var/home/core/zuul-output/0000755000175000017500000000000015145623303014527 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015145641301015471 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000331301415145641110020251 0ustar corecoreHBikubelet.log_o[;r)Br'o-n(!9t%Cs7}g/غIs,r.k9GfB e "mv?_eGbuuțx{w7ݭ7֫l% oo/q3m^]/o?8.7oW}ʋghewx/mX,ojŻ ^Tb3b#׳:}=p7뼝ca㑔`e0I1Q!&ѱ[/o^{W-{t3_U|6 x)K#/5ΌR"ggóisR)N %emOQ/Ϋ[oa0vs68/Jʢ ܚʂ9ss3+aô٥J}{37FEbп3 FKX1QRQlrTvb)E,s)Wɀ;$#LcdHMeBmFR5]!PI6f٘"y/(":[#;`1}+7 s'ϨF&%8'# $9b"r>B)GF%\bi/ Ff/Bp 4YH~BŊ6EZ|^߸3%L[EC 7gg/碓@e=Vn)h\\lwCzDiQJxTsL] ,=M`nͷ~Vܯ5n|X&pNz7l9HGAr Mme)M,O!Xa~YB ɻ!@J$ty#&i 5ܘ=ЂK]IIɻ]rwbXh)g''H_`!GKF5/O]Zڢ>:O񨡺ePӋ&56zGnL!?lJJYq=Wo/"IyQ4\:y|6h6dQX0>HTG5QOuxMe 1׶/5άRIo>a~W;D=;y|AAY'"葋_d$Ə{(he NSfX1982TH#D֪v3l"<, { Tms'oI&'Adp]{1DL^5"Ϧޙ`F}W5XDV7V5EE9esYYfiMOV i/ f>3VQ 7,oTW⇊AqO:rƭĘ DuZ^ To3dEN/} fI+?|Uz5SUZa{P,97óI,Q{eNFV+(hʺb ״ʻʞX6ýcsT z`q 0C?41- _n^ylSO2|'W'BOTLl-9Ja [$3BV2DC4l!TO C*Mrii1f5 JA *#jv߿Imy%u LOL8c3ilLJ!Ip,2(( *%KGj   %*e5-wFp"a~fzqu6tY,d,`!qIv꜒"T[1!I!NwL}\|}.b3oXR\(L _nJB/_xY.# ſԸv}9U}'/o uSH<:˷tGLS0l/LKcQ.os2% t)Eh~2p cL1%'4-1a_`[Zz㧦|k˭c ĚOρ_} Ewt3th?tvͪ{~;J0= |JUԍ;Iw}/9nh7l%>'ct Հ}a>-:(QxPyA Z UcÖgڌ:8cΗ|U1,-N9 dI [@3YN%:ò6PT:”QVay 77ĐrX(K&Y5+$wL#ɽ 4d-bbdAJ?w:P>n^2] e}gjFX@&avF묇cTy^}m .Ŏ7Uֻ󂊹P-\!3^.Y9[XԦo Έ')Ji.VՕH4~)(kKC&wfm#Y~!%rpWMEWMjbn(ek~iQ)à/2,?O Y]Q4`Iz_*2coT'ƟlQ.Ff!bpRw@\6"yr+i37Z_j*YLfnYJ~Z~okJX ?A?gU3U;,ד1t7lJ#wՆ;I|p"+I4ˬZcն a.1wXhxDI:;.^m9W_c.4z+ϟMn?!ԫ5H&=JkܓhkB\LQ"<LxeLo4l_m24^3.{oɼʪ~75/nQ?s d|pxu\uw?=QR -Mݞίk@Pc n1æ*m$=4Dbs+J \EƄզ}@۶(ߐ/ۼ𹫘qݎt7Ym݃|M$ 6.x5 TMXbXj-P\jА޴y$j`ROA"EkuS#q * CƂ lu" yo6"3껝I~flQ~NCBX`]ڦÞhkXO _-Qy2$?T3ͤEZ긊mۘ$XD.bͮW`AީClСw5/lbl[N*t*@56."D/< {Dۥ sLxZn$N(lYiV =?_e^0)?]{ @| 6+#gPX>Bk2_@L `CZ?z3~ }[ tŪ)۲-9ֆP}b&x Uhm._O 4m6^^osVЦ+*@5Fˢg'!>$]0 5_glg}릅h:@61Xv` 5DFnx ˭jCtu,R|ۯG8`&ו:ݓ3<:~iXN9`2ŦzhѤ^ MW`c?&d.'[\]}7A[?~R6*.9t,綨 3 6DFe^u; +֡X< paan}7ftJ^%0\?mg5k][ip4@]p6Uu|܀|Kx6خQU2KTǺ.ȕPQVzWuk{n#NWj8+\[ ?yiI~fs[:.۽ '5nWppH? 8>X+m7_Z`V j[ s3nϏT=1:T <= pDCm3-b _F(/f<8sl, 0۬Z"X.~b٦G3TE.֣eմi<~ik[m9뀥!cNIl8y$~\T B "2j*ҕ;ێIs ɛqQQKY`\ +\0(FęRQ hN œ@n|Vo|6 8~J[,o%l%!%tyNO}}=ʬ-'vlQ]m"ifӠ1˟ud9)˔~BѤ]һS8]uBi( Ql{]UcLxٻa,2r(#'CDd2݄kTxn@v7^58þ Ţ&VY+yn~F8I !6WB3C%X)ybLFB%X2U6vw8uUF+X|YukXxVO(+gIQp؎Z{TcR@MSRδ~+1æ|mq՗5$B᲋eY(|*磎\Dži`dZe j'V!Mu@ KV{XץF .Jg< ƜINs:b zĄu3=Az4 u5'og^s7`Rzu-anOIq;6z( rx߅ euPvIɦ7聀t>G;_H;2ʗ6 h6QװxmR JQUbTP2j˔Ni)C)HKE"$ӝ!@2<Bq 2oh80,kNA7,?ע|tC3.㤣TiHEIǢƅaeGF$ u2`d)/-st{E1kٌS*#¦۵_Vu3ЩpRIDr/TxF8g4sѓ{%w .ʕ+84ztT:eEK[[;0(1Q@ET0>@wY)aL5ׄӫ A^%f+[`sb˟(]m`F3 W((!5F-9]dDqL&RΖd}})7 k11 K ;%v'_3 dG8d t#MTU']h7^)O>?~?_ȿM4ə#a&Xi`O}6a-xm`8@;of,![0-7 4f kUy:M֖Esa./zʕy[/ݩqz2¼&'QxJE{cZ7C:?pM z*"#窾+ HsOt۩%͟A498SwWv|jNQ=-[ӓIkmjX((cvb_DSˮy}% _fK- B=U QϞ^Z-isKV]{M9$9#HR1(r-q^mD@0ngd6#eKyXc-HAQ $i-wtSJR^}[da|y*F9$N 62$IL7MiPˬ-Ő&\B`xidFdHÃФkxBl] "7 ƕ<>:j d ␙F9Ґ)Hnxpk2ApP!s]I)^ K?GOY|5Ŝh\S=YWl/gӌ6U?jN_-T`W*IakVCOVS4X[1G;7#! .%v$N/(OzAMbv6e=푦/hэˇjsV ==Tg )IYO;_=4F75"hK{BFEb6Ƞi]I߃tЀZy+HO:z뻗=H[\@oHwlXP\Ρb咄Yr=M6rN+LxE^D]ESٓ1+tn Ǵ5]RHŰ[qL|X` +%niOVP6`sg "M[1:2L}Tp~$HLa^b*җe&# #:001w%ei¿cZ7֣2A3Xh4iy㫎lӑbTry]W|~5ԞJ'!9MHKOT@:fID)EL2$%0P,$Na%ޅJc:KWi^9)N{~ խ$Ǟgu4]*B*A ݖΑِ %H;S*ֳI%I:/:b)*H|]|\e$و8H.h9}e'fVQ_77cLNWwIګA.4Ng'+TU[<s2 AZ`#=BY-@z .]H<$ ^D'r7x81dxx򀇦`ʶGi%*3%8#gNZYEźQǷo\bǷƀc"@{Fs {F۴Gs,QAIհ9JXo: y=%AEt0v[&FJҤb HwGhnMІtӳ_޽:myvy3h8~A|4֚X0v\fk= ;H %/ 8.V_gDJaxkG:8@N'rC1uxӁvGTю.+-+/x2c6 .1zҦR ">Q[ TF)㢥= [ ΢ D'OzCTSoN~Ub?,nΪʽw c%lv1ùLAzpv#[nv &ǭ0K3W*͈ w42@.y3bBA{kpRA ?IY9 ?|-z'}~kf ‰!,igJlw5!7GNܲ7T5WyIwGHέ ioNoԦ=!&d.09DH_O'ʿmȠ%9cS.  GE|GKiOM8Yۣxݭl0HڝHLK'C:mIUMnX 휤_8ă8.\pmPo @kg9_dQTcxQk&%͎Qh]ڍG!WMрxi^lv28&b#zwɓj o{_|8!X%Sݨg9X҃!8!ߐ>'4g7/;'QZW܉헯-='YdI<|=) lH#M7\O`'%tiYWm𓶝?Fi륍kF,]VnSSJCҞԻq=ky^Lv/R%eZ;i&8r [[9} VïCYJړ&A25R69zncEmHg8fI#"Hi|uǗ~UZFqv38?E7_̮MBvLϸY&0Ro6QžӼ W>yϾ.|`5s?Ἑ^̗tц}S@d{*W/C .2CBr6vKHɫ&edlBhwŧL8&ٹW)Xo`W;<8L@D+dˠU@w[hiҕ|sM/(_{+C*@k]?wdOaO!j-~OMϾs3 >6鿯 I;.K*!<={WLE,QjU/lalE 98 {(oѢ-Р΅ 7ցl661bKI9i?c{;'O{KSlgqvgIxpI"/ c S G-V+Z:H2vd,P4J8 DV|oV1>b$]Nb;s=Kx̊ygbE&>XYs䀚EƂ@>n>GgѨ@OĹCKØi4A嫳B@>kn*ӗQ7 }ˍzR>tQx)ij3sORE#{J_S{wnJkQνb(LFVc)cl*'<~PcV٢)Y;oӣee@7 lX$2ca_s6DWE-W$T:*Da*nICl>*6!ܠqK$DR\ͻ~uPq5g.}kS^ݨ{e!>XDŽ:g&ӄMu!<|ywOʹ`O7[F%mgXgP~bg*Deh$cd2˳:%NhfӫŶ$Au8Y4>7Ϋ~te^|gzt~NG$*ADo> IcuQ-̧Ͼ9@rѣ%zs b#s@*иrI1kGUG%|Ix=I}&ݢ6ɢ-?읉8FHޗ4DKK v2UTMUӜ=Bi|1cAuO)bXyRD `MHΛVu_$kcUh=Ɩ9bݛ&G'&zh“5h0Va(ΗBQGgCXt!*~Ga>FGdp- ]O3]J bXeXacKo;0*$a6 +nɮD >:֋[@ QDe,]R1]T ZvƂcW+dύ-7m4e0ϕ{ 6K!x^>$ 4ۏ l 2JDOLЩ LkJu\!`甉܋)`ŰV28%;fHoQVbapO@B59@mޯtG^n阬iԴ.2w⠪Ri_"6GX| w,?Vo7o}Cwқ5k7:vvmi8WLfT18V 3| lXQ!L7,R3PE08kvwmSH[f@!o\IVicoM~ar.EtIbW>߷/*~qzJJ\aQ#-~`XY阡ǝRS>r,CJYXzv[ezkKA`<dkqHNo_U!*pNiJه5B5H:WztK@MR:Y5ΟUh "`"a ߒ"G̾H [nCk(O rSwvҍuA+Qm0c:QZ]/1bdæ_DQP/2 re%_bn%"s#PCoT/*,:[4b=]N!rVo%¢EN$iԱ)e\rxac8{ =CNc\E)7$%LO./!Z&p:ˏ!_Lb a|D>{N{Vt:S4q>i Ǟ/"8+MIm(_,Xi.gL'—T1ZWJPU~l!.Wetr:-DJ|njߪA!wY~ -`%/Űb`\S38WUGۓVmlTccc[O`u pb>Gȱ ҫ+al HOAi\fw$olٝ:^Izq) ٽƎDjlKٻBc5S&ڽUalf@ Ve`D~ڇAځQi5ʬL^Huoa_u` 1>Z;I"YPۊCxbIa{ sc[',:n%Ld="K-us0d/#X.?ߒםh 2r=/oID3d ֺ"?yUAmE}~Grݮ@!&H2D+vVyKZt<c&kxu7ʍ-`Zi3)|x!6%%<@fpѻK2Q 1pFP=TU?!$VQp7% 8$ c*K "U8V15> =҆xɮDپ U`w۸ہ#t#|X!~A:"W vzZ U{ TĩG /z!~^<}NY !!E%ҏ:H =VՑLf*n6tGd#fR*c ^,—R9wN?3}resLV̼d"I ve,Jm_u)d靕َ"4pj2褴ƅblæC?Fw.IfpNV Ѵ)nsX CplvD.yΕ`2d ;WBTD\UlkF瘏" i '2S-5z+YCrE~b>|Ž6Oj~ebIapul%s,igG\ZqScxTi!r}_R \5eAl G-3&X ޴-kOw:9~Z"+A<sXnRcxKV(#lj@`^RoL;IQŸŢތXD@Zu\QbRN s>U^Olz3;Q_EU0u0Kr#r|%s5Ww` q>12e_ʿd{zlzUܟlDU j>zƖݗ!0 hDԘFLUb.u6lX)"}lj.b :|XU O\_JK\?2:uGL.xllT_oiqq$Xdy-Qd)R]n4[Put2 QCppS@pȎ gƖ^̹\B~IM 2sO/I!}(1nlz1H^[Iv%clh]I^z KٕlB l5`:~Bc>q,7}VE-QW70up˳ AM ytlx)킖h?.āUEJIu-tq5ӂ~ яr@/1EUyz8ǖQqrGm𻾏-[fTsqY~ sjZ+9[nu9YFzY(R+9Xu>sY~ ae9} x. zj!Z\ļǪЎr8B*!Fѡv8\[|osk[hW+/=:1={6]I)|Өq9| 76"Q;F*04Zٚ ?V ͼr?a`57X zLp'y$n~اEr3N%q**dp^ %eXKNhÃa^4F|e6,;TЗ-7y#dp?|Y\9%ر^W܌m+&V@=D$C"Gnx:. lBkŊQȲ& bbjv/ s8|c?ĈQM\?=0&Zdڪ_u۱UU_2N`/HG #^Nϊ#S};M], $zIz(%C|ZfŒNo]\+  mx9 #w`9>1>6qLzԳ T#xO=U-l,?f+K C.n;Fe28 cMVXn{"m;^MTR0]EAX̓|@%B*i`0QJÁw (,%>LOCm.J双}f.ž;p-n١X˃XDi;r-?-yn s+Q3?2di&!OR9OFr#]LA:lMUt&y 2KV`LmYO퐩dT*q+AY(EJVEB5ꖞݹwS{],9q̿'MVyޮ;&'̪6\y,?԰4:U`:{tQ,D).j@ ?Ra3 @QK"ip x,d-N(0ԈK) 3Ig%w.pPGϋAZW6K«'/52~4f(1Gj7CޜiB̕i.9iǡxu'"{`^2/eܹz'ۢ{F0/Α;+K{!}' j*Yt& q-ۑZ[,]0Βyu7@M}j5ZYܜ5i\4! ™C֫$ƻUβw X '.rB9jU ˊT#ZM,;ԩ300ILD/碟@YkO>*e|e-D(lj&Z=6BMv{~1qd0⚭`@!p_/yo'Q&q+3X>j'h] N$YSMqZH9Oe)PP4sLy?鮥Yk޵mX&~o3 i"pϺZP??u`.Ua )}Jq& t~Q[;G*q$bꥍ͘gm "6xSEiX*q܈MؕF_A۾@L%ip8ڙp`cy,^8K0ﴂ5,4o[ SV#οg* y"[:IR r 4 7p"O2%"BYmYrﳋnauǻ!)(6ӖwN.Np^ x՜Nߞ< Hөf~"" Oߝ0 VsUWRZV?Jqӈ`J(RT+j۔" !M8 mmlK%.֭ȤgYA@$EgyձP%[Ykoo_`i8Y~OifDՔ+O~}`Q"#swxji*Ed4yb6:K,[C7*MeGQ ŢH:uJgpupH,,'kd EW"7E(fihmGvfV8$ŞǐȲgjd2Gokjs^8I~ D+ - 폦qp| PHv/Ѭ(%U]Mdy$ze;@jC}cG om2ݾBCMio|q*Ѷ4UЈu{4NDK#qRb|ݮ?H2M;MQs}5u hlkG=`,5騵xR#1zaYu3+TC8 oZJٗ:3YL*Ypپܞr].ri)ptmt+X}wx33OK<=: ,掀"<@]9DU0#rf8ʞZ1Qs3|Yru|!]o%UC* L;<%pKøUvcy5F$ԯT-ݝ D"lsPƂ{!qv"'Y!s;펟gƲ!n{5i}O=ACuX2w$1uR{^ڿ&{$rm*ܔ¨KWI\;63ELgW,ϮbG-G7Ima0W[1vFLsYTCI9GLquF5fI瀖 hKfd]3}PE'I,SGL9\+F J]`B6P]thRpʲ5,)h_Mp-K*ިSX&7–X.B?Fm ϲi[ųp]quM[z`Z 1Q-]zQ֦b~CuȂaCV_.o0idćMX)2>Hl]qCdˉmD΍E2_BtmH8wH.t=ߦqTu+bzsp+bcp@)uqC'zUڃ0[t,PZMEq ,Bm})V| zoK347UUa 1M=y= gI2JFӎlC5|ݖ8XW.iS-} 8"z6< 2a_8;袱 ̣nkt)gפ;Rgҏsu.W (pmju{)63 j,G m¥2Z%! Cut$Um4ebgNKŊm!%Zn ̥-1ޞw`XV$ـT`$ok@4cDۻ+/2{wwlPW-zmSǺק3Y]&q)CjN^ښv$G6EjZb[̌ U8ƽRBDuvsF$g;=KIWF\ˡ.ݝRƕX "U_GS2'H{X+.ՖMZG^wl ~&PFkK.FXfR%c%EGϚGf- [fl>=u뚨h2pԦsG1"ڈEbB\n wv=_'< zP Z:^pH ~R:;}AuIuCsm)oҜvfHO z荺ytEw.7Skf3CJ=<&8VFԳnC=E;ny6 spq8yG=$,jD{zX]&O "3lOmH6J)oѽJ%+`N~姤u  { `*PRB܃kL 1B?{#;JPj:GRD|-$LM"s0 )EF=s%7g˔4?{W۸J{}g3~8Y 30(D-)AVuTK>"gm_Q]]}e~}a~]Y[a|;߷8^~`So1ORUT`q?O3aK04Okx@q{ՙ{7/n4Vx,?l[{,fVPۍ+s9ƓQn<֍O;yu~Of їAz 8ʯc +ݗ:g?mh }R KZ@[z&JW>.M_{Hҝ+Fq g(dGds7 o768cBצjDDmZu& 050:{ ڦ. ۅ \D( %JCDJr. ic^ŽD33Y.KuA 9B`/=ţ |B{U*ؽٗQ{/Q#K4Y6q,؀ЇY(;5if0tXMMN`E\GlVG^<iTR#(qafS ^ϰo n6*ĜrGyKB ȿ(C.>Yfq:3mY ̏\֗nqυUSf|fLS툪EܡǍdϛ$F~pY9@Gl ,OL"+MdN@l-i]Ը-uaXi@:F4)y[9͊bj oc \Qͯ5kىP $Ns`p"B)6U-0R<\2` N` |Q%PܰǺ2OfD}Fy?}qjϛ_p HYq~ûV=1FH3c+|L[̅g @l%8<;&7H@ڶ@nՃR -3*t$JP(oF*i+Suu 3b`\'G N8%܋{Kyo b8Hq׏f&esʰ=.zeAB^:`…ף"P.]7YS\|"bbE\!}7pEF~=NP~W!#/"J&)U7d6,n"ʢ?ICjK(1Vp`tCn'Ǜ̀Yr `1P~[@xZ=(<[m 5 s Բ.p4o1|bL`Gp@r0!P`TӋ_iG0䈼_L1 m\=G~ y7; $=x_FBx</͋jq|&=9~gFho4EW $ /S4Fşjj #K˷ޚ!,,|$R;`l /,! <:H=&'87,2O)8{"0J&z0lHY 5Gj'>xr<`dK%Q}xHyqYyխiFWmȤ?-ԥt Q_aoрjtO=y>wn4ӿG- !mt"hLok& F#W˵n>FIkwjnP;i80"ύ|7BK,kKqU0P1\7G>[+u8cU,|>LQF֕>0bZmz٣ d;Lk;A%^4QGY47\zJ4TʟUž ދ H8ig1pv6ʹi誓pNO /[Q~cZCt' d78beY,.N]bڊ*4hp6Sҹzr7]J-߀X9e-)*V *6'TP%bB*rBʗ*$Tn@J愺/#ݒPwBUB 6'{ޖzo@9$߀``B ^Fh%nNh2B- !tI61W K }Ajh4bŪVBhèu]4Di`:2q3*M $u*GUH17$ٍ22+kǔ v`!Q_Xן3vLiOr>gRғ4GӬP`z[$`Vʌ2.U9,.|s{2( σaG3'6S[r1U%u !{"OPٸ;C[ڧM?/,ş(G4!z j-,En5Z9? !jI960 4Ddwtls}4<MbKUdɜ bPEtdCPѤ;87NpzH*W{Z/ǣ$K`zvԂZL(SZ>yA0| ._5\?-cM)mGպn9(Ƌ1q6p)e{YҢ᩷jN/ͯqNyd8/r0xTs['.B2[@_ ͮIVz?W¥'#x7Qty|ɆGfy߿:*) +7lZq`+*FvThWjdUL1p> ^5zJ[aux==BEv5B__uwUp9;7lc ) r<2uEHt>*GӞ,Q6Bť8B)Y3UO̵ zFzhCYIy?M w!?nF*-lT A3k^x'FCtDdFWno#'.xqAVFRq~Bz0oXЬApueV~iTDYr>Q/7xnǾۓ:Kه;hƩ.^%˲ Xh b~[pl8VH;AϤr m`$탁@27xC}׌p]?2E"V`ԉ!_.A,h* ] p/y]uiEehl2Tל}4p֣[:f%d(JꛙޖCƕb2NTk{YLGwCK*h+[ ;f@m"{iZqkrp7f\e=~p|-a% ƻsd 96!jP ,Ԗ|x 7q dSߐJ;,1,9RVK<Wp钕d}@p[>Zȗ:&]uL0XJPeQj-Ul+ǟ哣<džvqcmogM={ݨsσ ð$…HRap,~YL7Ez43.{]VZn u-R.:4\/ |澚Io/D ]+a1͠ WMS%><;}hcWjWbk[IͯIpXc|F,䤮5UZƑ胂^FyXn ]-ULŬ/*XHZgi/6f~2X6tmٜŷ-,Ɍo@3,2Yg$'%Y e҈l'=xlbiu>75 +yBDZ#|7%υ0JYLWi l,of8uqwwO3餛iT) o$8]&ߎyP12;3LR3KQe4stƂN R@l} vE$":,5&ۂT!v/I,~.x'!PExR>zuw_'&+˧gwjX3aȒ`LuJ_%#89WlW n:dEpNY2o#g&mB("F {ԌT4OmK8M6fbo @:2, e@yJfDJ`۴4`N5F Va;MqFB>X>X؄B!ÈaiܞmO&h9#\jd-ٲ-ke=n34F ?.T<~.ԇ>U^# ̀ƀbMT< P\< rbMGH*ˌ͵9l`$SmHRMy]LWp9BpheAt1XެXZm+3_ItcկjEu. *WY׌P1bȯ@; ΫG UwX%`8,ۧNj!Ѫ[6:rZD]C2{iJcֹAlҏ091IN`X;DE@Mr^5u-!$]|[ /<19!CmwʳcGtZ5UA+3֙|hj T#\$ѕZLjlB4\9yHiChl \pR% F 1; ,8aL(<` g\P$8veR0br)J'@C,eA%0;Io/ nݼw:\Pf%25UԀC#,iHa%|pW[6N%0R ,$bq8b$]L`ѵ6DE{e#߇є$aQޫ 4(r+p(~%&/`vB牆f $: wZ Nc]BFui:>R$/aQb[SxDrjytw%d.ʑϧH[<$82uܻK>+81H3J 7ثG*2n;V@Ƽ(&W֘J1l{J](:ҵuL~ eY*43X!=i#J_@ hUpyޓ]bdK -Fd95C5X^=ʚզ Ӯ=^y.h`$ =8WBUygFVpsC1dCS|PS|rI(otHp4nQ1ûh9 ._K$y=(" ]ibo' y7P靕Ĭ>-G(n`[?L{Ӯ>IDN/qk[glt9TP积]51٣[IpvQW%fq4qZKq6C R|9KplY'H]g#}kb3|t3 K4za\V4)܌_]40$8:3zq#Պ 2Z+*pW&߿*`#;lz",tt(5D}Zo v ,Ï~_ vY[af6bW>wH98YE nY%f5O~SZd58#p>. $ ,vȮ[ ]?e; _DQ}cൕ*J*=Igg):ϴ|Qcf3ꢧ@A}Yty~ci#In">ULlAċV%2\o Ka:Ni\##Kўz,ڇX);vG‰o,7bpD_ v̀H5ZcؔR}'9iHv=BE(О- g^yAĞl y>.H5iXͧ$/ wjA N*9m7+$(nOvZ${RHptcӡf4:Сhq1bzJ7‘5bg}coɭ$8Jס^>{APbczqQ5 悰zz 1Eݳ~=AyQ+`0b4U$UU(7VA #B$ h2Y3x+K n~8v:Bph" d=#鲞š ^/r$WQMfՆ;ss0S~/mZ$$ Ib Yy`zb$Hs^J&K[QA*H {yY%r3o#Ȣ6FϬ2fcK05Ip4cc=%O.qpqii$X}Bd#iHlp~YfLt:zʵ;+lʣtvjyqI;nmR;-a0Zc{&a,-}F3r *)*` Y ٳV\2[8qIE/wk{ef 02 ͷ)dg2f}̟Vh4\Eh^!4nc]N>w['G4U8*"t^n`:kH2;Bۯn7GVB/U0K~Q ~=̎pjIpE1vj;mA-LEYCxdָJ"o~Q {sr0|oX `]Ő$A1.L b*s.bpfYpy?JH Gs(&l4G0xhwgX{)m(Rə93\eKo9s1@U7#/D3K~DN8^ =u9 : Z13*t}qoހgST ij!8*u~2fyzYmqOmz0bza&y&"Y7^27oS|J@cO>ҝw4Nmqzy㲸b?EL N||1[Ғaa 4~uѬE ^0R;pļ][hX|ީsipTw0uAr+027bUssT-5V}\ ɞ^Ԋ8!oNI0H^ξ?==*|r)V`x \a|XPpxFnO@'f8`B xquvܓۘY%giv>OWI}99Ĝ"Tv%2 Սmddf%8[D8/G{_JU$+1UqH"qQLvHJ({)AnYJmٴ s-QAAA{^TcFH; 05Q^m+ZU HN1N9NqK )hDč_hN,l+l sy3vD{3v ?Օe}#zeS JfE@$p5M3GQoE|̓USainTi 統L:q.u{~GX1O0*kL'?9`]Iq<ҳKJpw$Yg.ٓC6 `ItMSik&[H!l w*=:ѝe*ޕe(Pr^s;+$՟DJI,ՏpCO ]/)-NXq057<+'f/ߓ+Oge]a,J:\#B$Q6P̽@q8\P87IdT=SNcV$5+)>I_TW>OX3R:s r ˠZ{pyZtY8I ')ʒT;x|t 稴N52OFugnW.g iVvI-RbǠ171qMɴ{Qv^ݡy@BzK<'| )ܯ{x[5XT_xaV-#r0DL~PE\m ./ձ޽ 𶋡Qr)x>Ls+f|&kJ[Kill"7/ oxMs:qbfߚ+ ~ @d~+汪;3zy E?!/2iŕثH E3"ع@~BpWJnIȼ}܌ȍ^g3A~)Q3./qW&LB/~d_n{_dvOpT٦#3GEjW)UX: |>ph88^c*;:|O&6iFs7Ak9FR遶Mĕ~7׾ds%*ㅃGe|2*>IensXW4Hm ]l}au9ʌ`2!:G`K㴂Vs.?$3U4g :v.퍷3H]T_ǩN(\4Wӎӎ՚/3e:g&X, ~PL%6(rBf9 :kQuֶsвR5g,ba(2(PR!α}87[e4  k:H)E.xz#W挶⌶?3$}e&gp%A-'gBSg> u`(y%`Gq.まvehQ+h9#&MGf,69&ׁ3n (9Zqˬk^tSsɽ9@T誜ѶVѶ9 +z(P%_qdƳvZӝWbUE-[ᨎȊ^O$[}b(!MlY_'EJT*Ś& A;n<|y4]|x9'΢SOX#̆;%wՏJ@q7(4^kČ}#ʁgS5F0o_eYQfq!a=N{f엟|nA/MUL1zC3w^}ܿ͏[񙍶V)=.@ԓfQ\^ %M+c0͛dԼ d>i2N)˵x&GaFU;6U1>}y~1WI\$.k`;]IJ6d_x+d W֣W5cyPF-u3?g?^KHph{s{*4A7V釪Z'm?[)Y7Vm@|p6C5 ղ*P#AZ;79*hBS7nLrͭ%AUm릯蜽(5{C"_T깬U~ w|EIyO/W{f>ؓOwX/RlOc{p0OGŧxql[ҥvE5O `q˸KKDg,Kǀɚb%;I`s2 *M#hudgg#Xgg"Wgg֣o$(")gY}b(\/ЌUFDMLO[,=gf** uKdeq3 0^eAf}3o!V\r/[s@)5mZ˧`8Xt:qJt ?$;8BTs`:pq3)U-^9!Xzb`8A#C9FxH+Q#] J)NӓQϤeZX4ύQ-eftZ2Ýʕܺ j" M :=Q Pi{[$hd>sZ5_adysDJ 84L3Q`#\)ɝd9q(oEZ qmGRħLZ-:{!͸{ȑ\º!3ʙlD-O|@#RJ-V `X 1q 5( ˰!&TMǬkby7s+;$锼L;ɑꎼ%Ati]RZ(5 (饕qJji`3{4TNܨ {8I #KPd3J/V*ֈ'6I]m@pRt@.NI.ވl[Z:+>6`p{,IZc[G0WU!r,2Gcf6$۸˥5R?iğwH{\l `[.w& (t&k@jg&Vz1BK&ka8ѯ5u2="S,}&HpZNC6\z B OQJ Oy Pa_B]!Jz]½rj&<(_|`UyT3Kˉ8'a%Jìi%TZc))ݑ!;# ;~J K! K kb!$Ӛ:95H!%I!WmVT$ ƒYsZ YCx<"(#w,'9lMF 1 j]oƒWJy?z+^4)pq,K[YtEɉ[Y(YrHQ2Ŝu, D+2HP\/i壔k%> =oa R+>F(6&Zc(y%q@j> ^""y-XFSGDo=' E0Ya/ 4O$x4b_Qѓk&&d=u`ڟ8':A\pFzzreoذQ.R?)FcBXu.0xL^WqbZԐg q0;:͑IgU`~07@82)E(KU q'F3{6rRdb;=݈[؄Zhσ RYl 3h3qXO8F +8Xvi lZl:r`ays٧E P:Ъ?C=0B}!SuhBV<:Y2 &i8r6"u~:`3rT Eˑp{6ԸF}EAk+m?G@I%ݞYQr PO:(zqqYWi=QhE^2X ̈д+VoN]U3iD)?AZEw& Q18QpX, #g(,-9D{$*.FֺXcVc3cĉ12لSi@L Ʋ?BڇO U $1D Dh\FV1aZDD1[ Y;G g NnVc jFnQ6p \t\P֤Vzl:ʏ``Z(bw7[HH7ԧ@| Vg>[M8ɍ3C SNtuQ43E"c.1 G0bB֒R\I収~  q6b~D%``q@\RBdր#qne&tGxL*N4F󡛑E_V~iBOڠ6( S\{@0|n3rtȎW5IQX!u7EO&ՀpPe7$< d B+M6[]%=a5RaAfI1>rڌ`;X;FC9,ݹz29B?k߷[ܟK{3<]{4V4NJaGڠ#ǽiJ=ex@`N8d>ߠQtj߈L٦&dۿfC/eg6IefM$>8ʷ`AR)54s$mD&f3qubx' 8D|@)0nU9ȓ)Lw%7,混f,D>}a+ߵ>b>Gj`&:]XaY&d*}efSqef2-m@-XOBp0 7g}@I#p$JjD&!RlnO`5O gC# TJSBl|*I;i[hڋL}Є&d uG#7"#޵/܌\)bN໎LiMKG1*52##ANR}{+@ e7:7:Σ5f26VJT<;_}QF+N--z%9qUޥL,\ܕ8 30m>OM系lWj`b$r)}G՚x{7~~s_B罸xKs|<+B~4PMLJpO7> u+g7  69 Y->N Zfz\x{L/% l5|x҈ DE 5e %؞LW% m 5"zznԫ>pf]܄Lj>:oD7#@y^i&\H089b`s˜- 'Tkj% h/ {ydȍkW춅Ij?»1B^^j1kOX pN ^/%/r|X[7X@sQv̀㜳aba}PůYޗB:`r2I߶-FhT3=O/O4WJG؇"K03YMExܦOϲ%YS Grt _h>Gc ӊ>MX:E<8+-ADFZ%&F!0`!Kcz3u  ]g\,|f0;ݳQ!2l؀cJEUvy/su&y<wL_;wi6I+V+i P)mc泛Uގ21C h?0\k̔-G?!*"-c/a'ά-k/v#(:@MGt4{Ć0*>#`/{zOBerWe՝ &āF4AC}[+xԲxjt\gAmJ!_Da9(g$避1(Yߺ6kp쿗/8cwmp~pb8=0[9T]8zz 7>/4Kdm yj7m(h*+u|i2wL x㶝PlGXO駇!%\nxu Z8 ilơ=prz)3wO %zi6'mG>  HZPW{(pfv EDoetnJ-UD[/9j\2'{oD8:1]LҌ\&;YΖSԐq%67W4 S#M:ܐ` @jL()GSQyGIw֌\{3t۫ljx yf=lQ$xm}ƚYϱnkV`݌\EbV$UOqm](T0mNbCL{֩5s4CT.|[7~. /UǴdJm! kvYD/hb8K>Epuۏ?Ϸ^X|0Ў$f8\\+Ҿ]Id5/W֫?oÕ._feS{l<7Oy!*bLCίgi:v),aG%1:fv9P `,MŻ}I|6ʈ_Af&6J?-2|Jh`H xJ9L0qt{ CճPj×_k>3?=bz1Pznidd_( <;hWOTΡgt>Ԃs^|g\ew^e:YLA1|:-]ҧǿ+XzOF^{9` @e#7 J2b80!Xm%ga14]'WN 8wQ:ym@.{FIRhjc Mbc˜lcHڢ( ?Cz\~ cg!!?d[8, YtHI3I*l/w\4wT&>U _3zIBm؊#)\kE?>9-S\J] &ol "K|^E?hbT=orv+b5bsK3Md=CHPrzĐ!F㔪% +_,+o.]YY-{6Oʷ>-sur+}_lJnOTa ip7Yj27inA~Z_R&ؑ⡷S;M-wfWrH஛/<Ȉ..x+CcL$$f)>g7KDw=+0 nӻ Ed M+N84h˽И*sMjO]º4 >%.wvcC#cRLjEVi'4r-)KmD yuD*ZˏKr7|yjPMd(_T.*m_U|8mQgy%7 f?Og!m>Ů/ɇIqR~Vh\8gȦU?Vh0kY1X{o6օ^0Z]pcdIqO٪ fȆ#df4˓5~ gx>FmQ杚B0O.Hf% v*Y&SVmr.Ѧ'`XA__ybEP%Hce7 !6c'|s|{'mJm0oe0xR-P _<ԒQ_5. 2ߐXڣR5? )VI<4Tc5bRqFl_b( _vFx׾6\a2b?a)*U: P2'3َxO hYm;C*HXz: K}A &V.>D"ڛ3YRz32PAITPAO섭rcWB;.͈KT j5@yKa.n2өɐU+ˉ9>Ud߶YYW1'TbRP5%M)9mJNn6)9mJNFFiɧLӦɄӦ)9`F&g;IlKf7Vaɖ7?c}i oW0+~iGzQ-W֭o؋Ww6^s+֗d_!b%c u<-♾fB{0IJ۩п~J~2θH'˛Pv}6K{nT;9#&݌ET7:mK0TIjA'U/}KTa P2v_~ם˜<*n}یV2?pBRdRs`ԏnʃ &w" cWTh**ݶ83͏~r9Myȓ;!OyS %u.&x ESVMg"Iܺߜr`2A'uLgJ``+e8]Z|< s^9fzس.Q?+<HH 5Fr7na:sa ( +z^ 篒aCooo^M'vJtODT{*=JtO%Cg*AaS2NmzթMԦwjx=әN&Ԧwj;Nmz66-,1M=\yL׷7U?P7`]7W) ?U~# $ڐ:^>f}.ݲ*_=$t,0uDf9'V{*B΋<y͎`+/g,먹aR">U8%I/}E?}] xWw~w۔ 6gDHg"pG}4 MI-18j,ǁwdñY,g"E,sH@qy 4,਱ޕ=Cv6dz܀̅ d X RrvSĜ4زrxWJ\$: [H4UF^ qE#LH񖹘kŁwckCn";|Ԓ"xIB`s9LFEU Lchޑ뮖4coM528GůL)Vy:T1p{+?żZӇÆ(=7x7hlS+>U9{츢} h\6[TpӢ6cZRZԧp6KnQMK؀_ip |w+"eU0.%]7+[Q0vEa\dE=o\f0wknZ-nO` `ZQ I)%#z%ei\?o&^mbjC"T'ڳn5QݿCwl=JԷ2ܸOc&z ^&+<>w$̗/F\XEdtVdzSX滚,ɂ?Fp[BA1kbyHn*@u4V)G52U1]eWY#8;J~3'/Smk2J>Yn$2FK3ya;r}fu6;03)z:%.{J:p} ]41wAmnˌ"JHHg(aSŴr3I \x² vka}E !ґ,(Gt>6nI};GޅRn ̞Dtޔ!o00^5xBwd4A7x%"&ZO |Z7ϾUһM> B7)ӛ XZw]~ΩQkN$%sF|!ə \Ir 5^,vJѹFsGzxVKQ9CͫTs0q.ล:n*n+R{7."XF!/=Vp=z<=X :0h\S-% i)O bxjp,wYaKm(pIJ2ғ[fY1'9Qb9 IN?&?0.Lt>ˣ zzUDcj8n,@Vo7x]3R8+ ` XD0\;Ao㚒ÁwTrZ 5FP0vnmKM|p\,׊dKyb Lg  ȷ&!8nvygpQB6pп; I&!e2]dYab& Q "3-N3ŢvVF0_dMF$ߞþF!9H NJPMRr8K?1a&wyL/3!&+z?d1ZBc "0ڂl` 7xæ2M2I3d,=cUAo5ך`l|%o5dB̟+Yܵ8b XJ$rGA1(nP)Άӆ-dZ0>]Z0,Qn@MwNGk zt0.={ u2dDy.I x̗UrׯIj<_F[I)-0"y(Y`W9?vp4{&Ww&o^V**uOJ=jJfӪ402ZB߳Mԫw z9OSuz؉l_z~ŀ+T++g?}5])kІgbDEп)JWߔ-2k}cYqsT9=m |ǭqu}㷳*,,oRW$+M,Y] ټUVa@#!:(CjjU봛UX_4gbL31Q$e-]ܺ"d ޯc';Vb=Wv;{s~,s<_OknKk'1 }xcB9şPN,BHa`‚I4CIsݤVapԔ3 Y̘wVJua=>.*_| ^kފ7r@(K3CG|YQyK.v@*ͩ;PX5KX_-L^Ami vp1e 3 曚; t sri]\{XF|pndZT`A3V!)#V:Eǥ u|۹˪Cgա; ýyUh}"2n IF[ZFz_03Ah8(,wpowMjD'5bw5֪q\Ʒ͊aV X${ 8&Fv1MGc(|%,DOhZAo!a%Te5^]){Oz)zh&V[3)F7Qh ̌-·#|1dm-jb[p4IfS+ eɢR2 C䤁~sr#j)dB^ix1Ό:_ͨ z.q3ʹ+ MXy~t~_>)mUBoS,)`tmr'POZ_uaÉәlP(=g;o-޾+0."ۑM=<p2G#ÌW{Nbp G )be_䥬X:>T꿌>ܫ}hq%LR) YJ.2 F(Rp]b 8,GweA2@Fh{(jxsfʀ`,Y",Z`Ԑ:!wCwkwƼ(o,p=b #j ktwL5>UhUQe8V̘>w27W PX RHt:,~U:aRsi;n1;rS6NǺHsb\FCTV8ݣ5I8 _W$Zg$l34i_yVQr5m ZQWX k̈́ !;#pYrFEm8ܥa*@Ne"g;T 5/d/icyNs ṫW"Xʂ͌m)8\qzfH&{wa ],Ewewt˃9bfm 7- onyǃ xW{KP,K ~AZZ}ҥ|Xg. #sUwYgeAY߲T4{llemӴ?q~^,bA*a!2PnLZ"s)TXp(& 6h ̧g(Ѧ8=K<"4RӜ8(B3]P#">Qr|W=K0|N%J_"0fEapyo[B 4lcuyQD=X :E<] %ɔ@ʣ 글] qޕF]vFmcvt/n"ëhe1%q2]2YDxs h.8(,=ӀTjUx@5:mCp8FY=]½e% k?(ulL{  @0&ˆޑ6{{Ej&{x93N+jfirj;dAXUkC*c`un<)Le.= b!GQCwcLo;-}I~{ 4G5=k^5cÝZmsFfY2a 4Gf>xi-)/̡`DX%*$ #Q(LU5O>ifV)>V.]4[8 '\\4Geo6 xW6D~3OV^XܾMMD[<„OqYc{fr`naAr@=&ik,KIOmImU2X[͢b hQp<(>@T~тSlU΁ZF~cQ,^^ rRžů D~*2VWCc}G__1Cj4L/G8X ,VRQ9UW2^ G)/n xGѣR`)gNm M lC 7bv 7 |(+^NL&Yrj a"G$\^ƧjsLҤv>*GJ VLw)]D jhǏ>=ko+_q%Қx[FZyqk;@kP'GɧcQ(UIڀd#i{\.'hL RM?Q19R&TT7A}ȌF|ΌF%1͕@ɍs%JFYG+bTSwUFٿi.x圬I`*DU.ҁF | mFv7-ϓgغN76_=n u(dr-_zlL;wRVZ4k eJXeKq1;D %|P0&j$x K|㦶h wu|EÀ92K6#7v :簚ӧ,ScG^VM5-ˊjte-םfT͘>k̗=(][4kѤJ|ќM;PjAb$if$>yjN2F:pZy h@%,p3؝edKl_?}Jk7\UIε5 ՜ ,\ҼOZ﫛_Kr]S9Ӛ)!ES{#A;d Vlg"/q VGM1_sMs`CgTYk݂34,"&).(L>}m=NֈGM/oj}Sbedh+Hkã(Ӫ%&roI)/Hm>_z)_-w:(3'1K.Ca)#8Uި D6QCInHkz&7ᡩ"9O 8F($5 Z1R^{M n5̊Y7|x"u\ )p'{MN܅lɍXmr܎C|>(F;u(7S1gfXYv狗N/p1/(Gr @%j>Q&K6r4- $*%TjD o,nɫ'oIJPױQ LXJ}A`,C!ĢnC(h4n:(EPAsM,AF|ָQCiBWml@+r85snl&1I[J8Oio#X)ɔUlJߊ ?끦p X!%)ಖt8hAnтJ"@ + ճtrȦTTolqj3s8FЂi3(F#JE#8(v=VfL{w`Ooi^֪u[p>$l2p3񺎷`]Q&qSJ )q{GNj3f䅟\$rGf1!w`%*Ȩ T89[ _ |(3m+v(Go\ut^L5^"=49̼ogx@ۂB@k^~a % RM{ Țĭ8~wE>|?PH*j*)QFYҡQ`5%i9Yl9>ʩ4npq*G<qpZƹt@ab8@jk>g5=&AS% {")DT)B}vcOr-gGyLG|{U:p\ g%944iS"6@@ *O[Ł('%mDQ4*p1@5&'9L1Ry+L͹mş,|MF|ۮMFx#?AϿЌ ہ>DP=`tژ9M 7hd% u>O8d hg XYǢ˨Z5.Rzdс'aŚdcE"Hd mDMAKoIZp!OƑQ.oH jSxQOm۷١ }]s9x zN$~%zp#3qpe8W16.xfrWgԂL^e? >zf{ 5Y̧z'Z>]<39~#j5&"XGkQ#]K_?V"-2~th[P#Vpj0"gu*Y;D6L'CK.!L 9 8pZϚV$Ƹ>đ>S |VuD)xRe}sCw)T 3e,A@x UVRUׯCĠ9xn5c72bm2hm%É? ՛$*+_k\ĹMϻ)MYLs_kѣ'݁#3qR3IxDo1X;=p1+Ϗu1Vy[9xwk"@)2 ;=7G#3O!)lӐP ɗv,~YbJsY{@|[-b% 먩u?NѵMYU~_$MrG+ݞ=[[U'G/C8w( GT/GQ>OgjNn^_Ow1.ff$FJ@ɦEt hܜNVvx`& x;OҢr_ìx wr;"0Ǟr;_{!Yb1EaV< \Ur rw fP]j#B-0nvAo)~ͻ/F72>_U%dzQaʳu ڢBXw`) ko8J"{SeRdH4נ9X:!n3|-yܼc&ہ(.f^ːOhacMXQXX<fKH^jV1uĤ&M}4*xކv;a=0  v'FqcbO 􋸅;N4 0En!/bgך4BCSh(hֵN| r~ovì~` [;~}+Abul'+ fXYt&Ps;0.aeHzABC%_ݮ4U?Bɑ`J9=mzJJ4o-f \譅`HQnϦxpߨ ޟ~6' hh[5 $0 YQHPR6d|Ey๓e?!aFl, - ɚWB8{-%^K6%v`BI"t1Z_zj. e 4>ߢC35.4 `zSfXgۗ|1ߞl(vqZ@ 15S--aү,K~bQ~ SQ^UQ7hYS'oҽI$j5`2RXk.P^Pr_d{㟽 P(hMv6ewgmVHAܐs<Ի`|Mr0eL^T$, VDN8y+LιgV3$vvbNBH 8EZjA<XU2#8''|fwᡈ W0E@mFQe`Hя@| Q*xl9q(݇۟䞣藛y%ϫc6Esֿ"K8[};l껼1|:8bBʗ_' /'?F k3T1{hKͮD8w*nTE&pݟZs~XKdrKQ'Rdyb2=#`;`"/XYkavJ^~ CǼԱ> ĤHreR Ҡȹ7tɓ0i]<$Q8tռ7Elp8:c| kBK'r1z焖_rt< |hI4'.x{BihpĖ$sK\H狌WzW7L}ݮB* ]4ePľbDe,ւxy W '0)[R^CNQ f̯nV"?}6v=,iW78_8PNic8{j=%Tx IJ6 cCS/~?Ne1QݷkOߘ;d`6o N {Sv|O|=+(cٍAޘn&8 jQe{H1]$R@A\T྘jOѰ^b's`?yE6~/|lA_̊_k~Yl[w g9_hQr1D42%4 kFJoJNeu9Z#Q̄G_d$Blg-,7|/ʚz!k4 1vG˘B`J I/s~Ui(؋807LwLP$mZ:l 8 #MԽ)$*2*o sg:ؗBT(7N"d:3 Vۨsx/I RȮDfrD<&}7c0\z$DY꽰fYp5p&=88Ϻs0Z( eg* v&:0I c$粠A!(F`4*"<.ۜW⛄ʄA|)źGH$S&g!?o}j5RT˘"dNMLMt  PL: >s^r%ZS)bFb1D-M \q C<&l|Hkvnʲ$Nnt" fƚ65D0}آRZ0x2,ljP4'6Lڂ ¿MghGhx 8_϶i;s-W|3 XV0)><:̳Jγg}nV f붳Oni1U&i&3[;,a<"`K8Wװ#':-eܔ#JbDUaHDn+YF69 ;wML5ML1GC]H~Z(lc p[znY z -'l,m+5o…  - "MfZ4so9 U\sBrŇJt=%>VQ!#oՂ>49%eTw\{MJs#T\ 6{/=%F%A*,$$_Ή/:B==va90 Q )6"a1)Ht+Ў.i &k 칔Q ata0yucrkNN!Jb11_\ ^ ) :=a}tiumNߖ-.,VggJ" ݌u?R.>?d `sdXPx7qƻ8L{7qhOxql1<9bpزH<{4Өw&>bϫeLx4i3L}{hMc^2&(V"pw^E9 X% ֢]]zMBup;\'pZ)aeb±b:19˺^v3(UԻt|e:xZzXsU+6U]Tm;-u2J'Ve zV Y3ފ1tI6e40gfAF?Z>ɴ*U2 ;E /- |UA}V,spwCC|4kU*uԭ^x9- xK~YOz1vS[^٩1 tjYvo/oݚ:9x~2h,CO'QN=-), PfgLƖ,Yn(͵7J@u٭w(bKeL"Q>~ IJC|&vw>r&ϨXhZk9:v?h+ 1VzbUz焖_ra9v?oc冸HXʇp0 J$&=Bֹx-GWM^G:l)b(TLI\.Ō.<M^0xy¦Yi e&odPغŪj=Os4>qFWjMO5qd`qKT11:go2YE:R9WOOeS̯I-ڟ% 1gm\>jʇUt1yiȊ8ݦ FlLmn+kb w"l]pr(o9fe[äb HW\BPC3A)0<}dڛ]ʐO+PWރW#frqZATH'uQ H3*[(d}8j|d͕oS̭v)K1{`2-]n '䂝B@M7.kBhnսܓӌ%F7^[) mclثȝcuҪxn{o\4& Z5כro|;`G BXꅍ:+ɒ z;ɭ̩LI do9!\3xt덄=g{:=|\#tpqCљctwߘ^=ct ty%c3/M`׼#qĦ#q-3gL{\]`t 7 r߃2@Bh+oOY1@{;@{s`X}oT|ݺht> `wA b_/E_/fܾ~H>;E|cTHJ̹gt09I"9s>+ \+x+0ϝdE(S*XIy6E"9OHg4:-n.K~8|t68 8ޙ>yC 5t/@Cu>2 TQy1̱jX}qrO3mG`- -7ݔ 6̀Ff .;Fi؞M> PRvZĒH p%ݶZ9-r6 {_ Co\6-4:fN i0lXϥZBǼIگcemAc#i8s43Osi(+b@ƒhS`Eb=._ܨ 1sT 3Ox:QJa_Xf>8R'i vNNlgzfS^]SDF;K RhMuhP'( pFo 9U^؏ 5s`_lPE+(o8]G@ڠ"&AZeW=*-= $0yo7xt>0C%HRd44&Ann7-;8"Gfu.IH>m>3,Xg 3Y8mT}>3x)e@li?Xvk}AcL"ݙ(($tZ(\ݣړ/yDQ|"MAuPߋRT'h,0"[HEku5nP)$iV3#P`E/Z4 DH#8LUqpw.N'"0 mloctS|bq:cwY;hv?BԬE6~nymMwkI?3I 3[9a T:Q;蛝 )eӓAwMŰ zJyF'hǪUF7Iwջ*7ГhхT&m0X?Sb=MnFüuA~vDwcJĸHmlqCzw9%Ah-IZȮ\XTq/سrˮ~yh>L᭱^5 / K-壒*,y%~Ƴ|AįjXxQ"H#BKpP4E?}]p!DT;;XWp?A_"?;Y!'o܌]Kmޒ.Ev!AװEgѝbɩK^pE sEv6g-# QTV2K}IU9n4|U޺P,V4۝ތA:(sMq)FSZBc&0\b6e5BT_01Aķh`)MR(ohcO tf aL;hl;$U&,ОVx,=HwҕoP+-sLrcY3TTsFRo[Qf-*NGsx1i |Y8Mc9 /O޸6d߇ 7XdKHB,OjPhCJ}gW_zJ|2L|s}}4mt:sـw>pEj:sS=Q+_;._].ѭ.*D?~iKP>:*mx|+{DK$Q͠-ߎ['viIy>Zy_xF:/8/?y o ] _6yu1r>Ϋl\ q_ѮT-^f|9H'cuu!)P4_mmA$5gyX:[/LѰfTv||s*}yxêӝùwqnfz67?s 0U ,f%|O߅k=;(ؠOoo[ mr~vι,5K{wsc2 ɇ>sGwtkxfwxH>?AGyv ַ ~ZҷKJ:CO(ØVTYf,ÜeJN8UJ 't )qǕ8u eC<&ct4 `[{Bb[XFˠ&6G_ X&.+I~؇KXjc" p[$%bȝڦtO!omAs$wД)-^=ʱE% d"Ea BL&$IMř3)!dfFPqe|a" 0ڈsU`AU%gpHj뗃|K+;r\>x N<>;R1<7ʐ&n[ӟ!ɘVP\jRMY! d@SC%Ŏ3LCD^`ڛSl75KufpL9H &2Ja V ^ MX?4J :_,]֙}י}^g*ʦHZLR*+ : 4_cLo'!O$u@JXI&@I%R+K1()ʰpYB6 Fx q KkE KvK1,P>\pTJKo-Ɀ)N ;A1 ia)Qj,5i8@"U3Jh̵4} Oc)N(!q)FQ4Eahau h8dMx6Ol)VDCfG:V&ɁE؃Z 5TѼ(':42'VuDcn] N׭A|1_WYk¤k<ɲhZY'Qot|/`c\q( Y+Yrs;t&D=<<}5)Mi&tfh[Ѷe6Oݓ /;y5yy[I*;RPikg;T\%iśi!R`ڣiַ{pn=%o-m%oVB[Hz%! ~_f ާ9/lrk(-{y>?J‰Igա0-d ?{ƑeJ_&IS~03 dc'&jcdHʱEJؔDdQ{o]uӀi1fmբ*(tH1aڨ5gR əTgd  vK=,Zz,uYuSF`nZ_wl>~moѵkOאN'uhS,*+_~ _$Kܰ+DoKu7 $]! @1.m@"piiRzk8a_wϹL*&&L-%QB[k(?"0:;rwtC K+|sb2vG7\_Eo\L1jô\;/r`[G$1*I 7`=V0aB0ZiOdp^@K/w>W-lb`P%Z*m; i h Rؘ:py(w[ Vr9wF^q9xhgq\ &^`v"RWTS"(J;YI&41K}-M[9ҎB!H:ҦSfwڛ &庣kH_2,\t4[sZ8'9'-ƩWy#dZ,] '\ʂ3.P|?Y35+;RrE]뷅cG% +)7tBmu\ň Tlt8s7sn/?Vys1gNӤÌRF wM5M fCcI{goz8s>uͿYN[]4`)91}Wmbprb 'ݪІq.Qt IT+qq:xY'l9}o<19 SI%b:m ټOM}h`Qn>? \7rיlCW/'y|1߲_kœ8T/uMgTl2Y-_>u::|yP%A13]SlQ.| ̎sGt|KѼQ eɂ/va̰?)>@ԁ#tI#)YU jm8p̿BWA?vU5rUO^.Kڑ^7mGFnjTcg}*y)#&͵֨[KkcʧVcJ4c. e~@ht$C]ӌx5gw/! M_8YNIi"jnuO?4cbl<ԡ|p{B]I5mmecySP(<9@uvϺk:G)X+#:Z7Sbd|$W!ZR18B{)Tٌf0ǿ2ɫ^o^߽{KkF`8.Ά~Q@ SyЫ{/Uw򱃋0v\Gԋp6Jݼ~?ξ=?ElsaYKTZжfUs#ZoSmjڬwiJͬiMN\L1z`#UHY*p )q7nrNQ6_C^U?j:T88|E5ߙ7'w< `LpԤ,x Pi!h>1t24F++< Iqw4ĥ}:zΗR\9xr`z? Q;kׄlGzP:J5Ϥ=*c Nމ;qy4/cI R)ƠJ%N!Z- QJ"Xƽ] c"?5(Gr^`Kխ.W%%$Mݜbu"MC.iR?,ֹK-YweIL[=_wGGF K\+3%.zs? a>nPq)y1wZ):z}є/ߢ?!`A=Cr"zE+WD^6 y?!'~"'~"'~"ҢzzEla EXEXEXEX#}jFQk F |Yܕy[i/~7]*DU$]EU$]EU$]EUdCUҋH揤HHHHHHHHHHHHHHtHWy]US& BCd4Ae4x_V,RpK(1dp%> t>և?4_NOt^1aRMHŷTCaV RAzigKKw~Fzk!my?002QQ"[z-@,| 1Փ"ֺjXϩNg֤eNj Qp&m1OWKZygCukyMаG;}s󳓏bmhf~cAza[Fy[z|POk.U;dkFp]҉`UɌpėyV ~zquFtpRRƹt["-VJTۗ@=ƘP15e:3Qܪ8x؝k;]^|0>3  \K+@a%B+AP>P>MUHBdA(P>P>˲(` P(A(A(A(A(tYt.󝝝 sgC?LlF$)JjbE-3Bi%7G!^;'1ޅqvOQN~\;eF)̇F}eϧFkE쁘OkYZaI 19irM^b:^_+|:SS` ܀¼+kR=z8~y:"e{6N,m2q19I60nog"Qɋ.JOۓ_(&l&e7% _E;Eͦt\\Fgd[du?/+ VGd)e̓7 qGdjN/H2sN2\.)tClD)C}1C}1C}1CCJC}14C}1C}1C}1C}1C}1C}1C}1C}1C}1C}1C}1C}1C}1C}1{R毀W4)F\xճY/]/~[ӐX/lw;4OZjyW_&_vknIA~c>G\5?|jfltݥ\=z@߽ŊpT= 2>yͮ'9s?aQgX^i糣roV *a0Xvjr/!4LZMdoq7kLq>QNKjSa9_tۦI(<_k,t֜߃b&G tDQUU)tKc/Yuc%*xOc~YC嫸v` FYkEҕse)JFe ϯw+tH}1T&0ZV2ce}$-gDPcPFUN>|,A=m_Cm͟Ocz )QviŎ66 +12R*AѰ׆Z}YJI, qLID d0CRYmPCА6W 1fށFlXJ>RP7H@`^ZR9읟v^*iQtJHGDEU@l饷$A0ALT vӥCY.)5)pZ)C`*!{ELUEx4$h^EYhu]'6n^.}g?9ه>7?;x׆fK?7e Pa#zJ_pm\KחZuQK'U%3_zYI*U ^XuO׭¹6zr(KIn]g)n[ *Qn_FcBżLQR1Dq;UC;p;vv=g)`hO—ߟzl%= c+"+"+"+gQWYPWuEQWuEPWuEPWuEPWuEPWuEPWuEPWuEPWuEPWuEPWuEPWuEPWuEPWuEPWuEPW1z 9#9#9#k3#9z #2#9#9#בazt.7:;F9#Ds SD/'LJ=ķ&kzV|sy hldAPL JVS򥰚^[UR4U6Jϧx[\6B+ s 9X.)X-`f@f3P/l y7 C|r WkySM |b]7S;p4{Xg*䀾Cq3yn{\kf pq30ϲ-r3ɍ^FtiX̉y9 >'d^š̆na쇣GXA]7;ge+ɣuv;kדGS`RTQ.X f26Jr]Rz\m];Mr&{U{?cnS[dž:qj$K]!^4)K-2WJ?h l`f`8GQ7Ip&H(s&X6-XΌuW잷z[d4jVXU4|xכq-m̝'ZԵͶW{S&+a^VA7 MMIjSͻfsi;WU];[+؅mNrV]-V<>)7/RS6##92׳9341b6da26$D16Ёg;20HXB_Py'P\$´EEy\z7RӘeRCHg˓n1WEV$Pŕ0D- %W\1^}n9@SEߞ7;e_oVǤ[i^Rwz^W(q/()X1 | ABm2w/ ʢ ({>4Vkc߇D!cO&Y{b4 ck~ZKƕ/݈(N;ݜO(g>晄8$Fo GlfCǛ@I+ğ tZ7~?k"<# {2V,?O1Vb}Ƴ\l(K"uZfMЃ#9c'z8 `}[}W`\PQ0y}>keyR~>j6MlCh4k%ٖ[%YVkJXmDfZs|lIgO}$TxO+/_@BרH7T%SVDdֈs,j&&*F$y\xYen֣ѶZ=\lYOK5g- `1$1wI e0"N­ ÝK?HO~ *l~!v_W|qJ辙_{~@}fc_ԫ(,||wG:آZ4(pg]d'#pDF' #/ZCyj<`-G"pݗZQ>|?kvߋQw¯[O/ocv{=*#Ie,Bw0'Uywg:4:y7~Aخv^-vp!^O>W~Y䫗Q{=aehwW[iB% iLE`+&Hc!ǨE`hDz0 0aT0A{/"T0fDt"I9XbnDĤp !;&\æ`R RXjp1xFzZ8 Ul8a(g?M>cxegL?Ljdޜy~cGn(4ϗЄ-Zs5qdZbÉHbUT8oŻDo_HxMhWt47 %ߎ7 s*YqPӽ^ |F/f*< 3vBhVYEF9LB3ccIOZMD&$S#FkuJ;ނ~ad3z9X6{d$ִ+]BK2voO*)5[p{ɰWv~ .I^y s\\==Z~?䃃GH$q,`նc^5#?*ֳO,sY6J3JJ%`*έ!9xS"'47Hъ029w5 @\o:V(By}1B'z\HB3kZ}_S>&am/r-o)LƘ߂0n)}yZ=6^; |H=m6C=Ё 5ޘlo Qo 5$j|hCLE?L|W?֍x5GA ;4_ᕶwE,NZڎuL`Ɏwސaޥcsg-be߃ #7 <6tP|ij;H%.t,jnO־Hѡ:J:S}0|*aߨ |y3}+w\GC^Wݔ>y\V-ṏίz-w=M*qɚ$ ~n#G j}t {(Όb.kE jn'MA3ΫY|vTj?VI|#׵߃0c}(^g_y" zSgnnepj9PB[G#`al^joj<;[ ]'^[5D;dl˿T ĸ].|9Ozbb؛g U㼑*[@e҈}韠\ޑK1XLK: ( Y Xiˀ0֚F !_W.#E g¼Ff(|q$d(3J0}-|L^(s0oi㴲bBf F!IDZcܥ=(`^1&<T;yvÌ%J(S\2g` R29ʮ8Rn P2d-Wi^U[0O>Crc/n%j!|~}Ong:M\;G|4zhvIϮi=a"²=IԦ(`eOk#;S}b̜+I.j&!# 9AUsT\[Bk5E:jT*/ A1j6Sfi]2i)3aˉ84P8}P-K3)`)5kJt0߃B 3$e!Hs#r GDCP\y\'$RĈGVp"c8}(`Ryy.y7#9@gJ|TPy)KyJaaT*ib\Jw`>J0/ʻTD%d+H kgx"ZձP($[legXE& rd#HfĂV8}(`(]:HNY! ەS]Ԗ| 3oD)DC92b(`X8i`+Q<ļ31"0o pk3S/9}(μXbRy638+mdt$Ǥ PyŠ擁7 I0sh܁P($ 8VcLF5ث NPyϼ \pbbqG@WK1!PyE uR#J' QfIAQֱPyCLiS'R4'$zrY) 3:} EWt0Cm*XBM`a-7#> 0Ob <-=0t?v2u1߃B #96+pH@>IQ\J`1bCT G/3.Q0BRiCo :am*EC=(,4mbaRs:.IҤ&R:IU O9`2(yĀ'x`؂CҨ&q=}Swϯt"`y>7qO4̯Zs2q^'#g `Ћ)q견9Ǟ´jcޜv釸%;ۜ.M_àUT7BKxpඦ$+B%{pK$?d$MogMsc􋙝ֿ)W4M"x^-K@V9GpuGұo쒸j"*2q\A$ 5bTʢ_4Gn+臬/g)~rٕ3rj,+Ο<7}@2Y[zL2Ϯ4uIE*M2a<(k4Z(c|4|`N\jEۭw?KI1JqZH`NϝV;D@ ?X|Yr΢gjga -(#9#N ZWo+ nđ2V7?0-;-5WKKq$4Ӡ=SJ*7 o1Jc$o:rWס PE_7PrDM\TӏukLqa'Iˮ9Е?e̫ߚgwC.wٺ֩_ O6ce@{bmI8`wRg{7!Nv=F?%|l+W=$%"9#IJ ̌]gD2 -Mn•lͨ\\(.HXu8lW_k؜3Ϊcthm;Oo{?Wo} Tx߯?:VL6EBit ӫ0c}秫Mw?«B:~-;uo|o9 #^_/˚ }&po追гCxM%/xqMI7qhs3h!v6م+ކBk#.u:ҊgA;;We7d_q~~JRȔWVs3pQg /&W5̵Jݱ# Nc"h;:1hƘ's':zi@mRAR/IqN"q2&L{Ћ)IN U 5ӫўzaM1fFٻZMGRxd" 9^!B&t£"\wGD\;98<&J OHF:ӴH iGP\ņ3B.oZӐ5F X^|%gܔN갘P`q^1\{mm"ʓo GEJUI[S|e_Eo_h|Ba"(n1O(\dT%%JA%O(̜?0s躘ftbXaJ)2!E0&7I,;gcFlD-#NZ5?N ,}V(kG=kWzcPekV(X,'^ᤢKuMN%Fc#@*@ V@E  Yn [ :ă2E5g KR']TQcZji6W6jh#B.}!kcÉJ)5 B (֣G f%*`D/"!ـQB Q%4Xb2n08wTWhyG?Ϗo&b65ɜw7/^Xt[Ӳ(r"w%޼S oO,!vEqY@Nk8 ) 8\Tchn,DK3ϔ4*p~N">qko 4LTH-@m,/ܫ7o^tf^2\Hj6nlAw:{}73fdwЦia>_}vU^Hi*ٔw9r:W- ENrȦ^E}|mӃcmi.0~e}.!>;a=io֭~@w#GkB'Os-qRF9>%" %0 [Yˀ Ē\T,.,u`s F'up(6z/*CzD Q$a[&V|X[Y];h9:xӝ>">~;c'hڛ< ] s5ŎmEUL8|2 GvS{9yz*Y wBvŭh<ДKWnoV f3_ohAYqW`}mhH Qs{wDRcqLa*l~y sU/'KW8grZϟhDe~?D|9wD^{*@]ŵh,Xw{=;{%>?BjNp8ݟ k/ 4zcG "N`ht,dNܚךl QlrZ>F >4Y7XKH7K#rGxנ@m'--| aj]y2ޖ߮zR@t }\Ag e (k!Vo: \qAM[?ۛ/ tńV|Q|on˝JpO޸c%bhgVcYhB_4FErIZw6|Sy'W@؊Ip|i,u*d1j$*-ZaCd!Gta`PzØ }҉|&˕N.''&}\ń Y1bR RXjc&ˉt=DՈkgz^XmCˢS sil΄~C{S湌M؂Z5A[K%&Z;$J1*ZENs 9sxuTdIM:_ #Qoȷ~s"п}&seʚ| Hô}=Oi NgPgBʘNBe'2$9bO1Zks k;rDlP_#7[쇙c3#g1m wѱ%OKS}!@3 '*gr9 r~8'Ӏ\9+W  QΜT0c9Zu*o3]OWg8/;w{bgjZߎ)⠃;>DˀAXӪSKmhv+Sm>vcm>v}BQsԊXĆ2^ [֞Jo/,df )9Ӡ*.0CF т T&Bh5D,=&l.3 7h<>ʷV{|m=Ƴ䬢J7D$52B1ȸo5EThR /C<%H|-y>/4n=YC֋miRvLHk^[R\޼PіWySڔR@[N*2ګJN">І%lrJ|QL,mٻ8mԙ\珡 *tD*TE"ےS%KnKPoVy3XBn5۞7CŒKl)ɴg㡛(}6ZYaK-0Jd;DcpjZxm,5)-U+7/&juG @:YQ$@3a6+UXAP*"/6P aO]uJF 7eHq?Zy m?5ΠωH.?"uJ%8uO:Pcq"d\YR=6q~=k"%0bRc&:(YARL&[.?ߺK=.QE9c)(h 6V@cy-T*yZiިshzOܝʃqO xj*7,\S0p+3ˁ~tw#+oKu wnq"#C+GCGkr~m)$C3KFDwAs߁ioaoǔ(@+(s4R(E!,O2,E7{ClhKpރǞ%%k/%%\2NXYh64 np:qܴ +M 4 =~Ær(˵^˷mc)y7Nxb)xDd+ŜWF{ʈC*RE, FsGNx=Mn4\ngWvtʠ-g ŮY9[mgߜƾ`IELpz/H:$eq`? F*3=ˌg>cD!TΊW?>Jѷ)8]VjASʍ"6tdXDcZ)"%RȤH`A^O{QHFwY{ L[ˇp'ɕoe<#Ch4" 9+ JIѕz/Ż:%YVEAfӂR_~j<3a ,{_R%+jPz @)h|70,q/溬'ȦȘ/._Pݏ0q:}4 AcF:73<~>+[|'Iolg\*u|Tn3 u3FYvEr&,hݕ]}W+sߕw!<엘mKdl07M;0EHf YcI"|!tdEPM; Nhg*S;POjTLnZP%zؾw@D@+xEٗ@1J֙ fLKY s},€r]ϷOe&#QHYu2LwZ D佖豉hj4BZ"+H_L gϹ;"#FꝐ/#Z)γ={o|7̈́ ˭vr'ޟhۻ]Ȃ|y72g+oC:7Zk9}x~^f/9e DZ7F]k_7XU2y6 ֱV7]d#'[|>,Y0LS$(IۼdžX!R.+Q"X(sq؈'^gpW -/8&:`!1Rj K1acC E8BEj,fi}|em'o6X\w~UBnF4K#*S))E E"ÀP& -,IwWk%aBz( {wudRm}\3,/_^nke30$ `.49mh2+[2$|0ZB⎳|[Svx=]Ww׫lKgjљ[?M1T/vQ6n1GPUK++[jT6lfYނ 0: f0bGDOmoa*[% Z+Ũ9:MGa.#a%wS_<FY=D5D ba`՛?&}}_՛Wy}W|30K(\T cBniB#]ﷻO(zt;Nq[>,+&9 67c+ޕ$*'F<1ZtҁMoᦚMSi79riW79vn6vݭU 8 P~N*"#WmZ鐻t0M̖T6QR9Ѥ 's0cc~Uu0}d~V"@S{NjC~b7)_mI T:i[k(4#N -qN8a;·kS̛35al!R,GB)SJV f81 ʁ]Lp2vPBMn| V[_i'#ho]It@kcK=rr*[Ո4%G-_[Kԥpcw~gpK@'y~Pj`ܸrv/}48=H O!s*7+g,RT(o:sIK(P?'3Jt!OZ8suiI#1CRtSMɨ]QQ8ҒP4roD5ǕYoI2K6*ZrɚP2zEx#d+8>jKt*Ffd j0DFdlt)]kpfwWQxg穛 Փ_`qП:PbU42aD%$@tY$h) Ӂi]IՄ0;=H"88fSa3 A0 ('` N2Gd1MHK'c=# #$$8X"C#1 )DXҠ8KFbu pZvIZ'iǔ#EiFyfxXb8K8_98]['ǽ{O7?>Ke.(t1yn6J]1Eĕ^16a*- I-I,"#2PIa`mDʉ2GEypLޟi4o+E/ѿ^+giHCz+ z h;P&dtg6eq?} ɲ TjҕJJv]ԮTjW*,s>a{{RxU Z ]"3A3=Cr!A Ή!3g1O#o{[6tph\]pCQa;*a"r`ǛOpf?rXfz5+^sMRޖjpOpË?܅zj\*ӜK XQe1%TbwESNXsw?̕qt{9y~)e6zkyqJj"[T£kYGgԝS:5VdYS=uPu(6d[g:S4)@"wӰlmGw;<~[{2]U ^UgX{-LEgFZl}+A\>mJMg7 ƒYgc\qYJdJ0]Q.u{̶zTny]jQn؟5Mޠ}kҦ8y#^Z^ҟxӗm:quɁcNADҖd:0ӛc,HIWCh2h $~ۏ_n%l07 Dz sZdD)1lf% ,]4d mb$OK9DŽRn(Q( Ȣ`;X+\$JGd*{IO{`<&y@mpj]フ :+%6wwqg|σo]oKQ@cFGl_f_Hk7XLKY 怙u]:팓~o4=Yf`]>)K -r [N?2y,;Y۵",.I'\L8'?oד/xDA惋';vl>|y7yn+od47Z_v9}x~kYAQƪuMk·?uҺɦw4XZ4{,Onu,pΤ[:/lB˸>DQ0ʭ$H)'=פ,9Jc h'g`*kٟzQB:|lUxUV:joq#+}]\h1@p^$Nu#K(όsIJeQmҒ==]گcSI(Xr`YTThFrNȝR x q$Wx>(J{Dk uDJNcRɜRafX*Vc  B' Lx{< {&mTɕ:&ae}_ C%94pQHE׊q Yp(( 6 ,^Cfu3q4=iKE7O2(2o (b#%3/vgAWre $`7%`mrmơ%N3|‘bM {=&B$\ y)=kb$şc}L7%a yb0U߿,Աn=MO izŚ3_`@ N2"rrNΥSUҎoN9(˓0e{m? ΊN[lBZrTatVb9]:;9?K\&܇%u~|[F8ӏᦸ2"ivWZys::]O";]0^Moe~[qznT.?wީhnOaF}}EUpa2 ̥2-շ] M~;o/dIq|ieEGJFt~Ye >LJUI$N7 BWc.G-/1.^kԮw5iVUڪ Is`Og.=?'6{<塩lFrFWgw~Ͼ{ uoX?u剓Mz 1rKW|7U.yhi7V2J0~xeƤPgw'aē ɲa)7ikho14Ulyo|qu1Mf9 Q!u)Ҏ Uc i@-rcQRC2\<._O𳮍?Gf gexP,4GJQ& LutzWu;uRb*N@͈EKb0Z.bMtyf# MX֢16W25:c3&qBI%c8`52(Lhe̛PlBUa~}D[ w#h%]nh݊js.ڼfPA3 NV0`η:PhV ԜK>@Z58.k6sqG-&C-Hdl6Ȟk  CWC %oEbI6nZoi *aV~>*k@$gQp3gEx.@ I\:Q"2n_ճ5r{&w!ȞK>o!9dwt3ybp<4uw,CCsNvur JɡLw 5! BlY.Ѐ[m|&>ʛZcx-̧X{^N4}INt9tVӜ%YˁVxK!Pɣ 2{qgͿ+!v/$}/B>0n$ cշ~Ϯ^ hCZ5e#s> :]|X:P+R[2E%AZaLN~=FFZ~6k^Bߺ1T)b+t@HmojRz\Tknln7rs}<9ȒV*4붣#򊡭˃t\&@eBƺo7;GxBDJ0o[WѤ*'Gj&fʖaY2!yi=vefD ws t4YԽš||8V+\X oPG WsS>خ"rw([Ahxoz{]Jx_~ۗKvW^z&Q;7 qf%*Ű[)r}q{c? \οMքh-R܀+)& [)Q[M*Ffd j0D^d,]{JrF)HyyJz~wmrqMҌQ5:TRQk %I/P!`V0/2ZDJGt`ZD"rR¬}EY1 3 r[R.d#6Hc{!,cY^i@ Qcӕz ",iP%LR#1s:ևu-eI˒6h @(ғNg5jӫz!σ6`t{&^gVS,bAhdV7TqX$QQ▱({1+fdrHxRʠNJlX/>Wz ;&&VQR1qV{tHbaN WZ6кmҝ 6%:4?ORczydжd[͞gw G9_)S[q esӵϽ I9gZ06Jbf%S@F-ݑ@f9' h˗Qj/gָj6r<Y (:`TG+ )VYjʈhA #(H8HxKE# ?>E6b;Ν]:_EsϢ:3G&?NcF1ZleF͝QFGR :F}6cGˀgw}3oݐ<;er#tʮqr(OiQ/˕vfBd˟?5t=9+'Wɤ?$X$ ]ToPQ L:ƠFB .OÓgY:_Fi&#,BU.O;{Qn^6YQ$@3Yφ{ey0ZqE M$zqS׫ aLIX5FҎYiRʃp-I£ȳeApS(V@LrbA ň3XH*6aʅ B 7}#2\*aHR>B'-"ƼZr'1vBD45*0XOg7ZWHGx/Pn 7vϡy)3{ю`g6O 3tywvnj4R,OO?~=9_K`hi)1H-h38H{ Sh`52GiQBXdYn",Rg2DOmXp8%b"b1g;Ja838m:Y/P4mXdi-mrrn|nLS~ oT,zRRye8$+eQ2 A`$Q*Zpwۋ84NCGm[vD2x.lٿ9CA/IuZ/Hڗ8_x^ s#ti%3_2D*mbpԧBLR'}f(JqTQܦ Ȣ`;X+\$ֶD _M-`~uٺ^6>Ƞ@")tr˽exc%vu.%Y+ҟƍt\ e5uMq)}kF:޵wt\> oĔS5l<͖~ăy_b$@'}=}hOn7NP6xPoW޷~W-ӼCX8DG鼹;ĉ%o?PXߛQ%̂5ȜOFZq}, &c8>`={/&t F /شgI8=f\pjmެTX⾕~n>E@h7.R6vqh]F?P4u[$Q]' z7@TS PV㲥'wem$I~n`J#23?=tc4<%)R&)CKbQ*VfEEddĻ7h+Hbe* ÅV _yW 5J@_|o6LDw~P+ISaQT1&xQIJфl0X#0҄fa ۇZݔjB"$'u0aki.o?޽a7UiDW.޹{SDfoHoSHHfA^^ȳx=U8zu$oGUҏ.R@l+@UCтg" fZ%嬶)8!< S0Ŵ%#;8)Z=ihH؈L{%"VJ8p" >@--J-9Z),ܽ;]{Ұ^_], `u%4$wWyQL?8lB*NH`ʕ QLfNn+CDQ͉Ȣ*M؋Zmkt,nz/>pedWmrv3I&Y{@I/]JbJ^KDKdDT 1YȰX.Ds2Od2!Hƥ!Jy$ 8I,dT6  OE8L'H68TE'ABމ]ƨ%J\t06&RFl" T" T؏tHTtP>0NړX͸$t#ZS\yŝpI+'Q5>G:k3l$"r2ƄYhʃLY*A 6f$xtq'an zl%׺XI4JT!\%<'W>i彷?/sW+Ix5ǣ'EFphLӃ>.J:HȹcN_.8͡|L:WV$S*eZlhǗWzU&R[,t 9G3FOd+.Ȅ(5 ?:izugry?p?'g~Ox=xVLFI̥^bB[ioh("Vv=+p9q{[7] . #fYޓ :(XX,6#few8gkGl󨳇\7gY5uTqF6_?#8U rpV@}Ӣ g)30f`w aw N%LgY #C*+_EYr~N1\$M0C{'HdAY"$'st[a=tu;:2ӄU-b<xmy4*ER +6EX2wx> LB.1_,x&P,B۵v]ki'nFa?1 v,EMoGmvCo+0`(ߴ_w<ъ&`lf)_.v0idIʽV2['ja9#Oz .;'>G&'>om BFPr20oPtȄ!`35AVZIk%퐒%BӤ1Uv*q!ϓ0 ;,+GZٷ:XeG^ONW<@lu?VUNPzU+V^y[e"$lWU>]zG&?r=zgukkwIdUplA`N# Z:X#+ug`0=Y<|6iWI~=$o#9eV˨F{Ե7|xC]dT4*g+0WBV"GC$CH]dĻXft(7!syS3,$n1ƕK1GT6ARX.duiJ(P'-\XgiJHv91%Dx rz)opUc!Z"jo'ڛAjhI,DEFp* |ddP(P)ur*ǹƩ++*U^FM' w!,~-oa^8O 4eI`qu h "@FQ=t#etBI5X e N?Nvjkx / ^fZ&j[tt$t z)W>G)+_+dQrRJs& }JVFOKl$̒1l YkL2/Y ]^!F0)c(,)%VNx Ev] ƦN>gW:Q65l+\|ؘf$>ͳ$K_jI:":%D8gS`31{rwiᝢgLvT[dKUBoR3q&GReziY/Bi[yD^APLfaP+z1. /R5Rv=2 N*Ņi],+Kf:C*ChS_[+ Q=6;tuaG>"˻pG)|r̝GU#hA9t\JɄ sU JKZxy xy;4EEnR:OzUop~^fZ,_.ْ/bʏ #r[b}=钸3u( Wuk>vdkҧR>U8EYeܴrsrs$'modY@;kVKBcy˟GvN&Vnf-bmkkfDgce[\!`y<\[ ]#W,XP"jJ{Hu$H۬r#Zt[h(Tm`u%vWyQL{օ|T3zB ڙcltsrIy_?f#=u}vs7Ǘ?*?2) 7 VaNugo޲#^mǃ&%VY Cܦ=pcp[,Fk29rYBwv1}r6eEW׊|lIy|wD.<ŝ%6 *lL펦KӓnIܸ;@uh+et0;.hm]& ںOTwjۖI__"]~ʛv2F8WAw4]w_`e[4>rW-ٞpot4\Ɲt.K_z8eVZչRerKpehLұDɉA'.hì=WF rƈCWvSB/(-Y6mzm6&kDbOG11 }}Z՞͠:Bd Pc9* ]zN&)U+k]kEn v}fO>,9,j6JIAr2xZ0ьE>e$T%G2+A̅He vFg}| 2=vm7z.\w'Zfe/כL\[b_&5_^㭋R}3A3{W=^C[Ȋ+++X'ޗFnim{\楧MЃENi[ ;v8삇:6n㪎c:,X=AV2[t)l3<{Ѳ@{fSަŒ2[I5^̽le[b @'=z.Zٰp|P 'Sގ'F߁^>]v8?<3^Z9l.f1CD,OF 2cle}7&L|M_ ߺm*(Trtg䱛i]W !l:jJ{H%R{B*uOJ}`u%e;/*w`{EMF;!)W$<Bf.Ɛfes! gb}:%RgFZuͱɧ 9|H4ț`}Au8# gsT^-p[[|ֈB=\0'Cesz^Dy6@O 'cvCngJp%}GPgI€2*vMJ.%@M)z{p[,Fk]Jgׅ߬cpTl`}kHsun8-ԒGhpG"Yެn h7=6ǍK3)rm[H/ݹ5pvAàC.o{2u]-0qv\|smWݶOZr˖W.ɖOw'6rOyr:4d}Yo)a /_ݷTb{ž;p)vĊPUϭ͏2 K*H^㿖*x^k J Hr}TR 1(|[2:Jy~<9?,uzͤg̷VK,Y! 6[ˣK;9)t,KPCֆh5fhDziBs!X`h@zCꘄ̺-qbxjT-yky\쎌J!%@dnQ+8ϐQ"Im<Up4D\ԌA;t u"AFwrz."o1Rd`P&rp4L *X4҈Hh$X  U ';HCpleG5s0C"Ct;dm`"(ki!bX6;R)ԋQv?iz-Ӵ"E[dP[Z{G Q@ 0PH^$[Ez΅ξGBmFi0?F\b  GqjFJ\ %U&K6m6N2cnb`ޅO奢 OߕlůyrY>cO/ۓi&7 O2uL׌L.83PS㒉)2Zo.lnb6M"l<[I&W`=w]bV{e36ZÍ )HQ9׿>S8}#u`ꍤY%Z[-8wI$P-_ޢ.o7~يk2!W(UK_Լ=X^Œ:\œ?ͯ> L`nq k+׵] ~7S?\{`ot'P{iFvv*,"4{`Yb'rg_ [&9kU.'k\%gV=:+i#cEm(Mhʃ|0Yt<ǨQF5-a8&qÿ~ӏ_~맏?G.?O?Uc{g1[S$ge-q F>l;c/?S Yԑ!Jj ߺϓ4:c4I`NffًtA~ [߬SёNXժ'FR3dSЖ& O'!I%CbL&#EL?M{ϺܣϰjN[3&]26`>8V+G`!JhEWk066_.PLפFAzWˮzlg'vwRoqi iä8sVލOp<7~{$ +rLڨGLDGYeg8^X}E?궦\•˫IxZԯ/ ˃bfiSlyneԚ8}#:U{Ҕ{Y^S859?+緬YmvyKLI9m"yK!<-C)obZ#Cp=:EKRCG qMJmH*/?D!'1B*mk'㣅Ά[E[-K5ԺgQ۹9ΊӝrEM1Nў1K.HRl.EIQs- \& ]Xŷbz Y<ܓcv? Ѭ}/Tn4| g~XP@#,EB/Hm!&fhRf`{}C6.xT1@5dHbbA֦rE9,Mƹ.4Iױ^ސYiPm^FE|!餹 C 8+H=ioG/줥bfl$;.$CR1WMd,J,[buիw;$C:Zܲ=S1)YS"cMg}f` OhAnb=X<[zw&&VQR1qV{tHbaJ0G6oC}T3}FB,qЬd xjI, I}j;T{'H5QyAsS7~:z6F)t`)L!< e#.kA*%X `@ߔ2""&ZH0<)c"ҙ-clWN_^@̤Y侘L7"olلM.) Bwۨ8_zJ0~}aq(cfY*@X[xZ)<҆(0b,F 8JAx-Xhm 4wLm6*GT KIw Lٖ7Y/SVhW߹gfLh+z,Ko/-KRЏp:U8Ot?6h+P4BUqim)he*SjCX3((dQL6q_5Ҽ *.p3=Odg%olo}ߠBhK4*5AR鄡ZOo.;xqvaq:~<_ԻUP?g G_^.(]V uykNG:GRly泻[cy`J]w`wrrJv\8w]Je]:]de^IfRblA[>hy]CVeo9M7uAk>7j^KV콝o;U&S{GyLՋb˖W^f֍}ڸn\#}KrxiPIxEDm9uK6dnr/C"'˛*d\+]}.yirݥm8:øtY#S!;xwܿ 7*r*t/j_0Q_&vBHK,T\BzfMpKv%w"+Ǜp])E τs@t 7VoK*d~@cyLü$.Sd" dYh*G"j]bn&6BQȚoJ*X }X R\[kL Cez%jX6S\ #a◜^"z?(4%XTAX0BS'ooF5*5M]qt53#oǟ3T}.r7 Yǣpq#m,Ǐ6Cv-5+-°ڞJؓ5c~(f <4K[DTy+_4q-ƭ z[T`2:ky)ExZ,*t4.?5VBM= Yr|\[֬` n*!LO9)wm8N+NFf9S3u,/pҞ7rfk%kBANY=̥Xi])XL Ul]jHhgg~֙ Oޘ' _eΊ8DTz+h#|V492ž?C\(  .z)ͼp4 =Š0)e a{  I*OȢReJ% .uEN(U9g  } 9C %slz45Lj@8>TE| 9y rHr"9;N36+jM@  R@Vx#S ̑Ƃ#d16KC0!aP -Ax`}WYV(PA1]ל+L-<TxAVѩӛr<R~k"ԅ|ơ33h0pɢgp\̗($(__G;aVT()l`hJ0UƁ"*TV1mW9Hnʍ<mZ:$@1qGk8`'Z+/2ٯٖ;'ʡO)r^Z_GerÁm 57/m50J5H[M]CKkpz$Kp3"ا KI RDpCJ,N,xGI4 GrM=|$aY=VOL&]Ԃ]WDE=HVva`IqAl>yN˭{NnPr}: F}&Ύ-?zŽeE8Y{v6t[{O5Im Q{G{d0w#f2D-ēx/[^ );yUaC;鵯}r WO-iQxěq7 e8Bqi8к2l qोREFZennGc%]ҶGЮDK8iG`iGwb6r"gk(ZZJL)0"4r+GOioAcJh+(u Q)B'D"!6>-K e _]D1-m\{{ȇsg*]Z5. LcHBͰPh6-N rS2Dw4}VeP lj0*0.IO}Ra΋}}6>ƟF-5in gޤ7y oCGY"|}P;T1zG*J{iT*8RjqՎ}欨WgK3{X!l>RwuG;`S\|0"#Xq13#oǟ(f>Ox)k[~5CK8 (ZPԕ,fr=д+fRc1]4]89MZiϪXT%/SiVN] +>;h7E6¶bR֥ LKRZ VNs֞fGߴo|M 0J$8HqJSi#W`NgL0†M4 x$ ) 2$e30{-bhFs+RYtyKh<qX9[2ʄ|m~ՕfSkÄ;ir;Ԫ9ֈS7RɼJ;+֮-riu$dwjvˀ.&k{E]4Ggjܭn+4 i̷sT^̼2r;>$k}UewK\P3 ǣx6_$%j.5ٵ6^ײGwpo5!yV-WD2U\}ԐNO7tpgIԊT R[2Ee 5ѻgz:_!e`ܗ[ucݯRK6],c膏*˖7j'NԂi.6p5>?wXMڨGlx&ZY5YO~yFTWP6SRσq%l|팦;LHi) r)F3֨Ui?C{gPPuom~4V}Nw$|RKďodaiph1i=:EG-0sr >T^+^^ń"!yWlN<,l-wt7%{:cyFsGU)ݩV&ǘۭB6fil켈Фȹ̤?B;kʫt=}"=.5Ӌ[f_a .? I#CVfc`{!E Rk7Fɶge.9̂];{u}E6U͔A&C b ZBz}*@6y2]HؽX*cH\@M $wY. 胅F&D5!*iUҪRҎZv I8Unʹb=[X.Molb"(؋_;.BC2G%Ŭ & S'lypre%_QΧqex^rpĠFD)D%(! Btfֈm?k3ȗiٛ5_n ״>+'rXZ100ok{!64'ֻLטpK@ NL\3%Ft_NR'iٵ4nB y1RNe36sV/AtwpQ%gxmK [E2Qo}:1`ޭ5Mo/aF czwيhb2?̃v)PD6zҞJ|wqi=0noHVtnv`01ee' =џlXlmuu[̩ŨeHs)+b]ByW~{4==Fmx&upzWĎߟ~,?߿=OOoi Q`B9f]ZOJ]~vC2/>n%pΦq{qZucJ(~~*k-a ɲb97ew547ZZfhgf\׌f1n dC<XdSj#=9 F>V̩2gE -pyo^@Ouo2xAEl"{@:2k ƉoG8>/__Sg &I c)`4^$%]NrI_5aӊ4%BJA:Leƹ \gȤU,pϝ4+ὡ”c|Ytl:ǰjWZ oęW[zР|L>qB6(#|RZ VgO|/n-׻h;|Έz Nٜu!jjdIa dLC \I3l@昷*vkߞdCxŧnY,6gf6 R{bm~ty;>iпD]޽D.y7S} Н ?vy=^ Mxҝ5o[uѺ%?Y&hKpv~o={^Y gVܪo'rDWx _+r\WWx _+<Wx _+<N]Wx _뭶RZ/F9JDr8攒^M揟iM_o4o',U j7'қB HͯLqS5927E# p66li xdYz3l߇~S{A3uG_z-}^ ێG! n:"F.v_Ě1q }mgi|NtQ*2k ӍØL~G7_Uی p]m0d U<;|8 ly_ޟ'39M tA{sڴe /s`2^emT#M{&ZY Aӧ?'їNo牎K ZH%?g/)^OGog4aGN= Q3 < d goMR I>kŧ?IjB]"~G$C SL;gCI ):N qrMB]H*[I/bBxHIn@2&\%JZ]JQKCdP馜{-TlwWlA>/;+k KZgQo9rY:(P9A5Cn 0x]*Xxl-xU%[Y11i0 2e@Ǔ.BL$9sYF qskak79Դmx~OݵX?OJ4Cmvz{2zuIRg8nzy֧c.f Kd^$ځtځZZKMDI0%(eJgBElԆZPTDŽdo'A̅A$2q"f:.bl:[֙L o`ZEO\x$O3qm6'HBÎ%<,+W ǥ ]^vUm)Ʉ~E,Y"\F 绐[Tr7.UaܗSqb-@ S1ajFmdeVѽXV`qC*RܯqDvfGDA 5Hnx}AYDiUo|] ̚[]2sF#M3[oּQ{{ u&Z!2fR쩽'ɟ"|q@IrQյṵ , b2\Q SA=ܧ DkmH@lH~0Yw1- X)  EJ|YJtlYfꮯ{CQ{0rHRIpxaR[ERciNK&x;;Gw׌GÁ=K+FWi@ <#f7+΃Ѣ+bUD^xm"Þ{>dM9MᜧFXbxAu<qkVmVe_+{GdTaK~YP^"fJE.1QMK}%eGU@u,ڨ Z;"Ts4oP|DHl bXI0g`s@ߟ[^٭nV[Wݲ- 9wv~.Vf:QQ繢9S>0Fk STS1su]F0O$AɕmnծbK6 !^ڠLiI1gBP}T٣OY9SV0C)d} EAӿ Bkɝ1np ._}]G۽鐒-z $?VB>3XݚRf>z{NW&u[!nj_W$F{CGK/dQQX_K`hi)1H_gp˂#-vLr]ט@"D. ayId)bCXH>]X{))qJ7D b΂wDðqgp[w#X')2gEhcj`|qK_Asc\/<*K whOqHP%VʢeHTHm۟ġiG8 knX,e /,%/g 7 ֝s}UiGoTơ{bs4K.Y4h$R n2c؁2taVT()) N\m8!QiTV1~S1G EmȰ, Ƹ#aREb- \Ki#֝'Νw]F=ύ.~_L9 C w)#C0NP ':yGJ.1aJN*T1һaI5,W 6MNwGqr8H =F|n3^uSлFN<A7dz6뫾̫nb͍]ǴWɷ;ڱG~,\|Q~]|w ezszm/9=+}Ls%Xn=Ij5=elNf|lw3+)=:k:yaJJ1D#ˆiHj0;ɔnԾan L%n =uUV{!ҭanH_bWַ4\Yfs:N5S^ XУ֜ɟkbU2i'oKj KvPu\1k&-]`M7H4A5S)LLTTnh3mM_,;Bc::k1aprKf|e(;Sޞ'=F_^d쑧&,FɃv8Rt]JgW6r8Ġ륋REFZe¯7c2I͇'.{0T+|>'Mj;Ah˝]}xz agyxת>`SHgG_=L;.N/\|(% -Aiq0f dw7 A\N- _ξ+&鬨Y9(0=cF!t}f4MgN2\T gW {epWe]˹`1X9 ^.s:t : SaoJji [1퓿o.UwhODwOq3Tl)83U@Ծq#h-򼫉VIw#RM e$"_jY"3a?^gQIC[Mln ~h\|3N׿=*o`Q*2vN`w9mnqz]~Vĉl9<>*ϛ#G1%;Ĕ\ b#H sȵ %ɺug;~;ZN9ibPh,WlbjŘHn-X*Z:YЮ7hd?_'DfTXpy4H)]o;zi[E^~QUXҥ=-d] ANf?JgNfOgk[]V׶|4PݖaY2!yi=veTYD+B$F:чrjdNK"G~\Jr#{B1GUh >:L$P;Ry"gEt fQia*SfO HfUā80*gV̪Y3rSHkifU(̪Y3rfUάʙU9*gV̪7c2r&ˬʙU9*gV̪Y3r=BL±̪ffUάʙU9*gVa3rfUz92rfU#U9*g̊Y3rV̪Y3rfUΫ\fUάʙU9*gV̪WfU(**gV̪YsdfUάʙU9*EӞߋm'k>qh>=b%іJZe>q@ϣS|pGA _Ox%H"vRJ>z: LȜ&Ġԑ̍Q"q(j@9$ib䅑Jm8Q!(8v0)&DHp`lnxSm(ǣnxM CIH}ׄ]LvdTͫޅFi@ <#fa\9 `EW*]&J=8iQhKM)J;f2&?A@ m5g];;(Apup] %7d4P_t{X+;.}uKބQu&e~.zC~.𻢻?:k[]˯'+&oW/zn7\!۶ xTG\0sN>.^r-# dRKJ16xO}j|^Ԗ͍3tyFS! A*BIIMR{) D`-H)'=Jup[IL_ GKsgHUa!1Rj K1acj$ameyZȄ$؜_ \I9鼕YfkY܀DFҥiXr`YTThFrNI!-b(rg!)2!'u;M)9IE$s2$wZɔp0AP4@2AzOx}>2~&"TGQui 쥕uP$xtN; mRQ;ĵb\F,1j\2J(J (WE?'Z›`a<ˠJ˼0Lhԋ!@΀P+ _Ez Uop)J8ߝj6 o?#.#h[paoC? I>7r)"rS NB +IMlc`ָT()C=6.|{b0uÐK=E?t݃ DWu"LW `X3SP1:oM]k;=^.gOh7DC85iatHZ+1﮿ &Z]') 4uca>;?0E>5p[M3YNOOŏJp$ tڱ۩4տjhx'( ؊K0!ar~aC4t1+Œkɏ̓oglEgjޙkyPr6TBDʹ]j>Pakio-]km#EayC5j(4ɒm7]t0Fk[%hs ׵n#iK t: >3TM W,RDb43xA _g Ɨ o~m|ˏ߿}:?ipi+j͸0.4fd~(v0gaOw+d UW`0׍BaHB;dGˀ frOIkIʯTEiHlkD4gUU_u]GvC YDV%J1%IK`'ĂLobT:ɉ$RFt< n/uhaN^!5Bj8LȊ9u<+e6JP %&2` ܙ!LD$4BhS23v~ۤ;9^˼S|s\]2wiA5g[ڥHMvK4cBL%154h@c7js3"wMZ)5E蹦ĤI5xC9O*X(GGZ!2( jKDdj)kSB7dHfEy:鉶6SR:;2i'z2|(9rgdcWnta^꡻}hr^f%]c0cnxt2g:u@d0P{xx( juXkigRi6r:ᱚ&H]Ҟ-U1%(T[XUg?,/Pޓr>|3߯^2+mYr{]o>fϨ!'aO Rfz7AV!JE%Ѩ\ Tzr!wpW_8k ׯoW۶u(}"НyM_7? &ټh@ߡ 5sN= ޻50lj8o`jG?/~95.xwHcՏԘ=XJ7{ vkNb 䃜e)+,` kmVV :B7-_6/Р\E08s_0x/W~IJB2DW/fd*g/?)A##[bd .FyIJfmf̳#?89OO? ~fpE7_i/~CK2rg1[UJ$HjtAD8dIzO*]jVsy~aw~Aj;0.A^ V60P<1ސC@̔@mu:ԁӏWs.=`.i&Y˕\˕\˕\'[SG[SR\+uR+uR+uR+uR+uR+uz{{\ݘJ].6'D y~euF.: -h$W [/?Jn~xUQ d<I=Uj%(R oX * qc pzv c@Ǭ w2x А"C cGwƄ` k'I&XsU+P(I!F R0 C 4E{FH71V&W[p㛉}+_"֞H1w]6 4`JAh" +vs4Z ͜[061yxm>".1RHpԀѓ,q. ,)Ύ|!H!DTZRI5 Q+9%6B M s-ifUlWf(^h*.3V "}$V(Aj2Y$\zݯsB{V^ܰTOG-xT`5++"@jպ qQI@(MCξ,pkέ$EJ#Қí@'-KQ+J=KO2 D '6DB-ME{Q  F8'N«ŪB:-" l#g4POt;9$|z_cwh}[cEG'BG9:3(|| pԯ&QqFR~F *8:-82miD&hBE9O.LIJ'LI8e.^ >P1@CjR 3), qT@ dZ bP'NsE,6G-lX*۳N$PWcc){U<ޡ!zD,wmM<\KBv$Qi Ui;JU-"z{&Mךix_-e*rp;^7o47a4X2C& gQy4Nk@  OLi)\bjLC_ty*KT!|𦅐\[/W2iNip72΅HU7IψO9D))Ug;F 9窆>'x{ /ooX69c#ח:z8oObbUDP5+<Վ~KMCPn@K]| ji`;pv`w7-1k~ Y!Łjndv#`,%O3X|ukd9P-˾!gx6bNћ-m&^jC-<[`ǡ?>sy˪T9K96PѶWYN~`7Q8E/dD`q=1ha͏v{jw9Y1U2*U0O'kuzubBBw,\kz֫ɦt[ɤWn{`od!k[Be h9jsrkKU=$] K/t߇wOy{tz:-tg呫Й- 5Isct39^K`cښ\"GOuOeF-2a7힘Y)4< /Z#yJ+:W:k_za~fóKPԴ8Ox -Xt٫lxnD4XZ+`AQ8At2F~OLzqvs} W=tQƝ䡼[c=֋,^W5~S\J?^h8~1(^t_Fo !Af6釪^P\Xd p4^kfQ52RZCw^fz|.Q_U؇\. 9]!O}؁}ȭE8 Β>]0 p~| ek -y96vr~>6ލƟF=p*%;Ͼaecپ.1"^OcZbcŌ єVHu"H#_]d6v' GBd0yY𧓞80@cG 'YƁDiPOG{I*M&E "jT"Q!ֵ9sIm =MaNa#Xωbgx3ۯ7:Z0cӸ]F)D=9)+̩qj\3 f})H4*XM\$`Tp"҄j9b@b"E*Fi&۽uPHY"!5j[# #b\7?`K8֓kNm/|3rjɝ;kxb |RCG痟OΎj;yZMN`_ n 7=o7c[kY=~ze:ҕgUdϙjbܪeE"ǁZK~{#E} v^S/ LNI\t)%2DI5bYQJt iW. `:ժQpL9DB w=hq.>V*<dpg4JG4kkv$ 'P#s}y;sH~[b<;CrW駌koU)Ϊu({"";f%eju\H9P4~\:ۺf6Sk\& `F#"jB TGGgw2܇Ppܷ#Zq' V;+o"Q";Zec`ZeGO:!XJ?9?{UabP mp.sɺ. H ‹:W*+~QErw.Ժ/ΆSV믋óaa-P'ZwD}ٰԸ]1R4|_I5.}1k$;Zk>.: S\gHA?g k<[){jy>[F^U\P$p%U]Vj\|ɸvqH6W_f?.?~jf˛?[]i`AΎӳビe]zf/nGr\K0k3|¼أ37!T7P]'{Go t[fe19E7y?d65Cfa_`i[_9cUysQ%ګ;SڼyNUIiF1\:'o.#-Dt2JԶ)$*q 5ZI!!QQ ~ASiiivJS,,I;ME! OA lF5) Nȗ(Z!rS"dw"R@]`I-n%Ġ:*x{RQ]J u;"TF%zq U1L9%XE)[1`Q n|SZAn税3` GE~R7`8beW)X@A_&Ϥ1cIpmVS1āN :iP-my*V`\"/S P,&hye=Ҵ>P Ki9`6B(`F$X_wIAqWSKFd ,l5Xi w5TXZ͊ĀvPh6p.`hZFO50!ЮHV#v\օQAޠB\P?,#n@@zkR ,byM`0bGXAA.WEb/pPRp 3ˁ *Q5T&M;aq]BPL9U %a7 yR˲H^P t[HPSސԕ(ޫ0,uUc).w F\CMePHؗ3'I4h EfJ U vPM4zP4P4iؾ(x7n Ɗ~}ĬqDq"pRp^Iv'!"/ڄ{0k|g;`jGɍЕgҧ-0d飒)PA$B"$4)]UF@>yL Ρ&qC|`!39<%'Pi,iQy `j"l<ܶBM.p)q&E.NJUEX6ׇXkxhD0(c cP.O9>hhC*%:.h /shx#LlE)e`{'6J)A2v&IJuX:*Ċ/lw ,w3ڄ`rQj2[h| gIʀG⪏jc:& X+Q}4; A$y㠈YSj1P&LJBȺ 3AW ?ˬZZ16 X,ؚϭ 4]o餖9k4IyF&$u⭛A*Riۧފ{9BExj`A-#M*w+(aQ50 JP_筠5 {`O R mrys y槇-'ymk>Ln&纮L ![7P7۬476t n@(wR-,:zm+̚6My(Y-yp4X$h31m'l/Oz4#dbѡ@m F@9ȖzHFDns iҠDT- 0$FbFHzD`́z[Zi^ögc'6k^A \N1:PF# &,"]U C NPe@cD$Ε-tT#QyU;:P)"[ PFe#248'y[ 13>i&z`jUZMs&d2J!e3\U\A?!]SN^[֑Fv *L>fH\bmEw&X߃ Y[(m@Xxq+E9 Mg@h B2>/ NčrƇJD=4HO QP }J.&.xln @p4TmM곩,\å޵6#/{ozgo-Kю출3Wqv:5I-*Q'~M" ΃vtDe`M 꼇A8!v.pXn n@KQyoaJNWpxRDpFjI``C,+%BTW\\7Nj>}i^@s\M5P{ 5@QXs-\3c>p'nr>MQ߸k\Eõ+E+(P"8B%[cEzBX_{qi`f:nLRY-H 7󖪩绕OlV&8{m3,L`Lޱ/<'*AT.8tο4} (. ۞Z 48g E6JS#L<% #s)#(d6rhJcѪqҫvP$z*@ɞ@ˆXHoz2~h;fܽds'{w};Sr|klW?q"S0tJG5e^hFAtD:'(%wJ%c6rKƌVJNg!P~T~,(J6M/r=IӋ)/<4~2h:HX(4=̥\48P(EGMRIS;{!TȍҁW:8k{#rpX%iQrp!@u<%B$^HYcjE HsZGem{CC{G<<DGNCNCN;$5)皦l"jWu!`lsL'CV XV#XeXeXeXeXeXeXeXeXeXeXeXeXeXeXeXeXeXeXeXeXeXeXea`YI>'`YI(7,+S;y`YI@'",+*;.ن][?Ȁ'AE]p-5v?Oœ݇bMxdK~P WWI_5vׇ֧͞} 5d,'סWzM1gǛ|{nû͂κt~}}ׁ0q4vO6s<٦-9Tz.0w뗻Ӄ8EYe77'79^7I#2|d;_o`ֲpyak,90@3'Jx@A t$`T]` xdGژyrBh/T6JlëX paGm 2S+7mݿ "vDoNa}V>{a9ƫl݊IRŲKk TrRI~/-X2䚃ͭ]d-S>uY sf,FUtB"K*TW6<{lPKM.|yV(2;;lIsN`-"ԁ|}S}ԏ6]{#7QjGFŦ)ʳ OWmJ*G,<#>G8`'~ojXo}`EFH,h%J\QH H(ZS+Ih>G;R'QӻO{ufNRh n} Ɠ2g~mPYڇ-8EddnկYo{ V1@wB亿uoR#^a[9]]~n~?z/n[cY5veSYoXg'O,ˮ#_lS &bC-`paI>9ZبJѤ:Iӹ{cG-hS |P O{# 5+%pтTIO7h@ƀo::uYR^K_0aU%KyRI+iJi*aX@H\ZBE:i іa~z`?*w >G2}qM\*\ fA1(Z8VO*pL24s+OJcTJUzm.UpU[+XNN$e`C!8`kLe{8'68upڎSa|[ȗ}r} 9H0ggjv3-ѤVsIgv&r#iD]1}6ϲ+wyi077!;~G}ÞM=go8x3W3 7s*Q^qP{E"˒7! r9}/>ѱ/˂!`@<.8tr`?gQym1pxg E6JSGJ\J0&azE3y )DeW`BPQ먬a—ccrG}-u !!Κ8^6eE`:̑a$*?+\萯͂qRtP0*yYt hUji[n89dc#N+X]o!r<]ĭKx#Olrj)MeLdja$=/Hk-YR1.NKh HLh**m%$0`8sxNX}H@ ʔU0$̇B1eO膽p `7Q%.tw_ n&`xuZv廘rXN 3T eYJ3 pJ*G8NSTb$j F< ^ʏ톂{8E^j0~JʏͬlԨX[xj+SSg.TU8 >)%R)[Z`/#nlJ5HE8of-l 4`/NJ% bHyiԛlI b narĥDs eajUmXqIjsN#b0M_<2ƞ*В3$ac8)EH)ЋJ=K/l7xh& o.kph9WZOHt6[]I",}hC =۫}z# *e|$r?Er2h.N=7>nqr :?=Rc3FSن~^zel 1H ~ՠu=][fv)P[v"[~bq 'NKotN\MmhmJ;oY~F 2 V0bz0_dÕ%+%:Ȯv=++6!ydycoñOS6 ^_eo8=x&`RwärGװÇCo?~_?T>u +0>vc{?~7>e<#۰|: xg6L|T^fus #_0,k`!٭ϻߚn ߚi˭EΧ-.59[o2m7!lgj{ȑ_iNR| p3 vvbWDZc9gq-ےH݊98vwUf,ֳX7/cYLbI(Gڥ 3a=>f+ps1Z@w + 06osdYFH2hv~2_Z9nR5: 3! TV4{ p_$ܖMVF{vd Iq>p{6.)B2d ̤,&5K'.N sm"4./'ZrOu$-wY|~|mս. X+0`˷z|+dfrNYh q/W+6SpVb&Ee7.#|pŦ?WӞjSvM^||v690)!CZ˅rm n5$7}z@QqK4t9I3/?rϽavRݶ)d2EZVnc |p㫣7=8,es¡c/8 ϭһlqobs/]lsao􅘩`|rzj/r+|x/2kc 쳆 KlF#^2Sp "* ? 3!VWqHUƹLm̯Y{R?'jT{3cVQf=2OZSVLzwQlRU` 2{U2u%$+iOg19A Nu=9=x%(u7ֿxcEVsyU1bJê3.ިGf RшQxGb"?xc3]Q}PH=p )G!( (:gbL|2&ѥ$uP6{燪P"A(G&ɚBFg}ɱf)+9MȁseZ͜Cj;ys>繷b1!Z߯>6[a]3_&Y1 ^L)&(G,s;bC+Vr^a@Dk) &\sJ*ZzB+-\p^%H<%(.2ؼ}&;$ySi8|';*FAd<_5>30F1;n~n!bJFk%gCJPk ,;?.g'grkMy/7tnwPΎv B9^d R$C(CLNPv+׹6a,'U`4E8"I36dP54q|\ʔKsfrP;4ƞ6&RXjKmbM6=&R[obM,6&RXjKmbM,6&RXjKm6&RXjKmA6[q+mbM6&RXjKmHkHK}}[CuW̕Xk䜶}qp_.y!P_AkJ &/oJ*kR{O.1:u/УM^\JJfnٔbY18-HgՙY+jn!: LQ` 7<ӿTeMS"`f:5TYu%xOJ'T]L@%)tlC][Lx]LyԄk鸰C 2Iѝ:g~:2تιlrݥIâ׷>U;Yy@[6ܩtѵWlHb<*opV&굣k5p4ONK->.fTK}ګTNjdTF/ک`CO EFl=.E38bJ8$Avdf8JyXXgl uX([,k,<첬w]md9sz= Ɵ_@_g'ϯM{׺HU#0"6C.Wx/aS(0(H(&*iЦrĮfq\%jWCںE`k,v+6HpH@L4( 1]*{1Hݤ1G`5iJ$A0H>` !I`Z$&Pcko5OkOO;hۃSܤxQ彞ҧ.D=^VV!gVPn\|SnvwSδr "*d"1 *3nw3"1 8MNi͹ J8t+q7Ǔxו9wWCH(Ws9\+|!fkIB1W\Mpdf?@!u' |$ȍ 1+(>(8z@RBOEJH] 3 1&E>BZd@pl(C͎LI^DW5E&ɚBFg}ɱf)+9MȁseZ͜Cj;yS9W9l1iJu=7ǿ_}l0ػf^rb4BxՂb|op̚ҢW[ aHHK:feE /%˲$ͽNoÎueO\ _wa76ScEb?=\ݔ.q}Tc DlU(ʌ43XStr`n[y{-."i:f%(D Fsީ6~B[ |:Ϸ(agE~$/Dro ##!̵Q7[@i)LZÌPFb !xA'a\P:hd xbD!%BHu),Y[ Yi`YcL{D"kcjlƈl[CBb{X%~num)ϐ6O.h L([6&˩:Mb-C(V 49^&CΈWdqAݐbNE n~:&"Fbr$UXи+08nCqrd|v~_tጜxqomfCMNBKه8:яAoif|srz{^}t]E9;/Lbӓni\ҵ{奇RCJ|5jCtX foY_ ?&}ϖWSa枌nMsq- k,³ɥh~~+<9fs̗s{̺yzŽ՜♝vbă|w`Zӥ:zvPɨRk@WYiO= RTd֢ܐܽ)Vm(>ܿb{1H.E HL~س Epte {t\ /ܳIBbMWm1nnħ9oǒ1w3)㙻~q-ԅJKu1:ik[kI)ʁi[q؟V807 r''ܼ#,L6X4Huh|<*,)!d:dm2d]Wun.=0ddivy1MJ}c[ucnHE4X5@B0 zA=Q7F]s̷\@5jp2+:{b 稳>~b .5h6mm[QN5 ٠vQ[]moG+}IP"ٛHv} 3U",_ I(9$iAd3쮮~GOZnFS([%]#k*LM.\EqH=iIf.A4-1 p@Q<,ZqEJU06P aOuܐB_\JlB OYc*5(Z4Iãȳu5r( A?9O3+e {% ̃8SkHZUlB  CfSU RYU4sP>B6xThL$xtTrc*#~}:mT bbSq9Q(>q"$1wBh3G#.}ݺM_JQ%c8Ivkj:O?ȹ̥Fl1ג;c"J5,1F: lp6HvVk/;^s%*rT5r˛3H_I'j]읦;ԧaқE+u?(rq<'BGI܎!&Z 6gZJ-TBp#9f>f,FHDQa*miMQ4#UܓV x$ )[wg՘ID@j4zl5c"eoFNØ[ipЭA*%V_z*ڍVsr R). m&p8YfW4=Y7=6m_ҕw~\JZy4QMBR0+\ӛA|{aͮ*rэ6ZӖ+o.`I{ӧM侁Y>p {LZ0L4ßttqӋqCw."9XW=o4FL 5}`q׍fTOdsc̱||=쵱YGH(򹖜N)bxЄ~@m \R6"au 9d;)L뮃=\tWkw!>0 nC$^ZnP\do ;ߝ_릾Om^X?_>ѡŋ<aH(t@3`Y)"3\JDT^03Q`WS0S7zAj.\C0C>Z\B\G-Oc֣2Wz sjөL2\1W\*\%jwsh"X2uNuQjOB!(=z n.b6Sb q-8[K)I ayΠF~}&,+7{N 2/x16$n)F.0f tP[Ư7C{߼Fa'0B3)SN?*~ wȿļ&(DN&ANAҎneô;NW$`td&(޿;Ϯ^ZJ(Ͼ>/¯]F/:ٛRHk׃4SB;'ى66{Ev="|ƭp=ԫq%v[fDe'V 0Z^w4d⢛S@涍#ADŀ\@D-#3b+X˖C1B(*  ࠭&5>`yjtD4εn3w/4w^Q+RRSJ-_&zGyMʄ5h ,IY\>腼>eUn3tnՠXLjAz)nq_O2M(X:NՄj Z?{VzBre Wl5>T0ڳR2BN6R#jyp"yVI[lYZ[cEt`mfpI-1,)ed#r!Zc ǻ95g_aԐCm3 iDK˜"j¤bHC6Ր9XCjѤZ-8BO̓."$k˙_ )I{{K45( 5s. \@V}?x.9 rɰ\s&Oerl$,C3D׾|7{09ΰRoo70L ng:%ɾR ָO饤 qY?'C.' ~zD(̓G9BV ӆu.•uFc0Wo785iatH\̘u')?&) ^!fy`|#[Ɇ~ wd%& NWG9Q7*~ hhx/(X+.a )2LTcj"\k*8[@@B͈2^\@ZqQgKP]O=]uCjvS>7zJ>ٽꎗ cT+A{׺RCET@&Lu$uA|>.NތB eGv z4*@&ޗuu ߼7|~}_?7o?b>폯`80.c׌ c=#a23z(rU߿>;\ SɢK*:so*-Mg~jvMeؚ16g6<_w]^*iko5UI׼QOޤ_k| uWeb3 P. F]K}mJ-JcAF&c8-p5/{ucFf gexP,,GJQ&/jft68vT*4y5'8'ŰjWx5іt}慪&QJ=):~tPX;.zP{;o $^QJV n81 ʁ_Lp27ZلpK] &#qXwKA6*fVԤ8]ruȗJe`JsN=`cY%e=e=8oExg͛ZU,[6Gmp >OPfUPU@?4bk%]]l'qqiLʑOxtDD~ MW{W|k_ = Uco p:>;_s=E˯oIwQwZY8}j-׶ܩ&K{V\}LiU*-^"mt3Cj5cۑ%dzRl.EIQs- \&#U2Vge܌RNb }ml Uwg~.W48ֶ=Qj}W?йҊ[#p'' Bl '0l0K@gē@[@P?Ox/(fSjh1UM`u)`Ů&vGø\jW[}ڦY`7rUYi'A"dB(251E4Tꌘ18WVg=$d|S1Ǿ+[DlY"gLZ:)9}\L> $ 鸑 C W)aƮöުo4vv#)]}4]LN|O?;i3ǸZ"ى4*sN$l΅H g/R-20ڧۋ`w`_UӅuFu%]` )52ep'mHs qhdƨN ʋJ԰:ܙ<o}}W`15-mi"EV#8pÁiGa7kcRx4BI!ˆZS?5cGF,wr)N?M.!jfCj`#$T }f*TPd6d$R|h@e2-$ Bg.Dr(t`\frB5qv,{Kkj2k0~3c[dPn[5Cwb>O/c>"/;:O8ؒᬬfzT{%]Ƨ;ANk<"]9p) &2 ;6U|0Vmd4z"*a0m[-u_JU4.g;0wBv"Ge̐8Qc OJ 5̰ WeRS 5]B1TB9:ϕ7>Ygi]9,SBdN$KQفdѵftu]x TB0m *rbS=I HNKcޢuYHJ"(ݾ#k HӔhR &q$JѵZp%g.Ad3HEx,*zm^e{q(TLĜ2BiIy3Iesي]V}U=1[Z;X40V j%tabĠQg5둎ӼjOֵ7SOh뮹͠x@m`5pwя)6tJ}S w!vLu>*E;~IҶ^G |?yoPZZ'b֨`8/[D2ȨK=~W'gGOGUv4p槏ޭ/K#> Ou*c:džN%|0tॼ__8JY$!u3adUlਁӂ#=v. H+D2KƤ)d*:ɼd1jNTxR"e/ BPRhˌ@ R RdQUg;8"8}I tm g8: M|2ؘf">PG$^F("3GBDgG,sxB0LrF fO1My(f4aM![v18V>rVߜ(1Y}8+ʹMR7U!drvw#t.B@ Yʷ(Q_4ꋞR_gQj4&Xv)tA'.hì=WuwR\H\֥ȲR \g@"b{B^mݬ81b,TOhe{{+F^+1O1)uF$0g ^\8RHdY" ǹdnjơ0Rq:%v;w18i TBhl h‹:?]F?nfO~Y>7![4^G?/ zGK@~Mr/XƱ?}uϯsu7Lfz;nsmͧm&\g{s|KQalYӛ7ԛe=<'|{S BYP[2!.dB`vkZfբoH`X#]8M/34sf6ly:|Y7 ]WOXM) ްNCǴfIjJ ,Z6_2aN/=]&wڬ\ܣ-fV"Q`CMk]9>*{>L3:9L+5Xtm&dd%~eL|~yaZ7Mˡw:MZiN.vj:yBsق"l꽭u=УH=MnA G8) ;jp§n )zŃ]It YQD4 p࿄dҳӣk) 蒼> +<9`tr[.*sm~E[̠4_/p0q~%  ~&yf賏|׽ci8:\gu] W07d?AOzu(pvEmYrX|6N$x6˛iYl]ѼqWCuQ@~r1yE2}:0(8K)R;!)WzT%<Bf.Ңe!k˵CdQAE#Bb fݣO8k{XW;}N}@ S+~`pOw`!U s8T6h aOi 6@O 'cvCngJp%}GPgI€2*E&%VY Cܦ=pyo9@"!#w;+U^:8;hmk 0NgckSܬpo~9}kkRK\[Ig,pfzwf:"qnM|$F E =Nޑ ۜz:YFwgw\;OR6m7M;<~{祑x4p۫m~5?ݺS?ѥ1y痞zCջyCw>bM)v< Pȭ'M:#Xk lpJs)A5Wvxq  eNh/.әgK=zz&Վnl3WzN_\>sEJf\inKkc"}FF*Z}.Je\)UCWo\I`ݽ /r7)~0uinO7^So<f^3>6`wk+{ғ32$0r{6f+Li컙.R.m,43v4H=1nx'nD FoLQ_\1e x0 X'|p)QlXAӡ!EeIL`Jg%sɵr6SKK]Pd>IotLoQ8C%(]VR ]iya*/wzEm/?1/iZ̙4Z9Яj7BF_#k~rn^9LHh~ @5BF_#k~o5BF_#k~Я5BF_#k~-2])Eex<)HF4oZضKvɶԗaXv~zH'=-d-wˊy sM8{6kPI<'ˤ:~~;?kfyE^~;MOv__{nΚrVD,4 A2aik]-}RCX\Rd(gO19xHIoJ"^Eo Pys8U4}ώhɅlծUUɖGY'2:٪|έꉇ㔸&DzjA}z>V+^Z7Io0 H(rږYɱKg83:i 9Q޵55#B}Fy=;{b_fCW1mz'U6 pT?и(ԗĘIP90f021[/KsgR\KǹII֌٭jX]h*B5gՅ7|=O0fW0Ϻ@ǣY9F8ON9&1x0A>X6r,!HGOꆚ@P?iEmJm R4=F !e 5En'q͸֮}fIq+ kmA4Z3Kƪ4GU>J<d!2@Vtɐ]&H:.8,Qsa5rv֨?QLE1F#jDqЈFj=^JUhy$S:)fmD.zGٍȄ!%-Q5b5rvkĎsħBTK.tIA(*FiW-bjS6w6(BC[+H^GL~ɝؒHYngO;[ot^r5RQ+8ϐQ"qm<Up4D\ԌA;d t Y ;PW|YY )Q20(  rpP2Th87$ɑ $-UvqǖqO$-nޙO~ +ZXa͜4lfdn8ZZHlJ, Kx _#[KmM{ HXEA;jPkH$j6(!j}A+HϨފJ>s ~~wqibu?~/Gp#~%kKIRNDy.|*a]]̎~Y.՝vm=4ȓ ~=cLw8G2/Ǹd{wb$(sY:w9)wil4nr@ƒwd)pMn IgϹzuyb˶EJWGU$U~iV}xRCaɺ;g^Oѐod704}]M?@Egѭ9->Z<^NEcO. b4xM̙:tެ횣Xm-Mw&c]#}wGyaD0A01yh*|2Z-fherQ>j]vUrfr(V2RV>?/gQ9;[^'odq(.N~cᇏ~Ï>r?@Οh)^ZpڙIϗ_ĿfD^bSrxy\ݰ$՟ hNhUsa/S7y[ -Z՜r֜rǸ;n6ㆳت$ 8 n;?$nMm̓?!.f+p9ڗe XO= TDx@:2D/zA%6$iqb;A@' "Ͳ! 9mIznr+l8a4aӊ}@! <@V1ml2Qфlx> LB.1],x&PUfi~yїĕj<)&,7=} xgIx p=Lx4+%:IkFͬ!epV7f˄)jHҍtWgiy -L;[ng4y<퀴<" e-K<',IO*.UJ(Ǜ!7mBHɠP73֫\uKd%))HSFkdpē6BL$9sYF˸VZ42kZ9Ye [|y&4w%ᭃru/nln~zzD}w)I겭M;חͯ@Vsg)7‹$@Ck_ۺBڻQK`D'郱y9DlR` UB Uf dPTcr7 tj@$2qLef+V#gπwQzBї%NYV?inwx q3k-_ϧߦRʡujƭgL%YMk|ʚ+sMZLDeY,Z]0YL}>c*V>h5rvKF q욻"Zn+٦XT+@%fe٪fl0:nm Y$cHm4޵y" & zgB&dLD桋1irfQddOddFDځN1 %{rW9;a#P\\! k^H&, kcEn}q:5 d^pSZ@̺jd0Vրj; `$ f &+>[^{Y"sZj%lR>k9Z]?ˤY}lB's*ءx w6J&l59&r#K:l7zL 8ԛ9*]7󓓲_9FjG$PQ(`i:?GѬX h|3ǟ>׫.bw4HZ>0 |vGŧQ{dڦ;vo|tbb~ҿNL}5]6 7n Kb2tQesي]V}UKRhZ]MBdLu>#rf[fP*V(.'І{AdG26M~=E~߶@{u݀ۏnv XW BQdk"5N%X06d$C)tZs=q'Ϭ~MGYN'4eI`qu h "@FU5=09{QH}*X%_8=;-ݓ:W;?XV『`ѭ/xϾ8JY$ 53adU ls)P@Ҿn2I)dL*Bګ"K#qst'>ĺUu#8-oU _]E1]+相ᛇ^ħ?˵K[d#Bl 5㈞enOQƃI1f*»Do߈AMk4tw?roZ6pn[V|;7oP?4Nf^F\fLZ2 Ӷ񈼁qb V~bCSܞ6e \Y(5t,o22t3I 3:0kU]T ) ҺYVuT,p-",'Ѧ91"i斆Z|B@qoo|Ϛ॑S;#z#bD1kBL'Wrͅ~ND#́;=\xЉaJvNDl:t"C'GdF?Oz$AVY+%Cz-x)44 o|TQg ){ ;q=-dri 4#ѯΫ]7ʵ2cfky^rpG69IvcY6D1"!ѭ^5/Rr6(Q,FʐHxK!m/Zq%Izx_gfU|)4 F=X8@bxd[ZtGY ҽ&5Xo(x{1o!4yEڙZwS5c~oa4|>|l5COGGoh5C}Blki-nkFmmF dbѸ(:u'ˁ^yrz2sR*qwNnku[_ެZֳWZڰ$GOʠ9go?㡖?;/1O/x?R~]<3J1l闔~%Bi=ߏ_N$L)b=3VWNnX(#ӹd:HGle1j|Av4lZ#ܧie~v}]_o[ͪt[A#?/Ij f]ǸSG&,|Dvҩf\@ C+.%d;P * !191RgMr^q~?gf^ʷ9'# r1``ɃJΊUJ6i/G`c=udMG{GWsj,P Hd=M.[,d$#JJ!^lN<0=VݦhE}:JDuwʧs,B> G#4]WsTWG4k>gpעa orijg()\ ~sQbiuA-G uȃO4A~!SK@AnQ?{sϿ>ƟF[Mĵy} DwWanշ|asG䩪 Yo;½6k޷7۞iӲiW|öxڌNs,K#V\֙&fjBegY}Ns2ZX]L#tY9.7޿ 1 =mP~+kU1y95Ezs 1z @RwEe(1x>ۚ*ĻGU(?~<4F\"'R*0$-Naw d@=BA8)ҊOd j~md})$Au6uٽDhf-&u_ mኳ}9&l*=r=E_yN]G.xR[w[ݾ< tцDtAj?tr3i{kbzb.Zb;EF\xI `.Z*Jи\|T9PR 8VZ4 Tؙ8(x'Ky͌b MO+jf\%Ob'sre6& OÓ+Gb`r0RLD|!`hSN0Q, )S<6q< U QiUaS[ ,dKV˪x[v8+qv#vr/v=j Wu04E~d6IJk}`ho jAYT 4)c]H֐J`-:ir5D&đj1Eb\agV_fή7x("bLj="xV *v) Nd4QPI{Ay&1tfה%Z2T9&bQ:Mk \*!];@w&=/l>G|)Dv}%=2{}O(V37?kz9iTh{z~Pzcw{!W9T%0 HXD[AJ㛣u:gE: $ho2UtWp{b؄n|Yp#%mRYfϐt)@ ifnnϸ "a{{ҵwrFvBZu16bn#6)/.6rB> \4{B=R!^) Ma:J2O&y"K:PO>ԝX g5nY\MaA$S>V֢&30MΕ4z\JtD9(jCvn1@LTf4vvAc4:*׾$1=_})+*.5ګ_K*5^J6gBNv>m I5ny&;JSJ 6ݨ]ѹ@J_?NH'U_]r%H$}r΅Z d)SM57 S4ʄu4@$v*Cy|/Sd"Dad"Jv}Nӕ8Hr6=-yxU ob`2FL'@L7- ؔ}(J $+ۭ+eV1a.$ZHVU2X-E "JaykQ1 kW]jCY D0cXBŬ"/^oHYTP7zN){J,2Vj}JXhozӲ8~\2e$, ~м2{}mʇm6/tzVbVMgɢF{7۵~LƳsYQO-qǟF4jjZNV'=1Bx`U7lScѬtQCrZ{1Ak\݊u"($%/DQق(VןL֒9eLrR&0AV.LM T^X{}ĐݻZϘ5}r@wO݅ӽ?ssdUOFߕ=t/^Yޱ/|ݛY:2Y:hfBf0p78oj`\Ԍk4<0Jd>ns1c쀻P1 DBcy8"or]z[ yhv4TҺh!񔊵5)9lAER:(ڦQYZPysZF+k E0GPNZ2$1wL܅1q2.6|Yf'_]}W11޺)T}kPIXVvE$ej>W N!ڈsd.f_MXeY?kJ~ۛ_ƬOSttY|G%TiS;y}<;}3i}qy|(+dWwYU]Vi:JqUJo0R)EqQ֝9R;B7N5341`TshjϤzՍatcr suJ'Mcн-攂 8]=d&[CBR2`.E@A;t{ۧ^$m7}2Ouh]QiS iVmY0cdƫnDk(j7j1aޱp6t L+N˭Xf9m ۄ턷1{č6㺋 ~@SXwl) H&r>ܵ<_lS ;CMi$h{$F $2 | I9$M^v8Nٽz{ \P|Ab}wAJJD RA. T`h@P>)'b:))"J/sO8Nn~6N۽oϙAO֖L;|i/}cw>yma\xꄃN 8 ?}Sep }{9%$*U^ aِ/7Sڲz+v{Ir3W&AW>1Q\ yON,se\X*:ECys\]#͕k})H4*XM\$`Tp"҄j9b@b"E*Fi&:g(@,BYíN朿`1r̆淐%w8Vm<m]~3k*ӷdnl|>kō>3"l ]]n&uK'tȵ5f !OwzUӺv+WƵv>q is-h~-oaڮ6C>:6^R<075g9uW;gl8]XtNJ:ulsSJ4 6d#$2v3wƿD\)>ty"Co\ŮϩZ573D5;}a zD~_nGi(e{c3FP>|{yו L'\J٩L-Ǯ3S_8MZ/v>v=k 5=~ꍆqobu`jB9 t"(ӏ*-@ Lu>gLG"CHER ~_b:|om۔Ze7*{2ao?~WIP&s*FVƆT%gj7+7[ ,R-,"T'r™*1T{(&d! էgc*_'{4H8PrAJ %&wМ4?;hNpX殗 x8=ɮjǐsdkNT*ʃRFd qlu~C{ux߻"iW Wom&z|r}s>Q"\ukѝZbh7>0m+B9ԇœֽyv_6O~|{7j?x?Ξ Fg\a/amՆ?M0Wv8[A~i.鑮W V:<9<pT(\y' Ӆ?ٿOLRrTF֏:QW͕SblqvBS dnC1lF#="k:5\!;ן~w?|ۋ~e_÷W`}nb^P[}?Sz8]si:ڥ.{=34Wy7r4 0 z.V.{a9c9c1[S(%Y ԝfe8oO{.1ڊ#5u Zsa[yXkW?F7G'/ۨSrjBp2T2`^I^zNq;s]gԽTVyGW+{ :zilRA2/u @#uq\{MN%Q-;l߶zuQ 訥I('0`+*Rjwхpom|'o#4f-?.Qn(wcl]  '7w~+!-gN[8.HT4]@*Q@}eaiڡ.FN/nL*4ON;[|;CMou_+'^pkz`vˉb-3D7\I˴ϕ. kȏ`?ϘeXDF,8G7%@a}Jd|IsYSy#8Ÿ@t"8YDmx ɩ+\F.^Q+yenOՋ4`04'o\c'a c]i0tA :qV!CǵA<.Cl!ceɥ $KA%4D02WtRln4 shbq,Z[uZCk9hj+xieR)EfFJbXZ3p">LB*h!#2VQ x˦y18#b]H"FuTև٬k>C̊h4b)U#Qwӈk-7BX漍Q3H>kMb2Xc-H2%NMjrKYϸ༆v#Uq+Y#GhN.p'}w(X$.#OR:Uu\Dv~6is;׹7N>6̔NF J=>XD; D锨ăC;Sʄ%<7=i;+8ı>7dchm> bzr 8"PH$d@er9"PZ8"PJɺ"P_`(寭en"/}O1IK bh4PF>B QHL.8JbWڅlٵ5&$ <`y.fhRkD@%"|Gj)kLHZ8.D2L,h5T"=&b 8#g[g}[}*^4QK Z"Olnθ -wCe&WMv{ ~πRXmrP0ه[/#DDC\L 'eQbRТD$52J1hG2Mr=)ZcKnQ'YYo@oGIZ3sdяGvmU6Esڮ^ MKɋ[*4񶚘߇0!% u(PѺNZ9[gLCMHqt&:"R 1yhH1 ړ^K4L4AjEV.Js>Ti5Yf9Q\ @B(B>4"X L4K VHD8vZ-Efcr79+2|R=^o8*^Ɗ?{cKX.VwtxVj(&UPvXJ Tr֌W,xH8{G~OSÂ%)mhIC'@,kB(TGٮۭAr<'E)UB I57IV.iəKL"ga=T[e{i5u*CbNcvtx3}|fesي]V:\9{`3"WIi(!`%hUd/cĠQguge֓=RuoZSհ5hkQ>l2S>:@|=) *jC$QK8rebTk ب\tF}BubR\>.poy$,rk$!Z:Hp:$Z;s %#WR2aœ|C'-qi7 6ܣ@Sڃ!892h*j]@\Za̙: ͝4\bnzҥ%y/c,D??߬=^L6\ 6jͮlƯMJgAcɠ(QF $HDo!'Y;ЃȂ$ t8A8Bwc TL< a4,TK@XWw'tOr1b2Vۊv8z@i0wJ? ɇGu0:-5p*Zlffm=f<2V }--rd\Dc2 a@MNė|xaEÙ 6;%>AjXl /5 zPKjWEfbn.FMNtYeHt$Iz b Zc+6һ!? //&qYD-W||=8P`ڦ'i^acDg\v$0مYwj6ovítNk xJaد|8OװAV{3jw;xJ&8(`7 4.L;%8R&}r4lgpb<'Հ dRqL{#"K1IwYt[#kϮْr7xu'~ ٨du0`z&hg;W{Q_!+ W=(At}s=R<Rž;x-%WJbFkWY$QdFA#).khe34>i/#Z:Ti7&u piE,nk0Ě Ïgi؛Pr~$SNzߦM}wVozwlw]atv7s~ttbżwFH?Zn p %(6}Q[/-]r/mJ;H8p_21)6܀%]V0'J w^T,ӿ^vj[:,gHA_y$CLoIXw?#ewϿ|4j۫d}%7h*y2ZLDYoxL X'|p)pE$8^(R5EΔltI/>[W`-J}ht؄i&w~5[GyA2-0~i^'Tex紀~~էóW㽷{lֲLSӏ?O>+5Ȭ|JkD !hUQvKd%2=DN B 'cv#ɸ4>+T+#88N<,b҄IXVY Cܦ=p=8-H\d5GwV:. Y[&ZΎUkHa:\^׊ֆSoXWש.:L[kpb|Nk|tg.f얮g].&u3[տxe6).\۽A\s/Һn}7y.wi{qqZnhC}gt͏yr7Ѧ_qt|4\-_q@Uo( Mw-E賫~pgO' ͸C,Kn<,_LJ}-pu_,Ͳ]oQ tNs8PC|~jJ*E]+)4T |TR 1(|vnn J]/x ߓݏf]v.gA~/(׆hEQ{ɥ jR:V*QX՘A̎L  e w)q!X@ Ihx} z1 e I!>s,B֦FtAG{uZe&PZW) nZClPGۻX6e]tK3R^Q+8ϐQ"Qm<Up4D\ԌA[A|* qbν,$oԱN%s@0 ŬByЉHBR`}m=G1ǚpkOgZ4 }Xw!͜4l(260ep浴BX6;:,S#G!0tWC]M;$`~nV0HԠm PB gYpD/R>ޭ[NBc7g_sK$M=ĩE񆤹b(4>ֺe"N.TޅwQ9 ?̸ϾT̟ԪL/i&MO2x[&~\Yp‘KL\S%FčOb'i`ٳO7ƒwY `Z?`n 釷jxPQ>7}b=atݻp5#_jN'ϗ?H3L"IwIة 9Ҍ P.++ IooҼZӬuoڼXEZ]_?]Mf\gk&dݧ0gӽfoEDj?&u37\l _cM#-Ց@i8 g82pT(& ߌ?O6,s8`G/iԦJάZ:.v4?#e`Ee(]_^1='7fc)wCh^_9__.~׷?-__}Otv`B mXST/^2)0Cs=Q[򆝑/a9G؋ hϧJf]4n{cg_ ?vx[Cx%.C6W=]Ƶm.aܿ[bpk!?]'e}B!׫jsN 3;J `_n&w N%LPԑ!J7*/Jfm.M&! t ,{u_!ԈVN{hȜVQ|#OFR3dSЖ&K3OB:K $ ɝ#Qk&OvSW!tfot]Ytq ];q5zrDWiR}V|%c3 Օu1W%˄)Lud[KHo=r3ۚgn3uiV"򉥝[l;HH˂]k9L-IGΝCeu3ApnA?t=3(#9l3$<D>Xh}Q)[?Ӊ8GoOYm#΁ m9 SL[kug\hZU'`Ɗ5lCR%x-&' <#"*zcUMgMuGBjc0F'O"NɆKsEO*j\j@ļqLzp/P`DrFѥm.v gFY33^'; UΨ3)T.YrA' L֋RĜ5qebRms֦3nF)T%ʸ//4-BƒW^dq\pCIVa27oAA<9vV8蓓DlN@ l '0x0K@gē@vsգ?79a?%{GY\lEC_OIvFўA{sHcIMkOsz٬OiO!(*ljc#Ct)`Y ΔZתo%fQZ2.VG_P=0ص+kG]i "2a癢#^B0HA )4[ %u5)hjwh8O$|HlTg}c:,-.\5d=<rJѰAXAuQ^Cx~H9D y^$Q@;NgAVze:u *LC߫J>#B,:QcL1$L ;up Rf.E"5[!V4RI^AE݆&h[]_ٟ 1do̧m9sc~z{z*۾G'OXSkc|i8pTBK'tNe1Ck;T(_(MԺ5uGy{"#\zFRL)qb, -ZRH3ߴN2E*g=,) y_k|L-#.s!D}FAq޹?\\~ q6zhnwF_NfcZ#˳;P ޠ'xՁW<!]!!#1)w٠l XLvRY0,Cd/U ўD:Ģ85A vD- ]_;Xq;;xئ&+rO%H)5L*.¡47RЬ-6$掰)sf/|n%wW/#D :pD% ~Z~xe{yl<^9ږ"aX@t'K ViZ{xk/;]n%#j%$l"Ҙ تO7qDNFl/qOy|'SVyz?Wdz<;/K[x߃Vm3;]&1y'@7 ?QyLҕLHNdNV{j7c'-S="y{um֕"f颵 $19SNs@1 X3BQd2wsQ\ EV[Lt>o.1IQRDe b|NJ>i%\d.GO87֖3X &axL(j^!&޴tDR/J0(nv GSa.6K2fkJFIo>CA E !s=,hӮ n_YxcmXyu $BNTN`u4 {I)WCX HbF#t.F*l :Tpl#*Sk dS4d둎]Yܞ2׭koUu˳A;pO@l,e:R4uC<*-YtuwFVK5j?'Uw-$dOn%K/VZD1d`L ,F+etaѶPY-NnRpVv9R5ir6Λ̟zTKF&g€#gŁo_넙|<٬z=Q*vvxQRKRd.R 3F)-dƥ0O_^zVBπ< yP 2ϏA-$cLdTasSBŒ |nZ"vy O>u|݁6hAKaq l$okoNpҠ,@9lc nfZZQަ+tv[т8T2IG֖i, ! ^x>-B7v^9" LjϼG,: N7}B9BԞ-"!9#}\Wby 7}"pLv%kR]ɏU=}^Mv,Ӣb<ٱ r9hD bp"iNGJ+&h>jnf<&m;ؿldzJ5? k-\#UkOc7mzN>YɒU2Yf@:RfSi V%U{% >:Q Yxb8ٹ}}皍{ (>)oONA)G+b[$ lj9dcT@hUTl@ q7W_.~76ʗNG~1F^,u.pE;9ʿp }sZe'XuY\w u?gS>Tx@홫K ZޞTYZOޞJО gv(t;Nw8Or< ]]^(sͼe +*H]$RyCZs*9zB:6Zʣt1>eve 3+<Ȯ3+{]O{$eIm׮Sl?ONvy!+?MGNU<)y8߽kzwl46ZaVxNf7\0G74md9^ԆbqKRqkܓXݷ 9ms|O(O3"xD?"sT)lxRbW1+&PH8i\AWI$/|+]^Hk|~?KSk7}O__]ـ(_F9ׅCΪ.Xf=C*P=МVP+rO+ Efk'i6$TuH `l9p" s6X $])_>'TE,3Ԓ/;Sk6fٲx _C叩slEY"+&"N-n!{>wz[wo=sn a#Zf _UW"9P&Y 7g$,n6av+g^ƹ]w-R}w^~=\k9-bXQٮn|4퐲\^zi.1tcɶho^c:9SCҗ-6oi-ms^@e{L{ljk8i 8)+0M="0M iO}uW,0}0pU%y(p:aWUJezpW Q,|`U]U)IpJ#WU`{8`שC+eJW9ہ+pׯ~\m|)wSOoKnk_#Qsןrt9 M8e#~QajVU(t:Jہ؅$cWA*)R,Lr-/¸jѠbJfq0Bw"9VE_3P:Z@?F$s(fϧ1FBc&w:oS~ _ySd VYfi40 -c|5_v<|f޸.J~]:PTm- Cِi6d4LԔPBHfeI).|C*2dM>)䭧bAhܔHuC[FאoƄtt7$J@/YUcHX,v,aƒ>f ǵTqQwJkz R}&+֡xxx>~U,VMZPr@=Jmwj9~.z4؜RPDpVkEV> K)HeNj,/@:[ (-1f Eg5([Va&Y$0"|c_@ufCx!z(;QEσ36x% "MJB aĬQgF%5F%x-w,+QD-RVE{D8"TY^q =00L^/"@rRQ&gZ|p!ERKDrɷMP~ت6ڢbv_L$a,FTZMXYK/,I6`BW\!ƥOOfr=Go yP-R11OQ-$ShUas(`!d2"}0cE(=+=4AJCV?WRehAkq #؊ tQK#F, 8D퉬`]ҎAﵾ:{ߠwݗ  ԣHQknwpyA6K-@*]eNoUIGu`VB*St8E)@ lIƘ u={{+y@,&^8\bE݅6CF{Xۏ-@n\]OKU{,xH&h*{Twc.dY[a޳ŪGN[2FG Z_>_.gnFIB*[o3 HF (lEI*beIMx5ݷȹ` FۀG}jg'Ѱb;"d0 d6]* 蒜w K=5*wS]]>nm{7`V^N6Vi&(K})O_ۘ=VC z=9~&l~W[vxnqF'Lvh;p%vJwdWt.t'yI'-: Qng=*D"s.de.YS )@MI(ZQJ4O$S2E68LRd"Dad2cB(+٘6nWS^psg^35~ yw|G~uy:bKzx"ã0i2/Z:TٲهʐJ2f1iqN"5}0`VͤwL cXW g;_S^SGS1DbՑD+V"۶J{ayiQ1!l!V&|5 nbzfJbkD"1I,!IgdUz$G"R<*WB(^QdXN:Q>YǥcXQ2Vj}JXhoz3gq1du2*]Nm BcGcJ1ֵwqnintmb[ Ի_Fz)+!Y> YQx1yJvFxFrGmĎ6sFjioOVCWaz~>F'3nWW/ eNON~'ys^cy42p +g?%ƬM/Z>#/xn6ojmk~vPeں(V7Tޭ7Sލ$nu{y:J4}ywX ;a3յqڅ!8M:̧{BDΰM"{RZZbZbP T)ePHMB(Sb|!ȦJ|m[> }yl7>_&9 IJq>/:<=WC|j]$xHBgӜwZV].8\(`iHrؐŠQ!0B! 9"m# D#JO#H{ 1z @RwPKu&Zm gu l>&szppm}7 ^^{4[[pW9ҝ rPqzSo{zUں[nz4gl;vmUϷ[}_r(c)e+RA526u4W*H21 ^f|vU,.0eƇ'Ѡ p0}]LLF / Ms Ơ0EdVx6V*@lXRZBff(1dCw2qdBdب&BcctkC¼䂳:VmR2d !huA b% IT9ֵTk`@G bQkC62jd& ~y#}=藩=lk[pGs5"HYF TY]QL Y" Y@o~}t_I-sF4D  yOxtRD=P(.ʒ^_Eף/M/4H 2s` hAk#(3slĤ hE ,/"=Ņq*clb doϿF\y\;5AkPyüp(~akGf'/ThB$',KX smTazw3=|rY] __ _N5T"9 :jiN"O3SnymKXJQ$@qI8[7oyB*E4KiC|NFe;L'a$Y_를lvqywٺ; iV|U=Sl7 ʄ~S|RsɝgZ>Y4nF]?+?/|5:9{`0VXEm0*j;k"߯f>i'bӝ;yNn6F xua1h<(:u%׫sp1my8)ƻ*tMϪӺ@+76,>=ՔT<'6W8ƝxtjY&l0Z\>IO,o?}_~ן?} ?~?=:Q96^*+Et_c8XJ43nۈy?wTŒg.ޟgg|;oC>>jzb'/>_dS5yR.ZO}Թ4+kNGs5VKQ&tu1S;$'KgV;tL^[cH ;!-"5HNH9PҐg JQ=.9ymaD~_atö c~zDHHm^:ζB{q4N *lvlӳyg~g?e8G͟F/[ @\G>>RCJ_vS/?V9JJt)V2#%.0uSĹ&7fw7-+&9xө =B򲳽l9w77 ϳG_o7ףY7;o`of3Of8矎f P/g2 ϟoӾtwH7ړ{(!]Bg!@~lH)wh f%Pd'qK=e#J2JZt4yQ(ۛ8W<h4-ˮS_;7ׅB{XF[W > af oS솰o3¾Q\z^CطHLmRJ/C|! Cc?( UC+ p J9!!+ك*.C*WUJkzpn..hg6tS"A;~Amh"Hw|&ƣhdm?wG)x[h@7~;%LW5 LWq*-}*_!L#jo&lGu6zy ~jfls^ w{3FSb~mvZ6DQ 6Zh]Nygff F}zpU9nMtOGi_MȈ+8a99$0e~i uE44Χ(/)*D4uC25VX}RVJܾBJ=WH s+]*:*}*\BO<$vUvcrcqTWUZ*%jsftg徆(>/O|MΥWm 㯊6;@AcuM^JOZCo~So2JFUHBut؜SKKd¬\{}h m_O_6^g8O | j$-H5m^e!꣜V~^[-^%KD"lPg`LbrVG+eD9Aa#=q#C"'\![T[⍠)b,dAڔzvM "}P}J[̮m9hWOdCYN) d6B-W!vQ-/8 ͎n}6W㯿Q.!aeM#[v: h GK\ KP,%;mV]ŘDJ%Ejr>.#do\d%a (ʒa0 {]QؠL(!JH6T겯qc*2d S10~@k|Nh!+(MbuT" ŢB`<(˜g7q֫^XD}Rnm=[qP7iԊ ³I8u| 3iĈs/9:o~#h/ F:625H*94)xq lޑM"|ԂY6ծH!Aڤ2(5J[ $PzJȘYq҃@Ewݔ_8; d&</r_|), Gya[& E3 l, BYEz:c9i&yzy%woǟN͍NMN“va#÷?]]k{{UuGQ]{g2ZݟU*<^򼙅K'^`iJ A[BCQ%ֱ".`@#g{ՓJXtZu2Y'r !לwCdg"sJ{O&(EQ\b'P#1EJ[&E1IQRDBaD5x=SYOUbtv<Jre&jMM|Iu98^y$͛X`^J<BLiAԤfcE$2Aj0!>gHB 1xaRVYHZM wSaJZ26K2fHbVIo@ 1z/4o-*V5s=Yк]&70CA3e u{~>cI R x *HKeũ|IZ^z/)ډt൯ Yҹ9ݲ8~mX@I6ECx~Xc{9V2^uN ]j[ƶp1h0>9 'Hwwz16IWųs`w'gE zjD៬ ּ OۤqJO,f}V Vd*(2(XVI$]3~_o̻(Xvieǻ /{wi)Û@}Tz[y0d%׃~{J` s=Oq;{|~A+Shv 9UMIK0heC!Þy",-Grl(BBϚ )VYtWa9..Prч$be$K|=+/!tbtLW}{qev:<%>oPWę{e*6n޼Ŵs|pH׺_rh`UWmG#lQ`\dmol,XCĈ"e,h%8-q=էOUwUO$;vۻ 5t4o;kxo<a'/y>;7zHTWyd۫m]2<8X4׋>^BR[Wݺ\jryک?G!StNE˭s%>+٣=xѠFcBH࡞No;W'Y (ךY +: ~}UYerJKqn>~Hٔrd0:_JÑ:ky\І7ME Θ :? m 7xQH#zy MNN?Skr5]]uM16=tm?ZqKH\RFs@}izwޞ݌kG{wgo-H"\7HQ;\킣Um٤957z%tJ`i3'baQ8yJ>/f=lwқ9JVY'#歎@͑ܰ@pƈ_=Kagp( ˕:巟{w~89xwG`}ފZ/Dh,^3?T?}nm6YB-w. ͕^^ <*̾=?iczOt`Շ:h7?4RMc{U%{=U5%ݿ]o7Sy$4C<, ˻q_{ԨFZg{e:US( <4lqFF_݁f,&cȟQG$xD@+þ]O_QKeuJQ"W^YCY` $1a&:#3$&7V9+IKf ysz[5ڣ]hr*uPE66A1x`V1xhu`gVsJJ3aG:s0+dNANjW566: 9έoU{~uڻSj[Ch/.z&Sg#']m&o.T4]A*Q@}24XKUm!v2oF~R> s*3;'Isc HZѨJIbI4 .O>K {SЗymOuʻ(&&.]8#‚)]L;6USYm4:TJc(4V-(fD*tQ;\L(vYMuӒbqt]ODF .3psECnr2fm8E^1BTPI¬gLp1pqSvl^> `-j"u;LG(c8?:81G_kap, S),#5xbj'yaNEݝbNn*և6D(A +54rgs:%* -JS"MDԸ $/÷ӷkҕ<^i{7`/3_\ߔ-wT$Ճ.°',s<.Xxt85(=ĕO9v=iy':R6dk ΋F2I8J[U\~V8s? ߙUo4?4ᇈ˗E'3|zrg ^j"2~gr6[7X%9UBث뙙.}Vl4t4y$$wƫWܔπpb<޵ϧeJ'`L(`䒂j ԠQOмFcE;]:6΃F a)kLH-jEXL&ʌdXGhk1w|bY_RbS  QTn:[O.ߟDZۇڞM{q?v:+֡c G;NjF$NPˬZEg0"HN;kd4čFVG#rПkY.m8?~x잚Ɗ??4VMcvWwt|y*ܢ,&[q5|[6nQZMof)Yq7n P`FUm,-כWYJA:zpq4"B9[WY\ؚcqYZp4K+eZ`w0*ӊ՛1|Ukg/AEľ9~ͫj4Mw(ZBңZRN70Ω"+505[(-00UDLp*&~ˤy~a`.Tl, x^ ܦwAz2:/Pw7&PT)v]/MxU5>oӍِOSJkOǷwLPz"P%Z{V< ɃcfQ"eEi qJ.Ioօ+XliJB|M,0\@Rhmr8@UWoJ+(tR2K+xx_;kNOSTv4jwq@0/ ~ԗ}9_S0}NŁϱZTZkR:EW*{~]ln =G^/ È}Aͯ?E,MPcg}T _D5e*%Z"q>Z-xRsDу6H^~5]k @j9<_ O|Z0Zq]|3jjCǴp_`ݑ\ghK<E8<\jpf7.'@ WVDTQEmk'ID)"ePRA!/ih({V>I"њ h@RveI9@3JxEhc1W(urDqŏZ0TƒN2 Y]JY! mR\{xJwy$G76K{@C|G5?ApQ8?)izWb]@ =;Rn4+@b_kMYCvZR9#3,nPy' }҉<'$`ePKHc@3CMP30fRXfA⨀|XGQ1I J1>Uͧ%F0 +*Ad%.C|6PeN4* Z)Pʍ"6YtdXDcZ)"%R$n:ƈ#TmYBZGɼ=͓c#?Wd 9a[/b>dg+|%rճT2tIJoQq-P}fgCu}zՍRw[#X:sZm;g`HA842l;ύ ƆJLU}mc)S2;)i=Yà8<76}M??> qSE=HȢReFS xd2XEsVD8DH{ƽ`Fp#˘ZI- );mq6h#|g iһtZRXo{ YwZS}znVM]K]$;nji^82V P Rlʹw#LqW3/ByOF"QN(cd{ӍMfW=<_䑈%TIЅs^ A2s=3kýIBQ%O''ui_.߃?ߊ%qĒ+:H-  ՘#GAIC0! =w z-̺ -Zxv_pH`hQE]kv@hDv≁Fw4}bЄ.JݑnG˾|xMzP&S9.?jrR$PRl)=b"8 n6jM`>{7~,x4(f_,NFUr&Ort>(hr&hҷo˔I"UX*܏]C~\&O~]r0gnt`5Grs7l0>AMf>W. eeJuJ!Z;vu+Sax5RBQzgĚ[f6jrw!>cW:gJC,9jħO~[t-SG\뺣\}փ`fKzU=ַDav)%Y͡,óUmHdS=$)Zoqф立Y-6M1MLSSmea{LB-E[(u|5ޡY1jʹc 6+u#AnE- Tڽ;:6~?{LvZm5-Hnm.w;,i.TޔPSrf*4|Ⱥ'mU81/`JM cG(Z!a%( *qEGG'zD'&鉹ȯυ{¿sxA1X3 UQ6N"ƖY敍"8JyE"I-Bv4nJ, ,u'`z o{\ v O^V=kRJup7tD$Tt:v|Ws/A,1Z3m9^F#[j)3Ԗx-'jKN( ( ӿ0AU$rk'4^j죧{YkB Z8$n~6֡=<䐤)+@$/Tj yA)ıH1I:w U;#g7TuÛ) B:M'0IDL}+)n7v5~o>=wW s8, Y9 :u\@^H(ⰧݪuHXGa,.%6!`BL[c*5(Z4ãȳy3r AzN9Nh2`% 0A12yPg b-(e)3u؄v(|7" 6eN8. åh:-0/""|lјI&r%>@Epc[hR.Sk휊کJA:NĖ N( sIp0o:iVn1'2*PO+Ƴd^|lXL^r8ȉR4gKr'4>!cU c]1=Z0, q@3B#AE\ A yy*2  &_B_Ad딶Nvx9y#TPu_} < H&;>7Cx|^ٗ`1ֹ LK[ sAkO{ӗK hI!#$Gd)0NiDJsj=3 id"@ AHRVa X1cjDk45[J#u;#gOZ%bBlr[0S Ƿ^f7mZ=:S+6;we 쎮SxGYgqIw3*D.v{e w1aiMSjW;y/ u5D[\n-lz^j}?LF-5Ow~=?*~Eْ;:&\B |]|k6o{Xo_]}k鐴eͭ͟H7'!vk ?0,MJ!9L0g0vk/9v3|%gS4a!P0fUw՛j/:!%o@`"ǮӅ>5J#@Ƒ ߜ <Kʶmi[$j,*7iͭ o-JRې GIU(K'>\gYmXRwW_ѧ%$,M̫~w p̲r67iyo{ 8%Ӱ9o%UKhUvTOD'+w]{QB3 :[IΑRNzb#OEO< GwcGzVMa!1Rj K1acj$3s DJNcRɜRafL@Y,&#F A|#w_~"~&2~a 쩙u/I9R !2bTҡ4xZ`a CϚa`R! H ˥CRjĜ\ 2p5H(2<3Si;n)IJ|?.UR>s{!7MMe 0X@r0>[S:'3#4V=y[>x}3?x 6#PsbA˫jm^풡Zͬ2|qxji鍮摮ۆ!.y0˛0ŨQUGYb`erQ >j3ɶQJ1 FNӕ@hxfK]LA~,YF՝9bt*`OlPT37_v?oL?߽Ox Lśx5:Z慱Pj7z(}ruL&CaiO q +YUTc)].8xa~Y6@!fafh>C.g]/>.2Mf>Q!)ı[uC.x-|cQeI67 }}5hkd~V"@R{Nj]~bW?F㷭I T:i[k4#N -qN88¯kSHGRSJBxC릫~CmweQc)jiˬY~UytgU/ ÛA мi6.wf7bZ ߗ0@?eUMuaE{]Wyg9 }?_ᯞkDat+JH)QhkcƘ*c:l6noMp A8۷]',&g'R =9)oe NN?wɠ9uXLH[]OQַ,mqrR1G"z"JK$FʔZbI,! y$J['@&p!MOGu:}*e{OC|r' hl>?Gepn%Hh-6A3ZE)e$9jb <* s~]`DAcXXH3rv3jكzNSl<:D.%+>ڝJ jbwe͍H04up#~;pgсS&e[;~ŢDԽ"y!Ms.yoH-&OL3Mb2o0]Ŝ$ʩwu|\ϣ W+a/ba%3צn~4iC3;&Y$1_h湠_$VsAH\~7")<\eRRMP`UWc,-C+U9"B>⚣+&0up\(RWqDp#\ \eq_Aw-WY}pRv+\}4ՂmK[MW Âٞ0&zDI~Fx? ?7Mg;!c@roooA{q=ڎv&v۝ah, 23\߳b썆qHtF1xw'>Q/8QN(kݛ۲xe wb5͖͂xD&;D#"(0}p{!224HAT@8 \eq%9>\e)|DR1UXȣ,<vojRVv!J+ɽ28P.)Jʎ'yZVqy淊foɂf ٺ~p  H5_ntƣ O'|UB?| 'C_}w֭e+;zu}t?g h1K~xG;xg.訹Lٟ$ M2I.WoI[2̜ bx~`*!|wkn7Id^p DUG@q)9}9K*Kܗ5ܲ[pnY-ke 5-i)]1^s5x1^s5x1r!˚c9kc9kc9ƫfRReW k4Aj}G=VF D@OLjU{jD;TThU^hy‹VD]`^V[Hʁp\J8scc IJ6& HJQ>14DzS6MaYL3}Ho?f5Y2ik?.L@?qZ{<$bTo26DJ67dMqpd4o>+"4ɵ [Z1vTRdP^M>췇>~,;~?{~}77wɢ Hg YQx!S';\*TD "M dO6Д:gZ -`j]4VE$$Q{Ê"qC:&h50CMК#3%-uh5"Yf|i;\1q6;sUx;ϷWfL__9g 6O7cޱڗKW+:5Yҟ ~;'x'REJaOPN#FyRWg=OY=MW Pw}zMcxOlv--b2&0iȪޮx\56Sl8g!JX %d[KH7r4OWoG5o>_kz#+\A&a9NnZK`r}`lbjQs|Ԛ)c$p x,XpYP!2i4RCWx Qxg0ӧY)RʜAV)(#BQ@ݖQ8%6=pkaΡ",끌*H6 3?hγtg,H|$VjpB/" ?".Rdm2,d׾\KʭU;1}%R=$! lY~Lit TۺhφCe~{}dme+ cP59p)W0| N7ROSk VF!Rc,@#`F5f&ܥPYl+wYttN)  I3{Nȉהz!-& lHi!DH^;c"x'PHG5ց+l(&!y1po̧kjOVsAťI湪V2*;~-Wrm5@n ;BaL߱K/})S$RJ6n#qf1]]; NڟOf''9$vu¡TƷ j}lt=k^s|ǝt3&b]8uJG0$=a8w<[2tr6V'wd ?h3yk-Ejx-wi{6w<~ƭņuonzcmY/j~7ߙ^z?̭A{Uzxc#]hz|'w%m$ daG0`Fvz7BcIʲgHJyJjlX8GWwWWWUU5F<(ƊSΌ[{oσ|$)8! m9pRQ4~+ck˵r6z4E(gMK8O.-yMI16E8YMmT8_G!-M!U_Q2"tX".a8`(nx 1BY+pxYccwq,߼s0X폹ro[wc̾w!ԺA;|,ǧR*Ay3+Āj)` s,Nv6vU;Ry ٲQjH"[Ԅj#.ׇcwM-w;)מ2܁n? X\B}ywD;,O Wr,zn~}/{ 3Rc2/Jμ ;BG݆آ)C@BRnJ@*][O|mX. J(ԟVhR8K<+IŸ^mm(q_bqߎbqob)c\rT0CD9dZHR&$eI4DeV>bnY);ȸKK;>Nxv _ MU}HQt糋.^>+kjsjm|)fרP4 孚D (h(vSZǙI`3TʵOHF%Q8T :: 5P21C7D/( %Uw;+Ҟ5Y3m0 *؂:ϯV{ud޺ WWvWUx].ste>"Vשԕ.+BF H )xSOs^yh~my|77}w.A}sd~ z/Y)kN?uXʼnMݼzNi'*gX:d+dkIxr n_eM$(^h䜣b.2\ [Hӧ/ڣ~Z,Ȍ>uGٶiZ().ﺕ)^Tޙ~ipDhK,}L OXdZS<,p$SpdSZ?T:żwZy+ArN5 AkE`2hG>Y u@tvKM"ao`0NH:@1!P⓪م hˈU(dd8~4XEh ^kbBɫ"gBkTKa0{·U^(~~J֨%xڤ?04%Ж؜Or)j ޡ~#QHS F꘬gA4) !*T PH5-#g* Vpk36aY([YӓE֌/y?qb_Pn Gs#qTDl=1x BA>Xm$.ӨҡǍ4ax8 sq&W:(P"uBێ LXVmh{񆉹/R1[jK9\NQOWF(U:T,Hn"UGEDaKTFaJ2jT8h5HP9.(TQF$# hTvay9aeOd7)T"%i%b+Zn9ZX|p9M"\8`ڢ@JY]JR 6)MyԂ,g|@j&٪o_c䬑!p=#OƸdBhrѶr8h D 2qnUlYf,h b?>a.ɕpŊߐpY8c28 ;<''Zq¡w#gеl@Jz!p擛+N8W>\Q&Ra::?]X#\$Q‘}>fgώFg緊yhi ZwG0u69yiv5 ]38V*CqO㻋f_+w/7Յ/o*gsQD)1gKzl"dO*MSכnz!Z1- 9EͰP"6sYBW2 G0bdmv:%,lN:E}X9mu `FrAȯJECZCo]R<[N\)~뗯޼?߿8_x2sO/߿ypFr2NI,N g(~_u!ʮ&`[>\34W:y78y{V9 0~Q[/6|_^~ӂ647iNӲ^|vm]^gͪʹp+bPLX~OcH7[䴙ٲ:US*(Bzilj2|ycoxwIxɸ@}Ā:c ڲ:nxc'o)=mNk5p3!c( XV$y vGH-95;2eԿ?p3a@"IA elQI Az`i0v1B5r5  #8k)N6={ˁW39:`ZR[NZ{*8{qΐxćA33cAA ʷ3} /߽-yPxu]@n,z>dx7zJLF\eE\ejwqWOQ\6,'dܿ5+^zuPNWУ{7*ᯕ=>y+ϻ6jem77TËQ[V<[ ]/\/g"8˵~\ ߃ouj>)ٴOH]~}̇ *{~﫰" gE*v3 dcܦ;*uOݹ݇3 :.yXFC6ܣ h=p6ܣ h=ZLmўmGmGц{mGц{mGц{mGц{Q׆{mGц{mj;uƀ1\;=:gg^ɐzF{)3U߯4/F..YDT2[,ō!}[xVԜr3ub*T)\e|S *GD'°L:?4QBש'uNVMe}י;_{6oI̞7{{`$<M6\ro.'6gW8 yĎ;ډ{K t2JhHڸ񶪴8eV\B{?hV(Dͅ+?w)uMus|:tg\F> v%xQڃ~m$^m—sJ>K,h`)*>c!TG`ЗqW-#r= bg'Sv[㧋n EPS#/o)h2E,â4r%YKxs[c[`g,g- %^ܹk*CS;hԿ1ތzxl bfۺ"Ar#T(w&>PL-PLh3O1sS+i1#7fWuaQ[Q:ᄵlc+˭#k404%ЖXqNS>(%C&6*)FT7 E1YϢhRNCTTqGw-͍h{]#s<%(RCՏ_YHHJTwMGHV(D4 sqfXlf mc, $}[~qWDc_q9Y~:?>OƋE{8RYLR 2A!:$R$o1ES?"& &K˪et\Dsl6TSAfTP=0ؽ3:%4DDn2(13EsY3F8VVĂh$ 3JMMhc} ,P"keIYDh8r/|HE'YxLxRo]+0 "/EDj4  {57!E L[ zi;ȥJDV1  )":Bkd ]YDU*΄"afQZ#k m*1cs?_duJgfKpBdc\t.o 0EeWPTU͸@bJF(N$\<. 6=aw^aX'o?pq n+'8 &I Gwb]w.yoH]O*u Id(\tp]Qn0*L7"qm|6hA5IK2R i4Fi!0M`3[% yޓs)?/ՂCzۺ~*xkی{?PwzV]|gtKxl-5/P,CHcvBc/ {;R>2J&?eIptPMP$cC0gi2E*gG=A/5>Bm&#^z>T[^q̊wn:΀ ?ڻ{nA7ړ)@d,=+|_1NcGw/L/+ه1Ct4 NS[qj3^_W;ǎIMRZri)ͬT> )'TXP'>(9 rGj1wul&:97 77]|#7Mûߒwjx۔=ΠWjC%$=Zi#C״~4PC·GxqJ6*4TC]cRtFt9dnc~=/->֥m^,z+%m Nd n1?c@:!lE(JXtZu2Y'*p#Zc>L)ޓwcghQ\(E &: dLR32 h"l J'd%gq' bda14ytpNݏXyHpM?%p M,8ެt@xxӂI]YEm0!?K[+}̤EJ3UG"$X*[/NNiIu4}P$H58㏳ZZB$!S:#yQ.d]0t VꞵF,e_DDQ ԟ{ SLh (Ӷ-,@9Ა^٢b$o%Hc!Ҧ|кs+q<N)ߎo\Q7X4;9nu;vNf0,spswRƙDXX}Tf[٭Ut_E#ؑ0f.BY+;6;e.&c1ms"1'*˱t]"Hgs*GGR|gqYy?,avs,ozkP&^qDޥZ!L #$QE,sRV[*X?vXMoO Jʼy2`pX#b[j#vPHI!5nߤbmL%ueZ "VK'L*P%à 2r'Z.CؠL(!JH6TM12Fk y멘@t[sh-IC=&u)X JD#XTpVis#ZtO-»#*]5: J >OcD1S*Gj^ :v2uH:*9t)N|?+ē# iyڤ2(5J[ $PzJGKV֓]?u)/׷c29<y^f/ݼIZeK$FV$CJ(8*ڑ/$m)6c0*d4i@ B4aw#+eQY(1-P[CvdTS-elMlӐ€#g:!Y0dv^{vM;$]FGD 4*KiȡK$!Af&J &gHؒ,y~5[=? K_Z4$o|хs5F{c@1kٯrQONj=p^=Np)_!8RXFs؇߂]_WWK}g /ѿſ|v5umIӝ2_ei-B|aVit` ^yڐ|~i|]|Aή/)*v물Od$`Qts7p5W?Cݼ=ܼByzRwoG=bc6s[!=۫ i3]-|JпF];-ڶim?2v6i^ڤlr[߫['^^}W}JǞʞxZG\Qs>ܞ,/8aeJE>MőebL.,Bm{$Un;{@z>kEq ;^P}@c|Oz[[b=MhjuσnU&=w[l(U'IKOB~zo|,Y% e"eVы#ґ0$"bm*(QW, 'M'c<Ffx8CkNa)VgV8" 0Q(̦J6KD묢bzM@XλE^~~%Ћ7iGBf7'N²ؘu<~cbl"7 ?z컽.O0跍ζrGHh. M?+8ѝ﨑~;9Z4SZ㸸KY/rMZ4[ί'-Re +*AY]$R7 ^دI^*OyK]ϥ[Z/mT"AGL|9떰11wUp4yG\Mj# y4UOy:Z}|m/'e46w]C?Ye`=O=]/l{ć1e#wj@~čmt+;ޭtC%QJ$sFelOyh@?yt:"tՉ?"kUw|f56xV$MHē(b1NW+Y$/ Jy>D[gv7wo%S7Q5]ssi—ֿր(_k9ׅCΪ.Xf=RGdCa9Cz!=􉦇">P#9O 6=ioǒe/,$ЧĘ"}g!)r(ĈD4P`NgD!0† 4f%@ AHRVa X1cײ7&Z܊a 1p6WU7ƱOizqNvf -}] GuMI"5]W]oFE%] {ȕ1 $AhQ0ڵy1Ze7s]{kvU{Y&b&=ϵ\~E+ȳ|Mބ~nQM:)Ps2I՝+<xꍻTW?w.wHZOKMT7EZ̋RO $ e`^uZlsI$qwR69?Iq|ߛqHa?O0{>SxvG\ߟ߾ Z Ca7( a] D"v BXgj8\'gR|\A)}g)x[FUEmmy \鲼8Cwol^gY[|z&1e7`QM3zÒ7?Jc\7s 63?LAMu`/J?/o=UTio 4\֘" ,bm%6Nl^>'K^0~ e?>+4|=L RdRE"a|_GB+.LkL= :qAiN R(r6 CDRهtʼDJJHf3躠)L̹8YVA3XEd b·l6hT;=S I 9t>t%XtХtK({Trڕ]%?vG 'hI|WI?v\܏ؤL5 KB(9ɢ]'՜bw;mn+d_÷}=ofdd'ٜɮyUax%Ş3ۻ۞߿.Byw)N]'phmnN~)M32x܅9nKQZeq^̠S wA|!{Y嶽l]qz[]/͠zRZ%U6ÿb<>3Y\O?mA>KbA} |6t9L؈$w@=E$;YsoȄ&"Q&D+A-tO SHw zprָftM;7>g0O 0bJcD2Y+냉KMhA #tG_c"ΆW;ۂ%C.7:p32l6nw%KJ5h&AgNXEeȱ[iA{2 ov$o|ߪ~OQ0[FIa&!4{Gyu+nq_Of1grH8 YT=,ݵu1*&&VQp.($< t FXp46sk1WϤGPђ6ك>?_'>_x(ëe ߊzO$q$1>"O0SGGG=J5_C129X_`+{լUڴ7Q (Y9L(* KaU7J&n`B`&*/bNcJR@-(%"*yhlDق{{bpR<xMW7 .&CRd.[ܠf<*ltwˇ?F)`kDH 3ͬT -By͕nG?gTG+b>B>VDBB]M2N 6 tO)`/N֠yZx  )4Ey`#T[1b1PLq(|ͷ'0eI՗4#+(?zԈ U*3"iTYtm/2Aq,$`,fڻ8FX#w(d1_-i+߲Zry$"F "G|g B99Pa^ġ@tΓu /s[,hq>18~E)#@F&WB#spSe: Q LqR}cT?%ĺ(ZGC iͮc ?Ȇ&x".E>LلGݐcy?%YO²dYiX )jE41 B7(ٻQ2{G۞wzwu~A: :r_O;\4ۢHU#-x0 FL Ro )Y۟쌮_e\͇S7ss |*=”X6Zŏao\_-+ݪFJ( L3PX5HLn7'br}!md1٨)-6s;'É ou]Glҕ:ZW%gy]--A}Vqo6{ v)wYVSɪشۣy.gұ"2(bQbRe| 03LJS5 3՞A,* *|zCpMm7 Lt$8r3X@U=M3h7#ުކ%aEv튁&jNV9p Cxk;<>B $(4ԩ ZQBJJ-Q !,$iA1Q@'Z}#s0!/<^{R>2jbD䀲='aJT%H"vRJ>z: LȜ&ĠDqQ"熓=y!I%D$/Tj BERcib$&M7)p֋Z%aHS7gOtW'&yӫi9j̀ ȹ̥4.rRix`,(3xn2 F$ŤMK9VG<泭Zb8/A0M?S9l1ג;cd"Jd5Xb+t@Hm t/^ wE%KɍUa@clC:$oF+mi^T޼6'$B}__[}ñH: vD`BT|O2|,s5){R_bwz&ޔ3tl\QsrRbr)h[]Vނ*Ңr18J#Er\DR=`7:6 ɐTdgqgǐlZEC3]]]]*␠J Eˀ'a'$%g Dx,ޑ_F|Kbg_zs2Xb9eۓ;1Q?ƷUXru}*G˜ &c~ )2@JԒ)E 0Ov.vEpiJ,Q26eIhz}8i ۨ\tyor`~6]b)]hw]:t!vǔ ɇ^YQ\9v>(esӽ,(h_oN6Gɼ3bfA\ W77H$h"P3DDR3= r+=AGT1 J~~@~2iQaNKf-D~! @r(f_Q̙5lnҭ@3B#ARP. \ǀG'McB)7j<l#"z;X+Tk%O,5e#gi^ }EiG o>|ܟ>RM}rB`d>7mx|^Q?3bR֥ LKRZ sPz9<(!r`ˁe4B2}Hq2 6 K+N03PH!FVqLG ezwZ D佖Ul5ͭ09 /պ߶:9Z}T{_ý:`kd*)d. Wg=Nos^7Kü-l9[ys<퍟ϑ=2fak&t;K|?3wt'9զw'l;sW:^[^p>klG ?BHlǟ&9R+RB4TRjصLt&zG'X(H7#N 2vbAf7ZH=_]Aٹ9G )[ReɄؕQifDz#IA$lKK%)b2ϳV8XKC+I) i+dBF9;>Yd*~kRH:B>?G68 q$De f-"x _ 50$5AZXnqF9N]`$9ǰ[)rW"gB W*=wUɱW-~@rޣ ˪hPdAJSdE$aHTZڈAf U3 %6#TOp).}Ԗx93Tn͘1a2IƶPfօ ^Y}r=MxI*aã%V\Ҵݻvk05v8vCL{gq`N+=f*RH*8{'CȀ 3P&ҁ*\2mVt`ZD"rR従v;~<M:Ek˓>!ؕjii"ǠIBhMNHj0,QY*>\H,BdXю DEGQc}904&>Fz}X>I1F%jDY#FAz *tBvn}X=bm"*iAB*jVˈ%FVP͂K`ūp8Kǝ55ZH y" '3(2o (FJ g@^(+ ~ZHy6Ayy w1Tߝq Hth<"WaXt*_U@Ǒ)àg\x~|0g].'+y?? O <؇bΘLArd3HthaY]IA6rgf07l0%xsں= @P&o\:$UXLnCZcuԹ(t|Ň w~|aN?/ETd%FU|x;Q DTuw>l\q& ؍SAoaNS(UI{c]ԯSg~SzYpsc>:ݺyK3"Λat&m вfft'tnf-,D:JGN=ٺi W cWV:^bA:2 5: >]jv+[>c(лQ2nթջq/o?|ۗ>|#&受vR%XOH,(jd~(~~Z`Stx9Ug`nQ~ S;[X3c~kΖ]x_i5ù]SŚtsz&C^_Mfү Q!¯&~u6.)1[Se7F2J2㲃#JaL,4GJQ&ϗ:fpK(5ƧmI T:i[k(4#N -qN8a;_!D'ړQ#cMXVKCx%ӭ!JÑ19*8l^'tQ2P#T9 V!TCA%P6LlyFmiۣxuܠS x }P,C`'<4CMT֖)<􉅇pt:}۩\\Ԧ3{1wծ=&z#O\"ͼ-LȬԆ2xgPP)ϑo dEM/Ab~ڀ"(+^=* ےiTjkS=TƯ?/6XLCp@aMja2ʤL}(~|XG%U}4G%h?+M 0 NZ홽zBW( _*޿,n^3UBKU|GkA_ ,ϊ'g@ucFlKSfG"HE B©./߂:k= LX7[2TG J|jntraDaD`1rX,QIOX)b1Jbb5#X-bo!n;V~u{SF3np0%sH[]FEtbeP庞k{[Ƣc\W`M( ز:^Λl DkBUǵk83 ?hOs5}@]{ݴjt\{ɱ8H(򥖜NAfxhN8L0+^;L^>WNyK:e0B+]]55)W.c/4Fu{m,D:2|7,YFM/gkvg#26Hv6S~"P>yߘ^z*I%KghC=eցJ{ӧ#xY Yd~M>54i+ǦzŘ~Qz6泷aYnr?T{i^yiL͜r[X] z=(Yxጮs7 2 aj 0݋fV. @49fኜl#7Hba xPp"WMlVC@%'00cRW@0aן=uRt*Q)I]=Au%J+}4 JR}*Q)I]=EuJ$-)3YիgE}*su_jMLܛtl7:%>@*3^8њ ak }8fWrVD Bxچ S0ŴFX.i{uXQqMmH*+^$"Y;^Eo бvFÂޕ0^]Taxy+ : ,Z%]4+P6&rSjCu)N<9/M[CN=0RzgOmUĕ{B#3NhˬtKg83:i ޑ};ՌM(VB5b̒ R=)`@eb^~38Z:LJk9kltag.4BuЅ[Յ32>YMRNp]fQ7\?Tr5vVq''%#fY PsyX֠4:(Ɲш]Ǧ;ֈxЈx/rc<>P8â ێR$AZd0uzשFӆdJ'۬ рEB"ȍL!iJ%c9kOg^'8&µKθdX.BxzA/̨tS (RbPX\zZ J!Fÿ́ ;ˢM +3}8T,[fX8]gNJߩ #;?Q(YG?ޟǵJa݇=6<$Tqۢ.&oٵ}ߗx Z )K>&df̅He ~q[[|,ˋZW2'껝o@I(wѢ&'h2f+P f%J7 >EAGo${;A}ϟuN$ BfEFj=h]b% dJT!\mf5IYMҷ=Pa.sWjMUю3_"[-<8EQ{ɥ jR:%(! kC3h4"No0A*RB(Ɓhx*阄yOhY3rN3>ߕ䲹~>-T*0FZZ >~q>mHv!wΝ@];r[CVZB8Ũsg(ǂ 6*)i#\̑IB/: -hɚ:ë٥]:AS c dDr, IDr$IKU$JNX314e4vIXᩅuI xKaWdn8ZZHJ, Kx:)x8Ѵ doQɀ*C< 5h(!jsԭ m#;#"Z?|O?5;V AE~l\-ˆ_fOu֔U"M{'>0" Ku ޯ͟.Ty)DLd[o LרThcsc\2q|Ɏ- c7 ߿9I ^&Ϧy.0K%#^ 6qZgYlm3fCHXuxUWʽ?Ǽ7ln 1gPMVP#0SzʤoQ(178-_nl __:1\[V C4tjԪUO6|ymKnf6o A!63u/I[)bė]G\/lŜ98+J(}yk~v1,d?c mNdϲHSG(U0V LB.1.<{P\BrWg: 6p2F[^E|G\--D^VOo+ܛ~Dd{үPv_ҊCK?aK?4WOgjnCB'@] [Úyr~ wĂ78k9>OTAyULXj0'_!/Jd'tsKB,l gb1ͬނ!ud@&#Uv*VbS YBg"xncum7$B[MΏ_nN 218=CuÊu]*w(+0=|{S3欇p\:S֙{br_,le ReF5@L{qĴ:^ y.pJLk2iϷ>_EГQK$ٲfyJ:&F@t!* IN$P_4s[5Gg \ f,!?IS+[i4a[c0=j[y@ 椖઱}|͢!6+Hif@]}{vHxF;юm)~#Z<)56#4g5wAS8٨'v SegC)ɲA 9}E2l >IP,CA]FgwefnYNH Yr.<+r.{;kg-wru.18]Q;W7E5dACvK{W;WG?6u^7IDto̡vv~XA;Rgp?\^en Kb(r2*_ xR Rd(=> [>oktCՃЃjڔa׼ߓ!܊n5[?@'~oGχ@+TA}đ_KF$ qGV)ӻ)-[fw{u;Ic]"']~> h?~_߼Sٕݨcބ17o^Jr^/dSx'7Bx?)ͳ–j>7e{wrtrGPicězI2IR$"JՌ LU3֪#9./oS{4QǽKyJdA T/౤Bjџ/o_.^EpFb,NbKn:W7=qu }o uu 'vteF4krFdн[r;_G~jII ]IU+V^y[eJ MƒororX-գ2&Nnl.ih.i0b Rad:,O1Ld,ZIk嵃ZnepiV̼Y\O t1} TzM\>c۫o/AK~K텓rܗ9X.NnwI_}X<si!'Y'k;f \d$&" aw0 ;ݺs7ڱmENDBڻ%Yft~06@7wQ3f>3:nťE>PmԿ:8!Vʹڡۍ~BZ|[$;]Jp5pݚqSIe+9gVS & GNsm.Ip D>q1qC Z:X;#a ^igj8綂Lg{+_Mv\qO4ΑeR_*:@LN# rE, jqG#3c+Fc1hA'.hì=Wݶ$Jq!rAZ"Jyΐ5 EĐ΢c۳3r=ciɾw|DbpSGU)w. 'v6sLϘ?^#eq\{~97m $C~ yBdIJI ]=lkזlÙ!9$C?Syˏi%5$Hez E뽥jT9 VQ&|'D"$Zɸ `؃`>h?_,THc4mD'QhUNVÃe3:[Tv`4怷)bM]lfՔ5HZS iu_[Grϕ{8_qqMURO2So"d##SMC-s 9Ovw bjrA7u·GΏXotK#@SE 86$8$~iCW-ίx̅~5T+"+Ʃ 7S>O"f!|Wdl~Nx׻PtA鼋\,ć2Z~-f u (Wϛ Tan 6ʎ4[`F 2Ȧ2eZhYyoJltɫϫٚ.s=rMAqK3:4ʁI5_tR`aŇRXǗI]5 O`Ln>ϋ0'}H2WΦӞϞ<>3)ַB)Kb BqVAr  Etm3@I2,{S&IŗgWwJfNgTnoZkN9LjR"W^bxs;[\=>'sOC"Wa S~r? 1!ɩ0"g2,`!H#!퉄~0j.m%8[07{ x^JyB-_{ʹtfp PkjY|a}t:{VJO~5Olj6z8ْ~o"Ge'Rfw}aV^1!M >EI~%tkۼnP7Kf֌mZB$h\7_jo_I^LEf@Y\p '68x`f)&Q=&jr=С٠zeJ %رr7^3Au=Y1o 1"U 'ot4 vvK~DWGwT Z0%NnHԊr)5, D)QXG `*Ӗ,ZgH/QMXqi802l APʃ.JNjITzD.`jOG~y}[k^_ƍ[Y/GeMm{fmQ/"hL n;gy[lVUgիC ޒG?as0R_f]hJj^7%EUAy%jzS p簉Y0AQ M,gbGdTaÀ%2y$TG+o"X tXXޙJȋ;,;E/#k?۪(R YCQ3U_V@cy-; N.TLF[c(!fH jRz\TknXe[ nCvi&13ayj7dbXW#ܱp[;mTn]Y&eiG{ҎEJߏ?m[PB䌂|CsKFD§N9گrS@ ^9Fa_OuqF:('{fo k%dxx`:b EXshc4y?^~T,z J1ў2␠J Eˀ'a'$esGʎy[aZ{F z4"ݿka^rV距}'t~EE!aWSe|-E!zQȄP]Qo(rZs9s.{ק˜8Oƣ~gVq# $g`Zr;Afx疹[E+Vd]A(y@A9z6*W]ޛ(/BIilCWp%e>eCwe׿ \9͠à`}U2t[&>8f٦mipUٚ}|p[4'Hj6X~K5oVY,@JE(zI""7}0Y/ZP@2ih 5(O+Lw }(XZϵأkqf qn51 :rQh,t:<<)ݺsL(QmGs tdXDA.sGk+bD hl5:]77PPqHk^'9 >E8՛`Ia\=7rڇկ&ZPsIAZ63-In%h=;U `Ny\s9X);\\ >J$8HqJ%T'SUH!"] P`B<UX-л3jX$",`hnDΖ*|H]/ &9pqʾkʋ^fZC0õ[ܐ۷W\&x4zޅ=MJuS1 $עVaK#Y#SI!@X%h_z8Ϋ-. slʶqn[>r==5fo7Ȯ~ vaW7&\7Ý.ҨѝlnN,mՑ;W:k˛r!Y˖[g6W"Ezt|Ezt]GEzt*}T*{ a-- Kv`jXu1R\.0.0 LӺ.00c.0 LӺ.0 LӺ.0 LӺ.0 Lf ~_gUsT, E>DQ0׭$H)'==˥^,R;uݤnjzҸ_||Ez)Qk셥 ǰ1NhaR"eYk"JnɓλzqS/nM7lhϛ3K#e (ÙrIނ5\#r&CBb$YޥFqǓ)7տvޔK~@tz c2_ˎc 9::mLQKۿ:*& /B_o785iatH]f% &|g`X΋g&]F$,>\xr9V/O";#U+r$qA( BlJژY<]{ytrQ?8]VV #PD| bw~Q-Ջ]r ^N'ͣ 3On? 0zZZћ=1˦nHc7n0a)`,|8_&XٻM c+A{ צR4Z01e8v ?G8V'|@Tzuz ~ϟgO:Df`\0. P(- rd^N?>{EUdtx9JK|: X30Wzi7L|4}FY%A4N@[]zr6f]SŶ隷9z·W9~oRmK_:&tpurF5Y|k7uN3(&a8;y)O<=gMc?Df ;)Ep87`oNJLeIZC@qRhsQ A>KNoΨ{lL%:D#q99!\΀P060nYM#GeA=Q;٫=G"'vFA؟CW%9ku0Д5]b +..QOQ;vƚ/eLû?l~_H8&O#?1$ŠD#obTڗLM7P @jj/P)WNrat*%H"9P9P\NXt^هn~rUTcA{|A@vʛK,b`ect4hl pp2Be B*D_Ȃor1ԜF)t,jS&t!W&sX+p(koYk\%IΌٜang7ιЍpdž_lf_fx/Nձp0D( %9(&BKYV A @< NF[AC#{'b4ִi}H f|8'm(&#fDq̈cF)&&. m>vDUj ,b PWb]3"V)_XMu<sLEx`J¤ *!:t3gsFMW%/%i'/9/ҘǼxǫT,dXaohR!2fRʘ%v}ȇ|ֱA,?)ۊbk75/wMEc>Q!`88bäDz-=6 &}wU:왻J~zi?粻=L{ &Ps*f y` -I!sU{ M^>oGm]CK_Z#hhWet(?gח(L:Ed%mY8k,Ye9Rb=LrbMLr ЎfG֔=ds.,7j-.݁ٸVlXEC )4Tl׎u}np8d]|I)C.ӵjHi,[7= zafl~Lu.=Vv|r|_DmaގPP7ˆb/}g@;`\c P (AН)|JՎ;ܴGBĶԊ^ߴ5d 'y+*JFAPUԵ dk3;6:҈o{~r?fyxfqr{鷿Oߞ=^Wd9ׁj y)*M1j X.:͆VT]#X2#/'bc!LX1uƸ-KPդzWw3gs߈@OOON6?yO7sE=z^121KougT<,[M~*K8hyB0SD!/H!A!ÞSHw2 {A!ST-+jU T'zET({%e2֭s1?o#AeCYc$zd-OZ4eA`rn3'gT\{o$eJ?yy h}\krvO۲Xoa{GƺaUX&Y1#^^ uǫNfy-YhK{4p*ϹRwIV a>h'Hl+@+Cs!ztY[%aF%k 5IW+('q-)\FB޿=ޯ.Jax!hR&".\yUg(l)l V} PO9C(Šun iA'dwl$FQBbQeМL f @]^KTE Gt|"r`8בt~uPԫeNQ[F#J9ݝ˴bâԃwjF?@q@4v05wUF nsy{@ahtug4E5]T4f[\&%-?2B>5Jg3"w.Q J_Vܷ^t͒#ԧY^4uU=ÛՍWyi{y;1: (N(](M &aw^m1Wcмbζo1Y-R1*[F1Z8 cIBe̹d\+h[Cu(|hV+.+(CQ:zMM.:z^4vMC..`L+'-vF ^dVHŒX*s%S2Wuwhw\c}9Ic{+vyh5QJXYrQ|\BrƷ>11sɣ ۣ&=XkV}Bpj3LӁv?˻闇Lf:h<\#>_a]k%Fd%K3e^Y$$)|g93?c>[b&ߗk\D>hɋ<7ʱ"tyɼ.+,c2Tax^N^->bdq#uE,d;g'E})?0/G?rG*io"^F/_<7Nh|a'9k.08B ^6+u FbAaXvY{<ŃmY Q^ YGU9e .8mFVD}{nl f$EEi_{{e1/2n.7 rVW/' ]u/#`v.vW_~j{6r{uIV(J{kc;kqűW]3hqiW7ųlI".[@zqK'2Ȕ,F4 xNsfA?{ѳ,h-\9gˈ>yuXȵYsFW P{&f7%??N7Q;!N_+[{62ѳ>U+qoJ]k+Җ6!0S=} H/nt4C@ Ku`[-Z ͱ<ϮlaOCD gTőŌ ۣ*l5&5Z!۪*rձXC "#Bt}WڦEcdKVչVpɨ&DL(7CK71b7s6cܺ4f}B(lK{͆Vloh8֗fW1w1hfwo`35%yl〵!dQ{w`>5y2= ٖSbAqf5{͜I[sb5۠-־'dRm>(vq|o䜉??;?mS&7ZS#RΔE%39G2YIdBD*M,//xL cbAbp_E΄A R5. ,0āS,ľ'-b ̸Ve$8f3|g͂ooL|z9TO_˭erqeKk,MJyM#TV OzjRQ(8BXLz[%if ($-I hnQR{` -]G& fvGqrb{`d'#^%0k-ΞDO&PuuT- _ĚR4IYI&DMUVX 2eldr4>_l[XO<>{*d X,wq4)4z?z"^lN؂PO05$8ʊs{f8CMRTk1 }>ΩSzsLn}\}tX;NpѸѫ_z;{Bc_ =}:Dը?_^kv?\ d}M;oG߿/Co1<_ZW0z0 [j$G.)R&[Z`2o&~(ezRҩQꔖf~X_ 8=;R_ G.T+r6nRǷ?Thv.' ٨a3m)ćےouH._0>v߷?t5ྑNxv^ N.9.إ_~<jdZh#*\?5odWJ_ɅDFsM  i[]mk^.y(UTv;KZ=v.O긔=۸%eʴ<߼b5@lnA,{)a.W9.l&jioU/Հ+ N c}|Z>2ZwHr!nY|zRq;umrꌫi3Vۙ.Au%>չYr*?-N{X 'Qo{[ÿq+*_O,ɵTiӳ`-6Uޱ+ gMnNfq46m /l†/â ]\ {^˵tgdy/l"/XOtj{nkÂwsGx 'Nl8}g霽 2b!z 3:<ӔMU lQlR=me ѶQN{JZv<3˄ꉭL򿽕I3a3H<_^7IG*7tTwo[,1/b}ɦ-9:}FWIoGXU<7I?uJ}6߸6.6~Co@73'X}޻$Vdjl#wu/ֽfkhfc[{m{ p*"6cR?pGFZȰ;g[;TT"T"XkDi>9Jtckt#h}j1ZA^:[_rxwG;Ж3lc{ۄ2~ҙxOyS$S{r{dbAbuqzOxѱI-)N[RF),<9hua}7<ǵfkTA.o!7^~#}ukvnqKU}~ n".HvV1׫#e" !Ց)Z~ʗzԬ )7Uu`:IMMԄ1"&ZXf cWIx30fEEN;#M1E{) wgpGK8F}z'wZhcVB\G҆^6' 4IUxPIɜ; F {L.\QeuO }2ޡW"NGi?Ä{z] (yd~jg'MT*V8鑬cgdɘ볉5>(97'MUEzr7NTRZ<'JIIbE1\2':!x&} ̇%Jѻ)$6)hK 1¯PZ)DN^D6ZZ壝XrAVCI0Z4XL!ӛ"WKr5j&E `-iЮltF*M2U _.J]$E˳ǚڈ.tt%O@!ـ 5Z0+3\i!/ +ae2Ei^uE`!ThU%`}%EVMA[Et&J2؎~Bނn]%0ZuIBvp ԁ` (-f)c &˽e;VAa8(%ͱk?m,̚$fj2RAٕ`?Aj880"㬪p*y`* BȲ$@3A6 ?Ft26҉Bg s|Anw+it֞Ew64IxFDj3+j:M y/GѠJw a:J؀$`>OAK0A/lmRi6A =ZyaKP>]Ow؊PD4cҀ'7$3`Qg qRRIP(EcDMnƂGtvy\:\GJFdU8Fтۦi.,4R7k֪Tg $u20Kf*msOH:'/{i:?T赫MϩHVk LW F[ou$k 6VP8k* HFWP F̀zreR^HO30%7aFF,sBn7gSMh8]"/ ciVMZKE`$9D. Z&J#:ouޮhx蝩yd]pB; P GR/ZM4Ӌ=``䋐_"kpQDz-uZA^]C0 Xo iޥ9JTj\M8C7DzW@"d O] Dz{(J #+JX @G#Y J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@zJ V:0J XWY @ }z./OƳ8*-k>Hpx_9{cH0.zmI% qYmOQ_7X}-,ܢלfqq~v_<^ᤣ^6nM']IV׼V?᧍iqߨ7lt'@_,9/jjaY Y1Ȗ^el,ccX26el,ccX26el,ccX26el,ccX26el,ccX26el,ccX|elN Ik`?dl쓗9Z2xBTb%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V=_%Pc6t`@!:VjV=G%`@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X Jxi}իwKZjJ׷.zPN/XZՠHD9ˆ̍PK}io|Y p; (]`e+2יCIWdmO=]J'=teAPm[]!6G. lg(j|He͆Rmű;vbkQeDɗDg[8-*dzt,6+,[tE۞S,ZՓ+MD|WX  pik**l쎮NWzê=DTt`%j 0tJzdt+}S8m]ͨ=tR h9v (UGWOUqk*-t*yt(1B]=A &UK[CWrtP69)U̪~6Y5GY0A@dz?&W~ 4OգpxԷENO΢*a1x-i">ibϰ!860Z  s9+nbEe^t7I"Dz}v2_~5)KfFVM00G(~e_~\n[T;G2rrz􋓺1m~.Yb+$73g(<<}TZɟֲ'S ZƷɧ%:ױ쾇?FdX^e PO?)¹HkWw)W^bF;_Uv-4}mgfQ+<OX؆pQP&Xwq8L,:o`(w/Tf/fFAXgMhztL7<8F2 >#__; ,02d.ɿۚ :u%G(sh` ͇,/{":8Mni$~@^?XQ\ɖX2p 2~f ]4ki)³[(kP|!̷Q?oN"oӌ76]5&Gun) 'HR@jk8 nRg cr w5jC7cY3tD) [oIW;3,2"T{V pjg В^ag*$L=Af=܇^?c鎮 ] n]`Uk*-th >z (IGWO$~HW.mc[$J GWlϪWUvBGWBWZ0*u?}U[#P / (BWaE&UQ }WV-l Cvh>%C }K&";+q[*}h-;]JEQGWOU,tk*U-th)VNW]=A QZDWC/dm+@ˑ F#JSA=E4MaX) H c7)eUeIf"M0\EZށP.ۥV=!:+~] =fۡ0m+]Ꭾ[ U*V5tp9m=]ut`e AZCWj ]ZU@iGWO(fI]`v h9:v (e7|t%`큮a[W/oՄRa*KHtae 0%Yaװ"){6#!{=z^k|3o*Q055hn٠*h2-;OE0Ɍm6M*zw2[ߍBS0kRf?doN14K%aG4.kؗ.)'? sa`$fZ8z08v2gy=?A̓ M>{[^],eRPX t%K c6 KLR8&B a  4JwU{)QH% k]0nMh̘$Ei 5uؤhg78w+RT|LOKrc~33m>[SJy1|A_63(º#<|8.R[YAVt!>GUgӠVKtC2!t\^=gV,z>'KB2uhﭘ^A̍ak⛾]Ǵ͗Seym7g+79-Zܿ4ٟ]|@Wcen@J$Sk`F!jVKA56tJ(h,Ny}g1ekڨ!xiאېVy:=pl{rGy2?O=.wa(rIh.|H8)\|G$:]C)w՟.Un[/ὺOTW.£]_GaKGL QFg_%^1wC0OGZ$\Z$_WuYuP }{QfsgSXΏq~ДDRou|%G1 $ FwAѳ$/˚$0n>@.byCGU0+_7?/OƗ8[F f`̗~.z^ GZd\<:4٬p>{ZRFSb|yJW!+IKihgyQPCOgiA4WaVJT\ꪼR4ZyW?m#!a}>*wO 4 xc)&hMl*~Vlx \<zo.0Qߝ_9:_A V WSW%rѷ3+sFɗ{/la%X"9L"7r5; Q,sy,@wp+y +iHo4Ul>s]&}fyE/,]{ACO OT1c#MMvQ9iaH4tT>[rl n&ƾnoV:G }Nӧ=S驴(KC ׌X)PkO+ĚVWӤ=wdʄq8F#,;PJr1rBðI:APҦ1q 5+Lq1i"nj/PO;~k[lcmճ˕bv(:{Xqߩ_Zπ\0t얇fy(`m\x`gjq_Q _@t~۪ݻݝ3)OّTˉe9mKNT9rwj !taܐC[IzDYc Qqe*Idv]e_;# pʤD!bM+ZL2T4cL9]j:kgez4&oYM9#A1I^C\ )c0\k *D2 1hSa2Ygz]΁ɕPP|YN`)?/++웨 ]7ӯ~W::]غU l XN0'3tT1-!Ug!3z QZmpƌ.)k \v@UToMݞV i]}a ^)vvސEZe%ŇE>.M'Nj'#%8cbDbkmbI3%>(6 X- t,8շd7 ٣"HXܦ!+P.P)@df*]8=v: /^vQm0M'AnY\H";WQ*i^#HA;rA@ C ¢ ʾ5Y^#-:R,QgMpInq#ҏzDqG)LČ*6[ j;ڶVf(kuR>Myv`u:`j3u9 6Zsv߽d$~qqQBx:i]b ~1~q7i gP[öZbZs0J\P6*Gx~q_a7xy:Y807~SܘHُYc+bl^8j`ǷGe`8<fac==I!׾CM(WMii']\:g_CK"f;?\9/ׯmb#A=54 Pf6iﻅ7lRF36fzkRk_k^'Tgkgc@IKM5AsPAȨa9W;Re.5 Vw'XEQ[[C!+Ȕj"D"CpcLU5iW,NpcUvNƳ۵_~/wׁ ] 259i~:{DR mv+RmNV+'cH\vi2"jA\M(zZI[ 貈kb*fI`ޒUͪnkqYӹ5rbW apr銇Usg˱4_go:PHx]LTN+`HK6Ra); B9),%eCxJdF/(r,$1ik)sֹZ禍Ec$(*)ډ Z d+SɥfO@>:X8p練8T[}nS9o&$UtmPCv c݋:0 cUTwČxj0ogsv6lvץ魛S =@uSe3ZmVmF ޺}G^o}?HAW0X5*D.H>8d挵)uuJ2iH4Br,Ҋ 9}ulRqC毫յs(f/ÅCдPf*p9U.UWu Rcrq6;͠BO!ŮTs)XX.ZGƉ!YCE$lmğP׺=cnֱ)2W/ATgo.-t_l[(Uβ-#w}$ھg,^g[oXV:ԚU5vz 9nV]Qtu?f4D5]:ˮD\r 9%V\GsUc 6sI8] EZY^gR;i8英F'aQm?oo >)2ymY=ӻm:W9^V.D+(N&$3/I(@094 [DZyl.o)[y A6F AF l˜!&k p15 %һb7qn,%-e,9lU7MjE Er2TD'X+5ؔohD!ԋ m%/3+rSX)k3#!(*T  v~IygZ*ư0ec{*v:Ɯn0omh %{ hc{dcsmqm_Ӷ:g2xbи8ˁA 9Vnn PC̃s/j@1@"! A``XƲ؝.|-YdUeYATW(m)`t(r:#ݖlѷNDV` R"+M{܊{Ŗ /%SԻv/qn_ޠ=J?=ax d,# |k~vVj%m8GBG&hBSBS7C$Қ.d MM&!8ܕ9ٵGޤ\+pp@1!YCѕˍ\q\,I-sQHK, 6j.r'lR%Jm.d1gKήy[ c>E"'|л=Mo+BK1UQV Z*o+TՁ5cZ[X`kYYǖK.vc+2q9摖c*Ū8P #O% u޻,rL> QP T `PjzF3ViY񩦬]DcK rT"er)Vicl#qQչԂp٨IIn2n:cnƈ5V!٩En-śӗq<e*E/\#|$CTJ,_`3jY〕Pr{nx鱌 6a|m1Z;E%g2o\mQwB1jhRn{舍E΅u7obHk ߼}H E[3L$)bhkg9QTNhUɛ#4]^FJD Z'`&xո Q`('YE91V%V;K%z %ې]/wEݖw7f"8nxTVcoXkbɵ(m5lR3 3*_JxNg/z<5!թ 8?'#\H\Л*t8Bv긬,W3O%wI^bqFkfy#5@ɏf)[-dٿgr+?B?6sw1گv>{cdޒv':;9=mRGx> O~^iO^F|oƓ0y꽮9{V'?}-.YԲ/*/xr|\4l0]bx] }|{H3wmH_zn\_=s[*_<;;\Ye;Y63ӹ3D6]g{8W!;rCɞ8SBR}!)J֐Ԕ(m5驮ULRP2,FKX/#*/K_zcnvdr7R:   L O7[9MgO9ܮMsbfgr{&fIX(ZuLͽclKI p>{& ՁK מ Z+T-a]Y&fvM0eμS֝PDT}=D_QG XTV Zt,Y%(Y$!* M eU==EL//2\@?o2[-)i 6X4H\Z/I@{I$x jfUzC:if085R/\d7T/'ƒU8{C+R~l<' f$l03? LT%X1B f#cb0W \1+s`b0W \1_;FV W1+s`b0W bU1Q8)^1Y1+s`UՊ\1+s b0W P1+s`J\1+s`b0W \1+s`b0W \1+s`b0W \1+s`b0W \^`b0׌[2-cȾDlt@іQ GDjRIoaa)c­h3x3.yw֟/UMP.jKcd"DAhDA$F2QM2$وj D >*RTN[G(DJR0Hp:=1BdhaU9U0t'[ϔ >il1].zDܴxy%ydX5XRk(c$r ;VըS$k9;~ԕr8+EJi[5P愢m[F8{"yD]H fۮ&ơxY> /Lk$Q@Mr"/#a^""Atr.Y%Ƶ/.âefv9׾ƒOP Ra86V2ceeA2gDPcPF%'kBvW ¸0B{}VnpUwӸ-rWGKF/BƷ 'Tȵ Dj3ƃqWF SVk-rI@)cIz!bVˠ) B:]D!,PG"|4X =h ڨ<QJO5_+P 0nR0zbOxc>- &=YlZbx&G,=^ۭFvDCN׎.Fxu?Poլڪ`IWVGQ:SZ3i(h m 'i$(K,*pJ&fZуR!D"*m7u;c*Wy)#gG['_B]|zy>A/|ux[뉫vt)Mm>V_gV얮]Om[tk "7A|dsMlĨܺZL-Sw6H:n/zj\"5tZ0L6=n;dvs_\>l2YskOb/z\{k^`y<&Ú![͟6犮U/]H<ǝ t?CLc~Тx,ל?U}#^yf.NEƟ'ʴVd8E s1#O jvYaT{h)S#eyR^gq7Q /UB6u=}xhzZoet.aH>lXbǬRmԠOx9E-t<`qAD rp ~t;`aǫBơ;0 Yk8YDFHB̧6jIMbIl 0'2du8>伕=*2Kw.cJݧ6ZZ:odE`pב-t崽69+5<6$NaiD\]{s2?ՒĨ$qUdA ASSp^+%"'.*Bg_ ԛi6qss`ZKІ6PL[[] Z|P@#w7LaQRg89}\rpZA%ٷ˿3߼y=?;ܠ 7hݽfGaV8 G +FoW+U\{EA<]sw}{㛋»ubQfE̙<䴛[nvQ6 OޝjrG0~+iIG:CfY>@^72jҕ;d<\Njp(%2yC+WβgkÉ4sIdkzDN *uZ O{_݇c񻿽;u30pzR"4a%'b: ¢މQw[!?wZRCxˡ -K>bwז|q~sܬڬ g!u)eJ["qfb |6FrCIk5p4?1F "ɳTE,pMjԖj(=a)]vq>dtI j (er]!(P; rd4yNT'Y7u㝗Fz+ zUa*vabGXd!_-+ĽEOH7YXS չ:Fq4[iWj\Mۚ$j܍h4}f9smմ6iZlnwKf53?\Th :e# Mx"3jY_#)8a)nz_ZʧjQb;s Q4i$R)֊8zeKQMKCq@_^}סzu U18qk"V3ńp{BDTe*-DT H6L"8hqh%s*/ךP@E)r/T89&D!x؋==ZI }#,C;VJ.*s+xڤ\#04%Ж\GrܒVDWXT2*SF#n&&N&cEѤ$R)\B!UZ2#gtbgW²PVYWYx ~~5׷{v9 IO~נqh8p#qTDl=1x BA>Xm$.ӨҡǍŮ1< l칈Cx8b+P(Q]:mGV&,RD6kJ]b8"qǡHm]v`7r5YM]F(U:T,Hn"UGEBQch7R4:[" c]"~:|lY|Ÿ>r1rVX&'ZQ 87Aϵ D.X^Βh}CŸUw{a+a>cUnS\~|G|Bqw#X'.w7ryBqw ݍJ.PC8V K: }k/3OaU}ޚEk@PӠK4o<ZuX4Taں@Ԯf]hP*||~8/a^. yTJE,2 H%EX%ʬbN0p5@s;x_MȡxK=EvF,+P2mHW!%،/ ^XL،eR( 5%#Qns6OW[xU*SI%FM(:yK_I9'nAO7s/6/rhF\rF> kj~*`x}EƦlᅬ=pF`VUǐ7^'ig(Ҳ([Z7.M/@#C@˞@8Plv:drv䖾Owk-l^AzB7*rzuͯ7E(5CIV<1 K!L+ TMozW֍42PJ;mDӔXok/>PkX{CO ɲJ!EN9EՁUS]ZݙϷ;//esm_O<wLӿ?#ˁ"Oyd٢XeX|8i :m!e!x#ݿ*7 m1|%BBHP!R躦0DvM#%SPZXRu0"4Wi5>ŰMP>JmQ"`T&Gvm`Qb]b>`\N0sc;uzOq^~[7ʞݾX_~]nMf51ubFuUJ!ع:7+:ke@aUq^r!6>Ȣ,yF "n`td+C8mʫ?3 GJ N60Ju`uZ6QJ#yjzQRe}\ȗ<r݊4rf·tcݳ VguwrY5{m[o:Oez/S^Ɔ"Ũ (FWF 7hdhKWNѶ\MQPyIއPCʚҕ*ԑ5dԎ|ouW4fnw6}&P&FquEMpCԮތYue]]EO*b~V_HZqȗf 4%@SūƾAk7J4/exc*]cEذOӵ"尛&XWT1Pب#@֥kjMCEi^Hkt˜-[>i)3?ٹG&_x?`ݢE?zz 9" VsJxD>}H4Z XS?x\J%Ĉĕ6ZaN"lpr< W+V 0qlѝ;}996墝:| ~wUB b*0F =&?7|hI~)*u tֹo?ܠ0Ի\Nڂ'aͨGf</g d5>I}2~tL~AZՅ1,5 є\_{$s,TiCfrfz~N.WX7/N?ݳ#UYUZ;Eg&{<`9`>*j!yU?c+|FpXn>Τ>5<WRW+k Ve+H68JGpW  Ud`Ƨ+V! KЕWI#WQ݋pEg*W$?*qJoYX 88z f豫~rQ>jTv#Z9=,XXJp#W P{RXW,U."Rq*quhtv 5<5p @8޵= ,#L`BI6fCO1S :Y#Ӡ8V+bUu`x¬2ž|*l|q ܸwܰͶ]Zpֺ, x[ MEޏNx '=dAd`0ecSQ7.5H$;UӅ Zk|Tus'zWwVl1(%J!.$V!Nb7 6:o .\ZVʂCĕzc TpErƻbw*S7j/Bɻ"YrW<5<W2#F\WNG4vEW^J!1\Z+S1*w\UBWßGуKq/a 摒J+Db `C^Jg+>\\Pպq*quRJ )3 62\ܡS2u\J?zW+ Xz*:XW+uz`"婐;FIo坰a,\0jѧiR1Lk瓂-rIq/J>7^kr X0lZ;t^jJ~Uj?Zmi~j+W,E."R`bq\94?Fg˅l+V"u\J+䈫qElHzaUW+ʜv`wErƻb&+Vy/Wf˪7 URɅIS;1*TsqG#?^wU (fWeѤ?o2V*dUώ>|K͗[΃Kʌ8Kl8rυ9*9{Uơ W,]6"^\pjNWRW+me *j[*+V1=H\!g,Jx N:Ux#"پt/;[PۛUTVo6HO1(߿|q9.|U6wK/2$Ӽxq4~NRNɛPܺGn|]$1[.okў8HYr^ߛ|$얠N'׺ZN>*ѝ8ZCWwNtH"sj_WZ+:ceGTBmՖ<4Ø3ӚU/Nlг5O@i~蜇Nκg1 Ĝ⚂QSǓW촳ncn5]uC;tU3.ד}miNL"^%6tE fL z`Pc7ԸA*K}Ӌ>:3ʽԫ-M}5|F#S$؃fd 22jM+H!LuI F H. Xm~Ĉĕu f+lW$W֋WUJ#WmF"h Y.fR,SUU+tW$u>5ֹq*CĕG6v˪w+ Z:X%W_ Uoy+Ï]l>5\^jȻR%+WvC^R&#\`}6b&\Z%ةɅŁKR21\r#ZR}LW$xv+M.bF+Vi03 v"\rX^X#WZ)!#\`md6bV+V0u\J#W^NĶBYSA oڼ6mEپܠI`yl/~)>rث(LZd&Ji$OO P&|RUI-$qv>^թ:(TZfk9|R-FygrRH>`*JasRXM~e1J9(jW,T!(cRT!i:P$81` LձZU`^p R8`;x~rM66YU1b+RkJWRW+r$]`Wv Te^p +>#\`P2\\wjm D7qeܸaVm>?|E?[_Mڿ&cMVU|z]˳zqrˁdrݺwߑ6Ƴp>m^<偊_>\{['lqyyWB~T.]Io,V5aY틣?^xAMfr9 C)bt]e40rQ4??WwkȖŭ><;꭛O?~w=m?͟s 2 9MU'P$IߧדY׶r:uY%:]( Bny3}<}mrA`*NJG-j V}&!o{l U=W7\.ggi5VncV#oW;Wx(PW)D+Y?~ձkW״MCG]DTkk-tSEPVxEȇ:HRT(D4R i ,\a"? GE;Zob)#т9HRU |_YRRˋ+)ʘb3M/^Oggd^nqiwlj'',ZS/) %a90aԋH2 gg<:[ٻFn$W6Eh 8&K"l0sX<b%K3j[۶<3-jųrS &䲼R[u~xmœ ~|$&T%Ǹd3Z6 Gn6K~ts4kߔor@=+ %+֢^7P &ҋҸ6q_'.,)r|ne^դW#xaIG]2mU3tmG4Woo¼h~lԩ{}pFgmiq77lKa4[!ԅ9wq ߝ.V۶]3QWxY|ͩw_cm9яs=lDk6|ql;4cOj,XfxtlmqZs\l˵u,iZɴq9#ʃA6D`w$OHl8^\)uǿ~+?o\7C_)pR/5ߌY7o&#Jwpf-\ٰq#TjUH$0 Ͳ1$yY*koD%ke%_e[j7B8~D.\GYާ1@k;5&آBIڑk\AwtN]Ә[=jӝ@;z覲UmW~ÕgVzo֎zjߤ{ݱi|wpꂢqN"rĸYd@Ч୍a1@ȨU"Σo]-V[/Ӵ&8L.G|&e.i<}|0x OS#(H~Lp^:>O>[\)6{a; V7tay^1iώ7'zɅA@.zUx?m? mADf6J?_.?\$+ں_k$uZH喉fm0_aT|;gZMů,FUj|g,&Y@_5h*y=3e=ц!8!h- OroY#ew!:%wb^/I'>K*XMJiS$%(e2mJ} {hM2GwV:bx;6NvVwBoPF[u&6QK!H7LWX.i{uXQmH*ϊO19xHi!9ګa:ޕ}ՔhiV$͡,tO\YXg%],}=MвtMk_/Ý޽LO&\'>-Ķ\>ҪļqLzp/P`DrF mfq gFY33^';7SdhrDX9J#, I*"1gp&E͵t+p@kd8#cw\3,g싅c,T=>)JT6,x{U4[5h*7M>gY9F8ON$@`cX6r,!JGO:El25C{.R ) lJmAHt)LڎQYV2 ;FΊs?b8ICAzǡQgwx4qPQε(MJZ%8dx+OR 92šX" c*}1d;+xX3c: bWc_DĎ{DNx0 X'|p)pE$8\iiY `JgSDӆƔN $yD.8t#72a*!;FΊs?"~8M=j!:;%bYqŻz<3*1T|p*hL ,Sag6@*lf7y:ȝB2oQm5 7q'QmpSG87.E#Խ N2=v4ۣҠ @5^'2kQ@oKvMqN*X~8^[=ΆW㔹ݼ&~g任VRoƃUIT YQP2ZN=*|*eU|JUKq%"ĵQf+"ArJ;!I0g |v 1Oi RNTCԻP]߼hޟC3˨5!e2)FKIg.Dr(t`\f;<~oaء*ӗKeG &Z{ FHTtɬ/\.l(/T=USO]Wj6w[ mfBtqO3=iʡ뀃-Ռ[ϘJ2x/5W^q'9❎`B=zgukkwIdUb7Ft4r`]\wVGVd2m1W^zzk?ٮ))7! +eYCq8czs\હc Y$E_q*_+~h3>O!!́KHup4!!g2%.JǤəEM IdF)b.;Yf Pw 9J#TYhr,Y;p*iܵ'wWarˇD9~su0\Hkw 8 ge^(mNυ=7U]Xu7$0uu՗᫊s}O'Wjma^s$du1r$[ hHyDiuJA hWes\*2 =}voÈ #F.36Et(<Q EHLi˳1Ok!Z(k-lX!ksk8[AQ2eV >kfy̤Q(]JŰ hԡQ{㻍. dC`b˙=rbF EI 1DRe 8jNlXg0XyXo{j{"tZJ"Q75K-Qw-v}!*>re|Py@]<:'TErf+P%&a/`?>Xn6()HS4/ QFgyFtDH!XƵ֢ᛦ}'˵9iczbC9Ѯie;={!GX'z/xh U3aS* 1 oQG׬{;&x6bn#L2 Bl,G,sxBQl.No>KM-0Z<:,v#~7!tzw{g&#j*@@29m++*!V , j{o:7sVΔ,rI:ƠL}68#x=5M).T.HRdY)o6PA=F7]Uq爱D^GCbrXfĀynũ})a*u[MS̗BjB"mTZCRJՇhCiƴژTO)xf.ơĂ*TJ&5Vb(Q%RoBR16Av/kk'tCpBsF edYNetri=p1gr. ʚ;iMWO骞3,h^Q/a[U2ﯞqLdc(#!CcĻBdlGBZ:$Pc2pѢC ` >Y}g &]K:q-{؎L ]-W3b:+ɵ8;NPi)] ^gVNZGZ$5Z1ȸ_ Q5(e:%À&|<d:?=BO<&kmFEO@E2Iv6b&6i|Blɰ4dn]KrhKh"YSb1#Vx|!s!*ܡHB  JMCt?yf8i|֡Z{p8^=x x1%LtU:H+tVexL"ND|Jp:~a -zdvW1ph`hQEC{v@4l"KC(&Fw/VsdR쵭 "jH* E1IE^墐P|(E5W⓬:`l-z{x:?z_i<'hè1ףΏ~y=كi/]JMOB<~y^Fb夌fu-מ<mƗP3'iuG' |EKw:l:6O7yh#hIt\&wLвvJIڔ݄wcм뢹'Z 캸7 Y (4""D}A?q ΐ3V.yU+"!^D*ꀖ!3H~gILh{n/@<1:(Srً@d!d jM촶\|5UJo8079_\=R 7Bbn:=G@yĎpu .qC) re[^|>^Tx=ś(Lך4&RzjU ӡh? f'ܻj{{=TU‹2d},W4.F񡚜Re`H~CYPD^ZHE ()ja&VfLgdEɁ`l;[Qv-(dt%.3DW_Bw]nU;e-H@0baFVNYR.0Z:[RٸbUZeYeYnp)k>H/8pERTc3Axo H$.Tc KHٷ[h|\u11XMHZN2Ul1l2)a$M#;>oYk._jQ;Ap3=ٕl+cKV;}sQj-S!v89hkw=lɉR2^D*쌨NlLαƔ\KZAaK`L1jBI,dU')]ɕ\ 6I<&JG#ggrŞƶ[~2Yjת=/oOv}NWl~Sq8\-]/].>ǚ6N#OEF,6 =w 2?ٻNFEvRw6?:un׻kz?GCW=g;[}ݒwٴ20f[:.OgW>صQƿbWMw.=\~Y~C/ Nk^j氯{n|so/&F0s¾NMxJěZ'lL),$ϯZ1<8cL6![CVU-8xkƙHEAh췊{bҽG[OKJ?n6k#y` O}^o޷\Jo:xz?uzlG'{{5ʤΤ~=kLrݎ_ߧӞCMnJG̷ݲ彣sUޞ |d{b5_qP[e荝;M| /xܵEc8`^|lNxMnD)&5?"pPPS%Of>Yš5)*mD+t LruAPm,s6/ݖJYETkSK HWM^Y, DJ_y煮CW'ےgpBXW|GMϔ3AOƼ7&o^L_J 7_`H9@ŨFW/E]cbWE /I]^ R+b T{PW_C Zj›E qiw]ϢW14. R(}uAmMnCIآk5!. %`M+|vǓbvN[1݆v}6]a9x]O&&ELFxp_\H_a)h}no.vq{Օ-F3+VjӞRalضe=#"i0-=r0Ŧ}n6gRnYGܶѺl3J2T)GU;+,*IqZ$I;tYFmR+.ۓI?OTj]r-jd9"/'\kS5,DՕ`jxBZOڙmk#mhÊ dZ[=JA$֢%PƬE))VUazQ\}NRdK+~6Rk"ㆤv3ԒA2rR^VH))Zd-eoei" QCL4=ށP0cRN1.`ntʪ֒4djN1 wG%?-3.=CI !ScwP2&Z&UBE%SX*he[H(eBCVRjwY` = >8p! [f]@ y>I|sȂ<,U#Iey..^b &`ЙBV`=ua]Jv;@ðJD{-πVAKָ`)$"XZPT` :;9%`*>vNs? L$jI\,PxպJJTJC5w~WW$MIj%YrgUԯ76k0E%#`ѕJQCҼ7S򐃱s X|A \R>M;URkCIf0LtXFa8/ ӗTd%&[f@ m=㣻WES ?Ϥ8W!wVQ[ax%gN6ؽ7w6lbP?N|V*w!w-mXW`v|?4vvilc`dBdcIdsXzXN,DzHվ ĥyH^KS`TGnP۽,CUBpn! UpkcwL0 qtW\!u *Nz>Xty[k#-M4b\p)2#Sm`K,V+I}r["nns5oX`kvV+0,!p"$Ulݤ7et=ŧ "jTRpP"Fg'p]# WXfIρ$DIM:*r0SN,mi̭w׸tX ONq^s)I(I0um \7h!`3 0ݴ#X5Y:Z35DQZCIlb@p8&XhzPm A|no :.r93ɖ%LHkjҸOkY˷~\([5B%āUYkDڠ@)v {`)1#aв π As!֞)AXk0#JxSdB)8 8˒R.NI%@% "`~;Xf6[^E%nT]| [vXkfAޤu I a;u!Ē^q)@*yp^QtEBީ-  a^%ٸ\֓l#BGD^lU-` q#i/7o5WaMVpsU4]Ҍ Jf0h_Uf]=w)ag@sxwؕ@@)#%CT)@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJS{%%Q; Z^ ;LJ @"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H UY D QC\%ZvJ t@Q d@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJBxu@lyw@%æ;qsJ J K%34"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R) z+^T6>o&/?9"'l>[.\ u.¥baZ^faɇ'Xj Cq꣣B-^z1 ~dpujx2 du-Bo>v_1gy,.О*{~pQ^5̳T-LrVJmzg MsI3~iVӏXX|כj ,:9_>G%cURʺRʇTiIQ(e5ed0#ףڳ\5W}ՈDm(+Q(fXgmK. UVa ^%cRh}R0ѕ(أRyR`bTIn]9+p ]骠ԊҕUJZ!`Mg+t}&P2CW}2 |z싮++Vpaj&AjGJ^ =NI!`Yg+tUZvtUP:Gt<3{e:CW]c+as) ]I4wUU: a骠DWyA$wxLi[#`jov*pUs[%ET585;L姟nJUTۆmFA+ ch9Sdw \oBWyRsHW*ͻDW;Ks_G\P]=H1֥tǻ*p8=B|G]=@rY';DW ]w Z/3ڻzt]yxj/p}g h ?dɂR]qry9c`wǨ% ?Vn(5;.;'s`uzj'c ] +o]u Z yá+鵰x_"tiwiGp o[v;U;Kv J,MXZa29wt4٭򫿔^o٭1j#rSÞ~0Go8;6z#_nvu;3p>f\8A2?O%ZܸZ>kx 勓V ǵXG.nai+?28Oy˽l6`Wߣo3̭4-wWzy[Pޤg~a5pcLzy2DCƛRHU>&o\JA gyX.ik*'x^W}Vʅ̀:‰ZrB$E)Y;U\p5lR[.LM,`6Z{`8(ʸ,_  ۸jf0 4]p*n_wzYgxKg_OFs{ʅ_t]ΧGA2VB?]JO> y1vn)|߇ɖaRbc]zYMsN /}VfF5N:XWB5f5:ښJdSœK7l8Og`O7m\L7FZ-Y'wE΋&B6R`_Gn m煗]|d4:ɪ=Yoci71a1ȴwJۆ[j#ʱ,J !?,3A̿Dƴ4}6)7KDK#dPܹĂ0cr $"4DDj}|q6 K'|$MgLRd%(R1Μa1ok,3/-ۋp<< (&y5jǥ^_3OMޫQZl+^# 5JMzo7#M`ޟ0:w2!=iϣϷ\V[lIA g"WSx׽Oʞ 4@9SGyK&(tt<瑯E9(XV}Cd,!3ojm#._{gpGt0Y )Vdzvǒ$G#CN叽ɲг߯~[2`p2;;?ټT!L:Dw'Ѵou\]Ӭ@xi(sXBśKWe_iywoq{}vO~j?x<./9 u{ض6FӚmՈ,tm51(`ŲOƣ^9<nhlk\몑jVəZ9/j5E?lv~WZyeܖB l墰xq/Oazo>᧧oOo?} >yXu刅]|Bh#a\^2)_~x>|WըoUt6U}zᷩ[zk7B|93=/Jo7HWk@I|@=@Z.m6Z?)dD5DAYo9K )P;+vcc[Ngq},}׸wmη YtǃguNNJyj.GdD4*3fǥ5oj^vy6?^.m6&l6[o5HۅvYxg13oo_I2%7mTr*_mgAo\W=a_0Zd&hb[fo>L&WyB%" g1YKٗ2U/IT"i9Uwɰgs6-F|f z4zeR, | -[s}D!8Zضt<\f^0W [|4P{0Y<{ӮiyNhM>k>3 _|qrm)G1!!L׸>e붹#z9:[wbZ]zuܣ/'UtTF>3x#lb}2%3[8JvVWmS+>뼌uէ4o$|Zso47{Wƭd kv21PU^6{Vnr݇ĕj|IhR&)Ɋ}CJ"e(Q42qN]ր88JтڕVZhm wo/]S_''ŤєW_WǒxB]=]j",-o6OI0'& IoxsT~Zzy706-y٣]X[[cmVb; WBZN9#y *&LWf?T@YBm񑇦e<QP3kxq&z_zKs^ڗ@ٽin N Z7,-% X`y烵aycS2N(UU': `bq{ǕJ\d9fN"?c0tպ@&/zbBzmI3.蠂!H[q PyϸI7M]EMY'J47مsP5/bBxHʓXQe*+jlWtXrm43ixKF}W=먤]Q>Y.۝-\2{;=Jږ;=Jןd߾S}o]:Ȥ- V$VhǜKYzQ Īр1b嘱YrA'm΋)B mkTmX;-c=RV]ml ` ^< 7wwiyxE<,_6hp<^/bgqPq%+Db<)@d-A:xPiVbKY0~{!E1@Hp)L܎Q_rI9C,XlugI\}դ/VV{@JjԲ4YPt˨CI KZ:c#\e 뤸0kc1x"LȐ+bA&K&&H!ʱ>fY R,be{X;aK qL0"֒]-"T0X"VI 0%hYA*4xXb%aMvǪg-lVLYD.vG@%!VB*V"Vvxqh=yR>:I.vlv vqI<:1 )!h(.}:T"+jo 8:]sžj{xsGg0a׹KrH5Az %ZdF=> n~|GͤT&:]*5v_*zޮP+衙k%$*k_|ɽP`~(as{VpfnSZofCCV1 %(bJXJYr1a:b&s@%L"3bfg boqXѕx%Kw+AL/~=jqtT *:a /t9 i)٧H%]@5nKs z /7C/J@?yS G~͒]gzs,sgKkxÓSF ZEʄ@iƒ7UdiĞN}Tea_unYoOzm<[-ּ۬,'N/h$u+Whgw1d %]t֧l@KFLDgIsrLR!$#*} V]2Ir`Vug߉=Mܕ3}¿S,y,_Ylc:lMq"0|Mf@! du7\(@.+| i{N!u(3C9hc2 "\4'F%%j](ezʱzBq%u: YBGE09lPqZD8՝,r r^ט iyneS_|?qpRօ7ZKX}c0v,7FÀW_ jLYM꭪5c$bB3jQ21;M 7Լ)F5".j/Տ$]@p@˳̺&<Q Ep@Hdz6E8a/K BNOxc[՝ٗɑy?{e+9gW1 QˠVnNnU&Ĩl"Vc){@8NJI0THzQq Wv%F]s+T(sю5Lrm6BT'w߄.4WP{_/E]YZ02pWV0lb;Q^D="BRe@x9Zp$0fy!FA= R;Uv$ 'e%B0+cfR(19&'MΞ^mp22%2``:eye!F\s hw1’9ZDt.O('[O*ʲJK'R 9:ZYֆ(xS$WSUS KQ7~>?D]t8C뢻=38^1su{tRV^va<1t^%ܨ\wMd&"KDRnv]|.Xe^! &vЄ 0eT Oһb"Ιke7Z:Vǧ(K{ޞ_&]ck}e4-]Y,/hMߎFߦAnm}5˹ur2^L:0Dz{gn%qQdӣttI/8Rۄ~e׻zqNkx-kXq)*꾞;+^f{:M]%wm}]K4Ի!wחpy6f|{_L7kQ eD[Xz!} !]=)lOV?EP~.BP=8mA,YIwB6뢓K#IsDBM >LVx{ymo℻co{`8K\e7"3 L],esem) QJ% 9#无VOC:כ\Wm nl؍̶wJ)8WM̗9J ɘcAW`1*и 5*G`TQ‚~`1C!ymO2pM&`H\b>I`l&Zl<(}U(BJA:Leֹ*t,p,'5ZwcX[>'yw yS\/ߟL d$ >JS+iKe jNi JfCeFrM %z9CvlށMx['ium !R Q Eht42bp\:8B̙(- ِVِEOZc|wrIOW?{\vyJLd"Y%HK2:I\;ЅȂ&I8XgEo `^0y0:{T֨,-DU '*{` _Z*LvUO:ۋf#ڬ~`2ƽai^87!G%O:M@;fR DvGq0=g<_L HY3K8?xA\Lt!j¤X(,S+, C{Zo Z3o^r?a@8XLD[m t\3Ǎf%Y2,#LP CelnjZJQVY7E> mԣ@QmfwwD'+F(締 ͼTZ|l6Ys)2H 9LA)]@$VpDj`wɆUm6`<,֭=w'Oo1& o?෷z0i_lўJm)1&iGi~(B{ggqwJ:)N 7*ghhD1ͅIi>:YF?͊tw%d:.?~8hid=vofyG+z Zk1==-R bGlXβ~_'J&>GtGIq9l^~Y:XE-9Iy^ǟh~iq?ӬQi~[!6P,?',WZPÿO!}~4J2(c86ҠidL?*%xBWl6鑯gsDri߮Źtu]1r61e<lRF+\ (3q\7%z`- omImdNN1}<]rU P+bߌV>Il|%+>zVY=h/ithE9:[gU_Hf%:'ߕ"=>re!jҟG,B*MtI^ğb14ӜGG_*|5Wkww<HtRp~tzr *'IwHysj_wm nCU%A'X`ЧD"zxJk [fݿWk3`3cK𮙘 7PˮU*rٞcɖ6?s0jz@moPr^3AS{eKcmV-cRM$o6tT &L;͚s DB {aKO 8LIS]rVQ !,$i: Egi`?92$y&5t<׀`<]ejbOշZ eNv˵Qnt:ґDXWp ?Ĥ}Bkz͙$Rs'ğ:JWH!Y "RFa]F8Z F! u!٬2(:oޝVc&yR(nl5DҰdmn%o ѰNtLaKm*=&[[W} n )Cx&$V얦]xifִrJ<=Z#G& B?)."t^>`XnZkgQ@ֹ]nv>jypuKOnp0w f2_\/aop&bcF?'z=5.yuM{$Z\q/knnZJu^ɼ*K@A%ا7,U5/?/NJOp坙EGig&OCXN xffe,U~ه 4j"xqVgsz,9;h=s=&VwǼ\0s,G&[;kHE>ג)E  9oEޣ.h%-/L }j5wXQ筁Q>5]LT <m_o"WR>6 >݄A6璸ҤPĤfo$|PXw`i&5a_ =-}F*uL'lEl^op\{ԒX-DzK轓ٍ[iH@k4$7}0Yo@~2ih 5(O+Lws(Rk؋9qf q"rQKDKB(4:rvy$JqTQ&GEZ)@EH!Mg'i cɇ\ vH^%k:QsFJ>j7NyX+Eo1o;%-c7~'\w7hNY]`ukUZsVЊgW bՓS%E ip%{]v4zJL"O`WcWzLE*5*դ- CSUBI:v!)P-- JƏ]JT;vv%Rb"v+vpjtq*dcWo]ib_:\Cv@"r].{I'a眞# _/UlT9 R~)Q{q3 iͱi,h Npiklv 8z%8TcoM3^гi_eTlb?s$8 )Lrn?eAh08hq}h\'JUd)0vRDU|M>F#򪭧r c߾bЬg-Q {z,0C=ɖK/Npz$QTTUt:jKHiX;l.(0kr :|m.7 ]nMyshLRbTyimz#frFA鵞#F!! QzX u'>L|"o:f)kkWI,$@j K1acC E8BE&$٪HXG"\JlB Oc*5(K)µmނvȳL6gGiNGb&j"%$=P f]Vշw ҭMז5^fZڴIl,9^pd0,yhFrNI!-b(rA0O!뷬9I%Z &){ZI^G*#%TD5f`jƝVc +"XAPVdb{ףXudj+{ź dDH<:6 Z1.#YA5 .mJ(J ,á#W!6mM/5BE, (*°R"]d8aI"2Q^HJld y'J˼H/K` arR  ܽz/U^V'ߠC6s?{)FgvB$\2%ufO_Iad 1=dd\<`&?'M.TO EC@LIr:aX2S:nNdnmt2&x{:՟o0 (+:mqkrTa5\KUU~( e0_E#&r1e#S0j`f/ߪ Nt~~6-_A*Φɩ)n$qA( Bj!J&CvoLYx23¯λ7MurY8[F2sk>߻).?Φ  кÞ_nn0bkY&} 3f1Aɢmoapzz]7VQ[:I ^#cḈpM,Y)Ά #> zvz ǿ.}]}뻫o{wipB֚q2q4znQ!q7qC|w9[Y>,+u4𙯿#'pUk'F<{b~}ZV^{eR]CuT]M]MyMxom[_2~:pJecspF+JRb*N㽷֐ =hF(Zp &D[ml/rσ%yRpdEΰ NhPE %]m Ƞ0UtM;i`g3wS =c@ɿןؖR\3Ծ;USNԵ]DJ6Ŵ8U^`ɛi\0ѕE텫]]ik҄CIQhkcN&L"2+ iVo~\]ɮg [V~y$WTMys0(jID֤Oh-N(Y-&kJ_~Ɨa9 ,/gnsJ!a2ڜ:,s&$ϭ.Ofd' wJx8;8{G.JȢCEe?r9kLh@qJ$`"!oRRM7P MK~^"\\E+n:H$XgA80wnɨa.*̇txDC>>8nYIAw 0GUΖ^R'+q z;gjxqV?5IWTޝ d+*R6A'` '-tT1▂VŐ !CeБ4j#S2ftLVPkoIkTz#cF~\v ݄%jLJ•FEUj!32hɁS ݊sv,+1gjM\D65ɧ1ZVYR0iuN"Ruek89?AЌ; ~3LKەԐ$DU&vy-M)Yqp )'ׅ5gA$jTr,]*^KՅfhs7uLsO1*`~UY9$ѶM~]\?OZ #= u\C5 NӔ_w՗VXECp$̣l.6E.ӵj2BT;zջ<9V]Xe{fU^ mo|arSx;k]ZmpeXL6A-ZzT~ h(H~;>۾Y`{);p<(u%(A؝)|8qoЊ"edDKJ=иߊ*9jkk`Gf=vb۟o;BJR@Clh.+6d1Rvi2";5A\M(1bDSl!㲷$AU=[qvG<xY {#o9zMݮ+&eDVPkXdnΨ"׫8 D9("ٹ\ϖfmY@,$ki;Ac4*Y\&1_ڇ㫲9vh)Bo:HdK{<,;Ykl.$̇f!3ve[a"@dۤ0r[~?:[0"[qY`5*H>8_Z]2V6*[# SS>|%h yc!Q|iCy1lOi<7CдU4r^EUWu ff4XSH}-uj}fPDkn}k5ޑI:$ D@S6[qv9뚝.iս`nqSmnW!T>\1#.u~re"Qj4[I>vuJ+^ޭx˻Ƨ]nui[)ȖwXN^{pQ=ń3jNPm()CjL Z 5Od5{ =xx F9@m:Wq -bZ2G9FA 6s$ 1Aԙ\ r:L ;#EF1U['dzaW#gOք>3\˻~N;̠8Niؑ:ѕڭÇoSrT+:B !YCTMiQV9(, 2fҮGB1j.\OŤJV1Kmf2)gKή`/R1,/z'^9¼=^}Ehަ*"/mĤE6\(ĚD0oݰ"h6w [/x48WXs @7|Xq<үm!:a .M*fr\fG2B2>Քz xn="$SHه|E '~[紱Q \jn *VeN:"b&%at݊#r[2B-~E@޾{1ͼޟGj}as|L(3)LlŵLl}#[)L#BKCDjRm8額s3.=XK8e 4}FW#Ƭv)<Lƃ zheʈQۘ@r͡bҚlkem㲶jptB츛ͻmռ?k˶Ѳ-DL -ZdPs"VV r5V圼Ih۲p9l2TH <7'Gse.F`mluT=aPC-Ki>s<_ zBτ(%x*/ަܐ" /JtW;LSzpt:ӣVTI XHx[>_}p:UT?&XXekmHec=R/p1 =@٠#dHʲ俟!98ZMOOuwUW2_'Yq֛,6}9PVD'dFH5{Iz"yy3CoMWBO/Y4gW"&ajޜ6ޔ1SZjQT)v9hU.BɽJVpjsI2:;_Pl߃OLӬlvL|n2'6-}.9Hg 0>Wf$js.Imv{o v[d5)Hq|9^}ɼӭf'"N/z_^QoE @Ђ) u*pCVc5NB *qEuDً҉3|rdY?et | P'Î"&,FɃv8R[JgW6rPL Rm/]*2Ԓ(YuxO~[Г՝ƒsW#{tno{D3;ƅGNJ _ϗ3< rvLqMn\ Ƙct{&r^]oB7cs!]} l nBa>2˖߆I.7-}6àl%E|Vi>Y4)Y|2 <1t9CՠXP[RmKnJϐ Jꍦ&üq~d"'Jќdsk,Q?p2}P( Eo](}BYC(Cܤ2(gFLE\ ;yѸYϵa<not{q-x~8ۙ~I<ηtMV@mɰƚϚn|[Y6#wnN'>*퐬v͟6'LEXx|8p0nxڧ85S LOB6@QKE/UZ.\"ͼ8g֨\4 Oޯ8i7R7N+zlTMŖ* }+yFd,tP[m&{r|\]N/qʩpq~USxKW||)˓\f3LUƽ0F) ?M Mk_ iV\dBO?)v)$H Ų|U*T4ɛcYWb[/ fa0׶f=$Vۊgڮ]l|{ZS{נ>׬<.^:2l۠ tȆ@ #5LNvm\pFN,9C\NPc/1\bJ Vk߷``dONsS^3~)np09sH[G@-1beP/ͰWs;-c1k+k kʓ>u,P ~)|[swfrjR[ T/A_WmRoJߴxOg$hܯQ+唦}5#ɼڂЎ VmZzbK) J{~_²(ehǼˆ6M֬S. $ [ŰQ0壑fm^p8͝t\{f.Vn'Nc Dက߇)J'ottn*uWME:⎷OO{ͻʺsGYp)Ǟ2 SV/OuNmskŞgޱ6_/cǵh!g.ʜtk)] x3w1+ܽ͸G n~[ -`a:T՜/WJ$ЫĕIu\Y8v• +@̝]rπ(SpR!f8%|[Nу.U9wIYM0L 'nw8.|AU1 JHIH*v*@-Gؓ^crR/\Z?HIi%bRejyENj9dyN;NQl\A>@˅qN߬%Ko+l}lr+*U.pOuBi/"F=۩};,u|e[YYY;p^=Vv;_VA$'Jk|2) W!q*D-9zTkಆ_]%OH] SQWښHUƳguzԕSRWu~*+DǮ(Y]u%% OF]%rɨD-'Ǯ⌮^RINH]%8uȕ'Jjy TիTWJ~^ EuUٯldƓĖ "*~QYy4%˕-~\gǍ2ӠrU 6γڰzxZ2}*{VQ[\sE&uwyS~i3v' fr*Oj#ޡr Х2:eb %&vzȳ+tFNˊ-smzDE[E[DI尹̗#j*(G:$ 2/Rd E}9` n %ŴÖli­Rii +ThªqLW^)ǝ6KuFSX<ѳ(Z9'RE41Fh'47sjBJu: !%Xs누h9|Aޠ0-BB U-`ﲀr (2o %YuLo FtO-B$t&̬ݷcOFٻQ;֧b2쳻Q?T#yVېJ5{ eYY>C.lmB~җkip;> 0G5v{h{ )4`FXL(Jˬ59|Uˢz2UW:B@saŒU^9CgzvI8`06Jbf9S0+#0en[*2_q&_v`B A*cf}δRпJG5MMH~t&<"R ,!'c 2F`LdJVTbނH̑F6QGCJ9b@ws cʽ3>faX:CP8v @;#g;1ANލf6.=l8 ebrwݷϿal0o A>^Ԋ`uWNUyolL{)c.k6"a|)Q3\+r9XEz7c>ql'ߺ{@XQ))vNEƩDi^iq8Hljİ %ak؁KY e<A;܁)򌲪gU%˟MDdU%QYϪdUJ 75 2'.|;8HkT V0ry>!0SK^{<Ń{}l(R YCQlC"ƼZr'1v<4Mԩ3`k +;ĬW:ac#A2 1lwpI842l;t&h*1Uӓ-6Cүf} v~Or ޟ5;w$5$Hez `HZ-UVh4[ Q ښsVD؋DH\q/z2fVcsD "iP 68tTtg^] ؾTT]K>$;j#m^7ܠY P 0J3]S` GPS;aS2@.^t#xFEO%ϗ&y$"F "G|g B9`9=3 Y(I(5,:A{,00<41^"hIp|b @+b ՘#G $C0! l$,j-kvW !)CۈRԵe7 i@dGRB<-*E^V/\3kCV逼ƌ!ޡr SX3d"ux^rϕ bb: AaɻA8b ס^E! 2qYASg) ~{W;a=.Oی/jB sNgKi<4;IAQ}մx3Iyu`|aN2WU]] |7'F5:jsaD 8 Oߊ`x}5I+e 0iy3HWU~9uY, qDiz:ST"Kr$L23͌ 'so'JDTE37LP3,$ ٯ0k@"of10hr<(?fxCS(Q5Mc@ SL~$Ñž~5lJQHLRN[JC,TY/{U`0QB}^Ƌ=v  ^UL?C*2 p IZ [*M Pb/O=Ή']!I~tUl0˴8rV:[Õ_uh ^Hw%Dm̺x{f[(ϫDrl1+ʶ`g[&XHu]we.älܕ=vOPVB?/k XZ7K'(y y.}%L~ZfgT-SMygjM |#msP(ԹZf=m`<[^uܺu=EnE2@u=]$cgmI 9Zr;I=PC`Q3JÌBsr&:Ⱦ&:]V\AGn曋 !\wt}l n5|t,[6 r%7wobt]([uA 7n twY6:sΌw믇nslPKl.m+^z" *9D0&s>ލ\J6ݨir>ލJ+#CG|Ns>Ӵ7 ~сaN(Esfͭ$G~Zebe;{\FYC(CH {ӠiCQ(BGc#ϓ/zoJ/ik6v><=vt}܎7R{K #x}“˫!~9cj1)ֹ LK[ ZNs֞jNOvrN >J$8HqJ%T'SUH!"[ P`B<UX-к3֦;aײMFS)iK`g숾:_Bb:P+?utսYdEZK*= Q-u&fT]»W=O:0C5I!v$I@<2࿋n$4t>+5"[GVֱ]vުyQ>MK%7d4P_vwQ3+R7rTLqNKN6l\dXsu|^qZ-lNmf}i^j[!43 )'6 hc)=l;Knrcxɖ~\embBbJGb*81J JX:Ov^ PnԑrE{LI8FҎY颂p-a[^jFub茜] s>? /}@2ե؄D尵Vi5SYܼ Y?^us/eZVj4WΫ ԏXr`Y9ь0園BZPX)BSd@éJ/:<x9jjX^G*|JFǤ"9qW{"iH&qw]C сbSԋuP$xtNmRQ;ĵb\F,1j\ڔPX,^CO[cmM;AD ["" +!iE@S$r !< t۹!$ݕ\ a؃ځb "!@t_;p 0d#;W~,~o<.8Rl^ ~j!.|i6BhC $G}`ָPZ ظYv4o*SWq DבG;nUZv9!L9s)(FȔeGQ~vIP& -,I֓sr;|y6Wcc,N$DWԽ|>dcS z`di$?.OեYf?+wݯ7?V/^N5$ԜK~Pb6T/vQV< SwoBm--͖`ຖ.ۚ!`ZYf!!JFf i'@6aZ[%hsM'Zm+Hye:Ł}~M\:b4 [>,+uT7_gw e*ZF71ɜ)5:tI 3SM]w&`ƊM5lCR%}V|ɉCOȜ@&恚~"׳샇zI&%YJVV_ޡJǤ7  F$gЖYipf5>3u#~"Fc$P90fAO Pѥ93)j\R8WXxfl2 w^2{0՚'d8LڠqsdR#*T-8$]ʜG8+r4 wVvk>$\+_셭ۈ(qM ^޸ldȐTx(aͮ^ 6]JoFޅP)2tiEݝ]Ms(~=jq%& Z /TY* OQ$aǷ#Ɂk&lr,査8^8ȭ9 (! `<.E\Rݵ" =. 1͝} pUV .js(pUEpERJ9'\XZX^bUb,qle #:V?h864>wT%_ia-\[6ʛ8 Zm Ks!AbI~E1ef~Ҡ @3=F?nܗe 0'>fKdνR y:9;f^̀ٸxk+Wz_h7Y$рdFA:k)u@ (r~(H+S"%bq%671DHHpdͿ(z$^QwR}@2O+v/eЋ/{ ̣M,8`H>AptUz.ˀ.RFEE"އL_KyhYλ78j@ߌw.|wԺ;n9+E:[e,co}󲭋XC݋㫗9G|.W=ˬ#eЂ".¡s5;Z i\E%ޥtv)=ZʷMJ]mƹIoCҡ=3t MBi>'~E֤Y1dn}O'GhysʫO[6 o eJ$D 2d-x7,M(Y^ @vEU쀡o *&hgEs 0B*%Z +nu:ё\qa2 dC`RfRPXn1 ()rƝ@AQs>nGΊZt)bLL>|zVBO׽ȺqkbvA46Z2)dA"LNǽGe8#ΊSoq? Aew+qARLShP؛AÂ$ I2o[JGRfT}&ѵK?~ q|+52?ޯB7m˿ni_:5yjwח /{'hrl/bc2:㥞/s{w:o$ʤjQv@_H_~HgI>̘ZC_y=z@U:pOrխlUV&_b΋;;QNfTv$n/ $N1?K=,O ./wkA6qbIV!"2鿕5V$H$z}ڿUPׂȍB,(Y4.ϴ.OvOl';OT\n̓%cffr8ffRWfk^m;+bn泆;iVo^$jڬR]^Ļ 0 NK?3Xu^޺S%(Bt^m<'79x^$xN"j,dR%HƎF`Y۬yTq,O'Loɣ_$9 t^4 %ތ%>T^wBkiXQMbFAiҒAHe!e*k1탤1ܥą`Qң :Ŋ1 e QC_FxEEFR\"EUƾ8KʭUM_cL>1}F*)%6̠UjYN&( GF׍=cc~_k_Lc][Ŷp3s>;Tw_eBdcCo\(S 7!6L5>k*Eu6n;jkCd>& ,kY~rk$!Z`ppx@(y(AAPBYiXlRue_qc*2d>dm1bh\s @$֡: \Ai Q鬵X! UqXm6UZ- }H37nn-s+']דx%׈iz1 Ƚyҙ?Ǹ(ZdŰEXKe +*AΑ.ZN=}OxJD:6܆-U<Ⱦtc+3. t_}6>7%%0zȃ 4pG)W,S8 n_͋Sk> WXAAzr-lJ{HBJki%I燼. q8ҧ 'љNY zDD9 ?W~)lSWB8*(e('+ڕ,cIWUojb+Ei/j*|=?JxP7!M5jȇ翺0cG|3r 9.ӶdC9wVB}Dh E[3B 1p"X"@ltQV`9ݍ_oRfE_k͵w+^Y'  Puͭ_H7wݍvJҨX+{=3UE7fL|2f_Ͽ(6Ze-՟O=cwfϖ ζyІ .$bbͥ78|&qPgbpx\HAjQHRA)IZٺYG4N!h  D!nHB[~;0 >Őwe&>t_T*2D^ZH9o}6 cB_i2jFAN gY Y)e3 )`-Jt D5ZMaF'3 jwflfS͍~7*ySګ,yleH*{ҼC;jF (Adc0x7d4JJ ՞7 HŢ'GT\PdKAO9iKR%cϤVixeg<ЛX֊\3F'" @hSw."ņLC*U΄z$6F֤R*1[]l6ꐞf~ՁdwuHfTX!\t\bV: Q\ (umfj@X5GRBP"P>*H.rqrPa38yxsG= Ein c7_&vSJR+qnݨ5{ ׶}Dvt# m}@*յ T#=42]\?^ﷺ׾b0_\3*@2b"siCʾ0v 3MQeo!Kz_{tճMA7K_ Uo7ElK*g09e TP:J%XpX&M-~I)hi27.6@۫0/O*jZ~-w¡YžWzGM-.g6SIS`CVFJ*>0\'PR,1Sv=62 'ɹu9L &>g(CLʹ%v&D ሤY*% !"o :vZIHdINfԳhLb[묉L-~ <[ўR ͅ9ox-_滺.A8/I^ʂa+SC6FK@QtL<ǐ_uëI#PYW1LTDJD֨#=ֺ::K0Gdό 1㌠pO(nM^bV|if݅@K.:no-. X!"I`auS,(sO/BQ췦ϵ5Hoމȁ J Ixe]1#k"Z!Mgƌ6"1DIn֘)824.2dB kD_.L,VW_$ѷO/׿X#prw:\G#z \߮x8LnԌr~zك+uqv7aj2٨ NNk fm ELj3S?Z-0C!6F7 6#nfX2#<d`]źGBOn<;?ny8)Q>!7YA WN?4/y|>W_ePxHs'rorS4fgA<8erz7ϟ|oIe߽7=:Xmx.)j 3M,~tQ$De̋X7ՕV>AZ^O>7ڟMqq c0}Y4VCxˡ.COX]u-yø;n6qyi@`]zLުHUGzNFzvR|@XI)3U$#JP1EltjB(}nRt3ml^OR-&QD,8VE(XQfOJuM%K;v8v&yzuq(shu@'^IOu|Aw_ރ˗ ֈx)LW'y}LNh`\ Сws>N |aLV?b,'' "UOiX4Hj\!~v\CN,B&T){#l1kA%lxt& s cvX]}?Ml_ LƇ<Nh0wgh0}5/L w7M~n~?6 :ے-s?=ή_~Lm8l\`V큆kO7iҪViuJ;?gquC?Kd b 0 ] pբzJIHP\=c0CNFEtdAe%d!m9ڒ! a,/THfۊ{~ȹD^g5Lx=~CGw6{y-YSe۬V X:BC1TEGG)ƍ4D-leh&hkM$aSBE$rHyP!'Q(}TtfZ `kmp7-EO^5~]׉mz[g髻-BBLBy٘AM g&Xih4TTPdI)M)? +,)ue,ˣOP91-Z% ]QN,* &QCF֧rd:&M_ X*0ݱ`xp39/$hFg趘gz E2Nü ]4h_-z~tVQ$7' t5Db0 \W]TlȒ/*SzQm5lMLu3cIK2t)Rcd i48g Jx~Jq_@{gۈ8yTYwJbw{2POIb}/OSq;/#LY+K 0^*Chѝ Md=x6k OZ=u4X ~gz8_!tc  161ػ&WM2$+߷z8I(C43S=SUUu]8O:XhGGPDHA4AFpH_sOyϑãqCCiדp\WN`>T|z:sM:쬒CT&M`sȽsy,K'Oշˆ衭nyßn[f by}O)<[gKlI-ɳ%yْ<[$ϖْ<[gKlI-ɳ%y8똓QVg6sI-ɳ%y$ϖْ<[g#ӔBI-%y$ϖْ<[gKlI-ɳ%y$ϖbYݲ.hń,h%dlVJ6ZF;Nʉ{ Yof<[DِLwVż2D%D(@v@dmh(ɼ#di hXs=HZnQ&%-e-{!y/dz~!㞟TbX]y~b X4&wICCB+In!0)^-824(hU+bPD3ę8Bd,ĭ]>AX5sLz'6ʠ-7Uu1yТ̶D]Yfi7[zO+TT fT+58O!YONtBC@*M` T˭t\TYJ#Q*"Sc$$75Q{ɔAA@%+-2vD2v5xXCnTf 72q݋{ڽrST:i+)Oo`HDV.eԜvK؝@{F$ B*RKZ:W5F΋seT!0!KxyIg<Ά-FV;S 0I4W˫{ϭm)_lqt Bj}?k_!=k [Z1yRdP]ׇ~҆H58x@̀ qp5D ^Nw(T(a/!rE1pAh aMQxI/ fEcU#)oQCްGW4ٰѝv'0A$Ojxoi84[(³UVx"8XT&4kDP'!pjQzeN:EK(vp<= ElT=E<6ͣ &I4|B%4,Re6QS(Bxr5.OHB5V5=ؔ)h$۫z2vy.ޠ79fWp;ri?ڙF&8n]\kx:~ߧ~'=|ވ~z&xCoTu̩x=$qN^{i󏎯oyw~ ;_Ѻ"P[!zVyV`xLFf0Y 7<}^p:NynoA6 T1h]\ FM-ABxO#:[1?p|y/?vE>ojRϢ_yvNӾ{QzHqt#upӫoO<5?ūO:h8v !ɜ̟a4!ayދKrW#:gc{>꩎%E1q~a8:ݰDM=j6+3Za Gۛ+,gY5LȤ<)tD񒗣"mM1jl' tDnk \MlL>ZZTtg}~7 B=ИPԘz-\`4}XUilfIBV`^q jFmgE-71f1 E65[ܚwuɬ82mxA^o݁5YPF P,fDƎYoH̙hxkZqO,H?袖Zs>{2m0xc&@`kD( *7' I E鵢2tRM|h-o.2Í(;욶 ^W=k.I%~հ^킬gO{~(O?R*^TFz\$UEV NBe`'Vl] CMҭSmgLwɳm+!g;a6`=*"s|}{&y? N~|<#klY_rڋO?6ogijWnK%i2/\3(pE~m fjKǕetKiTRla':ϑ=Z30ÓF#*PgP箒f:8D O[ܳU1^ K\cIS8a'ՈC8X#)e\isȀȴܓ'\,IҤH} Af[ O['zQ =Svu1=5&z,=Yj,:>^JjۙApO iZxq5"6% =-,Zm}2n3P, !8Jypv "y!Mu-&I4wQB98:\3N) oS,+S -"Ig$s^0#5S+o(/˝$Rq*#4 >6τ;(#*J⍺K fsj# Y4*1FiSi 8"ѨuRփ@5\$A2* riPB-CЮGhcT21 VNP|&ʎs^Y@d6ǯ dq2괪&ml=#Yf+m0][%f:cqvcuDQJ {C,Zb9lWUcͣg_\5ϜE;k#|4 .<Nk:y8qFwWӻ[n,nh~Wvl~䛛ۈ?y·r<~y!ou2z!'NEoSr|?.7֙,7&&C moɄ1Huf.F8k5 xR% dUHJTYM>8)*NʮiBuUܮXVۚmMV;pԋv'$:>jRHp.&m ۭbtRY>g.K `]୷R蔅rZ=c Z~; mvGY!$C4 G~/œ_ MiО]p+Ɨd?lsȻZD30F:hFGS#L*6U6 q!d-z}`F'mB=y0X7ͼ4̠1Hev)gA΂rMs%y/ON\[FIk[w,㤖Y2Y+[9EP{@ E˘?9R i|zXB}POwwZH'#"*H8EP%.ju>= <?' , \Giz T$_ ƿ5Ηd쇊M&>8`^Qˣ̥Ҹ]bO3yMώUk)a]>2Ld:7Qi8 9_60z@''_>>P2aA^8ᇿͷҟ~x?ra?㛏}pr\+Ź[@h,a\^3)֧oӷILȟ9Oh:34W::O4 lw3 +F?z`|X _7ox[Mc{󦥅MVmz6vt]RmƳPEb7EJJwMlNOg'qd &K >)fu1(92lċ G NNz9TGloJ)#Vmڡ%bQ[3YHoF45:TdbR_h׷;7T^ssZջu vGl?D:R+Ho~ k?,v};貰|˛miIrV+9|~$ÚTgaL`*4Lڨ*$+: L$ayebj& K 7N1`mf9*)A`[óu!*M LmfZA]9^6 =4Đwe>po"0Dɼ )3yB`\݅'șk\YVϊO){yp!suP)llygB'MuTKb]8aoMNJzzңmhVYUfx&AX!;c.$ť7L :k ~ZeFmMQ2kkS-@FL]dIs-= |$63g=3X)%ؕ M\z.|R.4է7L48l z>lz PLFlK>veD2#ڞ{F|rc<!%%xZkVұ,C%)];\21mpM@rWO3րgB BFmhI 0dUB*L-3bkp3o: /Ntٚ‹"̋ŞxfTf: |6*KG«lh J1IF̈́{ǒ=/>/v[.'*g\G%7eg>`~C aǗf J`g^4~1K~7䝟q(_[62? d}8ꉥa 8f)qL6eS=ggߋ዗3'@ H yC %VێnxA^ y ^JkRRݧ+T1t),-ҕr^]`+BNҕ&Xy,]!`D1tEpM1ƺNWRp$ b(b7 FOWӕp NW[VOr_v(vt{u98kK+n nYGutq⺧+ "JCWWRu"׮Bc$"CWW3] ]ZNWRf/.gD0G=jV:xf-q]OSJ39O ~jzk̾*+Ą* u :vc]_ GӪ&3DP]]^-ۻrn#57[\2lj$ + r0yxqrEm<[_ioI?~.cU-@T9R| -uu l\1:yP$:DC ,XAtE-n9thS{?J+,CWW:Ъο'C+c0%XB9*]+BiyOWHWHiJ>Bˡ+ky?h`]+BٵȹЁP+ɋ+ Ŝ]ZycPzl8Y},ؾ}YjoقLOWN=JY]!`Y1tEpˡ+BdPꞮ5j+R˷{;] YT +̹*T՝+BJWN,X( McΏ?n+CQ67O?u(4S",_]J)ܯţz ރi'(qμ3d4;Dl"mX`DUn+W6dR*dS=Wd<*GU\u~'S') Aq5ETW9y̺%G+r1ˊ #+ϫ̡ 2W\U ED0>@bly1ULbZoB:SXvBnC+lP.9p0&]ItyUGpu1w"tuteԮ$MK1 eײt,tV%]!`M1tEp+<]J-{jn8qsT{6B+Pm-tsxAtE^ }+vߵзC +BYOWHWdt T"'Z'NWX]"]I,)G" JyPv-_^OWCWTRv2*+3w/VFloڒh;]VIpLQ4 Xefw4]rԓƢ4IXOjsU0 k0]^Kp +e%u}EB·jɘ-~]\WihPH`cXr ֹJ%{:@vVQtpc Z)NWRpJҮ-cxZy#Bi{zm8@߭ ȹvh՞P-tԓWBQl-ХWP!ҕЕ'Y9tpU9th<])8tutmQV-|/bG){{^~Fcd]kKᅿ?O?4 Pގ \b^-jz5wd5Sޛ }_޵L:f^_~4G(t"^\/e~jعK?&WKōFS-7Pk[4p i_8uAV '߰JibI/qb]4{y&S\)q_Q3|_=KJ4TXvh ${aY)8HI ~q<4_Nf2@wĻ; ;̑=~?)FGIR8^T6}_S/>.{K)3;yœ\ZyGjm=ב{ xZC-OM?~Ax $ʇId82SE?8Qjx9\ZԞGurm ÂJdtP3%+'UZ,MN6K\0(%&PƇDLRoLjÒqq]5ٻ޸$W<2닀,0=/ i HA<6٤Ѵb?wZ-UQE?)ua#O wc0k=FKdwŕid1Ƅh9Che TK&dD7,F: 8-nˬ%SɾX]؋S -v+H}4WG!N/ [OJʏs:,a{E( ౙӹ`QИӷ17TW(:ܸlbA^{TE1sfw%BG3d# rumhHQ"'uTqZ' cdȟjXh*Oϙly1`7w&)呒9*G@̬Ј&Wո}sYC6)Ҭk0ق \Q,k!ʹAL|cXFys/+hF\/93JZm&df m-ؖ:NÖݒÐd\l%[I{mߡ̵sT m.fL&VCK vGKFS%#|օ#D@Ϧ^O6X:j_lԚcIaٙi0dG)Y&Ş5عȄ5.ά5v:hPeXX@wN l)_hbʴlaOtL+z3bgwPQy@m>1h-Aa mhW;ΆQ Ar:CPC/K 0l J{kʡ}`l S 3<ֺGvc>k-L#5+& tܚ 膽Z< {CE h%Det((.qt @pŵ,S<&؉yokmNաR- 7 c l@{ d d>JAq ~[ V2I:$LX^pWL WJ+;͐jPo]i?<1l ao}c% ʄ5Hd&3VtVAx{bςGQ7 {Ǥ< !.A -i<8I0<'!D[(vDy!܂!7`S1E6GSF 3&3 2}Y[ ()WoUh1<>(i E"QY;T`HH/W.!EkmD(X(r[ eɅŷG6K#FZ|LcBEєB>3gʠ-{XQ,'f3Ac,yDDa;Y #N燎yP;_a2.܎k.K$| >I7ziLy.^2X ; b|PT8xi?>C`"JDkk% ZW!԰˓]@OG ]Q!0@J$jDOvC$N(ty?XX T5髮@&[-5x4; cxS`Q,ԀGwfFFby3EvMl4FNG2d緛K|}af>ن)>lJJM'^2p&9qwP61XG90hT&%@_14\nV~@R@"`:L'оl:P_|vr岽L{ji^oC^`B[lzu')n\-4&V./ݐPby_?DÍ6b7xno\|u{!O;^Ǎ_l~&_쾴usQ!]Kq;֫w L!W8HҪم$y5I @@yjOy$7IAߧ?BW@$&4 I MiH@$&4 I MiH@$&4 I MiH@$&4 I MiH@$&4 I MiH@$&4 I?lPZQH$n^ZI A&1 䂋I@$&4 I MiH@$&4 I MiH@$&4 I MiH@$&4 I MiH@$&4 I MiH@$&4 IMX5%Hlz@عVZO> ;%I3JՄI MiH@$&4 I MiH@$&4 I MiH@$&4 I MiH@$&4 I MiH@$&4 I MiH@$&op%ʨ׏ݛo06F~Ӯ]>"]ytp a]f5% \d4tYz WCWׯk6S+AkHWl aw?/;I|'羔&yGn!b/. 6ooƲC񋵋3c.,_8zM&!|ܙFݾl}sSze\sȝ`aŷ~|oTlvOZ?b鰪3s~~w}5=Ƴs_= F/~?ܳ%cr7^5!I|ϟ67^yrt?a&2bf!paɥEr:ejF |s}9n}+lc~|TF-ǕpC(,mB:i Bsh1˙˺`؆`_{Ԗ]#= v3wZܙPfRwv' MZ ] \ϾeHJWgHW!EL9jξӕ Y *fc.5BWN%U38] ] ܰ+A{KӕܙmU]#]\ ++ 'NWOeԣ? ]'Sᦧ=j? -2x J:-uE{f Е~-tuӕtutE&r++j pЕuJPXV+6p m0¸"' ~_(a[D@:K JvҔνOX||6/o\J/Q(۷lIܣe&ӗ=/-%W򔟯- k9XMޭ x829 EZ]y\cZ ] \BWOe0JWgHW(=֤eMnЕ -+]!]%ѕy5tlWCWNUPW:C䲍++jJU&2NϮ8tepsCA}xjuugW OCiѕ=.BZ]p~}: k+ADKp S3JW߄$xU Еv-t%hc>uv#7+RX]<7q=fPFZ ] 'P v2DlptA|:-YnoqSJa58l`y'}*j9hy|k>nʧxVѕ=[<~X;`o.0IЦ?stgS޹J]]cGó~{!_| _WwϻOߵ> gEyzT)z{m76M1n>-_ kzxfz S|\`<^Wqcsu sh )?>"k/%(8%rgq5,xGI4fY1Fn04f o j% yw[T*-w"͛lcCC(۳B4w 'ԇO4T}\3kCV逼ƌ!ޡr SX3d"ux^rϕ x2Lφ#jާ[q6p6UQuw,dTLgf#O[aVwW/JhJfT?P#}gQ7!0 U]dC|}0MGkH!b\$@¯ٰdϧI ;j|W:jQ믯T? 2J#JU0F,ABJb>Ei)a^g<%>5Z쇿gh&W2j^zy1doߵDU͒jѡl #GҋŒڲ7N__dd< 4BEK?F~2yQ$=^d^, _P(Yش@tLfjR")j)I$wםL*.>3@vLv }~PxCIrˑ͑pIetq.[&hk zJO$w]oз|Sj 1Kq ? }RFoV+f8[sW,k܇|cFP5oJ.˛DW7"ʠmڙ55@1Q&)zsr_o8S^Z?ߛIjpV>2Mi[&Hx$4nj:uޡ٨zi|U͛L GNLPj7{/wtm턷1نfUV]$70[^Ժ .;dV> (}<̵vS7:|j@ЂkL1x/9R*qEGG'њ^eɯ/Ol= |lEq[V5ly{8. qՄ(yN'V`K,F ig"E.=?dՐڣ2fe0v1N@CG٤D!Z2z mVEyQ:jo`rAìv}<#kN Ca5,6zo61槾7X06+"5}W btxQ' s2]a ܃ |Z?uT%H"vRJ>z: 8"5!-` 7FQء=<䐤X #5(rCP q`:-RLbROpP7rvC[d\ ~QrO "wD6$ݍd;OUoVhR91 J WXhQ <ᵉROx {گ[uť&LIX5FҎYɋ>%@C$)@Eb<_2Fu6*y1EΩ8՜(+A:NĶ]z'9s=ip)0떵 u :'WYlez 2'., X̵Q*g er+F$px}b \>y,#=+P K)d} EA ``+D16I2z l Cz  荜[cqM!SK/-kijG]5l9= S'Nxt.ixzڢo,ϧ=9_s`hn)1H-hiocJc\(!upQ˓L"K!L8Љ_@QɭmB9/SϞӔ'tQ՛Dd+ŜWF{ʈC*RE, Ft4(;RmW4&OM+]r6ۃL ?*Ie <c0h0BV)Yn<9# Hobw)lRZYO)sDhZaE2"E4GlU 餈HٞqcB)7j<uӑaYqGkZ@H!#Fn:_=["en{jɩrv2~;ky| 3ALB0\0P̱|6}α6$Qs-9˝R 3BJGVҝy)^w0Ǿpz)TnpWiXF+nCVie3hlGY4Lr붼kua4ulNo:<#*ش3|0(z*d Z`B֭ѵXVSzTB<=?A N@ix 'ߓ?ZT>* D^)``2(P&ʙt}{9 Ts6` 0 @AX TA& m֓QŸOuQeu烦椘 fͧVxAaiu0x"aqQܘ]_>^f-wWXzzdlc8=p sZD)3ln%9 ,!{)ԩSR8PIgQKLKB(4:rvy|%$z/Z+sH[ޜH?B*3o6O<5%Cy_-?>sIm6gZJ@=;U Z{z9܊CnŞV!>J$8HqJ%T'S(FPH#LG e邯IDk 5zl5ͭH9"=#ȹ n*Fe8TgQL.wý$ײ]|ٵ[tJ>GDT}&Dؘ]7V]'ZSʌkDn%1QOLB&# w1a7kyݺkUnwGl޸cܮvmzolBm+-t<_[oQݑ+`c莎 gǻ[ie[Yf;m>N{uzHN{F;$/}s; b_ D٦Gg؇dLٜ:,s&$ϭ.J3'.7#W.{~_Zw%Yb2h@qJ$6Ww[X1}ӕ9T$c]LTu\@/tHEFyT;%apA.s+hvWOt_2Wpo» B}!OV@mq"ҖPH JG>XKȉe4)P.#Iչt`O1g&C;PQo^j#L~l?SR`CVET( jv!yJw' YvsͶ_xVA=^xzq {Ë-CE@kKY8p= ]#;&g OF泟9^y-]/8Ԡu˪&EC*k51=םD5ZGrVJ=B6X4O k bŽȓ\#[Ɗ6}VXV#EbNBFmR9EIgcI.G?O4}iCIe dPS(K+#W\A.7EZ6Cb# $ΐvLA&#ACKHR8#gQ}F?V~0Σi8?kӰrE#Gr[ ʪF7Uw.|4}KΌ$*zR5ٕս?|հ>s)li!?O<胎cjVqG֬)H mŭq{b'9xQ(OU|)oԧF~4o;Tѕ_jxf]Z]oDϞ__o0xv:\_[y㭢\?nxxY_ܺR A=-OkqM\}|9G#ڿ 䧛bK?L8OE!vD? #59sme 1tQ߯'zv;7'9]WfYXkkVMc͌NG??4ʶC}mjڼqk7N qCnzL;Im#}nZI'Γ`3hBdB%M!?$EAxҗpXK?\֤T b(]dT4be!+6; .2T[:יUW0ZӉuAA1@|ȇw>s[5&tI[X ۛU\eJlO14VY0Sj(0$G^٢ŚrnK"L"e^ ΓŻP" >R 2n8Œh<.k}'9W )/ht< ~Kv=uj2};Ay I~{\|" \>OtZhdu|D)i阾l: ]o~,{y 0^M ኱$$(88\_yin7A*pĜIA2 slJJRǤ!jA#B;lhwXĔ"Lu"އِ@BN,Q(}T-qk|ڈ7^ʫ ZǖOY.w-{'bF&ov􋝼ieg6ݧ.v[vֻeW-w= wBǩG0[]ZLٕs6RMHS4 ou~LЏu$˓LbmnBv[eri|>(;S(!4?,Qj72 )QDVO=D" f'TH&"hD/̩$8! HY@{KEm '[6y 4'\F[c!N)RL{SJ1m HBk g: r caFIihs0FY6?rE)6{YNqpZqjE ZSe3|%9aḚ1w9i,飅Φ]%_Z]4NQSs^tʪkd`)Jf:*Ȓ*c}o;ԇ^ ZM.Cgnf0 '!P”٦K $$@,4}GvI_ p nY}Czs+1ëiێc\z ɴ%[*ncS@W]R >^*X̥A贃;tƎ^ YڏƢLC$M#y{猠< f E8D÷zz{|^lç~ mSo/~߯_Ϳ\pF{kJMS}bP ^ Z0#.O&Kˤ#I%dʕ쥵JhQ(|ܚ8{ylst{|?X6N+fy"fT5@P,k^?_<9> iDbuR\_?G ~Q~xǠRwBQn$ g>`Kɰ)Oa))I/b-_ ~%\tZ?1I͂Hv:y MNi6!Df\Zv@[gK\y`ڦA[A1ϭ4,}i7V_mÍwkm'fWOCJĦ5VݛU/:UbJ%6 %6`Mgȸ> l NXZ'TJ)WOuY?pUE \Ui:\U)i(`ѫ3QSAO]=J`p8q<-\=JZtz1J=W>zR}+X \FUtእ4S+崕Gp;!zWU/-JLእ$ \=AGhW,7piWUZ]*v \}} 67Rf?4x6zXlp$3 Gѯ!rʛxqb0}ɉ~RSfXAQ,y~0oGI>R "FlWS2tK>Yu9KoEP-/Xцh?ת YWj==z_PZjDjjlsL >Nbo|*,>FR >S14iEc>X`CW,%lpt X`27pUu++8vUWOK+2XAdo`VU d)I&+Gp`WmU?{W,-J:\U);\΄tGϕ'ɷ'.գ%u"qR<`COzWBoઊ]_JDJI8+}+Tઊ{]Uip[=\=ENSyWBH \UiWUJ3S+(jll Lp-~}pP }YwUT4| ^%RgzJ;U#Pתn$3A.H^֋E_caV%Hx<6`K!ϯUէ/n'n{~`/K4hnt(sZ5%W]Qr4k}()Zs/+sV'帴]oY IW+kmFQ1^Ax?->~j"Ӡ8 7aƾܳdZ#R)#=0)=\ UȕJ(7(.ru I^ăi@Lq-rئ'YaJp=q˕P:^J"f3\=\="'ɕ_rc%"W#WH FaJpE۹˕P-rur唋4ZL8 ˚ E|mf3-n r9L K1HF=JDcxΨJY #Wy>/4r%zIQY8\87Q :.WB(W24\ pȕaV?J:Bɏ| Ƒ+<\ }t%djTQtI:ke?\M\ rzlk6Ƈ 4\Mȕ2]rnۨ,r,re  Kȕк˕PRXrTv$z\cG+=Ԟ(iQBt=Љg*Z&F1cL*!y~Ǚ) Lo=>dic{ٜ826mxVZX2AvdWC5Pv`yT'̅3OYb&[9i>9ZpEAUL ^d(eS-㢓uq[ˢqhxv:|-Z4C72%dGȕaFp@w~\ Y劍 #WIA5]-1UH ȕa+u/>,rUرé"'{uj99XhƮQm0A"WmzMg0> 8qJp"WB{sQ2-rure(EɕF7Q V˕PYʲ66 $WhUjZ.WBd) ;KGY|h)tʖI~ яsLq̺\L˺` cnCp=h-?%Jc 314,."WBvr%:Biϱo_r%vВQs+z#+~R#= 8@vɠPZ*Ʈl9orZ:RiacƮLF#WjcWBi_\>荏Iϡ-WpD5hP{ Wq6`$`{INQJh\ ܶQYY*$W0r%xOv$~㑆VPc~ſ_g/_V豾?g;B '?|3|A۫֎X9 =_WMyuw!_|y=姳E{ y5}{_WI_*z.7^)ۮ>tɺ˭7_,MC!hz-?oWPt _Ȅ_v5N9O_oJWew֠C8q>>w'}Y+hͽ =-IzS|R3'6UY;7w3BE8zDl4|D|^\5_g@c}D#CFA75'w˿~䃟Ai}{YVܩZ[^ǛWx:˱%B%Yk<2HMu5YSMgKNX|ws?C2xY؇_^:s?nUz#?w~~tQ*N5"툃|prLޚ\uR:QNW9{Fs=f&YUcf*ԩXRUm=)՚nqaծca[eko, |e?44nuUa*MԱ2Xء'ʚՂ2' 1[i]Ddh}rk5X *ckds-h kڠ5r׶WzNѦ3q6UlM5d@}ZD3CHcw/FC(d9! > v0T !Gp appLqq)Q2)3 YD~=XgkB@$+" hȽ] i*3{QA.")k U )NOA`.#!ec: W̬VªQF]J D[d,z^n!G%weBHԯ.[ נpBYO؄^ 18Y hBc =Ɓ(h@yf 3(2~ TUE1.9Y|1&HTq*ڡ;iDOPrH v_xaU.޼i$tu.Ӟ\>*jAGgo{Y]PA#!BГ7! .m%c#J߯;@EVL263B,T*#"6,qLEΎx;g$_nH3!=h5'8\d§Uk z.dZa2Ӑ%Ȝ՗MBt6 E* #7[x,©^ߨ"lΫ C` d-!/kHlҷÚanM'.O琋=`d>bE\ n,z K)G 2‹9HQ).w!|q̡\Fm޹qdiIKu,6_6`PW[kԈbgo5)PĶN8 >]S]OWE^Kn&#)۽#ƞ >(#m6d F;k1Y6aLpP rrI'#`U3mA=4j,h֨4r|!h"&e#s4hͱK;X6MgPAqX(&ˀ !;H-2r#Xd jÜ>F4^8YT2 !Ht",Upk6֜ÿ `y!dQy,F%4 3Ko^5RHYUۣ t/zdV֝Zzotf&=G ]$/q5$oC:Іu0qw Mp+³0Ǵl4״$>~ݳE\atBáf=0h6:7k܁1̢q׺XCYkp1d`P9 7zP%]Aj>spS"XRch|(=dsW;HSbKS_Hu )0HH*P#Fi SփF7גa$ U0[&+bŸ"0r\B\,\1;94A'yMh` ;b{B!DKr EiQ-IUk1QhB-P ͏!!s6,ѳdD).m=FʄHkQrpqQ3byPfҧ D2TSEc?Ae .Y#9a\U8/ڔ?C=xYkʾw(U֍fA-˴;XCa,gfC: YV2#С /j@ =bO p|g, AsORt(d0mJEQJ@n%nLZ_zd;TM=2:->"9} 7"=v n^ft{fd\^an/A%,NֺӒN-_q6د_<=+ 崭_qouVUHqq n^z~f%-'?Vmq2O4Y i[8[owU9XScǹ6Mr=};9&?Kry8:<0͏y9=_p4?_!ۿx?½Qӏe99cc_*zJPQB/0 G)G mo:ztѹ_'ϦynrT;@1^Kc/(ܒ\ ֿd0 ӃRo"aze إ!)>}պLg!* iB5j,} pWj{4.o/ voy8? n=Dş&߯VoB?듋v ]兮BCBkXݘ ;> zkQ K|KîU}[,!DŽ̌FZBE~Ei g}rgvL۝6>CU lE\s#_7rp9c,E]}5rUbL*sս7_?fVI,ru(5{%W}\%zHsW#+\p\hw3.H^\5U ȕP11j~ixjQny@#WR'̈lh䪅XE\@R-k5ͯښuS6{4nkY\$nuNjͲ7&Liy;g&|:44pl@<8o$ɟ-OaCNm {6h 5}O*|C}ub c>of8-/>KoKz-Pt1[L=7Kt]nݝd|Yjjc,.zAc|gjkgz_}|[8CNoowfc^xjQ:GrJX`>ïGZEZ/W-JI^\Ivnr㑫K jJrբTʕj ^( 0v$ 9`+/-w;Q{wcFQޟM7<=N.HGpg:y t9Kmt䢝 wZٙd*Rj4H( zEzXD7Oōi_|ԭZXtT&}&3o! t_7e}e2yk'$YO 5e]j{wک|u]m%-d[yM_)[0w_X حvקN7˺n.[_ez{q7 xppq!m>xbqC7(b%;|y~Z#zۻ4/Oܷ_?E^lzr?p0=]'< q guV[zC73iiN8.^hۃ^$m%)7?B|OP:=GѰZ8/s,bF?s2Gcl[T5YΡ ?e"n<~>9w1.T3_Xφ,sR:"Btׂٺg}U͂ Fz܂,VcU hMzϥc*}i޷~ ˷\67t_t8_=6gճ-bsNq0/יqrۣ7N}5/JHn^b;G7tuukv{o  W}۾>zasGW*O0Gj%cE)Bg8Ud=99blRz쒒=׶$eIKĶ7cQњsAJrVTynjll*-monV%!`|Aۖo7kx L1?a#ΫIOiz(O';ouʊhDe9 >K%3Ɲr[x K+鎵yk$4铆ۛ!ww1'Kp+Y-gogL6ŒfmN Wi5ݷڜYxcl#O]zI,.:;UBVS1YuFGtey,Ytͽg-I>wf⻽nXzXb6 kL\{ǝ4br{PWB{gʕ"]H&k 6jSQm%!IJfUt,Ƚs.U#(MnNi )-ʇ'i]r:Q|󏓷~D񛋛1܋n_ٮ.;~> ҫ-9շ },ߙ+Y22Ɉ5Jo1n1_Luٓ!fSeqruUqCj6ޙ-I/[$&/Nf0 G?tZNf.R]ѧOkrTÛkS (Ѓ9$<>*\C(AZ.]aP"l.)aT]%TJʅ7"(d.3 2?,OƐOZ?__qdyq`O^,T۽~O{gZ e@"sEjv%[-IR4"e2)Q֖͘A ;yTE%aJUUFq&$VrT>T&p>eۢb&?\X:L~dmߴa.3D~goi֦Ôaa=031uٜwD3ksج?67_$l>k !Vnцڎs-aφq&f\u0P#5YR d Ђt]BkqX J<|puV`O'Jf /#γ$"8Aj>J&>v25ђIgD,34(OA$eBR*P"2_v|WgMvJ'>3x{ H.N|XGb}PUu|gҏ\7~<|iWcq}\ 6ߜ" ~bE=A&8~1v@:*O"Wt]=׽\׸eog 8gdsؠY[3c/z-;ٱ9B-1_rt3w9J,#TF b,Lp=ɋyRWi&`23JST??rOrȶnzu'GX<RtՏZkچL>"RLז\RVxd9S}𥏒td>QH|]:+{ruWXD4QkklF9*\PV9‡Ȭ *2?[n5c5FКrG7CVN%TKQD2 X h=$VJDND@S^ҰZ6L6T8 w8;@ Z%v?Y#8~;\? F!w=#JExFh1w1ЩBۯ!:'7'7߿>̜{|7x W`!_vB%itÜU3<.zM?~׮M4 %,+|}ԏElWbL6 ..׆O-h[Cx܈]mz໌kۜqX76qyHAn@;k8!fYfez9=D $)u(w +XЊ$BPS=~Jj q)><:9$<$HRPAYDtT,(^ Az`i0&ti^gwYQ>;GŤUs27,Lw`cTFEWbhsEPXm`S <6OL &ƂL^bWa4Yӫf [SG-6E[)\sr>Ns]T="EygV7=Vd:kD?βwN"! LS 7%rsP_ݬzί^jc8o 0)G0^h֤sƗYUZS3yxl܈H>]u8sl،vg7;8Lv&r>]Psۈl%4Xa|.Ҳ2P]$l$2@R!fAr(:*U ^ҴJT3A#@)Bq>&I@s-N+>2a~ܕ")9PE(ZX|p9M"\8);n 0m R%Uh %DiS Ԧ:X`)}0Pn %i&t*E@6o  U{&>:[Ò|B[@tq-(Cc sJkÐ) \4#DG-"F%.>]zv=\wkz$,H5g deL΄1pAZB2iLIiI j,er9R w[=AO}ܜ?a2ڑr81jFu7/;f'ӾH,)$2A`X o dcIN`Mg9=W71$TV3닅Z3Ԭ>ZkȻ`uo X'SΦa]ԇ^O-=(yNJA ( 5T7(jL۞tOW=Eq4ukG:eakx ̧g*h/†m ,|z_e۠* IFU7IWk۔sKŁؔ9eb"eȅuVUH*g zS4;1zǽd}m>n}f;F)'>-~߰1uqPmݣ~z;z^:x9", =`\cL*-N .jF;d4(`D0NiPbv!2 ੄8O$huz =f(h *mYoa1܇糓˕_j" Vv2Q"\,Zݙ9Af+ـ!ԚXrRQb4ETl* W f bkRlb1ƢbŤ&g+UM2yXY@gnEv}mok}9'p厘%jwfٙbx:|=Ă>L~B:b b@3Hͪ)6(8~@B(M)LC6A_̎1d>1-B!*`eRW]d׆eeYŐZ@1@ Y- swv_<"~i9 0\/l.O4ebй!#am  "#kJ-oVA4evw1[~9gRKef%Ŭ|^p}ןIZ~^&)g@M/'bv)MYၲw 3}xt!Wjy(JDk(@Y R& #~nraCUY-=yJ0F5kNL,#UAF؎XC7ZI]oϰ!_;?/&EoQ "IlIRFU5-h!ek oht~`)ո3ReVԤ}x-+: Jfj=*qo<רbF`P+PkF=^{~ɥ5>k}ͶvtնWN3g훻vO|"OMzʡO$$,P*bf|Y=p8|Y~9 PQj|iqD1G kTX eNXdU(Ei- &"%rі<}d1wo,2hRR\G dPJ"Vsƌl92YAz =a1u"ʲ eaF+6^ T)9Y9+2ʩ> 9S|xېTiV/x}Uak6ZK/8C碻=3:v#FQ\$5iz\rmώ9po' +$+5nj" \J\AsEbiU]5ijRh^ҏ|ݛ' vl&.nӤv&s4)i`dP?\\=b{d/2WM\\5i= \5)I +#Kx3W,097檉/H'wjFsHysR3u<]:ʢ+}Z$.d~p |/H_~`6],ߚy5>cRGXľW{c }f3/fujfIihߢnջ7YgeauN 6NN/&=/2 L'e|&$t*VI:p@,fg [/c~kno.QeZ7|w0?=9?V^WKJYmW{l~O?J1`d--!-3Gg- ڛ쾜MZC?k֌g/κ6~D7E$b)Qy-''!eء|l$u!0Ӕ//*-#]`}*PU񾺐Xբʚ6_5MJ`RdD5ꛖk=^My6_/l|0R߽ei t«ݻ"߼vߢ V%_|VTZF kgDsl96hM4&c_DsTDsl96hM4&cͱziM4#s=ا/ut2}~zW/eB5%qOQ~:[ mYTI62}q<Ǻh7T7M,.&݉ƍ͆+cw=gv{K<ϛ ŁMʁCT12:]PMF 5DM& jBp/qYХY|,gv ?`k|&A?͋>엇G|6NN'?&0Gxw8}^ئ/\$~7O_|&F&>-~BlŭgA[z +.lށǟ#S΃} 5u5TYXLg_C85V U /#Z4ѝ.\b8bD- ;lm~m),Ys@.m7$RT<Q1CA[0Vi6^o<?!y7x><\ǹZ[1e.vGI bA&r23H(,f*A%x(M)yr6A_̎)D !GV 5POT2hn%pI1CuUEmlJ]jY3gܛ8]CvcȋWu4G^~YwϺoeTwČzGZr"7ة* _I({8GJrj.Pɦdd!&Z)NaƬJu%JšLF'BFjґ #ZBC9t{ ݞt[>Cf'i.6aug?Νg/E2/$i)V'I̫Zp-h!ek ohtGUj\|aV~jR>іhgfG %D5 IĽ_9#GBu@vazrtQQ\%|A2^ۥmEYUe-#wwu#ltS0~e67SV6$TQkt|Qk *zG fB P(nR(yC+^5b31,LORB.YVw9Rf's:OHdͥ J/gj)U9}0JBӳ4N!ւl0ٝc&:b喢QK1w9AL l]JӐ2C/U7f{;drżƻMŷ+1y  )4z`#T[1t]cC%*v{{[T-*7}2ۙg2h[ufQ@z+c I8 RY婢$CdzoZE) 9 VQ&A"D"$^ A#ex$ƖE 68Yt\K:q-Xo`$wJZ'57S{Pyc vς9ąLPiGf^8bD QrO7g8\i=IQBȑ,]; !?Ys ;h8a|27)5+Fgt9|`&C7?×arzFAY셾/fP7*ٌN.5FG#~$:9t:|W@ykǟ]U徘Hsu^/8~~el";N{>{,<693M#qp &ItVaKOχy':-Ew'_@|J=\1 UVlsc4vOW[~6kbӽ ^JU^F3S8ONק QG}'K1 <,}rg^4>=TǧYo0''!! {FNz=EzbNW6C!m:VV+;/﫱IMZ%CIbӧ6>[RyjV1A9mCalPxCQ?ݔ#p=I egjM)0^?EGNMʺAO 8Ihs>9@.zlB϶jiWV*_kʼuMԕ5M>1#Ꞓn]Ϯ]K^!A&]NP|\mqCLިunC9ӒVib=`!bDKTJj.Ʈ"1|8%tt>6>M$<:41AsEJbxBy[6^.y'*V6G4śp9ds$O@Wܹ4&hzG|R~_fR}zWTN%7d*J"YOGPj70ѕHype&( >NT ؔ.]c &9~ULqJd6Y#F-o˼t]7 䞞I-^XKS kInN'If_7 wЃ]LZKzW;|l^8ͮk-+wq[iyz~kkJ7l~ 闖g}jf]WZu2elbfM/_XUc#w.#4C .ڊbڕ5y7MZ:.(MV93&J è r: r0=d\K)lQZ<΢>άo㮖h|!|΁Gw{7 >ZYK0X`"GM,+&9l[]jIy9h&elT"&ݫ0-љwq0hYA*8]594"2KJ:Tĭgv @~н^O({=*{}TT{(#XD- 3ô"E`IJ <61dpHYI*ݻUG%Gs:J"cX(9c f>"h)rXHX`' p't@I(㶺 ~qW>(]tQ9)]5'<\Ҽ$aypo\ _xR|RRgOR4LrB Ϡh?܍`EvN/zś .8܍\Mq~_ݝ% *vBmcga뢤L ^2r93ߥZRٞ<5xP@(`JeVh]uYkj[mHPjg9;ulhf:~~VnYI]?= Քf3E7fN=+䈹b2C![qO>3ΉFF5Ԧٟ]"S"XhhFFG`JnKaMz%Qk셥 ǰ1NhaRR0PR%@s !^#:I;#gKL~vDI⃴؈LWwƣ3mn\ZZ4StfMn򉤤Xr`YTThFrNRG;cA0O79؊^yʶ HOTD2'T S3ʬVD0<{AMcPa*)({ź dDX<:]LiRQ;ĵb\F,1j\ڔPX,^Áwؚkk*}E, (*°R"]0 0РSI谐:^H d4 )I=K0ϚJItrr</>A*'mU l_54I@o3850E0y-UiY~J~( N. bjF̥y~[s` VT,/`60xji^)[e[7Xk72ta)EpLt1wٛƨW ׶R4Z$Ь1pRW1>ٰμ&ǠQLКFoUcAU^}|7߿#d7#%:$%OkTHʱVϐE vHtuOS'ygopfg5,Eh"_^3sAO/F`s勋m{7. ÕA B_ՃiX;0r!y@j[45lҴߤ]e״vklןB49q(X[O0w$J'')E\yO9оd4$9"FGH 3pW/}\byV"94@@u`$d^\F"VEMj2Je_}^vk'^2[U2 'FG~z.O.xL"5YRل>*d_Gɣi}JAR΂H]t턾c8Y}ɶxZi);V!lȧM?uQ0T^:UTv:8ӴxYO`@`cJUN1RurNT1@Q畒Fፓ*FipNk:5ݗKCސ#=d5]y5DT>03p<'V8E(fQ2 *c0M٭F ʰLD%Z@2>9,h)D<b\`8T:RD 3q6nB4Nsa3=r}+b}ۊuݗBST{oSptR1͸W,Z%p6ĸ8%r(',7SdZЍ"ZXtH2Bd 6X[J؄ 52v&؝vb Eb.,R{qbӋ `:; K9E1)bhE7@A'*tڂ )<Cl&ceRy ݥvee`"|L+uĹxVB̾vgV{Vmnu>2OƢfFJb0XZ3p"S( fq9 \eiY\Yo-pt*al* '97K(-pS+EL*ؼתD7#x|V^Gsی r.Z*oX~x奝٬4 % 9\+&AVkBH'M\ZCAkCi!/d,B_ Y|9.q)B#ݷB_ Y|!/d,B_ Y|!/J,B_ Y|!/d,v!/d,D,d,,B_ Y|!6vݱ]e?NM8Kvud>VޣUSjl8n*s z;t1A$SB nnL,h5T"=&b}Hg]P` eufohy~]>O@ =F;AZR`<$2bFXp\ \zV.f$su>Hr$͡-!#0\ BaW,5^xrxmd0["3DG隷W T8g]rLJow~zri[Yև]IEө5V9lR3*Q.(fY0V(JF _.cLRJ.̙dqB4ڗVU,qp^=u@`8pg7? ozޟ D8֑*:6ITF}`h5qT@`vb8;?Yg:eg̟MvCwY7{M.A#]I=?b!/Ly(K̊j$FW@AWDE BMd?T[$,f)M28K#qnI p V,N<"B31RHp;m<8v(Cd(0844CBRaDӯ.:̒ A|]wg oՔ& HukD{򑛤(3X A b)M4ר=hjFPkBIRdž͂ПC^/-h@?2Hƣc -(*Gx)n^PcBl~tOj/V`X?rahboQ'+)N""I4_q7ecY~g]jX\i\_h/Iox[PN9x$4t xz_[ۍP7ZOHt6f-7\ I2zw(r~J0!VgBu!^Tg(>_6^DGGӳՏj pz7 \<)q㥢L8Wϧ''T[x@i~wޞf‹ǫ HҖj94{ߎ-7c@6|֤97z%7tJ`m3 1Ѹ(h*%a2ȳaQ+Ӷ`Z#| i?n4ЮMs 4-|7it5~w,>v=Wy>pO7q8플'')E\yO9оd4$9"FGH 3pW/}\byV si$HDɼ'@#uq\{ 5MNɾNw:ӉOzwg3yuzsխ*;#t; lm@88lލgGY;Řv5j0Z:rY]QKo?em%K%n#<ƈ~Op ZF'-& 5`褍έ 6Y wr&%L:ztVL1.5.`"+`$SלI]){.q?<˷煼$^(GX_xm NU=U^~ϲӪKIo!>knw³Lfv

Yz6}oWAQy Ўb%%:!K9:Ӡ@4 '(qXGiYgG/-ۏ[RmMtebV*!FS`s'.L8N(TGVnuaSWI֐]nξ?&_p\ 6»$!9 `L[lQtu[v*H\@DE043ř@CL FU]PdZ:6Ĺۈ^RDY$j9SUUuW7[P8cҰ:LuӉdbZKπ4eBq#%y9m aId-w@('@*ʲJ+]NNSHJA\`ꏸ'rh{Bkm tG qP{[]Rhz}5ТJc:(ҮzNt:]2KʕUAWeQJ](,BYrM$Ɖl_%ZKat>S5( `o0*@ꝟ7ve<9!a ^zM\>I|؀9쟛 ]qQ&FYsH#yҽ~~UV, ۩B!C]7 `#A=)zrf/ +K-`C sѩJu#h;Ba=Ukj٦Ad,LJKm}$ǽ0̡ bB!3/9{x=Ny/IB=Cnb.nge0xR7ڃ/WiWce AfI' As/|{FV5,w9=J$ydZ!t4ғR|ܚ85Zd4Eչ'_{Ǽ{-~EY>g:ݖg9SYr<GDo\U*|RG`2 F{{ uL  I,CV9$; wӉĢbb FT&ua/0 Cg0脎ςR>յ߂BUĩ,[:odmtcknR,@GgK&㔐|<٭J]a[Abe} jqE ern)bNK4:L@J;fu4"l3] i&ԙFY/X&<% _ǘLI Q'3w^'BIH,O[4eܿn6 GOa@8mI2 f MgZYFy (-]iAZzZ4MG]B na[^jÈ^4z :QMKYnUY,9 Le1^w,#QcNV' ]xZ{OHnCّزL3x4L89%_ghMOwSz9Q38Sz,D}4G-g+.QɃǽΧԾukfuNO"/g8'8O/֗;;ޛ7?Ȼ*q_zѕJW.DQ$f1fA"<Go&E CI^i8YS?7G￑9\n ).yAFi-L$ q﫳`lѻ5-늜_JͳOou`v{{ |]SلGQ_ի7C_O_B#/4 myu?Է??As P&U0?mėEz~!ϝ̄ϟƁ8P-jTm 'Eݏ[~~JmIhz-si7 S,k'\=oڶgbB~M8) 7f,(6ASfge  +xZr 6-TwyU|ɫY2(td?7CMhI9PERt.r1]M?YJTglg:fJd",t--͢G;N͆׼fO[I9@{[%vC-Zޫ6fʻba O[nt7/qnHf#(Ph# EyyF8Zg+'O/RsM"1i(ˆhhM+Ae//%9 2IA%*Q)(BV2G묀lcgZ1zel|(9Zw,ϱ]$lz[* I #@8b?UAe [&\jyrs09TBV"'e4uG&|໐Tv1tpD0"rTBK9I9LDe]V1I@ =-ֲ<VC)o.(摺30PhI kg7T]EG~<ԲJuN$MX QC4BL4iɨA $糐2 "$ҋ-V޾ A:M9MZ2o[-r0ܢp Vpo$gU"i4ք@#] M҄6!erĂf&"B –d:&FR}v:qKw*aIyg*wFmɝiN(Z˃`Mр^E؁]/F5A[% %ԅ'(87v* JCPPjJB4PZQ\9P)t|"بJ.:,)p{?YT{#`!Q,Y~}AɔJ19<*y Rvi,%B sȸY|/s0H2E3t|r hMX\2HJ8`KzcY-- ^Y-m <\;ʵ;IګrL;'=:/G -#ZI0C2 Kԁ=.8r)ΥB.ATuǵE̒1lLE'Y,%ɋ˄k%ldt` ZLn]f1:9xCc8]?\~UVEf; KRȌ8@`[䘄`bj@m,_(1^R[R=B1xV۴Tj0'-(O^Eg*re< UAT,`" u>J=TX=tX=XXTL&xI DA+#b6cbIA'ۭB|_B_=s#m}Sfa<#-_? v瑠q%p6ӯ0}{ 0GU\CUzNCFCesifyPv}WYGgmc<4,,Xl60p)hNq04[9tٍ!w;rYDht w)(.C>Bg\dPs>w:i&N XE\ 2Gk3 pe\ڐ5h5`SyT4rg 9~v87{qnr?ɧ:ݰkNN}tO9 ti/wP|k.ڝdCdإ;~/ Uy"]5ŭMx n>\&і-^y27.6| ?^޲u54lzӭͼ{7n?4gw:f=Rj*ҶJg%f:;$7 h^ꅾŜ4yxVjd&'3{o#RN&x/q%sؿ?|U|.'V0H/j $YK/`$_Yv\g.8^\ġ@DJHv/mNCM/ƾ]/:N^uelέ{hmÿ;8n )WVmm*/RE*JmRBI^;yCд/dr8x4и3uKOafsow߿xu um|WonsWo]펆Nu_76gDuj|+U7WǣE1 +&kuJՆt\dV\ M޿|9M(oo6o&wNp fyc-31tn¹7(JJ?r;~Ůf7U+?5uv6_F}spoJDiA_wӝBw}l|;5~|o^B{t/OxE\IB#M8Cp>RFwW2qbVA?Fyr}Z<-W2\6}<3jtlw5K#?`ZwsTΏ3pW\=IB|;WOŕ N< Wȍ+U˴t\ʥ\q,ɈuJ'3 78 TKǕ~ JJ4Tpapr+9yRT[qub|ozqkn ,${v([Zxr=>~^ y|Z][-ZF1e_?Z97V^3ƙ$GL`qnթ\7ܷ~ٙfg'C^FU3 цQpj'Te\'NWx2#}8ȵƌ+UTe+NWb$pTCJapjB-Ym"t W*xeYjY:T%sW_M/gؙ'8{!<țfc="aʥ=B\fJV\="sWN•u4 TKǕqeILZ2WX˛2W'+g9RWNaprW6JUqd,5:bu[+lQn9@``ZaT,U{1b0Dr@JUzq%V@`i\qAeZRvx=E\%>tCW*W(h;_U]}9Jlt%踚%ܱ鸸ґnkS6wf*zlӓgsJz? Tq*S\qu>&•bW*e\+U+W.WW֌+UaRK{H݊g2+Ni[QhsېۻeF]2wIz*] `U܋>:`^@;:`G̊Si2n^ Y5d9UiFq3Pvތ3r-ZKT'_g|abf +뇙LRzHCU% P/.A;T\_/.. u?em7:?\nB:Oo/K_h kwFW{@W}נ]C|qqY\.B>@$Ƽԟv?pWb\ ~+%βAcŵw׹xofq69b }k_u?zuM&j7?4#BvwGFջO܋Wg7 #_$U~b~9z?ZeQ7ϑ)?ӘBш/^ w+(oY_o3l}%lY&9eĵB$ΔNOA2J1޼/?dc7ګ7Ϯz!U?ob./^ӭ\N1wJnmQ\lc=05{JL$ݙJ$T!`lKi6g' M >v$*u('~\hPɅ>0 mnR>2;ld B{(sfTݫT{Ͻ#B)BVrU4F"Nh)NFy*9| ذW-V\Jr|+X}4MV qDnvɻ'{;qL)O"=X<:3q6s5$ԍ+NS+ptL\.i%gbA_{D4!9_+fzSdjRPiQl)O3C4ن06[*eBS0=!EDJhh*r&SMS|5h~kO"RLQ'HL`'~߾ȵFJzN5PSAP2샔©(aŠQ֢绌&)քRA5 )"jkɈT@|["PZ h4+MYٚɄnbNI)O.Tɠ@K "r+ZdhB,FGhϝOu.Єq;0ɚ1^ G+UrĊmETc0zJ2DpuؠGG#V!N'p1uu"P6 ȆiB Lb+HAqi4iS-pHs oX^uv1KN` ѻlWrwBaސwIa w8c)H Đ)y<%D&\/ MSx ̾ZS`-[KLFXнbFJp Մc$uK1;ZpȤpg[Y /O8Î`F&& /05h AGM*TTgQbs 0(c+Y B!AkMR6 :ҟZ\F. CqF;F`J*֧^[T`syKIF5ΰ_.Ct^8H= [Qِ&QXh0 dLj1>TdȏL{ &723%D>J/wnbA\fΘM089x>3U9hZ;t'"$B`BBK>m@v8@E6KWshdզYe cR Ƅ_HbBEa "Ę#eHc){GH6:,t#y # *[W1VX #Cq/}J,xP' ;1[ӑW=!Vl#8&[WFNCkˋ ~WN0NnR*ɔvb zNf2bK@.^4A5]J;7 y ,,!)XtQۚbhFQyPt+$ >M?gG&-ũG΃Ģ9)BD'e#[ qs."t L5:QZBMbA& #mCכOEa6LdAX qVLC]:QSs\=w-mI vH~0>HAlc\ _cHE$mkW5×dQO:DevWwW]]_MNO5r!jBwi5*)8@xf) @Y5\[fp/z"l6 Wm-`f&=t}}dM:qPcx0s6Z7 ka1y^ʭ t Pxu{L;Ge= % B7`F  4`&SFgT(Pv`)jXfaԡպ@$58E8Ò 6v1D9oM>s{8KX7 Rcvyw -\bV'%r) ]&Xt \`!U0J$az;LFX ^vW=t5IP! u߀XD W}n?0)NÒM^ghyae` /TxB!E | BJG3j7zցMa-syhڰ(RV0'#`sJ R5R&͖@j𱶺U g`'E3byPI x 4.=k$u~{Pɍp^?B '/L V]g)2@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 t@1+&b z&bBʝg) $&Sdq1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@O eh=CxWa) //@S)yWi<M@1`>@`C\2`[Su#dV:9'./{zm:rۮ#'bVÆ1ڔ+Őh,sbM.XT F&@sv;eqٟ}'WfeR:F\i 7p Bi;ɟ"\ Ӿ Mo ŵ/p~pRZNAOw\ @` Pگ}dpB)-m?E k{W m+}_ ڦ \a#;\[=g5p]`^4MP-a+;~w1uH)燥ǥ_{+2^`-wΖW/jݟ+M9+$ߛ1R{iHC{TujXV^cP Љ]aQJcPǿѿ^G\a$X7s4i zmˤ|W4|L>>1>9x κ&o\W'Kdc̃;wȇExdۨ[T-Mzwmlz\/6zIn!ā q?C`KGK{央c(c\#]&''q=q?Z.m6Y}r1>J$kXEJ&I[dC O7R߻6?w9O~^ZύۇUs[mpzwe6}qsguNb&F)h]I3Lgf-NM"dq76tsINmU{׽ąv)~ݴ0/|y^ޝt3CtOfHh_+"rCe!X䷍aWUJQU} EP.$(n kCwnk}֟ۈ$7 >ЀM_/g\sH[\r[Fk\t6G\P&էUl]HoیrO/@n?PWiq>0`A:yb*f/Xφ,sRq&{C)D7jw=4`cv|,?}7Uh-ͷ~kݸm0Xh[ kbK#YZ(nĭQܭaXѠzt'`e eW.Zyq_&2de}xpv8T]grw%k7sJU51V[P,sgyq>&k E'-xV[ݤbᘋRN#cN!FűpѰml2M ^lynkrZޕ ɧaܮ鵫6Ţ7)L%ItL3>PP*7%{D-Y/0OE7N¿5D⩨ SD̥-OԭsD-g\YZ.9F绲þ5=Uv}߀=p zi6tY"n( ϜǓj_]oo>[^m`2ZX2YQհB{cU\˙Ɔh x_1}&g0@OI.9msȡd4*ؒ 72pUh[fdܞ5,ܚf 햱P~U,\Jm2>Z!67M'i:l: OdGFqr om#GE/Eh`anL0 Yd#y,If~ŖdˎPv+i VuwUyz= <(38>(&ړ,08t 40֨ cd/DJ6u]bD0,ےracL=vMibk7ӎxm7zީ٪(%$)!F͖jPDTMi?,ƺ" ̐J`tvk(dUV,\^DT Ĥ:f{R_Ύ)W8J?G#N&drb 9 NNMuQR"+c#zBX6e-}YVUBf(Θ3iJH,}fe~Ց[א:iSbxL6/yRZc2Xk=Z2AԠRrJGD+THJ///6ӎ!ë=]MZE~ϑ5m}wz2z|@Bя]E?ͻ8;NB~7vꃿ-:E1Ǝمtg蔵6:)]^{t PRoY ;=}d]`sy$*jbr` %Fʡ0 3m[@:K$aN4y"RQJnG}kE^b_}0mr =&1jAC +k52I*5joBtt(1d)Zdy`&s) Q_0!u~.%qnLw]:;jz!0(SGb i dTL"4*INOhzKII[TPq@ok+TlD)J)+`]aqM!z^PǃU/Ń:m |> 9=bNM Gd"%!ǀBc-]IM10`Tuf:o{\l ⾪Lv Iː.) ltt )J9v&J̲8Z%\`J,`' lJE`--D(PFH^j]8C2PdV 5-dE4]: h|AF&g+!>%Z%Bߑ't}C aN1w1+M42gkdS fbJ^J}!kI,/1X0),9H2FUh*L-Vl(z\[>M߱ v^h!L{W-;}2-ٓE&@`D(ڊv>.Ibe(v "iXɐZTPa2Ul9piJČ. }$8}7qYz[#-Y"熱05wf< bFBxegnccMoѻ;= L[xrDE +lrҲ6V@fER.0gOF; |0 k%' ($p>AkR"i<2Rbl "e}꽐++~^~?d3goKOV~g_GAU2^IX%f^)JaHdt2AfF]lA:(&"pYkPyieؐ&((]d%*dH7~#D]+Ts LY>:?({m.ET-:g)x]_+d{ڿ<l;(30C|,(۴ d< ɓuRpܓd&<)նq}4Q(o0`q%\7J'Ƥ0(QzYu):6&ma!g >A ,_dKYxd!/ufC"P߽Mu:SݙY"t+Vb]1䖕GҊ]U~j)u]{z[ڇR![pcf@(07(,i^f=eR i./! j\CRpZQ\UoH|̓Yh(O' W'<'P!«!{st 樲^?s{\ ]=GI8R۔Wu{|w|%R7RU`f}#{!r Cn}).=6@xYƎ EBgP?eohDx_66C_w>@S}l,jDO@^UҺh!9k6)DTrk[>0)K=Oeuԥs w5sG?UE$aT FbE,3)Uba*<o]v'[`7<_7 ^KtkDn#y~N}*l[6bo Vq-Ka*- ۲vX)lծM˚C5}zWN`R9XP ABК8.k-^+\T6dPڢs%rЅO"cuق,6"DRZmTZD1&/ mbLR"(="bdBƔ8SZ7=ANM~zǜg~j}U+: 1bTU1G֝i&uFg-vXr(r5%ՁU  ndz 3A5$J)( "8uCkj K#ȀTLdBȊZvw67] _}ǻd欏Wc҅3Vr^F$30DQcX̵ \).G"hU`i;UWj<:1c0_D@2$,Lϰ@BH%]j -iZ2Gplz^_Ŝ|:z@ٛz-w/P"V?o$Ve)-z e٤E"-9d&tŕ~)㙝-ht=[]y6 h1#s!UP@2l"}0z(=գiUfI{o p/9d,6.J{iֈZ/N0&A`]Ҏ?Z}3dq}oס}(h*OG8/ B6K-x5ՠ/'y햵$%h[+`STMZz"{z2J4xJ5, /DkJd) )J"0M. հêaIqt6|v8MkTo9-zm:\xY/fivr'/^S>g8=|N╒y%d=z߳s?5?}DxǏx*/B4=>YΓ;g_b1M~'$Ct@%t>&y:)s*Ei8>9Z_al9:[Nz_A?m_qAJJ+ه[08YW ד,#&:>qꊓen=*e ,'?^ $Џ#K硋3F<~._]ݕNÿOkYC/n-ߦt:;f0r˗Wn]Io6 uҹjU׷!XxjR p*"o?wT~(">}ڦzFwmJ?mpƷNOjcN;1NVPJ/{Șb}x/SV݌l\YwiuO}Le-s>akZl'+*@ԲΪ@({rڶ\w7G쮖#w.1x1߱L[̜юEXQP;$ZQͶ6^M\ZNώ/6gw5p"1*t]xZ_/Ηj➄wКR~ģurBXaWP \{敐dɊ9>ԾxF2)Y!2ک(IE_,Ɏ+!Mq__N[߀vLɇG4lΥݷeT%9FU LM7ϋӡ4`hOl ֢v/ q});*uX~ez7Vo/v JWJ;D;UMBbޣU*'~%HY ِ2|c%{Q&?MMxb'p֒eD68 km#G_Ei"Y40f0Y#E'_eٲdmNebUU!rcJL@3M KQjkl9x4}-'~j8qATyB5I H Glt. )PIeW\7!H))J)}"$$h %3H[aEXUbiW!I_ZP5"tBm鐘S}]D'X$0S)2d)"7j#櫮6X40V FqԜLPB&:;==/,ί5IVUioA[ x13>9Ђ4J]x6T =5TJr @iJ` ]5lTH.]ͲY5ˉD̗,?KhNQp@9^LD%t)"I<0-qdz^qdT/]qESNY{_7;\v˥n:ztD}A`%ŋ)xTR j5GJ< II`̝oo-hkXk-hW`*X9Γ;.{PZWd(xIshmr&bHO+p9\jg7(Os4)z6%C%WC"H:u2GwV:u)Բ"9-$p@(̓1CngJpE!q̣bפ 'O1@hZI-M}tn;p: UiCaٜXUYlKVLPhYEJbH!g܉DZ6{^~4MǽktahEPКNIEU#|Pcڭ§[0އ$^93_p©j\m}8ˉ|1U PN 03 Hmɐ^q?L/='j]-H d`eoe#>]_v ~>]o=ֹg݌G@#%ѿwo&yR\ N見Gq}AI շwŸrx)ZiݽP skA^r=\fn&ܰ2!x@/j.#/zazdhEiXDH4o BreI߳tu][Z=/rc^z򔅘qJt+i6>o2OckaXZbW;Q_'u`*)Y]YsUκ Zƺefa,e6JSe#+@D>tZt M-rUN*qf+P$aCQiۺa:iu$/ [Z0DC29M/W#T.B~ V3Dw5{(+Ed +! >ER1Iǒ"O]&Y{|goFO&HCb3KuZ'Ε \IiYϽ!n C![+oahlq*%iXœ|CҒl`lMo}\/u ]2Qi. Z.0Kv=;in˨5H )K>&e4VЙ "e ̶ [#g̶B5|7}j*? 6 ʎ_ov})1O\seJ$DI2"c$];ЅȂ$ t$£H˸ `xb\h8 !dX:OD-1{L&v}eܺ=+F^_1* QQ~xS=h8hƐ#'-DdTlr";ӎMf=kEO'y^1L8Ԙbp~t!BD-2C".L*슄iIV$O. ʺ?Ym~A%q̒+: Ga\,YS`aMI3iH(׷{Ykofb n? m"^mͮ7d۞ȖRB0[ЅEnPKLf'5"H D`UFD8dyLTL"6b!{ ̣M,\'oe ,'!r eTѨZdfdw2 c GM)Z8k(Giva< cA^^kљ@ps4I.Կ5׿CT? 8gLaz{&E8U^T1 *L gQI5Oz;:++kΚKQH}?S / y\ { 4xs7D ۺ7[I !.^7M\ߚ_h2;7 i2-=؛]Q/4JF#P6e:~| iĕXXw7Xdy"_;-1$M߿[ru-Tݽn[\7CQ-eƃ3w|59e/GsHJOj62W.{6K_{>,,9R;Y,7v͈fEF=1Qlye+'3ϻ|_u>M{e.n|w)yO wIZ$Uo2yA[^"ͨO܎yw,rw:_&E[wonqӮcI.Daa,(,\8{icݩa'sQ[ hH'-:j-JI%0%HF`Y۬ybОy};-_pǏ+lV@F% YDEmLgE,OZJ `F`6zC>iv>QM~_skkəmgr^ 't6߫<jGMJwZrDRExO"*1ܴؤ/Pj6 ͊EtH)>p.p S,Qesي]V}U06uxL<}F*)%6̠UQ5'І{AxlxlB1+{ݼC')\(kVa ́cTxjCC(_CɬB8rebTk ب\t w(x:8zr<"%e-ϒ'{d1DkT0PQ nGOl"bybýOFjQdWM+dc8TqءgBG@GCG&?} (ekL8JNV^ Wi΄ߒUG8z^pdKlu˕I"H%cR^d^i5G'|'\+p$'=PPԢskt)׻_u>{jL3[d#Bl 5㈞enOQƃIΈvc:my(4J.OC 5{M˹vs7,5:`i_LB.u!u {B%BW㮄eK6Ǭ/98]VJl2qfEO}E2[Y ©d8ՙev1˰PhtL󵔚CԞMPKoJ⤹-x2`u6ߝ0 b >4+}FGuWݞ 'pĘ"$e[S=CJ,>$7=k-q=UտhI04xLm^ht۴ȭJwtK`jEKKmHb6mmyU 5|5{ .~TN;-aHE6;vέtV#;~rYmѸ K70 `u# 8hX̃ԗ(ZZϵ૝k)NH`ʕ$ Ae2Ĵ2Mrm9YT|ź 7P}vN|U>eLq>oy&ޤ ]>iS#\ ),#U&'CsnWH@Si e<9iX`lP0q+N״[Y$IA.LɥUoM>}l,FPs.{gkBWFȩiR d3 cȸDQ+" AV8NVgy ;f?|04.ߌ^_P'<ӦSnb~UoibOesjqe{S><&+йmLh ŐR[p)^ߡxx[hJ`h,ѥtt~( ł\,؇A{bxBG upY*Z忉qt$55-ZW؎{5N !B) Y FPjyge'vuo<ʔG-cOM.׭{KjWK۵ihb&ݹ~Tfne-!7*Go%[;IUKW̚5`Xlͯk+u5?wm7ky?{58qn~f;yRȇeߪ{mtbc;2oXs=fo/zh pf}EU ]is giT8LodaGl,x&0|@YQ'8};d=:d8DT^Xy2D\Rz)i+@*:ԹpdK-zva_KaI9m"K!HGu%-xϸN7u]AMxўن姘<$$$("s*zDfBRt/lF&,M'|}AVO^z(vj:J̣cңɥ2HЖRSI3CXU3j9FQ(GYrA'˜ѥhr)j\ˤjkjV؏gHmL&`*!qV#^'XLjYKAtSRJC[*+9w@FT^3βhӠqG*&ҽco*dWLяbNJ~\uUh aE]Cih(@$2aG{,+>W#VHB+J7sBs9}>n2ʱb0_<_eb@PR\&m f̅HMd:0\fI:ۂ&oq)o)exiwͻ/POr|^vsr Vh-c `Vi@S$2mp&5 gs}Y2!.@;0chδ& $~2emwMgM׳fR<4тo@2]A)!*?l3I~sX($9=g-xjt1u# 'u VNIBiکg&&DS{Th\ʧ7UT6{cGAjaEu7ZiSHR'6Rhh@&De*|jp qP|$Z Q{ɥ jR:%(2R6DMmP$_&ePܥą`Q8TI$E枌Cevtg5rt hotZ(=3]->ê?<Ďar6GQ J8*Vq!i{,0!l |UpuJd.jFV#sN4ar(cmk&9K(`D%s4!`FA)59D$GTEޏ 21Y嵵Iɤ_ -fN"Cb f LY 3GμR(#a&_\Oð5}AJipyVj$ܘȜ.B$,( c$s6ReAz_'0x,}2Z{G‘Q%H`͂#R%az/:<ӚoH߀L4;!:(GjDdk֤ilM(8ᆌC1.8d'g%F-OOKR'ibյ4nr@ƒwT T0%kxM 2o۸Ei,sܬΟg.8ri:OߥѥkjhemVt?dҐuw,NAwѐ?45d72^rk M㭇Z՜j֜qxmn U!0Ӏ@梠v7 .x$Z;NBĀdwK' "fًt4l~ [*[;k;ʥiS]l=54X I)pd`"h!#O'!&å dj =\ʹNb ˏ5|΋<7xûj4|HC(G CwZt}75Kov4( 4&) km41:ra㒔[ړVH@O~GL)19 h7iN:)河x[-;a#7YF"'3#X:s/H>0"J̵K"瀤N/Qi Cw{,+Bq lV,]_B}zϗSnzt 6ÀW *PXO ڊ꠰ijlDB3gdv1[Ǜ$oļ)e 2 5ZVڳ8\ 6E4*<`A` !5-JgiGJ^ A96NGr#96NE闑0M']$L`QZ<Vg2f Lx A՗b0VNz 7tJO%P0N Ds #$M8uOXdd>di%|=LT,81SuVR*ɛyʙM4&ѓ=YY@XR8),͸5!hěP)!ds:gO?`edJI 3(hfc Ym̐ BꔼIJDg5]PNFO"2D@K'R >ƶ'aU_i(ѐ  U j ҷ o$fh]t;3cs:/w1u6hե1_]o7W$g hjYr-9{ڲW +r;3p:'? ,%sKo,J ^',r$-Tufu=[g^לK & ёm S"<c"ΙK R5{t5Ak( ,|7/ǩwU7.> އ4M農Y~=!"z׿M\>MOͯÓðsܠ6=GtGYsH #x\S2v:@/ゥUݍ} ; B(ŷ~!x[Oj5 K:ZYP fWY3P}6`Ɣ+,VhO4P1*@f};tYεj1).-rE9c =36d%;|W|AaIt8;￟qNZCxlåO8?<_wJ:J8r_ .ZSV\zɝp3lR*Ad;\YK$+iOgqM@X*ZyXYFtyg%+vq V~][:(VuGQi9>CG"uEK[ E]j캺*TZܩOF]_g\2EQ]^ԾY(ȆAlIkKUR)p[_sa(=LF,U_Pp%F%ei'<,e+[ u(/~o|2OcK"9` gNAٔ:@I2&AÒOYnii+Yۻaa4:[,T>ջѦjuj\i{&>~J)Rx~u:zT`M,Z;擷I|hTedQ(=rP^onRyEau^U+*7>韝o=ۻ~6ܪP{Oߜqp_ݥY5ʹ4Ω21akC!6 ˗#_WVep2^?`^?\#_?Z&8*t Gkzdľ;Ɠ~,P^PDQTmzOͬ_.czI\gŷ>چdĽs` 灡Ɩr0 >iSFL$C6*Z&dvI(i>9w.ci]cbw-y%+^ .AYBϸP@,_ABe#Uu虈K PѹСЉh5)+Zx!En4Ob.`))+dգiXG#, +emJś >}'.2KqȑGOi/ڮ9+FE߫TH釁ߞɧTR'鬲- BxZIdZ-3.){UԄL[\dgO19xHIHIreZ3ZyXPQqXWKyd랝]vRk|fAeTFA2˺|C5R7^_shwwIKUn9Hŷ;YUpZX׎&kHNKڡ->.Ns>3UR$ݪfTF FRH3pA'Ql.E3: ]&%նfl5c{X5]g u˺PtFuႢlzL/9!_,şP`a֜Ϭtq0>9 bK7yBH` d@4A:x`J֪i"MQ&(H(d1UMhU)M5r~4K&fWvkwZ{`65: qҐQKEh6IJ1Kƪ7F lyJ$ !G$M&&HEt}|2@FuҮyXFXQшmǺѴN#4⽖A#hNR2i!:%`Y`Re 34ɔJ۬4Qg|`"kH&bUb([ֈBDړ^W3꒯5.YG/޲^;Ӌv2136 ).}d28"-QKpApgYi7[.ë7w5OiG> ܊cmSQLK<~|G֛ͭ~49hhvצ}W.kC @wa{,T |GBuQݨRdM`n\ ضZJֵĔ2gPuh{Cner|P9sr V(1[ 9y\)qT685ɖs9hY2W*}vg¼ c"7*GմÓfRl-;U9e@I~ֆhɨɷ dZ,jZ8ol@3@&V^Gơ9Zj6쫃BI(ʏG_~8Jy?CRyǫ ڈ6Ő((,F%tH3OQn*0>s!}ƽߦnWjK&5y"O1<`{oZ5 c4.~q)0ܐ1P24? 6 ny?:-ث]r >op4|5ʹFo΄Np,3;ĘZG U,;h8/jirqgY4kӳgV.fh42$ns1&{|yo4 whPAį^;~wߔ}}oraᆭ]oJ.H[(~xa|9? ~\ݰ0+fq/̆h\yex ͏O=]jcjo>5\ejS|ym0?n[bpkbCN`9kGIbh9$9:h}b;AEbًTN6"–RNzn\TE[ǃw. Y"7)(KJet'`g\sNGxEuX yq}[;ڬAn&-D(W}SHRwޛhq$eײ?Z EEK-ngR"19 i7FeL'c8\v6wvk&e'lT_'VHdf+Ve謏d#)+KQAnfR[䬐:vZa- BZzx8 'e_(.Bzmy?'ݶ:.^}! \[a% Z젰:a4ڨB @ ˔:^BKC; EѫQufыjU83 9m)-Ur<`DSgc6-e.eq#9]j̨? ;;X*C zolVw~g[  ՟~ g,,,1 B )@ m׾>\cbD*?E#,'AH$J:A2R$ChY[#a!^a }\c S~S 7=`7T-3[-l &u6QW[vKٮv%΍]dZ>}c:_>`d'#Jߋ4rsO߿}K'ov^iTN4!)S 2Kϧ@[B-g} gj{ȑ_i̧ݻw 7ٹ..p ૭"i$9d~VKe˔-LRSd4wG ݆ M:Ul"jFm: m^.yX^,o+\jyU?/o߸IT=4y̗7XeoHsߙMG=T/Z,gx*ګpUUà 4>HE `߾@$MVY3=( ڛڪ`jITsf&VjO#~ڊg!xp@7N:Œ I'c>& MJ ".BH!,aM q'\dB,xYF'Oxdoy⢎$p 7HWA:!BC 5QA4qi!^z\UX@sW#SXB xVDrR V@pq .e}}RG)4$YF$PDIZ!s葏uPÿ"/3 ]9|Ęb o1!HƅМ(ܰ=ӟ#<9XxJc >>LQ̎pTZxbj'˥.s:BwNɮJm[kڦ0;mN)ԚQ - D4>Xx3A2"T\<ݽFWԸ(^$V>~|6 w*88./&@m|ȍ^]^ݼ̑i]k[25ʺܺפsA> 3חT^Kat`+z[={$TP`̮h/$шZ>FkDw,2|cD=ݻmrjI"1l--',#Σ(bp  #D` >j=:L-_O{Qn'tv=,庼?tpVI 6#0M|{xiIQ$:J5QIZTQYTc\XFz$$Kh֌IEJ\|-x:S!٪l-G,4B,hć9WeQxϣU^HA6#%]o=co=HC19O \Q'"$"Z%BRe q))\Zύ$)Nژ$J")!gSG ĜtBJG#g3D ]_;׬;UӬ5 ?~>Dt|q & CjjI . 1AXֻhؒa}"F.^\; 7->͢:t7_A ZV< " 0@(_#<)SCT+ t)^r)xIUO5Ϸy4a$FC[z@搅 Q &j ಆƠ)kR ~G=Q*hMq>xh+qC%1TIY"&I&FEPsJAHK}1<1{#^2{[Q8d0;Җm&" J< p;K`G=uF6d#U Ϙ'dE Hwx70ͤ WgBoG880C .:Ã')jEj{)ES2H0}Prn8ƕe28=G9g?7ZN3"ݗDy>9~Go~3|>DO8 U7}Lf'G-%y@  W/Z3j>l{ny$zFp9V8x<|s T_֍ыEH6γc4Đ7`xTlN/JG_Qtty_?91~g_UA !; U? WC+rM2jݼφ%QC-Kǡ)",6NnWM_^\Բ__TQud/d8 ڝW@^T?rAşmWHC^t.t SS{6?wsؐ4}| jNn똠y]SbTd0ݕ/f_DLZ(\[NDL;\7{uŘkǞzj|*s6cvuL{U+D"?WeX<v89ep?5CE|<^,06Wm<;Hqz[Ot |4J9&G8ŃmaY3t r6e !RrOb zh J5Rm=öRw4O}u2~>!ڿc`>1M˺fɎIhw 6^fyyfddځ;& F˄W6\ DŽ o{Ŭg 6Ys{:nras;NٝLD~s~gTG7# HeB!f(9ڹr!Tqͬ7$yLR4yo9+SB yFzgkpDJZFiru % *W' I +2t\l/eJƖeJ֣K)OjƂpuNqCA}2GJ6 GA'S=^Y,^=S)u_gI!vS⠭n/x^|.gHuJNB'W0^t!!eT~slUP|n%Vd}'F4eHG!& )фɤ"~.g$*%hShe8|$# x m 3^͖uj칗6(FC @ϑhq/9H$`lbbn)e xґ:2+VqJl8J\ZsĀD2ش ěq(6KڅP~KIJ SHSӡ27P!aq1J'ka1$(r[U.c%*TDxIHn$#Q{ɔAA@%+eldl}>GWmڳN}Rנ|hqfPzH ` '|k)n VF!Rm,@-`kF5}Rkb[m\>^gV>S$EJ# .vx+PTDkJ-\KlHa!DH^;c"x'PHG5ց++P[$K9v ٹ#|#H.Iޙuuc-'p\Q/U۸8 :Rm?N? _k QvZQ4~Fq v 0ฮ*ud Dr2$q,sEwQ'N{VAn[PTtRQHݑ?<4B|`My8(BITGczf9=;Z[Dx[4lW}},v:_1EmRk% uRCǁTjJyfYm\Ñ<)?Y፪0絰:V (P7ŃmK,H݄#t'# rO$iR> 3βq.@.ޙ$J"^O5HsN˩0y*T{MW7PsJ7NMxvpp?'H/=y$ÍR8p6ş>?9f*cjѬvQHHPÜG=|Qߓmi)=MRKCWQќg^ 09ꀠ E')Nje9+k6Ƴ rmJ&PcN]ֹ\4'RHF#BMHc0,FNAVi3-Z'|`,PM(.IJ$MX@(#V'8ݷ`_KWB F~rxjC|+mH _voG(nɾHMKBR~gH:xYCkg8]SWu0α2Aee%(V,0CJu"}!磤kd3??wD.ƣob ҩ\,܇A> |(}H%Js gn$V]QmJ_֔dc(dzP9\8!yqyӈ]n4;k5l;YWӍ!Mwd3+?ɧz{M\WKqtoMҼל߸dnשU5אW&!ݚEk !KW̚9`K Xl/r3T}hɝulWixL [\ݞ~;<nKXoBwXzCj.uPiBwMe8g׽ШmHPVagiə=E`}89GFJ"-%)sĜr`,5n*W׃+\CsC 0`ֳLџ?`2n1b88GF ޿ZLp5enܨ8e"@d`ix>ix>ix>i_F<|`<vڴ58{F?܃ĵ=ä<0)|~V(ľlK|Зm˶e[-}ٖlKo> b}ٖ@e[-}ٖlK_/2P֩Brk>x<*HOTRD3߰k.Rv%Sv%Sd%z~;:pAG%:w.h9IƱ,^1D6F$*cPpE lZg=qc G?X$> jTVkH<2jXDj*`n?e?1?O&?iwDf94yylW+TvNl0I9M!ySbMe2L/]Hy]_?n5,{.u~Z7Ou[<)lw 8▌)Cc\2qtɎNd#7 h:#8JXͯ-O&Y9.0K W&0c"Z}-nD2 '\'EhĪwף O(ȾY6^D_..W?3LZgӣ]u2I]ZqFKH,T9` ni~lѯ6տ+?xJF3mp|><=kVrlW6HHF?\cgnt=i[[X׌X oRy̪H1Ѹƨ/6?-zlsx1mlmmnuu䬩R:-;i1GJÊIjJe'cPwC5Æ /H?|o)ӷ'_} ?ޞWF`B>Y/ޤcE-agRpQz8 ceqi.\հ̣QW4ׅ6kF?z`#|w߷?4𶚦MK 4|wivyBm Kb?"Z((͡+lZo''Ipi4c _:!`4^䄰Iz.rXR7Rճ.n}x82fo<ױFhg MA#Y"&j5!OB:,6)x&:Lֽ{ čN\4r!!y{9,{HZ[.;Ъuy\w@Ɵ!4ͪE~9IDs"3y|@+1rYu(Q]! ʖ-RŢd J!Fÿ́ ;dSO]ִ x|ރ zq Em;k?6JReŻH0\ڈ(q׶uW.Yi2$) w{|Dz( {zѴ3ws*sx㧧{*C) ZQLT%>&t#L@-LFe)\4ѱ9ץ5v3ΡQk(&'Bk3 /Tt٧Hecp6#Ɂs5}Y2gKfM1a{ƅ3~U,Kxkd8nF)x+PZV65e=ѳfAN"jlB\g4>DDLRugMIRe!hSn?fqlBԘŕC^=u <|GMTx0h"&nIVO?:ˤ-Y,O T2t)>|p5{H]dtIńlF׭f cφ KhU) /HtYH/0}Ce3 ]QK D';gf8o36B*Y =| 3CzDFZ/'s!va$'L!Vjv::y?|iuKB8zT{%]DS\yŝp30i[ *{ꄜkD]dp$Tasr!TѤʁ{Ӛ8g< ǓxTvxqY-g,_P\5&w\5_G`Bj u8XZ[Hbk&S& i:dBN3.JǤəEM=&Fm!zt0RP{!= ܖ@Jՠ ‰Tѡ@49Ҭ8JD6pn;-qvpjpKSG9ҟ[.+7qޗu8~)sXuLJ_{UiMx>_5;(]|ݰ5 F.3u(<`A Kgc8F;~GYa}|& _7{s(a~3ZQMm18q;0^824>MvlbH@Bx帷R`@RK !@ПlYAZ+\V'~P.) PyདྷZiA+B4Zÿm0<)%@@3u reA8{W[v񪕦%btJ1g IQw9rv$`+Eqcѣ:~0Z1r]2K ؃ & ]7|:D;R`.wTV[MqUNz1,"^tB$/)/uBN~w2gPJ)tL,WbCf#ȭɡ*H[DKڧn}&CC)d@DBXC#w&QSR$@BƀiM=+r>\_.cq7PxwP|b# !&YQ5Nr}ۇ~/ɇ\?Kۭ.b đ=G6g}Z{z7ߗoAz\􆭸Y_w~0dI(=woQׂ3[*Jƿacm8{4Az쑵 k#un,p n*ud Dr2$q,su!'rK'Sپ}*{[%L.99⫗ή؇ʗKM 4s%b=&D" Du10F@BkExPVwiE9hHݼ`O ~{ѹgk'};7Go4ф߄I9̎q4vd ^GL߽"Q B)% k-^Kc%4@+:%ҊWkj I9NVmD.%eK빱1$IUI$%xsh:tӲmĐ[o2tg$t`PU/my]q}iCN/PV}c01ebji*eJbP /BH+eT Nb BJxAu5J JHѕo,:g{=8>&z'R2%ՊIKj  j!D}Fk!w67Yu% co^x'REJaOPN#FyRW*Ϊd|:R1Bhd1&S[|z}3_~K~^(X |gv|}iٟ*h:EuӼ/~bypE^r6Ƚ4{Ԉ1 `GF4]$_LſOހ|1_޸3?cN ;j1Y8VWm=&T>:=!1p=M?8CPp m@AK 0zC?PRm@iJyMI6-NƁ =hiJ U|RUVjKuٌ7e>~M<.c:-?Ю_&37>)^q:Ij܋kʛh^Ӫ[wTh]ҽܜhnnyx٤z9HWp=.qzy!ɨtɪr-zZO )yөbU}>A! 6A~ VH/KOe7]iКlaY3Zɴhvz\+}iICoTj}wzhb*k\+~o=yoĽw"Jm1ԄeYRVrZ;Q ݯ7޽ Z7|Rܹʔ*N`d)C-oOMY*f#HZ?, QhwhdZPn`'4o'he84}$ -E6F[6[7lNèA 0 HZVu*;x|M{ zt*< |hXb/MAS)RJ\D]G ȬPЧg*N"G[ sh끌UćoH(~P(BūђV (ӡ2#1#b J'ka1$ըIʽjwQK.P"85{HBr#Y[qq^2cDP+P o됎uM6d:[u}ϯ0 HUKoq$u-Тট+$LX).&Nb50&de|uUJt)t§:jVj ,9b^H^LPHB/P . Ӄ>Cd\rr암sW kTilHe)x&~ B7tX=Vvb9'mˈn.'M #<}ế</^rmeDFq)Ad"A"qG)C_)" .%IoDR!FK)E[f(jVZ!0DB-ME-1A^kͮ9RPh>ViKcT>%Ccl#A2azDWS ]e BWVu(J' Kٟ}+D:]elS+mGt5zAW3њwyENHx r ]e7th]rpO*D Ů2\}Vw(@WĖK/h.Gk"LWetZy$ 冃 ;ЕjߥgD'ʀ ]eZ2Z#NWHB@W'HW7Gt•2Zy*b+AGt-B ]eyBbP@WDWJՊd P;R9o,xð9~zYDJ%‹r>4qkO=kݴls&hIsebP-E0at-ЧRpuoL D{(t01NP+EOUF@W'HW!{DW0BWOWͧHW=+, ]e7`F+t*TC0]!`ɕp5 ]!ZC: fl^]-^Ω9u`LWu\ -GrwC;j;Еjߥ\)NzDW mUkd_ jFNW%] ]1na=+&pCWWUݷ6)ҕ!|nmw?N;iPśO̧_'u x⺨K4RIMT<2;-e[8moo~?+23}5v45rk;,빨U{[ IfT9ZCJCM +] /?P')L 5 zb)<ɫ|,Q_7gOӬQ h'+xu=+4Bѩ^gW 8Ӓw%) K>d.蕨ޞ*xO`~Aqq![=.v+!\v-__Vgq^380"'"Sٶ<7xT͋iB|M n55ƅ h}JXF)`5ƕ kQxSÎ  O^8uMՑ kgloq]&c* p~ތι M3?}v}|Tnc4лomvpZf4gΑ[-[.]iz>Ղ+-A9KhQ4vEnQ &nH][sDZ+(;,5=V8eCR- 1H(Kq忟]\5 1CLT۽t$8EM=z]DVß}ۨdzXmxժ>$TV"y~R+vu'KuF`n\s=-? zC0vw_~ϥ^9^cϚʓaȵQ亽7E]脫>2\}ll}lbyZZn& NuQv 5%䨐@ ,9r*KWz5ǡH9/}G>UR/˦{ Ml ^wW~<~=?Xg\YxtY(`Vm5[w,>bsf>ѯ7iQ 'WMuY^{se wgm&"/FɶӀ} 1( 1R/ l{'׽ d1vq(X3(eS{~ql0Tt%9.o;\:ҫfŰɯ('nҒ=ۦ}^Rh@Ny>#S$ur(ʑ.j!V:;M:wxPX"Z_Pf*_̏ޭ @L%%*cO$mNҞgL#3t䱧UtҙU=h+0d ]9ZBWmtE(!I7J ]`l2tr ]9 c+`+a,C]`8z #C ;]9d:A`P(Td;ʜi+7-PYC+ vá9$:EC2'DW1"%Њ7ƒgz҈LcЕ!rhetP꼃ste,-6%sp6:ey?24È%wZ}C~2}Uז uD6twmX8A)()  0K&dtpVCFR2j"|AHgC ]9?t(cK&t$tobJtt g TtP tut\` ѕy{&_Zd%d:E\ R0td$:\NWχSxptc5{콫0HYa(MdttU=$DWXH\S+w+}t $&DW[$R+Vʡ씗e:r+XdE ]Zq膮ӕC Jqqw`sfδ1heSNE' 1eȥ:Sry:Sc5 荂7TJ-q>R6 ,"l*tb5& NWeyӡ+a6!FM6"ctE(N$g&֩1Ppr:LHڦRe ea=OaTsIST p0 :ocnI:.2aZ6yL# ڔ65鼪1H%:shEIelZrt$љBnMJ "rpu2&gzhXRP)틮\ ]9NWNȯ6'DWJ\ЕC{PZșHWh`@K`U6+RLW'HW$sN0dʡ+ZT%w/y|L{߼zH_:o<͌,HCҽxsAGܓi*`0t1FU]0 \Rw}MѸp}[V!dg67N.{fi/P1Lw*9,Ϛt |7uE?a=*ob8~}Ӽ@R駀IK'3#;q6#QfE<9}WM./'W-9)z ןzz1L4Nos' 0?z~}ů7_i;OΚ[VZW](SH@]XuQ2iv(y]? {SGW[ړ(N{pIqs=$.?'p%=~u9M9! +Y(ʪ"K$ȱV=OIl0!ιOH'^Kk:`u\d' ~sydWwi&˖]NZ-Z.j.,/6hw:w7p{ZmQ<5>\v5g@^^60{:e,}v3OB3z7q}dq*pɲw]_g ,W~>>>]L6 T[Ssq8q%,7۸jNJ3D;b ܀; +Z ?M kyȳY@Cn˜`ZSrfb@V)O68bїjV\]f6Xi@*f%u $%Ya*,4 %.d$2ܥ.ޟʋq U)JYT]95qYC.l6,m$ER\$ERﺑ oe+3UM@-aT800Tu] Ci#VU1cII?g=mFWMm]Į:iJ;zE̛֦kE] wV//3Xu3xbJ  {+5uQ_D>gm\= )a-iSds;Vzs<8w$ӿHlRHc; 0KZ ib6~n_vhA%Y#s{kPqƋGC7*xK>qv$]Ks=J6݇@J-CNrw>9f;2!pwsޙԏ7%Z`m쪓d<{ј⋋+:e=m~ZЄ !Pr}~miZJYb>+riröߤec;$bk^H5Jkje*<ΘʘJtk"\XcE^Ͻޭ[kZ,ɫr^' `8\YJ@I 8VڲX|Pz`GrXeӾ_6zkjskw Ow(,;Uѡ[R\K-v Ƕ*o.AǥGDny6{}$*׼v@]uB>;egJ+:z*DAെe[n&L 48YgGpú@kEe٤@ ,H`ut<.I<$T3$3)BZV&ϊJc@i{3R qO3i-6}!i">5H #X296aėDTa~۝Y:賆iCN+g_PPۺk$bŃߎxv&b\bD.Hv4cub%Z_|[m|P{yo2pGz+_~x\gzZw: k_47$UcE[;jeOڜ=֣9K&8[=;w\5=A5w(rwΘ4WkrY:f֨5H+NHXg.n.HٵKȵ aE:YA&9!:MճT)M*>g j+͌Y=k)-rÒcyٕ/$``:W,=ŌAiU; PdR0+E{v2Ke,Am23[Yg- oF*.e0ZC&6{6xa9,M''Drdwd=kx[l*>g V#U֌_Ed.;:XmcQY$$dT1*!'LH q8HaIĒccS"gQj,dCaNK'sQ٠2ӣ`D0Yɞ8K&%a+9X>1 #}f^>FFӫa@ 6?pvl\ǁb)K>fJ_ ]ʿPg+=`rq<6/ܦm}1 d`l"L-BTBeCPU?~?Q/}ѫA4-6N¿] 7O,T-vD$S we1eZ>!a={&se4ldUљ_gpk5ˆ(A-N*qY}ĻCluV7\-TkەiNEowlztuygKYMW`7N;7S;sC#մO7ePɳ$5E(-Dmf5;~m̈́P |J fhciưR"N$MD.2ōI%Hi GX!svIA+YٙddI{\sԠJG 5hYƼ>˶ @˝u~;-ʱ`%uﵞڈ?e]:0t䷬Q3{H|9>뚉Q1-*@Va|~6k3+ςec }Fy۬)Tovf,^ΜVy7s ~*voeEج5 : g[U(q.9,3wL,A$t zsY)g Yd1);0}12+>K|!~#l2w[0cìI & XDS \,yPkBd09gdC6@hG˭YC3}"lBMzLFȝG(M@eB;#b^~k^nGBA;vP&o'YZX_U*_wESJɁDI=JI=O5C2W&O˗!)r)c)Ux v7F2*BO>+  ܿj8'oFqVs-\B5''̸.%[3  :LDMv%:XSaۍ(UEшBqD/pEȬv4n"#.@Cux4ʓ=12 oYAe|b@\0 P$v1.ٮ9 a5ƑM3Io2Ycu$t.r5 2tv3>xPUu0B¬D / T煝6%^Ȏay_,TDO7I$ct\ 2p$z_0 z$.@*qȉ|-{$V|W>d~2q"}2YJ6ֶ{Z$w_\ӭޝx:"K:Yv};_̾(5%\ǐبk9tD#96%Xw 8"D)8ebp5M 0t 'ԇ3uhA#@E`'意m [ 9;H ܲ$RXB)hѪF8:ATb U%Ml1`#ZVz;1mYq:|>CKSUWq7~C1p)jp2x(\(#&/Xp=N9BbmY=kT&!8a:[M,p񅢜GO)f^] u2zZrkd4gY,?=ec_6łq( G٪1JKue}^z;spۇ(Zw^p;82e(4CT(H:U՞}98\[OKm^<_*4Ƹ=e尪^{ 6b,T-XS}jMXz{akQmz 6cOxK=,1}'!SYy{a%-*-^ Z^|֭3;J-hYcBvWd;gq$JHj9%(N.Jc)C($ Hsa7 :D2'+>V@EIbآ f<}7Blv3?8&qt݂n XM[j\ٲr9fHWLnwON$+B[ATEϸhV֗Np8TK*AK|r17=7+hmgS<1i1Gnx [<0"ϴ@G)F11"pcdxw\ZŷV|nm,swur:qrg-I  #z\VR@k1idµYy}+5f3SPyRrcc" A#GZ0a<:^ 6ګADwC֌8UӸE-51okg,DZB@s@1)$2CޱWjפ]{6ެ}qDe?ʾ]׆3P[j6^e*1Hh&2j";3 d%XX#Vky^q+ "Dꛜ{-lR,櫪7z%mt-^Iyb.q)` R91I5iy~{:T0{;Y.8XH06 Hi3H ~5;}G,ج{EASD3~iN"MeXЁwk}w][w/_q[ɛnl063^u6^QL᭙f[1?o`rPVLÇ;6P+ -b"!~~ߎUT+rԮR3B82ƣBFLQF#g\-rG\LmŭE̪7Mw^/vmgUt+aFD uѷ:7$v;$k 3bà%kŦpYy-ognHW R֠OS}(SL Δp}U2r.qԵi!#hU+B?@&\#lXdONܒp>kș^!cGbFuFجdSK|H%pTId/b} ibZ.ŽpahD5=8d?of+ BwIv,}]#QϚ6TdX$:B `P.I(ID$Bi.ZnK3/E,p^gn""ꩰ;,°e=U>nуTEDF2ynUS.Hޣ jYo#8MrnRj\ "[2!]:Lˡ +4qh~_X KwG,d]!clkvKHZ #7 6?qid5/J;ZtFI/gBmu qeɷpXPqa[G4M9KNdF6KYcգ5OeWȘO'sPtH35/it\4Anqe2> Z[jX軆8~(tQbN*cN$͈Q=zĘA. wj֥pkt}Ё7\q[TGanXtzH.)B>@OL7-DY0mNPc/C,(|jHa78,lp߃G{VX=HĊ;פ=.gˠ+ɿ4:U}cli/6uӔ+vIHp{ha(_-ՄLpăQ񽵗e*Sd {㊜a5xZ&_K]y*݊ވv=ڞ'ROG nrO*sp汵sDE f74"B;.g&BT Z8a.*nDt3{ 1t6{DBFB&Mndw4S]y*3iʱ"\pFYeqfpe$|Q.e DDyכzwE!P/=bhoZT.I0IG]n~XqA8)l:d=~p Æ(u#W qr*ᭈIeELxӌ!&e*yHNV@C=y.rfpG1ǭ=•P #dgʴJ5l}:4kZg'dO`@]VJV+ڤꀞtO#)g4FHj͔iAZOL)S( ILRsQ.qsZ{ls-JŌ.:B5Uf$FxUz @DaSá }{.GL; kdaKh\uժk.Iz@_g1F u3 g=T6mCQ{ZdDsBX#bWO(BH1$LX%#DtnKтt`^jD{hx?Ļ_}Kyx*>2Re[7۬} ϭT&oԧ_`ςVy׺A$_u1- (a"3kb^[5D`03 exҧɽNBʳ]u_ a@Z㏀[˘֮3OpTV!clumt~詵u2[B1in7hJt&kTJU$Y5U\H&hwuX`L@B%Gn ~Q$cO{9S r9^3K[]0MkP(jW 7w&8±'#=ĺ8+ZHs_n5HBNK۬"DѦOMb)qӱ)jc'M:w+Q10tb>x=(Q'Q0Ju īG38eEI"{{k \a(dbo Vh<OWw~#rVGWQc:ԀŔ+5#iC1UcIVy $$Z0Hc-,1ZW Z%I ݅k8|a$R1k!//)cPpJoE"bV9j:&Z+ ,RY'oUZޣZ (oa4OdKłr #>yE*d )T#3%k@)ɥE5LDVBt%g2dqTD[𤹅K/nn^md䍕?ϚH,!…`B֌Ŝr kѽZATx nHŁ"'ra1_(Q9a,t 1.Y(s an`dϬ##a)8J+H oM?su'֋c[^i*MI@Ԙ;&'5 J1zyZ-avp$mz5ݐaJj8MHۡJg3BuP{@2MO M[B'fCu]EBVLl^0RM?-+vX2vw13LUSgT%m4z<:o@vE1z)@ xH#H[sM|oQQ ?,F}vv#)\eDSv;eY9qoA%zN{ίֻ=D}; EBPQ#/=1- wMCEA" ժP^p9)QG eXG)#x.r%J7"被ZJ2\'PxD-S -]8K2CPARE4$Q)*NqOR~E*D'-_nDDDraZѠb23zxfb/nu͜ YKCJ2ڪY/'KC|H<ł9(}a|˸Bt)VpG!c肒*S=(!=0ƃT^gd0c($$"KλueIvrS~mJpY+:pèKN݁g$7l&CXOZL̋~FEퟷ} fW᠅YV QN)_yB؝AGV(%m5Vfd37S%>(C2ՙ"CwR[!cձ.jg4YVe걓Zy&qJ!-W&P)"ST!KZY=/nI^s vɳ|wETlF E*  2}6#vx1 sO5cR8<ʈ/ OAU_P+Dqf?eq_ED?Vˏ*;>w! m*.ۻ-5[biDD'̱[{֐gh* N?zs9X c;Cx? kr(" zpr!­r;ÂNa˝$B%UQ]-Enc[Vh%EN׊4'$'ҨԾ~*AzA7Q;Nge~-|'_ uڃ37Ag&ocQ9flF-?-ieNhŨx4MUP\;F+@R.6(J Zq>}jAr1Ao԰ܹ5nU`hA\b/ER;jr+ؚ vhdmJ nY!s+})\i;Uc܁qO4{z2.ܾVPlyDpNjqDrt2GbCQv|w-{Fn$0ᰓ X,Hnr,ɒfbenKjl9ĖmVU륅!nՊUx1nd*M-o:d[{>Ѿn:w$R+d;HB޳TҲ6/aR2׫?}WK-ƈ]:#LEB,hr;.d2 bs ܧݚo]<̜GqoF&**ZQU wW{}3uR+sgybIoI%SqGq_Z0t`^I۴&0!G^MiLד{2 U}5>5} A$c@]֥]O2Z^ήNSXH1\  4'̒o3x>Zt=~B_С˄⚞PC76w>wX9Ųke4F2h~*kEњ)82PՈR Jޠ m$aaGyt[J\ElFHIu9B7'XvצzNO XIIf/w4@HDk=PczDYP#qo)NU bR-1DEuCJ9ѡ]y9^!iJ=f,!ނ3;w?[hJ vx6BgRp/^̮+E)"odklίOjyՠ}7߷ի덝ͶRZ3t/"J>-<&G_,]%Mta%gpG7Hyo邜{'%iכo_t7ybh֣m|6us |-Wj}d簀^p9_.'r}yq&L{6ȥ䜅\|%s0'KE3znVmlzV7нlz=|i֋/=8⼥ltGr8=Heu~wA\(M qV7p{OjψgHQ1ڰk|wX1E L03!k,EylW*SJ{$ Z:NG_TiO?GUG&$gsBHҮ 3bz2\ĭ#$;}()mySƹ?EB %pydHscߵO]H) %YE'_FDfT`=ɑգζHo|lqE7$2~G)xmb#;olf(N #|iͣ`\εaWyxrG`*8=2+IIqw:Gơ1>zzi&]|is|ElILE[~@B),z#GԻץڈur7֋;x}Q5$'oz[o?竈M0JA'3}~?>X,GˏFKpYorac3o)a~=@sRj{w "oٽQ\-ƒ䓑?~ ģٶw8oQefe6s?@k^k^嗯j@^ -fZgƙ -XrBr&3g+/ulFr15-\^~ 7F~m68 c;(|Y8jǺla6WmE4,/ ؅fZqw ﺺğDD (q I|iX{t  5_~da8 ך#3 y& D0pP.a_ɐMų"Ù*W!OZ͌*`F ((4<1O (踳ZD1l#ɬc WA V䓚X O@]4 hj&WI蕵[dcV,VF0k8N42ˊspĭK J8DPkId[M}RfƎ"`zǨjqv,UxHJxRZa<(!/A܇E$C{-KC;}d\4$1rb8#z -+(ùʡ {6tx< bYzy,9j:{O\< .:T}n. ŕw3Wjmt38q[CG8Fy -8d|٩Q| L+or4w8:pO Em,"(.gN17׏pTXifr{KEzg$< Q3D$qwl`H̲ \F}+[u.v"98ACi)Zì\ei 㰙?OyYznt=q|P+OF6Wy*yNDfd;O| :p 4eRZn(Q A#`"h&X9h?pΎ}w.?S8} l/1GKc@GsSEP<!Z_E _NfI2"U8|x 8kSPR9gR1u`>ʜj·ar~b9PHs#t!c9*w0]˛B-2'EQn(RJUCEZES?U[\ &l]S2JFpMLZ FA0Nv.$]]uUW9~V Ԓ^g8R`9bO0dpIf8yx-g [7{PZ.5=+PPyd"|"|.3嫱]ơ 8ta.Lr]9+@>P+}+ ILT 3-\pX@QvY,'M@%HJ K )*.Pk(DsMo7&u`tNn/cI,$s6YLP3ɹ"ɳO:3*C3CyzsHŐkAoCnRl2kPEY$\ejl'kmc t{qٲr6` / oPy(JCRy=EҢH[nj 0.NwR47Ļz 1Ԟ_zxIor-vp?LdpF"nA?9bXd!cMe;7圸xv:_/lU:Oy6KotYvF@Tc{bZ5$>D^* [+1^uUV Qdo;|Se}^ ݧ?j׌=RS\|zøwwUN\fE)w+{]TZ6b0"Tkx\p]PaO:Zu0Df64Ϭzʌ^9V[(W$JYo&eN,&G>w9 QtQߛ o-yܪ`p"\2Ya۲a658ӘDv9w|ěRNIP  '[Y: QX-$3|[%uI5wӨk9QNVm/](2I2ydb$liZȶ(F֣'0:}Zxmr@=r YD$X / VfYlqG 7I׋+sx$iMPƾ:\gxQ Q ’'!`>h6j*Omx:Wm .˘_/:lY MɗfR[ JIr_wlscF;_nm"TFiέT(}Z.1_\L:pkFٲƷN)ɬC@Mf'\&Irt[r]VvKk&MOyɵY;ٺN*3,M.&n:aG.X- bQX,s'VE p.=i^ /EcuWzv N;@~4$G#M0h?Jϑ0"J&|e{p+䜒/g1mV ! {Tk>V5ٚ5jfAuhjxݲƁ`@līѽ|*6uj1&J Ɂ$4$h2k,_X. nNys"Lv\v7PDeB{Ix9xC+ie={BVֺ!,{`zq~:]Ҋ+()deюroA S\Nﬓ)j/&Nq 6'ۖ 5]V l;u5qjN(pNT#@)ҿn28\ʷnhzrvVnڡnDol)XYz }C6WRKЗt`,z˜se%[1_7pYylZf7Y2~D6-kBXP6b~ 7a $_w @R){w%#p#a7 7E 9`L@ n="O<|JCT>4|^}B$!d`Lv10ϱv,+p/ pPmPw6q-y]7SmWnvLP10\1f[ϿeX 5Z0xKU<_oo7ѡ+_{p/C s]6ț3Zo¨ԀǦx]}al(~~(wE5]ij,D|0_T[8P km}mᇠ K*`轌X3ğ:ڱ-(X}.I 8FeJi#P*!J%DyRi{3tM=-i+|!{/TJ>Xu{Qݲ9eVNW3T(4: { 1Cvhek𴎠8 Jص&I(Q'u q~nc-kRu? IkUtG7Wbj~-ߨT3(T[ےč,7MF?Awi*j:){Hro<-(ՑPr fhS#|dڅo;f~HPІx. zp6{ IlN9RRsV Q[BC? NA@4eGYHy]Jv6d;;\[R+(otLvy;xrI$&p" LjY$GVV֧ fL/(L/jm:ٺRsX[&}ϿL'. 2kAtq1eKinݎE%КUУoTܝ]Sa5Wŀ&N.tKswX᪦_";!UБ|V_AYUBGJ|1Tx# ɴ  GT-^|TR\Io*f )<=`W Zx3[j ~UYCmhR+rޠ'pw>F Cی=/J-TF3fJ|"/ 0*Bv ۟q,WYkDF]HFF)O/,*Wgme\Yi eKQS1a_ ~)nnj8ch;h(EׯdL!`Jx!P1PT o(1gHRTJhxF)^ O&1yN ]EHOxxia;2)LV u4=(x4;RQ2-B惟Еx$քSZp zЉ l61XklkQ,ΔAx<-eBZ/(uB1s; h7['}N5 C2/;GbM ų³Vt,ǭJhՖ )3HJ \ڨD,${$; 1Lbے[=OÊdm=Q6Z~Z#^CBHhb,:@I"(&$%'*SćI9FRma#n<" 2Iqf.{DgWr D)=[?YƁ7Jee8[rࡅ _Ak;uƂESboJ#-oZ0VHG\hCu[U7u !m2HUǐKym׳Y;gS*K'#%^E"UD4Xe^I,ɍ!hR4{>.nc7?  $ٺ&kj+G歽Ph}tKMUvgffkڛ !Lz21NB9:-yZT*++(1m#c.ҜK [1%-Eo ylޙ,1D=ێA(|+r4FkD*]A(c:F<1$dN`Vc;'7R)QIt#K,(BAmU93߂c(Ҩ1-fZ&ܪI41vPb,\[7.Vm%-vȰ}_@]Kg&k|H?Oj է`?4r5Y?6cPWaȑzt~_*Z=0 ^f9nL2̿V֌ cΟ5 I68 ߟnao>AZt"UKQiR7TF 5Vc`bR ˓kN %gIڅ/F>Q٤HR. MV>4CNJc/7:29p}_ɴ*{u~73s>w7|2;;)o|haKps~meޚ!/dՏU]2aLYŚҒVg$PswW@K"4A1Lh);% 2 VV?|ܢK-$!"V1=XIV}@2uP :0Ҧ|HOA;H OYtT0pAeÐэJZ:nn]M\SL> 򥨯sɴ04+ TDW{~@=ztBQ-Ҏ]=kC6. F;!ha PKϾ>=x*4&@FvajpD`ZfK{~#~ zт-lZe^N=7W}ژ崡m,e AQv>@j\aD,J֎(&6+jlmpV(uDk~QGνoz:3~D#Ja)ӄ7 YzB'd9ZNȲa>2׬(%bcbh(&ZS/A Oqˡ7a96o覱 "G1y|eBLy=378e{ayb֘YB2GT`x%ko왱e8[ (FK^;cGVcMeR v^@u:{/zo#ȅ]ِk6S|BR߾K!K43 j Zclc!a7h.&bf1.vM9Onf T [?6vwn[d~M}sd:̟7L4G5^~N)(tw?U⾛xfY2y&΂g\gn=Y aȿ0Hh}UsXB8ӔLZ cbJ]>nMV~N1fiki;PK&xA^1\!Yۗ zAY (hTte>wY98vY}@kf]|%A)"m݂Pe94}"zVBGL}%xsT?5i+հy? ͍d}Nɚ3!Fd[*TDejI11& ]1l+}ʮ6t:6Rer^P0|hpb_tAxt+ӺrOYsm@GMg:'_.2Q r5mqEt vpPt6b( AWޝECKNrl,*`fBmu*b  Є!cҔJA3W-&,Cy :\SL9_,l@\~jܬU梆hA]b",`+->r6Zi\!oSjf@> .-C$T-R}I-][s\9n+}p At>%cjo٩)^GZ˲Ҭ[vtZG)lAq{5ה *=ekL7h߶cKMs Y=(_qbNoɒ[Ypk72b[k܆*>}ER"ZG 8Vద\OQQ`{EW8TQ6'TdBdXe畬V tpVSنТkGÚ圊OzEVc"~h}fMrpX5c3֫ƩFDzݡxsrzxxN9 &)E(]\nMLmOIY6%ՁԺ~T[SbrP#-)ƆCm@ϢR>ODAf;=|+,FŘڹ*Lvi|(0"'?+mZjRzsSR. m-RH ! DU}i]g}尕 ayT28X{qw>3tRĔt'^۸H`AsJ$.HǼJrŜ)M.7"35"/GjH j%bBgl.vUͻ .УE[%K`mo{mT@ ˆ39ܖ_Y# 욮ۿ=zjUb>_wCE<?U0}bKOrUԔFYڧNzHYRqTʱVM9W-ⵡ^Քڿ-9]41ޏ-Be 6g)J1~ڍjRdk蚢GCro%YzԀ jb8:mPB/}Ap*;KŸ_ Tv%s%R26jle)yv8i ^u#r+4EP-ިp\%!.2+FAScOHv1GY5Ki6Td.U@oA]~ hx/ǧO.gJՇXлД*U*)F5V'Do#4?{Òu*gsFLi=dc)g(%Z%'!A.IŊ*w6WnThpC^LedCp4!B.hBޞs+xtfOޣ*,sΣ +Yj'pE_ofW{ cP\nZ,"4:Ak0%ZVxv_D/Zt=e۵KRu??Fރ{l9:&pq?b yRu"YbNp!Ĝ^?/q ?iܔ}1f#K? RC^L%7QHpp !G R@}'wO3LSda\;[p!|YvwzS)Q]=gPq=(QFP%zCȉ9GD,*80>ߛOЏaGۏ!/Ы 23bă蚎sz-tOOOhmкbXOws7= Bt.'W j&'p2qĎr$hQX= E_t^\z(s8J qKEKA得coHipiGܴ{Ohj pZ:\lp`wƨ}9K6.ыEqw){Qoҙ 1Q;i3s;ӥ3|0S|x7< jA@x?s 4N3gv˻[ 32*&\Mk5R]վ 5(f fl5*U'T[[GQNB9p-UpFagzOzig?q(DΕ@R<ꄭ}fG_C`C]gd(Tu6&=HUUefؒRq݇;%rݣ9ԔCй =H^@M;H`M,xvЪFp$)ɷ/M{\6ȿ^p7eef#sV G4 bC-ܝn}dmcS٥4ڲي{L.gB6)7}9>w_>quǻ$^p2@SnN-ZՐ~;v w$lZJO@uZvFNfR}o8wTǟF^;~zގ>?^&Oj7A,x~S뿝 Pu2o&YxRfUzmy>Q^_Q{V4N}˪a<-dh2:TR蹨]"![I];x2hsb0w)*SjNAIj%g CjmFjgt=Zsۤ'؃RUMu&-be'/^sY,>L:U*υ(yFIMʂ.(z~*-~0}ukPzj s@EallRF筤@~ QI*2"aav c0Tg6Iuā[vVQ<4.GuJ l`sf1uYd ~ޭR>{6Es( &KO&,#cgl!2Mݻ1K$&%aV\Cix%No%m-ȶQ'*0 Zid'G(Gv}qѻ8_vLv ]z2&Y|=z*)5Ztt-6z%E5j, i}ܭmåOwtbB3r8N2ǃCx{Jn#"NH"+j+L?3NM5=}#ċOQ`F3)jj3y@Y9F*JoMT)UZ)s8IJH*td4j3ۆ. Z6dB6G^نm4o{Gq4`e]g_SIGf ( i-i4C#*Pn0kUulR?YMOYZ4":E#, ,7qr%y ݻ'}JmG2VRLRGD2 Tt OӖOtAj˱} SEy\\rbNd5<<kqqsr:Uݧm>N;[a|8#C?QfG>b7DN&9\\aJE_qYPaK.Te{˦Trtn=_4|.COKQgƈe?{׺F_e`!A>:uI6vvRy:,K-GR˹=8R p߭R=hJl<`D/?|oO//Bry83}~N .W/cn aSκn=}wb7O'ϸDɍ=UMѧϗxCbg⊼kklH}ZhFr W`Uh{#{a?i; jD,W|Ff%@̗_6w"x?,&Z%pZ)لRtlrc w #[5 =J fU ,?f`IW , :`94xdRVٍ$[.dLʆ{ۘw.őB]<fTqਏQz,9t۬kt8|dRC~P8|5eZ^fFO6KO 2X_Q,-ǿF= Da7QgތII$msYQY77gtnS,F7j5fK>`{% {v6ǤHJÒZ+ @K!|`G:Xp]6."H DZY|}VzT3c>({}X(BM>,tWD~vPʦ 9selNy/GCgNT(7喰9V-ZE֍6YvqRՇm6Wg颭iΊ)DY[IsULM #7 jTLJFrel27[ӉN %*1Ұp&ԕU= rAѣ<#^RNueu1C,K^!3CS ] p= r<89J[,>5Щc^\Spcr)t]QO9ll=4vj*Ijl >BYEg;+{Sكhѵ"ޑ|_2}FwQ }}r7/~bѹfO7FlDNQ}YT :hRfxU[[\w{ʲVG{%'CAF|fTZ)8Zv("Κ$Ԩּg͜KCd6}a -uBՖK} ]uvõ)c#Q`4cV\!J{/{:T#=G"E <=6ONO}ꢹ1'tߢw-v+uN y-t5Zl7qzr#M's`&.gp~38.@o(ـ5`E# ;sS x( m~`(QfI[vtb2IR#^RoPK=(SJDţxՈw͎@83@?)l_dTiBUZ]jB1&N(YrEdMݞ98 /|xY=L$]M!f@l)+WX%JDV~,![=z,RBF6ZK9ke]?&?^Q(xL)%gba~mu%ȉVQ%^bYB"woy*W0ԀA ^aUA%ŷ YaӃ3ړ5I_6њ҈©ӫS9P W NHή@n9,*(zA^+/̲E;`}v}1Xg =s!_o#ydRh&!13ѹg cd.y% c(ytSqH;P;t|9doȜU*"ߔ}}A^X2P:>="I>ޖK WR$/YHVz%{eNcb|ٺ<3sHw/x)mpbC:\9q3|,[V& 2;U`|N沓CkqҮo?.-m>ߏGr~i}ח.-UDw80/ {ªвKf{-3c>%tbxOw n\/OX%=5s*OmN#ò@-[od렷V% 4e @4}`R89S8SΔY3҃f>KCdGQ"Hת,*6Tr,GN =ZѤ}( =Nc~KHm^T2e’Qw3-]ӄqvO͜; h EKhMl|7+r]7:ػd52Jcdv^AMŻB)qIDK(+b[o Њl;Dlz eWkRbizl-\mba(䫥ybs:N q 'L|f/ h.m6GbE#hE-$b٫u-ܓH4J+%:*VWUGnD=RdIvܨp:}tH%#o .Q ?D?ܩɮjHJRC^V^\[(*I'/=gNcUҾa`žVf/e{1 oH<ڼZ&WVUs@'y`Y{ǦRȊ4d_Q/_X=G!;?O|pNmP}&N:']g{ '*ElRmu7˔F&:aS|"cfI ]&*t2% WoչQϹqRkŪtPez*0Fm,+^Z aP^ͦT VJF>:L m!=I@-" `^L Cz4CκQ锝Hv7w`g ]fd+=.i/Ur\guҋ:eVeh=EU5Vj]jV 0{td;Cݱ1"a͛݃כ/>E%o@h\r rQnZRPbN yl))K안qTs,@>}d2fB ~}~iB$!?j ժ[{ұؠJ"]!7g"AxL488K_vTEN{T8.سq)S>'~Iؑ t`׸&Ɖ{@5mb+[;,)`)2V'L(uj1[^RA]rt'=;ABKwҗNz\8OHJ߈Cb+M◕4 bt+tə OŵkGs;׽P`!zn DLlJK uQ a>t>g, '/Z;>7YuV@}l_O9+)>CbrX 4 ơ9G|ͥzWq(|8e?YJ> 5dE%TS6dY+׌==(lElVZ ql:UQ_`$JIؼ~ zNyPo7w8BHClyՐMj\+&FL)Vs;(ϋ( I& #*V*AqUcsS壟yP3=S2"u֦~XJi XNo2.ڗ@F|-u>AҘfZ2IgLn!yx f|p<]lWgkHadT@n?vLF&;%P)U{MZ{ȣg2'\IITn/m[>tslGZRTu#TQ)Y$Ey,K*sot}P.;+;gw7`{䕴 LUD 'ɄDg8Urm{\Y#EG\{q%gk{,Ei,8g`vT< Z^1f;-0#/'aDuw刭.x59bǹ&gT\ACۣɪ kq ~|.0gWCF )Y p=rA\p6r#"c!p7`G;ّ'WdMKjjI, YU,>#x3^86ViW@˘RjOeN淄יxWcAJ`F*(b9NȆ5 @ C#F[Q¥'oM^'Zt,~؂e%yiBIլs(HĆe\`UTlZ*]Ge~neHHd"%,I\ ^hH}GZ߮ǟQ]sFVB$/%ᳺ3mqޏWvWz&OoyvP=u_2_UHȉRkd\8IcmĨ=Rznvu5nP)mm8'.ɚg4+#e#r_Xm'n^SL3S&,wZb2Ũ4@5=cj' sxGd-+܊]J}wޣĉuM\ZKZ컬tHRX_{ī) nрl-J۵2'cÒDR (#v%n(%Ƥ%ۿTnc=I#(2[7 ߁wAC44䛘M cc]6kg+\O;iz[80ohsVStҭzZi xIGP N;6XHU1>դRlM  @DFй;!W]Tq4eΡ\@vڄR~A f ;n_ Mek$  ?W'UjnTE҂~rNȆ5ek1ts?3Nm?`h^y=B^Ajވt*jUȾruMBm bϔ-U#Ojr{U-4͗ jU;λwג'Jjm؊;ִ@%#8JV:Ȋ%G ) ZUjoy3T:Nґ'n0{So:xlܝA|y'v_~}bFoûOO{TYAsF~`g_}$thUhO;\%|VۯIOxן}c;ަط9a`:("(zD1i 8t[J<ylƞR(UN hIO+ϒMoƔhwXfIVңE3'*jrAƱ|ZңSI:79M|_4Һdŗh铍yRg'w&![F` '2t(HE0DɟWon8[,70OćY[kSM:ma"c3*1GC2NVR7`ܜs=^)]t%dLєώlg00У0)F69tFĬpJc3$ Ka֛>P \2%t) ,9TaJ69R-\) Za$ %;$F2B;;G>z͵f%+z'R4s>+?Doۚic13[2,~;{Q^b@ ~"UL$3%noza,C1uŋd &g;ImIY@IW`@!-}Q1U7&8X422ifYu%[yc +PI.dw8͑*~Fj)w87Z $ P%&.V\xD \+ h?1q,dH6ޟ`}xȖuk.GG~ct۶<(u.+ ;LR<^\/%Rr3]mQ:3=cمA{"' 'LyÛ+GRݏUWq1C~Ы3:l?^gՒq/)F*SID:Y *P47|׼-۩> pb4 >hr Ӓ"=UӼN20~P1ɨEAhE-ooe18ςMlnƦg`KMj0tW6%h4$ ycvnP*ZC[LP9F>^I͐* lS_[$rưqзcJ֎F(s YCD(v㉲F>X2v'H%,Ju5:jl~b@ hf/wc6x6B4FZCZ1j}wL,XaH~Jw 7%gZrPri+y0E?恢@яy@ih\`Z9ы dd%-rԨ(-H}儌y4z}NP.o=RHCaꅯgL7@/2vWb\/?<egaK+ B(~!8'qz/U:ypL-Nc-mިPk [$=ܴs:.8CCsv8MBX<*wU{xyQvYJp)fg4k^楔f'4~r, PbV},#OqjEsY9NO9N堎}sN+9N%UɼѬ4PyUf={^XyF(`V<2՚2}() b\R+ ʲ`+-ME(hOۮ:tߋa^g-iޠJ/؋s;+MG]^{`V.tG=HnkX)ݕܶ#O._o]6RZv|ׁ2q tVF4f)ydӪ$׈ UzdkŜYG^s{of?j:PS&׭v卩YCSv7@I/KF\͵hL+!k ~ ~? c8s6W|3jpDH^HK.guy6]g3 Hi@LgL?ݢw= 2Ee1NZ:9!,֜mUmm =h j Cw9B9s#4q)PB/*-|dgH!u6 ҋVJ֗MV-[PQK\i*'q0ť6D jʝi4ֲ PN$ .z[ V!"w7Oe#rDO6Q*!ALs!tzRM!(i5X\r!)#- bfCVu)"@GO+su4HhqaϥJTXP󊥔3GmO53/u7߅PTwJD7Xa)/ՍqUR.&)T4$\SRCmFcH99e6NOiaHy\&S] ]:%f%-z˧uHtoCDG0j)Ԉ@ɼ }PtٺGjUdVmQgHDM}-Jf&~~oםȑ{d_KOnܰfzł(ECI]В,E1ce LKRRu~(URL,8R* p%lHcJ*PR'Lra `~G ]-J{4n?|eo>D+l8B39ݭ#c._Q1( /t/;?2k2%9\ #؏Vd7I$2[Ҏhy@;{C[QcA&94?UɹW[HPܵӂ aYXލ>`ʍQ~RT7Hb&eG˦e6ajKj1_ZQV5k2 P/82; UlJ)`BT_Yq9RNGR (&J`wU ']p0U|:?SLGXs"3>i VѺQY#0x*2D*5a(PS h$ :AJRmA)ʃ.2 iI{QS5HE 6_ Iy.pnT qTJ>d9Xq4'@*y94 o?)%I(80a:J1(O0ȱIhg6p)5 pc":$\`Y3#kƥgU d`]gLԇ;t?>$0Dx}r5L<9w%[1o/WKՊ~ [}IG8lG*mF_QQkaoVFG[r: LhD֋j(T @ DO.GIQј,(L e)*& IG7ڄE4i\e ə %E6Ikb'FZ% W; f$hDTД1m @dDDCR LY[)7}ǖ#cGrsiJ٧~#EjYIK+6M+-<"! z*Qp)xR@c[SF{e; ^,Qk*OǠ5VDC ]~(UzA*"4'+rsW}}̴Ea-*,WۿR{ų~3ᄒ}2#SzJ7:C= C1[6^W):*G+N08]iG+,">.32${hd!fi-fţW3UC%KbUƋ82#_l3wu < gt:$8'")9R>D5I5{Q"RbT(flGKZKrx)%ɉ%4˝j $j յA#_<2#!Doggb7m6C MotiSIFjABQ⤂FԄ0o=R:-ktsa3#*x;SP<8:e秃0#W7t|4p "t含QCdhW\+r"P x{v}`ݻF}K~\!cBmFdލïc5Znռ3a~.ͼ +n8Tc\4bb@='To iq>hv1/ :1\<{ꠠ%އ;QXM!JRт/(x1KBRQ133jZt:Rv&^F͡UY3m3w3|kW$|>gd אאtŷ>mhx]$_ȕ!#>%TS$Ec43]R.H O zWʤRRq&:VMe֋zdfX]ᢑ&!IcOxJU y]!5i =՗'0 }nFUit[/9UKb..p.P{Yzi Pq9qCZֈFN%J65̈JİZ#cIIo6X ) nK,ă# 9DJ1d>PN^|d>BR817 ɏASFIE}b1*Hu@NUPIo+=9~OoOKuO7~rA 5N@~QG"t?EN4yQ-Y\Y=g O}I1:« ^W./薯7&v}ظx. l~x ww3C3u~qڛyVV%bHMqbC{pܾ'$qxL|:l:D_oތqq2P;ŽL[:E;Ϯ!"l0z,h;s:LfsZ6y=c(5.IYb'w~y׍ӻ~x[`1ڟk], ;3;l^S둷?"Qći -u`\?!}^n*5nKY2Ũ͐Ty "9[}jlh ?=>>7ͤ, ܯdˋfڦZNAAoS(oO67nkǼGlPA*ʜeFz婠 \٘a\"l W"uD+RV\EBXMXc2|M jd'+/+r;S5OQ6h-b-ϼp`_ވxWn, 6cX('%./~6\H}jCLs/+ޥ$y+J}W4DRr6|aI|qK.e× ǚo1 Q!f8~Ry1IZƄ֢1dwr1㭩H-犪o8~C؜@iF4'aXC* E#}5R%~Ti<C3|,;f[$F$-o'W/ܷQ~z!/NG~4 4*9kSY/zyVK!Cc/9[.?7`g*)c_Wu-4)ySwSǢ=p]~O3?%}dq,ⰸ 3͕6ȐS8 i6c6 l$JDZɼ(Iݡ?槼}Gnt>=M; ii`,/{~h11p~ysk_⎳uEUTŘ3_~%=7V!26Ɉ3pBl3.  T_,~$]r6$,)_B!pw-`v~aGIśXxҜ9n=$4YgLGt9fk`,$'jtִr 4!ӚeEt@_ 8KC]ҚEz&m5A ⚝u[r Mzdy{\k5&dC a4e&37;ZQ{ (XJMܘC R(#rc:~vrQ%Aw.Ԃkp+ra Ej%VHFcYiLPX+%&Z(jEs(\6iIy4vAU;$I-JVUns,7(Ǻ&н /+"c4}^ OV[Z•)7]<> /Vo~#Jŕ8UA/5Wu#6P +5^p9<ϙBjDI"穀U/ c,H4oɂŜ`5C6Ԃp_eCob3w؈1ix@5R{ "Txt2R+!]y B@loͲiÔ}^)O*J sm$DG}0 0]¥BMILQYj2+Q$UL}c9q`?42i]"i W͖Exsòb{:)$0gshAs,Y@ D.@h3FPsónW-<9ܧM+HAeǟ=SA PY4F}d¦TJTI$F &p~I\#;6+!}OF0h{kE)D&+7In4*C,JnT^Ǖ2rWF+#Fe=rHĔ!8EacAwts/zʕBԖFը\G&21|wPp$wx"v"P|Dje@[r,Q }lyq96 C63{ycxe CȞ@L⥇  ~c$$TNڌ 19; ͳuJ9?ueCZpP 5$zVdon?R(䉖+M-" Zz\YϽ@x9Ut UUkI"S[?ƣT]?ZXׯ?0|FjaL;ր+®?l!!]Vղ><'*i+Js)qLj:J[ea܋1|?Ï;@_܌[wEr ,H] ̰3OxNw@0>Mc1}Zm)>/c3DkM9ˌ&3SAc7 f[Z:5 \Y]Jdu}<v3 ȞNʦ, I3Yk)RǞW Txl2lxJ.A32fLlQ)l _W?vƃ?Q&1RަiЄT5tw@ HQ 2uu ;,@G/ g剾} w;z)Jl:cWi\mlA~6x5ӼV&鼟@ҁ w#dZg*W9͘t.*ố/lTpv2!H^l%jJP@N!FGqr$ *g{FWrb 3f4%$;WMAɖE%EvկE1άa[p/(: p5IRQ7%2h5J#ծX5@Jƌcbc,A%Z4qZIiIKY,Kdں OI81!1Y G;#$=Vr )U܊c3[ŭ=vVcE e?PmJrn]nJc}JV򈯍4,OM[Ls+z1jpXD >.CF:r%&7֏`cЬ&  Ԧ""Q\x„S>(4X^(`4 y' _V #thefhaY11?mj j7:a˨(o!R3ܭU4"J ΛcfNs`ܼ" 滵\ʃj W4nF<jPxaKji=%8qF^6$R.Y'،C;[ج c Ä#tb%' EvϾ"-|QH! y0ƅ8:[yp<? FrTX`Atwe 7K1Kk3EQ'u<[x ubE3 Y' *@bM[7h"RyPIV[d| -ķk\m*ʏM;씱J,ٵe&ii/Bq Q~271hCCU4瀐`oyQzNm^qZVh-jo Z# +6LŪ1uS\6׮"dD&t4p1ZYInA\D2dRݕ\FX&yF!)\{ *\X0qLxSf5H}IIRC?+2*,%r1ZaZctD()S7{'RaP];y øaPx) n1ym;B}xy;wY.)Is9S4h󔸪&Z9-erh>A\Yjq{Q ;UjP>VjE"P@HN5<:pzJ @gY J5>q驁wjq|V~(=:ZG-MjpqWk0,jr-[[s n4Tg16ovm1rLjZ- Xhf%7<)`J`qDyxx[Ňq|;CG<5ZcDKA(00Jg ~6" 2zrS.5G ֗nHݎtZ>f\ 3bnJB֟ Fc@x9S/zTa_9BfN5ey0D+c=>u9|8(]>4!*j|9jF)$zfELiׄ1,aNY歳*pDT Ԧ |Zqg$RZ]* uJkl<9xaBO!,W ^5[^S9{;ʕe"4[(60zB gЄEDH)>R6എSD1}5GjCz]xxRv 6 dk 3֙.]Y0"-#xlSz:Z 4Pk"ΐnΦh +Eщ,Up<">^?OӆJHqI(9*{j!%!]K)`L z/!W,()blЛs3鍧Ha *'ڿ85 \2 @uW/ePxxy]"3 Gwu^ sPCliP1 u:{K ^(rQD(aun}RU\> 2$-x?I/L׽qQJ.su:(ƅcvĞ/6/8tJسQE'}2ǔ7 ̑$nl? WwfkPE)8躟anv4S37($H>~|"P$t4!NYf4* 0G)Ɏw,(ǞGE9¥ ]+=ᖌRm/nw8H忍lL\LhQ5.ov3̿w[P ~qw˫Y;ůד_ýLo9[Ⅿ_n*O3|fwc9g_ 7G0_54vt~-~Π[qa)<2RUdgK>*:[%;[ [q7lΖlJ䝚[Ze=+0SG`[F\3,r$;K$$һ;K$;K%9KRzLE.1(% Ur,I̴CIK2Br84c.V `,RUVyΒ,$o$;K$wgS$NaJ2m$ӄY-K'X",(I`V3%Z%, 25ʰt1@$0@J1.=r)cceH`%ǸL<*zxVI)[=V;'072y1{AS0bY J#`|JўzA P>wly?w555X ڕ1 \iR]B }RELjxŷfj8uRhZ8;&hb(k<6쵧eǣ3`b_가} OsTw67kgSkpGfDͮHaN, h&'A\26EEpRDΦiH-34)-#+A pxHa]* zϙ%jIL-,(eW9s!%lixAԨ XLuxQWr<,upy)wDSG.wP;!#<JӔyJ;JGRhZ8oeGb󓸶a}r}\M~gE#XT[F`E&q\ݣ[e\Yrbq|i2I:v_^a&/51vX(ʕF~qn- Ujxqiѭw~.fˏ bAD?ozoJ&>ݔs"nXO"@Vv$01Bxut.#ɪ8}%߳\dT;*AhxA(wmmJrjU|^Tl[nT XT` s%%!4_/H"qB[l+rC2zDiۮMZfڼzz=ˀ{{~\7󼿂([?pR~035UbN!'fo,nWG͟ exsf ,[Υ~u\CkVll՛?#|9'Em>qM@wBpQC ASdw(:x^]J~${ܾ{}|Je3o_~iFatY3| :D׹@xիڞ0)e:tY;q, =zY:BuH1:*.87% AaR'߿TqZ LkA)hﻝ Eߧ:6%S Xp:QDŽ? RW0N~R$H#χOY_DFTlMiFt0z+p$ 'IX2xcG2Wc6jvz͡My yjjCp͖+BͦeTb% +}|G0xvmapS*|=;¤+BHrF1+!c.˷}o&]뮪_–6)?|*Y?6]ֶjUdFI?%6^w =y(3R+_.?6xOlCz"̶̿V~weY@;9KLJ{{[M lW +8mOea@OkV|_?$ Cimb4P=z/h)% ]΃cӃ}@(cCz&mOaV{h}⢢'GO(A%D1.F"<^90x1ţ8G$b`ZʓE82 8=R@,cB{< ލ){J,H!J绿?fW(C9b0缘K.KCcag٢uY(=\իOQ$zg ,KΣ ?t9rdJ9(qטR9cLke[dR0B\}}H j"ϻFg0 | Ú].|[[6"?M`ʞS ~?T`@f+F+|Ê,/hTo: /'ݎ+|?U#e+H<` !ceoQY^%t[1Gʼ3\IDxE X!˸8 "%r/n+ۈc[ 2zg 1ICE\3ʻ'rFXHPg0Ո) Ո? Вj$Y&Ո`$$ä<b/dWKnw[I?7_Mhă*#Qd{dCꐘkTy7EӧKk'4͸UQӥiCCCM8 $1[+1CkD62]"f=::xzr1T"f<,Ht|hNCĆk:|I$$U8#oR,RlRgŔ,CF[)n Rr"Y&)R~\)|2Ä/2010 lS¤RRkf aSW QK a"'YPP@J>CUfQkL>VJIu#ۨ_kSz 2;kJ{93T; +qMLFBJ%.m0`4zP" Q!B4tAS1-4q Mx?ń$;=1.y3q[^qω}.^ď-x4}Eb\b1./:q׉-0~y.(W܁|y85uAS댿U&1mo]򰃍+˄R$ci]u$4hÖQEfaxgiz^@6apB(hX%fԧX{ -Zwx$\~ {C,v?}F(}m~D جo/m?%]>ww|#*w:ɮm.߷=n%+.vZ;`g0oU؝~SώM4_\?}g"Ls ხ~(aL,Ռ%H0 ^.ʥ#Uʹύ1`ࠫ0EaX H)S[42nVO%de.9̆wly$|&+L-NvP)5QyMk|Vh;8? ߮3AR8J7x" y^V_hN œY'{sœX͒B/oOj:q %$ Κ(̒H-lkv&L3#1*I9bd9#KAf)V*ͨh!Fef˸I)QF{d@*56u\g9JB.5P0 )O V.ģc< \GҾ}{-K|}Gkp\lwHQ a{)q9@1Gae-u8d.Ȉ VS (׿Ԙ9!P*Cs!ʠ!;.+#E@b'ӆTX䀈i9%mSm]@wØ~0Q&_{m5ha3{.cs;'*F|0- o.ڜoE 7k I' K~>>䥺 }#O3Ɔɦ[/șPF svE鴱(32/(@L,4+-P}RjE;˄me5OJNg}\zM/l"Z0ΈL?0tQ؅IǕViюL^ n9SP.`r#vy4˺0PLJGOUU[z;{Hˊ8T{9:BӏV_. D\ɻ@VQ+=e8ፎYE*I&IBS=նܮRlc  )B7ΥFR*,7Ȧr+pP)dTN~| fd6E U`SesSHy`$7<\#!BP?3ΖŨy`JX[qM |#*+̌eg X+WΣΣ9%`嘛|y+vHȜ)$(=ͮ,/uvo;Ifٽ)M {.Q%t'nuFُ苸>=|@$Utbt+<~ntSpSX~MV1O/{oCoFRlE+9;7Orb9Cw-^ߝ" rQXEez"SKJUX+Ee0($pIAJ%+8 ^cW{B S!+Oq5hn=4YbZAl$*jQSp>JePD1B!zh"}$86Z|1qUq׆HDZ Lh08!!Z-S#lfL8':RQ$͈KU*# d"d97C5MQrso>JD#-{r)F*w‡w_KFWk H.6w/1&Y/kF3z^ddzi~U,8}+syj`FG4UlE Xc?U1h8o4aBHɹ҄yS͈Y0JCx]40 Tĭav&!1pl@-S:ȡk}Si!P9PhT"kBSʩ^ 902a:9: 5 ja!3P SiBQCh:y)ǙJQR7#Z)ᥪ2J2:b4V7 A9 f)։̑Kv3+*"E+.c+/$l^=+;݀Z.RQ;.Y N_}(ZB~ÆfEzf}EfJ[v!oeg8P``r71pd=5.G9=:wOqRNW-)F]j zS r9kitq}ֆ«05ؾ4PXp|{Bx z?p_GßE'U tz(r-YT13/z>_L'pM+9lB3;CRo#"`苷N?#:ڲ{JȎ\Jc3rY޷E4K nSgmb؞o7UNbbis֌$6\ b4V)TIi'I`M5?4rS-!!s/S m^qOZQ2)Ck5{H" ;sQPâ/<($b0j<(83 |\u&I!@Nl|!Ժ hѹgT>GVq*=9jWл0h d~#[4 ?i?iOzXg:K1}jB3䌰Y?CcS:k;"}_;|j џ0]s R wzm}t'ahDy;UE l9]:Pɠ>^*Pooz>MmgzS-|8~[uܾ[ .|LVcOYq=E|Gr_Xs,# υ&*3γ&v:#|zz6f棠%aj1ܟ!HFS[#R |sd5ތ4ޭ"C?} s]7"|l-T,Cm=ID |?I]ˍș2!nn-˛ƭa$x6fu0]s6*\U2?@ۓ7ǏW~$`[p]\"K~ fJsʲzVZ7!F5yFG 3xy[Űkwv+finZcޯcDZfQ_,ِ7D=cHNHNbl*^%&Twbv$qPE߃6T7<@Tz(6́.ӛaHhV!oʮ8$(;f0tB I'P> eL sYwLE5;4I? Q2 \RhdE 8(\I=+ύh#6Dx)uBm$9 ^"Wq~ҽzWAiKL'Lj=ߊHD(uCNs8yRͤuZ_D3[RRsKO +Uc3sI ^FyP@Z{K&8Z2uy]#5jQfx-qMĈaCqG ɡ>H\Pv%+Z \'YkHVX41PX@y(4!Y+w6B%g|3};yXqs 90+^ ysk`;8OkrDtT{ǶX‡.Ēh)T҄#pi@R7FrN۬4*Cig\92rD+-W%b?LX쪲cepVn7k]TGg s~KM=x\̨3h Zp^lƜ^P*[)|öjz5߿{t8.AqU^P/XKPKAv TUEY[ΎTm@"!Ws Y/f Ujie<*bٌkN,uP',ڠBCUBcSI*B@jj p#=%Mmjb=0e n:E*Hk͊TH+ٲ@*f3 m51Pf'Y6}ʌgȖfp zu(8mtӀRCADKz: Gu)6~v;Pz[QE{Dž^z[<։'C./p $%H7vPeh1 7LW(Z-Ҷ\FijM[f\ׄx hwR Bt4'*"y]U0(TՎyPz#C)=2!-wW_|uɡA*q p иS5@cް({O`=ǂ?`{뭜ңȄv^?bhmÿcsOo?zlڈSn=!:|R%%u.ߔB+!۠2bV'nLh2]/F2RR`DR/u65_L1k"hI,u&.Jeeնȁ %a4[o|.? dpJ(poSXS'N8ch 64MeToz:-N4:z'z8$߷B!IcN J .qagC8]0sH$CA(sb! Ɂ24^zlJFO؝1:,rMRd>J jϕcH#Vj2rO[9c[DodSȴN\[ ޓaP3U{#i}KIP(Q!j __pR .^G:@ٹʒ( ĮXk_;e& t[N}m}XW׋eU揫sR OժJ<݄Zh(-(?l%RQ"@{F ĥ!m0.jF *EIgBͰZ& m:}IF}@rHBwM!SnСoEg{ƸOyG'34eF7u0ôrޑ=IhMi\} u#N!=cCnas!r0IywI+COgtwt$z@hS[DѪdr9}qv8W*/8yC?@2fjZv˪i9A%7x}1_%7bUãFIᖢXA40ѹ{ $q |KkCk˵џْ 3B)=^˧Gieh$&fVOTy&u.%"4; F K+Pk^_Cmՠ (°ΥA׼nPpf %2jRmqś[d苲4P$z:=E ֍d͓Dt͢4@٩5EL>pmyryxfȭn٘p+l<> 96wѽzptm,4]=:e499!e\=o^8풏ڃ!{{Ch:($L9(wv ɏAC7?~P"HB#+jpFi ]{VWW&t$%Kh-o%;UӿJN#ET;@l^]!|ʫK Eԁ+h%L7BmRtFh =O$TPVzNbS1-qh!ݳ1w)Zt3hRPiܝFiw Jϋ"@6fV|=<xwU}/=nmex vD,뀱Av/b-Xҩُ%#iHU%pEw*T`W_%׊DJZ}=fP0d/щ(ŠִNX?Ev ZQgE֎_RY;h#F"B#*$N"N[+<2%zB*ɦf*( Y7O,QM y1cwm_YA|? C= X,zIw+Kw%Q[4Zr793 q\"x1SEB3R8^ZM/ݸ{P~K^C1r8 ;C0Gdi .EZ 圁:y7ME›\* 7RxhX4cG1yR=tw be)/lh𽱉󥌳eiK-SPܯ &f1:?q:6N6>`6PKaWDkDZ ML!F=WP5s$9xd%IZ+W01Wg?^ͤOګ'X5()ԦL8ji*2!BCmn)LT[䷝N/(ssbI yS#1t[:?\P Qt>\6;,d( @;?y7\PTK@6 \.mKY$0GWwaU%(Cb3p\6C-E>=V2.I-dø\Ζ0ƈҖYm-]I;=h"L %'1Yb=Kd/˸{ BW=o@t FÇكIh'gLn?u]φ]Ǝ!^I&^爯&ל7|B_ s0[9=VRx$T']?#u?i6?iJғmnv?ub>Ee67tP ;?0 o1%~7y1db7a}gQ ?ӱwWJy ܧ&`oYJmA^+Y|JR^=1`Cq,{h<%yf)SD+1O6k%%6y6x$f\,(c@.*/jFUoLOARhH }#R l'^Sw#@~*nGtpuY=m,R 32ø'MgqOmќ‡Zf>gZO#?Ed*b[R@_'v{mXEaFވ|ep:0R(<#Х PClRBB@D GIۂ[XHpyC J<3_VOB|[.MM [I&t J lT:^l:UR)Zp!Zph ؃# zL M#AkljbXORh8v/Mf4MwnYt;aBLGh{פ37dgͼʼnИχ&'e(۞ֈi<'e}K!4,Nqzvq4еEߕsxawdJ??v'ߢ(hڜZZՃ,~#Q c`82U%0BW:cpqfuT03HpxHِ/ur -i/ʾfv_Po?݇Z?ڂ!+7-nr-`W߲bw柿!o^?}~Uy-~>*DRD\$FOWW`1`<[RoEt >i_M_;=3|;4u31iχ n[!\_|;y7bL_jsw Yp֦L5$m0Z̝Jf<ɳ0@aaP,Vh_[q#!k  (Sm&C6Ŵߝ|qc5y~-z45ҬL (Ѥ[! A,&/Ӥ)`PO>qDPCz-S ` (rR "#RPwКKN/"[|5¼:4ϤUcS{2X$edHP6FVIyCE/urC0!͐Дk%LaO\3aك)DSYL4rOxnFx Sd_Ւ]h'kxO;Y]NBy Jy.c (r) cxHH/r˻!rUբ^*Yḱ&XzK  rZJB[Db k4'?Kc3݄=ͪL Z>3_fml|#k͗m-[)={?@Yn/\DdPvDKݪ-uD'v} R]mMh]ք|"%S$s8״˳'n<#:cN ihBj&$ ecEwt\o,F=ŭY)?'R]׉ koWpJ zİ Zcsv'ݕdpZ,pk]>_I-6oYN)YS!CK/ *pď1ޖ -VɌ5pJ{N~uНmo/⫯ uϦ7'#TN[B ߊ槏i<+"*NlX>wtW'p)9cjWMerltF)M 6؛A_o|k8p.stϳUk0(>bkLИ"*?J]ۖ||?4_^s1i[W`xl+ ` Gríeҥg,9`^&2eBa<!T UQm@eJA9Q#F:ȥӅ_vj'B hc.9ˈTXEEK[ɚ.lK X5v" ˰1VhaRr8ˬ#Z[a)H8Qp(~(T,9rNpd0Y¡a T2C ܚL s'b>o rNG>-9mᾄu =VesYff\xཕVX9fWk3.)Yx }.AF08h)°Rq@)o S"79FAjBȈitwӪ΋Ab}&m٠gӫ5 5qrfP Z.]Z,ޔDQV 8 iP *w^%˗cI(geIVGQ@)|bTjGOiOgCH3"IWm4uzZQQQSR)dVϘu4}`yNqQ: -WI)dn{a\[ĵb\8B `s[\L 8:'3|Yν\ 81VcҥTu0"E;Xuwu? 0QZdG[d7z?'z5}|3)zs ȗ1XniȬg(ɐeb7 pN8GD v4۸A*|Iz<>Qhʗx$2f t33"OIYi=}̮wL4\;;˗ayB#k+/*!f╟.c5#`4w~p/]uEcE+]uE a?Lh &`Qm}0ࢣ62S/!AT.%n jXΝnh^FJf} X(%t䰕!xK\…uKp\bϰ]']1HܪrDC6C]Fb q1k*nB })~Q*G3čP:/> E㯲k"9f=kGJYЌj0|*UjޗM~6h* \^]XtH#-<_kkw}׈S_N!+3H~|-Æȗ'9$|\Wl8; ("JhȘ˰9uHL,0RYì~0J"cK.I؇Y[߄?B.K\{]?t57~7u%X{𜑔 ..sa,Υu\R N~ M ͑ScIfb+mYCɃ_͟gB8,g*)Rs@O(<̅ m$^Y %ʀ[XM`gxa& iF"hϊx.ݗbڮ6?l'yaTa:{P/fg-ѫ |W_ewHηo:XT]BoYub 8)!b肃"kdk(a2Rr؀g<c~ ݪ PC ^кW6橖7!!WҶE?{Wܸq JLr>XQ޸v#]^&]+* ! * y{3ɕ8^/ n)?;/@ߙw;i)T,_)P;jic(PՃ&_k? (_ƃ_Gf,:o'ÏΦ?޺8r߂ܿ+'| =ftyGytX]|3pCCzkv~C6sxf|Lz #[s罿I,$p o;^ ?E_`#?~H0!caҌGZI0Ws'draMgnDpVh x_{&~f4탉#t`(a>L%y}I &vR;r>rdCp6ܙ7k@֟:EĂBޮ])X]/iqL8-WH齖LeBE8X"iabT:D".L%' SCk3.k")FbCba 751sXؐ"b b*gUO0͡^'Ǯ|L9`-KÈ6!PJB u [P\b`ЄYHp3Ȇn`.dgY'l)8q6]YӋ055](&dkbTp60~:uQL3ӷ{dΖWP Tv۫!? ZrS,^3%[o^my2șTÅ*U[0J.l4X.Nvz:O> xu2i>ɵLr2*z9D\'Mۭ7yjH65V"ybEfVYgYe =*V8zX#iyHzmBx=Qur!8zc[HE9O(gp4OO0wsMI 1Ge掳 M5IЯbΫFu!v{= N*RS9GDDIF`1sN!' Q,P#T% CY `1|k0)"+(ɺid`Mxu<8,*iW m<Цi{mKi e'ۛRo%ʶq͓K|ј}#OӌA{Ø;t(f䪓C=;pCʥcy,ZHdSMf|\ne1:e(".#*t6Vzy[Y O)+[Y N6H %)$2unWU!/EKGߟܓnXFne1:e(".笰nWU!/EKcؓn V*>:yvqѼJ(@Hѫ'(E)8i:H)R#D*MSHzİ,&mú|bF|=JqMlR?| &&AhTb'T(M[4hrh F/MuMMc['-K4qOs{̽L&`Un|0sId6陸pu^bXuZil6٨jK@7?θjw~z<2;sh̻aJE!yQ͛ AFNޕ<_"W44VZm3;- -TWN=W卜NIBhOr9ل" .ͷ0D)=e"X0_}ݣMry[&oXr u絛NhVIi>w_hkЙ*21?VI dp`|Yl4|4)\5D,LDkT4)iaf|gޘj4 :' &Ѵ؂656Mu~ Z31j2 :+ "̓ +MzI0qF#ʖw'e}ͅ#m4r%싙?ya톷c;tKM6 XrZ,``a:<9|Sd3՞!͕?͊e\7-!jQLXԪ5(~T};OQzVՔfPz!&yӛYs|2)EȽpȾ߫KaF#K /A5 sJACL K8GM&5ilR>iŧ\ vVŢ`<IbᡲǂH È:xlFYaEЖ(Ǔ}SgY(kP9`H6Scr:I1mNteڜN Vl5Gl1)I+-[ 2Sǣj׻4^}I rU /Fb3edT}i=o1 8䅳6Om1{1Ey\a|]kRYRY a[Je!aeTVT!>ʏ;tsT ])B.%Lra7HHZƘphWpэ$QI AɆy`Pj3K%bϠn9mAQfszǿvZf'&Fx. /M/ݧ%} k$[I'(3_XX9!,V("VqbN.P,{*x lg&4EVl8Z< YAaN[O7t@XֽX=W1"ml3B!`CH i B@!Qk]fs$)Tr 4%`i^}4Wpq9ͷǻ)]#߃BDEp'P?㸠VЁ_3'džcw(jrd&ԒnU"8l qEB#+61H-2v.3|)TN{JO~G=1-4%dӢgKwL[Y#\cG*;p|DVh|(j\VϯC'9@zRr-yBf#}X!t!CS$#bjCD-ZYÅd{]@>M )HgC2D8XBBn@CډHG)cTIx)c0!sk0&2aL#K1@HEP*u$JG9|aa~r+ֱbX/oı+"Ypo4Gk{ dY0 VŨJxR,>hS ˌV&v59=Nw9#[Ng}~$?axWKv]%(丞I Y >Տg㱻u|N%tvaM@튈}P|B- ЍGv"v'q7ػkL6zmXCn&'9q}nCUL6蚦>ꑷ.{o8,T:,R+bk*5ޕq$Bih I;z!nIUUCY#Iɺ2)7hx<$EFddDd1`HB~Da§)s+$b,8" CnTˇY+ݨ{5Yюo')8ȡvSphM."R8xX#fs4%`,|TVѳBe6V-WJ|aZ u+΃ }u&`8{aZ^ )g?nmDDDMR80]P$2PϖR R¬^y=纞n^̣b*Zs%CuIHx#>QUn-5?d ].Ӛu"f;DZ„uTx,5M\Z8hw Z Z4[s$-,-hM 2(^SvH+>Z-|m:օEkQΩ(4w\g8ͮG~KlZC#@T4/q&EEgHҪyY壘{=}'Yfvk}i[)3YÜGqN~*-~\ïTR SjTlbo$o8dI\+՘ 02?=+}\qw"?*DIKi~@n'"쀍oɑ3>HJҙdt>QJpק? >uf^20'3=~K@策&@[T{IdwyH:@1hYaSep bY׻.:݂` ,4gya,ˡBO=k╳b;ntı=߇2y>OAKڟ̊ VO:w ՁdfE]n֚uCeI$ljV&{!F^W+X۱Pqk1Hg BIUs=>ў5WCNY>1p+W}p옫܀Bg?¥q$!D b܌6B듿8˖+*1@X m5kgr1;{2›ѽcP(2kn@CVnǎˇ痪PG !l )IkkP? Gd͌jyL'ںROWt(^iFE %aZxtYdJ1`6H6W3:@cc9ϲy \YRw~XZbdS*rHMJަp|jY U"S4fIkӦ2OI:h|BS;eȡ *mdV?*߼n*VBFeL|˟SD CO_$uӻn]"1ׂi4o -eָhDmO;ҞzU^{z>b&@]Bp$Y_s"1cVD+P&8YnȎl~)gi̌ܣ)jO$\raU6YqքMHGZaC@iMea_ XgFҝq`&aȒW QE!ޞ'f2tj} )/ Y+wZY V`n}\tdXs$>EB@&;g qy@)}N d^<ij$VD$`ERLYM FԹY[.ID;`HEo]=rIE)ˬhL$^YTX4x$->ޫT3R8a•8[ך"(ʛz:ԡ,q FUk$/DY%)Ϧ.HA%wt T>S `Ǽ 0. 9QnTI,uSԟFI;=YLB "SI[v4F:mX߸$R@u'jNuI y2rZW]q@'!!l$9`%VL.y%ױY:WO} wA,0:#XO?7pseaġyphka~yԪXJ;0<# FneuF6'_~̻:p_e%_4Y%O}eeo=}}P\TY~z-b˴_Bu-Zzvx_x+pPY,|p0Ya1vޅ^_o&~Jdn3KV2#:!=quS"9zn3TH R9lmAz@ a[}yL& %-7؟)Jt~_}A4N7 D|x$ yE"6l9Z v ##!0Sc? }OZGaljZ53E4@r ܔtG5 N"T/wߓٷ.D mf. g_;@&ʅN~ S]f.ݛש]ꃕrj'w$K%Ζr7qJ!˨|Ϻc=WEMJ|=:@CqRSgfD'%ib}:\ҳ6RX'"GN^7AokO|Ow7OQ]b6z!PxTCfF{T@,Kxsb;UK/oTFa)fu >(tj:@@_da<'zʴM:1"@Ƅp2BjFvw3.0c)2wAgTQ`%{> ̔Lǻä#_Iz9z&ze4c>S7 Iw({r̛H( :,Fw/9dg՗d0h4?ޝC.gxNա5c g $ Rpha%THATڨB;t,}%KR[rQPNfjQD% pi3Nǣ7kd书2h%2X.(- Cy6?}jST)|ϯ_D @SGy…kxJ~}Qvܒf[6g'Y|}kܚ_c8=Ŭ$SMU< mu`v7O9§XY=6Wl-8  w_ZJ98ͳ A7vzdEFNXӪRn2KWMswcx`kN兕uj~ke(d 5,3t^+CV9'EbdU5TwheV=𝯕tF=y8OF܄5 bZPQ5Ęf t̃R KrH#ँז9ʞ]N czųR37wy>aх1,ܢ+9_/Wf}qB܏.wZqo:pr:=ޙ\1f[r}ƻ.N#N%^y2w:Jv8L zdg" פq?.un(qH/y\CF.x}qM0Ļu$,C;쒸01N,qr;3;L[ӌ m>JHius7i4wjjۿzE@s1l\̥NýnM-𩒣4ˬI@Z{5f#ջ%1 )vP@(F獨AN3Ix BW[bJD l4m3v(6on M]v3^udQL07.eq!b\7PBl#8BG $9u׃db>Φ01o} a{#WD߬tsLCO} =rŃ^v\iqǯ,%cX79q@vHI1 I rdوZUڐO%Ӄ?[~n:Vfg/|Ԋ2է߬"?@[=of82<|(xq@APz" Gr$d i4. ʍEdAY0!""ţEAĀB>Bb+ .e@t|dQfsA)A[ :O I-a&ˆܺEv;<_C*E61_ xi:` }~t0q67)kF#}/ax}jܽR \n"j.c,aE:ۍ{#I^]1q\1LsC;3&s`E8A@`Ne U( $#"eU*chaJ>{.``h7;78NST:>QXO_?@ȲظSC}}t}=| d}QI)E-R@czVL% <0[Q?86*Ue>E\.f>7 nk{qt!0M}}/[P-a>AQjVWWLP:zW.//._TV;A]WdQdqOylݓQJ`[:@*kK6i+bhol2A69m_n{>RY8(J0ZzQ:DRzDLjJ¡@_}ȥp9~ˬF?b ፭VLVID% nOdc<5DGu )x#Dۊ5K+]糑33p9._51_| Ir QO x")?&]5E3/m/6j;r1wDl<m'> s/N RJvx]Y5?6^[P✣u6aY/gt|[͈?-Ō]s:*#Wv҆7l*q.XW X}e/hMpޮR٨B.cݷ3*6Ryӳ*o@*`%D"&'gy pJ+'ʇ:. pYڍjcwCo{rUܽ.pY$/:V4v,6R!Ki@6Rr1_喍Y?/' &ӣoVMl0Z~[lSC0,w]V#8]<89.+Kwh 'I$080,X(S`*R AJ5B)E )@Y ʨH* b`8)g$Ӓ$RiPR Jd\&dF tSuAc[7bA'Cjˮ1c![g!PT&84F "̔6G_53I 20]0$B;*1 i _Fڐ(⢭~is4u~,#;!0>_WLuP)7:\O[l S6fA]t3kD_]^PWo{gx\n85k>_ܗ)Ǝ9G0pM}_l@53Y@Q𐑕80&+%=(%hO "UqaX4úOٽ0pS,H F (RP4'"*6 2Km碧 1W&0 fB7r`ϟ(Yss܃oN䫼yۢ %E}ddπL'l@d;Ih$EL?.O7` xݶO[P 'tF.C KRgh佄?F?n$ܖkb0/(#ݳmNR%w CdAξ}i9re/Hyc(U Tq EE"L ֔(Jh1f/&iy% ȃ1[Nrl"T^,mA=y;#1Ɣ`̪\ $h1 I4 (e$R# S5F"m+NEj%hL@t=x=HvQ⿼KAg]l;@3,mtA7 BriQ (B'_0gcuꕛFNrܔ?cW[QNY*6r}HF ֏x[LQ( MуX#G N#[1WINdY]r "caP JwG`G1bw barLcڙ]h2ms9-+'o輀+%|xP)qi[itbm$K"RGmhr}mYdȂ1|)ffF#)Ÿ9w &Jef۬$DgԒDD  *pDP4",M3 {'>-McA!}Lnj\-_SOEYDuj;yTwTŐ8.ɝ Zz'<98r9.̮B8N,\rN .v C)+xn b԰+yD 4*P4aQd 2KIj-b;n?H HƈDZ1$<[N+ϓ|e}IFŀ$qwX"Ft VrہuZ/cuS]okJ^<2i&-anbdC"PY3J$Qj5!@ZxKˢl%8pi6QbNѽfLKa، YqRLPB3D f!Xa @pcp8^u~d~ss7K(VzI}؄Jd:uXQ"J9ܭۆ;65ܜXhxmq<RM.^X1#q DP:ag SXJ,VT&#MS-"QP`e (dTg .$hZ}nWYqI#[&&ZDE1HPd#'8IpJf X\QnU `F&S$f*seJ*hxbiD0[׾py#\p VNN^݊b4N!aI*(%YjFq`Zt>F@mEHRdYL4 2S H@FutltI %@O1#6Ve|ԂsWIzKB\62g?4Fg(y|ͭ rKWS.3^M ' =%},/;E'Z6(lC:ȺGF4hGM&P^Kr2 wm6h41 !y6j65 um\L.\ɇ::#>Ƞ@@¨66ƦQNQ?l2;{[|<0z`M x><$1g>ƣgǣӻɬF~E-^?jur [SiqRRduSZ.ZCo\%d2+R!<>-fM9Gp+F.Eb=R`ݙe<E9x8~Lb~O@Q9GfU<\ө.`SBI82Ǔ~##=7scpNw߀oրon}A~`vg7Gn%p֟? s1!mq;\cVv]Nۂs_w-R3Y9D*% E 6d##+0 E*`ͧ+ZڻU}ž;@GmF*+ N{tt})FeL֓01#OX -C %`ϫXYGN 4&eԤJHN6:KANZ &ϣy&*pKiF#b4:`O'әWMLc|\1S]OUoz?\\e֐@5(GGIK\ 2s0M×ǜqɐ l]dPF[iߑ,*!aF#F) B *[7hւ -0!OCXT$ &drZ(rUΙ.>J .Ca( uU[ Bn*?sS;% [k ʖ CD,gSf㇅ fdw%YJ{Ŗ[iI ׃s3 xˆ.6t=* GdEY gEL@Xf}A j4\&%C[{pTN>7z_&oNomtl.ȗ$G7( FycrGy &#Jӑ>=8NOo h.Mlif1ewX^-Gb-V۽4ebl8lѵl2I'q< "K ^X#@Y}m%,QgInl)*/czys-K+wݺ{ܧ lԶ]XR+Ǹ= KXCnq/,ѰM3q}/,UBT Kc^m m(>m 23.qަC y'/'_\Z-=S*wQ;Huce%nv6amOTsAuvw-3Fۖ=j7o}4XR%`+u7 r=?UFFyry>x9o['0/]G-0šQT_g^" iW<"#"8vYm@}iӭVvUMb96ZiĢs?(ڈ9Kg'CD9`TMrn2mJ.\1OX+kQ DWފ ē,R.L>.L>.L>3y6@^ƨTNga(ɲDYDO{og90tRLHͧ*wdBʧApuOU9b܆ͤoOl$6MڤNNcoKl@4i9d 5_V_rF+HtAZC'~gTk}kִWBl7 > s)0٫q8X:D`\1XIƚod'}V~VxsZX{sX FYV3`B %_Z[[l=U@kȗ|ɰdݖS/x (|xXSy5 Z& ۝h<.y\D.j {#1@ #5`urgdJ)u1yNH, TN)ur0n˭4sn*Jąmhm-sWP6Y߈x/yVgdU\-Qg4bVKŜ d}S,hΠ!~q:}w@JnJ(FTYΤW'"a چ4JYJh]%Ņ,Xjk6 DBv/-m%_uAAKkrjKAN91ֺ'̌0~ Y'ѓGbU9Mfx:U흖FYoUܯgLUlA#9 c6ʛr2/~X;`/ѼﺼP.we!\} V1<jAzkt\;DȈA(4VJXrFpMl9HǼ5B& # V$Fea35"JͶs=pKTͻ; w\4qpE5\}NsuirҎI- g$s2")nZNpA;L*5޼:Qͻ7z(JYjǠr2&ӓm#jΌݤE:@WoIT"L} ʦ=ݠSfֲ?ed p*e{|"YL`b TMizաebsލn~2os#5kdadL^8+\ШZhuZHn jЪ&;8}[%Y"qs$%E`B9!2sCGZNk?F{jk35; 4\\3]h> %v2 *lS!~\rQ3#) N 鸹*xX26Զ+pk_::- $׎#PJ+l+3RDFķΔ. tT>ٳ`U |ֱ܏E Ȩ,&~R&iL7'`dԄDCdsi}"MQ(VyR$[>ԥƕ 5"S31;I # ֢rĢ"S9BEނ&4ftO& Y'*ZZ:}g-JR oHC~Hi-A@b(1A)pbL'29KgS*de!(Roha-Ӳ,Ag`RlCjW7KлwՆ <"yZAșK :&K/dȷoFkl{gc'**d$I7E4"ŁP`xT8&' [6< %ӄ$J~##Jg\gq~(+z*%>{zon2ֲJEOx7Α8TKD|o a$U_tzkEgw';~9;;8Y9̌h}GJfZљ~?S.yM p#Dy0Ǜ |8R*r5??, 4ձޏǼ1iAs ;vfe`{d=<Q-~rTBjFhkCQCYm '/H+ y9h46XV(DXMhxGec?,j h( a<2E9zav٨e.:6(-᠒dCʣBFaRrG2LQgcK܆Dg)ݗkAB)M-^+#Qfd 3`0xHAy+V%lj EkIM * !6 xpŕcBhg=iI6V@6|D5NC'q&3~L7gӣ$½^LҜ0w?Sңl)ptT< itV3Jç#~b=kowm~9n(b`d ɾ$m!HgrO5)ٔDYIJwݑLuWU]fUliįZW %+z^uE]$H۶E(x t!7mW7dG7sK;2wZ\8Xm 310顣퀪 px3ݶIW>.G ߣ,aΆL3*9ݚ됬͏kJi'Gg9$) |P*INF|L/9"lR2G=};Gjwd#rrSV‡e[£d0Lz⫑W5cz&b9`>4+e kVKi8Sl{Xzls3O{WYNI8*G7|r뱣pkS0չJ qt~')ϴWq|miɣGgf Oqt~iGYZy OQ,֌q<.;љ\g"h O:E`霫[TcS8h}sVD{g`ݿ` El,P]fRtqvҾz($۫Q V ă F` \9RZ[=lR,5NiC0a F"'46T6S?\{b}vzn{k5ZBXѨWFۡGO3eʺVU"Cɺ&o3emX7.pMꞻGK[c ~;2}t򸠋8EXŲ? xZױ+um~Q0.%\ӿj0i$z,2ψmIBE2Ź^1f_>5'\7O,VZ]","n`~ Mh-W9cx#{BY&X_ӝʖґvɚ&YenT\ϠK҄2҈d7 tFiUtX:,p R:1NKĞ͖ %ƞsϙx3C+pXrWkqCo(ooc`a^ԄgnKz%J F1@[gNb%AY%qK kunþczpDl 8.PD/ѝV1nD0!jOʥZ"*Z CXP?qjbڦ!Z?2ց!Rq8!6;DTZڐ -dž1:7t`a>vE&κ_}v}luAq5^PkW<!L U|pƢ'xG^{<|Y,OHZAy^1N@͗(E~*w z''~Hqu&ch@='\v$|þQ;tbI ȧLcɣd̮oMHfB UO.ۗٵ!Z3;L$BɔKU"Nj}z7bw vC״? 隳żMּ;RoWhG/mkҿbL4'-:Sog<| >fId2+H;aɞ9֝**Bs*fN:B}hCV !$%HN  ꍱ.Wsrp{VeJ|knW[l ¬ ,^ċ0 !왣[TXkUĵfU"0?=JSݾү\5c:^`MDĚWDaYya59,?ʻUe*ݰ5G Akaӫ"+ XcֳSDqT٪/H/dk*~<,n1̹zb774Y>n?V{6pN kޗsԃ'H6ҹX?&G#=׻1 2# .#>&6,!ZKUtuݭGӄ#极>%$zN )2!sULQ|ϸ.п/bfeӉ.N= >.Qת6V7|K>xEC@@w{fLFUIF \K` 9τ JpU`$,<Ɔ\jF2Z],%;,M$4Ul*M7ܱ Vqh *2NJ R +9c -Jx<2YX< +"i<۴Olx D3̀05v^| ;kJouF"LSRX`14&Y ! k& "(ɩ! $. oK"͛u'^/ (b AVBlu>΃`A{)cXjS'IUݫ7n UlZ:Nm[5Z==w!K#M?xoPcA(E?yw! ۲1ڥn S+2#PNbK@΅F@ R9%|5jiQ_QYG4H@!KERE-Swioמk<) v>#T2GBX"TjȜdD"d`؀ 5d#&D>@ sZWЖ 92"+XxnBuY Dtz?IߛEHlߟ?oաbPfݖFӠJTlgDj[wH{uAf58a8DZ?Y q8H( !d27u)j8_9y5.^l۰bJ>Hec!DKYjTYr ht 0u3ƒLgR6D,{p9A -2{) ;ύ &6TbR&Ńt{FlLPiR1) yh*" ;a#Ӑn[pn=8Ђ)I@~ VXa8RGNNhgY՚Nxz} \"b`r$S%s^ 0ʂ̤X[FO<6!W[ﵿ>%_F1㵒[#R0:1 hn ڈߤJV4igɨV_hg 4XDP069"*肥N3{lm.!RE-(Lo߮Bb:x|?B:y}(_|U]0;kX!bJ5]R徭 Uʣ\tXy<{`bJC+)uBwۄ5mvh#n;%7cq2jOȐm׈):2(a'gS&8$)8rJXU(?V~POڔ2KSbTh<ay7kƓqsVHDMɎpcRs%a7)f#fz5.d&1$O_8{!#7'zWKD"p<Fbcw"ld/5m_ke{Pp.׫|l(Q2mn>k^^7- 丶o3fӇyOkKgfQ?52c)B9N/Թ=~f^CN_?5uMgMi}k3Q Ys='ЅCR}̉UɑuNRg,#ۧ 9lrTRvn蝐\Ybϒl@ڎ  s~}|QZn-, wo >"0P#TL;P7 4 `w7 B>~hCn E[~)g7G?%#` @F$f1wLNu-ϒH+  n=~7ē};:ld5nhٻ#RSPNɾgG;3HKs(ezM֔7R1e=eߍ kr]J$v:AO|~0|JP֗vOwK\CR/-%J:N=KONx[Tj `P3})UV#@af@qS9T\)>4W#+T<+_ q*?D:P  TJ1`b$$ux vY $_{ TA@J, |*8Z g)T D}:5{4C*W$8*G"hif.YК(̽Dk dF3h>_9`nIdl b9}9 :ݚ|xӅY=3ݏT榩},Rq "q;`3RVAo+-4zPS[ e߾#)|K^PHۏ-۾ $ M&IJ13jA3)$s,z% "5bZYG x (nAGLuFƿ~;>W/цbJ h++ ?Ioi=/og]}5k!Nh2%CbZ'*O_I7OE[XCfz6t>{0u㗫EUC-;b: aV 0P"aŃy$`рhI8(= ǭ]vn]jYZ~%R;+rytGK'`:uq.N5a k DZ|x qRAhԔ Fiv,7RvOv `g],0Ej n=1X'],'kLO.&]LŤPr9nyG㓵)b'kSO..ߗujv1.J]:Yavv1ֶ)WE]XmpÜ"mʄ#cA0h `TiL@d~'ksO.悜];ͧwksubłn|3שŠO.SYKvqQ.RϿ.~uok- EΤK̃c`hB3 ZvJj 2sfkb(5fq$[#1$80$ZL =+^+gY_W[UL]\?*S(SRP  w֨Rx* {d22Ђ%8Kuq$ ] ad}$畐Fc#䘒.}b+Q%`R8!젠0ƒ1 Z;AWM곾|2kxZŲ;E&XK/s-|7Z>qؒøu+0.a3gћY~.uMN!Z WmX辩"Y~WYxʊx[R?1Tt,YsYnYGG17VgJV 寗r~|l/p?WM?rs yc#2GϚpŽXV# :oiutYSKvrgMXA?k*$opd?F LUm-nfiy/noQ7E:?:4_} 9\lʝƒqv`,e07m13:eUo<4RU X&d;).'*rfz8Y4oT)!b1:cݺ7sViPv?dPQ!'΢1nTޡvŠ Ďqv.~ӴJ [Aڭ9q){4~zD'qH *5"8(Y>&ݢW!'΢1n3ŠR6r ݝλŗV*!'΢Q<%u ?n\"A։ri+j:8Y4Cf$NӹZ%A#4[s@3uJ<Z7̚t^6W8C VmFA8[ <"PFPQ:w $Z uY:l! ToY tVI '#Zt$S(Bdp}ؐ2Hjz1ʩL9cU1SP1s^S9přsr)\$c%3+wO9cC$13pR9krcN9Z%_s1* 嘹@1+ JF:SNI /,Nu)\$0"cR^Ilu`lcN9:%A"6:fI$J9cU豽1Kt1s^YJ.S9kr qr)\$(r̊jr)\$0>-R$rJ5B,SNImu̚xI1D|w2V$SFI`LKzS|_|B&DNoɹz4_\.m)|[."e0NM&*lc|]X7,ήf3x6c?^x cJ>~@-נ 7?20"z`,vZ XNa0a e΅GLzBpo8F ZL <\fSDKA p9c9zי};_oA~s"_z14&$ _Or ,$u0 n駭:ߍ/H~"~v>Me,xe"jJPa)r`p8f)N@P@ChRS:X\2po]$1# + ! S->< BQ+j`L#"!,u,iHzn1RMVok2!t\;x8fA1LHK6 Bl !D3m%'xhX4̌Ψ1B]B gf+M<D!9YA h۵MNӇ׈T0U~T3\,aYqJ3U_IL`tH f0 xX 32#rA dTB 8Xr9x#8ClHɇqxeV~f~\-1r]ߒqe7O¬1l Qp^s1UmI./~?.=,^ەr}D&kۯo. |su5}:o7Bu7Wf~V_nf3x>4a`}9T):aV¸By]5iS[V8X 8{Q͎BBOYM%x2j,8 l"Gd#df}h G1_f\(0<:Zy n}z} ?v? ݳ]qa |@gE ,?;(w!2v2aDrf-lT^;ɖxOޕq$Be;[dG[,aeyiËY$K]*)j̈/Q6cZB`a4W1hwov}nfnAe52 " V.](&0&AL. 6nIFOAFpmʝk{f#5BzoҜk%0:8f14h 2ay$B\chKf&nXWd`U*HKEEDI(|--fwhQ3v?&ƅl~ހ2yf+\ƕ?Ls> <!c{瓛nZuwi'͠Ik(PX,6lxfm&={1JeAުynefm}Q34˘TqE3F, "-L ]>y9_ td:9ĂT (o%.[? ϧ7~nF>$J {˘(b_JIRNHj̕ڣIXG?(PӸ53#oWv̨-mvZ?ESz˴FCk< {D9RO)ƪt?bQ @تg !EQyZ|;4x?[~ݶр@g~.?ܹ&lSTY^|/ K=۸.jM{n<@><[|``pDͫb:I[.uiZNJPGRmy( [QN*\ 07)ہUhH$g#""!`MtےVa,ScfqVS,' ool]Ww+2w'⋪:brc:*aYR[G3;ЯX[6O5n*V١\}g6H(ժGz& HNXG_H?#lGs[mo [B":T"Փ ¦R&i[ITb!8Ie( 6d8EX B >#lG![]o [ ;0FL-[%luz?\Mz wd:,'#MyKd(Ed =Kj*=*g_>65`]h{q;hBZ| : § {|>;9D,<#.ob+RktȗŇEM|tOSiM0KKoꑮ,x{ӓ/Dm4 >9UcEN2$q.O|YVƣ6u5/|MqEzqGq1\rA`ɨL"5+ Hip.SaјHlϺ7%(V 9I+Zx; Ě::Dq|@kY?߽ٱ_̜޻ɼWڵgKӫ ҏ4>vdtq(m2 jFFZ p}0oqݨo={UoGC0.O1"lA>(x.^بOi $\3OQZ34{ wwp"RȺu'b.iOX .Dk(t'PcjI[dzB[ )ACeLEYt26~HK>'4:-.uP[5[nOeդB<[IB<|]Hݡ'D)Ò rr"pg}>X I7LiUX’2汭'leGᕀmLcTn3 1)Pk:!.k@z ll?zfBJ{ՕAFDE@bԛ{AfQ$_w[ ;~ǩ5[SDeie}DL2ƽˌCZ0cLHc|j 9EKxEEcrr[ו |&cq%Qn*4O\*#X._c䒉҅__M S~҆k݈C9՗:K52NL}5sNɉOax~qd^?kRpr>y"ٯ\XGn$B$S&/Z3֠;!]4SIq- 0;|V5#ihd߶yM.'7.&+QR]0Mfݤ[(?mfgm\B1ci)|_wq˂X˞Idn+칽|/>Um, ""S\ьKa>ݹ`u%@bLlb+:aAWd6h RAH11b`9Q B8EŒgw/b um.uI @w\5NirbȂu0Ց(}`&`PKJ[A<8Ats42`sL@]g[b\+QQ;GEi\%6Rԛ/+mgQ~KצWPH֖lC_R[VG%8V=.h:H vp ks:j ~LcŔ,(KQSp[A09!\-C(1_a#!OC h%f`qHPm2+EIy%xux*0)6i>:ΑKޭIjsaRߟo-?ŞuZ0Տ?w;L/ȋTP(]g{ f8m DiYY|0;)a)R1vkxwISujn ,_uG:8LR)c#,1X%FDq.`iij_`+?r=da6AK'bI0XuHEN.FFƐĻb H8N,.KFDLj J Rt\xQJK ]_aUUCj,5?"T+Mv0--5!!ZWB!3,R`Wd)}yg31u*k!Ks.1BNhMD 2>iPbWkBaL4{3ďBF1Xj)H XRgd)u '-FZ0\^&9iQ ehO A,O﫥f>_  \8WT@")KNcSthm$1bAy\v394tVlSE NJ jsXE0=Q  L[0Uta,HZ86|yQ [f[š~j8*EnЇfnr]W1w<#`^;Z+ʛMUn@KH\ B2#*("hBpņEN@,3q6 `a'K cnr{ue-Q}f>Ko?ܹǃ36[X:LO s#˴&KwbfN;czlCM)(0ޣi \q֒j#[>+rx7;Ҍ*H &ֵTHjv"$|(rtjwgMJf_yd2fs+tsWdRBK!2_"U#S~ʾ+ i _냮qRVT)VRb 9LeT8ZsTڨadt~Z[c?6bAv w7ʮu8͉9S]Yo#G+^v y>0ف̋DmҒT;ER*^,V)5ӞiKd_DFFDCG1w  XyGJE(?!pghch*4b_Z![bm3 GTo`v}z0F] dvqB3a&(aENF#TC#>,崺W+޿{~)S8]@߂MH&4j7͞P'2.6/3,4R/DJ=1s,6wi!ĺ皨4as-Hb69Hzdn. J?sBNR HU &j!dZ#Tez"Zka Nj9+mW VI0E9h)5%^xUFF0gpm(h*&1 Ťtfsl2L$D!-=J)@$:HA)Q@=BW9P6!!!y2Vt(;54U ҍ~V3p$Xfݳv5H?}ZWy#lu%Qb$}fnن˵1576YLU,,En-,ͧT-#j,)2ܦ8E_VZ{YDI><6B]稙]{'~pDfi׽|.~ܮfW*KAJs9onѳ+~y;=;eܽ}Zm^e_þU_Yh=-+-WUb?KoZl-pm ׼"7' .5R$N5+ƱS3oFzAy kQHSc<gţ3l$͸RDcz:O'RYzl'cJ #$} ׏+5PY0D_LlZ-5@^#1Ȩt%62(iMp[,"$Je0٥NԨ@ɏh8i˚>\-7jf*PHKNr鸌˾̄ ^qeZV?.#r&4\gҨ+}7_oWY ޗK;rp6UmWOxVc\,IPԎHEA( F!fU{*I ( p:%hBĩ%ǼRApj\nQ%V"G)uA#\Q4ݲ֚Ts&m%#K1BCG ֺcRiCLՁ!E?^|:v`)B]kN6Z%RQ4F}4؆f}wcRU{w5X:>:Ur8*Y&+?oIwoXF.9fg]4Y%++ζ77?6a ~v&5ЗхT`ݒ!P~Bj+ *ogo8,> c.,"bASb}MWVܒ9Cg%bqV :ok#_29;f~N,I$9\Lp5HB: F-L`QE2b$wU1]:tΊ?a!zJ򍷈Has@('^i+ݤѾKBYhԁn4bt4mT%|}?%]3:EiiG"0>=^1wREǽӧ ~|<KS7rpa+ rr>Dx;nFKZwp8?,Tjmk=PC8%w"Ёk*R3y]RGqm܊ƕ@tk>#Ύ% i]qPShT)uU.RH^chV .>/Y0 t!,J' nɴASӮz %n/Kc+Ʀte=^Y &,+ 24AFB1EuG{yh,o,_>i:_[$D(g'P?Cq~y}c>HmN4E^ӮnO EqC<\r.P |Q+עBs[R(RsXqg1r9 ȴvsUD:`!zg9c1^wM'j2}sgA &8mG|8o!UC$xCDeTd*r義Vɏ8Jɏ8&?Vo*wHȣL)fF2Hq 7N=5B:oKpMtF7UɏfhߪOgd 6hye52J(Y#5RedOԆ(c@ ޥ3&R nsFpP>NԈN~4#${!6;_é%KmRmɷ_D9iŲ;!z'hjm$?JNkHfs@R( _Gг 3I)ejͅ 9[Ĵ5MYn`0 KOhxn1.pI(&ʢHZnT+ Q .JA#):hlڌ/9H6 1߇sWo\Lbߕw`o-.?ޖw2{w1Иh7! Q yݪ֘ a"[zd=Dn=Yĥ 8fTsb^ui=yFO&OZBI%˸5dpB*_9?JM_w”`XKx,'>P 1N nOmEdzKߧ!&-߾yگfQfs4FрfT@gTXқ)Wj9CCpS~KJZXdVklCڂ`\'\IT=_ ?bk6nǿ4Y`N7ìe+hhuwn](9^.5 oL~iWaCpJϟX+λ>pd9wu}X|ώ,FqnnME$]S Kݟ>9w/k}BU*8P*NR=íq?)˥']`J )ߚl Y{wpƘ|azC]{7?ws-q(Ę9!B2㑖l8XNBkd IP@~ΒǮ|jO8>q )<ªM_in #C =fQpln\P6!-/@LQUړhCLZDN'1AynF;}J3]B$O/cxZuGѪ+)})..;;ϱ.  Y#=w a7.i»sZzooT֢F赽c*,b2(TNS`7KŚ/A߰X`ə6|5$J1 >0M +"ZgzH_! 89ve$%"e{w=CR %q-z#Ӟh-ontPX"3OMz{d<)Cz殸sĔRR0a0_q [NécՍEpzYapb0ɦ7ul@떊Aƺ( Mmݒn]hWуu $sʹs"lj/<܍n.h⃣=]9pkynre-~<[wי̇.D3.-Zt) U OFSNP$V ܸl"X> ;f)I%BQ s :Öt :(*%Ӎkle%}vۘ tY}1 I(Z!UJ{ i$6ԇRHjdK;[)߲BSmWl5q%qsoVn4mT?Sү]pNa!fj"FcxwݝW<.;ޚٻ3a2+mmuQ|viSVtX}:Wik )aGR% `EN#4_lC~kuXipuCձhC)ؽ;ta,{w6DO"EOZi'EMR{ZY",կ$)EIJ0MK v̞)'Henx)/(r_J,R/#D#T#m%&8fU99GvFw762>l9rN)a ZM[-5Ԓf%qV%1qF"[:[>(B-JG}xW)-4>%sQ ޟt땺:SwӚ-J)2IM\NgcDsq ;usu%-C[ɜ%9E(BBLx-5w֟keMH-F(:''$[ԱLOҋ':dEd@isI!cvWrGJ-\Eأ Tv)\itb* REGπ%bc7&U:*%H P3<`ɔFL"'I:[ d$_:X$g$0B#g'sGtD7YyFO:RVzcU<XP%y]u!՗o1&9]QƺH%)N3b=2ܕ:oJo,r>EqS4SLs>XFάdp+\- nQZcF yWP RlŪ)J׻)Q)J16=Wt dSTDWX8ꝵi̵N8PۍgdZ8l{3,<nӳEYgZ1%!s V1A*DZ!vIu9h()PNʊ{KFIzI 3F<wv||?}65r{Oܥj X+T?|a3QYi=G@?WpF"E}Cs~<UWB0\\l8j"PۣPTdR MhRNgb滇 ^`8#)B}6]HLJ*}8*3ƛ)$fѷrf/?Gv˘"T!H +^S]!j%41C"&RDIuMdL!,ޢD(m4QKB MZBUGh _0EК"C8hG.޳(*>7'VGrǜ P.⋚|%T9([S x$7EXͪ׵$BY#+sI XIr .$25ʗRӕ ƝS&l;^}◇@XuL VÎq**;_ɅK~l*Ň/mIm&`k I"Vq ץL/߉MvcpLr7BS<ͅ%I/ 8rZBÒfy4eU콮TE KU0p Ô+b{cA;fٝzuX!0z`Jϵ( [xv]RoԐGpqu 6Ǒ_ GtcŠTWv1菣&/-r$c#.zM3l#̝Ȕ6&h9_o澾:tkMD0'mŽ )J l7!3(bf-,8 48`HF`T ]L䷛xZԒB:0?BF8(1R+7 IŌ[R;yt5\aD;x {;4q a?Mu&O%@fm+y6erEo&Xn6g(Q'U%\}e"a0"MӨ/Ŷz-pH.vs;r@R$`*t R O=E['r4=Ec1v2 ZL{bF&x'7}t/ ~ ("9~ȕf<o8?7_R`=lOe_-7O7ӭ#q>t:xTt0tm#:輞;OHk٭pan~7]gt۪znng#t 5:ܝqW70O xVgv+*,{7QUFO4'THWWZi<1xaU)ephJs2ey~)BF|쿯iТsb9/hAMͫy/J)Z;GZ߅I3/ | 7^ށ;pvΛw`mA$3RJqi\)q"YD\$zè2bQU??ղx~{?wFUT_(u"ۣPNn@sf1鷫%O~ZwqH_͈,ɢ8ľEpA% O[Ne;bHyIN`K*Vpz2 +[Ɂ ۙp~ =&jbO&- EIЇR%(cr! `J)$dv`u/$C_ #[r>S\تQG2kFQ6#ǿ2$N&(҉m~}SM]].7uQ;ޓ fߗd^bTplLâ^P /Gp.8Ạ٥ܭ CAh`9-ǬKf[(yKsRE0JlL.+iBzcP*eQ u~#j 33sA7O':q*oVl 4)>OVRs:0gieieiei8EJ}(QjWx"$Y(]Em־x 9j`gxhP`<}?ku$Jm>B11c⽓J.Vр tdwz}luAB+/`suP]kE5hW>?,zBY:cvnQc)ɘ~S °BrQ|*#|*'02kŁh!8Zc+lb\rEk $~f g C}u8R*2"C YeȎC(KI LԷ%=T WPBB7bB+;$I\4w0nm3Ђq3cA7K&}ǒƆj^J`Y %>`Q X /E![0F,RZ.>h*#pbf^kвuG8ML |Z5}(2,1[ b; 9czIHn>_&]^ ^~Y[C: vt+}zUZѮސ\ nҪ%y kruQ9fZ~vͱl.v, d yJ=GcI&#Xz^4Zd2ZxzFR%$1xTJVD3%s6qwyhr`t☑*'dLk-LDŽ ۊ뗾{k|f o2 W %)őL#=,f4+Ĭ8!cLN9[QH.0[,Y )ү3_7HH6E0>xYV oABGȺf4@мBt" =͆32{>"$|⨝IX \(g 5׫BB1HPT6! St.GkBd1%0^u39AEKEHd*ԊѺ@;d- af{㬚dՇRˁ-{U|m‘FjhE%v^ UEfEcpX 'Gpm%뫭p/׮ƙuq΢[*v0];y/\ϤAv3 Y1QXJl 3zԅٟO>@)zlMr!vJz+HV]][&iqFg)B_.[z[}Ɲlm˰Vm믠x-·=`Z)JW]{ge!Md8. E0J|Y4YⳘFUS-w#$Zޱ2Q2i?ʈ} fd6y}X_UOD` =f|f T d2JIq`I7FqN$!7ý{/&mۃtFZSLYdF(SѨPr,*P9 w<0E }".Y:fcƆ1yGڭQxG ݲCgfP8Cx_|js 2*zckgi w>t#c|ݧϬ@_OY?OnQ#X蹚J )e3+|8+IKWV?zzKG{A?A`TIt|Kȶs=alčb^ݼLl'$~~aQը{YUm>NBc)3NSiG\H#̯ɑ5;ܣJOy A񏬜f<$!s\Muc Iǒn/A136hOBVKunlVft+)hq$,XPzTV Os6Kpثc*ShW=JȌ>;|Y ;rk?\mʏ(\>g8fW0wus3~ɯ,^}RQ蔕9{Q:)W/`GxGɩ-O|ړ~; 옷D13}.]T _D:zcBz-dDփ'JA͛}re̒RJmcZǥ֘x}w:FE)bJvH&3-aU^Ez!IG_g./-*&k+J <2YY v뚚4`mlߝOZ˗9NH;ضE#f,Њ!ދpE0Kk(0/FuQqTZD$}5?:vT 5TnY{1jʞs|qҍGuZwe?mJdS.fG\~3{,>H67kg q ۛs TNךNDc׍x5)/Ih^ NV58jpVv ),e^i+0]=+֣ig"F2AEG4E|A::NA;VК00[(z]S ^cXH6 2IP\&#|:y]ToU lPEEbAr"[vک?:h3kO`AP,hA;=^88w7`m˹ܽx(ӣHUs6_bx;efZ8'}";WȄ{SHGBnT匴EDT%k-=>wV° !"R"[?ubԚw|+jKG,nYZVUۛUۄY67֛7Ny^um7٩Mwмc$]?7 ּ C^̡+r5mH-sZ89@ $3؀m&AIz֑1D􄀶s#@uh6CUbb1R{KI(r )%rYsT\Dkto2(cfLm9%ۀQ$$Eъq: $PYh(S>FWA iƎmW I _o@U_Q۶u0~#ؗu!`*@,ll$TR*t)H%f7FEvG#zPJB]n$9r_| `^B'4WV/(#TzkyM: nQِsW"Gzw{Ec$FBdIͥ 4 [+|4͗Lť70-8 Qk uh%z7\}=810pEru`dr7P9Z|'+/A/@ÙNֻG]G~DOiz^P K/?'-\_`#C4k!y1Sڿ%g)R#)oU|*xլHi4XTKz_FOz~hI޽YN 5ܳDB-~8OcR(F"3"Um P͹L_%PF)Gc~x nj_|ucx Wx%Ad h{ЎZ[oUK$94^}3]f܅2kߔM$Rթ^Ysԉ>ox_T˿鿧/ggZg Zpʬ,%Y\Vˁ noEIZ|\˜Ѷwss]6u)\?/\' ug_u{7fjJs%,ў 5_oO; =ZXYuw\\d!Dlk}ݩ׼N')[L'q^m.:lvD2ֻ`!DlswӝSzDDNt)r-"ֻ`!DWlJGڔ?P`N]uoooNRG_< 7=Kn_?}~o?_ˡ?BTbrkd\u|H輳$/+#WTg㶯`r&IX}%Zik;I16/ _c*F]MP~xH_XW 4 H4d He]<*ymK{ P~#D%/DN$d R)2Q_WJµk6}ڪZSUCV`u])NLEq퐂2nRWmķwoFAhf d:z;||Ѷ1 cP2Y!6}aDO? 8[x4 O֟C$ ̔n&J)iG[3RtC:j-dp4Lk0ͯ]-9  hjWcq6ܯҶ!N:cpʸZфk%>ؚ%rމR)8 P5hǕs 1@u(mhFQ$i-|7676Q&Y`-p,u-1"zNĸSFJx[{ٱ- @A[ALKS75 F *.lYT }1Y+f#Y49Aj[cBˍE#F(iάDdk2M9 )jRfnh){Mm7@D,/Ćɻ6bZo[{RQu *d &60X7;TA3G0R4ZVuĪJ9QWer&-vL-@ABJ L4@ ~W{@=PL9^Hrj">h0.Jz*ٽ7#j^ 7@1љQ;tD4$g_"h-;?F ȌtB|֦~*a;:x*:ā$ЧZ^}%a 5frYSJ_/J)j]z&(pG])ZϿeBߙ'.\zbyJ(יg7̳(>:A4P/x7MOa4P ?rqߏ0 R9 6e{jBT!LWk#b\驳H1,䃛hMIϚwᬙл tRݦKL;r{7Qm &bSȽ?Sy7TYO% jcLW6]`ױMtMq*(euz-ϨV*M(1|Ahȴon8nt;Iu<ec+{ )6E[4w>y~Rp &%t[E@ie#b+%(1U?TH2*\#0htڇjK?&P[p>3rGMm4$h@'L\Bw[O/l ,RxMHu ;QpJEK(uD~X\@:#tMHRosDT߾#JV*3BM9".mUȈhXB4,FťE)Z$bR٬׉T)cTTO28$A2 2H;C ?y\RA:Tc,FQ96$i/*iu7@M@E%+-L}kN`KJI%..ׂjbTFC7Ja^NI Mp!b©!Ԣ'(2R M($H7n_w6,9~ ;?ߛCjgzߍ\PV&ЧϟD!lqu~H @&ǡ)J%2&>|Zi[IB"\lZq=:@ >/bH hQ= gtFv^'̳ r8.lt\/f;(Z!HpP}{*eU[8CK T=uJON͈[PY:_5`ayEXJ1gy=9L.(CŽ[i%I[oB;ЍЪ2EM*-A2yt߆KH`ԮoOsF)-./nY G%aI|p0tz.[717[rsΑ>ozw}b/6(&I J;}|Hd~' y(z`kGR8dx 0i85,o 8Wua58E)XekI=H&<:j\b`H~C߄m,u0J<3؁pNL2NISTO66MKf$o|w?fa)co,I͓7\6Id#s9#E pGcgX "Yq*CGa[Cj6x'ۈ)J"x5 !>"OOpVgTŕ>T-0Ľ}v71| Ba '' )#<'2'Қ纹N]y A^8!CFI (+0FNowu@2N}'@ȴ"2 ʄzQ IHHpFwF H ű7aKm0]խ_Dc2rr'pP|cQXOHeo6`cꪡTM-Z0?-\2FY̞Dhea){ف(=>)Î?@AQWmE Kxq2m |W?G/W5^"nMJKO/ozN3]o8W =X[76n6${a[E6j#icYҴFԲ% 33\%eVNXhˈI HR$'DL!u%fpW0O䶪Td j r̯uݤBֆEn ;]8W1-cƭґ}9Cݠ,]5[`">`k)Hs5 qGͭ6f]S3+t֎kԔ+9y2^c MUyU`J(6jA(Uq..*eRJzfi\Fh}gSZr;.<1^Y\bo lF$?oq] ӳV 8} -zuLU).n=@wd,7X9G"aFpPtG3c[9ptk bXLcH:1jj"{`7s6f".l5TWh+pe]U)ZR |)P x`!`̭<2H)iWvHЉ{Gm6[Lv#\\{i* ggsFig<[g_U34 qBy2fg-VixZ ml{~Iz@Tg[Pc84T3ybP⍿lLצ䤻JHʕ7-Fws'%s+~~?uF8T8$J8o);aq*LrBD+,0k 3|7dWC h W<),02 :\Մp [2(E`R:ZWm%Edɸxs]S5!\T]^y--"=_WF HBMtt)1x9>_Z'\VV_o Ȫa,tE~@*q=TꠇI.-]bBTƪ3&x5FxE=ELio8Xamk/iҋ |\{W5 @-VW[F5n6>:ZUdҪ"2H~X`f.r1L*100,HUtG31Yx#Z].)0u-zI!צb]ŶƏlc12ˤvV7P1&CB|$n} }P SU))p$嘢tI,s>3fA. ƮeG_u)`QcU2Fk UeAz(i]m4h\/)-z-<"r5,݋RhjozIq5߅ʣ؅"2oB M֛ۏ@J;RBl{* /lONEUߚUW=m zsdp-\*mhiD5aZ nnh-֦Y&HbUuZ \Yr\-L&|DDzQ#:yp`44qوÈk8?p@=qBwY>w%5ff+>}?mn09y< P&$aQ]8?rgGh-xv ϝkWY?H*21 iV\0.bǏXJ%v0F<'!њvaJu9I0 cw5 aX7'_^sFq2=d8ld Sh< 0Fo9q3:yl7N7hwpxjD=Žb \mgMt9v3e,[&;^2#>3XqE#Fbzڼd[;[<iN%ӬC4ed!9#e?g2%rٙKcqwBFT$hAOtkW\l&F VÌT͠vҨl5_…J9bf QVmM цzM"3AhF+̉ͯQyw4}ڛjN@a6d9)y RS(`?7<X?4洗|LC`>(3wBkB"*K$orTWmrT $?82\}VQ'3 TFmZd"!dr~2Lö67^th" rOPX)v%V!u<30\[G7=,|,?f!{aq5gR`csi7'V:>|>2jxx1]|ZI0`FEQ:$S_J:1mV6T"TF"7q ,0U4eÝq !6jdVIh4밓8]Fxtw\be,C4\֚56w=x&_V@7#ဠ8V96c 9GRCaFR xl8*3qaC ktbQ'ơP˪QRQյuT7bTgq "lk4, 1)NɁG;U^wX~Ø*|JvlPMXQgzfGL'_4~vuJ'{Px|}NG@;wPHqIc.)&Pgdn3;JlҪn h5wl$ZZ~=B9SNsR IL%EFF:i8l&Jg{#mD\i0A!uyǯ3pcQ”qq( 1"b5hǜ!FSS1.qVf^sFTǼkN`*:b(Te?#9B,ItL< 3++,V)(>֣X9<3RJ%$f\G&(!BFL0=m$qJC$&69bd c&cb{H5/g \%GUw"{g~q(Q_ {ze++VD7; JE!(TX|Pj b/фU5,ry@%bGdp0Jc+q1$"] fr.aY9Lq4n7qL /a" @4& `ca$,*rFdC2B%__*]k$@`fA dK pȕ@4 J9B8p.ch2`KBu5 5I|O4X,<ZOP](Z 2]l2+!xuThh w[$n$2deꞘn&lBnӼ?e;}C)46a".5<1 0R&zν\##f7{{DpCq`cC&JŭgL{t~;:΄ 6h6ʂQ 72I2b'l{plmHw7G0 FԻ9o_[{3 .e\{pxG/^jtƮMivZ {stW{g}x|{ro~o:CCC/?;8Zցa{}4e P귗ptivU#_׬_ct_|m" [5pg` v] ;y1iЍY3 O!wy<ǂjdv.ڣص  hp>,*<:S z}>hĦ}w5gnLYUO?d_\?-{niirBd~W5Go8j hL;zU?E7 v]\21oߴ&}g0ǙNB^'{~T!hf|~gxIf1Ս- ʶ{7ɵwv\v{b~…ĝuz'fj:hzn@ڝg/.O˧ ؞w]=׻ .iwu?ݧÿ`D^]uz g,|;iMҧN|Кdc2fz>F60ma'@>N9:.Tv:g_wl?:Xl_4 sj"0YqׅjZ\_Ƴ202{C翿̌Nxrz_FȴԿLr| `"b1"FHQLKdч?lϓh-4s& T2;W30h2sc$V!K$6H#cJil "ʇ n? n n n  :f8LDT[1HiX3D`Wks20d " p^tc1p5~d2gfyۜ "o7h}\\+01 2/RMcZ9KWwmKûhb"1$%Cb}ZeQ!)ۊ3u LwWuWկjb2╈>J4&фsf3tv:x)ᕼrWjRxeyxrT`LQ+1)e]=cdV)%l5_sjy0#+ q:\B+919rpyAY=BtQ;)Hy,:rc01vSL_8=hkꩦ$Ei 9?nm`gw^&⅁Na6U˯r^ DVu_H5~ oG*&IUޚ_Qn+O>qj6h#3ڒQ)MaȍO[Z#[7l+ n.lZ[nnyS4"EO?<5͝UAbwfh$>ĝ"A:t3FI>Q V"bzd`E3؂iYr OT6 S}NOɝl:v'ڝ\kvyv|ٍmJ[(sd y)}9Z(S易Sp9˰c!jAC9cW} ERz-mȶۘkh[!̬LT$ ;gE K&TJ:F(O /F(5BJPjD2q9Sd,itA*e@OM/@ȉ1:)3$Utb;e1PPT~< l]c SlcnW빦V&JK,1_Rz(2M2xL= FJ$P is/NJ5.xN$vAzu:22з5K_0زc2Y9cSSP?|݇Ά* J sSdzD8g(3bL19t3[p†)1U4d*tuFOq\k s/%5BEd5bjirX`ʂ1c(fc!/ ]NGV兑(cLօ :s ŬFcPmXXi+4#1ДE5LJ˴>R2^XD8( 8D HyDdsk<6 % W/ 2SL̜s):%VeTyF09J<@]1C`qA?:}$@-A`( kc@p%> lynЃäE$Zo(0$2R@9Zh'މEDTOܽ˺X,!jKzNd G P 3jubATEY)1X` 3РVu6TeH8S>}GP #ȧO'eڻ'"zԺ&Т1K3=38go4.!VTzVʼx H&2m6 "VV4aKsaRBV4g!XR~4J"v}ft'̓Yq\`B} }&c;EfM*S lYm=~6AgK<Otvix+ ^ sf TyuQrEm @=g8L/Bh$GdWb.ƪW<&E%ꐹ :`L11c7UzSm G%<4NHiG$,([cY(L;iĶ&Cӻ!wǦLpՙW(M97wvNgN7IcVhesq)2\;iBb*f];bq$۵SZOͷT~2 Oy2Ug䫔sQg;ذHH1u&g;sv r @l(hi[uydAvP|6Nɷynic|c{`"$Ri.b#s|޿|[dVFZB1@R/׾dfDˈrkˌ;()n|=kR攋u]J_Ch<<~b`3n}v6dۍӋ-#~(g p"Ѫ }+n~z\wqtDr醭u+/g[;g;1~`@dYKIZE/BYR"`0nT$(I 'bD(NlˉJ?1X8SJ] t[ α UY|i9C4SPJ夣uug}?w9`[Ovv \(nXby $\B+rBs]Nói5Z}!嫢TbYGmX[%jTǘ>"fȖÿj=쉆(.Q@/(I u9Qg~ 7'DR%O4Dq$ nݴ$rVV8w!Y}ʇP\%>7:wFږO^>vA/_Š) SYm7cGic 7Y6bypDuPC|0b.Hԥ톗]sB>\ Ll,<÷~=+>㯓O_9#4<3Rh+&9uKJHO E%E7fL|B]G|=8(ꮨ>BQŠ3v{?4\eEu_[ar0jy7p^QyEI9yqpB|cǥaK|}5P#_߷~ͭ#w3˿s%W9w:S(ʿ:<Y6{\%{M̷A(v.;[EF5, ' N: E7o*Ɖ)^:]^eRM;wNC~=[~aS//e`1lR\tN:Bngn\scQWW;t=X=?n=tz>NWXwN p?zI Qsz:jsX?7֫n<x~;dO`eww &}:ÔH>XF ~vȥ~ү\Ժ_3R*5nb~aBO޿ʕNzs\>K}wuLh:6˰LaC_|[hL 4i\~gu Gʔ(Z-14J&LJPM0tڨbaUԌH#+f9..DA%FUF>5yRȧkyF>5O|j<3GOgh\uT2^E9 oͦ56q8oT (8Gʙ`T*tk=,Ta,5V|oV赺*OWẗl'h6M5 /cO]v7fKIhG7eJ`h}a"Td`*Uz< l]c #r٦s>qEDO@J: 1ڦU eA:Vȓ=5i2CxZ, #bCzKfcvtOCc~bk{}vv˷>ep|QC9 Rj3j-6껻%OPnbv8_ktjldJZ-D)4ֱo\9\aszuWUC@ׯ^Mb~|e eu3k#ƒ?oPMy7ĭY?5끇kE[[ I UVMDBӔ7JgtY M)` \5 ԵP.U&sW%ީS E~7.ݍo$)zP1Ձ1^H0 46޲xcV)h<8~\"{ d[2R*H `ˍ/mRO do7ktV; kΫxso?~nx:wT.x}pBSݬ9$NsOݚ^4~;{6OR~ $'Q5&/|a% 6UlfAB(qH :X&Ldwchځ SJ:RR>Fyj{#fWE`H>-qrIHOo/~ЙKǫ7@2 X70z~ЫCN*U CIL,`LY ¨)| ۚSMƴ59c؞4*p xgȟW . [n<he8K8!2TDS@c,ZXKCqQ&ao]F'_JC1@> 4,J4K(q3)^~sy$l ihLBIesr$&)kRfWS*SK05[RV 9*OG#W1mgۀi/srg*h/T6 cо3maK ;l3@Ǚi~vVT)ef3{ߎB$~勫YY8߿]1&l3ΖN>j;DLP북Y{zUk2GT-#LUY Uٺ9kE#{٪sqb ]6 nEsܯTƙA ÜNJL6'*4 ~4x29 $glVİC޲gcv 9L%6\Hoa_m%LQoKk(/J|^4w8K=J2ecYˀm`<JQMT= J;Ġ Xe;IK y&c GzelL>vQ+ZIovL.äLWd7:ҐGJ%`WdLZ( [RQJ,?TxyɏmCЉ+0#Fr<8(-3 Yy0 ݡ$41Hdk*Ĥ^x apʾ/lŴPvܱF ?Xt*Fr1X.(z(NKBp=\ |R8VJAY(.Ũ`:o/~#\ꉸH1=ͲE,y mߔxjr0'헍}IbW]alhN 4Xc8"ID+jb\gdu^nF8 jE ă/?rk ٜtkPય8;i_U |*@|RH*1bkBl9rQMi.QF_+#52y jTrG"o/ 4! z! / Yb Xcb f pUT08tJE*P3%>=zUkPclE^aaKr8o2`O&ۅQΧWҧ= Ȫv<4@\!6v )ɤiZ|GF}c@<Ҥ?vQ'v$jǧd17?1~# <>ý\񙪃dk5Eh)aJrTFSI&z_DPPBܚ0Fm216Nik߸g<,SDLT:g4P>R JhN'Q_s< ,Tbtj'Cc09bV y)Ƽ6k? !eUNiVM 1rIHsZ#ԄǨ5c^,Iؗ7!AZ8Р}M3ԄtC܎'U΄XY"RbhYU@hJB*+DS{97X`lW0 $Rusg1"|uGTMӠy`r4rGCb° G )0Y4m o/&|;(A&XJ#ZB]"/P:d ,TN2sL+30+z+~DZ ^ݱ _nF^YIH(P.K㨤 U9;̳FYcDDDZS:I1 dn:PAkSY%yY5[4JU*ZURHp5ؠߠBX3@ʝ6 f1wC@82\NCkC2 =Pq3,+"H@;%T$"c#+DήBi-u hNrLeoJEL,هzjp(V%t\vR7 ߥשɈ.ѵRHBkJ54>ȯO^lW7nCiN#zxw{+{qrEؾ/O]5f2^A#N-xdb[zHPJ q}xŊI{ "GJhߚ4C/&{X^ÑXZE:zALվyWQ(1bN_ I nL$  fL;ۛ;.3HEsfL7cAǀ3eݱF1 0FD/GdPXgi/5L4-m^tt/z-6ƭbi]ڪ^jƑ-o+Vn0a*b3Q;z=ԏ];u߾yjo#޽J=^l;չ/wU׷?:^]/o/e WÎ攇l#kF6ܵ=U_pS Nc?]_bS<ɧn?֫Qe]뻳~ t;9MeR莿=W%A#Rŏ/λUp~wb~ENqk+6݌dR}}}}ߏ/J7rrĚZTu1A3J*MvUO(uV]A4;N`'ÄH8q*NNnСo: POo߾',Q +\X84֎;y|E˅./ΏR۔!-{W:ˉ: |x?tBPxaGmqd w!siM~s3."RځDIȀ8 `b47)Q&Nd$EIQ*2 h'*C] 䨯GDTNTr&8 u耀IXY6=9{"i.3kKٟG5H2jlJeUcdMjM.Jz-m6U v~3:Bv؟0N5BeRJ$,%\ .reɉ4æ353^[8rZ̚un cJVegl†g?w<.>>|8fx#"Q"vweH>>*d_iSR|5JI$޿3KSţXՈ%w p 9e$3/X +aOn?^{~6 f :Jth0Ib`TZTo`4.y#qIFF w+@4xX6Ǜc5Dk؎ڞ 3n'Dwرړ[*oFğ=B?VJ?C'_\[.ƳCԕ0OũD1 -B:ɴLsb$GIg$6*2wm/\nUF%[tTzG!}̏ IM>{~H jI^t! JhBBk P\( ;(6qX^`0u=|1ݺNTܣ@)i=QR2nfҖ=Pf#3; ]rOzݥÚW:څGagwX8SЌ++Y(:FT5wѱ"3U$PTqj^I1FjX,eOlw}?w}?b5$h hp*b4S;$1&Hј,%D$S$6o`~su5jnzA1>EXS:Smۭ'.QS"UEc LIJrqNYa"4Ԉŀ$G4ԞG{{F`[{.{I WޞQ{C%Tѭ\W}-b*JxhJqI¢/ۤHv+ɒxF/=>!ײy1"ikDccJ1Zaz8HQV+wQcF­:!q(?# XyIEa.3LnPñS ) (YKQ~@|4,HL2Ce|x?)Ҹa@P8:!DMLLI{NΔ|I#q=O]R -jX`1g# /{*NxD S F) {v@['n&J9hAt01I(9z ޕ"䢆iѝgwTx~H}P*)5(CULo7>ن_|bJQd5qha!󙟮_&Ywϳ{j Y#{Y>kѱ~7_6Lz μqY# & a$- .#Y%L>˞ 2!+LS;3Gљ)=bRypb"41K8Q$RARMّm]^Hk;[[1P'8o#03ɒwƜG”ƞ7c3 kea 4a ӮAvE3zrِ,nA#+ėa$♘}JHP9;^bJ݆K=n &*zNJԟR;*V!Q.Dϫ--^wVf神ۣNנZ^kխ|ѩۇn|}v2[Ytdg&8I 40iQ 1co=+1+tqW(PE1%ӛ"n8^j<>h{Ư|?͔Ùh?>=o `fI>IWr$,6DA1ۻ9x;&s8qNMș2z^dTKl@A_(sL΄V*7\czD?烙ȊFo2}Χ_<><}<\EB.Uxvk~2M䦜ȯ0ͣEYp̑L-gW8"RZ$ƗHe\DNcە1hpAiۆ*">v> .[Whx|^E_N51l/})up61(kz  ;qi<[H0nJR3s6:k'e`;pB#wB׿>]/nsKHuh ֕/ޝ-s޹%&Cm;MI(:Dg!k=$&7ƧM`C7'myAbQ I (ʼ8HmF: 81c"t@Ju5cFʶGE4$TGZB` z2TCl0˫ulyWDCpR2T;AWReL` [0fODu1qۧWG yۋfжh{VX>iR{T %u)$D#L?FJ,(KENXhu;\nPZ#N(z~hmބLHkfߏig{0`7E==gߞh'{`Tx#1[5 Qu1vz&rӭu"k`BBj?SK]w>^B[Ѐ*o5Zf,ƬLi[A>QYqP#4zW mNBmUݣ%! ~K=@ cJ >O0 .zdsq dC=S= ou3@.;}#DcŒ{ɹeBfXKicV#;[z%gRx#*Oh¿lLs-A^g=41сדWnáݵ$&߬auDeͮqحLsqB%!:ޯ)O/_ST<LNk1&}wrf}xZΙє|Iߞq÷}R$1,ʑyvtIH9Ѩ{nǶe(8.w9|Yx B@?\7-+\(Nd ُ+'@Ӝ=AF[ϝ@9 ƨ˱rO,zTzLzTR/ǁO?p]9/aL\<媶)uNOV ~B'q.1)LC=8aA5] $TvXF[;"EHm4p%:[sXy C0ѻf櫖!MF3oaɂ4(^ ];\vH?Դ Z(pml9R/65nY! !O wX)82Uf1 dFp &V!'b{TA Rk`B0j>>y*Lu<32>?^=V_|*WX޿2Qm4Q_ irh*ջUx~3$7[X΍n6>@ۚfUtOKWҕYG2w`ħ L![OF7Krk \xy}@k-Y9e]rHN[G.!ӯMd nvMHzsN1QGS} ߌ6t˷_0NԸ+Qxj~ROM3kiNE_Rg5 E(w?Ϳ<;NTܣ_ߔ"1C-_'MqϬ[FH wpk@Kz Z@)Ќ~|dm0:2z6l{+*e[j`duU)\jB Dq+[ F;؈Sbkh,L C7*?d,Ho$ҩ_GYn+[\( !Ȇ\4`0бXF(70$6b!ڥsi2`%w6-w{)/298N۬C'/ ($ Q)abۚ:Mo1U.7ł!<*Q"*ܸ=Gi];6 !ݲ=Xhو\"n4oB:5GHmiνK󄰆xҚ%MlwA/^OGZ?3ʚV,\#RX;#<[Nkje e) mW^)LWIRFkP+U3P~mTEncgw@NU|g,h ۹|xǻOGׯ>E'4ccA2= GNGw8t _UjR +!93Ab9$Ǜ jCAH %Cˍ(a"#R<ƌack%D0bȚAFߊP ɦ7ogBcT6(L{S6H\;(HȂ*ժ$Hs)i)l-y*Ur`5LTM $SYbt\Z+ppV|ŗVջ6D6Dˤ@E@Y*C;Ԯq\y0 }׽K;zmxux 60!PUBF.bz*#W`˓NVȦ,SSNy0.pY1aKQ ;b­' ?57ӳWױwш@=<RYS1lqGb͓@|pyIBBPy= Hjys-˕Hd8X 8E>El5]B.SMWۄA J{D>fo'pw (5.Kn/RLKzpY2AGl4xfj# 1--~da͸ Z?b!O6rnSoJli& K3̄$#qINl^¬51BTftc.Nomm%`++>FIU3.n‘_>A+qtzUSyqBuv|ZPY5K+Q0WvL5ii yVnFZ4mme &c[C3x3V#.!M J-pak[_dE58ua65 ŒS#7 ]%O#6$LFKVt`q^0JSܩaHZFQs4,!fK!]wM evaXmR#Za ;6Q2)]r'L XT`"D ,*rW2VLk_RȯvtyqE5>$dSץ8ሤ{D)fܞ±|mY\QX_y֏`PC\߰x!gjF,3D<:?$/x{`-zU>Vc?"zK-7ѧT4za$qf rB>0 Y_GS\y~UIrq@h*klƴ5 7"l״:5D=jɎLM-Q:5sb5e̊sރ(n☰a+FQNBt!+햤 Lw*jtsӹ7ױ?ԲPkǺNB#:2@g{ &_yWmĐxjR豖t+9l,j3">H"w-G`-f()Hͬ @^rR@F;Bm ׇH*ީlGKEޕ؇nxOl#}k4J؆@nR&>*w\j\܏ErT>5C@Ioϻ,w;/&M;<}T (IjEλCB}}?긬6=LP$\r \w&A L՗71&/Hۀ65Җq92G U3\8EX=&E~~t#~.nHO.^>?,ttttP^]],'C{4*7B7:ՂcY40ҠA˔f}Yazj/ xȆ* (2݄̏d*DbЋ 4^+C.HV)bpXP1E(.f03B&0,1vSH,sRbuHKkŪUҊU+;$QzűUE1,zVtd)<Ú-YMKiU!!PF*6 JHiB94)( -JUU+/?^\J$7)t|ErՇ=9ü]hZ:O:4?rih|>;NNw䀪#ȑG.GC _B̌M'\w.&=~$TGe]529ul^ǧFcjlBH|EC ia,tiQ]d?>:`>Qc INN_AؙN$I ɯpy'$iG<ݤ7dsc(,9N&މw#g'j h00Q4~~y_iSN' ? p,jF( E>GPhp+`Vwz'hiO\#U2}{=\ļm-@ c4|;Mf>(t4Yn e_y1~@e/\~_yϳYF~}+7l\]iϢZ)W`5dr N2̂˾5SJ:ޥE86"D̅(\8iEN\BTbCW֠2"WAvx Ws ,[>Xtɇߙ5ض+2nceϟdx)EH^Zp[ |-zEWtWM!՗>?\}ޏQ} ܷH\q_;!YwX Tr)J#r;ha,xDN1h,"!6Xie4P물r&4O 4e~R 8෬˘-L`'RRbJF%8&5[3ϖp =+qa'f1 A&Adl3cq a#>B8tlŭ%ӡ`@a zL²Hw!  ǜ=C#XFS"Sp $X3$CӠC`qa%1q"5tSՊ* OacxR~vYŲ2_{AX)rR^_3GP˽^TulM50^S.C. 2C^JVcE/) x4:Kwf|uzu`x=I^ự srQ9^r8^eG7I2oia'$f;56Њ%T*kR/";ƤAK[gӢT5ͧ:Z~{<\MmD^Gqqmm C/^GHLD?x{1ˋ \ڒfg7 u)XTUQɉ Bd'p9=j1Q[%JWo㽔㷺\TJhJfD<7uz7\}kzwUWX67lu-4*8}snCdG zQgm5ҦjwPZ?م UiᰠL/ן:^>4PV4xrDV9Lђ-*ntbKcB;V[{u~> ?{@@3VEޔ.mhlNª:fȬ4/╼NHz W4՝ڳe%'l+v| =qGA[>ۿ Pq\r}4d8h(&GK(a_ ޘtCDšv*ʯx[E;t$R3P!(^]uu{EVݣկo'=u(; K8l0޼^ScRIB9Ab0Z9~?u潏7Wv#~Af[=n0VI+XZF3Vnce֮w`,k}_#;C~H=0Ǖ3x; r`@@>Z 9>xԢU?`Cget#ՃanF2>X[Zڰ9oMl36gsI4:5Oqb1&ZN6k+C+1zleWb{',i14;_QF0f2*,u]3͚u6R 3*H}deZk~c <ڈQOg=Ө{y q6ЊuBI9x]4<8"Ap>}GGƍ~c]s"1iicQJ+R;-<^F{) rIIF~qTѥnY[UA2?sfx+]>餚gC}R^l: ^5R?6X5{Y{tk[dyPiVaZ"rVw[:B8.-:kH7^yQT}HVHܑCȼA{yAms :jp-W.Qtwjռ lP Hu4sꕮ"m f(|N̿.e x;gUsVYS<@QdKJٔ肈 ѣ:$ȩa3U S}_+sߗx##>Ug\.5H&ByR*.)=xOB.&- =S^q0u=AGc 6XB 0f]V}В4hœ VYv&7Xz:Kz:KRm ]9$N%i61h:K$-ۥU5JH2kAG(A,G&fn U*՛^VB=$^׸cUN]5A#؃VW>'h]}xrAp;B&>HKrGȭYɁ#+PB;K EGdǥ'!!(D%4((hQYgI*X 7@@ߚCcYE6 M5&8yا3pvoh ArCD6Y)q,sx L/u7.dl IrJ02hD @F[C1G'$f{'#㓌]/=Rb{NYQ XcɦC2xy̲A$5 lQ6E)-&mҰSh9ʐ$mt;i %PNv5JMwDB"/i'"Y43*?kaJ[ L-#eoE/HtT&С97-Yrb*!K XbPLaKOZY0^z4^,I8X0wq(EdAcJFI!G'fO $g̶*p0;-vf\IVjfr/J4tjRlH56`W4WN+י]@qܱDxF7cWxW4 %Cy$@~%^(:7]y4l^̦X< !m0RӹW<2=׽ĴھDgXK{Hm]!;U7vT]Ӌ1hD4?euL|lgXG~-ҭ7m:_n~iꍲ %]kH@n8#^D-66i+wim`ƊfMv uuLoܢt?m=]f30 xǾn}~s2[x=ZK*? bMJZvEY^k>*zDTV(3ҩ:;9Y 0䇺q}ggRx< P$o-8Yo|8%E~%U'ukn{ջap'!Fڌy,45mj˷w~͆01kc4?O&et˚KjZ-ME'g(z&|0Hk4:ɿߝ{gLP׿:k2t.R<?nwnPk8;n:}iSp\ϵ#/?]͗dLgAazfՁ1=e8xj̝i9rpB+9}mj+ȍoL^i<"rkt* RJFjBʆԼh'%ڇ`$(58䘁%DglnvJpRbbrJ: ?)K[e n3u1O7=FcqywejG2T+$9F-xh % b.,;'1FX]\މ5XaYU߯ؠMo>%.uuj("s!N-H9xNڈ(-BdNc^d&Az ^ cR;}lRDqlRlR*hJͭ$`g@$ ߄+t>(.HTKe&Jlk‹:NXNR@x O"h$sA()@v (2h$K-2sL4 VPv Vbx4Z ]KR_ub+$W> Vesc2DUI#NaFѷul|Ʌr1w[R&ftܪELrcc*01;.1$XYcInJJF0,YIj:ѫ9V|EA^l00J,AwVVfPoؤL]>L0ƘJ@KmM_-N[D9(k@ԩ6,.yiZO?!'OKFT.ҡ8M`55nu(T7T+u3lᩊm01p! f?'_@T~|bK\0N7ExanL*Wߌo]s:y'79*"9*lX9G=;Yu<[I$tJISw'^~;^>Pn <`ZEHCH!84sKeXlJeکQie-5璿Q&Zҥ֡XЄ5Q7pݤS2anc~Td-/v}/[1H.%Gۋ]gZVV#Sf7viH +䩰[iB7 F+OB:EƟ=_k8 ~{u{:ܹxDi˱`bJSs0I4,ȨL]kĴ,S3= <eqd㔒4MY΃V:xZp+^5t ].Pe?s,hx.K#b ?~6q(s|7 |`{lsSn|=$g"n8V\&Dv`1yuy4XE_0z7},F3} t_8q,wY\Vf%PPXS pl߫t}b>\N3Y+[d 3QXcА[>o[!jL7 @<~GG!O/̓PK!(䂌58Sj(d3!vk޹1~ աxfh9^0Lv*Ǽ*=DUȵ 1@2r0f]fAL5WAzvu /U!?6O/O ˓?X3AFf>џVC.Fdf>h>0&4cL!Ag^GRYWۻ?&$/\Loi1KSJ3yI`kN#iS;,ӒNzIfxƼnԁR0 Lw=n\U79%L)es%d!B͔နt;SA7RvUah/ hBn{9۬. Oӌ&S_Qpft>S*ḃA Ṫ: ;f ;+rfb;SwKLEz.8A/F;b; _$X'z;B(+v`HrR{;*ZPm&`ڎ lF?O&a;*P" -ZEbI}jF\ZFG1[١F؂z%a!v}S__jSB;((-Բ2)XNUmZמe`a]QIE*0TRaaWr7RUreXQ2C+9K%$,Wnͨ Xs:J=&]9l޿?DbٳI:!2u(aB[oJ|gaK'kʼnZ}طW%]+P4 -{~10J>Ot~Xa7ԩY ,)]56١-m~tW[B5Զh*}|MO~K'-mJU27xeF1̤LN#Ljq ԋUx8YiG;=p-B'`y-nr3zN`fzu4nQ&r.1~MBAU1e{h7&lj0-e9ACozrQ43ߌ~{|ﮰ+5+GPȂ,i`7N({JpYyaOoSE$YqOi\@SOߌ|yo568ߪŅdu$ %@kL0%BHT4 јrѥ};&&%W ʏ'w՘)CӇ3kl3Fَ 2+֘2s__Eʾ[nb6ʡhSq^դ1*Ec"FhWAoVBM^zޜPGN,أXf1RQl"-;8k&IcRtdC *+_Rքžޏ-]ai5WZ2.jPI[qXr\0> Ѷsys.Jxu͇zc95R]ޠa 0_|>qQ 'v< k䄥I{;q}Њ$%#4c)_h%|]$N]xK];+С[mC-ڵ;v(_N5gX[* 7Z Et\Gkhu :=oOgד;w 3\Ǽz9vɯe3\&הr-O%͜V&HrlF)c9OL"3xz'Y&3ϲDV: FL l@Ra0 zgPNNX,wn[:T5}PUZkʪ= •.+1cb Yk.XR6\5XIu8g>mɖ~wmp.*5UwvroW\{zV3n[kS;QJnѸzzmOkħuw'6c6Aضmd]ţYX؆ Oj2sJ'eHK.*TySdȨXEI?M@3\}8ʜM߿K` K`M9_b L5JOdaKiN91Ԛ4Ȕ2ޮV$a,7@ݲKo~W4}4j:л"x0D`C1O19߆Ҡ_淣G/b+$n@X/YXvq9gaZtEP C Hzk XqE>N8X0x;6~.1a$NA1('v ͅiבHX`pjKU gkbRic*S[xX "$ž5DڱF3vRl7)6 b'$7F.25yzv6Ӵ")\-r.UJGg3: ٚk /ki2}o3 S@=#k6 Rc;j $.)طrFfk͒_|*{T_Nrc֌lK:w6"S+>İ7$PAƄrFUc _,ÙgP< 1J\bJog&noWRL{+b}9mFᾕ ɬT`d8?-/(=c#0}_}Y+fN'/}e@ ޲Z"o ˏ/?*Hs1b ~u1+RޓFr$+ļxmZGdjm~Y`b;9ȑVwdw,vuP"fduedܑǺ$Ru{]2nyWvҮ mogJ ë?7DO~2wڐUDlyRCdsWL7kF)ϡ`~:yLސCc]vN\k oQqQfͅcR_P>@&|JP0AT1!2L 8g9"*|Oy߲1(ZLO,Cǻm&l*unoGcja&?vaGCܱuP 0C6wrEhۀS*= LiA75guZ^wօT,0 y1HgSc F(k+}z |--mx@nb(:!ɔ<&RȞTWeo9;$bAр0H58Pӯ+O,NYHmF1 7wt#%׍:4Frr@w{i93g8ѳDXw<ɨ9o.ko18Vqqÿ"kRugFny,r@65֗jKh0\ /ॎ, HJo81Kѯ2DSr?T(<suPیE.J9&cJjڒٜlٜ- w?'UYQl=, ǻiTzeq7;keA.Z/խ7OoBrõ7`LpE1×kA=5oT_x&Ӄ$=7:m۩8랂H.9"=5.ylqڜ1>Éa(!ᰎnсz~o"gY3n2&&YNjgts"_EN=XF2]dNOc<$d֎:b댲d=К"Y!Py(BlI ^l$c4$&H& 3DrL)=:!DG0H9CReѸ8ƥоϣ7̊L'U!e_OhId~0;w>le[od+ZriQlJg+NfK+>Қ34 P%䦋 uo4l<-/牢!IJg5,;Q0a}p4Ӟ L {s s^^1Jh{XVz8\~˴N!}#Wᶵ,a9+@!j"RN>Yuv?wK%ڵa#X߽؍2^JyPk  sǍV؎+XiR·FHE[`b kqV%M{spbFqR0&Ի'+9N#3/%̀s&v ycϤِYa⚜rM-mEacΌ%(RLPAuSF`V䤰Ƒزv0ZFU>>-1MN~%`J r[V:J&Q #o WYd֑1*rM+XdȒ{gɕ 8LG{)F_`/WLVȞL+] ^iT:Ó?~ݡۏ[]&Қ)~o?w77{)'=4.{_Bzɤn?\l^8BMTD >/]pC`Y u}s*䷧ d'qwI![%b`f/T8\hrP'_J#y;?)-) -/d- vdS$q.S#>X n7W߯㊫|/woܸk}=F<fL؈pJbT @.{37^_z=}_6ϳ>mMG,rMV6Rx >:-"C8l%/Ԭfp>)(df92@|l)Wz}pTY"Dh"1#9x!mȺ{/^$XZZG~EBFRRr|e[ +ɸM%fU[Ρ09`1qҏye`` !itDL9 3P',$1C :iJg^r=xc2ʊ~eW ]g\(r6db%9YAt2g΍wyTAULЕYtkRJK SR&n-Iv۔/i(7i2zAC^ۦ^--NoIz0 m \v[>{Rmb}/E$U u4iE@"WCMV'OyQӑO# rY_h٭u6pL:FI\ gT%85`@2Qnc7m'왷|s{Jn{lmȶ: >Y"GZhb9/"H-Ow[g\(kІHyƃ`R!o׸,wrA2 n*_Ic]4*> 9r| "Pbu:sqs{ca@|#{p#va%iQOk䥇2h!md- K)dxBe<,yQJw:| [zPr^|g).q '?\ۙs6c^Cgy~N` Kk_4-t|iu?rz)ҊM@9ZћY,ӆfTY}L<ץ g˃ 6]\NtM.rZ &O3zf `#]ь0f-kAaUQsa)V8}vƛo ăjsL#6~={]c߶]^{-fF\}[ mK#OLv܆u ( {3h5B \]06FFjDP&pƓM?֖,Ԭ,gTI9L n&sQh %X*}VAߍ*(|ᕸ+ƅr櫀| VzO]Hr7}a ТGW|Az~T‰Y Pnr)&qN9p6u*aT2Q5LX5ށ |!d&*jK,b/ '}Aa2imT$hI8odeZtɂèU +EfSRv0)"kjjɌL6$p Jc(wѻ6;Dfۦ]NCK(lEPz8c,Stè],l5xƘW&i_u$H!G>`K'zSw}>Ib/K,sFQ7"7h=P?Cك#1tfuW3{^Lh-MTb5;-㞗eܚ1|$ܽ ^4oxyvR^C)UPyTa8A3 mDMckǞkp(4D݆DWQ^b,Z_ݽƓ]IKo>QM ߔD7%QM;Qy3>6y&2Yrg(pt.yòFZ*GN[,gxxK/AL9_bt5qodzPTeĪqآx!7_o&oӛ_K50)m"!>Ek0^mNBѯO _*ҒCwC5!tu<ӞctL{^? Nr*P畠,e&~{};R N`P. u6TåJ W.iM}cWWJtqx m޿ךIc \\If)B$ޡ6G:`KϊӮR )?q("žM(cZ|&qj;1͑LKG)>,پֺ-$}Z2Dž䒍dw`!5 i&vT L !YNFFq:*ʃ2!D-EHnsVmlZ]ONO&F ?F vb#xλdAQ%, 1L%F72^։>\ 'e@beTz1"K CiyIupxyARGVB" nr?|:wt1[Bu s?9h-KNDYPɄ+32c1aK$+ aJ QaP!EL4`'Zȵ"XjguKV=phlLx9IPC7{iT" 0E)ِ aH` 9RvKr!=䷬C/K̨=r,:%Z[%ES:x!E4iQ80B:R>JtNo$3Cz%K 9Dk,z-;H`M`uFʀCQH"P @ "=}8 >e2:h,!Q8E ` : "R^kZ ?[.>Ys #S) UsZ >Hb VXoHc=-ScCh* ưGָj ^Nv4G (lx`#;kM F$a@sZ 9֥'aEl0$@MBIIp9j:"K:k p%6Hꋣ'QhCN=jC}oG5C >M*wdžJJQ_q犎6/!p~a/)0B G~F :Yſ@҃A^WA~uoB^`?|n&?./7 l~et74܏(К7To:x|Pf/BQ s+7BHp7e`c lZñRAOUTmL33HFV4Z+$"JSUPl#$+" ϗdd &oLo?-hUp/ )x֘ᴭUV[E덠Y[3tvWMNwsjxj{ I-;_?d}ѓ'6\j+,[8\ GKe; x7YG'Qu!\GdGfs7"QKT[B3p^Xc$ZY[.atuQ~DR|~D iĤ0*L'nȺ&_I Sq,l%[e{¨6B: J*ރɤE[yy =,'r]}l3HT*Iudߪ~l鉾웮nۻZ&}>ek߬4z| aɕ^7sc,K0ׄ?^AA*\-#d?$!ǟ?fADWu|l>;2AIY(gPj>Vohqڱ~zm2٧:F\\DY .W^+J3 AD}n-+7}2 f{]J)= i1]D-[Ɵ\7[D-K̽Ǚv0D=jXE:OD:+^:"Bc}$A'nRdOjrcөG˓'~.ϡ{PoR?VSJ` _CaD1{"7~ Ӌ{zN2|ysg i%EHW)`0RSj MP 9/ λ$]Q'>|%6N0Ds큦Hc'M4VW_dai ڬڏNQq sW;^y$1[E+ۏ*2sZh/~|_:F7,˜a 19߃8?#ooveL<(8W\3 m̎,v]+ı`vP$1?ҜcBDS!Dj@0)4Io98a N`P. u!nlXR<)ď5V o%;62(&c8 责X |(,DWG,TaDv.]8]F#zU_J Q1l5 >PĔKN'.j"׊`). E:/´p9*\zO CWX1LK-4x1E*|<1Sށ_~G^jfJ@ ;Ҥ`T.ޮ Bk$/oI8fIޖdk@WQ37`D`)-=ȧ9iZUr^P݀I9_Wh!Km 12$ 6rh&qj!ϲ$-m9Bq/+9^^NUH£&BI^+jg%[0AP]ie%udwb  M ķoYz)܍u((l.K7wc|b0V uw0K;8 p$έ.X/ W:큡>M:c{Itodh p:>X0?,%yzr^_jbk0=SR:zLo xZ#I(cHIb*x>,VAz3y;,NxZIiCo&1(O¢5g_mNBѯ _GrP!@6cػg MQ_6,`=.rwQD[kt X&9)הo5V3:ąN 'H)2;c^یṀAW2:"N`«D^8 v۩e4u0(,"2r@ֺD98p w+螝dX]ձͶmiģKk:p̨qk 7R _uHtEsB8A޻,Zbo.ry#RDZy"h(ֆWy'%cXx!`5G` eD(P10 SǞ{tOw@ք됅.^1Ha'>b5&M-p7cl H+%9oUbAjjXȵF=,5xt%?0+XԢSw)|>5)WbV'nyFBj|)&pdCN^tS+-} ADTZ U&~'`ߧj !TaE*1jBdrif󿬖(:M~&xS!0efk* k$C0ިc~ !F'wٕރT +j:g[O (&H)Yt^$|c)b h)"Dv<ެOSē%8z?{Wɑ ?UOfȀa Òe2/{lE 03dYqdFD-wRgѲKZ3Z5h=RZ)*,Yŗ%d_ہT50%4*%6 7N%L|H,0%VRW ȇ_CeM׊'" ;S jH۩y; ۄ+sBvchEυOHH̓D?-ѻ ce?OyTGxĸ+E@:EB &L[=)d+jg6an-ёM\t)BpB26Y[GQeaT 0|gb"[Lt3kJ=Rj|[l5}gu&Ke& dH)]Xhej.*3D۲[LD}ͧH;`ߨ#O m˜`\Fg5JWO/8NhH ' &vo }GUqZ*ܳƓ$E.Nےе VK֖͛]o.MR|mP) :ҖZT49+e̦$1f3B*GϝZ`#:G O9K"s*1cЙq'=-*̜S,*f@Ę)‰\61Xܹ$u-4=VUFg9m'qYeQEA\D{= Tv(Seph24OAHUNA [E^>|(ƕ2Mb(:Za.,*44Hs<:&"01:띶HzpZp~ 28yJ`/œ?eC],5z$zT W؜UP g6/#3 n~v{wboys=kwwSMdp Ҕ;ow w܍o3Ƥ eF8tDcy{W/F-tR4a1q?C)L}brȅDF=%UGqDFhtD \qkz^je) ݮ\H6<8sm>8kYTsNs#<7lbD[fgay3ΫTg^vyK-T&Ar~Jrah)3" d#Ӓ9\>Ja h{wf``J*pLD#,D! JcVʈ"qe%^v>jovφ֬p5Z# \X9f}HKR0%4i9("%8cB#Pj@^=91PLŒpȘmV,CvK>xIUh(,;9W(G&'٘uZ5w,AbpLUu ʗ,&Ŧ`\>B.% 4`y̙Ggωlo'|> 7Nƅs'ЩɇKI{,Nji5SI\IJ'`V\RX_kmX a/ݲ3$w&DQt*=Ó[2ZZ8FIAHIRMM`3<: d% 0$. ڱUhxPGg0`df@dYA:#IHb mI)9QuWo$FN; nbtǵ1E?BR\bJ̺B k=$[bA *VM%J!lZi4B>t+M$y<(iFsIT[R0Ft3H썆nVDwCeC4mJ'o7n{C7}9&=_mA]1v1{1n =tK9pT䒎OuVӆaPjG*KS6;-Kr􋰯.jhBI5Ȝhvd[L Mo0L.">9 /<۰j<{.݉  2|GuO)gI̿j$ SyJK秞<pۯ *j6䉒Ϛצ3N$N޻b#eEױȃ`壜x^cDȱrϮVܭ/d8g'pFw_[3"uhzfֿF6t݋ +ZodGd?}Z66JJ˭(>4V^i|tMZ|J p6v1uy7?͸øULH)b+t`n?w|+yp_-n~oV,k7֨`L72xMtɣRFwM7]_,g3ffJWv;?!R<3o}SDf^x wu^>E-Z{H]B\u[we'x I#o>\5w?|H`G= 'md6 AKq5 !eo5c}J*ЋOl{rIl b%i 19(E* ɜ91YRq=( 䩂JxHw.Mh1䞇D j)H I- K#DA:3hB1tV&2[I hB[Wg\HhT%A'U U )Db0 UNy (Ѕt}NhSl2F 7ДN͜hPs9LGsۘ 24 uB}j)md 0%kz{Uخl33x2vj;o"g+$ROdmJ}i'͇eo|Xc'd'T8S/9`/EzHe^Í #ņ F+6Qຊ +gL`QjdePg! ǖ䩚!%VA#; #?RX}ͪe]Ui˪7l|[A]в#gR!gto \'JsCuŵ0UpSaonͥ3f qD$Vŭ r{<1zzۏWOa To >ԭt7X/,9,eo FS A$aԀ5xIC#P1~n< @V17kOw?-rhc~|2hձf|S晭hW}]#Fo,)x:Ӈ>8;1L%pcF?qדz6rXz cgw`q#uNNkŘ&2ӝ&>MxܮIQ{d8x%˗w0AqyɁ{a Ɍ~oаl[k?g:n1YE/ym]29wOl^ZJЌ?8Z|EjWG#~Dz!_7n[@!87 Y&rlfT ~RsQ%kmQaPaܨ[)xmTvha>؁K̐KطqU|Rx|k?}w|?~ ZGf?vu_jA6w!7?_1z!<ȶ<ꬾAkPI[w;hUSZ|5]-]2 NиCR3EHK4mL 4/[nsΎX9\ĭ"zbd'}Hi+==j)u +k`Qb8ͥiI;tOp^ujU;!'47AiEIZXb&rryiREV<˃yEets%=zRѽF&iֽuwR[s*;"H [b2R`Dz>XF1TbEGj Nh4A˧88a2^33É/6fN76gx#ՂRZ4ZyHmBc p89 jY&i2JJ- {P¨MaBiwꜬZ1 4Jb&x{zQ2UkH qq-~|+Itr"Ow~ 3- 7޵m,Зޞ|I➢An.5zE,%˔J|HqQkgg^/?`+"rV ēMoa!"b? 'x8i޶fwęRLNA *p'EG}$ գQt8I~p?MlrXRN(vm:K]R:ۺeXn_Z%zm(ei'ECc$xhzݏ|2셥sLvMJ==W9*@wupsk Fuo}̇)e 8i>U\(v$ p y ,o ² V"HE*AZAx Q ~{8ZĂr(`Tx6 ^&*!m-VWA;1 `=T;̀y䳃}gW9E qt>s%hLvRvlyXaqlE6'9CBݎc`:j-DuekUrFdE7BrR>W/]a] ~0K$+CĿҥ\: ^18Fb&( YP 8"똭,+hVjE`t geot㰄0zb<1]#V U?^ ojK7C&v bdZb(7索5닁BVAkŖCD`\îhW \sZȏ#kNbi=qH2NKB=L TJXa,0[fVl/ݏ7_G|gYůa<<]A\\ksru:ZbıqDSD4(Y⺯IRqPG_n6s_.1,;0K}H]4 byMM``~7  ɇkJZ™洸^|]7NyI^΅WVǎ[zk &|RufDzu3eB A44I.7Gb54"b!Sچ1= TlavFpOx"8OpMISNQJFHkɘYebf' .h{b}'^v6 7fx?̈́0:W@ѵ'$_'uo5v]Q3; )Ӂl00ݝ{|;_r/cwJL3*_MyCKL8zr/9u_,ڃ97{>g7eQ[HMQKujbyf֪0Ee2 ?(jh%*łB5Қ}F]V^|{_t, ?==1 0p 쒆[,RNL&Qב@\CĒSܓ ׅ~35 P>P-9Ø'Ron)l f7tulV6I?FR&jcoCAcd+ ##ˋejMD#{%z`gol4~1>˩L/~Nx6Mf%ь~:<}yU^Ceɚyw~ fkzSxfZ!fgL3\7f ЭxqWа 3*rkʬv\;|q u8CL'0Ԭ@֫Y9wcEo[[\X ئ)%!Qn֢1\M&D\Q8WXm~mTfāek.Q_|VQjcXSNJbqj$I!"QDp)AFatV2\& ٤I#y⇭<)B.8Q6XXֻAa/}=gKϮPԏABr9?$ópqwOU' S*1JմZnurj]YZcfVAb{=v-HO%ЬUgq|ް. HEEhl&;jC};O^>O*$unNYalcR,K4EJZ9*}۟;Vի-, `X?|1Ov'OꒂثHLH*%Wpfї.+EBD׌Ѡ5Z]\+K.xf.$MTww ]uI^u^".1,c[PA'm¹0ώc"v{01^܌Y;v|6ܣћٯoɭs; ʯQ3T0cg~ks!wwۙ&pnvˮr^cޭAE,$O\bvg/w_;n[UJ7EP;U8Y)^s,BU*z7rey01GdQuUcdsWu͚1N nVV q7rltMN<$cx53ŵ;Mq̴O*Qd+DSj+h м4˶OGHHC8*EFZ%^6 XmpyJJǘ@BMgWaO0ַ>)L>` ԋE2k8Ip5)y@K6)"uRv#ꎸBf/`ݒٟj.%|5-ͼ Y.`/cfXo\fH歷iVsQ}Oh1_Ldd.lPd(NVI:CsvH#ӝHvh%OňJF(t #m5cS./A|;2LQq %$B$EcL06HHX25.I&V$ ԝLBf' s:&؁ **v[˃BqUwܥ56! Q`sS}WƱ#Ttwi0Zjh HJ@+jm6bY?>3|c$ ӻTO8eRqy פwYQE6س 5|6[ZXƈז7_ ސjJ66i%YӣH0N^t(,Zh<2L{_i)cXݾ}.:*v{lmw؛+qtWtjHT*,aL$=$8l <' ~VN88?aD$,I1ǘFV3"#fTX"J'1'*֙U>KBj!n>}efN,+ DG5GOZeH Uw7!&e(#WpgåfJ C$tEWv@qR#ƒ##5c<F1cMۤ>8Da):h #kn1gȞ`6~82e4]!!r[*fՇߙL%_{d){> Zj%ͦ_BϨ?ual8^]qa4g7-{ 9QdtH~:xӀʋoO4ni`X+3V4ZE`XlߘMHd^^]\ƙq])÷iHgvԊ =~N=o܁w^._kLbݙ7cT`1병_Wζ+w|E7kJVv#GG81Q coC,x{@{¦e*lc6Esxm,݅4?@̹;6 ԓ_G"w_U;A]J8T *|RO8ZO E!h19+Jt"#:HApj=g[BuDvjf:8r$$!vsKqkz[p˽V/~s[&sN%L[n R){IYy!.k,Y3S>6YgZlB>a۔~_R~ ph JF( \EFhct́@57v1X88i2` EiDSRN8bpw):Q:gJ<'dkeP{ݐ4. 6ɸQɬ20 ӄ9IGRBDR<1z&uAxZ(.hQ"# iȳH+;pĐdI͒JPH0N*G f2Yt%)!*{KM&),2r:ѓ|ETਯE 7k Y:w~;ΉRV}?__>CCfH|/ U߉wo>~t7NɑuW4 =abJsFÛ]Oop;+}'GMo8\H^\jMs(z>|_8<&L%/_ (keGL2TpB"{.iQgP2 n& rܲ4':$5;dvYx)|/Y}Ӭk',))s&wd@ T)Q rZ"'XF$4 M,mJs\])/]R&6렻Y(JZɃ.?8'4{4Mcmj<"/{&#.ph2$3O@! E#?Av#mn2 `'=Seg>GLe1=ӑNtQFjfj);Id j6jZXiv4m9r!vcC' EVВx^" :F22l'iHì-|PGd*Ix)lO%ibUUUޚql|Jq؞+aĎֶ )b߿v6MMrQݬh>IoAw5*AFnhnT݂(6 2t HҘ-+n<Q̣3܉HVc*kI*5C 'qeFgFL酑ݎH} ^1/D C& j<0p跮Zhx*kkˆ]oraCsޱǐ[9=XvՁ;eyΠ6|:wzb{?fnWCK>h-#;̡$VApp&8GL|aZ0%]U&^vD/#Bg >‰Yhc5Y$b*dd) [Д m- ܐ玣p~Q6SGfk*~ȩobB&Ful ݱkzY^aʝ;h t?qh3b-mqxXYMamHV5 N-HX4xI4!щzt <) + GdGZFdہctj ;Lol B:$Ց>Oqߥrk!Weq W2D)C(yB(dh~P);{uhxGQ"Y#v _旫r.&ޱD1&x߻>Q=d IpOY5ҵr.C X.8JJL6g /r> bڛ::\Vd,qHgg OD!k E^o)2)Ϋ"J}hK>/Y,g0XwsJ bfǫ>([ db"ZBޒau^]Ł<UZR`LR.iT:KRĖI1oHe0VGEb)p؉݃; _6~l֦a]m}Czbꏛ !`sZ$kX^8wקLny'_״y:oۊ|c"+k:=tqNuKi ^ITbe]oѣ|`Kz0-=cE豣4VlNƘY'BӑP;]: W;g5uڍ8#J@ZHѲ;Gv%N@َղpȨ"nGKi Q-ǎj~UDQ\P%n9ۍ>v /lIfk[}JXkyU{cWM=h_"~J!ˁi[L?)62KOL.O4VZf?iyHhPM#B@B"fb6)Y峵7ܕ+xz̪HD" f<]~yrzr{00! 3 cGjs{&C]!4ʗp0)V|/-:Hf;LF3)Em#dg^N'Ag583QYg?YoK{9Ҏ܈'+jp5?kz0hi , TpCK޼5W=֔yWg TqvbeNĆ%@ 6d{d>B]LC ]i2VX k{/dLp IDf)DG)Mԛ XpO/(+>ȡbqJ6{]<7Dj{JW"RE\ 岂Ɨ6m+PkC5:z8XCWgJkY=z sAM%Rkÿӻ\JL6X=Z„2+o "y?Ux 0_; Q]n\F'^;[K*WR`p\f^H>hwkA q?k{{k2aЫ9 _6RU/sF뵿nl75jxzoT%99xǏCZ!UCK- r#O!kZ:~2#+1y&c]dgy>l:k-X52 P-U]2=*] "J2X2߬=7C!O7kyUMzT1>U)8Rt7yA)&:tɑ& ca)ł&(U!Ðag*Qe@Qޜ9Syk2D*T1hH=؜߮т`ܟ(ywZ"{@O MP"}8P bsbׁ%>#+3 b$!Br^Pj[@gv~3iPl-X F)Ån )*RB_^Ni4s$R΁lC΁y8&ExN1!GZpS0'P1EF DiI% $IRDTHe\&BV1EC %" QhN#]/{t6jhԩUmv1nuTg͵0U^=95Zia׌@l{֫:d(xиb~[}GjX2X&r39Ib+e%wYg9\5s׫ sMJ(4zyshԛqxFA`]:t^G/?2^^~q@289sch+n v]gloko }O -5/Z |_&w/Q~ v?tprDFE8v*`5iq|9ITb𩉜9e1 ^M \D6.q=ɰl'gqѓ%Q2ЮH(.qS(4yC p&/:=޷ 8,Ie %a< (4f1 0Q ,<{fS]O~YFRޫ f^iͩ<)0ҘS":u9 u$ۦwJAnW>~*FT1ט{UmI{e-#NL!hK$t9 ǫmѽ\g+2{0ٚl)t)/ATNE nn4\^J^1XVZ\EKq׆;SĦ DfHj}-S;}aʐndZ/n鈡Φ= :<0jpx08)&ⱀ N3n w Ǟbt"'(k5"T)TdAg0f!NߩpX`gԠqAG΄Y%Qc3эwYYZmA`֝.|ͿlS;zdǾ}#־+ҋ^`vVV ]H) Lmz_K <} )}SJqN fa$zM!{[rM{޶Hޞ LyH.%|^.gIs9WumTR/ ߵB !߮LlS\tIqXiD6!ssZ*"K ԉƿٸÿ|JY]W[͝84 )oxf{%4jԌ zoVz$M θV6*D84LTJJB7xh)SrnxȀ)H!lW;2 XG^}K;unb*AdK~H-2:kDwyD9-!i :g8街 {ap0yK)/tpZ9OvҲ4z,(Snsb{Y@ So nU }#@8kY|&[1R!cL|qCܗYWqݳv0KsG%I~ZΪS6J8\9JyhCтM&"+ .^כ3j1Mօ@̗ AP=CrN9 Q2@4{`CB0h\MXJ`T.(9bb*RH3m047E |<ZT.;ߧ45b}#!b &1oN*ib?6()!*QȁUNw`WGi,޸vQRX2h שXoymMد3 Do:\'(sTjM)Aһ~WԂMFlG&=3@"#ۛXCBeOYRFiK 7]^mP#Opļ< =#1,\;7[bW(y"i<1^|e[3m 7 4rCv/VQF2jܲXѬiNTeX+R)r)%IJn'kM;"a $-mxFcp"pΗrFQH́J/ ޯ>@ȃ 63W}d: s,`gCv\ℵT86Fr%̵g_p@0W~H41_:d }5H+Qۑ}UobC:t!}Bv8=Ke*q3"W@#= -Zul qjEz^R-9?~CHeuūSe_?~/x9Zz%tА${$J  /$HLQD48\DȘ0F((bC23}HK 姗$}D `BN [:]nGL\A 0Lϳ>?>4CsO yQcs r8Zڃ-cqLhANkC/r"VlD/m I~MYd1Ub7Z*\ID->d"ԭV*;qqn+"%B/YW LTΣ6['D:r=7BBS…m%ZN!;hftВ)eO/Y0(ˣx%g}L}+p`Nmya,bWPp&ܒI"7\D$>uCǩKpdpEmbs%jqE؀mDt΃idH>JExѼJEf BH;0$Jc,XUx"N&cʄoj"CHKN1+)vk~}F?W"ߝDjXHo Jwlȩ+q՜mZiUx#bѤW30i)Q(@<;:Mjs1r+sߝFY!/q=)_v&p^XD`0tȧG&F8>'tگzftqIh~o A7Tw.\Ś"+%R-9ʖ3;:o{la:W0}ߟӷQVy;YICg&CYzCi-քwDև t5zcas6]m^Z- 8/fmwхƒ5ZE|ix10$۝vαҤ Y[ȢRinJ Vm׮J!(v=nxH6$YLk[>ֺdL2SfO5O(cĒ_D/y=Lɤn?gL©6+d8N^-q֣yFӧ]_!WbP0b$ܚ|+okcr.i~׉%Tb4-qP4![;Z"w𯾩Z=@T%#|+QL0r,i6(o1b1}"T\Np5mȊŃj8M! |tvQUۛ!2F8sМyg{x Ӗzu ]0oa P%U q:iG/EВ Y\8K%Cܳ@\!pB(L q*ȡ gBkw9Eԛ\[p!4NPB :0B=Ef-]sXJgW!'gMZ B-Ur)lp?^N͓Web'j&au}ge4X2dG[^닙VuDɵν?:zZXܭY%.p4~w/t^AG%A=XO18T& ʜzVwx ty݋$#/|Ub<\z[an*n+[Hgp_z9gt圗 d K~L;bbBY c $AHqRE n.V05y\OgnPjqœ޽SH3 cXF<{0$#Oş _ nsQ0?@L2$&$`4<_ %$D xGEfLeV/'M$-6tY[P?@<+ ԎN4YP CU}#r^u Ɯl +]Ès {[v5W2CVvb''yFFqr4Bh4hNzPmN7ܰwA<#+tgFݝ&Ri~4Ҁwx σˌ6 }xNR04lS 'A4P?[!^yۆΊ ZПqFT?{q #] Xm 8k'!u'ZJ&A&uR4̐JW".]U]]e>/ Q=u ۧ~J_;4pӢ_K?%799m|?Gz#$[E _wŖiv7U:֑q0xwva4T,Q;+p֞FڮZ?w^ Xo&_sOaT߯>qNXjX9e Ww/7{YoB\m|MvPzoZ61@2ݛƾqܻ!/6gMPuA^l_k67؛kK=xAxǤ /L Ԛ*`K܋_i6ziؾz1x9 B셥Y' V$iQjG{J߷8wy;7H/͆t}iX/u Aԓ{[ohzP|>l ^nB\<ʼl4,xvXҭsv4k-J|k Y Ivb1^Ph~ P%7_KE;[Z{9,(A/+qf8ݱ^2ĆczyUF5@wjܿo#$WfvkJ/WZcSl_E`\x=%sZ3q3 m-3RݓxV.z?~N͞1jZ]uv]x|!mxMSz+R [4A㓐ת[ګ@iѸ.{Ю\B.x=rͼ鎽NTVFqÈQfdkvnwQ&z ю ]q[]٢es>|e??/?痓s]N_TtFDzKm 'o8ftF^+e* MHE!k\Z377# ɾ8I{:K~ڴ?7G9g׸>z=P1]fMՆW:Y+w)6Ӱ\ς8CwN2{O`r § "0W91/4,F2x-SywUeʕ2+Ţ|йrTA&%+̜ .Eɉekȵ1^JyYtd rȝ' x\ۻA9$9s"こ(rg %K2t:(N0ˋq1}:nߝ V mMDB.H t ꈌq؆jU^kez\GQx:F[D8iJ^Ƿ;R QZJsUs)6Ҭ(WBЈr$}\-d( aEAn%"u4r69?Z(c TQ彎@X FkgZ5&04@|4X6nC&#3Ʊ yOڐ'9펼uP$K ),Wp^{;:^ %cCK[tl^99;9KJW Ub A@崋&:˔MD.(baF"FyW$IeƢ32EtSchѲBl:ĒCPuu}ku52_ʙR&K)A$%A%ۄHF'J1-E&9&pVKR3+8Y6|ZqK;4wt^ʒ1XC$jE݈F3`(O$x;d̂2¹z'0)Ƶ8ݍaO Up&C}ik*A\Q 9G'lY0z1$IaG)j3&*+X.p8 ߜ%7&O.>.:~u &s.auM$I g&hY`o?ӌb[>&T$/\Oy#6o-bє(_+ w*RZ/0WK9 c#ס:G7`|;9a<~{4x;P8n37:gӁbj0IIYnta6`1nhq0od*10K lq&RM6TNyU)bLe5L<9ҪT;[|-*ƭVcmg„rwu`Hn%/GA1b \rP%!l\~%H( 9G') %FsŴmdmAqkҙW5ɱKrR%("$BmYGځ:(AQH.hAvҗ vpP .juB ܣpL&-EB:(~#JuY%\\\*:fy Yϔbp'cJđ 2zdN$\?i%"w٧oR&GO3bFqN1g~i4R.^mxLhJmT\N1`# ΁eQJU(/@ЅLv$A( d2>9阵[jT[BĭֵrϹ.Xp>b})"B^J*Lm*k&~5ߏQg_F(c JZi/V DeKsM 4綤q#\0L(t'F+Zz > tEY]`ɨ:?:i'9O{ 3##D +13JvudQhLJO{Qmod!C3xgV2HM֌ ʘ I8h%2T1Rh1(ї26 O;N Lo sFXB.0]vvW_lvjd׍&Iޠ$ɍ8j';v JjgpMHJ(t'JQG%k?zMx Q/72=u;e~&!O 6Fp4խl!+n}H^`C$ {ƨn3!!@鵝{X9 {4;^}2AuN}LhH&()4ȗgr9q=+mtepdd 8\׾nyM 0ˊ+:kbehectSEyMj B5x@ѕFVFR^W6@2kYm:lsnl$ZQɎGsZA[?&]%tN8n4mQ4DRgIJd]")Q6%9^fvg/??鞫| -~{ ld/^5)8Io ADS=_eG^c`e--F㔮SDFb8Ukر(!c) 7 74]2GNĶfdW. ı]&jBkIk۠MXO0tWV=@b"QפoJXήx (֐*n8oV2` i{ 5iUڙNgVUR͊wzΰK]DaHO(&nn{~! uȽ 2I c&s)505"ɪ:C=*ae w3Jl:0F]niIԺ袲<ϋj*0h-< yOJ3wL8z>\5\&5 guY`tVkEY>eN%΁ĝ(u;Q̒&xycf^S>zWc;SJ8:M}" YrL>!+[.07:⥏1Owm?ꄥOr1866q~QlRbGOt%6Y-Tɇ55[,x ;(T@unAb$ Qɇ<n}= bd=ͪeESBfj Zb ]qmV*t*f6iQܽRwKm0$2~ϑ EE pg>j>7 |)^´@OBw\֕Ie9?[ r%+Y!(Q)̨gx1iO}Dʁ6 o4yNTzSTfuЁ`x^3yrŏydIhq#Dב$ ƎBKAW&aO~YvO-=s1LAϻDp\'ws?0pG$yJ.qۻ H"OVГNG{b{72}e6ɲZSNf]{L'мoAi~png/8 a:M`ףd}МF6̊5?rr?zQnşJ_]|)/YZ~f`)ց]҉ /a2LL_2Bz4 FER.HFx^Y®1S c0 Mxܶ1']A6kXF+U^F_l-XBџVYUP2UXI.ق &kg<|0tkIrlnYn)mh[bB nӿ-VJGLJ N& 18:eP w*s:8M“p/*1XN ^zޝ]:|=8w^rۛ|wpi{8yI{ONzONߜ}Pi[>?_~})rKr+m/"+Dˇ!", X%;}|=Nɖ yc+øhoόQDh[pyy˘D,2V F\g6_k u.+YL5右 jڥ~ʣ1cf/_3UҘcfٛZVP$t :yLlUuRjD'*6]%xE.md:j1Q 9Jx7'lxnu> ouRv*EvddxR[z]럚H/gu%lN&]I ]d^]lL(Alq:n1Ql]ς16a',lA85aԈO%[ZĖzM]0>ܘU+Tf$z=o&riԶ5n\7C;c]7in0Rk#k| Щ7IiaI{~H}OUiX:`US~m1 ~Lxߧ<_JW|Eo]G5f|=ٶ8Tw$Y]֓ausra^lj"sgC (G +QkI]HwތLk8XV3Vm+ĕ m5t*p|otsW/oT 2v\qUGȬ% MUߊMD숦yUz[Rt["C͹Th]kʉ@;7QR"\%it{CYqbAZ[D y\*Ů[LqYFf*yTH)gfTK"L MV؟XeĄ](\)byg ڕ#r7&{HMW(cODBL2cPƝ^8$YgX7 c]^¡xqv'Br'/؈}L`WĿ2Uo6gw<;X73l̗bc0>('6e9cdCAN5i@>Þ!O@C864BL7$/}rRUSH/=C1w &/I%_ugHZѤ nD9hXKO g.# QO8@5c@&=AKX xM0B^xPqĕ. Eb]F4Wed*䔖@7tC"fL6 ƾM]]a#R{ԲtQJ<,D[<$6L:%<&2'"1%-Li~vX[yK}Ƨ`,Y+g~JHn'Bf"u/!2_:M`5bp`1d5(rM2euDPD31|?g R.2( }7z曜t@"tW[ʪg:ѯ_ݮuzXCaŋe(T1;/pdUAía!4""&)\qxdvrLk曊ljO\S9)%KO|i}|{ZC|kz,\W<մnb1YX)-ֻf88.w%4',fyt802Il*dGtW˔.1貀k 4K=q^._Eyy˒ BRR2﭅Z%0ef.J#gOff;d@%ABS4{4â Li`$7q*Ipu\2NDy3](A&*D1Q^:a|g1WP0Ws AY:4:3*9`2U4%FI=F:_njw*]bV%%cIAvHtZ hy;bnb0MG pE?YP5+˹l#^4ߏ.p*f,Ox(;s] ŮR>!X'~2)b}6fxc-Z}u 2$J|rZ)ZogʽY)wp{iNW^/n>571UVџnpesl^1<{D*PƎoB&r92[ 34"g<#/=;R@[55#Spܥ^qXVL_O]Az*yߔyߔyߔyߴiΔ1~ǬEN9n,̙$ěuj%c5%Ck_$~^.^ ۿ~}7aq_.XNxK|T9~Z&5ќ'nf|W)oVN{9IOwաCsy?>_\e[c[ 56>[v*@VZSƎ{&j0\(jh5w3oj"u>BARO8_nn 7&UsECDƵ5FK*.?Ve{{\aid.~\؅ OJ)[W7N֏.fP?=Uvi^~X^~^^ |yQ,18=ٜOɍ4̢adLI7^ӆrC|{qc)rzTJ!Dq(GdJj Q G[f pFA ӎMLS眔pA Wm i ?z ɋ=XWdXKfY)ݔf3ciĵi ΄ c `Un2Xc5rͺmNJndNҞb[=pqsj鋂qy;cQ8<09iPc'+L lݎ*k`-F4B:;/ٷEr"bd&967%Xd;=faƔe D2 V3h*`<{fu6N6ɿC\ꌏtSHٺ3ep[pxHjťӬa\iL#F6*+Zkm-@bªѤRނ{㦆I('ʊ՚x#%2:Ҡw}h 3D "W)J}I׮oWqFd(HJҙcL1*G.ՙ@ *woʑ ?:|wֺ4$KrOʵ;Cpr2T&PyCu׊Z"U&]X9UeZ*% l$A1HRog R2\$'^7;t9aGG5)N<@Sn!wVv~P V=ª/Gd52Wꓻgo/czê huXZ8B \*yyw6k8>bu;;.>Ǹ6U6pI@b1L/Op8uͣ\3:Iz+x ىL *k\\FAQw駳q?N;GX #8#&b5&͉ f=Y"6' ^F=[oO#U,ۄm̻U;_dSuNPjfQ| ڒƶ1;)`w P㻗Wxђ 붑(v+ݵt<*2ZYM5+5~sԺTݏ*Ԯs.'O'wE'ÌVg-__%4/oTiMHjsw-oU댾͓?ۖ(ׇvM!q)hىcɜ!__߿|b9\vVVVCTc>u`Y2Vx hNIfHs"|#S$:>>Yk+˚eM`&hqqbcd_{7"Xj2\}]%H_7.Dnn͓~jSWxih#`?|J^x¥nn>^33a@tT է`7a,9a_(]@Dюe,>/ƍ~[fϳVdhܗ׿e;d(qY*{]$kٽJZZ]*ѭC{L|U5J-ִ ګ,e䁴|VˇT1!um;6ߞp} 4Xvi!R-֝dAC\iflfx4{#idW/vȶYQP4з%t4^O]Yuĥ/z<_ 槊SI>ǘۮT 6 o0o t!6 inLZ?,'=5R1g>Ò^?hbxC?,'=5dkcSW/.a,0kC-r-J2^n]FpR#,RRҮ/+_ݮͦ ź3c95[ hxIflι;ي y**N2jA 1 swA2y#KZNBvekwe[ұ"A4ˉdӫx}q K(ג0?{5U`W*)4 hXx[LG_,}TFUq}هE 0zh2Tr`/~>IC^___\}6A^-_}Nz\-]AHy `B@".,BZՈyB-WT4_VmSH{L>ܙ!lڋLxHk6G}{3ͯ2] rJϕ.2f{7őF00W$21MJRaf,XejO(Ay* s ldQ)f1EYJRp&D&9"| ?oT a*`DE؇Z `TMvp:JGMYf'/TT8O)D0yFVw8` H$%KJ(M! Q+0i f^'EO AlTn2N0hB:X$PUU|*b,}`RBa 5V9`Yyцc TJ /RYgY GX rgbs8`NZUc)܍// ~2-?OJ{};[Lg-KwFK~a|"pm Aqݻ+Z\jέR]]PyXQO8؞6KϺ&=tN` q!=Ñs`p)zpP ǚuQң. d-Ѵ3nwIѰPBG.VըhAI$-#Z }.V"P4ʇՙՈa0ӋSe`.n ]\ A7 HGfnJ7 F7sߦf\eu(VW?*qlR[_II-4̛!ѩ`%T 2٧S ˉN*;TLwfK<2SgNWL`UTnb Ʊ2i {(o*ZOإKaW̘ϣ%?ĀoU{I_e+ "SI&pОzaݤ_(50 ʥlc6鋿e08AU﷓uG/ aĬ}\R7G9q "ѭ/dNiԥβEGuCkZ: hsR1;Źnmm4B-`-?nBYuhLyS km^aa.Db# rZl|^%AE ,? ק(irf;å>kE'Yw[)k%ThսsL AIՑS,x7N:)#Lv˧鯟OV@.CpSY73 c4 SdzojM].>9b:gHhBjl BDA\]&.^:|xghh%4>x#"jrlkҜK&a1~Qh{13%$We'$,Fd@(g7YH NUL4*0iI969X.''uxʼnTXv[5.ɻs.\_C{H-DΌrJދ?esnU²¸M<:Lv>}䟎22Z4됬gW@aXj7'v$u]sެ~MOk7tnQNSs g +$j4@xHfC^587 ׌t?͓F 1 I5/ Be-ZYMnL"(*wFU8G 1Iǰ_xw1gF7cВtc3j($H?/>R!]ckLϼkLϲk`%}AjXH$s0ŧt7Dڥ8判qJ;8X{WZ2K?j6h19p( "BH[äRـ4Hcq"UB%[(U 9d J)2ʃN•28HKU gL5ꤣ~'2GY;IWv $0Ul9%Q80_]jQКtJ A瞲c\)e2|?^2i&r^6IׯѸ畍\EV|M?~:7?ۍ?ܙ!lQsaQ@'7O.޶ ` ^IWwnqr }X޾N7t0s򷰢T-S+Vw1͘b%+rYl2\?Ɗũ-#~ǩ|kXƇRi|hy [X"kUTyu,k8UUUOu`J?J?V l g3vG6rU}3>K5VPDܽ|έ@-5nB}8zB{<)P2B/͓{q2lȋGR/e@Ҹܧ0q(W"iPmD0=< NIDlX@5JG5+TJ T1>*qB =W p_J,LfPq1ITs>\TM[5AlGg2ezR)%MlD5~8\KHtF*ȆeLӹ*)ūA0C7o?b9A?{Ƒ NZŀA8¨Ą4$eǻH]Zb,)ن!RU[:ucWfz>GZ8{vQ$5Qc2^C0ůbz=0_l2{Ycv<JHQr!9cBD[~ڱ]>/GXYxǘC{v<1t@oHCΦ]d1X8P f,8(&̬7%jrRIM1 pñ7Yf75=g;Sdgt Ɏ)_띲nj/_-OnY?ki>jEK"&4+{b1+wЬ%ڟ(eWO,'angXdiuREɭEgڍ|7apxOZ9*J!vdM78FT7UDEA XIRv7^CtKՃ*my' IwFjA-CV&DVeov:r*[[34%VYH8SKPP|UCdI5[E궔+e3\[6z.b*=vp \IdKD>r5IL "b- $ Mh ^dDr_Jܷ %B/}a,F<iց$KdYo-:KyKZy__I% b}P|*Si IM5ϸhSXb)Q,&ڷoRC뎒,5.'ޚ޷^B[kP9JIu4e;AZ}lƃٺj@efb hMێz+]+IUF5e׻g [SE}q~(8R%;@:%󰺫A#xvܺ W/{:6OX[)~GTɈ̈ϺB~#MޒѭGEgM!7=8 D$ ܯ*0mTj.l|9 UÀW@Ȃ}M^L5WA`J}_Xǟ`5YIsMC"bjidR) ʧ4yP36Ύ<(ށ@Qڭqg~?~DkK;޲~<`lTw Ά NX[D8~hvoƃm~_f[SK=ѸMFRg9qp+y>R`1k[ی? 5,ሷD$AbLűZb!e㊿_ dM}TpN0#/П"evACgCFTR7}hsBz*fjk %WЖr-B6NhYE%)Aq8I'HXC?ϵHuᣗ:F8ЀZ>eگi p,k[\p!:?rX: +ڐ'!9^(}׿P2A\"{C?vsgttH'VCnB97z2dgbձ?AMp Y `QsJ$[lS;Q 5&&n$=8klҥpcÛEG"[SQkAN/ x):ZgiV=IK ǥY yFIDZII]zAd :4e-pI`5EaugHLKu r'Sp?SA)E' mUlaAS8#內L6|2/ Wf|bx; f<"L\(|v<HuDOes(t|}XOR'_2I>p`tO? 0G{ #р7ꇃ0^ 7.}s=}5p> .&NL>79f.2 e`&OKcT܀2fhr=|o>h|bP9^SFS aRTג nR_!a8L~_|i~?,ͯ98S f< 뚘rNvJ/@/7'mAq{r@dGWrKby]W#zS"z3My+S}"H5jTX8REVT*y|3;KB\b1]#}!o_o`E0fX6?FE>akVjp-vH$q#kwCF$G8_pʸ(wrʉ,psYE|P2y2>]!O \!IP?G1Y2_@€cF~MZ`'}9)S Q:M 6QJٸ7?f*i_ߣfHko~\WXZo&s_nt08XǕ`0 XLL=]p`LZ"ƑܥӌB6 5S;קGON<  3`5 G.Xರł,,<zz!zsxT0 f J0L0!*zz1񩦏A߽Wu(w* #B}N+*l)5z~?~"RQ Rܽz >zv:Mw 3i!.@g $H*=l;~.G=C|^~#G Q6%V֘?ln [0SFJ7&lMC\ `+-:)&?flqHVeb (Q拨¸9y-O9nstDy4ц4\zIu95@9qiHFKƑA%L304RK-?Bq46[ɨ?. DÌt bD8[h""Fp Di0@ w),J%&vv;B(%c |qm=߸`Az#Sja7:NBBh) ?_=eZkCkR3rKuOmfO7˹A˿BvF6{h2!Csmcr cp G l ߂I;**+S[ǽ nbZᙙwYwwg&珽{r4Yh+,?nƷpez Am_W'ZM>,:Uh=f- 6DQ 9d["B;kòd0IAmMoBژ̯G]Յ/j[GD&T2K=5/~`&0&|1w9g0 J41SDpir(My0SĨcج."$$kI^}+k,;\[HRFcT =bpFѨ@Qtk3 {)I}Zfؠ˟Jyy9[(•iiܦ`bҲ)Mţ>Y _\nkMňpy YӰd .vzI^T%MqlƊjBXN91rC "J4mȢ?磋zp-}.w ?4 }vU}aZUuv܈eHb>.:f !Rzp=Z|݅TscWfz>G*9{0[j_,>=LGk{ceYL['Hq$_v{1> ]v{Lcjz>ՠV-{$)$J(@](2^fdDddQ&$T*Q %<;6E!<宥[4"Zvn`N?L"'Dִ?Je&1yeK6i4D5T}HnLpf} N2JadRu ,Xc؞dw|DV(rR۷@ĈK-/ejH3N r}\޹iyBh~v$:vn} >ͣP0.B)LŚh3U[dӁM%Xn*yrڕzsDw_K(WN Y>8Iռ>Ԕ ^i9*)GoBy!oQ{WA۴T^Z>~!^t#WO__/ߟElr{?/s..0:'Xs,;M tnudw;Jٿ|a} FD&p.$ u;ԮPר_c5ґQ>_U]o L8o,$hҕ^)PHڡtt+jL[(ٲ;(eCaYwRO %L%Н)Ge D`[=6oќOǴyXv=,HP~T)>UXtdž[]ΧJpj.9PϕUp!=ϴcys%GRi"vwM/6#b^8o]tv3uY㻩ތ?)^ `ǪIKԚ[!+hýȕK1A‚# B˘p[=QlcyB=~%jDbxӝBxI+ߛ`hŅ T2R;H1:)%],/.gY,c?*mb,tR׼RWVh]jJ)+/ivPbdղ+w?]VF"S8kf҈Lsf-ǖW(٦uKDu5kȦ⤧ hRUCZjEaIIJL$Hpʇ5ks 0X6|wpax}Ħ;d.:EIpxo(w${u >fPJdR[&4XTh4jϝrUn`I!JG"^Tl\|l__n.*\eLc3B pSP8xVKg5Q9p~Y ``/uJPD4315MJRq(r0QS1pDyJrp1E;%+ 96έk"5ri(0N٦R)=(BpN1$!@Q!pFsű5 -F(wrWX8~)E{R(a,p.AIĉuqd9%yфbɋq\\Ğ0Ey .WbIJ@5ε\Xd1-*s `N_,O"xrﮨ)0ٹ:Wc%wgxn%g`J hZ9(e+\J B GGKn*@RʛX B|57.z:@Y?Jx6$ٍl) 5jٍ>xvq팢G!9jk0Y|oۖ@ɧfӛ~:[OU{rY\a0hBy6 FX-y^ הݧ/ :M6-ч,Y8D鿧+_Z+C@ TK~vʧw (0neFcStFw0zg ҍ`ED9m50*3/zP_/3O8 H#Ci1t*;65$= &s?xj.9!Ca)BɨTL^4K͹ 3)/%r*4 +2V6rX,5r|gd ,Ov+WHb1uV> W"9 )sEN(TI$T4DNEs'w5AUXj>> a<0f y#zbJ (Sx+XR9փ >"a[pఱH% RKݞ(y:Dw;g`80ޞ,iv9/5р5Zi0Tj:b䳱|y`m$ ~0tP/WK+q;Nyg's˼̊zKt(8G<Ls N}&]\uL6N`#VJ'V,Z9UB!V*!GdGP۰mbWDE֏j5u F; bT+͉̀lQDzMs,rDN S9EB1bXvb@D" $sɣt/g~;0yNyej#8C8pX}^~މ˴o%̽zXƐe&y:ɫ_XKR)W9M(Ƙt:Vvog0jOuu j<:It?at~> b\g~{ _r&l{X RZ,-`y 3kh{> )w/Ь?:⊿u 9z^ߘXǞew-IKSG"f~ۅjw=uu/ >2fy˜lHZO'Ǚ%.ggI 36neP3N(Tgf$e'n;ބXJb0ExšՈ!eUPbD⩥7];F[ Di!n_R̈́=ՂQv ="6snQ2ڷ6cU7WΩr"B.s-2& 8 r`'H )>y/]E$hN_&0zx X{Udt%EgGbuueH%bDG؏1:mQS4Z3^R{7TlCH(h1ue/ ͵i>+kN‚{U$Ivݘ V<ƼZ[QGʺp??ݕ񯇿"eG &(Dab܎|oNPZ/O>ZY%hiyЌ=K;1F qͤl]5{(#ujBb2cˡlpQ, cbE]Vfr]TGI fg"8o6, $ yoeJq'vw?frTq>r:{?Lw\Õ30If XIes{]#o}y%iNs@czѱz:v6ݺՃXL,O_gS8Ds8v}-őQ#-#Odz|Qx޵#>ۋ vnÇmѕH*WVf7'e9Nh,G!3 GmӬ0> cҽu߇Kӣ80}{~Yhs.i:Ka;3$ mc7ݽ'̈́O?j]|3eM * .rk\䖜}H˘E9hI"4] (]_~wfdɤN!viCf1!QFAqdXal,U( N堛rPʀ.NʍˍrcK)D1Fkc1bı #BLYJ&CkUT =]RArǦ$i q Own(m>e׀f>'-y{4-YA'& 2+L!SUkxL€<="X4AD",x(D̞;4(]ip1+ Xmdl 'őTHG7 fH9$q$d!ZACBB5BKQ_e@<|ZA.+c ]H8E%Lp,1c"eօ)(!2a P񃧦&YS+0\ "jC'5!XLA'!H ;X*T$ )N)0"^N [#XL&T^2s71/Oߜ.IUL,xӟ^2uI2&0nWT 3#qOLyu/aU3VlQut \S 㫩SI% Z(d2REkJƈaHūS v(z!HļDdT3)X.+7aN.A;7zi6b&D*Të0=l'CDQ۫.2EMҵ~@G_ e኱LċbN(Ԡ\k40.,?̓S~`[ [ý5a|k0j L MO} V/^(Cz[ 25dWn5n,>эeyR /hc!ս_I,֟6/xGTE;s??frSuz)]RoY)Äs$+oox\rs| 8u$+Ln e ۆŇW4h&$\c鸋x?늻XZoӘ*ZwOݍ>"6<.myH;U7{h4 9[uDŽGWOg8+*E UD16#Y=?HE-05+ D&PZz/-Bn x\=zJ%T^b ~9ضLקǑ2#l }o(2YĹ2a,gJ*]_#-\5 u lpkFED2,ed86 I!]CD.L/CaFSbc[p%9h aqjk3.Bޚu8?&BaDZ=lrkƵ#̩ ڕl0WExx I7ZôrI"IX!l ̰,FpMp2Ds1L2,B Y뵂9B `y9w2=|P'b1'9AwG2"! S E8&u)"(6<:&OtwՠN8g;8P |}XY'C"Z:h;4]ЪcF1h`]Dy' %e}ޤVpE\9砖$$2FPy3 E"pal1 aGb. VQ/EH7I`ds GHHa*=rP\(e&gG4kR 4)](TG0ʝR?RSz{1aF%KXzRU3íN0#L{mgF`݄ HoW")iiE~ иww*2&jw]TY BI.5e!V XfVFA<}]siB{8}Fnaᥳ Η.޼}ɢ"1JN_ߛ8 zt4CDDH0^wc F,yG{p5M?#3Xn{M(8SJӍ?0J%zt- w_ʘB5UH__FGQzUMBx~ϻŜAG7 ,يPX[/{E~lCݲ)e] E7 XlI`w) "rk⦿cm_|(̧Zb_Vrų<)ޖUT J6Xs4iWKT;#s6g-1 5P{rJ}{7I̗t'I^<УKӣ'?iG{y unKa]fr䔍l6:6~ڌB}aEamp{:=W~v/bp_y irr*C*9:6bt^LO#lsq~0aq߻e7_w|UE0oG3=PӠϱf\w0:놃 ]f=B6?>!{Q>Y4F$4 c$G?qqt8?NC)z\^0pi2G`]4@i'e 6x?;{볿í~d^$dp}@Aܶj{ȥz~ k_{,% eK?'˗I?co+( ӛџ{'oO?_kfT-i %[<[֬%e[.x+nEPɸ ce5~*xp? U쏢A5I{?xLd䈾/Jyd͏ks7_ȯ/\q:_]ڼ3*"ey͞"r2OIYoˊ< }>k:w AV͛ &@3'0xvޖ7Bȕ_afw_A+}m_7s@Qė(vQ?\G)@(*~C;wf_qC,w Hiu5s/wi_go)+جfk%v^(4{u3VO@7PotQvsƯ+7;9Srg *o? /];6k}73@W怾km2M5.$W)Lo)8@U kM;smڮ?AӺ# ͧ:s|҃2;ux}}h nN~yeT髛c Ыy<K0~&{jO9v0N?`>aUQK{ ?cvg8n{_Mf&l8lgx+Pyu&@}ɦi6Ny "agm*K`MMq饳͂ۮqwo \8k|/=.#y3x:9\y)UrݕfL*6ȫh*~^\ ޝDm͔|vP6YX%`ݷXݶbcY36]+`ݬ9FP@m>)?,¸ 5޷\U!POo7GQJya ?%w>(73wzOW)8լֈfks:bIG>-4ri:c*߾yZ4vrn6/mcb^>ml697FNf*U#.e^st~Χ 94&0L݌X.4pq (ȘʴǓ3k0Ii5~>"}~^<=r X1wzG32rP8ph߽ytn?ږ: |{Zbo?|Os ͗S>e&gLx9Ae (8y|@ kk%2U2CNm4:Cd Dt(kƹ _g4%11! !U02.rMR>~Y%˙3y\zͥb-O$0iq{ztԕ9'?? ?,#-,W~_l˄M0#p=R }Z?S0)K'* @eeHs8s2 rK S|(zǵáא[K0=T&S\ s1@F@ FbȘƽ0z>J> & xKNINŽC+7P: 'S^hE+{YI[iC)XG顔nϭpW|t6@  B%%F|rK(QzK>oӋjPUS -)/+GO #c$#L%*Vc kN YR`B!%]薺_'S&h#ɽ?~ǿK& q[>*+_e|w܂缎sBzYN{;dɣBh0soz&oy:G$խXIl + QZLO)¶G4v z/ϛ(» yllh n=&zaҌ^%#^>>ÌO/?\MϦM/ ٣X!fO?3ʩ󭁆縸60 z,kqjl&c,&ާJ+c2ڷZ׷˔n66.Jp c8 O0)X&p\`ͭ^& b-8fqΡb]Vv ;ko}{ݦ|bmTL4Fhҽ1L(9]_ei&똣hErq5zr61}h8H챓3#S<]v[Yw5/$ SOӭCItvu;@482%PiQ|32fbpQ=F?5Tȣ1Fߨ=2%7*VNh[ mI$>$,+}Y9VX!e֋T:)^E.鴖8# \"ŭQU`."ȭŒDLg[`՝zWO5,SkI2+ԑ{zӅ歑a}}GO30H&$/bq0gUͽyY> V 4T %$r*cCJ%$`LzfaÙ{Ym$iD]7=}:ٓdSaj[K2-rxK?Hh>VղlfݐRZV |(_Zᾍgd&H7' N2/+'_&Ji Q@X1p JW8+!(P'&Pn(:CX%/cbXEErsmjb)xCeDQHh* zB㽶dfaqRY|EjyGZqR ɺ3\s{4'F!P5]{ov6>_~RCPiǃQJJ 2(xUpg9\\ƀl|~-> iNDg--=z5V!=(1SU~j.us+T4IQ X2 * HQ1D&MH\$+gղGMMzUIeIlb]!VhH?UzTJ!R"qT`5.?}b-$ d{TAaq?wې##ln,X# (6X_t :^AjUI6>1F"X?b}w'&iP-U<޿% 5mT̶t6"6~p ye$}`-'ul@63^,&3 Oϋ| 7&'fkņZБVv`E,NJ fdXXы  e}6q8mf?C7G_HgrPjAyo,7,qlmn[mn|?znXu|9e~7Dݹ/2sV]_^.}΀?{$Ww,,E FlqK)vKl,k>~ˡ9}a <^^fǔR07 pETIV>'ْ-g'!#P"Y[ ^OxT{'7GE?"7z{B:FY֓vF~ Y5#~[V=O|gط?\[\n2#PY[88YSE*<-|,~ Ud-g椏h#wor9$&ZeT-87EIhDɂY휸9hR*c +VYйQIz-8-=upA("\jXH4Gܷë`O8䪒{Y} v[& &OsYY%"w`zkz"VfS psj#+#hnJNaVȖRHjE@jB;k{x Z1d]7 ,$hd0ςb!A mc% bHs&=Ƙ F0ܐA\Uj.)$JƈɧvK]3kp!EFLG8*D+Ҡ"ڵÌŸ©\%kGN5CLЧѬr$ pFBm! z{_;#K2LEVիhޠ+d?9uX+vwo2B`#FYiFNLouߖO9FuF ǘ0ѷ~ߨRikAVpn%7d~ߤ{f-});P'miq0emiK`մ%X]"`st][_lVjH_V3Av" ٸ?;%VANnT[;a}/#źt6hwFU$5>c &R@n0}\j"H|U"@ADBABO34fL>0,*P}A5/tHO=M(JY1pOpM> 0S r0wl| p}mLn[:4-3xFlfM,޶j˳g\wKqw,΁}ju/bf@nc b /Å YGPYB2RGDžuey(bIdLc_%E0 RhP[ rZ3AƐ.DABܺ7+W@ v"h=Z.dW2OxS(l?}WA̳r4n1/6d,KBi\#hIF4'eۉT+ 8*n|Vwd5&u{ƒIS8S?7 P b/t韃ftwuNJiA<O"~3*fN>wn$BR~gt8=He'gwW Mb0ZkC8>PrK&5Ei}z, u!2ւNӅh@yc)'Z5.2XZA)$%Ԭ񃩆4[wf4OCO-D WZ)#۔/lAF%q1H'HA1[P'~xR(x?')>n6ڽDZֈvqNNK$zZo:E37<20zyXv~gH-N HD3,ˮԳ{1',Wg;q؄LF[;<* +`SPW~kAfu,1;攧H0}>&y)lOGnK^~2Szccf)F̼umoxX96$!߹֒BSn7Q*Gnu1(#:uQEє~jvkBBs=XVxo3ȒNcm`ޚ 8LU8hZC*A99RJ25t ]׿6K&)I&`|c21B q' H@3K=̨HVr"QQEck=;}@vSp.ei|'.1ֽ~-YE1^%(J Flb; |@ԑ4Z*@~#VjI, ihxCbÞ =<ip$Vak`uYZYz8H,mLa9/߮2eLaKF uօR_ `XĕYJOՆI4J7wcv5 c0d0ǣa&4V:X2IBO*k!c,͂ L=@#AyxM?6LdA9>,ӭ`y.d/U2~`nrwqy61-18@8iNuBBN1H>ؓpÁp){@ͻ=^ypI Fۜ;Xle]m6qIaf*tf4MefeHbZ-]ՐPNV[l(pG B-m`<:Lm%#K&P6=S."<:tJ=MSD] QUf qm8W};zxN8Ɣ <Ⴛ4ɔ^A!ȈHDU[cmk!$*!B!SVXJak{EZp6[^~6eц sS;Vu!K)/(’eF ll*C>Ts̋W۬ZljqkW@S;֮ :ruҘ*Hx0il|,& VXkV8Q;|Rh\sVq3sG .5f(V;`81$)"Bƌu"fbD |]76w,8rhg.0"#% X$Gо]wH#}͌:B41YP;-Ѡm"G"a"Q3ǰM" eg-{uu+)/ *@wSMA_7ש)(X2duQ%x{!)%t1%\Dú:y]Fڭ.eD;h# ;n]%[20N@Ip2lȇ2*}3Ѷ^#MxkHew~7I r@? SFZ; ̙̂z ͊(TLh)6n J^m\i\mkDr%I-~O IgYI?iy[>[c -M.Z1`\6rLNݹ~F*qT"QVf{z,{xq#Z8@O4FooO1 %P ,\7vp#pYȨ$T!]Cq?:D3IưL Zh,S/\H&ݻrZD!0ysA[!CF-PJcE"8316: X`mE[Kh IHADq|cfh`ZX $C3$4)Չ($!J Y^$!HQ!Ye896BFL;B Si$z/?mNeEfEBDA]`A(M”UBGk Qpd,il t8 4`5&+ELnȌ7#˿pmKiC* IYK4 @&T*7JAJ- q?z.eiR0="~2ܦ9qde .1VyQx-wԚG9HVqxw5?楪06N??G >ǘf@h? ៍ 'pTEZ,Oyn4Ns$cĈ)KRE]˛W?tlR+'#~ ABI?sJ+ALyhРK4iD,`Z ބX]gqLm 6%&?4"u ޓx#+-Í-8-ݥf: ޕ>q#e_6[pzO[(U.;T#"$eǻ})ɡM1_7nZ%W1`+t*p^nRy7ߧ+9߷~;oq2uKDc[E1QG(k#SCsUHo@ipc"!xɋ{e>{1!g0p~>^?3@^bNҋvxy%߇ɚ>8z[fBRKj_[[Z^dW^PND.87TpرrfW걷 @1U9•Q4v q%)yU3A|<R'>%'$2U$\JG%@?D@b eP!$=A,/=R iB|c2p5O Õ})i--vB\ʯE >g_bӟ!<ˀSt&c,B]DcԡLP:-e˭[-cO3^ƻXpƻyE~jnNF]pTĮٗ+{!X+{ +zU"]{|k*]J/iLN^JdZ p ^{VAp;Isͽ!@t[5$lw$,B )6K9 +)[C=&! WBs(Pn578Ĥ+nvX U6AP* 8@K6Bp]jز=Tx6N]wi);r GXtPUpPUnPtOɠY"2DhM:Rp)B{3: 䆍ȃ yo2pH㎙ Lx'U^ƃ[gn2ڿ"і]\5Zpn4ۃ0 h2~^{ #L%ó{rO. {Pų+ ^ Kn/R ç?QuU&dPqb5e(5N^b3:vϴ jnbC)IJXb q.I}^bi?r:=3~]>1z~^ߣi}~pÌ>~V?b<˹gŹv1u4[sۜpsGv2Y |f[_1Un c8"bL@ϼ>e!YOpoFH-E 5DHU* D`w7C{)2 Y'lcKpJd̪@BE)e aZ3P IJtEre&!_1y.GTo. TK)s3CvdD9Y"a$J{HB eB)LnEH)0@"Rh>3r *Cg0N r8`fpv E3E\QF,7PLt5a'|-X%B>I.!)vH臹kRNH6(•KuE3-vjZTu\~U Ukv#WvTtR9'rh̐Cs!( ?(6-vPeFUPG1g*G!+K/EKC;#eYà %I2b2 #jnG#ȥaRMkkC)@)۴ D`Z_pm &J,)YպH{\(ăC \<(HOY9gM`Ӎ#==M i8{1ȺENڿMuf4| vi`hLΏ}?Ӳ>cNЕ܌?k\\,fxnf|.Z1{@cUBOˢf\X9_=h%%4 `ku(quAT}Fv.1nՋTnuH7.2Cd _KEe*7|Qi.;TATooT w[< SA)olJ~,߈6|06SOoA%=A|`u|^ fT$|z 5CKPf!p0x Dh{YsN)Mfia{'^` (@+"=fcWD-^wU;AͷTDWWL7gtr`m'7y .V'P0@W+!e* -M hY7(߲&nH}Iaמ^GB" sCJ K [dTJG1„.M&߱q&&r-QAE5Rr )wcC11iKm jӪ$8Zղ746v @KFwlI@=q3ZT/ծm190JN0&׌00L Ì@51BA~]o3ogowcuD0{}emz&~aiԉ*i+zw1[;ldv2G fipbJy)Jot]E@[fTy;*i&쵟Rtdm$PRJ\Bŀ $In'GM!=\.]e\P3C| L3{A+>/#&}*ɘl*WY~|\u qrDށ4u4FlNJz㤐!|{!X >/)#p*1V(u%wO&18A_k!& IĔ $BSSR@jA9 `1 jO)a$8I ai,B15 4vu[nFPFǻ}$@zNyƉY*Wj^:79PvW'#ݪ<}ٷL^TPD.ђ4nRCn6QIk]&a:2Os :&پo->zE?,L.n0q|C1ܟ藟ozQڟLgŻ?oa@Mgߌ]{O2ݩ!$=^ "/0$!Lq/ʮ_߆wqWB0ʡ"2;9GXC2ą4"Ô pHjvz4! aK$E*rgH$2U.O .,ut\JĹ1 ͕QJSGEE؉e1,aS(\ k̤R1Sn RCb,F\$q)UR.05m42`;%[ZK75 ?}rͳKJL#wZ>]=|4z8pm"nS_6H2!J `h=PR,g"6yz5`U4΂eC 3?ZQth^ʿuXwΖ 3E9"/7>anJ/zcgWIпLD[(&X%'PI/zfG՞=u~8؅\pwo^tݫk]o`7?uwϯ^}몣^]:B+7o߿0͏ODV[wnuޭV͙ ȗ~jZ2!Oށ}Q/נQOvۨН%vߨ ") V8g%tN:~|:W%kDvKH:KH-Es7g [unW#)[7pT!no7z@puy 2.x)n)p)Zޤlr9pbP`JHkIX `$2L'\rl 4V-sD(sx٣"fe |Y#*!FJZ`RnG-S56U )"LtֆeO%!msixX\q޺u&QT/7tvC!/%%i,($X^rSkyZ-aBԞ90Ȩu0s=>x¬LT9&]moIr+$2`w0p=x7wALhIT<,{(rșÖ9au6Ug˷l|0΀>h5uhH Y$KLߴz寯"A9aO;rfD!1ݑRhNr;uD4j{-RE|4.DOK?y͵ Z s%_+)1̜l9@d!GV=~fӸ7 >mc`p{`O;$3zš+L8 DjEd&|Ukl|k%Q',()ݮ$lGA/NkǪ#΃]LXOKI #*/ܱ C]ؗ/=}iia]ÚcG/GubSZ@LG'}M &;)w7aI݉ Iԩݍ, ؛Uxn}omkat^w >'&׵x`@⺒ =ZC`#爫]`;Z|w>vHfP8%zv, нuguJH;r[`%NTO֝L6^WZ^^'&Fc.M.ܬ<$Ɋ6U*ovXQtX12 }&4 |UE(^G둲mu4 Ywjge۽tuy&GI*NP3(ȢV A~@ =P6%@bQ逩(*It*%̬dP@]X.R~.bt/oŴr8ٖSD̟RNiL \D̵-aGQ# !IoɌʙTd[˭^D0Tu~w!_S}kXx&%W R&[du^#\7DDG*n22 h8fcÑ׍8:<݈wqrMJ]^;dQdo]K'Z W7~9mSW;.:V]-:ntՙSPqՙu鯲Bw*9:W}t)~CWAF9{u@(=$q[G__ͷvխN'/mw^YwGb^ͰY:_,k+Z̓_I,W/6Jm`P^><]qXNܜtMYQ'dыuGkY_\]V0kA\?;r:8&1BAo%m; Ct/LږԬݤ;/@gEQʴU޷v?Q!{`a_O2JA<(5ك@<#Z ujʇ|x?h%Ic}G 8To$zñR{J/^K}IdMSȣnF|?~)H[}BӎjSO)nZvpG]W\N&)U.B%ѤLTT ,dXi1{g¡zuYHQIOe )(ڐšIdaYQ齸r}5wݥzx7VsMl;Z7<~j j}V?iGk\ΕR|quubj3fҿ{i&BNm~'@2\,j6t'o7ɵedm{mWr"qS ܘ=i|syzvSd1c LYu zkMi(OӋ 2 "S#ʺ($2KlP [KHIadxu(R??{/?ū Li5{Fg-^+ }u_O)V7H%4N;xgm~*a_, T%D𱰛|^dRѱZqΨw%&t1\<&%dJ쒊bgOtL%DHUAyv+"Xd-3jFmNGa$-ڤ \~s'!IN 㝜Hơc//Ff8Nj-VuYMRGIfo9B^`(W'A!< `V`YC9Roj_K.Qd߁,ZI60iEEJV:?(d}85lH9=h61 A(XXEd B)B0< `*45lHI1j?+ETB$kjY ($ >(g,$ 1]V"߄uV2ǂ Q%5iA)%\"'f0)VP ΢%~ ɇ/i7N Uװu+Gtk,;?oeT`׳?\Bj-S몞uu[]V7iWY kVWӺf |Dp߯{v=[GJܓPDbU @6]/%hKzY\T1"X !\M'9Oʃь`YCTb F S:)ǽiyPl<{裸YcY/d=0śDuA @](\˰2Nʃn9*=jd Rt@ 3THDe`SV7U}r܋=7"pfV6(D)8V +Lp/SLߢ[].>梅£i<򤓍3``KkHۋI3)T#R3ÓwM=m} ,ЕmёT"liC/'ڳk5&6NA]W_Mj|5դ_jo՚-k[-b.Z47[ubҷͶcKh}ٕfUtqqY, #4m4S{~vm2r6],V_|rWrWm-/GU^Cy#cP77u6+dzwfo_u?0>ޠB}rh!Bir6 Yהe$*X zb1 PkckrEA-n愌6Io>I*gdQ_O ޳4Nل'_pT %j([U߂Ƴ,+p1IeFNeY0;Qr6J dMA.$OɅ#Doj* dm Di $Î /e {\|[ddQ8YAXa(P8 xJc҈pXl6P>mea wԢ3$!|.80#8Gj}*Qpɑ-a׮d(X\$G ]"]V2(&)aPEOj/7쮎=dNJ#Qu\+~C6)gAObpƑ9I FWX GiIT*^@]<^;~^'Tm#y7K9R(‹l"{ň&g3,5ة]>*oxc#۪rS NBeQ0.H̚"8H&iFFf(}jLS4uLS43뚫Y:ِ;6䒈3(e2NS1vg ($P{,{˫U. 5M?SdVϖo?)ږMRŴ\Ln9NkM?3` ^ {̡NNzy}oQ-E7xNV)6<8d; @J]YsG+lzP=3#%{Fud䀠,djTw)vX62+LP6ZxY5-gϢ0Ov6Z|]1!UZ)QMyA)`LJi'·cMIm.ԒIjEҀ.!xAL`NA-BL8}q.(cJd`4fK5JZVm^j`b{h!c `.Ń3q՗T *īۉU:z`yp5CNh_S 7Sؘk oocV烥=l |!-ijq_YHs[Cz%u{'}YTj5s$anՍwN[Ŏ1Vy㹐WzP5 {XS7C\E; -/3ΒBO_y=Z4 {{ n޷m}V1ܢ1 EKs^Tg{$$ypd7!IjZwYῶWkjCurȑ旭^cuS}k4Kl>6Dja:dc$wgYL׋ ky3}uykrX:~v構^{Y♸@3 =<v9pf|~4Y|cg5X#“yΞzQcdT1ޒQI?¨Ggܺɭ':G&qkCb Qq~sU-u5LZ4}~aGl MV1u߄S}msa;;TUk8^莣%stti;(ו8zNecb sHp:Ё;Εf!CYxؖM:_)n J=>(ÇGٵd rיlUvw2L:Lj>̓GطACN"e=GOIZ i^#K;,C ]k5a QZA#ULXt{*"5:oNܭ"R?\t*ZOfd G1ecKlG;0᰽c>WqLaEL`rݦQkNc5.7(-ϫ Nm# \ RT 6A FԇȭkhˎG[|I402LCЎo2̜fAFpX̙x"`#'F+A-˲جY.ݪDLDL6Iy*IˌaKiKVs\,t@a+(%) # 3,t)%3T2XY}w%cH'# qzs163?_/fSnU&x]'2|znn~ϥ^~5&J3#RqEf/_@?v_޽<f\Xonߐ!`$j rcs3iYd-@aJ74fw'} eSM:.)#T"vh:wn{iʩ9rU1@7dZcۗ>5WvLU_9Pyu;@iy񿻔0xYJkgdqptshrs(ԚÜ2AS9v7+EUX 즦S+n돯^$%_L#'+^X=Lan\w`L67#^)2<3͞Բ 9cܾeTàjnՏmY: hq֧s0 W'enxq_ƃŹ~÷sA^]ʘYSwjVr$m*7S AF؇S ڟZY(|s^'?X ZKZ8DSx/o+[]&bX)jgEsM`i״K&aoCS/k6 퍚EZ2!^qH|w;DaIo)d[Д>>]#S;7#!D=j+#loIƆRIwCp[+ꈮ@-{ʻQ{krwGte[螂M^@>=Bx[Ny+p//k0yâaTOzH.mMTO C|2iAVXAMe5mʾ-iZ۞"hK$:fL$U\Ҏ/!ȒHV+8?08{LGϢ$0z,clɧv>67욼J؜EtR!Ŋ 2A4Bn4ՖjRX6Pİ,'%цsWZ aC8I _:&"BV%Hi* !J̑#<>q)q4-hDdt0k%ֳ-VNk A |QqpEe.u'3\YXR*̾3֎AwہN'[0KĴ,}p~m<uݛ ĝuؑv{-DRYˁICK nný{Լytz!8-`)V gV!pLjNgUϐݷ^`e\.N|];ei]_FFLn1ƷD7QظFTd!1P8f"vi0iT2fP%Խ5I$RVq[T$bRL]9[8=P LWGiUmpHVUڎ?Fx1xqiEو־DpTAYRcaJa^c 1e[k5q^z&q0-V'0_)/♢w>iumUL_J7h(F!a-B/]Z$"GN$Ee?ЧozP,SUrELN 19ᢚPP%8=AOձ S̴̀w0ʂR( h (mr`f>wLX٤$Nzb{tK#;FCwnǽ`CӤmuVkhVJ[GkxJbtK[JPZJ &/Pɽvw^\\<6V;GrjNsa㛐BUew A~}ɂcA 7${:zPrUMjBӁQ0 t7NrЦG}=yfP=ѩʽf(Ø#CgsT huFKa9,2Qpm3gDKsR:a,ZCϊlRNVt `4˭ #lMl7+? ?-HK&R:oiwyAc;C oYdmy CaGPYꌁ+@' vI0FwЪ}8o g &m%lAwP:g ɛݜ( 7?W~ W= 2v-0v>\nHgqs~ro&9=n8=:d`0.Vˋl6\0e?~ ;Tx!u+/F*bX^إJMBV}}@_S7g?N#.=c[I+DmqInN+Ԃ"'rp-x$ArT5WE; akQJǞDta*r "C*J^z.9y@>BhfGTܚ R&LmJѴTx5:gYYקgCCnj49^Se6/bǴbP7Iΐd+}s͖qn9f@RtC0nM2[x옱ua\u|A(+1bI_E5!|7˷!, ,ݼÍUHxf'燯ݾ|r6|T1ų'/s~6Ww09{愰ӆW|t ,'1>.G C6>q4͟eYBc\^l6=+lz#2GdRvq,-Y3|K8k .|1l(и݃+ۍ@^޳fuM7ܜx͑-h>"ҒJY &%#ec^5<%re9c_vv;F ƅ D4*G%ֲD ,GkLM,1|6 *z>>4;>:O1Fro~Mf.y,RiG 4/X}svZ{cQ»,\8CSwZ{cQY!1*FX)V(Bisji.`T:p3YB<L?*T YhK۞9?,/QOFj9qi4! *&wdTyk$B.D:yvIhTj%Ϩ*`ĝR'>X1% NLLX`[6y,gt F.e.(yʚU*Rf(uv' b\Npr *jR,5'Td[Z tZWo29]2T(.st Z]}<65 kI`r]p:QAlG۷ov&Ҹۛ_\d=0í o'k?U.~b|svm3NC89گ5&t_E~?x V S)am(H,b^9% \Ʀ%tsK],җX}IRD fssR@0n!-(so[)yo(Tt+"[Qr%ō`$V8RBN &]~|s2w~f_w_Y\+x$cYlϓn5m ~#lu;gΉև,6P,Ӛs#v9.ξޞH籑(%s wFڑ.I2%^jT bD'upOM%O4V!!/\Ddnp8[ 470=B'ឱ'<7iΜ4+#(wa@5!Xxi)r\\Wo'L8 4e)cL|svf//ZxԺzGQ}xݸ"99ASS7e!ST8).w9 \y .$DŽb='JFqiQTS3jj?JU*eƠTNSQ3P7΢@;Т(^$%G|1KO`u0ң~j^, h3tl3 %Onjc<2{*Jev*n~|nI+ѕS,\ZqPca 1Xw٩.nӎ~)Ҁ*k> 1,\)L!ԔBU?ôf@g pKܯ %=Ývؼ"HźNMySq`O ˍ"J.b܌mjC9jP ¶9ƅL#<|AN4iWf))~VkN[5["Yx捎pDS,0ץmmt *W%1,7O:Y_t; 9CqnEJu7J zqJq(i;NxG,^37&{yϑ.wl^ o+bgd+ q:|?v|QϾlϙ\hQ({0*;xQ*9R[2ZWJaY|Ss Ms*edhYBcRA]T9*j& eTHH~\ YCLCu7b>׽{s>, 10ft:}Cy]"LeIw0{j V;1xmօVU{L[[H9>Cb0)@:L)j]ܾ>B6"nq6BuL|irvsztXMǏ/aF[~ qHJ;r4pkLb`LW# (0jT0Ŷ'Țft,'Y^3 >L)YaQB #\y&P,5 rJ#:M8z&W9=$ 1GA&m,7?mNj&<7``ޜ]|1ˋYv}EsS&/-O>cyB+Zؼ}jNy[I#GL'wb*nq9ъ&[S;Ks@4cUXtǨN]\>%YQyH'ʢ㩨BfU-p3+bq' ݲl1?1"wٞG!BJCԡteLkЦ&үt 1FOH?Ō)fOx ˩R"JdquH"^ŹE OgKoj%=XJ1YrqFrN_bZ)5ugE3kO-8Ţ-Ck@ZH29jq7ڜQ5A\{+s|t3͎҅,wΉYG0"d=@%7[Q))e²\k8.|YyID_2 t *c"6}b<_ޟMlXȏOg-Ϫ c犯=|v:JnOt}pl>XP4ީ'7pxD ɍ>⸭Qr\0b4j1ĐBpe5FئX Ӧ<⸭QHa-:Z40:>^ivin$3 G;>T(%\eV!}wnslV!0wx!P/q.hqp҂l8unRdhqt ǭZw:h9(tvUGC^6QR{4ObP8ujGYMTx%YIgm!3^em0cϕ(kq-4hDDŽ𜰋DzR KnM0 d4/|Η Vv:nlYIh ֠y0T3XC2?S!񌇵6jioP5WGS vƈ4{-od7]1,xudfb٧O$A\kwJ#*ѓ%Տ[li9u1b0Z#VOhUKR+viЊ/ꗬUFڲhFo\[O>c c<|b̨zp-n+F&e@4M \cNe)D 4d8qih}ε[WW ֟hFs6uRi8 U A.j7msA:(}E-%vk_>Ӿ }= D{u٨רW4pPY!k˂ P f6 , `sGIl"$!j vDH͢5H"S/#2/Ia]RS?"3n F07: f*]a'}c*F-ʖjPMUTQwq,H^dkc˦dm1fJ[L˩4j<{TTVZ[)pq t zLp= :foEϣq: VPNH5*c}sq'ZZAʽUM6򂴃ɻA̖M/SvW-rzd.9  px[+bde Y=jKb~kYs)OJJӧ!]a֛&*iKЎ5r84ۏIFk r.GfIjף4dt !ƃ2dZI@ڒxeuIrpBpzjCj|GJzƵt^-oXZJ`wQd\8z& F#6X a0P?zKDYf4xg9/x_h#5K׶. cNPٕXz-&:cjX`&qcN(Jr 2P2ͩˋ asD_8m-t h~J96TT[g_)*16SJRFsnս@Q86BʋB/[Puf,M|yGI}]M ~:Z͹ذ4#]{Y.vHR-~Lc9Oz_[3K^Z #>E8^fF P<Wd;[s~)Mou~݌0ZIEnJ h6`%joxj̉Iho]-ZH J~"h70`ϡgӼnqo4fl;6MU^7~rⷙP ujVӞ=q](e_ӄ&q4mb5w_'+΋yXT6=:0ɿ-Al>;Ǝkb03Oަhh+fA=az 1%("g=WK8mn!jh93/5Z,\#\ٿOi\9,R?"ؤyFaS͊0_,b~^FyeZ.d!(odBhtȥYyTJ\io}~~|70U3zI\1/N zg7ikQ=s kL}뇯هH4[Afkcș` %h E. y.ti XɄވ[WHAntFhfO -!4”:/Sr r">eYjB@i|.C !/ chղ9Np-r&] ~b3g$4(0Q fwVg^A>JAYE!͙sHH!p) :[;^Hyb5  SZ14ίV>-a_w>\@-[/Xس9 }F\ 1q 9ԸdJM7u=ogqRl# 6jioT!nFYy1 |Lޓ@ݺ;6Jlѝ~ɨ߄Ug[?fQ6;fHח{/.g_X8-\UXK󮪉7QeB.ʪ56+uR c%R_^=nBM4 |Ž+C#P/dQHlv9ЋYcH, ƙ lYdguVBR^Hgg'l=և[%?4ݼ]tٷow\ڿDlْ>A]!{I=$=<Ϸ8EX}'Az0hx.] NE,vo}P\^T\Z<xiJ3p$HmԊaeiUЅ(# XI#$<y EJ 4JsWn5BMH_ELsjKATQA7GL Tܷгf#NSkN@eeu\.fjb1)c1$WWO2bLJ;.9(MwߟJǘWmezCU~SYF&y|Ba=А& wOn.Bs,4"|`w{ Z0 v.#LS[-_l[{8F{\p>[}w[";D0TnRN,MKy4 ^^s4+V26W8,'_sK!Q,\cN #K{R>zdig-'mw%4XKdJ&,jO+!->ˠJGhⳐ []4ZqehT C^U6V^ ڰ8Q$v]npd Y3a]SgQ.YELɢUP! 2mX$B."`l֐5kA XC-Mvl.JCB=,LBmA8y0vF-oW B?'eZ.rqӒѺf%OaN82a1,^:YA9J%($NJA񩶐ƽ˫t\ :9=VQNC՘42 B< 󽬥@g㣃J3Rm(]Iiv UKw:1菖}q=uz0Ͷnj2PX2=, ZcV317]gs$|Q~9F@MSc-)`v_z lA>X˛VxVqR?%iWkEŕt.*GwEH[jrI ECV !A>sy়]9iT+92'L<:%n F dS=wDC? svcElf/ة~ezP'i/*̌KIkhKY.F̝*۸ppmH0 SQ/闓] P[&'68[l-6#L KCo2:ytWpu[|/wbׇ4;_~mlAi:O|z-^(Ձ :ӯF֒XlPj3N/_5.LbKC0&yX`pN:)gy;aE=ZԚ)gi-u-Sڂ2|a]jdUCK<^?*ɄI%GBh0hS4=+<صC҄yꃶXJ_]l߅y rx<t\d>ePXeMG]]<?\ͷݖS>| * YWnml1z8i&l8QwBr=ئSG;@NefMwmuHz LX"q^v ٝ ^m-l+{KR\$򰯖ǎ>_*>=q@&-!FqMh$SwZN\פ A&TEamޚEPEZ'"=5Q YOASql~;4vũ+ݰ"s:XXFFQoGq0{\J9ʍP* DzE ҼKTj,Z9#'hi5}zT#Qa p}Xk䪾_O?> j0`uWC7ZS)xt[&pjV"*D˕тX1쯻~NZcvHQ d$`J#3PV4=v/Y%Z"mluX.6#lzwYX=XbEJzEЂ5klZX"9 ȎV҉(ԉ8!UtrVUC SR͆IZ!IRH*`8'Rx©_ğG[?]*eѠ| X *NJWYDg_|8zq MBɤO*ӔAh?*wrݠ:5PyjxHRhyqp}V$$ $V>Z$\.xAcЈxT>WS̡bydT cT; \;ƒ!+uZD|}0/[*B"@a7d8' |sh.DdS`|xt~sQfq4'_|6/ޟ7~LQ7>>Hk:e\,G] `^#CBZ 2(hzL w(-E=7O?FKNP5O6FUTl I@k.%;;:>6042$8F%4Em]U:TN03bՍ7Fy+f|Iuo%7]MY4P5_ S}evݑ-5GhCs("! S2HA\ Z?}8{uU[zJՙr$ۚH}DTgj6y˺pukŠd>ƺMB˺i(n}'>D7Ɣ.Sr*LE[!'r9iW__NN&zOф|!$ӇV>ZYǓ~[H\.JH優h_һ$׮>}ܩc<^Y'W"> 臼cu)o_]3=m1w혟ۜݱ>Yq bK}87j엟]J!%;`(Qѹ ŨWg 3+CwslA1 ZbD:{7˂Ekx^j%;s1 萃AQ;רG #t܀ bm(􃁠@v2yح";k"yP'q)\ctN $rpح8Mw[NV (a&ohe+hE ʢ9_YD*mPu:u@EVH# n / 4*4PI[ ͠zL)톭R3vӹFϕZy84:o4hͭzqqSJ#Q[ 5iDUYsAjMY)k0ͻOi>p}Z?)R4{hj%jhI.%5i]B1I%SQXPTH10n4`|01FIHI C0*V f'$54WݭXQL$ et\?XhUs7;2U2"Tk #xs&F94)拾CJQm3[Qt%*K+Bp=gfvG|bOe},jHk ȑ{ioIo%oi_:{w_D]%9}p9go"7E1si h,=و+=niTףi]5sxj>O>8J:s$xz F+wg tF3\Fm]sq.Ʀ-JR>ym0*c檗GNʇS7'?Zփ闛,fJ8\CM8nu$q%2KA74 1W >UUbZ)˦\^mSiH&pPj8@H-b!e-MRjz_rM^^I:)K0.hjkYLQhc=( .ڴDB,Z#25h!jYh1 LZb̮b*ӰvN . 9| -^?2(4ftn44 -g\ $AnFHt2BK%NQ'%CU Y(#&K\ӜѼ(BӒgrE[dZbT?c\`ѠFfHV+ː޼Ui>R!M^#8]wHjyz.Eo/ G1[dWOw/^4/^iYX ݲ6>݈췚ݨ6;c}*Ž*=5 kdӧf/[Hfmw\)d˺./۟Y8BF)1=`'2a$eI$qLN@u?qYZNQJUFYRZe 'i5R`+`&i%p;]S ˷0QJt9cE yƌ!ׄQPj,gjBw[P-Fbp1ee sKţhHhF#.;X]_=wqYFAt{T`Ț4zgMіtfꁦlL CSVT*ٿEvE2eGӅ#ip+9c0\M7u:( F&^}{!{g5*G0.QrMB`AAGhP"(k3$@G.(gPNQO,f&=l;O @N^)*kpr㠧Z^N7ʵM.ўL7_alI'ޕ57r#ήHP zn{e 4"cERRQEvK,Y_&DT߉MP5,Z r/*EkԠZq; ou>]tI! KN*G혝ThڽNUJ,TPZ*xOط+GL]Cn_)rJag#;鼶P!/Nv ^"ݢpwXЖ=*-]nk|ybvz&=e.;x}g%nrum:1#Tq=絾Vd_\d_ޅJQ:eQ@C!J1šmo VC+h( G9FӗZ*ICW^,v{EtD#j u[?wG2+D+V[ |P,d,H\]'wYflS}Ŭ鋽v^QҚ`VNVƦ%ȎU/dJ/wٻ][C!sMʚ{^uCJ0wñrjW;qj-d^<5p}6KOvY"k?*N9C> \^ln'I_<~nrˢfVfiXs]$(`ڐ5+. K$p9'3ܨhJbf R&Y$i)7Ka$~ ;'.pFI\?o' H9cUcZ,^ֿ>UDh!ØVѲ68ļ$b Sb]M` I\Nc&aS|VzPEnE,=< wERJ_O|s~xhWۗԧ_:|nۅ`s$ҝ;< ]ƻ$RVr6MdL=/k<>&?Rm&f/:(x|a4+! 9p%SR>UeI1Y7ޡvAԝv]}Hgڭ~mDք֒mmC)ݡvAԝv]pNJn˙'j&$ELQvC'Ӻ$N9*iHx|ڭvTրn/Sҵh)]hCK>+P bVudZu+pPtC z%[ˋ>U Q8X{KrG+x))-L]U(Ra5VdšN>-~"qoԌhڴ `(]&<9LÄ_nԜ`̓5^?; 㬡hҠJeVIx,m%R9Jvq% SԠR)moφ ZNn]"V463#*.2'TYUXm% rjF ߢT+*p|$W-Zݥ84Mga;FTH}2qrRwO urfS+4)l/e%^[Izq+P.aj)7{?T9KւPu !1wTb{T1WXC nTViOUU圵6۠)eD[zrmȃy<"(H;#GYf2eѵx@BK!'RK$"(*D".js/#54|K(u= (sT Qg8m.b78z2T[BZ4-zB 2_2cb>_a[T(*tE?{zGd MǧM&x|Xz`4=pۛ{|Pׯ7Azg,?lg@O;v4jtJ?hS>|+`/4Fў1j p\9TVzQDa'( l7N!rRQ\ HȷHQJ E&'\|B͝(\\ -ZP1sMb)A-*/jؔy^F r_2zQ7.nNIjf}4J2s)ә̱kBin`M6rA7YLiJz&!bĚI;9g*ϕ+9`ԸZPvΫoOzQd6N}ψR ]kMaԍYAd@Hf4޻<{I7ڲ&aڢ/XXoW@Z۸4:t2Ӓ5=w-bYSP?%o'ϣ?3Zkwm62PՋ]yi d(蒝!.MGKkѷ]M;[rJDVWqpr0iB\aEگGs[!B&MkF m?*4ߛmSEa;Em\ǗElh_~bYoBHsHߨJ վՃ [>z&H x *Rm/ +ix;yD聫̑]V+uS}k`hոtq؊4~H?H#^Nj qתEy?8o`m܎"7F=(4P RHե}3tt"rE̻lcd_hբT YۿB6Y:}q*5mO p8\摈`E*cЌG&2k4\$[3NJGk,m)n** SevZ Oj/w sZo$άM\af $.j+Kj7s@ZoM-X|t~ 'psug 0]HEx4HCaʜ5ӠpE$6ծk#xt1XBj`v&גk'4n^:QjFqEx!RΘ+ps!.*)/*kb2\哽qDo_<"(W0EgIqF×D%Պi"WscR;:y i;+T_\[p ]pFj@HЫZ{mٲ}*KOmګek')kCu-x:KiWèvHn@E=!G.S3 *Z|Q낖$x.dt-TU3yRjSLka`o~19<_[af1b8 ëO~|lJPa?>>|> !L>-m{s6ʌ]+ ~xƇ!~ πɟwh H ooU"7ZCۃp@h1ugW)R/Jy `("W\"僕AXZ y6X8ȃPZK\9o58̽%NT`Uwtf^>(H+ռzN%Xj"1ĭXKÉߞSub,J&R(OhcJ[@3 ,s|#<r AƤA+jC:BW8 X\ha"9̈́AZiSHO +4`g] Ib߫93^a\UH|$/ٚ/\pjNݝ{W[^U!JٷTO@DN<'A]msF+,}U(K!޺u\\ҭDjIIj 8@0V*eK4<虞~mg#e3l.8 "`,tsK #*װ͵LB` Th4u _^TP;| 'h9m g*^VEa55*XyC2) Y,ɕTe%B8מ2Iac-HN`FpX` Q4-4LTP1kLS` N..!_d 1.++pC f+oxLj03G 8Z `V `ȸ6(#&9ђZ,H% 8I(DF,͆0S6AjRjLYB ,7Y\%G,Tϭ׆`g X#͛w]y)1blۮ)N1LLl]k ja˖5a8 ef_[ZooEkP%Uq۵b oM9( &=OfњM b>,`Zk0 :Xna\`mІqd 6qBHcaI%m6ɽFZ37cƐV}YޚD6;I {72V(n8J)A)Q4@td˵12q &{P|0fXnu J>|hm ,d1P?Ci&2| D]SGǰ.3Lɔ¼bsm·wCJ%ˆ>@}|\+%goR~p?V~-';5$퓂6#Gp{Nu+O0ݗك8wZ6e5IQM7ȭۆ1uD9kVE7( 4]¥1;;eb5ݕn)"Zm}Yp͇cHL!=fGZrǩfB<GNOh+tx-xm萻v"*NqAlܷ-v_! _N`٥*Zq b֩ xaMzޏOnwIu=D$=aM)=Bs)ӼV0߸jXNգszOJe%RT;sv0WD"{הIp1*0 -9Ms2 fj˨a@i+8˰,r~),#N{=3vPcG~ey*~woewy|g[)Wa^c$YGt{E-:α0 x{k:}\ liezOA盫y0/E-S|72yR~ʋO'n]`y?(0ݛ_t ,m4;7[]F~QO̯n>nn. ZroXKq[`Pf곐oֶk.Oٮl k8q{fs]=1HvhYGA *{~p H":* ?fBqiCꮗ1*Վ6J<8_)aW(3dHz(#qL e"GOGOf 80jC9+ò 됁/BF b EV!q.muG0#PШ?,i[vld`2$ǏDGpzLaFC Y1Θq+&u7p8&Nd# s3 ,f4 3֧;iBA%|O_ @1Dt'yV H=j%M !%FޘeMBmlˆ]ZC _muP+Z=q?Wk"uX"Ί9}uR)LYAh! |. tŸ:He)P)q+У"/$/ Rp :,}ZCu Ov5_LlLoAܱ5ʔU˽^)$d6YGv""i=#T A4F"YGm~j;@ljrᲂk >Ȋ3VHQ )OsɳRKB$.%1B21EݥFw)LUhS}%ϿS W|,&b0Ỹm7WG/ɛWQfniw;ϫz;tR~I򶚗*?H}ƶ|G0-WYրNk9U4\5LXi1VYʪkrHSYOeJ~1ŭjr;oDSl x:z7[*!vƵ [DwBqM)/nB)TK:Ͽ `W^97A`YsY#;; ?}6ût->>}~03ڱcm^SX?VئnvDxw=lS4Gx7ryefa~w(wm\{'S2NALu R;?M!A-F88i 1\oPOJ=lI'#g:5hr#T@:A-GsNǬ:YƆ Hz@q: B'滄?>ݛ?<ӭW7W>/nr/[%}+m*AXh[ڧoZb.H&7V w~b1-#]fj4djd+\`Zp=ˊ8]S69˿D)H QJ-^-XgFB\5]c[aٱDk,ϫǞ!8ZcoߺGkT>ڷDi5P!_}zl&͖0IHLjN|@8J$^ 8T4ԯ 5bysMwjr ^>y@mR<>{g|dx-/xcG}su,nﲛozPw?l8rbÝ|K) U=1q?Va.9Jps<s)|K 23pTX"XdHpSa; CC\#T# )n^|6k]~ȓ-r$p==FLc4xaHsco1+ [1^apͧ\]f /Y]\ssFbz "ڣ{8uڣ;BCR.nbTa:RwpkLfVjlq(9k^9z"C)h(04v7wgZKyA|7&!?3?3lk,\df6u-Olo+e˦Oiä 4zg07uu\܀ۺypW_M~ǿ_|l;sUbWUCƧZ6\TnFod7Wm? yș$/}bBQM g1!\aJ\UE]+h^h~p|@9&ʑ,ԓ2yVBKr)YB@z4Io?d.nk}U.I՟!?߹4w%sz ΏWWD>< _)t ǓůfiϿt1SՇ"A1TI,RWo;@1kNjv+0h=`aYMMKM]=݉fT -)ЬЁczi*J^@gUÓOW㪰тѼ>yUz#FtEaZ=dFSKBx{ߓd/9@q:įHGEhIRp7sGOU^l"Z 8coZ42Y 9D) ?1qwrb[&yBs1YEC,1k՜Ҩ2z2GlFծEە.fd5FNWcsF{z)Vz)fH' +,ӓ3p?Ce('DY m5X7xI n Ik,5@ɔ'Ǥ QP(L1`z8=-c`ȳ'Lꞌ %I0]1Qٌh7u$JO#'q832u^hE A*8Cf^)Ni U{NX)m:֯.E[QL H@#VK++0F$>T,[]TDzOzAPk4DKmsս9KemO9Pޕ\P1"8M efGƀsX|8ڛ=P+Gs&j%RKVvJLyo˹$`\#\~Lg~AV :< R#4 hl4bg [r ⓔZm(_DP2An[Sa"eaK#]cq}Uo)e0fi~׎ʌKRb{4w*GJ%\)=z:Aide`$d]SzSMɁipBcyשv_/RV/*"tc7;^,JBʫ~wipw.(wٸK1V{&8Tbb`ۋ:_IJZ5Vޭo EL]h^9hNwnGGyLhvBBr.S"5RE:-y i(@#A2J z/QC{:{zM ѢM1k&(m3Nos8UZs<Ҕo6/$rHDGc ),? {Be\T x1"x}؇6" edBGgL ?hͯ:x iL$zv_'R7CyܞدBxJs׾wo.3?zzȕ̹Nz87rE!(1r7[:!51Qc[Mt䈙Sؙ";B^_#ђ" S?'=UΎ+sɀg.g \/>ۛ22 x QN|fZOYϮ+W6tDqoe} z5i-5Lʑe 3QקT &[nϏ+~U/2{|sMY\ޜW_'.R󃖅}n/<[,Yv3yFw[|q͡rM] Z 'rI<t֖,2Q VT܀'.Qu vXUH ESQrԲe,Ra@$+r6dB/8r&Dv{\mmYP=J;C A>Muhv'htQkcν\ikdԊLji=9ՓU_ns\1jyԭiO5¼-eTpn􊌁[pRz}fcc#QnU j(^_$_>ЦkDa-32NPj/c[,{C w˻uWQR wM;V,ce.7,EYhtZyRHͅK! U|2NU*A[/-4/<(()CJל0:hbo_d8AD-˥T$ߒ{)LD*%%QYEYb( @{'D<)BF5XZoPvSA'8 K5Sf: a5GvO?/{Ye'٫u??e,* ڸ~qzwyaEm%⦸l?l&1m&ɯG`^1)\22iZKMJG+lSϩ@%X5j{G~Wѷ-~Zy?iBl[c_R|z2Ԃ <<54r1 Nҝ|ԐHb䔖2J-19ñ o+6紃@nf z.ڷ[0zS2p<%<nffhA2K1ppCprFhbH"'Vƽ ޠu_xV@Mҥ*IZHҕFnXj" )QU!xO *,CM+y=稰V7B =e$^KUH$rHplI2jPSAY\˔6OI\4lxi͸]~St"Jx 4CLv9Po(J!¦n m"e- hfC:_DJ=ܦJ.>+4P[{8HpKߖ d\c~sֵb:y XS-IZdm20䦊n !w6"!u7itw%6ƞGIؔ_%mqS(w7j2aY|>9^|Iˡtgšw3vk;$L?- KNz3z3#?& ӻc"‡@dTxsPs~U: +}ZJ6/>>#0m򩹥Qlc<%)FC5Zé6S@N@U4} xAP(12e {Wr(-KQ@d8t0[P&zax憷;թ3ՑzN+{guwi[!K|kW+ZN 3JySM a&8d^I`*:LDhytb҇1[d: 5]vsfr8Ŏͩs$q5UVӋ/D ؅e^Myn읂RuY %k|dm0͌BƽV93ZUBUwO+z|z7bP;E)҇ >T*U[DJ6ؐݶxyBEM\yK rF[}tՐxa:tbx٢ 7~ńw̟ȧ:*O /ZmFEF8|L;Kf;ړ-ǒg V+dl,>U,r[? 4*-9N˖d킎"HvApoI) KBh4rUR''3a}"BoJ h7xl|UaY|ss=d}PiοnAignrp7]~-):@߾->Oy9Glޛ S@ފQWPUZw +^~9?bYE⯾`S y N$!g.Q2%en6 b11hERy6fn[ 9s)ܦvAb":cn mZ*i.XU" fxb-qH 3 ;jb(F9v` ?g@Wx"*V_)"Y 4?ۋ\,:j{FT/B81xМ4|i>3\^/!uH5\ 5%49goƱL hm(oֺO ' 6+E6r; ;}/'/b8nyIU첀?-,}1Ol26v배2t?)78d:4bZL[ĆUR+`4b-[# 4j vϧE[ZZ1`ys[W/vVh\F~pt{Nq7BiHXnkc&M@Z5 ր-cJ;ָݏwAc֣H3Prp#j0JTb((*["`ۍΐo6̣7–72rwe- wgk:(ջCZdw|>z!=&#UuTjoSj;%/5߂[cV].OopqS> 񨅴j!hE9'TqkRĽR0$RQXHVJc5mԄTFhyRƳ]%<"q4ׯ@8DIɬ Q!Uc+ˊ Q60‚@}.L G3;jBt d)Jm8W%͈ӪW#݌l*FtT0P V51F)U`&‘h%4TZW8uĠť|2(Z6id ة3!+8؏)؟ƽH>S1W}K $'v6yiG4$& sx70/-,S s*e??N!Ltmt2P[qDua< i= oW1ӻ9)$;:lm6(I,&c@+™ 9ߎV~y7o~Q0'tCkwA W]~VZ:+dCah5dqJr(K !`:xQFjV!@ 커?5HWQ l{rnFDyDD"CwJ|"E]re_tA D("+kX Y,h_R za\L@9H^boq'bFS8ՌP[}9!oؕs>y餃^ݩA!k*z6mۨ^T,V&+a. [XʍEUPAU^PRU` 1ȮCnP&~$lQ-㒲Ӫ[@C: DºԊ`\Ҳs#JbIZJ h)V[zK b3S+L,Cxǎ+ ,_b+H&\5'" X@rf/@jM~̖L+)iC[㜨 8@҂BT@PH=˅:Cw ِXh|zP>'pt-D||; HRx_5-^=!"f_}̠|)=N& {w]D,eC3]ǻ;'pmna )k*8 /}qK%VCɺܨ@Q0b{˻٭:t:%dpNn~g64:L!v/;֢ㅢow*UB$ zIwY%QṪ;xr/ j$ N?Z< 8T;ho}_~,yzɳBruU2˹JfUuOj>WNI`3#Y:{& g upI$I/[>l䈔99rٲv"/餋pv>S]~Y/6V(_l֜,uko?ȈTJ́ 4 L0_J]^ƿ)L-`1rp2( `1Pr}XZhX2^%Z/{Z+qx(/y=)* Jǟ­'kWވ׾.UL)SXrjR\Rʰ/$Hm)far ѝRa#iǶOig NKPR4vr˅ȊT!+Hh+TRV/uFVP5kMZC%@BŮurLľYI_I\m6}Kd{Րo6 [$#sxw1>" Jzg&n/<2cr@{2yb>,aׯ1Wc~[~9>>ӽ55;?_a}X!(㨑0Z\ON?O^ >^nH'vtko ]/=n=[ͳvp 7,v{tf(f4JXӇߛm3;3T#O%rc!&Z[9f]MvP쏯i=sC?ZTg3mWo|?c1_(N_s|7w=3:RupӰX˃).taCwTCVUvRr {:b.j\nw߿a{k 3mo7v};X/{ "x3 |}f4ۉvt*W:{ {;l}4hwwJ$SE O c'Y ۟]U*#МL@uTbՈKi- tP'z*|8^l%F,tʂgk1NE>,LX/Fn`΀̵/ƘCq dJgT6[h ,j_~0}eOhyz翼zY?7ˡ[7˱#&/8oo\쒝{N00*_$L*#7vCd;xZ!h,7sZAL*A07kh%͗AoV8}ZW5\(U=Lpz-C4]VZ1]NJ{sV>(ʻDZǻ,D}5gToδ'Ia5IAJ|:Uz+nq7n,}ŭ硏-'8FHUfG9uwβ䥷^qhݤͫ' Ϸ'mV.ι@(@DՖmQհ}YA`%D̊Ѹ+f=TYQH'^| pʹ]W6A_-X(%BgٚJ_ruPձ.%)q1q 0m:Sʌ|ʌ2ͮ@a et(d.# F,4Exl)*ETp؁&"p5k("XM;1l8Ѡ} cv:P\1J-U* g#n2hXj@F*n)`#$n CuαZ'jx:1"ɝ4#T<$T幨 nYi2" #&BUtcbQd ,[]/YK\$w(Mi&w E7WFjuH&+GgS'643:dC{7Yłm3#1jB[Oz8S؜J7g+/y3àtiuT"љVq+ʓcHM ]A%G47{ʅTqe DGL(jts$$[5jڶ#>j{u9jLoRA|vÕ诗Rx7 ]@jqw寛]=ܛ_הvn>Nʁ$;K1ַkcԔ //noټ7t Se+z/ԵE9evpzU%]KOq:癕ZHșhL;j7^GnNm 'An=P3bdM6pQb":cn;9i艆jr"%ST<$SnNm1}$7|$[|bvK!!g.H_vvuÄ43X{G1xQWA!YY hh})RRdR^KԽ#[3X W6s\)/Yk]J5Y9#nC%'P爒Uǜ/XE$+>#x% MQ"Mzu@*`2b"{qz\v4-Sp D7gBH'+Њ| ҵ9K0Wս 2߫82r/ۻΪyxWJDWUoAwX' q~5;8&W`)k}u A(n ֨zi[J_k;RWZ;BS%u/hgW|Gf3 B\Oe"YAZw9QN !9ErR 'd[& (h _otV(kU8AaesoVrcsČ<M Uz[GhK~.sFt #h#*Lӣ?NVU_z㾗j 8;ZTHWٚX`٨zyurzdqe0 Ut<ՠ pET}~8R\:-SSPUZ%^#ƤJ&np)aX;aD :pCJR#]-|Vh=%`rf,\aft!rQŜ4AKPV bW|QE"cA:њ)/!8IOt3$$<:&zƼ3tdIs %& y} PX|:~޿uAD˽E!Z|00Đ'bziR}XMXv=j%ZF4l߶iךr"ۼgUdUdUdUzy1)OakcǶ@4w -`ͅ(,U6vԨfΏm F>r D:9HQ7t~LG)mM谭|iqDLʜ 99L c;$ dJIrJ89AN Q5@a9hO7f f2J1-?JN8gX!^-*"W^g 6*M&۪hw@3.~v}Fx>IC X9 eZ"J \w㾗LO'U]dBb:ޤ`ZHQK])<iͿ5l jqD z?Y~O7mrm?fs"ǭW\C`JA%A29rb} 8患z xラ2¨n"cjYX8%4g201WNk-% fQf.&2Vg RN;YL%eULJAwS i=mV⩴Y)oH%m8+n%2) n{ 5xYn[ cz.U0*d#B9=iw1 R҂5Ce(g|:fҵ?hc?rXU_`#p\-?l@cYx9OXE, [ =`<_LЪv2Irx%L5{d>qQlM3%2ړۏ_ \"T*tsGIǁ\OZkl.ĔC.i,tou'?<%D]wxƥ{F`7` I lbm˰暴rB nD!ccX/%jTKBWFՃgHسBK;I`X8V ``iiM 厇](C:4CKS94sJj)Ƃy<`/ՠL!f6)t S)}^4L'b+r2o߽Ex8ww˫35WF0"Ĵ𛭻Ygk6M5\݄Mw~Po!7z$\ :xm<8-;/"̦fRuoBJJԖ,'"U76TڮyB\]S}(Ꙭ\|Ū\c)yf˳ٝ⁂./r0w]N*~9v{c bt3~^Qwh&yߛr =/3<-UpLLXW&=!L,V*={9gZ9Zg%JQ'~a媜NKELq4|c&K [S RDiu "OK']kɝ[Exb Y^KgOLjYqHT_޿P*`L<k 1! }lZǢ!&gCc;YUű3"^^5Nn'ȷUc7%2`O|D3 m ʻ/%֝@4Fa7RMhnT|dn=fq9SƲ!8SyTnɿc` %!m@`uQH7ч6[S5*_saCnp<3˳Ofv1ۋuq+bwrO;O;}}j)oh>xNfV{g_KRutEUGnW3=)׭aojғ~ݤ`y@b|;BY,TEqϯn>&^:UyrIZĥ[wfᮯxd_jyd\w rkU0[0KB[θpQz/+?QA:2J-gat\-g? yXܽD#ĒkL1ȏdpwEDlLJo@br5;3{\6mxWû%,>fgdOm.Y3 R1ED ~„s9}t-bb95MxAJ!'V_OK$Zqt8!-/?9(/1~㨄/ d)SIN &o%r  .]XF?E}:_ I-|2̂mr ~Wq_Y~gUyf q.'BjcX.Xi-h+ 9D{nF /]Jn'{/R-E)eo/@Rbf8*rdn4@(E |>SG|?zMHQW}&M6^gkJ*BUҶwWBp,VRΥ;xw#G&p44ϴJ*B\PxF{ui1JjrqHD˩Y94N3d<2o26C\I2]Y> P,Y51˪!c}Ԑjhw*D3W" Ѐ -f5AĶ<6Y~:އP2fU0MVt^j|*޺u\=-kQ%9^SuۤN[kN[nxlP:GVӴw7 kP۲ImF,$!p p {g}3OPJ\ufaR "SpV`W*sX.Qı3LH>Inr! Lku{'em8h< Ja F̞`8<%KĝXXXj!؅ c+)L!do-srY ID3J ƩsT*sH*Et.Cs+}.BʻnI=yz@׷ "|ȸ/?}Un5>f7x3B՟wfn_ j>Gk:P 0X֤ H`:aSX"p)g=KuUW爔GJR%zT|;RCر!AX<5Gk:%s D`ͮ5UQP-BR rؗѤlSPngXhCBEH( ޲8nU[.J11h Z znnmHȿ/SqLZ SlPh6+4EZuj8Nx/,g'5̨3k3fj 0~VM8U\Sx0S)\j;/楈ӝGVJՌNmaft|Hr#JUlZ/NO֧SΦʚ.v9\X|z26-+UjF~ ˕Eگr L˜C< \Qsuwȍ'Og3`JJݭe#r 3Z=xn ݄gnƩJ@XJe `ikN٨)9jI" Seg4U#B=C7}=t]KK윽VAp'a57˴D *{`IRp,z3h'W?SIP Jy"zB0N0l'cxC1ka}=х3s7^1Uj"Lŏ4Zx@&# V!A*N0zybxBT']hc+ߐš<^Cfl۠ՈHy*Z/og^>bl(>{ ĵJiS{j9]N}/!! ؠ`TY:DuTS qgz*9;w/= !B{Lͧ-62[z'ޓ|^6?$EGxތ 3: ~awEa5_S6WbmvhӨcR~F 8 - Epc(m e0B>Bvqf~v||||Uer5`W0ǁ)|,Z9(q>H1kIqsvK* -<_:~ܓHI/z O@HVSKOrtޥ8 0c!EdC0& Ϝ!JY#QHQiR%wO<4&tUI~gwꭤ?dXSKLe`Sh~6<<<>#dl0Q+8N݊@6^<?w2Cs#Z(N{ly#l4CiJM 1J2e*rcN`MpJh*e7A L0dABD'z9g|3J{ѽGI%ލaLcI4+yzTR]4 S(w2e^2mep+G5^Xz+JeHj>8짭aoigCl FRƶC(ϧZH&O$\L>q2>ȱzn+x Ŝ/W7*9:/fzo1uܔY8?hUwhYps awuU/\yk mvZa[vJNMRVg7VonqGџ<8A݁84߭$e!z5J1+d˖ّ1 $[]W6!,xo4h#I32]ZH {o. a'Kv0xNiϘ`_z؏ޮclͼZ|'?:2W;ǖ%lY uQ4}Q"X }#q~!t0nY<{G- r^O it;O8#rɔ-Im'@";P}r&rI] tBIU3S"TfdjAhxF!g[JA@/[A]$5J\V"  r5U|CbC'&A(ߍ"I3)U'A&oBxw}fٺ! 9BӴ܂aաiA7.(BhM%߇m`5a=Fv'E{H d2BvQ?U0CG⺭~xC[օر4u9f?CƮˮ.T!NA=He@=I7 ޾^ڔ<<',R8FH ?JK?\i<&ӃOS[|.>IK8|A?,:Lpfn cJhNf.fYok~Csw]cX<6.UcspTVQu5'5TpbU?͍E$ѢeR58P,aFa%4L4'YHDp {f4Je`Hp˜: RJR1CQ1W~7=ns}_C+,$ h&u-#VY2mIwLߪy`Q4/=R G~W3d ѻyq wgxy9}$vw b:B&خ@|:SN*/,ը@m; [*V~@ʯ.CMB\4tiZ@X UFgbEȇ #iJ1Za4#mIJ9[Eor&Zܦ( T]:7FL^AY'Wy+V]WإB:%ˬ'/"࢒sQa^ùD[UkqA"mmܒ dVS5 T>LxSL{EWj_pzu k4lRf3 k0s/ϟ "q PS. @!笆V6N^|`D6фJ8O$"O3| aM:3!7?~dKUrX SO>5DSzBPk%K%uzBE%̷Poxr n7Tc{SS;*V-oNOݫUxf"ܥҡiwD@*½GU. ^"ENL5ʼnN+Nrʼnrpm C!VCLLd,#(L)"XB1a@ @ 9#PhFbbH (Ι1€A֚X2LEd(2t!֔)`VY : L tIiX -m- Q6sZd4^+0]gV=dѼƫɣZˮܫ$y拥s:tz'^+;:˷W$yfF˷j& :pc 8@Wj28̹JծR.W/nvJ.R?WWvM5a& rK $x:eֵ<+~!%d*+3[>sJ/4N| >5ǽG^&ĚI=nTTN$P]r4:o 'aزJ&̍sH@`#O>ezz 5A{y>I4xhOGO;IL(cA4xFB3OK'ꎴuSCne, 0tI|28M+TMM r1Fxc! u!$9A)b׽AECZ$ j1&HF"&Q8ѓsYǘ0D?gXr8lz)a" `Vd Os Ky<̭[ { bhTd#!%8*"p,%aĘ\Q9̖ ) $%>bž7ؽZtyoS}.]? #&q RYQ f5:,rD2+Ǿ`+PP z nY < Pl_L> UBn\!Pc7 Pp,u7Gq`z fd߫K;(|/[8_nW khއ|NңG9 Krڀ8w߿Va A7/g.X&xqhS 6hM0`ӏ-|x~xNF\&e󠖫^2;5V?8)"$@8&A%IyO)UWїFAI2xmw7GжnEu0Pg&[nt|$ӤwmKnѝEv3~AE kPa Cj4+l1ǯb?cNqB! LTtl KA3 N8%z ܻ~e X!3e7r7--/E <ːjȵ wI%s\Wl;w0<X4i6PѪ\C]duJϛ3Z#8L]lG_>$ fxQzo]Y-&!uUg9gq31J * ="!ڛ08g$3IK?? VVH155Ϸмϖ BX޸Xr4 ^JVsNNWO:烀1Rq~c9S率 ȩC9%"&W!X`m FBs$bU `bR@;5Kz?' ~՞p! V$4M:x1YC1Fr\RKE VǔSF2 "+]b%9 ^7[3炘3vtp"b`: ` %1̔XpdvE@ zbdJ}DPd?cI%0}b櫯{tQR@j]-4?/NH>(..\ \uǟ A؛}m٭]G7ܪeN ɖ|ܓZϛѝ]eLw*37R6p燽]jѻ t>#wk1 ܘw+ޭ 9s-eSeF;MI[Y BL3x6gn\VEC[r&Zئus|XvZ1;P݆?xu%ΰ o8#Zɪn+qrBwbMfRem zϩB0 OCbBIݼ 3Nj^c!r 4nGZ 8PBYQ"ױ9P0L˃0Tl4wfbnv(2Q]j.{ \`oeclE e0z?`o,u:?}tCZ.wdWCI&-Jg`If*El2/K{PZ%W@03r/dYa協䥻0M8F 7inܟI!ϟrTphb'09Z#%C.rqQ3F2𨓇'G'A4k$`wÉ9$#2(@tWy7c:BГAi&WQ0~zE ʬwBnjfFP7B=O$nJz`vRr\(Ƥi>"(N>ɇ].8יc${2g8p<$*~IGr`I;¾g5&3kIύ([pяul9F=n ,H͢ Q RDR<ŖBJ>p:{ @̳Ied;r&OJ=(ۑ R}IЛDi(Nh"M 4Pm̥T ` q c)fRι")Ō$t Ը(xVs(V`KP#[>]\a;Bu{EPӮ W0>.C.!=] wF;7Xw ~vdyY'n,4|{` 굶gntqoڌ4|s0LVW]DaNJyVcN827w 506B+Wi[NjBa9ύ[LE+5-c$mڃdm]8i[o<,Ɲl@xDVb`QyDV[f* -][o7+_8ZE q.l6'/'0ؼ:%Ee׻?řԚӗiۃa,댅B&ڥMq'lz7h\K{Dm#YԈѼ[犖z>ZWD;)Y(27EYT=Pv j"kՌ68= /u8Pchֵ~[]z{MlKq5ԞҐ[1ޡx= . ?6r aPK:3EXNRD(`6ia$%T + ;#VH6oa= | { q0tn0&/M ܚ(JÆbK,7ہ* u$a5br2"twdRjRsR& 7ɩZ黉ih ҏ|c~;1i NSAZ>RNndȘOwwx--^MVFEfkèxbD1-p|V䡖|b9&nR0[Kn1\ ɦ/[ɂc_YpPqk'<g;gab#_d?iI,Q'_\m|ٮFɷmWb@f:(cn L⎑!b8(8Wem00/SZ 9{eT2F=c).qS2)LS%oN1t bRK hQŔܕ.?tT35WңֈLkJqEyAh[ jB>!qY)XOp (@9ln/ RV^$5u]ˆ$%Um\]P3 ltrQ0̃J3(52Ăȥx B`!I!F;Wwn]]_?VB56TPJgMsqS& }3 U2QDv'kyO4%/yVY 9بbSL=M`h2>^^] e;];ن;چ$ r?i~Nz(RZI&8<'4֚1df0&J^-loՕ;%'J%\*K2H< X(5b>XRH]PUvkm ċnxPdtfU5i hGnǻ(kiUb )b\/uҒ 8Ckp*̌֙!0[dC4B޶_)~nv4%7'c(qy7g_zuAc㼜7?g◫Qߤ N~Zڽ|a-}800ǰ{ h~_Nu z2x ׷lHDƢSI59渆_:̵d)rnlW_goysg: ~zt:owWOxU9ͥy5p5 a6uKϫߺ-3vo7wSbMr*^^;lE}TLkD} vb;Gl bޔ*x{> %}/e>V?~@bԒ4h< ,"AbU^eȂȪymuCUbG4g*/"'B:[Owm<[6jG/)f wUMQe쳫𢯋eo{en4'O3UZȁ:Y0'z$J(=GƧ$(ERq>.< SPLѩO#j71(Tr zvbO'g Q4)1Vq9eB)R3OhnYҖܬNN'd95JJ )&t MoD LI+ Ta>|R&.a㺷 ^vo.z"-5Xg`dID֐JSEG'U+xzm!" _h4y9\.FA(bNw0`8Ы6ir&q ؎|ݍLy z3]T4w,,W>qnV沀w[kk= U~i/goaCZDg j9yPpIOrC/nuߝFw We߇ {\𽺞/.t~juj/#;Gc$jIმ/T_wn,SramGV4hOO<;|6Mxxu,@I3Y3,ݚk~T}t[{kŎni;0n 'sLfƴԓEK +( N JqyDԵt42y:ɶ$}d5)G2z&,GQ C_w BN}I;66;`ZPh= c8)%DhC}7$sWSPB3@#=dRuP(ȓJGy [*:_Jm~q=jor?: >gTQծ6jB*!dBQp#Kjk+c/3ƠɅ>E\r׍FN 4v=5e\H9C.dDenyb鑎wG|Xk e:ȸ-u0od1y!t1w/Zŏ7Tpq[3ԕG+UFQ*㹮S(aj'ZףFE2{T?baBxty7w׿iNSF_EV -@{'V́#%\X[n`qW!yCOb/x؟@{Pi%C?y58-Hۼ H {i^%Fs?}XLR; ٻ8$Wz`GD6XΎ1Ƭ_<8jT7e[67IbÖbe\"L~glglTp/l2 ; TW*:yT*icZg쏳lvY2{WG}:PE@_QG#Cst>6"5ka5 UI꾯bĉ5 #I~0:YDždd`=Fȭ5-,*NIa>h[mIPzKsՌT D#\8B֪QF!%:d4%!sګDR*T 08L1.m0bز6T&'Rsm,gs!K mjB Xv N6W۲BZHZX{(1el ai az}f"2- nͣ聯Z ]+,(vB" VUFk ԡXB #",- y@IjJFvw*a:UqcЊ) kH:\V>Xw)$Q*!UpdQhM!\IHMA7F(r1JxrMOѝjK1^ j́J!?9FO 7MЫ/&_9JiŃBλ.>7\x5Q2qh,7C2='dt4[;=AuhB1O״6ށ^..A_kqp)hQM[ ݻ,.Ɨg_9~_FHp/zQcFn圀Z̀(\Ot72z!cLOO̴F}cwϭ wO,!V%d C<`Nkq3>iQͿU\~Fb>if!|9j@m^`59H\7!ǗNwnK $>]REުC5k}G6@j%޹TK%:j zR|$]U'@/Jݬw} Յ|0$ZvĔ4YRM&pg%ׁhZn}#'|Mm8y5PfQ8_~E\y[ J2࿾[8'wZg2QH4>N,󇻆Q¢]Źyf%A@ha=]nrDB)u"+uM#z2(1cn%TnK, MM8׼$[_N1xc"otطwл a!DWlڥ̷yT6oer5]?-QҨ+wͫ?Ez9|Pt$z9vwQjؖӵnot-ul6J+KȍLp>[/"Qu?|~# dx]9Ua"Ƅ*4FT+ǣ|d(9ߒMf7\_hNs {Eu&qռ'%6W5{r]ׅRTr9g_X\sX?\\]ѧWAOӫzY>mogn>{tS*_4l~cw`}ِEP?JO?:u{ݻ}ѧ ,{ v"f;h^H^pH.7'-+䓡șT`aL-l[waժAa?FJOҶQ> t$$r!07w{J glngd){!^kV|1Q,CkyC;n8i osŻO1hбrSG)иvgaU% ]%G'C $|"rA+wJ>nr!!'y\p'!z%GvHW#,rN-T8آZi qSoJFxIsx*x~%_(RԖ19NE*th_թ?_\3aZxJ#S~J,p`fϸ/kC+d>ԧI< $CvMz9f hY5&x>13~*Ӡ)lW TYc2oπAJt?xX1="KJ{kJ b2.9g' "Tlll%jFK_1 $@L.xnR6T02ƣN y1Z,DU_p09.&As6q_XZ Sqi[̯o1ao*dH# c:g4b)xHfk Al8Zߩ0\ߏ#+s PEpK>07H<0s)ڊpGp{@Po„ia: 0%6̽W̭N;XT*wS}Fiz\||T7n +*`G8D-; mKFu)Y5[OkjE}/n'rxj Y\[i0û;bf2oQKAmsQmo),Neͮ0=Ln=O!0y[jO@æ[˰87wsk/ǾiT,ޥݩBWFK:&^fO^fEpxR,䙛h'y77ʠtƻ1: yOm y&Ǧ8sͻi88w+;Fv] s84օm y&٦h; Qizv<\c/7I/Pj?)y]-O0.kA4v\Bzse) ^6cwa#|:ޒc}ㅱ`M5mKޣST.V)< +E lJ+R`1XKC EX on0IT#EWmfKrKt1p2X% ѴITBʨb)h|Ju=P4<2PSDGx CaTyZ|j`0/VT ?S𥻩>_0J QfBLh>Nq^06L1'ϧ+,:m҉ I[HC%q )P*:0'd5 98ī׌f<}fŕ8>RW$ɇJzBJr"B08 ׷5yi]_ +a]BgP(8|/iCoo׭ ]e\j+rjo"Wׇ<~dx)˵?\%"wލ+8wsWz?_7!sӕR=>oͼel߰L=- ]9JȷU0U}-UxZL JiTI┦׷U~E̡œC kn>LvtӦ詭p$罅^%%p*; FRJP(qǁwtxm:.PI1 Iq;6pOP_?\oX]1>f~w;_MZgeVЈVKXiz鲣#R4kUwJ?W` irWіq[ɨdsݶJJ4`]P[+Yw8tga]% Z&ZM{),Cl&| OW5`[c)cS3[,5wi2Zo3%s"Kg>]-_Vpd|̫w?v1#}^㧟+xzwۉ ;Oq>/iW5syՇva^iyx"݊}нɠen3ӻq}$[dۺYkZwkhTѣ `L4՘W!.x[Ƹvr#[Bjӿc_Nq|"e埝>PmOyiQ@_ W{DMY0G*pèYMZ*P{Y$qPxWYmsFPF yћuYJy+sQ=mjd>㸊Hۿ~m٪6z?$I6i*$0cL:LWUĈ2r&cld;h e V~Wa.B{8*Ӏs]|n,,עF˧F/\.ڿR/[!S&ql$] y&/*q%AUZFT*xx:Bc$+#M)#Siثj&dJ#Y-=55)M}D)εaϹY%FuѲ_q.V~+ %rcr!UY}}΁~o}u(,ktP@f@>}iKj֢R.-ZI' 4hIܜ%(j5:ӗ8h]SW iv1= )%2$xQmqHu/4awu~}KL,hU{7A 3rb8Aܣ\AJ1IWNP4A1&P7l[X0zUD0&bӚF+V+>p f C0zP֓ *]YGr+¾yDdD"~0 ?-`)E(;{A*fqdD#ɀBnJ~rYŨ #BFFXCxfYS6D aL)4]Lq =כ@(c\9NĩSXn z oHVCR߸}ZUa#Hü#XCmp# 4*XOk//mMݔ{D<T/dP9P8#YYxj$44VdsfJБ\02*#l|ZeD*I!@5֍ oݕA3|{΁ߡ'sPl߫ .,ZNB'L Nk65]w#KΏ$ l?-`>o? y'c^v0Hr^dKwI{yNAt \yLKĉ:\pDt-E1$a$t֎PdL+!sMYo[IR#  σqkA8k\[;{ps0{d٘K^/8z&ͅ~ Fn[3:^kq 1d/f>~z BrzQ <9=n:׼ֲɡ7[;xkē)V(O\0Mr[R XZ{/y"չ&&_l M  ?2*쁲4A281E3R#G##!a첮m1OZa)AS* Z‘?w y۬ͬ*fCqCrlmoF_@`,@on HX^*ؤ%YYl$4]s9cUNG FčC |*@޲ `!]ll L}k)Nl ] EJBבlc`ŝ/Ѻo~Z$315v^z: Y v:KDzzMbu*iz"j29}.9 ?T8I5a-d'.,dU=^&dDӴek3R*BEDMNGt6up\տZQ}`*i`rvgX %;‰1x:uF'MX4`b5XFU,GEFD{i cW;kJ 71ud(bl\"6-u,kfWaAO|1;XyBNmQ?NDBJRo5墲Q:!td/{ϨuǪ Ztn*LyK ݳw\4Woh 8K߯>ʣF _Oըw?JTw5_~~`Ipﱏx6C0Ǜ{7h/z,_.Sfkzg7ۗ~T;k'//N$ブ{Ϝݖ9uEIGJ'JnuA)KW=R~[y?.Skj{r.FRK`y&gj5WW.^|ZBacOJW﹧Ǐo^×UKDPOa?.͸ۣw{~?:$A 3Exq[ȢWO?s|-u'5 R?:ظҒIzdŻs3}6 nHXTop =6ZOiv$zK ԴxmA~{f|rk>:?AyiB: .C?; :}|-zßӖ9]92dݡ?Lߟl x)}Ÿ<Épϑ%]\g.2qk9nh0l9- <{y:;2(\y`"vj 6O9]"ͣ?ir9Fs"C PzM۹svX]drLFۧ~5%E 5Fo Tcj"VK%Hm;PQ+hruMl6H7f67O8d[bV+h]4d{7MWK:Xx: Oc%X6k}m_'p_?ckAg`F}ZJbJ IMz>m`=zүyN2.KB6jx~ԽtԻש7]ڮs ^v<!?hnLK ٿ&N#fب:FmxgZUلt#ڝ}h54D~%L7 |It5%c},\8Y2A ©FzĦ>pnFt GpH;:eB@2!/nA\f=2 `Ecj Xkʉ7͙RuI:#k[u5rPWh YtPǨD&|)!ygA$RFJ:<44>s!6tɜJZ6WI#%S[iƥUc %4S(`j0ReL& 9`e} Ղ#+g5a`1ou׳P6fuҸ0oSmw Qj Qw@2mAjZn^M7fNnG}^217&T)&t[;@տy@GR W-zA]wof%LGٖ0r^'= _Dd1ɺ@0?&lRXIbGXIg_LЕShΓKo ?u%c3 ov[SJkӕ:&F0ېz>3X2]'Ƀ~o"zdP Cq8tG('!Rj 5Q[[e<S)yIX Р_Gr7$;'stKW[[p lZ%Ή$>1jFIrkiScV><\֮HA_7Y}o9 4{ ZB~e&_,I,j[R\Y ?ziL}J7o=-`}27ͭ]$:75M2m}ڗǿOS̈<@&E}|nj߼%|M!nytкI41W&xJM&L<4q: Y"^ /,2NmI:]tiA״Ka -KRXbZKQ-%c*GGRFXCxgVU3XM8\[WkCX :c $k5E=##!<86e_3Pm R5zBT~d“86rwmO~_?%*G!}8@@i-F0ן?m"}~<(5Xe4@p>@0N%ٻ6r$y9Ұb ۇyE͏H2.$-ɲRQınbՏ!̧˩*A|żRJFR0k:p ae q8u(NR- le `:aƿ=H :a鶈[>S|E>j0@'TOem v{,d_E;e4WmuZ#ƫBz< _+kR2HbM.+ L-y/8Bl}Dɣroȼ7kعA|tKH'@墄i򞣤C{< 1WEUjXć˂XN'Lpo2-T@/2kh;7e{PELz(H{VI d  Wwr)-Pv@G%\ҐN77Fx`[u+P,1ELި$stX%UYJ5;Jj $lZXzΉ$T"5 DVE" REQ&UQ{b~}H5\.~"Z)I:3+з߀i\J/#9x@vPF#N J?j];*#9F4ʴ q>XRB:lD۟jTOoQy Ih끢$tZR8^1d@Iz%اP4u뫨v'*@J& iekma՚&BSFk u:6wV!d@iKBjbc͡6ѧ o1Dhm Qbn1Ă鄭gc|HoaD$_MM< `#FHxaf$hj4wCp" \vl!=RӶ umf8bnحik :%͂?DjхTkY#-#`Q-ךZ_P̢UG*(sͅk؜YN Cjk/[5ڱ_q(5ϭ&v4sd8y۱on?PG,OmXvnQ"U i-U뼶s$T@m`ʘ6wI40*J*lfeNՊ@Y3㦼XTS Z)RR>8[|鬽": Ng8Y9Y @UsHkComۧw=e6-id Ay|j ROB94Q| J:OC9z@M{qp-~ўRP ƈ'Ֆ}~qjEj:S4nsV3;kȍzBGaZ!_Rop߶CNeCevt3D'6Mqh9ٟ)Ol^K}w?WҫU2:KhꇗR-ZHy岆$nn"$͊%ܬP4bh+K5B 1-!ގQB ,m$+dzhye%)6PI54+Nx,j.f.R!_2kTj-M8>6d{ӾW izƤ5kw I&FDz&1H<.TBe=$;7v{/푦*E'B+' )L`+uFhV8¤qdV fUfy)Aж)P }t dҊ*~PdRL Ʊ*3 Fض &YDuQHFFX"D~*jPDO p- hH^< F9頢ul{KrS1)4>hc9VXK混GcF*9ND'*Ld`V&J; v\#7{dLBЍUG&OGU"@mG~1nQ,彺xtQ כOmOcni||_ݻWbNa!c6no313&CQĿ˦,3nQ"$ v e g+viƉ*HhX mf!r =^eo D j9h43=b1lYC-zyK(`uK D헃=ϧ1+q4|JF8vtYjs}:=MVJdLZ;ŋ߾ձ,s6 :}S Y7s $ 5fFAݧۻcy6(%_u0+[@Y1H;Sy<*ٱϋgV&bu|@õtK4\Kf1ڕ(qV :z%gqF_R! ?B}1*P2"k }vʭh8:1lgH;Β1؋8;vJn㊴#^C%y)6)\o?\Uobbݝd578d|M8wo7ןV&( 6m"mn)K̨r&YilE epW?g~QKWon>m1Ё^^uSu{9|$f46یyu.vus[>a-tGb޵l`Y'Ɯ (bh ]πDZ@[p2 hbhjN1~bWXP>U}Jq\PX-g2Y\yeY,Rw%ۧg7^jTFs DO։*z"caTPu>}[R*\&ڈMz@vvG~ݹU9Z}_os*-~,[O?mY>'p`#vͧy {)n~\-fu)K<ٷXߛuy׸zs&$ZOvJ6uTyB~pdS;[m JLg5|$Q݆9z1,7A6eq{7;w*:Fw;./v~oDwB~pmSX`:cN!6W"A.D#/y qUߤ+q 0Yxbc,¦ k"G g:Q͉1UaSߐ3@Z MPKD\0 x˓ժ1?y=f.>([w5EcnB4*5Й1v>N u[]E$̒Ygf^we{wҀN N GTJmD5(%P=%2trR܍6H7)ŨI~?LQ>DCBl Z$B!EVU6蒓֊{T@Ľ3[k@?ni+2%j=1P)0ǜ\#!F"h\oT5ƺᲺ=#@-o2JGb26@5(; .5m$=8r[*37dxMM@IT 8؞B }#g%bjdC:Mj4~͝CY|)4g$盧~BLɿ__L7Fyӽ^^/]in>|]2O6gdvC)ʌW- \7bjPx( ٓ`C2dtDj wwm͍VV6UgSԺyXT}A2I )I5I I*sku|p.5uU">*"#pǸ ŸdP-73&y̽%6luͤ.Į܌*&f}]YR0?kg6))*є ]/#~) p鰾{m=2-w#{ٓ~ѣnMoz%"RU (pQiH5gZ.s@3 <9L 3 FsHL0Er?6P"ڽ>SiGgp9os"Cer>_J~YcHw;ܦ8sՇ%orbΣ';P $zLj^e';dx-.'/80 cJ 8In5M [sDqUsa<15vá|刴.Tca]!&_V .1fJ/Vnߊo1?/ y5Ń>d NUό( X%'H;#ĬU*8r~lZ~.V9a~3Qjmې xKǩvVBn!Sce'3! (@T~b 2IO[;-?aLxgs|4F=qiv;S\ֲlܠ2`e#Ci%;}L[w.2uƱ%K t{x+IN*巪 \].K? LoUzOU IW״eH‰lʮHRa𱯼UUd{\"|;oiFe=S') \"Uqx*VUڊzu{Y85Vq#q 9~zoߔC8vdxc< .u\,'p:U@Ѱ'{YJ ,+U,+0.J,?ܰr# P$†}Tz2OϹ\at.%ҥ4: C2  K6wAZ%E㗚ʓ,Υr!zrJz6g wS·)|9:"]Ë,2+|AA Nr 9r% 1]b(>h8tJJmBT߆+ y 0<78S!b9m|ABAEn9

5,v):^lHPQ_vT@T]uMt>vmrvޮPiWLEucfm q] FƍXf}_ʹYOܷ^Zu8- =߯WWռY޿{gd{;ibѪ>nX{k|͵usz͸׿Y=i>zx`hOc橹Cݔ݌V0itN2I@JYC>k歰0e*5\PE02HCchO@Cc_Wq 1QIN@H̓t: #~hM`_́qGZE#TYuɒ%DC?,WN*C,Db9ɇI+FYN?HJQiR g2߇ޑ ?$8juԀ?$1w'wqdu 1R.bYf}z"9ON͓"6*#d$0SM$ zՄLyma032}L+m1c?h1B`ZTl0vYE@'8]4<+*8'@ ņeLI{?8)* SV1$+GdTR.9Qq:4NdFE N*h)WY"tu*OT[lu%׵E (xa1rJTˊ`@J*_}\df~}DŽ/4ngTw׭}?Ͽ VX-o|mYgSț>4aZlI+}Y~*(W{mTl~ADt R>EWxߜ1hoffK1Ku4F{[pr<~3h^ڍIwWnVj34S*]N˙^o/5@9FoXgܖo%*C, Qhx owǓ]sۺUi\\/=8+ F[&@ :֙F^L߿z?6{y`+$1h0c HB/Z .,[NG9N^1%SX~v?*v"7T:*{ 8y+fȑg^ s;g2;?4RS(XvХfD"g_cͮYX IDNJCc5Ry`,csU4K8${5)tmO↍YOR8ۤ|(@ .Xηq,xNڅV Bɽ梯.+UXg$h,%Պ6BuQ)HTQe(m^pp'-QI)ˢ*AkJ+n;+Q!)#M-dMyX(^=8I&Ac%0&Wap!+fԳ ND2dq&n+c#@W)%. +|kÌ<{YWB Ηroc:a]n_ܿ dWF_ Da۰K`PxGYb94lDe;F1 ;Nzp쑍&^nF{z I1޷0^1"뽰q1o6U}4@׃gY-q'7Nq.P*1+@cN2."%- 6eAǕ,0 AQ?RMf.f;O_ݕZkpq7gn2c~k=l?XgNZC=.qAЄ**95#q jʞz;Z'$O7@=+@`pIuJ@ ߗ ӬiH]}[I0|@Pi.))kee`XmXу;].!~sIŇ[WffqMqi0m[ҵ=XK`}}sVMZ;]%mQδ= 6d# 3U7'%Jp Yƾ'?='^|`X„5eIϿ0N9ʦ׹wnweWk}pl0$#Gc6t),#Gy\m|YK@ƦrXjLY#+vz8yxO:ǥqpoǶ~:9bǨ,dN4R]tƭ+.Ä-Ղ ,p㗰]rcH o޲!o![6ĔR(y1]~p^~Ͻ{Tl8ˠNn]vT$ADŽtjS\#ŒsɾX2V1=>#^% e@8Pg\o7<#l  z^GnyŔlW4ӡBJ+Z(I]=`1cD);'/ZTS^[#c ht:* `CEP_y5bObn<fYuL)a`$8?yڜ=n9gu9n T Һt3w; }܉x$ tS)߿oT?ѷ OV̪NRommGdPJG툯7;9JbgN␨v`T"y2>v.T+maE-f8NF'm|W9k0%CU53y𪰃ɍG$tI@4N#\Iٕ62SJF[ K`R0 s e wHL{ b7rE]J)k&I$ebEQqu*]WfE-pYTE|- _1& 0KZԸb ⚩ *0p$TUY WMl LbVlvWv2D%Q}ÁfP\ ku@gRi$vkc&<:\2k:(f2uNb sp߾aRD`}Sfr{oP ]/%R`a޼ȩ@}n)0Ps?%j,gFWW3=kmH/{~? CnwXf3 rsXYRpfvEb7EDPdUUֽr"2֍_9lXF؁=&C/ u!?ܝw;(mfœu\0Qt'$%-wiS:2Fo^OhV ",HQ)IAHLaeq"PIRj{pcܦJL١E95p~p)γ)q?^fSTB|ݸQW_qea^_xcN9$x׍#$,>ʲRK rW9.lN#}7Ƣ/BFc4IV`፺_~&L>aL|.,p2k u<2 cչ=5(uXXjA"PXf*(U aTcƸ> ]>go!Anԧ#HݝB% Fm3MT,ʹ"BY A$8MiMtZ @I;o])j"VWŦSf )y [\OofUbbD`si F`눜p3^ꢕXkx0nwPR5ձrxv -P6tn˃Lǻvw,NJλN^\O^w?N^ Zgn6#+D#`ocM5oe4QV6A'ݫ=.f]k^$m҆#R#ic4./FsN GE ޤ|[@) <^2cAeDmYa˵Zm &%"8mKWY ڃU[UcTR# `:w"dۄK\dQN 3n==]*{*#Q:hI\@Ϭv罅Ϭw~J_Fv nT_-.M)g? :0ԛ d.kXrmҰ,_A,pWN?1bJeΥkȵDcx14r=סI):#ţ |0m@}Y3QPd}N"}KnݐU?'C< 0;}p殧7O'+q#aT,fW\̢q%cMܽNP[v~ٳ|'koVi\y@dǠYW,8Y$F猻A ҈4v yI/z$4& cJB _IȒN)[4oC$ua W= Bb4cɧlf^XJ[0ܘ8&mk,jHM3)0)FHh:,N%K8:<b,e\w``2p&kp0)S7J}%G%5Of8 _DR vV*6@t&Ca٫3`kւqNOC5Pnt>|u%L(éEt@ޤӋ,-7z7\ Œ!VyڐL`O8F(#")$ IcI()EcCHl1 qf DkL1 I $*<G8d(4%TTg:4@>*5iQmins!#eLj6:E8UȝBfDj lqXm6fwtYz֟ (Y'C}"qggg2:mVԕ|?jF~ѩl8bdiP=ۀʶ-dl[`vrTPȁgYa H`.vJVϚ3cᦺ5yBBrܔK$'rab#\Ճ3[W,'O -Nв~y,/R -1ͷ;_W7~2̷(زb}](n\"QHRUAU>5Q_R8'4#φ?>,NaT!EoADe=/& 2.&P*R6Q!y}r}>H /B/1n<).39jI8#ƛ{u\e1Y͇qSAmw+u޻u+թ*ri[ y*SQxP,vt,s̪ӻ,&tmSG {b$?>Sۓv[9wJ.ż[1l$:^UxUQȂȺW~;^Uл'ʩ/hTWK ^4nm80_kgšft8A *Id&UgR1@,@$,FT u w3>}( K_oZxОXkX P3`Xuuc&!Zk?ӯzZ86<Ȫ73a;&f;toeni|̿hƳGky#Neysیwe=f+,"l5ܟO*_\௟n.|xCO)>u* *<][UT`^+ @{I' j;5y9ň G*@mY)L'@ ı0s۝Y px\]aSC,r 5l}%RBTzkRi#Z1ͩ3:-b-/ f]NKSM5׭i)dS[iQC7-ŰO,Vh1HvRBݴNd+P7-ͩ]RTLj)mT?lS Y\K!rRcRܴ4:-j-EMK" -=F6Ր_b⦥R%%Tۑh^eeRjHp[Kc\j'!R4A}Skϟ?PRY\^$͂ڒ<HA\P FLQ͕`V|${_"2\KςavY( $=/Mܪ ҀKòkTXn<(>X`.~QAe JtLWynmnS x'àDqIU%캊H3!oƫcvHq6S<)zЎ],ܮb #YpFu@soK<<]R'8xu 8]&,r␋AM޸1~!ِ˺OrR,>!vp_W B<զ|tKG=uEAnDRC̛yKXzA+; hw=eWn 3E;L&AIԴoܶ3/)?ZV6%ngdݖ9b*2@@k殕:.kI㔳FeeLCr WiWۼvYMJ5.ېQ'LNQ=]NLdLcE:$a!{|DK!SzB&|U6en ׶Q۝bhHb2tY1U4T4Njχ`}MR2{5\ lb@2&#.֌re9X>LJػꊍ1Jyo_?d)T6Kq|/74k)E2bVwHNc_w$>(l bqBq`auj8t[-{K}1]DpkQ-B(ABY\r(FB 03Ar#|Lې1o]^;-wy){R8A?؝.=Al6Sfe)ȁXS )@8u*3G+E͌`IS%y"Ĝ-Lc} | ञ#ퟬ9aw3o4PRt>;0JDA>Zf$Bz D5?A` )hP;EC%&ۂDJ!I2Aȩ0։$$Т^1V_'}Y7)E N>h9;Sr`3ٗEE8o Bd5/`EvŠ/e9^4e#u2@` t u0 o_fcG6l8,[hwڭ2 fHjbMBFHC!K ך#d4lkFĤ f ` Uv]e)" D„P%1B bTHn~& 5$!2+.#ʥ ^u S[P Ggs8TR2D!S͸!טSFgJHB.8O$b CJkNwe=㐲Һ=ݩ}!!32 2pP8D$1G:8 P4W DM!JSEZ`Y͐r[rf桎G ) OhRHP(4#*cI#TIANBFFRR( !sЦDy{ǤKWwy{QK1$fpV_WI~U!f8_8$22)Ŵq`•Б$:Lq%PC|HBC 9n:{"@BD gVp Hj 3`3CyOPHՒ0!aC(9njQדO3!(7 8kdm͆K 424f- 4r /O&Z_1@Z9mʫSM8F!8fԂ>@zxۻ5w=3ztS bu$w$Dx-[\+NVQϛ%FfwܮAj .=)}-A:7BrR[͵Vjyw`,e~}3xZ&:ZlH̻h2wek&~|5@ `O5J=||2fF7(?@**v庣~ޢ{ 1 .=&b} J@ ؉\~HeacO32' Ozb2rE9h{C 'cˤ8.&]5H[-|懙$= 3W:|g\٦էd= ?<b٪C2Wt,q*2*g/72\,hDR3S+ofXn:vdM#[m>YBr @ H+nP `ai NICx `%ntT C ({֝_VKڐi*ե8kJI6YYj ?WxR?}8{h).R]jfFKxKN{ 嚒qّߝlN@,ZqNqn3]~kqMWUOO/ru? `wC7T/?hU5RLSvvWթ~~R\A>C 3NC*5i+WQb~~f0mu!PY7GVrҩLH_=֍g!Z7_ \TǷ2֭Uf+K:Z:4䕫hN zxf_ 1NoZYY!vk g5MvKi{zvH7~vL껜B/]gم#l1b$ %;ٖ_"hk1uOn躕`a%d s.N:#2lÎ{7 ?*rj5^yA+z{W\0k:@ZvD{zL{4 szRl1qndsY,2 F]˅̕AOت}o lB{P7 .EB)UEr]$;uW- 'ga!+zou b9:RեV;F-!PT<.!H u! W-XأnGo3܇ _ \}(U"4Q^TUQr)1Gnٛ:noe[>DAIkͿxuCC^zTuSq FBIu<(e眶iݼ/qn5h+W:E-[sD^X7%`  ^D.'51/gKRh(N]j1R3*EKXK pHs6N_>hqR#wh9jF0'BKn7;(!-=k-MKqfcR ܴ~)@7-%#DH -%MKSiR즥[z1zwy1-=g-E72?;9=R奆\~dKwݦcBbZ#+Ēŏ h,L $^ϨYX)F H(D槒#bDG'0J8\ 0LH n4$&B(qQ)$" tjy 47a.5uzH ^ku=~VgAgGpzhg!RN 5OL9BKUp%cup(U)i7"=CcqgS/ Eglsu]ޔu{<[{AyOC3 RAh4pǽc.=e mr61\m<8G5+ 7YGֿ 756:y3\%a?Xv:Z'ҙW 4FtӸ#ԜNoMjN6g9–v{ Z^@ KgPPglQusӛ$nw$N!ZQKr,U^AB=:9Fzeej 7Ĩ8g!3'1"wx%%otz'Fho:´>R$<؅؅}bϮO(Yvw;E|zu枚n'Lyx:qcic´ͮ0k3E}}v w8Z$IۤJA<RAYRAoHr  Z8)S,JJ8i&Q09Ň͸FQb& ljXF"*Q`$m2?$>4+(J27pKH  &7%( (S@%QSHHEo%4N*RLPmOlMŧ2mQES";eznSKXزbG{yH5TrH&I_9V؊N$D}Z__J eW~Xأލe]ErwgXyca:e>פ8 nwvfB}Hd>$KO/-L QwP/:1.[5j|oު}V7oi¡ a4 0"LTlV\J!,we͍HT;3&Pݲױuء5bj(RbbnEe"(k4ab 5:MRzFPEׁ bfC,?1^6`/A~oJ/-Z/ߛVn";vd@J6W`ݸ\E.uMH?D$2 IkLEF& $Z3Ll,Ð:J}Ӈ kzWu@BRŹ6}1~Pϥ11GEMRd:43Q_T#xܾ7[8|]DD~7?os|F |}ǞXٶbo+](JX/YoƳg %AƲ [7 KjxRb&7;Fw޵*dIjR[-R /1o=OIå_<.XU װ✐F] 5y4ľK 3Os?Nn`g\h\8E"B3 U ~HHq{>ue'2mxIKb^sGq(\P*]_(F^DN %; E(XމrD^⦽T Jj![v\$RmV ;c%`~`8N -/8Sk&fWkŸ-bŜmZ)PJD 2HBR&!1nj%aRbUX&I wܠԐߝb6h'v2 CH!|kOP!-MD$܈%#F+%SbPi(҉AA:HE-< SЁ#bR!( jq(Iri:ג$N#N#l+%j/|D+EDk#|T4$d +4!8B$2&1Ja"Md JNDa%uLVu' JszOlj쉤{x"wۭMb 8B<`LQB4Tr,n9!"TRf<,Agk%0x"K5U܀c`R0<ˁyv*a2c<0 0i8M'&fٴۙqž}L?=--kos8"Ff6re٭gw_\#aد{d1_{&Id`=G;/ZaE66x{ۗX•QeE"+,-, `5S/a_7&tJsnj}>㋙OM8^5,annd8 A OчN馛Uaj{]n10G NK̔=ӝLs8zfmzqšvcxpKZZ Uq2oޠ9W'pqoΏ*}["jkv6;,{aE=F 0W$cY}sӬ7siU5a !,Y.|nѰYyR7g5۴;3O4tC]J|$Ka9xvLK_bpA. ֯QpRsS|`p*.Rxf'O(&q=s<R &,n5W߾ EwfntmO&+~,FZd י%v`}Ju5r 8a=kO&/r;kCΜEstTҭ..SU^@S\jܶt(ݚ3gZ<%7ηM1t }GvH-¶LOaqnMpș O93>`ܶ^Ufs²ӻ -L ۨbPS>0ZWw.1(ܯ| Jn+(K7rD_•-s%FN\iB1p+#Wb9J.\yֹ(7$ vl+C)s%&֋846׿ E=Uc)LֳL0vR/:8X(spvI\?(|FrҐ abn06N #RɈj%WPRU2Lދ9aX`M3xlnK @-JpQDKPI񧙲y~y~E wv9mqׅY>t3tJ/ktBWKTrU)刨PBY)TF:TZm-ȯNWntwߩ>g@p֚^RAGk>u{EgYo#䒡C ub alꆸ1s n:ߤۜz닝y猺޾Lch_b#:'6Gy7yw:'tN]=ػ;' {%_M xs-^18hš/*.*&^˶$*^Tƫ}qp]?̼$uㅙ:5ȶ NBsikx)DH)T/ i J Ei0!\P(4I$HЌ2i*°F *K87f)Tq\MH:"V'[۩:%wh nbz¥'ے(^`>5Y)SzYUc#.s D)ioS#ɖ)ǯ-=9SlwC =g%_ʰ97&QpOWW?ZIfUmgbp)X~,{Ӏ4f: +P4bdS?{I-> Yk!FE_Fd&cY)"+g$"0)}S}`(ez]L ~b6S"K唾z~K-bN>}Iۧu:-#?MӌOFpd;Yp0-{0 ?'?}> l(z%! W'ɛzD)"֐Y*VT键GRuHߒ#.{\`C& J f-ePPeAbGͭЧ.B_w^{ Xڍb4w@6@*>VYaqd 3{W) u&$t{u\<l&*Q6*E}'8=Ga.8U*DDDxP*TA"ksi% |t_.P 'Vx9P8U^I>\]qnwȆwYgc)*vd!&LJQѢ)Q}FQDUdy\wdR ZpϲnO|(@ӻKx1Tt_bHCi_oo3OS[<6I;x+ʱDGk4 {Qt=@dO {,ҭ0vv@O75 uj*+ 5TU$V wsP`$̞I_PKޗC9 JySJ1=;WvHt:}U`a^G1;;v@ &|(T~To/U Z 'LvTTK_$d[^W;Y.$9$u;]ޒCQsƐ s[h<͓p;Sֺ\el^".|j{qڵ1ojIK!MkoB+K{u\u;U{k=Z.sq~*ɕ'TRCV?U[ҊA@ǰ*j֥=Ph t> ^"‘YBK$/QG`Dhe/> ieX+\ElC"L|92c*<3)EM ?՞U\:o<2ǩGw[$i[лtRJt#+-ƄVN<(-ܟ'3iڇ$ )F\BgҁqUC +<)q.:fܛ#.Wl5$zL a/":9E:us)_8\`uȸSV0/Lu4=daS6B{^fDC CJO?A!D%\lɖy/& ^.UD9S9L_|1EcxW?l<*&cKj ~\`CN #[I'&>~${3uuWẏI(*+vgOu @Nׇnu`>ô^`gÿ7}߻BDJ` Ut?PTQjޗ &&R.$%S7 .r,<M ?{#ڊ-~Cn#2ʦ#L\{1\Qo|D>#?0tO}QN3%eB^44n+^>#BL) -e #7j,ˁX64q+pō`CbSHbf8*'5%R,uzrrT9 m@\*GNơi˼ewGZN9$B;qߕr$NOD̍-Zљ菇͉zZ@J370viNWɂqrӏ~zxXXpe߳wQW3Õ4a&;A`2!aHvg)t k<1z3JjSܬ)1eO(V6lfe')q = \I*j< *Y0t ub4]5W|m\^Jsֹ,5–0l'[))S5Ir(V(\:Ņwֲ5bw-S}T)ڗ^8ڗXI "o\ ڪ"nbt0EJ;Zu(TQy1j:=I|Ԭt#)ۜW$At3'+Lu3SʋǸ#5މFі2rh!UvwURW\-ɥ{[^[^5,t.QEb!~5EWV]U;(R@Vh<ū;+h]N4[3˚(~^8U~&J$`mEw?`<+VU)!њC1Gg5Zp8dW'ؚa?;ټ1rtsehV'FzZ,(dT#)u$T( -U&d/T+ZU$a r*<$p{H V֐⢔ܼ}kGR 6IV%o^ 3tyJ[P,EΪqJ HݲOaGZʻQAxJaWJ*3}mSOﯿ*l"$W{"+Zb 'rrå!y!PלO .dV')r}:X a2ן&X*X7?.vG_(" 7|iQ5IvPPr2Mif4 '*f?- u~NG|]:2asG^5;rC.M_?lhNU!59(sa[e0Cj5ثz͉\m_OG2S ՋsړRض TNsGC*ZI(:֍B[U ʨNwc:\"6mݪD+iАVҩ9cvº1wAթ}[[i檦[%[WJ:U{۳^X7 :nU1(:U߱up˝֬[冖nuh?\E)Ur`13NHCC2ӭ|L;Jyt%bmlX xĄ:@y0K¹󅯵:|&pmƇ==L* n!o&.^3s?1"qq-7=_0~>}Ճ8!t/34=X r&'!շDqSD8ayл]Bdn%G4Y nqFVAE׫^4}a '˯q!/1+'23{!"/4YÝ3cK38E.*r41X &%*WrrS񭑔}|,S2SoͶ8| ChUW$飶h@lسRL믇&c3yD}3 E")D0l$rHʫF7.7d੆)aMy!.8,ܺ!f9D@u29e`xs'ϔ!?h䅆C#oj*-(i96&J؇+lMxY s]Z?ĩE]'usn akZA%ÞHc8s rְuո4L0?#fVU9(8[A>믱+ImV^vvvv[!ǂ&Z%hsjBtD $B)P3cLLl%!&BQ%"YaB9Ƣ/j7>EUV,jOLr%i?&#J @aD5M X' AQitlD! 7ܪPXiYEmJ &kAO(X(L@Kiu"PSԷN*jXG͈L1]Tva$L3?Q z=`AРAf6C™o3DPV>y=>(2i RP%XXxNRs$;ʿ_n}Wfamk jgnw<<+:DV] O]R<~&qˎ觩nMy>to#f^=WW? 5erBzVYi{k8*zH36BB23Kd>n(YVc:ӟ:u{>sqIH0ɘ cX1VX EI4WGYgjk6L^pR犱{Qقpbg(Dž*VU0Όc9'ح/.4 d6i^pފM8KJ:dn^,|[yۼgM ~f jgQM6S[{ZVP7 W fb,S TsmkjWWǮ _xõ/ S|̽w*By6LDQݚ{~Y< *o{G9H^Ux=؅gO5?Ƿ|jTYdDͷѰ窟ׯwџb'<ى_}?ʾ4%m*ӨEӖح4~Ͷ9e-뵙"6|nfu'wuf<.9snaQ(LwR 7+Ql{gY܊[fϝ| \I9@WVN{qP&ZXS(x27)o7c*y0\e 㼱ݶȲERytb|avR|鮉.߿w pB,:A]t??Ab^@jdw]L }|g6n-afw~=.wqpO;Y“'B5T((C\^]?4t}f7}igNvcR7ESYd}$T1w(=']Qj b(7JsO 7TG0Dbt"yEB!4 IBY$8 (X 3 y"QE2(g /n社+H )VziJy'PP_Rϫ4+@j(Oy|[݀v:fߟ2dv9>uw(' XٻjiAfqס._wt<(?8+XL1Q\N|ա1 getyp#vٙ%u#B\ń:E K*h( ǟn=] >֘E$/X><'轡mNzؽZ13Πï>"7n$ռ{vYa>wH"t}O+2dj7T=C")sG[" CdQ6繂zJ݇u+ s k[RrJjji ArOj[`SL:63?upsgܿ5B{M%R&T"5UҘ0FXLe(Dv b+&! x'j08! iPD!6Z‚@T0h-\dM ?2HN ?lw9JOI Ci*58Pk"C);(E8a"{a" G=5Cb` "U AHqcPs%1#6I%!V*_ nvW{x j:0xiN?;o5fDU;܌~Zerʓʮnf^=WW?_Wk?FI`fOh=[EsZ)&Y/]om*Jܛ;fe*Ջ y~y#'\Vj4{cPy7viշ9c : Kg߻XAaQرv -ˆ \5GSb~\>[UrEy%"3t55B';x"I8ݒpie8VQL!h5}rմ>.E#ɢm7{ҕm5DR#^Wj9ÿ|P'x8Qs=xR6HhC"_hQ;{.=.nvXJA5%9PzQ//hG<,HcEPIĂ8eE% k <)CP,`B ~MȬB饍S Dai&"N & ) K0hLPld6˪)K$Tf1N Sަ迢]i0jl`=[}xt,?7?+y 1h0V9|w-wIXev MhS@"x$AAKbm׾$`ADNr*Wfg8T #b1U1 1PQHka[B JnW4%8(E̙ *FlbsANS7ũ/?~FƏ;.`1\L0A'dB"T{'5QSǫ.rLp߳52ڮH b|d=Q*MSZlo+w13My\Ր^vI}=ϖomA0jfHͤ5A(j{J M9JɫzNNd&Vw?UYgQM^]mf`vDvSŝ`Ůfd^>dcئI∍?p,T`'R_},̗r6N}th$"Pbq$T(C,2fU6Џ8oYAu&"4^ _<aWY/8eϱSmn5/ؗU-ڛ +?DR>:r3]z)e,IdjqjC-s-^Z I4.,ޏid[q1jyLE5sWs_) ZbQg@Wf3o%*Z^Z 5iԨrTu庖y񓕒*DtJ;lPox3ʸwj@H!A]mo#7+?bcM, gf'ۧ9 dƉGd%lRV,e[bb<[gpg%HdXPDkF ) JQpך٢ߝj"UB0ZAἴd Paa+)w KP +orּqzQ;V SAũ%EBTEFKeYң෵*8Yޝ -l%>28G_TOT{6DhG̏ef_&IS4N]dvWBLRL?ne*{_ۇjs:e Ȭ9䩜HArއWțDh SwEFs??x3ZruNN [g^]W;or!G,LI2zS!1h\@'w6d֓O=XBYj8j!oj7pry1Rۀ/S?=XBkS\ˍOEH5=wp|p|GQwfݿUn9p8{mq4l\1w"wX)qĝdY%I1ʘ&TF6ju꺟m>zv'rshwnWSzE}wc[ᵙ2k)w$Z57+`gY(͛9Da9-q춰p*G(ṩ&HM֘Vzq  `zp(rTJ ngx p( 8Prwffsz$]wQwiRQ.+$b'knUw>MFPbY gt>4B^)1Y"tutyv}׈b W[ݧ+~rB%R TyhJ6B F,mkd jADn66c(Ѵ$5PPjS~*! J@[8|aQE$xrT6$C#~{ox5|JjK5c?#HiH2$} נ9Hw)P P4m+,F(g`Y%PR1UXJ,4RW\*glS;K^?0E2 tߌƪ3@)ʢwR+1V Ng Wh4JLcAoJҪC3(,SxkxYjC^@R8TJK(r) 4X( P EE~W΃oqCmzlu= dd2BH{PRo*:+;6v5Lwމ{hq(lot"BCvhY*JP[x BDV?ۍ I_eFG]:#O:tNd"Uhq؇t%B" s%4gztm>tWE\Jo9u+PTp^в8W9_-cxC1::CUp=%׼ɇ'بIJ=baaSL}\H]& s5xl "_tyWǨuˍn) 2A;_Ð״ q[Xk;VY?w)ՊF>`!Kk˝PmMXh} YƆNx19l τۈ12o=!ơ$3jahL圮VTVо GXʘmm$џ5*,,8{D@=i,$ 늇,lHj-ݩZ4|= )g@F FUz@Q\sQxS06h]G;i2DؠZTiyDT ..R-$cgO{ = 2ijw،-L^)tw~٬ː;F{5vNf(>:;S-)e<֭rjȽW OdLrOͳlvq6+x}ο V?ޗZit-\k~̿}ؙ waB˫[G r?9#F`9*jɁT@#mN9ZXS9 q}DHu`H&yScتAɩf% rKpm}nN9̎Tto{qFG 4Ug"gPW"R DQX;\'YCӁzu$Џ6^y2c:3_/; 5 ŽZs;(znH)Jy tg!XC%v˳xʂC^=uC㻭FH$[T &TDwvMM*tf{6U]2LWP}B=l,@T?a禅c-Ʃ!Iet̡24q|ܨ&-sO]dt$Qɽ*HNW)ٲLy^7#k|άuKDrjF[ &.*wнdP+uBYj/cnJ"09Q1c$] vie /"klE9rfak7 ",1#E wOdDj.rS c S"zWn*ޮī,9E^ hH]KECbZ4D'QeX"~"IB .lʄ9 GdgqѭQ^!_D/@lU[+{?zkhvK ,CbhƞKbvszl,5(e|z6K_!)㝎Q2ƨYFե%-QF#d-l޽i׉ۦਇ"-#]D,޻(^'AbuH;ߧ1!")]M#Й:[=$/i*2Tq3ֶ4F!RH 5LhasLys *W]&2a*MńPcEā v-]j"BRj)َeLŊyoǐ^0ZˊITHB_rb(|1+,-۝blrj`8`0K.b"rH fn 2md̒oFt#ho{ڍ@3yU~6s Ƀ2Aj-dlrtv""Q1c$Y w2=4r]Td\(S 98.4И7ƅɇ禜Il~z'W7s;=eリrrWWjjrc} 19WRwڠՕ˻c>~Tά{;/n_0Y3|ȀV$ghVp;'V9**WiAm8Pm赾>E* ֲt'1Yu|O'|d0x'iĔOPƔ40/5v>~n)'r Ư 4C*TB{k.7f )(Te ? Ү'?\_0 392#1-њ /UYDӕk?ngx x/yrrjrݝlqgrΟ 3s9h2 a?n¿#8{䀈UG lʛBڣ0eݵJ\%+oKfNvJ8U'_ pr8lt NDR%e*Tbrm?%%.EEs@tCrnrbGOIafRSD(P՗TOfaUJQMid(}5#'0JI\gR.ǰL҃F)q(%"Tq8:QJuJ(ZTǡ4PPzh(EC)b@@> "ơ3g҃F)q(SJ_rjRT'4J)ơ;>25 RC)w| ҃F)q(e_19 2RV5=(P2J_rgʚȑ_aecf( (fvbv?yBlu }*R,I:)vH"T KLy?m+2JyvC0rfոdlgԙXqc`OS}]Z v!=m+ "e+>ҠzG:g2f eVʇK Ҝjl'mY)_AX)gaVQ9?㟲S}<0_OTcg_zV9)D 4+} I~>/=m+ ;g8G JNTc.ϻӶRAX) \YsI[)BA{|?nL)z*M<7oNn&5?̍ƚy0OfQ(V@>u*z'eEH~e5ۜ!ʙF{ՖzPD~k0V.~l6tw5鉅mBo*9>jgsXu0ΙءTK*+DQBH3_!bUZJ)NSJ,P S;FUyΫQ`S% 9BKI#Er0kJ3!\Q) ;RIC#/z҈b>WהFgd}(My9?ZZ)4:f9nzwL8-!RwClbbϙM{r{*?G}[#aqW}_g^?뇼}FA\\^l1`"E1f[#\$拝S叔 -)SbO>>{_XHuadqb5 tGSOA_YԮ1?Ϳ?Ԭ1|.JcNDAB# TEBs@ҳ.ê-…35Wn/"Lf@zڋyuA-.䬞7kD7>pߌnAC[BMp cxܐ1™ ~Κ&~I򈆯(a"0p̱"DmfƘ$"f _Q~l Vh)#*aR&I*{\ yYj5V2ud73x?̏n˪?%ѣdJ=*ֻ~Y=UN3!€>mrSIo^]/B$*: o/Fd:/vyu>QwU;7z6۲G/%΍sSG⳼n3ɐ37q롂.uhZdLs첑(Am3 R_h`8S{U0d \HΞ䴘2sЎayr$[ox І^"lHx7voW'_s;\Q3f)U?o?G+.l9ʟIY54D޶{ն9sm)#ՙt1l[gM=s5&)u>JQ*MU>l_Lo|lAԸ7_'ƾGjJڒ4jUFKE@1E@9ns7AW7>&^ȫ y݂~opG?fcYyr]j1W89{s:N{onFhz3uQ =Llԋz/ rCy-:ܜ84R>fn_~C\¯osB^<g\%BݐsY4#TH "(E0h7rK?{/Cp|w|~s% ī&?GuWnm6+j^_.g߼?" F_@}nK#BMuXd/oUב>E.KM: 7Cz7C1蘘u,nIX H$01v1#JS <&I~XRQ,S6$A$*:RbcR(bjScvֺRChm1pRO%Oİl`'iISYnj&fZ$Y],~:coυu5En>7T/no2M21 # =A;YfOS0BX=[sN.%B[x 1,1avܻ% w0v!l~#VcR!73*pS\Qg4Vcg3]֮zIf8.KӣO)ؑ{xhQ1 &^*NڋsیW$1OuRqWD-j233G5D+㝕p;Dccb抵]}K[[Aj_T}#LJ+j d q;qU n A^aѝn2Q)>0'T[J:bH` ]ɭ(/&t&Jn*;TT n*>b(B o'` LTa^Ri]m#.QxFQ$ez]&{Yj9֥ D#\;h1QN; f)B n1Ƈ?v#,CV?}0Z\j?eXD_$9DPPB2pqU Ǜc X+]'vȅD`)#5]GM|1""7y]bu//L>yA%ZCMMa un"[]NwTnE3^wz&,䅛h-6sۻ4Fޭ BL;xw ҝwݚn%6EnjapC\X`ScNA|5Ġ/#ͭ{[B=ɵ6ݣFwQOFb]~|>Qp \>ͨq#_IP2.LhnF#2fQ9b]f}ɏW5,9QWϮɇ[TMQ2:N!!(jljzG'}Oz]cO7VvTƦ375d6XY,lJFuRA)0C-\rns-0:TJ,BrmTQ.'dz@J\@GUo)iL@"R1CMSp&+I0ڟOr4ߐJ <#<)HM{<%ĠTBրӥf8,ȹ!0 de^ c0ŗ獰/G-CW q[V7,(! 9 $1)ҝ3 (Xc$XC S',Jq0dK`d{5$CuDBjW[R"1 3J11NSVHMu420H-)@)1Vug!E[\"B-Sg.q).C"1,&1VV`H%E" FJgo''`:aG2N2F4(ɰyIf1r}6;Sr Ĕlpo^&!輚T Z:q=狊:>=Rc9 _µ}S:v8JY/"b +VI5I %;maCR+8I@&: Ex1a %)4v"zAY6jE址3*΄*acLǂg hϖ bRzň;ytL$eGĻv=S?-\zTUc|V&vل"!FQ-9"c+) 4N`4R$\"űs8%LRC-*;FI~3R" u.3yLO j@Q`2EZ%0ıV֤1V$X(MRD UFmQm(0# %H'}i)ö4ͳ1!w֕&}Ɣ)Yg]AgHh)HhJLbL Ov %f (U<8k.Dg 47ā!/΀x:BU!S~^hx.0wR DL4Yu|~Uǖ= R:WC]E0fፔ"eߒ5R-%:{;n2!TM1C;-C z\UiSY zԪoRK5Td?mj^esCT+*3Hhb!Rn46Ƙ"N)$F:EB dRXd$£-CAvQTŨ Pu[@.R‘6QtD5@;j4O4O,Yy%KPvXBu^1D՟XRN83zix7`&DХfp% Rԁ%hF]q?BRX2Ƅ+oj V?q/3˙򠘷RFoh (PZ)-Q/-Peܢ=Qx8SG8l9UrX3|vGscn;XƼwpopbS;3] rY/ovak7lܜt<Z{UHu39컐]H ֲ-8YU3#|27hHpJpjAn86W!My^HfUhyJTZ;vn`IyDR`R~j0rLwcBy\P/߆Q+4٣qkT+O]h&Q_i&sdm|?JAί?,q)Ts=AϬQCWD9 w0e-ɒv^Q9vkʃ4}Fv< ʴ2ecڭySavBr6`zsYiNZ5AalZKvc0nA/l馓yM+@o!iMCnSfmţ3FGtfu b WuDrBH L&VR̰n*)Cg- D?28%eY:8&\!D"qȶ ]|ITe;_XːLhݾFmJkVKY mzKbC-d-pq*5h/฽w jzav%]|FZ҂7nFj2vhXC To5ds[46T9͢- {9練8ՕV&i5c1ֺƴz,0h4~JbsUY{#y `E`JSɂ&0WEkeչ" uh&U#)}ROc7Wq9Ą@X>RX 93)h| sB[`8yni%-~md>uݡTI!V3vnq߲aViq0\rOha8_+M9 { +" '2j{<שTKVѩG,7AJm3Ϗg'Qa`b|!SƴiYnQ5A t>v;#2UpAv0šv3h}lʃ$|#F҈nM!_9D0x],lؚ|Ԛr5y"Ly.:&y8 è#Rƺ(E*:ޢt˜ V_ ,R{"dܕ^B28 e:wqx|{2%?ieˈM߷eW?TCq?ou&UϏ X]ɲMAyD(:4Ɏb9 LH ԣLާhȂʭ^?I#ej}m$ lΙylMM +ʴ4Oݼ|z ZK09.cZ/򠩐!|ONze^_)?Ւ '!L(ݖh1+ Z;Q{M_i̇5Z,`J[@ojP#.fG"n 5J`EXdsRX@b|_I*|8X>!(r˕S4F1͕*>*\R%*IByJ~kue+cfGdJ DdT"Q OYm_J-mϢ] ZEiP̴(!,.erSNJ(>XFvSX,|~5O c<OEfs'ڟ"2z]$`'ξ^ 䈈 z,ߤGVsYsܯ̚ecnS%u9ɖ&}M6R`Mj\>=z4^c-@f/DQz(" TDm\Z]tF)42UTVt՗ T >Q4.T/LըK+53JO@P \ig.JyP+QJH P%<[FiZS@.Z|]9(& tb6J9OC)QތVt՗TSNQ* g#a7Q}Hp֥RHKA(] ~iZpgQ*z4e_AyʛInrjpni9m2RVV4T#'RP*`Х\_z(DKRwah}mq oJ7S}@`g'QViɀq :TS6F)i(-%@)i(-V\;q RR Ғj ljTJlE JC)R(==I9lUm(orjADSDiZCN i)Tʳ.=mrHC)(吆Ғj=6JJCP8BiyF)THRN=6ԞjQ 芟ƑJ(0ք(|ʁ 9#&7{G;ȵ4 l~ƆF905^5v:Fp;O3TKT@{) \ZuӤèu^K^[ES8pߏR/Y2=7ek&-g|g/UG&o/r ђ߽}\I)>Ε|EJ%pwg/<) Du#Vɇ7sSf§;C\^urIiu@d%֍g7$!q\ 73OٛQ}g[Td'_3Hm˃/fRb8.ݻiH@1c s\7Ԏ~B[J(+ihR$-LJ h[MYz@ѳ)PB>u)P:'{R=0Ilca _J>qJgs{$3Ml2Ӎ=pyC躏$?:?>**** ܰɉ.Tj pCQR|%s]9!N' J &G\/eޡG~M"r5}P/mL a|/iC&oK^T;NQDl.A_npv9X(r (Ԉm(gt>{=$_0`?{[Ǎdo{b_zA; &|&ȒVdEJrGKdWpV7TX$N k)~~KrU-Hik'oWiD + t>,J %Q~-ƹ-7DI=zV%|ܲ.x Y&~s&>_10ſskWҜG!8zV0&`MsrvWg)QWAa \m0pdjlUY3V0e R1'BZkvL>1I$bR,Waڌd"r)3YZFoXmc^7 gykLKSygH8V˦JT*Q#i*`gzP I؟LSzgGR띴qAjSS=L9O,C!_HNH0w' CBD*/գޗaMڌ`n Ɍbr}`L #AdQZ6 M-!+^aLܳzҊ^ٳ`ƞ%CW U\huCps:)֫M{9QINA{ 7 e.Oo㨵߯|{PHa/Og+n& @(!^S,t2hrK/<~ 7$RYi:\*;Ĕ$KxjQҌtї$"- 0Q֍9 3t@xp-J҇2ai<Pf#0Y:q.4S;HeXΘ!@##f椥g@̄L3҉r貪_/_n7ZI_-k\ѷggnzy瓳;$~oG7hOo“Eڒ:3Ln灯ׇ\3󳣳sro1y_+OG+HdMy02{T5kɵ!}`Nhkskq{ӻ%I2?&FsQr?=k28Ji3qmm|Wms:G4Ca$]RO.YÐ#؈Z]ҋ{Z3Fϝ3^eDG n$>wq"s_*pKR-.N?OS<->-oW󘢎N( (LsԼhTRګ:Lц/GYw޽pk!x;)lW\ww~s.ߊ1k2o߯8V?(zQZ(W] ܨ'W*R\7̼ؔUrV`yDI*E Y 7":3˻V7w+ tJ&%Vjޭ48ѻ a!/DlEpwӭszB LYm]D0D1w+fInX 76E$}Z_]im\_^{<}(^f?^޽9^TǷ>G0IMǷ/|(%rS2KVHdd=9uMC@  [<&𜎤ؕ:Lbw!#!ґ XE0p4,lgV8|w1{Wa} U~>.=!C@TԞYnČx[A6|c*$ R( $(bCafiDJUhצUXFhjѯX[N^bɸh'B@w@^{G>"'fN6m6%QBzϗ>X`"p?n:^9PN6"6a]1e$hk<6?t.eDb5}s֨1uŊJ~#g>t '+)ynE6՗hb5z׻QŊ$U~#Xq+ԘKM.VB^ؔ@vwC!xR RL9m-OЩ|2V,hwB^)>- hǯ9S S~rnUJR6[{Zad_?s7rɸ?V.o .0C[C|rɵC%kX,=W3JПl'-U!&7t0/1jI p U2K9\2&氪(nOO%F;km p2 Yv9r0QVy*}> IPs=d7zʀ&I` Ӎ1spQA֍jeH/@Y wV65JT'mbuM`j80Pjot,trCie8j7v 㷴- {7A%.&{~_̦y=<;c5\wqդ!\2dgtnQoqğ_rǬ07ٗՎl6!u]< {/ϟ hP#iyjFHyT0mDFh Ə_K ZaN$ O\]J4BRH71r8\djw)WT GAP16vh!3̢J*9I-&ؓ. T,[ME'WdT9o=B8.]wOѐ6dܶ"J6mS w 5dNP)1sDy U{AQ$(\lsXxsqp+%pPb.aeU cƒ5Ę#HYk[kCVr\38rn\K!T Hvfs)fټ]=nvN*qNi]6W@ 1w޷b"ذJ벹#fsPJAF{X!^ !1[ '척A")l?v)3uGH_rEa궍 ;g(BGw-y[=hݸ#isR&w9=DS5K'f*Z8@&ScS1ѽpxzmMBN\:NBuޑDL:LdCFAޢ2NKMZ?=) tBY5,5*N 0z4uDҘh.*S M fx4-jfۉmgDR9Q\?k/ Ӈp܃LQ)|Q+sׂ6G5vk~SCeiG$;agmj{-bBd_^ ԨSZ({- a!/DlJ_w@}]K1H1oxbSj1ߍsMȦ|ĹTk3zR RL9m]P&nYm y&ۦH(b`ܴS燈rSA&Y TZBLMJ[ͣXurv]_W+=xIҜ|^x⚌ilE<ilϫj2s˰xAGy>ӓvzqjv"W^ث7 Ɓ#|ؖPJf;HrF['aW%m.T lRU5RJأ=JQ0RA1 rn%dVm扂E@0-eY& A71*Df7&s?UX"W1j$Λ7;A 2va "?ժo LH?; O4̶Ff##Zirr;Y LEN9.^|Wrb%ԙ'j|T$ v|hRN4M_E5Ǎiwu 6X k85#5 DG~u @ש3nb#3 CVEܡ.A`dS1҅!3jJ98c'F';J!Ć+*fqVgC{<`w O2Hy]yC&,`D# %|hT-.G-&b!@톀c֑[3ӌW}<KZ|9f؈ Ĩa29Ãh|J{xI$-eIzIdKH Sck'TSmm:ժIJfcZ?=c6<&hUsfr CH18Ϥ(. M乞̸Í, $"܎~5. %v/y/Yd3Ʊ<`1)٦7EJlIlQduꗪzUK?>9֎ZmEHQ4Wr_+t4`Vt88Xg4lH凹\KQ<`hg^2;*A%UhłSI39UU5U2=vVl)}?h! ZPIp/~UΘKbJz"2 h c8Z3hiz+yl4h4B`.g!Vwqc]Y vS2./79Lru7x-.ڂcŅb&O s8n8Wjkuކ*0lnY&d.mZhTJ%p#"hy:N.3% 9Osg%y"2 mP5FLYkљ>bE;#<"ӧւBAƙ(xBPhP$A$uVP '`x.E r ]'w!, >a!-C̊(R3ҞvMw lO|sb,"rZ_}qmq|i5>'2.N>L=,͈(~JnKZxOуMX&vb5Uj]jAp0Mw6i!TrN/5Ue_O$b6JLɶ:9`! ƥ9`16j[c9 Kq@` & ќ}o9oon觳:ejЧ5Ly"y4?@~f׫g_[?Թo{ܯ>m˞ݹo}M5 (lԚmyDZ[;lڢ=9#v"1HFhát5O(=F&ѪE*wQ4b.`PY Pz(: \9Ӫ($E]jO(=j R/U4J-4ңF4J {+iE/PIꋺԚkgZFdfu'`H !,y6DIg+}hѺN]).FdSٹY]pQwK2њg<sY+ǸN҄t`"}D`VoO5>yTnvd kklsW[Czô*;JV?7K찝$p/toY8l"-THzI1H(7{]^+㺩ciKl0?E+A mrL28w2 k?ҏ!BC*47&!EVP%E2.eVJf}rOp*`3G =yb|/r}`w?L hz7C43ON>[?f1dFvvtjA ֐8Ӹ-d6 μ\ZƊ ֱ]7U'R^ňDn[W1p)]lFBKC ~$<3ͯ:g6%t.]N^gɛ242QL9a C땓x ęŠPzKb!].u('6RaduhF`5y6fQgs3&Gr_\i0i<+MlQChFzVi|tp}0sww6L~e^ܭ̯w涊ozoմW?|~/~2j$4o>߼>㌫!폌 ~9Z-]Rhw,E3U?Jð\/ ./sG~z$V(TK#&iy* cg빲{ŁYDc8fH kl.]uSg6!8ԺM 5. bCVճ& CGڠw[ =- B/gy[WFg[*&j5JX3?6}6yaܖ)T:Ik&|5ߣwW~G N0{(i#_ud#7P3{D\<2Qzǥnި0nj-Oh8Kʘ밄d<>ulߗ|ͷֺ%h$Ƥh׍~dls2#(M+j^FS*>{fiv=tO$ʧBѢwAr:,7@Zyw7  H\3Li3|wuUcwW׬ٷg.娞t趯Z+>^w=>zFܭ4c zf̰fngȃl<2@yhqu}5$ ieh;}+H$n/vΔ1lfUv~߹ l7q$V<®+@]Wn7a5֌鷗*[6k]ԱC6j E[?\/P[6iAX7.*Z 쮆"L1<6-K"M6#Z:/qmF?gw9d4Ku Eg$:7ϯm+ZN{Uɦ([/>H!lW]έ3[ט bмWNW6ôbYBEGym)!?S/{7#T[[@3xZͻ'VIn] C4SwB»A t>w;.BWvS%z.!ZT"rKY62Sx:pcG SnWɫ˱5xt$j ֗ z/eeM >8@ 9ҏ#:-פm$Xt0}|8IMfsɘYnvQͪ9#H3|TsZ֨C0:4+(=Hm?s%Hͮ;WTJf4=sB-ig5`wwH!tGF{o B6x"5J|ztVn*$->9 J$tX|KrSZI) +h<4kAY1D#vN(=q`WQ:kd5uUgui.p\ቇs;ӡHy](ю8qVDv-xGv{BG!8v)E=F9N,+4v\(]Hݽ=(ΩD{oE;5}te]qTqVEg"MWtֹqhY.O~ )}J5?R򆦔zWlx.N4j7 F)c|؋Dh*ERO)A) ۹wqzF1IЖ[$(4(@%{Lreo*ZO5p6=ꊽ?j8@AL2x<r*R:Xvr{{=>-ąpHttM&Q*Daf'g4G"De #iGԡ|{]R)(]$Y bkLG{\v@"c.1wa͏]u@=*m0n_9efG]D;UAh"!*'.1i |z;o0zl~ރǿ s*s?>y>A[S]K54O?03y+όNOlirR׎tɜ @g8RhdnONf0(0iKQ-'#Lzj)BU7/8 ~N0:oiY):BH(t:HCϏNCMQF)Qtxlf屉\W2a:8ȝ`3I:XmQ~H{:yY9(vhE`gDL.oaPI>Κa0$Ҁ$Sf~c{@X='FjHp(Lm8׎G"sq~o N3;# 諘D/0g4I$a)ss:A)&d&Rk$dg"3IҲܣOX|~nT>;JU* uzq(qqɐeӰ Q$J-4+}:~T>od^NI> *X^HbDy7STj6xntc%sRZ}'ؠ'oIws=\:ww; _is%+{O,wqqyRŹ()GȢ+r΀|swBr,!-,EEń)`Ko!jbZ8ZI Ϥ͙吁ݹ1˝F Ը3*J܏ۄ*!i (!L&Ī١_8VUbI,,˩4aFJ)@H#JQf#41*W]|14(')<.^^odoMG05%!8e@P85>'_wdAsׄ%k%{[K,|.#mz)rtKz@fXpA}w#迾]٥w:_ޘ;Sik +}=:ϦNyr÷W]rFf.zfv4%evv+P_}gnfu'޷,NZp2f 'o0eOI.<!xN ͽGY2Mso|Cfh<֠ԁrJ:<81ΙvݬՇUWjZw53x ]zP4!x /Cka.U8] ߮ۗA/%d I2$F~dnkl=Ulm%#>Z,#axQVysb"C>RP>0( Dђ`^gВuw? )wx7;{c&V]B~ m7EDMJ42Ei,U212$`*).=ΈL0υ~Ec0;ɷ ]c@mg?8G\gX%,O`d̘^g@S!A EUڭ>a\i}aE#bN"kZi@:Ƹe6+re6HI]ѫMD4fy.9܀cBw<54%gd3(})}$ B@ GkST¯X@@o}- T5v["Sɝ:p$4Z̾spK sW,૧"PthG `5!}p L^Sӽڰ>[ ,RE11FN՛,?6{ov>pۇ~2 Bp['8j#z:R'w@$YBIֈ"<ҒT$޾QaZJ@6,?!A  Zꆐ_(X={'Jߡ-(ĔD=,ZS:òg )}BZQ16k=q!g,ZPI [PԡNcKc( οN s`q}B"_0jCIT.!$s'gFJ&,$$ϙHx.$bf?_^.2tAGsj]CBPwl~Mgd DVR-2e*(hCiRRe23j=T,WlE4KJ(e$ʜ!LMc7BpL9ёD){XXC&\Kv(5v&;@Xh^ %E4"MQe&;V"M)O a։F D: ^$RډH'$` 5{5%k,^FoKz$zˮFչ-%K%K-I/sR%KiQ)d p!<08n7^iUO@7 tɼ9@.ix?/*V<9tSjuj*}ҝ\ >r7ϤgvsySv|qK^_qZӋzqt.SM_;'3բ?*t]oʿ<[# 8gO@S A-=^έp7[,~36j3hr(y/k6/1*"+]AWkjR>-QkZ[+cvRvh(IaeLWg5x'NwA(%z3"C'n_SmuMÇRmUʯnx~M; ΙvymQT2iO;x'Oλ`ׂԪQ`=ja<CZ杚ىqA'+SU&e6sE# RM)S!Yꬱ-CP]]Sc9+^KnV)Ct+'4:ᜄ3SEM?,K$?6aaEjU]C܇0+rҠdryMI$XBYR[^ |)C@%3CrGġ90h2{G ~x;d vϵ߭sŀ=lAOew4gzlP=PjZuwdVQ{Q'o5Aqov{.~q0؋S}̺?f;X;yFgzKnX8IJKK.V/694gNJzNphcOݧϮ./c%mΪ'z;u$]Hc^r75/eS^A~qCo\ pMӎϐ(v.2-.3㢧 >T'``箟>?1^og=7I%#2f"ǜKNGW'[0{䀘q qm0:n7h͘H>@gd'il#H8rO7k.3+z}ԌT/J2`M >>S^no:wieT#eZ@"& 3A6JMYKhʛLՑu{wƛ[ᔪ+.DHfd1fʠDbRNF|<Ln@iIћ[ᡔ)P N@(ZO=VqBO{ԟ%<0ZYcvK g~!0(D.?`ON"3:]w;?çQV\&J)_@|+e2FQ ~+|s1lGhl!Y5M m!g;~whƫ̿7}jˏ$s^j.(/u]ֻ؛y xln!.~Ej)ֈ0.48Ҧ()f!p(ڟJ{Y{¼)ޗbq޴X87Vdw')֔Eu9`5' '0D-1$R'" /sɋ_]U;VXMJj|g7 QKk0ػzV|{ˤϟ>܊uql)h%JJ͑PJ S^ F{*׋[WB΋^DTJV!OѶO]loDbPt)~Bwܰ BG 5gCd;ܛ<}<[We(fd0@#筪&™XENdbI0qTvԂFSVE61jU3 IAePɘRB @rh^ ʣ &? u<,Rbl7TDʢX-6fo F>;[ ; i~5ʘvX5#2X,HB,"VeQdmҳW"Ӟfujd9ejX+8U"rkD Ҩ`>|aŸʦ9]<:dS&"$ѕ$B`E;4r-|2Jݭ;)jsZ:$[ Ͼ5G'awTn]۰"_s%.]bӾ(A`\JPdނ s )͠|m YvSI.5@$4fj+@bκB)Ci;Qfjߛa(@=EXœa]el}rX=]L5fgOSѲASxԇ*>I^/_Ĕꅍaz% D1Gَ Р";UGnϻh ټl)ouRuU̮v`SPd)LZy4z80Rvds],D*M7isç 18QeDa-Kɘk=P d(Ѳ\ ̕N&(=[Vd[J!>|6ԑ6tȏ#?_oA&ņǗ/_B]vHs C!'ѨQvqE@lq[6X 0Z1ɢl!"t#`9|b+PffA!d"QJ*R k|Σ]JY5-զ^}dͣHxUds*`kF kf==Lߑ*B)%vaea:z2[fbѷZDb!Ej{f(؂ޝN4 Y\0lsi,B TkptkG<"w86bfm uS":QX&ep/-Vrj&ǰQI7e$tJ5Xȶ6w tA$LlFS^0,^jĉ-:;8hTh+՛vç//I6USfǎiƾV[1L9༂l2ReP}5À;-O]Sle[B`!Y#'"euU5:Pzh3'Sxlnad˦gϟ?zٮټ}k0dc(ޢZ+A"Sks_j©cft,}&;>Av*2,ʪȬT`-]{ͭ |l*AWlr~_Y'`~Hv2Ƞ"k@UGތz?%B )}z(.ڪzFdRaiV(oi|eVKkBsFc=87C9~*my TQĨ7/mb>C 7"d[]n]]&n4HuB )³JPCMx;:'hL7xᆴl0笿rv7͵h 0 g{bU8Ws,[(B(`Tj+#bd99']oUt;&G%"<>m.5:VI[AÖg!48Pj (NK|\kHDQX%A$_1%αG C}o^|Ș"2%̓)6Q'W GU| QqyKݺ RERSاYz(cJou,i/;~W2X ul0#?(O^"&@o5 GZiɥV^zUOXoVᲴWBm+JK--?-LR6f] X1=\BP+91c$vz4: JNujZ@_& mxEcbe5l`Р)/`@efQqEۀs>TQ>'pEmu2Ie&,$d 4 <<]Ъr~4DUꛟٹHSWH DY)_tjx[83T/iUN lGa\hQD>X$1۶ ˛mLFƼ)l1OXb"q6e9gEl2 u=`nHS9;xr&$mX7<ô`C [-ܾhTѶ8o_DZ[vm|^ļIjb+DF/I",e_o؊fh) RHAUV?;U?P?`]'$%A\ҹtmU?י>iWޘWyN5췶XΏܙ,S^`bg@i+kf?Zꤡuqټj=t~grݴodfo>Cd[˵%/+mqU6>rI5>o%` BXv&VYA$qў*M$Sn7,>6-:AEn[)0O6 XJ x2d;maǓ`:gGZ]tbۉLJ*@0S <}[؆:a@ nN>ṉ RUЦ_oc)"\?l,4<:> TNgd8 {ӚZf=i%IX47ƗcHGJ%M%]7DŗL,= u,>_(y2&l"8y,hq``-pB|Qz0ӛEx,k.d!8 WTd :Qְ&9(RYdz>_ `b'5 v澊Xq0TioNDž08^<0&Su:kO97<ڂ/kP Mk%nF'/ ' '1uI h|YەwKڨe7>ƁŊ"kD^B_en 6%Ms=p0^+m縒)nO `3Gr#=0'^_&qDX%01xx(/*Y7}ǟ'icr+ ΏƇ|aSYݏ8cuu|?RpE/l]rpRD!N7`E~#"78׌IP!(HZbF+q(L4Lnc]ÎUE¯qC2~Io'|2TPr ?nQ!~&P ` E܏S\udYښ,ǩ{-&+qL ͵nj.c LV`^c)t9(k/X]32w4w@&,b9!g FGy])_D+[Gۺ*x<\1U@ l {sG ˧kŢfc2Z\fy%d}eFE ~!Ơ0᤿%OBx=Iŏ$Y MWE6}5w\5㫛,tͩ'd۵Ԍm/ȑ[U?-d5$ 1+l M[b(dIFZ0p'c ,% `3Zf}\P-[|?$m庖1O_uE/nTXiuv}Hw"\foʏ_/4\f℡[ˬ&L=g>ŜX 1Ȕ6PꌢE[´UI;åT ^a`zMG|*,ڋJᥞ` i{)o R celnEbMLլȋQqeo$# j K߼x:oրEU=]yqӧg5g)ؽd”ۺd.I6RCb]6:~5B @~$w>a) xr9޾Ѣ>j\9){aE}}v<W݃Cg?Z•{pa9׆W?9'cGTivh~ḗW=96_m+[Y¿}ʡ~mR^Y!@AnMb[8G 'ޔS(no<Hb=IiE,鋭GPJݗ+6'_xJ!ՓXJ1ΫTz=~ v&i74R``R$YEDbwĺ` a_z wCfgǩy֠h% (j@9: NtԀu<5"yx^$֝GFNGP4w%!+JfTE^VM3wNB_?lo蚵=~9N&g( ČӋ\%{IvD.)qpk*y94C$2سjه;4H^X(+7 57%P|jA{@rlB-Njy^cW T|[wRVD3f)_7a@ U!Bb`kJ *Q(3M 6(*^EjpEGg[pռjj5; Cr8pKϔp[y*8A>a(x ti^3 >-3U`oX[a?Ҕ6śM^hcX ½j f 1?tg>A<*z߭R[MV|=S<ٯFQt~#w8뗭){]Q"v{|Do߼a=$u'⑳[Kx -:}7VQզ >5o`f}hy 6YqB/(7+.2ataB WLej,n>7wGEh&Dr[N T\Q&jhl*iE$9j=1:"‰0sS ,1#3x ?ㄕ T3nUaz*xX}Qr< (q+CD PwN3GL$R >BJ"d)QRckZWww/`I7Ⰰk|PCdPSM &U<|;jK}̾ BYTZNijƂ2,ǚsVxzb*ê,qY8F97/ŷPȆZRrR+:hfRDp{Ԉ*E Dgl>Fᚑt=)3lTՕo{ni nꐆ z_fmGpvyED D';8['[nHW|`"X,'uR@ơjd6H>>*27AqtTuO!Ƿ%F/lLݶ=}-Ho*,@ҟf=.@ǐD9T K(EԅUL!`  xO*P/`lПVzYsmS)#(r(4`Jc0Rי@~»>LBaF&?_ |^8!30:0w+ӥ@Y89]||=KP3Ttˁ:SJuRF8[LLW-RcxI=P&sR`1k1!íDt.qQuQKvcjRȱ',jIs<,S ظWHSw]ʽu"f6n /Mn ^0__ ֠fXiƛ*fqD- q }~&s&␲ ܰXH eP !9zv&CJn Zń gZIEZ$*-aXe-p&Zjhմq%X4-)zQ]ě8B*˘}AGiQEJYX68hɉ /OW?l5$wňj o~];eGWE"NV4H8E M- ӪyjM1bwfҠ:w75p%]˵dyUe KÈ9j|$ 4`de4 n#c-6,rB |) QО!D CI,|`Q qMO!(HHZdl/lB?SI AWK8JDLCw kW37=RT)CZT>Wk4br?}1 FiHa?sܤUXOW=.3a?U1^X'P!1 Q \/GXn(. j;k#|dQJ[5 2co]2DvDYJ>Α'{E0SPp:I ψ9K1dȮ5P2D /U4/E_9E *vѠ9.ef wgݖC} 9?ĝ,"S?feD݈@#_XKD]}/n:E{G؍0˦2#, [8M˸­렾X!Ə}G :pl;2veWñ* ii1!Q#/E7.'0m%O͑[ZɖmZ9̦!Ufysԏl]oM=Vw׫K=G>lA(jGE{ Q kQ,*qT*i|x)%F6UR]*ɃP2u!0l)8ÈDjEr2FdG,WQhG[PsrNxA s\0duH"EEF.*}?_oKJe鷸n Ni8>D3LGC. o[eYv<ѭ誻%Snt,v)^eJ2ʵE vR1F BX#1q[L\ tC@]6"\:]͵qaH kr3\[Gj @$l59b}Vnvݗ^jr% lؼgLm[]%]ͣ^.OWBs rq~s!&^C_WX_]:!'uNeSB.΂+'K 哃"HuR#͔hƣNS;,/_~|TMS[lˣR0tb6%A.q*`vClLvW=> ըYzdU4ͅ2)6[Ͻنw͗4vT`JҝэU. AGM%[ z^C@KxJ7s3b8mޮ >6qRBѱ~ 3M[fIKYXmSIgAzaA;MDZHP=Q#aF9U!, }ҡPm>CݦV$IzJ l ɤWOl;1Җz@p/Pl߇{c۫1Sb%l1 l+T1j0V6[#Z < mXnbD!(5& ph([`-X.)F ¬řA,9CoeM{lJŒ+,lFqE-FRR[%C`obQS)D t@a9e-2 s垹jYuQ*q Q]Ȯ'EOT[fM4*g+ Um"-V3rJ¼^ZnZ@t߭EUFޢuSվu/o?KmRV^)??~wlq!N<^nn0c g'05w.r^|=7=מu*B#?<ޖ&^QVDI3#b2몄C|+Ub2.w$^T!lNN;̈́wp] r jU/]><--;TCL8>raGhfRMF{|c4_J㭛8@Av?skTIwդ˧'bAJϣhrtII"# }06Apͻֈ=rNtt@ mo:m3կ~os}H[>j^{(Q2 ڑt0`wGXN Ɵ.u[GPMG><{{I܆ڠ!kw!q;܄T|M6$q#BSVO&݆ڠ&t~GvKkm8b2!y)rG_x"(xڠtw"=XS|j~wCt\o޺xkްJy% S(,Fr4b5+UЪ+In[0;BZj-jpB`q2}-8PI`PN^ΛxUN? (E5]܄vL}ZpMu}u5.EnOFbZ=6y=NG tM 5:DYKl|=z:Pog&b'o+ᬲb {H-뺨 kONrq mZqf9m⏓ɻhq&8g5ްjUiB5ڍK} JLW"zJ-V"tmlr;B[kTvN0Ąi{'FhͶZoϿʐZϭi R)kW7uY4ky \]6u)?2U닯ThjHrC>B6?u'›oHⳓ3ݒmly)rogR5Ξ.ɭ >^'d{?pDM9DzUnU#V`]m| :<LXHI4 tgX"0Vo=~@-i_ hiq 6ݜ3ZX/VH n=2,> 4j<];4lf2AUhyV@.PK'ZZzoЌ =d%[e1TV$<hM]ܿ`PHA3Q914:Lu~Ó@3z+PA3vjhY0i.Oe6otJRu7 RGUGk(Pȑu9hFA3'iCvdOo#wཝdO/+&"ǟ uXԹv,R宗0Nv"V~xXb@:$i90v)/b-?\mr=dwR6V+< ׶͖xQ]g%7J;H:VX?fs30c@(aC䓂b66Big,[Vy R G [{@OZOrqp?E>'XlvsyÏO74ϋϋϋϻN{Cj Q$)2,jt* IEhڅ}h.E4o'C&^vOk ?XN^e"[k., UC͘Z 5:ơ&Lv/tr.KmR}Ȑ$&Jg,!`B^ @A [/PY$]ڽ__zvO*_ƿ͍2ZYY(/wJwH|wuů8hJNU )bS5HǏx˫\z,!8y/w[I3 $P<  EVvB6Bz( ;@ij9O.WПO~,sWɆ˕A>3g.2u{s Ŧc'Kp>K= /DYTjJa-31%V,4ȢwI(A+9z !(%,(Z%<!j]Tr ( lE·&D^d09%jzc *WhLkoLTBy08*Xc_^}\l\x_ߔ rVLeO0,}{/Zt䭥<9M85n,w TY*5~x}ũYEM:dǨbSb'5?_o6724+:Cͧ<as)T Vscc@)Z'UѫUľWןYLŖL5_)dMyc]MA7CNN9Ie@1,J#+{u56)~oY~r>ruUI]  e@_n> 6#+3Aڋ>7erFq V6%>J*sRPݤFq̐WHYkH(y۠YF6E#|tjV.H4W{b RFP9,Cro8O{J4jZE]~_3k  ӯrN&MZз'?TB6(| ͌,7 'm텦DNb: u5vPpZ|oCe3yykѠ Z)ppUXIX5 B*yPhr3 W'G ?Qp]9b(VB58T%Ka1US?<3轻L{$T&K -ܒB7ߘsRqR#ZKH Csp(Q@YG5yKo[{5lU Mx%|4bZk#D$ޫk&/>=|xDtxb8kq?>]l&{[I=qCvRF{JB2r' q!Mڏ^n7 r{}N)KXgKbs|#k4/6u{ oOG`wʼn3ql9A 08F|KQ0 6 J<X\I}s~&ZDqHhm8 .A߬%Zl]|:vZdzh<|yjH83(7MF:yxr? ~P=Y\!<^L9ҮIT1"u5Ub˃{#2Ŧ: ?l D *Tf_Ңx'hf3%>xNX砲XhmJAN|dVymgru!ZUbeL?|qIBе -[k^# GPtB*XјMCZ,-27)1&BtlO w#{(RUV#Sa!q/4%Y?nʄv8Lh#}0DΌrZkMS& R2&.[6qt0.ag/F%ޑ۴5At&T!~U? rt{~Eyn}LLد\i (E?u|{I9!\4Dq;ÏA<k]E1*BiZ|94ioaq!xirDbv'7[ ^|)yv1Ӻ8/%\x)[ ֩g˪YW"L͟P8B?ߺF#]BVXC&r=ɃUR%#̇`F?[߯nuC0%Mb}2pEY'ʭ@?2ezl.juuH0x7v+>PK_ bv}47Sgm{DA )rnpE4fG0Ƌ!bD6TMUqc\3)W8*t¯1|`&ZOMy,6EhIJ}k3AkWbUCqZӊY] vc"ġ5+#:9BW)|@6'k]?g8󥺱g_AḣU/zKp~];œpbͿVP4F|hor5%F9sfՠ_y[*ڌۺ¶ÑϦ؂^(n ѠY->~:|Ǻ^Wپwx&bĠ~ @K'L^|2on%K3qy 뷪oLW49mJFڛ(`G 0j*5?7Ɲ9c_w宜 fkж"!/1_I}4d)h:RGʡ':kz$ctsCL0Йw΍.,W,5_nz7:=R*.'pu&h :7eHgf "")SDvѷg*(?:_]Ξ\-%nrt͢T[OL4zl(ѓP52zIUFJi9'lh$b:WCms&m\#$%!Tʐ2TxObhPbއH* &vn1/A ~tqS&FWZ iy{$ȳ6Gsn.W@υ;Pߕ˽Ol'4)1 ^ф,P6զz03(Gwe%ǗXԛ:v2ZJ !;a6gQ(? E8# ]{?Uîh -.zc3}KY}1I  zy;5 )&b rʩXяJfuG'VP˰ZnpY,TiHWN1nY@WЯ:\0C0WW!y&(P$,)(U5mx4gZ|uݟx뫶bOr9%[G-_r_'gxȔ'1zͩ=<_(?K%Cm(+חX}vw]*19s 0PI~3ߵLM{/ \bה T\vUv@@lzQP_i*ߕZH ~{05QNJQ '8 #h߽e߫2H`)MM6v,sRBRE](잕 Gv޾_~ET2gXn'T]$$=@Y75ҙ:6SI|Ջk C?WS]H0ީϙ 2ucԚ]P{#nmäz/;8:#QNHFź_n\oF-E9\n^;Q|~%wt'@ps(?jgǔ@۟[>"wv%K 04J3f:; $ɻ*)qIӋdZa(B> YO58{N/C|s4ňB`,׮M 6~ǎjxw? |߂$}9Qe_ *`(>5a(6(jʗ+0|'4.{AI>! ω*تQ`bw(~z u(3Ɔs!U( V*U2+8og}3n_wylo؄iI64QN.{n6>\lt0x>Y D|bLZHPM!/{WƑJX>xEñ[aY2r $aRj`А pTʣL\+Xk!|vףCӡ1nOp`#GBS]ڈ^zܮ@6s\CA/>Y|3wN:xw?xr+1l~$B+`|#9 PkGY58vm ZAOZ L%e+%.yiM7]f?;#TAk9x&M]+~T72ౄ4&Z#g4 ᆨځ|CaCH!ABi\a!lMOutn0BM6Y ҥvt BtU.c(q *yWQXaNqc-aG=W  \ ޶3wʲ0.x N.ɋ4QQr9J55g2bx1 A)w< `@ 6B$w3Pé (˛n\p #'7sdcѭ 5vn;RzA9l(#J@y4V̏P~*dW\L5CEe3ڐ ;X4tXۡЯYHWKh~}*Cj!- 2ȎDcD>,bL-pI,ڄb-P}ӭ jlMRĉϏ4$\hAr,]i 95 tbw^DiǹK9Cb4I':_CSfTNθ l}g˚e-`o7k/L/\y~^H:Jttw;Rb>| =)K j)rqb\t.|,B|ad-CFq$UUL9y=RZ0Y&UmB`LZ8ԃI[\8-IuZiҧ/83Vę( pºw:& l>?z 8*$l!APSyĄ0`lI[BiF)=m8h s""M:rBeκFP_59FB@0ؑy}! D]ij! тnA Ehhg=!nxB$:9iMrV:Q̦xjYǛEU ᔪ#Gj4ey ppjjg -#=a?TODl.>BȆFj5'Ҽ,sR!YO95w Պ}+U {C]&[FÝ !CUE !'RqCRYAEF2KLJAp$(,z{+h"<Ƒ ,2 YF '%D 5hr99MH?;!_*؎tюTJ!!7Ybj}Zp.>eu%q(am@UYm MA" `iNvF$%{1'Zwy"4 DK, hu]pDM9h\nFHpVK+ڹP}} _ݽupQnilN,٧ݍ`)$P-km@Fsȕ87sAXk^2o*z!C+#4Di;"k|J$J[r5?ҋӇ)pe8ߕྟhܘKbk6x@ʅx6vyg k4 ۿ|Txq1LA]/[* B,wѰCKxc*"Aa‰W)F97+5>Y4(G|PYB\뿍rsG\,zO}+`fma714rmʳp34 _~_P7-~; %rǥ>Y8,_H)/rJH Z'a c1Ps;C1FUAL^eXJٛV9ެ|[rNnoKpe1 ǓlIً%b;oZ_@^&ܮM ʭ9>k8^rY_9O%>lxw!@ZA.+8'BU8-xԔ)QڊZaafS ui Ex |'hP..G" ;C"1j0 c#^rl2"z/l)ɉ.S>m;VGͪIa®,mJ[6J|eCժ)`Յ<| sy9u1%EߗG}:jB:#ꜨzRJT3AHQ_?7ycRFI)߄胐MZ5$-dsoYWRiWr$(\qU`+m SOJːjᛓj`S)2! =)zDpx8׃Ŷbyfrnb4}Bw|8c c#`+`F;E-L\o_J s+eikBsjX[0s*~%l>H@I }Lk}E衯z,z7vipLzv]3=f\7'I0fO8<иrmINy=DdG!Dy=*&3P1h&i'=5 7̇ލ'\zA!ۢ/Hw#k3\m#V+M*]Jr˩o9 2Df{x V^=wi[J6aLhL;S;9?p]ʄe&|ae0!p3Z`璆xTL`NaBTDFI+V4u61#P(/u~V?`ڵ!&;M/6_'/߭ &is]iWv]'rQr5R-b$ 4uѲH 2e(|,zr!$#VD!=S k6-H:^Ӷ35WuICF+!ƙ'Y :h*S̈h$` MLAPNv `~}K-7uzJS2.]+p筋0{>TIWꂵ($)u^c~Eb_jO`.^LCw<9rVXLoVY>]/)0mwDpѳfچD(%1'4BR,DΉl02u')i 9o6Rqs] [N])ij̪-Z2Kc[$]>ۇa0(`Y ݧ|͸զ,RZ4hX8WHHk.yS10mG+VD0,d ?Y  BP:ZA[zxXxΝؖ7i g)ӛ8SߪgI{6+| Yw^ 'ɳy_ x4tCVEQ3M69 -ʟpA/.+Ћr9+[("S(ϼ =^5RFvo4ԙNzU|>nҍma6 cK8*[Zo~8tN͇2MwgAgk푖 FӘ+4$Fhp H&<#be&dGi1.rxruqrﴁί D]0!^nC| Q/ h4.u{<_!44)А ,*bC,X2L^M%"bjҘSI r]Pke \FeD{feWJά` fQ,-RPМ2#Q&l{PBBKZKpRb`)SR$If#D>42Yr!:#LY@m1|S[겷BMw@ۥ_BL6]E kj'IZ-z#mK/-"wL@{OZ= /~:U.w~I.RQ.˯^>\l>_?g<%n rNijQ6ۤsC fr g.l=OvC:T&qpJQfof^BYjW \o.څSU@mmG~[Je~w85r1=b\ ei~}wwP:X qN4T{ \C9|<,3=s6yRg.yRgzNf<'H'2!@Yp`1mcȤP;l>x ʵkZ!Eg&xE㯶ֿj?4<3Xȩ \&w2 3[ZIlee+R4RĊ,A^4J g2qM:&c9(ʱv MӔƱk*ȈBdtsF1˘2'~5$HQ5/H^{hS01äȐHƒyњA74^I)f*2O*T34 j)e![4"\*%hKH[A l 2h}sq@FIF@H˿: (}EQxP,(4T \p./-$ 8283b%*!TT\JrlyessvU?c?v,iޖt/d' A7/}씞]vr,h*ʟvie.oޜs݉o>r&oXܫ .wߞ]F c+P+8]Dmg{)A.*HvAܜ̮$isur=#iPWPJO,{n[P4P1)rA6q q{Є|>nn'(΢DFA(j4&eIsؒ)\@/4IJ"#hMia֙3Tb-0-lpjua!ݞ@.\~<@R ')&(;T*^[3[w{((:Vm sչD&mul>Q{il/&9,Qiط[fw]20 TAa[S40*#!z{xx _gڗBp.|8 wSuMq20hm,>prpdWuQ?YZTyO7O_;~lW \d}755W3jLDpq b!A}ZPE_*CueRpAƴFאB$<%iJXFT-3m# VGp-x 9JU=CQ.}.d\ɞHl"MD"#L@h,$2Nbk=M'w B-;y.0~553F*ÏU9(察[onI !yޡ?ҝM{[*-ns#<u t9MN߅CH.u$cIk荴#$ UW`9~v|cvhj@nI4N;ib=M;iQÝ~cbOc|Z͠U͈ccus7G"H0ޠKoqoJ3 U4"GskFBI"fPAOM$DI32P 7\Z< k5?ڂ<랉A ɫ*Jv1qKr=;࿭yO7Iq+5z]>9N7i1 y5דՓB߱L=<#Vɘ%))?:t;7Q`k8֎GHh?rT>uHGE4Hj7 &Bi#:hF4nw<[E4DwLNAGtBD OKsRn4<[E"S&nd&b7cYy‘1 WDpSƻS3xMd%>A x_ $"j]VC%}+`OmXͻ:jS$dÕptwz$'.>\IU{"zT8~l[ط/isݗ46Tq+n8U?N(ƫsհ8ƫ+A,rP?h 6G ;5#VO=R}}ѐG;L~u4-5ԂJ-C;0vb~zWjҶ$6'sk%SZ*7NRCisL&'iUSI;I O8V\gi4D ʭ_>5~PJܤKy8{)o߽{(?w<ɗη`_%6York5.ddBvbpZŀRlJg9݆xX[T2O7`}w"K1g٥ܵ.o:{G4zju2 H6LR=S QÖXҪL˝"2e`Y"J'cZ9ĒX6]ҍ'(/|6ӝK( /e0kcJ)] flELe]mH?1^Fli3"7Ťmԛ3VT?meIUͽ9yurP "dؾ"mD d_C1y $u9ݷצ KVPWM.jbX:߮{CUdHZxfܐtnsȲ8cyٷa ^[;0SP 9Cp[{Гsb?U ئa8Uas^QIk9C_v[W<ģ'یm/.Hwl<8B%;tS=ݱTt̞ oybb(Hzrx"F|I $6[mH_unn%2NT uF՜eQ\LDV,Iay,Q(*NoArӣ"!d"c1cmSΥJf<52nu@G6#L4-`$/h+6tv^%_iS]6}'ʋrY)GG)`Ƅ[vJO.;)[Oz|~ʘ.oޜPrt^߸g>d/W->ld<=˿PEZ-A\]z`lHjodF5b qCȣu{ޒ桄ab')aav>z'[[ڪߜt=-$,GTAnw5@OSqq;/aFwçH{tnps$gT>e2D^[mg mkao{Z9ZeٶVgFR[JWY-%ՃM9+UyHRs~"7xDqfAa/;)Y  yh # "+Fh+K!1Ôߝw[źo+݋%:o`!+2C$顬" OkLR.(uekҊ Xʓ[`ͼR@>s{Zsj}7KE? $CU ܾ!{T9Q-@T"K$%'G-iԟӼ(9ůa>ߊ﷋M"Oh7":MPv{?o[?z[ =w^wm1(1=s:F+Zk87=J~&RL *,/Sgm*G+GyAh ֏, C+OZG&Cy&$Mɿ5u~hv)4H%zw!!?"ϴʯ44Ww%$՝VwN0Tf $RVxv#k+.ZD YBh4Uް#٣J[LAUW(=R1ks^'lKЕTi̽>V oU^q9ЍE$FuɡHɡ&N N5&>4q,e@Oĝ;E3t#N_n\7]uRn-|~%O0suKI[OV3z ;El~vO͒~t^?~x`^߾]OWaH]^ܲ 5quO ^K EwGaiM `x&p Hb"3bU '`-(8ps[rl <׈>c 8Hr# 3+ ݎWѿFW<wPVz8^*{NHڕ5R_+(όκ;j4FDGGf5VGj"!&V ?bР-6sq-BhaK^5*{|d% |ILfI;]U ͠c9hGγk?9# rE`/~BՓָK /)Te坴"$SiڹYӢF q;>#lv_MGOs2__nw:G$nfS9r& cfW;ĵ!f-"@;1ZON i+;АLC .Tn5d“`h`0XS5Cwxtwؚ>f{umw r =G1ͬM@Q9xlA{I-<=8LFO 8 VO}F{H|ebvܯU{nt06Ikw3Ț(V,A'u4 YK-kI?ȔxTJ|9i> ;IEFN)#W192.168.126.11:17697: read: connection reset by peer" start-of-body= Feb 19 15:09:45 crc kubenswrapper[4810]: I0219 15:09:45.329008 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:56004->192.168.126.11:17697: read: connection reset by peer" Feb 19 15:09:45 crc kubenswrapper[4810]: I0219 15:09:45.348013 4810 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Feb 19 15:09:45 crc kubenswrapper[4810]: I0219 15:09:45.363171 4810 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 06:40:25.325883947 +0000 UTC Feb 19 15:09:45 crc kubenswrapper[4810]: I0219 15:09:45.567023 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 19 15:09:45 crc kubenswrapper[4810]: I0219 15:09:45.569347 4810 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f1b00844d1da0b4c140f9ebcb4acffd7825f60d193021a860831e5fd754b2790" exitCode=255 Feb 19 15:09:45 crc kubenswrapper[4810]: I0219 15:09:45.569493 4810 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 15:09:45 crc kubenswrapper[4810]: I0219 15:09:45.569529 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"f1b00844d1da0b4c140f9ebcb4acffd7825f60d193021a860831e5fd754b2790"} Feb 19 15:09:45 crc kubenswrapper[4810]: I0219 15:09:45.569711 4810 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 15:09:45 crc kubenswrapper[4810]: I0219 15:09:45.570861 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:09:45 crc kubenswrapper[4810]: I0219 15:09:45.570918 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:09:45 crc kubenswrapper[4810]: I0219 15:09:45.570929 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:09:45 crc kubenswrapper[4810]: I0219 15:09:45.571233 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:09:45 crc kubenswrapper[4810]: I0219 15:09:45.571256 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:09:45 crc kubenswrapper[4810]: I0219 15:09:45.571264 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:09:45 crc kubenswrapper[4810]: I0219 15:09:45.571866 4810 scope.go:117] "RemoveContainer" containerID="f1b00844d1da0b4c140f9ebcb4acffd7825f60d193021a860831e5fd754b2790" Feb 19 15:09:45 crc kubenswrapper[4810]: W0219 15:09:45.717403 4810 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Feb 19 15:09:45 crc kubenswrapper[4810]: I0219 15:09:45.717480 4810 trace.go:236] Trace[1968846068]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (19-Feb-2026 15:09:35.715) (total time: 10001ms): Feb 19 15:09:45 crc kubenswrapper[4810]: Trace[1968846068]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (15:09:45.717) Feb 19 15:09:45 crc kubenswrapper[4810]: Trace[1968846068]: [10.001945165s] [10.001945165s] END Feb 19 15:09:45 crc kubenswrapper[4810]: E0219 15:09:45.717502 4810 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Feb 19 15:09:45 crc kubenswrapper[4810]: I0219 15:09:45.861070 4810 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 19 15:09:45 crc kubenswrapper[4810]: I0219 15:09:45.861182 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 19 15:09:46 crc kubenswrapper[4810]: I0219 15:09:46.364467 4810 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 23:28:45.449930866 +0000 UTC Feb 19 15:09:46 crc kubenswrapper[4810]: I0219 15:09:46.464135 4810 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 15:09:46 crc kubenswrapper[4810]: I0219 15:09:46.498696 4810 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 19 15:09:46 crc kubenswrapper[4810]: I0219 15:09:46.498805 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 19 15:09:46 crc kubenswrapper[4810]: I0219 15:09:46.506350 4810 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 19 15:09:46 crc kubenswrapper[4810]: I0219 15:09:46.506642 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 19 15:09:46 crc kubenswrapper[4810]: I0219 15:09:46.576263 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 19 15:09:46 crc kubenswrapper[4810]: I0219 15:09:46.578311 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"640d02443d7d20048695866b4db1d1fdd6469c712a804010860ca4c730f59c65"} Feb 19 15:09:46 crc kubenswrapper[4810]: I0219 15:09:46.578466 4810 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 15:09:46 crc kubenswrapper[4810]: I0219 15:09:46.579413 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:09:46 crc kubenswrapper[4810]: I0219 15:09:46.579573 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:09:46 crc kubenswrapper[4810]: I0219 15:09:46.579651 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:09:47 crc kubenswrapper[4810]: I0219 15:09:47.365814 4810 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 02:54:04.513238034 +0000 UTC Feb 19 15:09:47 crc kubenswrapper[4810]: I0219 15:09:47.535540 4810 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Feb 19 15:09:47 crc kubenswrapper[4810]: [+]log ok Feb 19 15:09:47 crc kubenswrapper[4810]: [+]etcd ok Feb 19 15:09:47 crc kubenswrapper[4810]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Feb 19 15:09:47 crc kubenswrapper[4810]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Feb 19 15:09:47 crc kubenswrapper[4810]: [+]poststarthook/start-apiserver-admission-initializer ok Feb 19 15:09:47 crc kubenswrapper[4810]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Feb 19 15:09:47 crc kubenswrapper[4810]: [+]poststarthook/openshift.io-api-request-count-filter ok Feb 19 15:09:47 crc kubenswrapper[4810]: [+]poststarthook/openshift.io-startkubeinformers ok Feb 19 15:09:47 crc kubenswrapper[4810]: [+]poststarthook/generic-apiserver-start-informers ok Feb 19 15:09:47 crc kubenswrapper[4810]: [+]poststarthook/priority-and-fairness-config-consumer ok Feb 19 15:09:47 crc kubenswrapper[4810]: [+]poststarthook/priority-and-fairness-filter ok Feb 19 15:09:47 crc kubenswrapper[4810]: [+]poststarthook/storage-object-count-tracker-hook ok Feb 19 15:09:47 crc kubenswrapper[4810]: [+]poststarthook/start-apiextensions-informers ok Feb 19 15:09:47 crc kubenswrapper[4810]: [+]poststarthook/start-apiextensions-controllers ok Feb 19 15:09:47 crc kubenswrapper[4810]: [+]poststarthook/crd-informer-synced ok Feb 19 15:09:47 crc kubenswrapper[4810]: [+]poststarthook/start-system-namespaces-controller ok Feb 19 15:09:47 crc kubenswrapper[4810]: [+]poststarthook/start-cluster-authentication-info-controller ok Feb 19 15:09:47 crc kubenswrapper[4810]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Feb 19 15:09:47 crc kubenswrapper[4810]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Feb 19 15:09:47 crc kubenswrapper[4810]: [+]poststarthook/start-legacy-token-tracking-controller ok Feb 19 15:09:47 crc kubenswrapper[4810]: [+]poststarthook/start-service-ip-repair-controllers ok Feb 19 15:09:47 crc kubenswrapper[4810]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Feb 19 15:09:47 crc kubenswrapper[4810]: [+]poststarthook/scheduling/bootstrap-system-priority-classes ok Feb 19 15:09:47 crc kubenswrapper[4810]: [+]poststarthook/priority-and-fairness-config-producer ok Feb 19 15:09:47 crc kubenswrapper[4810]: [+]poststarthook/bootstrap-controller ok Feb 19 15:09:47 crc kubenswrapper[4810]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Feb 19 15:09:47 crc kubenswrapper[4810]: [+]poststarthook/start-kube-aggregator-informers ok Feb 19 15:09:47 crc kubenswrapper[4810]: [+]poststarthook/apiservice-status-local-available-controller ok Feb 19 15:09:47 crc kubenswrapper[4810]: [+]poststarthook/apiservice-status-remote-available-controller ok Feb 19 15:09:47 crc kubenswrapper[4810]: [+]poststarthook/apiservice-registration-controller ok Feb 19 15:09:47 crc kubenswrapper[4810]: [+]poststarthook/apiservice-wait-for-first-sync ok Feb 19 15:09:47 crc kubenswrapper[4810]: [+]poststarthook/apiservice-discovery-controller ok Feb 19 15:09:47 crc kubenswrapper[4810]: [+]poststarthook/kube-apiserver-autoregistration ok Feb 19 15:09:47 crc kubenswrapper[4810]: [+]autoregister-completion ok Feb 19 15:09:47 crc kubenswrapper[4810]: [+]poststarthook/apiservice-openapi-controller ok Feb 19 15:09:47 crc kubenswrapper[4810]: [+]poststarthook/apiservice-openapiv3-controller ok Feb 19 15:09:47 crc kubenswrapper[4810]: livez check failed Feb 19 15:09:47 crc kubenswrapper[4810]: I0219 15:09:47.538298 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 15:09:47 crc kubenswrapper[4810]: I0219 15:09:47.580968 4810 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 15:09:47 crc kubenswrapper[4810]: I0219 15:09:47.581071 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 15:09:47 crc kubenswrapper[4810]: I0219 15:09:47.582116 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:09:47 crc kubenswrapper[4810]: I0219 15:09:47.582309 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:09:47 crc kubenswrapper[4810]: I0219 15:09:47.582474 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:09:48 crc kubenswrapper[4810]: I0219 15:09:48.367099 4810 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 10:48:28.136455972 +0000 UTC Feb 19 15:09:48 crc kubenswrapper[4810]: I0219 15:09:48.589409 4810 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 15:09:48 crc kubenswrapper[4810]: I0219 15:09:48.590925 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:09:48 crc kubenswrapper[4810]: I0219 15:09:48.590964 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:09:48 crc kubenswrapper[4810]: I0219 15:09:48.590976 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:09:49 crc kubenswrapper[4810]: I0219 15:09:49.320719 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Feb 19 15:09:49 crc kubenswrapper[4810]: I0219 15:09:49.320954 4810 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 15:09:49 crc kubenswrapper[4810]: I0219 15:09:49.322609 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:09:49 crc kubenswrapper[4810]: I0219 15:09:49.322673 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:09:49 crc kubenswrapper[4810]: I0219 15:09:49.322691 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:09:49 crc kubenswrapper[4810]: I0219 15:09:49.336371 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Feb 19 15:09:49 crc kubenswrapper[4810]: I0219 15:09:49.367845 4810 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 15:04:15.430721913 +0000 UTC Feb 19 15:09:49 crc kubenswrapper[4810]: I0219 15:09:49.591924 4810 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 15:09:49 crc kubenswrapper[4810]: I0219 15:09:49.593178 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:09:49 crc kubenswrapper[4810]: I0219 15:09:49.593264 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:09:49 crc kubenswrapper[4810]: I0219 15:09:49.593283 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:09:49 crc kubenswrapper[4810]: I0219 15:09:49.737071 4810 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 19 15:09:50 crc kubenswrapper[4810]: I0219 15:09:50.343304 4810 apiserver.go:52] "Watching apiserver" Feb 19 15:09:50 crc kubenswrapper[4810]: I0219 15:09:50.349912 4810 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 19 15:09:50 crc kubenswrapper[4810]: I0219 15:09:50.350654 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf"] Feb 19 15:09:50 crc kubenswrapper[4810]: I0219 15:09:50.351257 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 19 15:09:50 crc kubenswrapper[4810]: I0219 15:09:50.351439 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 15:09:50 crc kubenswrapper[4810]: I0219 15:09:50.351658 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 15:09:50 crc kubenswrapper[4810]: E0219 15:09:50.351756 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 15:09:50 crc kubenswrapper[4810]: E0219 15:09:50.351775 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 15:09:50 crc kubenswrapper[4810]: I0219 15:09:50.352048 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 15:09:50 crc kubenswrapper[4810]: I0219 15:09:50.352914 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 15:09:50 crc kubenswrapper[4810]: I0219 15:09:50.352946 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 19 15:09:50 crc kubenswrapper[4810]: E0219 15:09:50.352999 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 15:09:50 crc kubenswrapper[4810]: I0219 15:09:50.355541 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 19 15:09:50 crc kubenswrapper[4810]: I0219 15:09:50.355541 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 19 15:09:50 crc kubenswrapper[4810]: I0219 15:09:50.356846 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 19 15:09:50 crc kubenswrapper[4810]: I0219 15:09:50.356940 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 19 15:09:50 crc kubenswrapper[4810]: I0219 15:09:50.357104 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 19 15:09:50 crc kubenswrapper[4810]: I0219 15:09:50.357317 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 19 15:09:50 crc kubenswrapper[4810]: I0219 15:09:50.357671 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 19 15:09:50 crc kubenswrapper[4810]: I0219 15:09:50.357884 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 19 15:09:50 crc kubenswrapper[4810]: I0219 15:09:50.358070 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 19 15:09:50 crc kubenswrapper[4810]: I0219 15:09:50.368676 4810 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 06:49:27.434453389 +0000 UTC Feb 19 15:09:50 crc kubenswrapper[4810]: I0219 15:09:50.390628 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 15:09:50 crc kubenswrapper[4810]: I0219 15:09:50.408425 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 15:09:50 crc kubenswrapper[4810]: I0219 15:09:50.423906 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 15:09:50 crc kubenswrapper[4810]: I0219 15:09:50.444409 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 15:09:50 crc kubenswrapper[4810]: I0219 15:09:50.451078 4810 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Feb 19 15:09:50 crc kubenswrapper[4810]: I0219 15:09:50.457470 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 15:09:50 crc kubenswrapper[4810]: I0219 15:09:50.472307 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 15:09:50 crc kubenswrapper[4810]: I0219 15:09:50.484653 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.369247 4810 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 06:25:15.070750763 +0000 UTC Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.438772 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.438794 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 15:09:51 crc kubenswrapper[4810]: E0219 15:09:51.438929 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 15:09:51 crc kubenswrapper[4810]: E0219 15:09:51.439264 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.457705 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.475642 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 15:09:51 crc kubenswrapper[4810]: E0219 15:09:51.491307 4810 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.493351 4810 trace.go:236] Trace[44012123]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (19-Feb-2026 15:09:39.444) (total time: 12049ms): Feb 19 15:09:51 crc kubenswrapper[4810]: Trace[44012123]: ---"Objects listed" error: 12049ms (15:09:51.493) Feb 19 15:09:51 crc kubenswrapper[4810]: Trace[44012123]: [12.049199069s] [12.049199069s] END Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.493372 4810 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.494102 4810 trace.go:236] Trace[1393407943]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (19-Feb-2026 15:09:39.807) (total time: 11686ms): Feb 19 15:09:51 crc kubenswrapper[4810]: Trace[1393407943]: ---"Objects listed" error: 11686ms (15:09:51.494) Feb 19 15:09:51 crc kubenswrapper[4810]: Trace[1393407943]: [11.686360415s] [11.686360415s] END Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.494120 4810 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.494459 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 15:09:51 crc kubenswrapper[4810]: E0219 15:09:51.496137 4810 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.496289 4810 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.497363 4810 trace.go:236] Trace[1148226231]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (19-Feb-2026 15:09:38.918) (total time: 12578ms): Feb 19 15:09:51 crc kubenswrapper[4810]: Trace[1148226231]: ---"Objects listed" error: 12578ms (15:09:51.497) Feb 19 15:09:51 crc kubenswrapper[4810]: Trace[1148226231]: [12.578774292s] [12.578774292s] END Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.497645 4810 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.509181 4810 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.512574 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.530871 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.550042 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.596778 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.596848 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.596879 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.596909 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.596948 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.596975 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.597040 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.597963 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.597968 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.598055 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.598116 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.598374 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.598838 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.598877 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.598929 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.598948 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.598979 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.599021 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.599043 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.599109 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.599717 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.599766 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.599810 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.599891 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.600286 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.600345 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.600394 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.600420 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.600415 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.600504 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.600546 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.600756 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.600797 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.600878 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.601120 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.601243 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.601309 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.601641 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.601750 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.602769 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.602883 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.604267 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.604423 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.604539 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.604612 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.604665 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.604983 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.605296 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.605394 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.605521 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.605759 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.605953 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.605989 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: E0219 15:09:51.606134 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 15:09:52.106106584 +0000 UTC m=+21.588136718 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.606167 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.606213 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.606312 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.606628 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.606704 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.606731 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.607120 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.607065 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.607254 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.607600 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.607279 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.607738 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.607880 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.607885 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.607913 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.608085 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.608159 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.608187 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.608817 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.608896 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.608737 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.608788 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.608993 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.609237 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.609398 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.609482 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.609728 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.609779 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.609844 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.610288 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.614796 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.614876 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.614914 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.614942 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.614976 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.615000 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.615024 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.615048 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.615072 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.615096 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.615121 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.615154 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.615177 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.615201 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.615235 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.615276 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.615309 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.615371 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.615402 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.615426 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.615456 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.615493 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.615542 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.615570 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.615605 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.615629 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.615656 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.615685 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.615707 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.615729 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.615753 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.615775 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.615800 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.615821 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.615842 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.615870 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.615891 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.615913 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.615945 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.615968 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.615991 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.616016 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.616043 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.616066 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.616089 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.616112 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.616135 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.616158 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.616182 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.616205 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.616228 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.616254 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.616277 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.616298 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.616320 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.616363 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.616386 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.616408 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.616432 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.616496 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.616519 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.616541 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.616574 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.616608 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.616640 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.616671 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.616692 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.616714 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.616736 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.616779 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.616803 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.616827 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.616854 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.616878 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.616902 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.616925 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.616947 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.616969 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.616994 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.617024 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.617048 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.617070 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.617106 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.617130 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.617154 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.617176 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.617199 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.617223 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.617248 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.617270 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.617292 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.617314 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.620339 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.620387 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.620418 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.620454 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.620483 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.620510 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.620537 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.620563 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.620595 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.620624 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.620649 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.620674 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.620705 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.620730 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.620754 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.620779 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.620806 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.620832 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.620854 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.620880 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.620906 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.620932 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.620959 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.620989 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.621014 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.621038 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.621059 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.621094 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.621117 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.621141 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.621164 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.621189 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.621213 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.621240 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.621264 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.621290 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.621316 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.621372 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.621395 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.621420 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.621446 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.621498 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.621522 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.621545 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.621570 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.621599 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.621624 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.621650 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.621698 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.621723 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.621748 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.621772 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.621798 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.621821 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.621844 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.621867 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.621892 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.621919 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.621945 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.621971 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.621996 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.622020 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.622075 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.622107 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.622138 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.622168 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.622194 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.622221 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.622252 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.622281 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.622305 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.622402 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.622427 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.622455 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.622481 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.622505 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.622586 4810 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.622603 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.622619 4810 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.622632 4810 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.622647 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.622662 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.622676 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.622690 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.622704 4810 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.622718 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.622732 4810 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.622745 4810 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.622758 4810 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.622772 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.622786 4810 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.622801 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.622814 4810 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.622828 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.622842 4810 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.622855 4810 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.622868 4810 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.622881 4810 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.622894 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.622909 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.622922 4810 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.622937 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.622950 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.622964 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.622978 4810 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.622991 4810 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.623005 4810 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.623020 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.623040 4810 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.623054 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.623068 4810 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.623082 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.623096 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.623109 4810 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.623124 4810 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.623137 4810 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.623157 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.623171 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.618587 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.637164 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.618600 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.618908 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.618985 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.619029 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.619064 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.619258 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.619448 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.619635 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.619681 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.619670 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.619773 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.619726 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.619904 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.620017 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.620026 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.620465 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.620880 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.621140 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.621219 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.621237 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.621582 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.621862 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.621887 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.622177 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.623876 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.624646 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.625873 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.625955 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.625299 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.626207 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.626235 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.626435 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.626488 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.626590 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.626799 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.626884 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.627052 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.627435 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.627529 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.628039 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.628413 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.628670 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.628712 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.628896 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.628959 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.629158 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.629208 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.629464 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.629972 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.630665 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.630933 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.630928 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.632674 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.632891 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.632918 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.633269 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.633632 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.633865 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.634066 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.634187 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.634278 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.634510 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.635386 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.635982 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.636097 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.636255 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.636490 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.636493 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.636881 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.637800 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.637095 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.637498 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.637625 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.637735 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: E0219 15:09:51.637908 4810 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.638033 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: E0219 15:09:51.638118 4810 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.638459 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.638735 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.639268 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.639298 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.639335 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.639007 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: E0219 15:09:51.639478 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 15:09:52.139417892 +0000 UTC m=+21.621448006 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.639652 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.639906 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.640113 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.640276 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: E0219 15:09:51.640412 4810 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 15:09:51 crc kubenswrapper[4810]: E0219 15:09:51.640438 4810 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 15:09:51 crc kubenswrapper[4810]: E0219 15:09:51.640459 4810 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 15:09:51 crc kubenswrapper[4810]: E0219 15:09:51.640584 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-19 15:09:52.14056075 +0000 UTC m=+21.622590894 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.640663 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.640789 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.641585 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.641781 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.642449 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.643880 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.643947 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.644305 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.644409 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: E0219 15:09:51.644768 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 15:09:52.144666852 +0000 UTC m=+21.626697156 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.644883 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.645826 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.646182 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.637125 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.648059 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.648108 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.649123 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.649522 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.650014 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.655955 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.650650 4810 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.656341 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.656419 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.656663 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.656948 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.660360 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.660493 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.660728 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.660799 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.660871 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.657124 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.661214 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: E0219 15:09:51.661429 4810 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 15:09:51 crc kubenswrapper[4810]: E0219 15:09:51.661458 4810 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 15:09:51 crc kubenswrapper[4810]: E0219 15:09:51.661493 4810 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.662169 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.662520 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.662801 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.663068 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: E0219 15:09:51.663214 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-19 15:09:52.163169332 +0000 UTC m=+21.645199466 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.664138 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.664363 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.664438 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.664663 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.664730 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.665043 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.665568 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.666064 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.666124 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.666122 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.667154 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.667950 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.667964 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.668045 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.668667 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.668764 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.668815 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.669060 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.669128 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.669081 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.669232 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.669284 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.669551 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.672626 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.675723 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.675961 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.675762 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.675886 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.675892 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.676149 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.676260 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.677815 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.679407 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.679604 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.680466 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.684102 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.685981 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.696189 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.707539 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.724419 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.724473 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.724533 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.724546 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.724555 4810 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.724564 4810 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.724573 4810 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.724581 4810 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.724590 4810 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.724600 4810 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.724610 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.724619 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.724628 4810 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.724636 4810 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.724644 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.724653 4810 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.724661 4810 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.724670 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.724678 4810 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.724687 4810 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.724695 4810 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.724703 4810 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.724712 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.724722 4810 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.724731 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.724740 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.724749 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.724758 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.724766 4810 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.724775 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.724779 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.724783 4810 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.724807 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.724816 4810 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.724827 4810 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.724837 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.724845 4810 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.724854 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.724863 4810 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.724871 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.724880 4810 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.724890 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.724900 4810 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.724910 4810 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.724919 4810 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.724928 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.724937 4810 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.724946 4810 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.724955 4810 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.724963 4810 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.724971 4810 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.724980 4810 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.724988 4810 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.724997 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725006 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725015 4810 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725023 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725031 4810 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725040 4810 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725048 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725056 4810 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725066 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725074 4810 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725083 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725096 4810 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725104 4810 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725113 4810 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725122 4810 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725131 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725139 4810 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725148 4810 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725157 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725176 4810 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725186 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725195 4810 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725203 4810 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725211 4810 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725220 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725228 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725236 4810 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725245 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725254 4810 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725262 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725273 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725281 4810 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725289 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725297 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725305 4810 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725313 4810 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725346 4810 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725356 4810 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725364 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725372 4810 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725380 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725389 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725398 4810 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725408 4810 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725416 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725424 4810 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725433 4810 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725441 4810 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725450 4810 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725457 4810 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725465 4810 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725473 4810 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725482 4810 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725494 4810 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725502 4810 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725510 4810 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725518 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725526 4810 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725535 4810 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725542 4810 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725550 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725558 4810 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725567 4810 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725574 4810 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725583 4810 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725591 4810 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725600 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725608 4810 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725616 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725624 4810 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725632 4810 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725641 4810 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725649 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725658 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725670 4810 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725681 4810 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725689 4810 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725698 4810 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725707 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725714 4810 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725724 4810 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725732 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725739 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725748 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725756 4810 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725766 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725774 4810 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725783 4810 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725792 4810 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725800 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725809 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725817 4810 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725825 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725834 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725843 4810 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725851 4810 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725860 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725869 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725877 4810 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725885 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725893 4810 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725902 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.725911 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.872217 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.884709 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 15:09:51 crc kubenswrapper[4810]: I0219 15:09:51.890450 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 19 15:09:51 crc kubenswrapper[4810]: W0219 15:09:51.908573 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-1be1bd45f6ca011de1f6f387ecb73eb3ec669d743b57fc58aadfa5e2fbfbed90 WatchSource:0}: Error finding container 1be1bd45f6ca011de1f6f387ecb73eb3ec669d743b57fc58aadfa5e2fbfbed90: Status 404 returned error can't find the container with id 1be1bd45f6ca011de1f6f387ecb73eb3ec669d743b57fc58aadfa5e2fbfbed90 Feb 19 15:09:52 crc kubenswrapper[4810]: I0219 15:09:52.131548 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 15:09:52 crc kubenswrapper[4810]: E0219 15:09:52.131793 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 15:09:53.131750258 +0000 UTC m=+22.613780382 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:09:52 crc kubenswrapper[4810]: I0219 15:09:52.233005 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 15:09:52 crc kubenswrapper[4810]: I0219 15:09:52.233055 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 15:09:52 crc kubenswrapper[4810]: I0219 15:09:52.233077 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 15:09:52 crc kubenswrapper[4810]: I0219 15:09:52.233097 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 15:09:52 crc kubenswrapper[4810]: E0219 15:09:52.233207 4810 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 15:09:52 crc kubenswrapper[4810]: E0219 15:09:52.233262 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 15:09:53.23324536 +0000 UTC m=+22.715275484 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 15:09:52 crc kubenswrapper[4810]: E0219 15:09:52.233655 4810 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 15:09:52 crc kubenswrapper[4810]: E0219 15:09:52.233668 4810 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 15:09:52 crc kubenswrapper[4810]: E0219 15:09:52.233679 4810 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 15:09:52 crc kubenswrapper[4810]: E0219 15:09:52.233702 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-19 15:09:53.233694811 +0000 UTC m=+22.715724935 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 15:09:52 crc kubenswrapper[4810]: E0219 15:09:52.233758 4810 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 15:09:52 crc kubenswrapper[4810]: E0219 15:09:52.233767 4810 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 15:09:52 crc kubenswrapper[4810]: E0219 15:09:52.233774 4810 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 15:09:52 crc kubenswrapper[4810]: E0219 15:09:52.233791 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-19 15:09:53.233785974 +0000 UTC m=+22.715816098 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 15:09:52 crc kubenswrapper[4810]: E0219 15:09:52.233816 4810 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 15:09:52 crc kubenswrapper[4810]: E0219 15:09:52.233835 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 15:09:53.233828875 +0000 UTC m=+22.715858989 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 15:09:52 crc kubenswrapper[4810]: I0219 15:09:52.370819 4810 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 20:42:09.014127784 +0000 UTC Feb 19 15:09:52 crc kubenswrapper[4810]: I0219 15:09:52.438619 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 15:09:52 crc kubenswrapper[4810]: E0219 15:09:52.438802 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 15:09:52 crc kubenswrapper[4810]: I0219 15:09:52.535702 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 15:09:52 crc kubenswrapper[4810]: I0219 15:09:52.540963 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 15:09:52 crc kubenswrapper[4810]: I0219 15:09:52.547469 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 15:09:52 crc kubenswrapper[4810]: I0219 15:09:52.548139 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 19 15:09:52 crc kubenswrapper[4810]: I0219 15:09:52.560052 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 15:09:52 crc kubenswrapper[4810]: I0219 15:09:52.587712 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 15:09:52 crc kubenswrapper[4810]: I0219 15:09:52.602640 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 19 15:09:52 crc kubenswrapper[4810]: I0219 15:09:52.603231 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 19 15:09:52 crc kubenswrapper[4810]: I0219 15:09:52.605076 4810 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="640d02443d7d20048695866b4db1d1fdd6469c712a804010860ca4c730f59c65" exitCode=255 Feb 19 15:09:52 crc kubenswrapper[4810]: I0219 15:09:52.605141 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"640d02443d7d20048695866b4db1d1fdd6469c712a804010860ca4c730f59c65"} Feb 19 15:09:52 crc kubenswrapper[4810]: I0219 15:09:52.605226 4810 scope.go:117] "RemoveContainer" containerID="f1b00844d1da0b4c140f9ebcb4acffd7825f60d193021a860831e5fd754b2790" Feb 19 15:09:52 crc kubenswrapper[4810]: I0219 15:09:52.607130 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"4fcbafc57eac3fa9bee102fb74894c7065c4619a5ee2264d1274592f2301e55d"} Feb 19 15:09:52 crc kubenswrapper[4810]: I0219 15:09:52.609848 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"9db08d13283889d8c43ab89157e2c33f300359ecf66b548100b30ddea3da2be1"} Feb 19 15:09:52 crc kubenswrapper[4810]: I0219 15:09:52.609882 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"f8472b62f0ac04884cd8a9d342c063860188b0f17e1ed2201c7ea7288fb44d77"} Feb 19 15:09:52 crc kubenswrapper[4810]: I0219 15:09:52.609897 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"1be1bd45f6ca011de1f6f387ecb73eb3ec669d743b57fc58aadfa5e2fbfbed90"} Feb 19 15:09:52 crc kubenswrapper[4810]: I0219 15:09:52.611346 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"b60fe11e60f880bde640f63640fdc6a1b66e3044dcd975c711c45a15aed3fb2b"} Feb 19 15:09:52 crc kubenswrapper[4810]: I0219 15:09:52.611396 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"163227e588ccab22aa17179b63f314531c5f3e6219dab5243c4fddde2ef2e86f"} Feb 19 15:09:52 crc kubenswrapper[4810]: E0219 15:09:52.616066 4810 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-apiserver-crc\" already exists" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 15:09:52 crc kubenswrapper[4810]: I0219 15:09:52.616256 4810 scope.go:117] "RemoveContainer" containerID="640d02443d7d20048695866b4db1d1fdd6469c712a804010860ca4c730f59c65" Feb 19 15:09:52 crc kubenswrapper[4810]: E0219 15:09:52.616417 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 19 15:09:52 crc kubenswrapper[4810]: I0219 15:09:52.617868 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 15:09:52 crc kubenswrapper[4810]: I0219 15:09:52.635965 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 15:09:52 crc kubenswrapper[4810]: I0219 15:09:52.637610 4810 csr.go:261] certificate signing request csr-q6c86 is approved, waiting to be issued Feb 19 15:09:52 crc kubenswrapper[4810]: I0219 15:09:52.652836 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 15:09:52 crc kubenswrapper[4810]: I0219 15:09:52.653523 4810 csr.go:257] certificate signing request csr-q6c86 is issued Feb 19 15:09:52 crc kubenswrapper[4810]: I0219 15:09:52.667568 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 15:09:52 crc kubenswrapper[4810]: I0219 15:09:52.679142 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 15:09:52 crc kubenswrapper[4810]: I0219 15:09:52.697108 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 15:09:52 crc kubenswrapper[4810]: I0219 15:09:52.710445 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 15:09:52 crc kubenswrapper[4810]: I0219 15:09:52.725640 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9db08d13283889d8c43ab89157e2c33f300359ecf66b548100b30ddea3da2be1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8472b62f0ac04884cd8a9d342c063860188b0f17e1ed2201c7ea7288fb44d77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 15:09:52 crc kubenswrapper[4810]: I0219 15:09:52.737699 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6694b0d-3264-43eb-bd52-5088c7d2bf15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16713d77989f991d6b3136c41aac696aaa3233a0f3454bcd299cd968f46872eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42939a1750dd650a11f4f83812694248cc8767b040f5cab76ee64d30ad6004ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a3707767fcc9c9ee6a808f9c0598457f1710b6394ecf9910f7a1a0ddd8ec1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://640d02443d7d20048695866b4db1d1fdd6469c712a804010860ca4c730f59c65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1b00844d1da0b4c140f9ebcb4acffd7825f60d193021a860831e5fd754b2790\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T15:09:45Z\\\",\\\"message\\\":\\\"W0219 15:09:34.723476 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0219 15:09:34.724418 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771513774 cert, and key in /tmp/serving-cert-2495124582/serving-signer.crt, /tmp/serving-cert-2495124582/serving-signer.key\\\\nI0219 15:09:35.062079 1 observer_polling.go:159] Starting file observer\\\\nW0219 15:09:35.062187 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0219 15:09:35.062309 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 15:09:35.064632 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2495124582/tls.crt::/tmp/serving-cert-2495124582/tls.key\\\\\\\"\\\\nF0219 15:09:45.321489 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://640d02443d7d20048695866b4db1d1fdd6469c712a804010860ca4c730f59c65\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T15:09:51Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 15:09:51.508430 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 15:09:51.508603 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 15:09:51.509986 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1367038872/tls.crt::/tmp/serving-cert-1367038872/tls.key\\\\\\\"\\\\nI0219 15:09:51.722990 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 15:09:51.726068 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 15:09:51.726086 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 15:09:51.726104 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 15:09:51.726110 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 15:09:51.731095 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 15:09:51.731120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0219 15:09:51.731119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0219 15:09:51.731126 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 15:09:51.731136 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 15:09:51.731141 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 15:09:51.731145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 15:09:51.731149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 15:09:51.734010 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77137e37c7a8cdddbba21456e942c760fb2cc6c31533e3bc6c4561929b41393b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 15:09:52 crc kubenswrapper[4810]: I0219 15:09:52.756226 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b60fe11e60f880bde640f63640fdc6a1b66e3044dcd975c711c45a15aed3fb2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:52Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:52 crc kubenswrapper[4810]: I0219 15:09:52.864902 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 15:09:52 crc kubenswrapper[4810]: I0219 15:09:52.871365 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 15:09:52 crc kubenswrapper[4810]: I0219 15:09:52.879383 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b60fe11e60f880bde640f63640fdc6a1b66e3044dcd975c711c45a15aed3fb2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:52Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:52 crc kubenswrapper[4810]: I0219 15:09:52.901388 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:52Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:52 crc kubenswrapper[4810]: I0219 15:09:52.917086 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9db08d13283889d8c43ab89157e2c33f300359ecf66b548100b30ddea3da2be1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8472b62f0ac04884cd8a9d342c063860188b0f17e1ed2201c7ea7288fb44d77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:52Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:52 crc kubenswrapper[4810]: I0219 15:09:52.921206 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Feb 19 15:09:52 crc kubenswrapper[4810]: I0219 15:09:52.956204 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6694b0d-3264-43eb-bd52-5088c7d2bf15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16713d77989f991d6b3136c41aac696aaa3233a0f3454bcd299cd968f46872eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42939a1750dd650a11f4f83812694248cc8767b040f5cab76ee64d30ad6004ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a3707767fcc9c9ee6a808f9c0598457f1710b6394ecf9910f7a1a0ddd8ec1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://640d02443d7d20048695866b4db1d1fdd6469c712a804010860ca4c730f59c65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1b00844d1da0b4c140f9ebcb4acffd7825f60d193021a860831e5fd754b2790\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T15:09:45Z\\\",\\\"message\\\":\\\"W0219 15:09:34.723476 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0219 15:09:34.724418 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771513774 cert, and key in /tmp/serving-cert-2495124582/serving-signer.crt, /tmp/serving-cert-2495124582/serving-signer.key\\\\nI0219 15:09:35.062079 1 observer_polling.go:159] Starting file observer\\\\nW0219 15:09:35.062187 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0219 15:09:35.062309 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 15:09:35.064632 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2495124582/tls.crt::/tmp/serving-cert-2495124582/tls.key\\\\\\\"\\\\nF0219 15:09:45.321489 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://640d02443d7d20048695866b4db1d1fdd6469c712a804010860ca4c730f59c65\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T15:09:51Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 15:09:51.508430 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 15:09:51.508603 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 15:09:51.509986 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1367038872/tls.crt::/tmp/serving-cert-1367038872/tls.key\\\\\\\"\\\\nI0219 15:09:51.722990 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 15:09:51.726068 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 15:09:51.726086 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 15:09:51.726104 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 15:09:51.726110 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 15:09:51.731095 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 15:09:51.731120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0219 15:09:51.731119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0219 15:09:51.731126 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 15:09:51.731136 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 15:09:51.731141 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 15:09:51.731145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 15:09:51.731149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 15:09:51.734010 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77137e37c7a8cdddbba21456e942c760fb2cc6c31533e3bc6c4561929b41393b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:52Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:52 crc kubenswrapper[4810]: I0219 15:09:52.984003 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:52Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.028295 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:53Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.077353 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:53Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.104761 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b60fe11e60f880bde640f63640fdc6a1b66e3044dcd975c711c45a15aed3fb2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:53Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.116785 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:53Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.129365 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9db08d13283889d8c43ab89157e2c33f300359ecf66b548100b30ddea3da2be1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8472b62f0ac04884cd8a9d342c063860188b0f17e1ed2201c7ea7288fb44d77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:53Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.142683 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 15:09:53 crc kubenswrapper[4810]: E0219 15:09:53.142952 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 15:09:55.142909947 +0000 UTC m=+24.624940101 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.144066 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6694b0d-3264-43eb-bd52-5088c7d2bf15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16713d77989f991d6b3136c41aac696aaa3233a0f3454bcd299cd968f46872eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42939a1750dd650a11f4f83812694248cc8767b040f5cab76ee64d30ad6004ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a3707767fcc9c9ee6a808f9c0598457f1710b6394ecf9910f7a1a0ddd8ec1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://640d02443d7d20048695866b4db1d1fdd6469c712a804010860ca4c730f59c65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1b00844d1da0b4c140f9ebcb4acffd7825f60d193021a860831e5fd754b2790\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T15:09:45Z\\\",\\\"message\\\":\\\"W0219 15:09:34.723476 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0219 15:09:34.724418 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771513774 cert, and key in /tmp/serving-cert-2495124582/serving-signer.crt, /tmp/serving-cert-2495124582/serving-signer.key\\\\nI0219 15:09:35.062079 1 observer_polling.go:159] Starting file observer\\\\nW0219 15:09:35.062187 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0219 15:09:35.062309 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 15:09:35.064632 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2495124582/tls.crt::/tmp/serving-cert-2495124582/tls.key\\\\\\\"\\\\nF0219 15:09:45.321489 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://640d02443d7d20048695866b4db1d1fdd6469c712a804010860ca4c730f59c65\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T15:09:51Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 15:09:51.508430 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 15:09:51.508603 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 15:09:51.509986 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1367038872/tls.crt::/tmp/serving-cert-1367038872/tls.key\\\\\\\"\\\\nI0219 15:09:51.722990 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 15:09:51.726068 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 15:09:51.726086 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 15:09:51.726104 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 15:09:51.726110 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 15:09:51.731095 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 15:09:51.731120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0219 15:09:51.731119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0219 15:09:51.731126 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 15:09:51.731136 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 15:09:51.731141 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 15:09:51.731145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 15:09:51.731149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 15:09:51.734010 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77137e37c7a8cdddbba21456e942c760fb2cc6c31533e3bc6c4561929b41393b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:53Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.156688 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-flbx5"] Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.157082 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-bsztz"] Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.157245 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-flbx5" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.157376 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-bsztz" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.160386 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.160414 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.160515 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.160706 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.161485 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.166712 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.166987 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.166988 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d74a04f-ffb6-47fb-8823-2fd7f3df4f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2070ce19c777ad791dfd3e5d99eded43e479f4cbbaefe96c5ed8525aa86e5ed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89e969496cd5adab974a5d8b8c3051bcdfb63c582b8f0d6d9af233d25e2c1d3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5559fbe43f7e18617e2a46182d621a6310ae4d0b10548bae2bb098588285e081\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fbf8815e560483199cc8f453952895d3e5d831e7c52671b204187dc686fa4ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:53Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.171063 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.183728 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:53Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.195417 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:53Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.207632 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:53Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.222822 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6694b0d-3264-43eb-bd52-5088c7d2bf15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16713d77989f991d6b3136c41aac696aaa3233a0f3454bcd299cd968f46872eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42939a1750dd650a11f4f83812694248cc8767b040f5cab76ee64d30ad6004ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a3707767fcc9c9ee6a808f9c0598457f1710b6394ecf9910f7a1a0ddd8ec1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://640d02443d7d20048695866b4db1d1fdd6469c712a804010860ca4c730f59c65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1b00844d1da0b4c140f9ebcb4acffd7825f60d193021a860831e5fd754b2790\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T15:09:45Z\\\",\\\"message\\\":\\\"W0219 15:09:34.723476 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0219 15:09:34.724418 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771513774 cert, and key in /tmp/serving-cert-2495124582/serving-signer.crt, /tmp/serving-cert-2495124582/serving-signer.key\\\\nI0219 15:09:35.062079 1 observer_polling.go:159] Starting file observer\\\\nW0219 15:09:35.062187 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0219 15:09:35.062309 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 15:09:35.064632 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2495124582/tls.crt::/tmp/serving-cert-2495124582/tls.key\\\\\\\"\\\\nF0219 15:09:45.321489 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://640d02443d7d20048695866b4db1d1fdd6469c712a804010860ca4c730f59c65\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T15:09:51Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 15:09:51.508430 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 15:09:51.508603 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 15:09:51.509986 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1367038872/tls.crt::/tmp/serving-cert-1367038872/tls.key\\\\\\\"\\\\nI0219 15:09:51.722990 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 15:09:51.726068 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 15:09:51.726086 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 15:09:51.726104 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 15:09:51.726110 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 15:09:51.731095 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 15:09:51.731120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0219 15:09:51.731119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0219 15:09:51.731126 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 15:09:51.731136 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 15:09:51.731141 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 15:09:51.731145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 15:09:51.731149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 15:09:51.734010 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77137e37c7a8cdddbba21456e942c760fb2cc6c31533e3bc6c4561929b41393b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:53Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.239088 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d74a04f-ffb6-47fb-8823-2fd7f3df4f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2070ce19c777ad791dfd3e5d99eded43e479f4cbbaefe96c5ed8525aa86e5ed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89e969496cd5adab974a5d8b8c3051bcdfb63c582b8f0d6d9af233d25e2c1d3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5559fbe43f7e18617e2a46182d621a6310ae4d0b10548bae2bb098588285e081\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fbf8815e560483199cc8f453952895d3e5d831e7c52671b204187dc686fa4ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:53Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.243353 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2a45a199-beeb-4972-b796-15c958fe99d3-multus-conf-dir\") pod \"multus-bsztz\" (UID: \"2a45a199-beeb-4972-b796-15c958fe99d3\") " pod="openshift-multus/multus-bsztz" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.243407 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2a45a199-beeb-4972-b796-15c958fe99d3-etc-kubernetes\") pod \"multus-bsztz\" (UID: \"2a45a199-beeb-4972-b796-15c958fe99d3\") " pod="openshift-multus/multus-bsztz" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.243450 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.243482 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2a45a199-beeb-4972-b796-15c958fe99d3-multus-socket-dir-parent\") pod \"multus-bsztz\" (UID: \"2a45a199-beeb-4972-b796-15c958fe99d3\") " pod="openshift-multus/multus-bsztz" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.243507 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2a45a199-beeb-4972-b796-15c958fe99d3-multus-cni-dir\") pod \"multus-bsztz\" (UID: \"2a45a199-beeb-4972-b796-15c958fe99d3\") " pod="openshift-multus/multus-bsztz" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.243534 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2a45a199-beeb-4972-b796-15c958fe99d3-os-release\") pod \"multus-bsztz\" (UID: \"2a45a199-beeb-4972-b796-15c958fe99d3\") " pod="openshift-multus/multus-bsztz" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.243558 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2a45a199-beeb-4972-b796-15c958fe99d3-host-var-lib-cni-multus\") pod \"multus-bsztz\" (UID: \"2a45a199-beeb-4972-b796-15c958fe99d3\") " pod="openshift-multus/multus-bsztz" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.243588 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2a45a199-beeb-4972-b796-15c958fe99d3-host-var-lib-kubelet\") pod \"multus-bsztz\" (UID: \"2a45a199-beeb-4972-b796-15c958fe99d3\") " pod="openshift-multus/multus-bsztz" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.243709 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.243799 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4cc3ea69-b881-4fd4-ad4d-42803f27865b-hosts-file\") pod \"node-resolver-flbx5\" (UID: \"4cc3ea69-b881-4fd4-ad4d-42803f27865b\") " pod="openshift-dns/node-resolver-flbx5" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.243829 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqph9\" (UniqueName: \"kubernetes.io/projected/4cc3ea69-b881-4fd4-ad4d-42803f27865b-kube-api-access-gqph9\") pod \"node-resolver-flbx5\" (UID: \"4cc3ea69-b881-4fd4-ad4d-42803f27865b\") " pod="openshift-dns/node-resolver-flbx5" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.243851 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2a45a199-beeb-4972-b796-15c958fe99d3-cni-binary-copy\") pod \"multus-bsztz\" (UID: \"2a45a199-beeb-4972-b796-15c958fe99d3\") " pod="openshift-multus/multus-bsztz" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.243872 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2a45a199-beeb-4972-b796-15c958fe99d3-host-var-lib-cni-bin\") pod \"multus-bsztz\" (UID: \"2a45a199-beeb-4972-b796-15c958fe99d3\") " pod="openshift-multus/multus-bsztz" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.243893 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2a45a199-beeb-4972-b796-15c958fe99d3-hostroot\") pod \"multus-bsztz\" (UID: \"2a45a199-beeb-4972-b796-15c958fe99d3\") " pod="openshift-multus/multus-bsztz" Feb 19 15:09:53 crc kubenswrapper[4810]: E0219 15:09:53.243948 4810 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 15:09:53 crc kubenswrapper[4810]: E0219 15:09:53.243989 4810 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 15:09:53 crc kubenswrapper[4810]: E0219 15:09:53.244002 4810 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.243968 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 15:09:53 crc kubenswrapper[4810]: E0219 15:09:53.244066 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-19 15:09:55.24404461 +0000 UTC m=+24.726074734 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 15:09:53 crc kubenswrapper[4810]: E0219 15:09:53.244091 4810 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 15:09:53 crc kubenswrapper[4810]: E0219 15:09:53.244120 4810 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 15:09:53 crc kubenswrapper[4810]: E0219 15:09:53.244133 4810 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.244092 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2a45a199-beeb-4972-b796-15c958fe99d3-cnibin\") pod \"multus-bsztz\" (UID: \"2a45a199-beeb-4972-b796-15c958fe99d3\") " pod="openshift-multus/multus-bsztz" Feb 19 15:09:53 crc kubenswrapper[4810]: E0219 15:09:53.244190 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-19 15:09:55.244168163 +0000 UTC m=+24.726198347 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.244216 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2a45a199-beeb-4972-b796-15c958fe99d3-host-run-k8s-cni-cncf-io\") pod \"multus-bsztz\" (UID: \"2a45a199-beeb-4972-b796-15c958fe99d3\") " pod="openshift-multus/multus-bsztz" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.244241 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2a45a199-beeb-4972-b796-15c958fe99d3-multus-daemon-config\") pod \"multus-bsztz\" (UID: \"2a45a199-beeb-4972-b796-15c958fe99d3\") " pod="openshift-multus/multus-bsztz" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.244270 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2a45a199-beeb-4972-b796-15c958fe99d3-system-cni-dir\") pod \"multus-bsztz\" (UID: \"2a45a199-beeb-4972-b796-15c958fe99d3\") " pod="openshift-multus/multus-bsztz" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.244294 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcg8h\" (UniqueName: \"kubernetes.io/projected/2a45a199-beeb-4972-b796-15c958fe99d3-kube-api-access-pcg8h\") pod \"multus-bsztz\" (UID: \"2a45a199-beeb-4972-b796-15c958fe99d3\") " pod="openshift-multus/multus-bsztz" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.244318 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2a45a199-beeb-4972-b796-15c958fe99d3-host-run-netns\") pod \"multus-bsztz\" (UID: \"2a45a199-beeb-4972-b796-15c958fe99d3\") " pod="openshift-multus/multus-bsztz" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.244358 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2a45a199-beeb-4972-b796-15c958fe99d3-host-run-multus-certs\") pod \"multus-bsztz\" (UID: \"2a45a199-beeb-4972-b796-15c958fe99d3\") " pod="openshift-multus/multus-bsztz" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.244397 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 15:09:53 crc kubenswrapper[4810]: E0219 15:09:53.244477 4810 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 15:09:53 crc kubenswrapper[4810]: E0219 15:09:53.244514 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 15:09:55.244504361 +0000 UTC m=+24.726534525 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 15:09:53 crc kubenswrapper[4810]: E0219 15:09:53.244597 4810 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 15:09:53 crc kubenswrapper[4810]: E0219 15:09:53.244653 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 15:09:55.244641895 +0000 UTC m=+24.726672219 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.258679 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b60fe11e60f880bde640f63640fdc6a1b66e3044dcd975c711c45a15aed3fb2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:53Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.275059 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:53Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.291262 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9db08d13283889d8c43ab89157e2c33f300359ecf66b548100b30ddea3da2be1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8472b62f0ac04884cd8a9d342c063860188b0f17e1ed2201c7ea7288fb44d77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:53Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.304263 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-flbx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cc3ea69-b881-4fd4-ad4d-42803f27865b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqph9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-flbx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:53Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.322990 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:53Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.334055 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:53Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.345611 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2a45a199-beeb-4972-b796-15c958fe99d3-system-cni-dir\") pod \"multus-bsztz\" (UID: \"2a45a199-beeb-4972-b796-15c958fe99d3\") " pod="openshift-multus/multus-bsztz" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.345742 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pcg8h\" (UniqueName: \"kubernetes.io/projected/2a45a199-beeb-4972-b796-15c958fe99d3-kube-api-access-pcg8h\") pod \"multus-bsztz\" (UID: \"2a45a199-beeb-4972-b796-15c958fe99d3\") " pod="openshift-multus/multus-bsztz" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.345824 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2a45a199-beeb-4972-b796-15c958fe99d3-host-run-netns\") pod \"multus-bsztz\" (UID: \"2a45a199-beeb-4972-b796-15c958fe99d3\") " pod="openshift-multus/multus-bsztz" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.345865 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2a45a199-beeb-4972-b796-15c958fe99d3-system-cni-dir\") pod \"multus-bsztz\" (UID: \"2a45a199-beeb-4972-b796-15c958fe99d3\") " pod="openshift-multus/multus-bsztz" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.345893 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2a45a199-beeb-4972-b796-15c958fe99d3-host-run-multus-certs\") pod \"multus-bsztz\" (UID: \"2a45a199-beeb-4972-b796-15c958fe99d3\") " pod="openshift-multus/multus-bsztz" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.346036 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2a45a199-beeb-4972-b796-15c958fe99d3-etc-kubernetes\") pod \"multus-bsztz\" (UID: \"2a45a199-beeb-4972-b796-15c958fe99d3\") " pod="openshift-multus/multus-bsztz" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.346094 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2a45a199-beeb-4972-b796-15c958fe99d3-multus-socket-dir-parent\") pod \"multus-bsztz\" (UID: \"2a45a199-beeb-4972-b796-15c958fe99d3\") " pod="openshift-multus/multus-bsztz" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.346124 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2a45a199-beeb-4972-b796-15c958fe99d3-multus-conf-dir\") pod \"multus-bsztz\" (UID: \"2a45a199-beeb-4972-b796-15c958fe99d3\") " pod="openshift-multus/multus-bsztz" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.346147 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2a45a199-beeb-4972-b796-15c958fe99d3-multus-cni-dir\") pod \"multus-bsztz\" (UID: \"2a45a199-beeb-4972-b796-15c958fe99d3\") " pod="openshift-multus/multus-bsztz" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.346182 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2a45a199-beeb-4972-b796-15c958fe99d3-os-release\") pod \"multus-bsztz\" (UID: \"2a45a199-beeb-4972-b796-15c958fe99d3\") " pod="openshift-multus/multus-bsztz" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.346257 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2a45a199-beeb-4972-b796-15c958fe99d3-host-var-lib-cni-multus\") pod \"multus-bsztz\" (UID: \"2a45a199-beeb-4972-b796-15c958fe99d3\") " pod="openshift-multus/multus-bsztz" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.346282 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2a45a199-beeb-4972-b796-15c958fe99d3-host-var-lib-kubelet\") pod \"multus-bsztz\" (UID: \"2a45a199-beeb-4972-b796-15c958fe99d3\") " pod="openshift-multus/multus-bsztz" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.346371 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4cc3ea69-b881-4fd4-ad4d-42803f27865b-hosts-file\") pod \"node-resolver-flbx5\" (UID: \"4cc3ea69-b881-4fd4-ad4d-42803f27865b\") " pod="openshift-dns/node-resolver-flbx5" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.346408 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqph9\" (UniqueName: \"kubernetes.io/projected/4cc3ea69-b881-4fd4-ad4d-42803f27865b-kube-api-access-gqph9\") pod \"node-resolver-flbx5\" (UID: \"4cc3ea69-b881-4fd4-ad4d-42803f27865b\") " pod="openshift-dns/node-resolver-flbx5" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.346454 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2a45a199-beeb-4972-b796-15c958fe99d3-cni-binary-copy\") pod \"multus-bsztz\" (UID: \"2a45a199-beeb-4972-b796-15c958fe99d3\") " pod="openshift-multus/multus-bsztz" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.346477 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2a45a199-beeb-4972-b796-15c958fe99d3-hostroot\") pod \"multus-bsztz\" (UID: \"2a45a199-beeb-4972-b796-15c958fe99d3\") " pod="openshift-multus/multus-bsztz" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.346517 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2a45a199-beeb-4972-b796-15c958fe99d3-cnibin\") pod \"multus-bsztz\" (UID: \"2a45a199-beeb-4972-b796-15c958fe99d3\") " pod="openshift-multus/multus-bsztz" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.346544 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2a45a199-beeb-4972-b796-15c958fe99d3-host-run-k8s-cni-cncf-io\") pod \"multus-bsztz\" (UID: \"2a45a199-beeb-4972-b796-15c958fe99d3\") " pod="openshift-multus/multus-bsztz" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.346568 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2a45a199-beeb-4972-b796-15c958fe99d3-host-var-lib-cni-bin\") pod \"multus-bsztz\" (UID: \"2a45a199-beeb-4972-b796-15c958fe99d3\") " pod="openshift-multus/multus-bsztz" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.346591 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2a45a199-beeb-4972-b796-15c958fe99d3-multus-daemon-config\") pod \"multus-bsztz\" (UID: \"2a45a199-beeb-4972-b796-15c958fe99d3\") " pod="openshift-multus/multus-bsztz" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.346422 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2a45a199-beeb-4972-b796-15c958fe99d3-host-run-multus-certs\") pod \"multus-bsztz\" (UID: \"2a45a199-beeb-4972-b796-15c958fe99d3\") " pod="openshift-multus/multus-bsztz" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.347013 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2a45a199-beeb-4972-b796-15c958fe99d3-host-run-netns\") pod \"multus-bsztz\" (UID: \"2a45a199-beeb-4972-b796-15c958fe99d3\") " pod="openshift-multus/multus-bsztz" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.347148 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2a45a199-beeb-4972-b796-15c958fe99d3-host-var-lib-kubelet\") pod \"multus-bsztz\" (UID: \"2a45a199-beeb-4972-b796-15c958fe99d3\") " pod="openshift-multus/multus-bsztz" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.347253 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2a45a199-beeb-4972-b796-15c958fe99d3-etc-kubernetes\") pod \"multus-bsztz\" (UID: \"2a45a199-beeb-4972-b796-15c958fe99d3\") " pod="openshift-multus/multus-bsztz" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.347367 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2a45a199-beeb-4972-b796-15c958fe99d3-multus-socket-dir-parent\") pod \"multus-bsztz\" (UID: \"2a45a199-beeb-4972-b796-15c958fe99d3\") " pod="openshift-multus/multus-bsztz" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.347452 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2a45a199-beeb-4972-b796-15c958fe99d3-multus-conf-dir\") pod \"multus-bsztz\" (UID: \"2a45a199-beeb-4972-b796-15c958fe99d3\") " pod="openshift-multus/multus-bsztz" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.347751 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2a45a199-beeb-4972-b796-15c958fe99d3-multus-cni-dir\") pod \"multus-bsztz\" (UID: \"2a45a199-beeb-4972-b796-15c958fe99d3\") " pod="openshift-multus/multus-bsztz" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.347507 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4cc3ea69-b881-4fd4-ad4d-42803f27865b-hosts-file\") pod \"node-resolver-flbx5\" (UID: \"4cc3ea69-b881-4fd4-ad4d-42803f27865b\") " pod="openshift-dns/node-resolver-flbx5" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.348046 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2a45a199-beeb-4972-b796-15c958fe99d3-cni-binary-copy\") pod \"multus-bsztz\" (UID: \"2a45a199-beeb-4972-b796-15c958fe99d3\") " pod="openshift-multus/multus-bsztz" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.348111 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2a45a199-beeb-4972-b796-15c958fe99d3-hostroot\") pod \"multus-bsztz\" (UID: \"2a45a199-beeb-4972-b796-15c958fe99d3\") " pod="openshift-multus/multus-bsztz" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.348145 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2a45a199-beeb-4972-b796-15c958fe99d3-host-var-lib-cni-multus\") pod \"multus-bsztz\" (UID: \"2a45a199-beeb-4972-b796-15c958fe99d3\") " pod="openshift-multus/multus-bsztz" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.348159 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2a45a199-beeb-4972-b796-15c958fe99d3-cnibin\") pod \"multus-bsztz\" (UID: \"2a45a199-beeb-4972-b796-15c958fe99d3\") " pod="openshift-multus/multus-bsztz" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.348183 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2a45a199-beeb-4972-b796-15c958fe99d3-host-run-k8s-cni-cncf-io\") pod \"multus-bsztz\" (UID: \"2a45a199-beeb-4972-b796-15c958fe99d3\") " pod="openshift-multus/multus-bsztz" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.348257 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2a45a199-beeb-4972-b796-15c958fe99d3-host-var-lib-cni-bin\") pod \"multus-bsztz\" (UID: \"2a45a199-beeb-4972-b796-15c958fe99d3\") " pod="openshift-multus/multus-bsztz" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.347463 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2a45a199-beeb-4972-b796-15c958fe99d3-multus-daemon-config\") pod \"multus-bsztz\" (UID: \"2a45a199-beeb-4972-b796-15c958fe99d3\") " pod="openshift-multus/multus-bsztz" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.348425 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2a45a199-beeb-4972-b796-15c958fe99d3-os-release\") pod \"multus-bsztz\" (UID: \"2a45a199-beeb-4972-b796-15c958fe99d3\") " pod="openshift-multus/multus-bsztz" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.348722 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:53Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.366046 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcg8h\" (UniqueName: \"kubernetes.io/projected/2a45a199-beeb-4972-b796-15c958fe99d3-kube-api-access-pcg8h\") pod \"multus-bsztz\" (UID: \"2a45a199-beeb-4972-b796-15c958fe99d3\") " pod="openshift-multus/multus-bsztz" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.366672 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bsztz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a45a199-beeb-4972-b796-15c958fe99d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcg8h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bsztz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:53Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.371498 4810 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 09:10:53.975287492 +0000 UTC Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.374158 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqph9\" (UniqueName: \"kubernetes.io/projected/4cc3ea69-b881-4fd4-ad4d-42803f27865b-kube-api-access-gqph9\") pod \"node-resolver-flbx5\" (UID: \"4cc3ea69-b881-4fd4-ad4d-42803f27865b\") " pod="openshift-dns/node-resolver-flbx5" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.439738 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 15:09:53 crc kubenswrapper[4810]: E0219 15:09:53.439955 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.439760 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 15:09:53 crc kubenswrapper[4810]: E0219 15:09:53.440492 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.445765 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.446297 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.447638 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.451623 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.473449 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-flbx5" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.478722 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-bsztz" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.487579 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Feb 19 15:09:53 crc kubenswrapper[4810]: W0219 15:09:53.508150 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a45a199_beeb_4972_b796_15c958fe99d3.slice/crio-384873e32a6b0156755a3eec1f2afe540afe5a90b1090f7abd9efa003b900161 WatchSource:0}: Error finding container 384873e32a6b0156755a3eec1f2afe540afe5a90b1090f7abd9efa003b900161: Status 404 returned error can't find the container with id 384873e32a6b0156755a3eec1f2afe540afe5a90b1090f7abd9efa003b900161 Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.520444 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.521286 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.522444 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.523120 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.524110 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.524666 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.525638 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.526172 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.529785 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.556902 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.557565 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.600601 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.601320 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.601941 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.603029 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.603560 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.604563 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.605041 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.606100 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.606581 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.607197 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.609199 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.609706 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.610735 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.611258 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.612703 4810 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.612816 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.614514 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.616978 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.619475 4810 scope.go:117] "RemoveContainer" containerID="640d02443d7d20048695866b4db1d1fdd6469c712a804010860ca4c730f59c65" Feb 19 15:09:53 crc kubenswrapper[4810]: E0219 15:09:53.619639 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.636002 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6694b0d-3264-43eb-bd52-5088c7d2bf15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16713d77989f991d6b3136c41aac696aaa3233a0f3454bcd299cd968f46872eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42939a1750dd650a11f4f83812694248cc8767b040f5cab76ee64d30ad6004ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a3707767fcc9c9ee6a808f9c0598457f1710b6394ecf9910f7a1a0ddd8ec1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://640d02443d7d20048695866b4db1d1fdd6469c712a804010860ca4c730f59c65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://640d02443d7d20048695866b4db1d1fdd6469c712a804010860ca4c730f59c65\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T15:09:51Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 15:09:51.508430 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 15:09:51.508603 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 15:09:51.509986 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1367038872/tls.crt::/tmp/serving-cert-1367038872/tls.key\\\\\\\"\\\\nI0219 15:09:51.722990 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 15:09:51.726068 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 15:09:51.726086 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 15:09:51.726104 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 15:09:51.726110 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 15:09:51.731095 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 15:09:51.731120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0219 15:09:51.731119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0219 15:09:51.731126 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 15:09:51.731136 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 15:09:51.731141 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 15:09:51.731145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 15:09:51.731149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 15:09:51.734010 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77137e37c7a8cdddbba21456e942c760fb2cc6c31533e3bc6c4561929b41393b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:53Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.639757 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.640711 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.643241 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.648798 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.650133 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.651102 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.651984 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.653496 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.654446 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.654498 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d74a04f-ffb6-47fb-8823-2fd7f3df4f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2070ce19c777ad791dfd3e5d99eded43e479f4cbbaefe96c5ed8525aa86e5ed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89e969496cd5adab974a5d8b8c3051bcdfb63c582b8f0d6d9af233d25e2c1d3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5559fbe43f7e18617e2a46182d621a6310ae4d0b10548bae2bb098588285e081\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fbf8815e560483199cc8f453952895d3e5d831e7c52671b204187dc686fa4ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:53Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.654731 4810 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-19 15:04:52 +0000 UTC, rotation deadline is 2026-12-26 04:10:58.342860789 +0000 UTC Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.654784 4810 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7429h1m4.688079321s for next certificate rotation Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.655513 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.656344 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.657438 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.658049 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.659236 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.660469 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.662019 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.665697 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.666179 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.666988 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.668569 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.669174 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.670190 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bsztz" event={"ID":"2a45a199-beeb-4972-b796-15c958fe99d3","Type":"ContainerStarted","Data":"384873e32a6b0156755a3eec1f2afe540afe5a90b1090f7abd9efa003b900161"} Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.670223 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-flbx5" event={"ID":"4cc3ea69-b881-4fd4-ad4d-42803f27865b","Type":"ContainerStarted","Data":"8009e6833f60100903c37a52f0f744b03f602571cf467798dc90511180417ded"} Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.670240 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-nmbsx"] Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.673135 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b60fe11e60f880bde640f63640fdc6a1b66e3044dcd975c711c45a15aed3fb2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:53Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.676472 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-t499d"] Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.676843 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-t499d" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.677547 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-nmbsx" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.681085 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.681369 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.681584 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.681739 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.681894 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.682273 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.682439 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.691455 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:53Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.715056 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9db08d13283889d8c43ab89157e2c33f300359ecf66b548100b30ddea3da2be1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8472b62f0ac04884cd8a9d342c063860188b0f17e1ed2201c7ea7288fb44d77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:53Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.731804 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-flbx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cc3ea69-b881-4fd4-ad4d-42803f27865b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqph9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-flbx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:53Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.754365 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/247e948b-3c17-4675-bd1c-f894b02d2817-cni-binary-copy\") pod \"multus-additional-cni-plugins-nmbsx\" (UID: \"247e948b-3c17-4675-bd1c-f894b02d2817\") " pod="openshift-multus/multus-additional-cni-plugins-nmbsx" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.754275 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:53Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.754543 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/247e948b-3c17-4675-bd1c-f894b02d2817-cnibin\") pod \"multus-additional-cni-plugins-nmbsx\" (UID: \"247e948b-3c17-4675-bd1c-f894b02d2817\") " pod="openshift-multus/multus-additional-cni-plugins-nmbsx" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.754694 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/dfbf74ef-1e94-4826-8583-42b2e246ccf3-rootfs\") pod \"machine-config-daemon-t499d\" (UID: \"dfbf74ef-1e94-4826-8583-42b2e246ccf3\") " pod="openshift-machine-config-operator/machine-config-daemon-t499d" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.754983 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/247e948b-3c17-4675-bd1c-f894b02d2817-os-release\") pod \"multus-additional-cni-plugins-nmbsx\" (UID: \"247e948b-3c17-4675-bd1c-f894b02d2817\") " pod="openshift-multus/multus-additional-cni-plugins-nmbsx" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.755058 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmwlb\" (UniqueName: \"kubernetes.io/projected/247e948b-3c17-4675-bd1c-f894b02d2817-kube-api-access-qmwlb\") pod \"multus-additional-cni-plugins-nmbsx\" (UID: \"247e948b-3c17-4675-bd1c-f894b02d2817\") " pod="openshift-multus/multus-additional-cni-plugins-nmbsx" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.755101 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dfbf74ef-1e94-4826-8583-42b2e246ccf3-mcd-auth-proxy-config\") pod \"machine-config-daemon-t499d\" (UID: \"dfbf74ef-1e94-4826-8583-42b2e246ccf3\") " pod="openshift-machine-config-operator/machine-config-daemon-t499d" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.755155 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dfbf74ef-1e94-4826-8583-42b2e246ccf3-proxy-tls\") pod \"machine-config-daemon-t499d\" (UID: \"dfbf74ef-1e94-4826-8583-42b2e246ccf3\") " pod="openshift-machine-config-operator/machine-config-daemon-t499d" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.755247 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/247e948b-3c17-4675-bd1c-f894b02d2817-system-cni-dir\") pod \"multus-additional-cni-plugins-nmbsx\" (UID: \"247e948b-3c17-4675-bd1c-f894b02d2817\") " pod="openshift-multus/multus-additional-cni-plugins-nmbsx" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.755280 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/247e948b-3c17-4675-bd1c-f894b02d2817-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-nmbsx\" (UID: \"247e948b-3c17-4675-bd1c-f894b02d2817\") " pod="openshift-multus/multus-additional-cni-plugins-nmbsx" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.755315 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/247e948b-3c17-4675-bd1c-f894b02d2817-tuning-conf-dir\") pod \"multus-additional-cni-plugins-nmbsx\" (UID: \"247e948b-3c17-4675-bd1c-f894b02d2817\") " pod="openshift-multus/multus-additional-cni-plugins-nmbsx" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.755384 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjrkt\" (UniqueName: \"kubernetes.io/projected/dfbf74ef-1e94-4826-8583-42b2e246ccf3-kube-api-access-kjrkt\") pod \"machine-config-daemon-t499d\" (UID: \"dfbf74ef-1e94-4826-8583-42b2e246ccf3\") " pod="openshift-machine-config-operator/machine-config-daemon-t499d" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.772697 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:53Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.789048 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bsztz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a45a199-beeb-4972-b796-15c958fe99d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcg8h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bsztz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:53Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.806403 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:53Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.825297 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:53Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.844639 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfbf74ef-1e94-4826-8583-42b2e246ccf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjrkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjrkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t499d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:53Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.856427 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/dfbf74ef-1e94-4826-8583-42b2e246ccf3-rootfs\") pod \"machine-config-daemon-t499d\" (UID: \"dfbf74ef-1e94-4826-8583-42b2e246ccf3\") " pod="openshift-machine-config-operator/machine-config-daemon-t499d" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.856497 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/247e948b-3c17-4675-bd1c-f894b02d2817-os-release\") pod \"multus-additional-cni-plugins-nmbsx\" (UID: \"247e948b-3c17-4675-bd1c-f894b02d2817\") " pod="openshift-multus/multus-additional-cni-plugins-nmbsx" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.856520 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmwlb\" (UniqueName: \"kubernetes.io/projected/247e948b-3c17-4675-bd1c-f894b02d2817-kube-api-access-qmwlb\") pod \"multus-additional-cni-plugins-nmbsx\" (UID: \"247e948b-3c17-4675-bd1c-f894b02d2817\") " pod="openshift-multus/multus-additional-cni-plugins-nmbsx" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.856544 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dfbf74ef-1e94-4826-8583-42b2e246ccf3-mcd-auth-proxy-config\") pod \"machine-config-daemon-t499d\" (UID: \"dfbf74ef-1e94-4826-8583-42b2e246ccf3\") " pod="openshift-machine-config-operator/machine-config-daemon-t499d" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.856579 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dfbf74ef-1e94-4826-8583-42b2e246ccf3-proxy-tls\") pod \"machine-config-daemon-t499d\" (UID: \"dfbf74ef-1e94-4826-8583-42b2e246ccf3\") " pod="openshift-machine-config-operator/machine-config-daemon-t499d" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.856608 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/247e948b-3c17-4675-bd1c-f894b02d2817-system-cni-dir\") pod \"multus-additional-cni-plugins-nmbsx\" (UID: \"247e948b-3c17-4675-bd1c-f894b02d2817\") " pod="openshift-multus/multus-additional-cni-plugins-nmbsx" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.856601 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/dfbf74ef-1e94-4826-8583-42b2e246ccf3-rootfs\") pod \"machine-config-daemon-t499d\" (UID: \"dfbf74ef-1e94-4826-8583-42b2e246ccf3\") " pod="openshift-machine-config-operator/machine-config-daemon-t499d" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.856625 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/247e948b-3c17-4675-bd1c-f894b02d2817-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-nmbsx\" (UID: \"247e948b-3c17-4675-bd1c-f894b02d2817\") " pod="openshift-multus/multus-additional-cni-plugins-nmbsx" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.856717 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/247e948b-3c17-4675-bd1c-f894b02d2817-tuning-conf-dir\") pod \"multus-additional-cni-plugins-nmbsx\" (UID: \"247e948b-3c17-4675-bd1c-f894b02d2817\") " pod="openshift-multus/multus-additional-cni-plugins-nmbsx" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.856740 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjrkt\" (UniqueName: \"kubernetes.io/projected/dfbf74ef-1e94-4826-8583-42b2e246ccf3-kube-api-access-kjrkt\") pod \"machine-config-daemon-t499d\" (UID: \"dfbf74ef-1e94-4826-8583-42b2e246ccf3\") " pod="openshift-machine-config-operator/machine-config-daemon-t499d" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.856757 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/247e948b-3c17-4675-bd1c-f894b02d2817-os-release\") pod \"multus-additional-cni-plugins-nmbsx\" (UID: \"247e948b-3c17-4675-bd1c-f894b02d2817\") " pod="openshift-multus/multus-additional-cni-plugins-nmbsx" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.856804 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/247e948b-3c17-4675-bd1c-f894b02d2817-cni-binary-copy\") pod \"multus-additional-cni-plugins-nmbsx\" (UID: \"247e948b-3c17-4675-bd1c-f894b02d2817\") " pod="openshift-multus/multus-additional-cni-plugins-nmbsx" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.856829 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/247e948b-3c17-4675-bd1c-f894b02d2817-cnibin\") pod \"multus-additional-cni-plugins-nmbsx\" (UID: \"247e948b-3c17-4675-bd1c-f894b02d2817\") " pod="openshift-multus/multus-additional-cni-plugins-nmbsx" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.856872 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/247e948b-3c17-4675-bd1c-f894b02d2817-tuning-conf-dir\") pod \"multus-additional-cni-plugins-nmbsx\" (UID: \"247e948b-3c17-4675-bd1c-f894b02d2817\") " pod="openshift-multus/multus-additional-cni-plugins-nmbsx" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.856891 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/247e948b-3c17-4675-bd1c-f894b02d2817-cnibin\") pod \"multus-additional-cni-plugins-nmbsx\" (UID: \"247e948b-3c17-4675-bd1c-f894b02d2817\") " pod="openshift-multus/multus-additional-cni-plugins-nmbsx" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.857652 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dfbf74ef-1e94-4826-8583-42b2e246ccf3-mcd-auth-proxy-config\") pod \"machine-config-daemon-t499d\" (UID: \"dfbf74ef-1e94-4826-8583-42b2e246ccf3\") " pod="openshift-machine-config-operator/machine-config-daemon-t499d" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.857717 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/247e948b-3c17-4675-bd1c-f894b02d2817-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-nmbsx\" (UID: \"247e948b-3c17-4675-bd1c-f894b02d2817\") " pod="openshift-multus/multus-additional-cni-plugins-nmbsx" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.857828 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/247e948b-3c17-4675-bd1c-f894b02d2817-system-cni-dir\") pod \"multus-additional-cni-plugins-nmbsx\" (UID: \"247e948b-3c17-4675-bd1c-f894b02d2817\") " pod="openshift-multus/multus-additional-cni-plugins-nmbsx" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.857899 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/247e948b-3c17-4675-bd1c-f894b02d2817-cni-binary-copy\") pod \"multus-additional-cni-plugins-nmbsx\" (UID: \"247e948b-3c17-4675-bd1c-f894b02d2817\") " pod="openshift-multus/multus-additional-cni-plugins-nmbsx" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.861994 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dfbf74ef-1e94-4826-8583-42b2e246ccf3-proxy-tls\") pod \"machine-config-daemon-t499d\" (UID: \"dfbf74ef-1e94-4826-8583-42b2e246ccf3\") " pod="openshift-machine-config-operator/machine-config-daemon-t499d" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.862948 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d74a04f-ffb6-47fb-8823-2fd7f3df4f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2070ce19c777ad791dfd3e5d99eded43e479f4cbbaefe96c5ed8525aa86e5ed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89e969496cd5adab974a5d8b8c3051bcdfb63c582b8f0d6d9af233d25e2c1d3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5559fbe43f7e18617e2a46182d621a6310ae4d0b10548bae2bb098588285e081\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fbf8815e560483199cc8f453952895d3e5d831e7c52671b204187dc686fa4ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:53Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.875728 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjrkt\" (UniqueName: \"kubernetes.io/projected/dfbf74ef-1e94-4826-8583-42b2e246ccf3-kube-api-access-kjrkt\") pod \"machine-config-daemon-t499d\" (UID: \"dfbf74ef-1e94-4826-8583-42b2e246ccf3\") " pod="openshift-machine-config-operator/machine-config-daemon-t499d" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.878736 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmwlb\" (UniqueName: \"kubernetes.io/projected/247e948b-3c17-4675-bd1c-f894b02d2817-kube-api-access-qmwlb\") pod \"multus-additional-cni-plugins-nmbsx\" (UID: \"247e948b-3c17-4675-bd1c-f894b02d2817\") " pod="openshift-multus/multus-additional-cni-plugins-nmbsx" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.881116 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b60fe11e60f880bde640f63640fdc6a1b66e3044dcd975c711c45a15aed3fb2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:53Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.898246 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9db08d13283889d8c43ab89157e2c33f300359ecf66b548100b30ddea3da2be1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8472b62f0ac04884cd8a9d342c063860188b0f17e1ed2201c7ea7288fb44d77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:53Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.915127 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-flbx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cc3ea69-b881-4fd4-ad4d-42803f27865b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqph9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-flbx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:53Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.924701 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-8k7p5"] Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.925723 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.927349 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.927635 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.927895 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.928047 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.928567 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.928875 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.929112 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.932299 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:53Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.946122 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:53Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.958785 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bsztz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a45a199-beeb-4972-b796-15c958fe99d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcg8h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bsztz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:53Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.974380 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6694b0d-3264-43eb-bd52-5088c7d2bf15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16713d77989f991d6b3136c41aac696aaa3233a0f3454bcd299cd968f46872eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42939a1750dd650a11f4f83812694248cc8767b040f5cab76ee64d30ad6004ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a3707767fcc9c9ee6a808f9c0598457f1710b6394ecf9910f7a1a0ddd8ec1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://640d02443d7d20048695866b4db1d1fdd6469c712a804010860ca4c730f59c65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://640d02443d7d20048695866b4db1d1fdd6469c712a804010860ca4c730f59c65\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T15:09:51Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 15:09:51.508430 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 15:09:51.508603 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 15:09:51.509986 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1367038872/tls.crt::/tmp/serving-cert-1367038872/tls.key\\\\\\\"\\\\nI0219 15:09:51.722990 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 15:09:51.726068 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 15:09:51.726086 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 15:09:51.726104 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 15:09:51.726110 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 15:09:51.731095 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 15:09:51.731120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0219 15:09:51.731119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0219 15:09:51.731126 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 15:09:51.731136 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 15:09:51.731141 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 15:09:51.731145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 15:09:51.731149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 15:09:51.734010 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77137e37c7a8cdddbba21456e942c760fb2cc6c31533e3bc6c4561929b41393b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:53Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:53 crc kubenswrapper[4810]: I0219 15:09:53.987600 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:53Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.000759 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-t499d" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.000932 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmbsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"247e948b-3c17-4675-bd1c-f894b02d2817\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmbsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:53Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:54 crc kubenswrapper[4810]: W0219 15:09:54.013176 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddfbf74ef_1e94_4826_8583_42b2e246ccf3.slice/crio-a970a79f27d05e1726f430916427e3c1c23747211a4c62f3528a1df835ec6fdd WatchSource:0}: Error finding container a970a79f27d05e1726f430916427e3c1c23747211a4c62f3528a1df835ec6fdd: Status 404 returned error can't find the container with id a970a79f27d05e1726f430916427e3c1c23747211a4c62f3528a1df835ec6fdd Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.021021 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-nmbsx" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.026411 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:54Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:54 crc kubenswrapper[4810]: W0219 15:09:54.037472 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod247e948b_3c17_4675_bd1c_f894b02d2817.slice/crio-f4ae6f249d84406b535022e86552a2e5916555094ead428fd4bc4d3f7263983b WatchSource:0}: Error finding container f4ae6f249d84406b535022e86552a2e5916555094ead428fd4bc4d3f7263983b: Status 404 returned error can't find the container with id f4ae6f249d84406b535022e86552a2e5916555094ead428fd4bc4d3f7263983b Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.058882 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8k7p5\" (UID: \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.058927 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-run-systemd\") pod \"ovnkube-node-8k7p5\" (UID: \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.058946 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-env-overrides\") pod \"ovnkube-node-8k7p5\" (UID: \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.058971 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-run-ovn\") pod \"ovnkube-node-8k7p5\" (UID: \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.058988 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-ovnkube-script-lib\") pod \"ovnkube-node-8k7p5\" (UID: \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.059130 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-var-lib-openvswitch\") pod \"ovnkube-node-8k7p5\" (UID: \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.059183 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-etc-openvswitch\") pod \"ovnkube-node-8k7p5\" (UID: \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.059202 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-log-socket\") pod \"ovnkube-node-8k7p5\" (UID: \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.059219 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7xqv\" (UniqueName: \"kubernetes.io/projected/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-kube-api-access-v7xqv\") pod \"ovnkube-node-8k7p5\" (UID: \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.059299 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-host-kubelet\") pod \"ovnkube-node-8k7p5\" (UID: \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.059319 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-node-log\") pod \"ovnkube-node-8k7p5\" (UID: \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.059354 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-ovnkube-config\") pod \"ovnkube-node-8k7p5\" (UID: \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.059371 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-ovn-node-metrics-cert\") pod \"ovnkube-node-8k7p5\" (UID: \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.059387 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-run-openvswitch\") pod \"ovnkube-node-8k7p5\" (UID: \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.059404 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-systemd-units\") pod \"ovnkube-node-8k7p5\" (UID: \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.059427 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-host-cni-bin\") pod \"ovnkube-node-8k7p5\" (UID: \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.059444 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-host-run-ovn-kubernetes\") pod \"ovnkube-node-8k7p5\" (UID: \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.059458 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-host-cni-netd\") pod \"ovnkube-node-8k7p5\" (UID: \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.059473 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-host-slash\") pod \"ovnkube-node-8k7p5\" (UID: \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.059488 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-host-run-netns\") pod \"ovnkube-node-8k7p5\" (UID: \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.059511 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:54Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.114295 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bsztz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a45a199-beeb-4972-b796-15c958fe99d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcg8h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bsztz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:54Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.154767 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6694b0d-3264-43eb-bd52-5088c7d2bf15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16713d77989f991d6b3136c41aac696aaa3233a0f3454bcd299cd968f46872eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42939a1750dd650a11f4f83812694248cc8767b040f5cab76ee64d30ad6004ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a3707767fcc9c9ee6a808f9c0598457f1710b6394ecf9910f7a1a0ddd8ec1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://640d02443d7d20048695866b4db1d1fdd6469c712a804010860ca4c730f59c65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://640d02443d7d20048695866b4db1d1fdd6469c712a804010860ca4c730f59c65\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T15:09:51Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 15:09:51.508430 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 15:09:51.508603 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 15:09:51.509986 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1367038872/tls.crt::/tmp/serving-cert-1367038872/tls.key\\\\\\\"\\\\nI0219 15:09:51.722990 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 15:09:51.726068 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 15:09:51.726086 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 15:09:51.726104 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 15:09:51.726110 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 15:09:51.731095 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 15:09:51.731120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0219 15:09:51.731119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0219 15:09:51.731126 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 15:09:51.731136 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 15:09:51.731141 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 15:09:51.731145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 15:09:51.731149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 15:09:51.734010 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77137e37c7a8cdddbba21456e942c760fb2cc6c31533e3bc6c4561929b41393b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:54Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.159835 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-etc-openvswitch\") pod \"ovnkube-node-8k7p5\" (UID: \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.159873 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-log-socket\") pod \"ovnkube-node-8k7p5\" (UID: \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.159894 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7xqv\" (UniqueName: \"kubernetes.io/projected/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-kube-api-access-v7xqv\") pod \"ovnkube-node-8k7p5\" (UID: \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.159924 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-host-kubelet\") pod \"ovnkube-node-8k7p5\" (UID: \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.159938 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-node-log\") pod \"ovnkube-node-8k7p5\" (UID: \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.159958 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-ovnkube-config\") pod \"ovnkube-node-8k7p5\" (UID: \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.159972 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-ovn-node-metrics-cert\") pod \"ovnkube-node-8k7p5\" (UID: \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.159988 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-run-openvswitch\") pod \"ovnkube-node-8k7p5\" (UID: \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.159992 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-etc-openvswitch\") pod \"ovnkube-node-8k7p5\" (UID: \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.160012 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-systemd-units\") pod \"ovnkube-node-8k7p5\" (UID: \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.160057 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-systemd-units\") pod \"ovnkube-node-8k7p5\" (UID: \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.160111 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-host-cni-bin\") pod \"ovnkube-node-8k7p5\" (UID: \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.160152 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-host-run-ovn-kubernetes\") pod \"ovnkube-node-8k7p5\" (UID: \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.160176 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-host-cni-netd\") pod \"ovnkube-node-8k7p5\" (UID: \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.160178 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-node-log\") pod \"ovnkube-node-8k7p5\" (UID: \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.160255 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-host-cni-bin\") pod \"ovnkube-node-8k7p5\" (UID: \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.160232 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-host-slash\") pod \"ovnkube-node-8k7p5\" (UID: \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.160201 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-host-slash\") pod \"ovnkube-node-8k7p5\" (UID: \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.160304 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-host-run-ovn-kubernetes\") pod \"ovnkube-node-8k7p5\" (UID: \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.160351 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-host-run-netns\") pod \"ovnkube-node-8k7p5\" (UID: \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.160369 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-host-kubelet\") pod \"ovnkube-node-8k7p5\" (UID: \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.160402 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8k7p5\" (UID: \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.160418 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-run-openvswitch\") pod \"ovnkube-node-8k7p5\" (UID: \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.160387 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-host-run-netns\") pod \"ovnkube-node-8k7p5\" (UID: \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.160358 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-host-cni-netd\") pod \"ovnkube-node-8k7p5\" (UID: \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.160448 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8k7p5\" (UID: \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.160500 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-run-systemd\") pod \"ovnkube-node-8k7p5\" (UID: \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.160521 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-env-overrides\") pod \"ovnkube-node-8k7p5\" (UID: \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.160553 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-run-ovn\") pod \"ovnkube-node-8k7p5\" (UID: \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.160553 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-run-systemd\") pod \"ovnkube-node-8k7p5\" (UID: \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.160578 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-ovnkube-script-lib\") pod \"ovnkube-node-8k7p5\" (UID: \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.160604 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-var-lib-openvswitch\") pod \"ovnkube-node-8k7p5\" (UID: \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.160645 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-run-ovn\") pod \"ovnkube-node-8k7p5\" (UID: \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.160671 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-var-lib-openvswitch\") pod \"ovnkube-node-8k7p5\" (UID: \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.160721 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-log-socket\") pod \"ovnkube-node-8k7p5\" (UID: \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.160939 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-ovnkube-config\") pod \"ovnkube-node-8k7p5\" (UID: \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.161185 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-env-overrides\") pod \"ovnkube-node-8k7p5\" (UID: \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.161652 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-ovnkube-script-lib\") pod \"ovnkube-node-8k7p5\" (UID: \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.164426 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-ovn-node-metrics-cert\") pod \"ovnkube-node-8k7p5\" (UID: \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.179933 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:54Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.189199 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7xqv\" (UniqueName: \"kubernetes.io/projected/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-kube-api-access-v7xqv\") pod \"ovnkube-node-8k7p5\" (UID: \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.203218 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmbsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"247e948b-3c17-4675-bd1c-f894b02d2817\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmbsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:54Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.223703 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:54Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.235504 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfbf74ef-1e94-4826-8583-42b2e246ccf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjrkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjrkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t499d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:54Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.241664 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" Feb 19 15:09:54 crc kubenswrapper[4810]: W0219 15:09:54.254618 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5a8a15c_53e8_4868_8feb_dcd4e83939a4.slice/crio-18f9a8e62a518ac5ed0415309aa8b6acc10ac0aef8f801aae58cbe85d9127027 WatchSource:0}: Error finding container 18f9a8e62a518ac5ed0415309aa8b6acc10ac0aef8f801aae58cbe85d9127027: Status 404 returned error can't find the container with id 18f9a8e62a518ac5ed0415309aa8b6acc10ac0aef8f801aae58cbe85d9127027 Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.256856 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d74a04f-ffb6-47fb-8823-2fd7f3df4f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2070ce19c777ad791dfd3e5d99eded43e479f4cbbaefe96c5ed8525aa86e5ed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89e969496cd5adab974a5d8b8c3051bcdfb63c582b8f0d6d9af233d25e2c1d3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5559fbe43f7e18617e2a46182d621a6310ae4d0b10548bae2bb098588285e081\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fbf8815e560483199cc8f453952895d3e5d831e7c52671b204187dc686fa4ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:54Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.274073 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b60fe11e60f880bde640f63640fdc6a1b66e3044dcd975c711c45a15aed3fb2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:54Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.291677 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9db08d13283889d8c43ab89157e2c33f300359ecf66b548100b30ddea3da2be1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8472b62f0ac04884cd8a9d342c063860188b0f17e1ed2201c7ea7288fb44d77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:54Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.306378 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-flbx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cc3ea69-b881-4fd4-ad4d-42803f27865b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqph9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-flbx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:54Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.330769 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8k7p5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:54Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.373011 4810 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 17:16:45.592358663 +0000 UTC Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.439175 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 15:09:54 crc kubenswrapper[4810]: E0219 15:09:54.439303 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.622450 4810 generic.go:334] "Generic (PLEG): container finished" podID="c5a8a15c-53e8-4868-8feb-dcd4e83939a4" containerID="3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c" exitCode=0 Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.622657 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" event={"ID":"c5a8a15c-53e8-4868-8feb-dcd4e83939a4","Type":"ContainerDied","Data":"3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c"} Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.622793 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" event={"ID":"c5a8a15c-53e8-4868-8feb-dcd4e83939a4","Type":"ContainerStarted","Data":"18f9a8e62a518ac5ed0415309aa8b6acc10ac0aef8f801aae58cbe85d9127027"} Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.623563 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nmbsx" event={"ID":"247e948b-3c17-4675-bd1c-f894b02d2817","Type":"ContainerStarted","Data":"7cbe5a969ecab49494c22bccec30a5b7050699729a8ae6244a4bb016ed9cbe64"} Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.623584 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nmbsx" event={"ID":"247e948b-3c17-4675-bd1c-f894b02d2817","Type":"ContainerStarted","Data":"f4ae6f249d84406b535022e86552a2e5916555094ead428fd4bc4d3f7263983b"} Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.625115 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"54ba23405a6182e281d7d423f13e84a03c0f68cf69169f59fe1cd0d5881103c2"} Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.626349 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-flbx5" event={"ID":"4cc3ea69-b881-4fd4-ad4d-42803f27865b","Type":"ContainerStarted","Data":"07b26bfbc15981836877e6ddc6aea3fb26a4df1be38ac67c76f8a7a8f6b84b5b"} Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.632886 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bsztz" event={"ID":"2a45a199-beeb-4972-b796-15c958fe99d3","Type":"ContainerStarted","Data":"a674e283274f1e77f6ac89a8170137bf5852153bd4a35a80101a3cd0090b85d2"} Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.634791 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" event={"ID":"dfbf74ef-1e94-4826-8583-42b2e246ccf3","Type":"ContainerStarted","Data":"4f26baf8d6384c9133f7da338479eac54f1aa88609e4b0854078d4e85e8bf05c"} Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.634854 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" event={"ID":"dfbf74ef-1e94-4826-8583-42b2e246ccf3","Type":"ContainerStarted","Data":"ba65a8abc0841494ba4596019b005e3cbb3f3c0897bef615b51a2ab10aa9e651"} Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.634868 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" event={"ID":"dfbf74ef-1e94-4826-8583-42b2e246ccf3","Type":"ContainerStarted","Data":"a970a79f27d05e1726f430916427e3c1c23747211a4c62f3528a1df835ec6fdd"} Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.646452 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6694b0d-3264-43eb-bd52-5088c7d2bf15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16713d77989f991d6b3136c41aac696aaa3233a0f3454bcd299cd968f46872eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42939a1750dd650a11f4f83812694248cc8767b040f5cab76ee64d30ad6004ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a3707767fcc9c9ee6a808f9c0598457f1710b6394ecf9910f7a1a0ddd8ec1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://640d02443d7d20048695866b4db1d1fdd6469c712a804010860ca4c730f59c65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://640d02443d7d20048695866b4db1d1fdd6469c712a804010860ca4c730f59c65\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T15:09:51Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 15:09:51.508430 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 15:09:51.508603 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 15:09:51.509986 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1367038872/tls.crt::/tmp/serving-cert-1367038872/tls.key\\\\\\\"\\\\nI0219 15:09:51.722990 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 15:09:51.726068 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 15:09:51.726086 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 15:09:51.726104 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 15:09:51.726110 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 15:09:51.731095 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 15:09:51.731120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0219 15:09:51.731119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0219 15:09:51.731126 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 15:09:51.731136 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 15:09:51.731141 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 15:09:51.731145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 15:09:51.731149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 15:09:51.734010 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77137e37c7a8cdddbba21456e942c760fb2cc6c31533e3bc6c4561929b41393b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:54Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.659312 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:54Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.677073 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmbsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"247e948b-3c17-4675-bd1c-f894b02d2817\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmbsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:54Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.696425 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:54Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.709373 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfbf74ef-1e94-4826-8583-42b2e246ccf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjrkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjrkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t499d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:54Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.724791 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d74a04f-ffb6-47fb-8823-2fd7f3df4f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2070ce19c777ad791dfd3e5d99eded43e479f4cbbaefe96c5ed8525aa86e5ed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89e969496cd5adab974a5d8b8c3051bcdfb63c582b8f0d6d9af233d25e2c1d3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5559fbe43f7e18617e2a46182d621a6310ae4d0b10548bae2bb098588285e081\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fbf8815e560483199cc8f453952895d3e5d831e7c52671b204187dc686fa4ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:54Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.740994 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b60fe11e60f880bde640f63640fdc6a1b66e3044dcd975c711c45a15aed3fb2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:54Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.758472 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9db08d13283889d8c43ab89157e2c33f300359ecf66b548100b30ddea3da2be1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8472b62f0ac04884cd8a9d342c063860188b0f17e1ed2201c7ea7288fb44d77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:54Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.771540 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-flbx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cc3ea69-b881-4fd4-ad4d-42803f27865b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqph9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-flbx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:54Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.790553 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8k7p5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:54Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.831562 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:54Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.852011 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:54Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.896800 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bsztz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a45a199-beeb-4972-b796-15c958fe99d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcg8h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bsztz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:54Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.909395 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d74a04f-ffb6-47fb-8823-2fd7f3df4f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2070ce19c777ad791dfd3e5d99eded43e479f4cbbaefe96c5ed8525aa86e5ed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89e969496cd5adab974a5d8b8c3051bcdfb63c582b8f0d6d9af233d25e2c1d3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5559fbe43f7e18617e2a46182d621a6310ae4d0b10548bae2bb098588285e081\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fbf8815e560483199cc8f453952895d3e5d831e7c52671b204187dc686fa4ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:54Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.926089 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b60fe11e60f880bde640f63640fdc6a1b66e3044dcd975c711c45a15aed3fb2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:54Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.944128 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9db08d13283889d8c43ab89157e2c33f300359ecf66b548100b30ddea3da2be1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8472b62f0ac04884cd8a9d342c063860188b0f17e1ed2201c7ea7288fb44d77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:54Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.955301 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-flbx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cc3ea69-b881-4fd4-ad4d-42803f27865b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07b26bfbc15981836877e6ddc6aea3fb26a4df1be38ac67c76f8a7a8f6b84b5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqph9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-flbx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:54Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.978109 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8k7p5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:54Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:54 crc kubenswrapper[4810]: I0219 15:09:54.992735 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:54Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:55 crc kubenswrapper[4810]: I0219 15:09:55.007385 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54ba23405a6182e281d7d423f13e84a03c0f68cf69169f59fe1cd0d5881103c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:55Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:55 crc kubenswrapper[4810]: I0219 15:09:55.027107 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bsztz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a45a199-beeb-4972-b796-15c958fe99d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a674e283274f1e77f6ac89a8170137bf5852153bd4a35a80101a3cd0090b85d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcg8h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bsztz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:55Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:55 crc kubenswrapper[4810]: I0219 15:09:55.048064 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6694b0d-3264-43eb-bd52-5088c7d2bf15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16713d77989f991d6b3136c41aac696aaa3233a0f3454bcd299cd968f46872eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42939a1750dd650a11f4f83812694248cc8767b040f5cab76ee64d30ad6004ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a3707767fcc9c9ee6a808f9c0598457f1710b6394ecf9910f7a1a0ddd8ec1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://640d02443d7d20048695866b4db1d1fdd6469c712a804010860ca4c730f59c65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://640d02443d7d20048695866b4db1d1fdd6469c712a804010860ca4c730f59c65\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T15:09:51Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 15:09:51.508430 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 15:09:51.508603 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 15:09:51.509986 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1367038872/tls.crt::/tmp/serving-cert-1367038872/tls.key\\\\\\\"\\\\nI0219 15:09:51.722990 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 15:09:51.726068 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 15:09:51.726086 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 15:09:51.726104 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 15:09:51.726110 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 15:09:51.731095 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 15:09:51.731120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0219 15:09:51.731119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0219 15:09:51.731126 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 15:09:51.731136 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 15:09:51.731141 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 15:09:51.731145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 15:09:51.731149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 15:09:51.734010 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77137e37c7a8cdddbba21456e942c760fb2cc6c31533e3bc6c4561929b41393b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:55Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:55 crc kubenswrapper[4810]: I0219 15:09:55.061528 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:55Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:55 crc kubenswrapper[4810]: I0219 15:09:55.077032 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmbsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"247e948b-3c17-4675-bd1c-f894b02d2817\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cbe5a969ecab49494c22bccec30a5b7050699729a8ae6244a4bb016ed9cbe64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmbsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:55Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:55 crc kubenswrapper[4810]: I0219 15:09:55.094179 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:55Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:55 crc kubenswrapper[4810]: I0219 15:09:55.108008 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfbf74ef-1e94-4826-8583-42b2e246ccf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f26baf8d6384c9133f7da338479eac54f1aa88609e4b0854078d4e85e8bf05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjrkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba65a8abc0841494ba4596019b005e3cbb3f3c0897bef615b51a2ab10aa9e651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjrkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t499d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:55Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:55 crc kubenswrapper[4810]: I0219 15:09:55.169935 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 15:09:55 crc kubenswrapper[4810]: E0219 15:09:55.170162 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 15:09:59.170119848 +0000 UTC m=+28.652149972 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:09:55 crc kubenswrapper[4810]: I0219 15:09:55.272413 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 15:09:55 crc kubenswrapper[4810]: I0219 15:09:55.272465 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 15:09:55 crc kubenswrapper[4810]: I0219 15:09:55.272493 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 15:09:55 crc kubenswrapper[4810]: I0219 15:09:55.272525 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 15:09:55 crc kubenswrapper[4810]: E0219 15:09:55.272587 4810 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 15:09:55 crc kubenswrapper[4810]: E0219 15:09:55.272653 4810 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 15:09:55 crc kubenswrapper[4810]: E0219 15:09:55.272669 4810 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 15:09:55 crc kubenswrapper[4810]: E0219 15:09:55.272699 4810 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 15:09:55 crc kubenswrapper[4810]: E0219 15:09:55.272675 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 15:09:59.272652646 +0000 UTC m=+28.754682770 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 15:09:55 crc kubenswrapper[4810]: E0219 15:09:55.272713 4810 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 15:09:55 crc kubenswrapper[4810]: E0219 15:09:55.272719 4810 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 15:09:55 crc kubenswrapper[4810]: E0219 15:09:55.272769 4810 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 15:09:55 crc kubenswrapper[4810]: E0219 15:09:55.272787 4810 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 15:09:55 crc kubenswrapper[4810]: E0219 15:09:55.272739 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 15:09:59.272717778 +0000 UTC m=+28.754747902 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 15:09:55 crc kubenswrapper[4810]: E0219 15:09:55.272886 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-19 15:09:59.272858051 +0000 UTC m=+28.754888325 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 15:09:55 crc kubenswrapper[4810]: E0219 15:09:55.272917 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-19 15:09:59.272906452 +0000 UTC m=+28.754936816 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 15:09:55 crc kubenswrapper[4810]: I0219 15:09:55.373636 4810 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 16:53:00.164239586 +0000 UTC Feb 19 15:09:55 crc kubenswrapper[4810]: I0219 15:09:55.439059 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 15:09:55 crc kubenswrapper[4810]: I0219 15:09:55.439054 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 15:09:55 crc kubenswrapper[4810]: E0219 15:09:55.439241 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 15:09:55 crc kubenswrapper[4810]: E0219 15:09:55.439396 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 15:09:55 crc kubenswrapper[4810]: I0219 15:09:55.647369 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" event={"ID":"c5a8a15c-53e8-4868-8feb-dcd4e83939a4","Type":"ContainerStarted","Data":"5de90aae0b44ad98711b4e4c2764cf25c491a329a7acbe9a6ec49f88fb8e7c92"} Feb 19 15:09:55 crc kubenswrapper[4810]: I0219 15:09:55.647424 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" event={"ID":"c5a8a15c-53e8-4868-8feb-dcd4e83939a4","Type":"ContainerStarted","Data":"0c0568b462e1168949dcfd0131ef4e94d6c3f68dd31906ee5c6a8c87f7fef25b"} Feb 19 15:09:55 crc kubenswrapper[4810]: I0219 15:09:55.647434 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" event={"ID":"c5a8a15c-53e8-4868-8feb-dcd4e83939a4","Type":"ContainerStarted","Data":"879052e78b8d383d0a7f4fbddb38b98fa931bec3d0d0c7ff7eb830ccb62be46d"} Feb 19 15:09:55 crc kubenswrapper[4810]: I0219 15:09:55.647443 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" event={"ID":"c5a8a15c-53e8-4868-8feb-dcd4e83939a4","Type":"ContainerStarted","Data":"6dcca2640a7625737b151a65d63dac889139b41a8dff1cd5aae2ead1a37138f1"} Feb 19 15:09:55 crc kubenswrapper[4810]: I0219 15:09:55.647452 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" event={"ID":"c5a8a15c-53e8-4868-8feb-dcd4e83939a4","Type":"ContainerStarted","Data":"702a868ef507f31cb27f41ccd862f38ba8d5256e5d4af19a5d8b168939932663"} Feb 19 15:09:55 crc kubenswrapper[4810]: I0219 15:09:55.647461 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" event={"ID":"c5a8a15c-53e8-4868-8feb-dcd4e83939a4","Type":"ContainerStarted","Data":"5089067acda7084123f239150916500b11a0547068b0a39ee60a467c372f3459"} Feb 19 15:09:55 crc kubenswrapper[4810]: I0219 15:09:55.648740 4810 generic.go:334] "Generic (PLEG): container finished" podID="247e948b-3c17-4675-bd1c-f894b02d2817" containerID="7cbe5a969ecab49494c22bccec30a5b7050699729a8ae6244a4bb016ed9cbe64" exitCode=0 Feb 19 15:09:55 crc kubenswrapper[4810]: I0219 15:09:55.648882 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nmbsx" event={"ID":"247e948b-3c17-4675-bd1c-f894b02d2817","Type":"ContainerDied","Data":"7cbe5a969ecab49494c22bccec30a5b7050699729a8ae6244a4bb016ed9cbe64"} Feb 19 15:09:55 crc kubenswrapper[4810]: I0219 15:09:55.669785 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-flbx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cc3ea69-b881-4fd4-ad4d-42803f27865b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07b26bfbc15981836877e6ddc6aea3fb26a4df1be38ac67c76f8a7a8f6b84b5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqph9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-flbx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:55Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:55 crc kubenswrapper[4810]: I0219 15:09:55.704277 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8k7p5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:55Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:55 crc kubenswrapper[4810]: I0219 15:09:55.729040 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d74a04f-ffb6-47fb-8823-2fd7f3df4f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2070ce19c777ad791dfd3e5d99eded43e479f4cbbaefe96c5ed8525aa86e5ed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89e969496cd5adab974a5d8b8c3051bcdfb63c582b8f0d6d9af233d25e2c1d3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5559fbe43f7e18617e2a46182d621a6310ae4d0b10548bae2bb098588285e081\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fbf8815e560483199cc8f453952895d3e5d831e7c52671b204187dc686fa4ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:55Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:55 crc kubenswrapper[4810]: I0219 15:09:55.752977 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b60fe11e60f880bde640f63640fdc6a1b66e3044dcd975c711c45a15aed3fb2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:55Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:55 crc kubenswrapper[4810]: I0219 15:09:55.767187 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9db08d13283889d8c43ab89157e2c33f300359ecf66b548100b30ddea3da2be1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8472b62f0ac04884cd8a9d342c063860188b0f17e1ed2201c7ea7288fb44d77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:55Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:55 crc kubenswrapper[4810]: I0219 15:09:55.781360 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:55Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:55 crc kubenswrapper[4810]: I0219 15:09:55.805080 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54ba23405a6182e281d7d423f13e84a03c0f68cf69169f59fe1cd0d5881103c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:55Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:55 crc kubenswrapper[4810]: I0219 15:09:55.816947 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-t9jnq"] Feb 19 15:09:55 crc kubenswrapper[4810]: I0219 15:09:55.817539 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-t9jnq" Feb 19 15:09:55 crc kubenswrapper[4810]: I0219 15:09:55.819409 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 19 15:09:55 crc kubenswrapper[4810]: I0219 15:09:55.819898 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 19 15:09:55 crc kubenswrapper[4810]: I0219 15:09:55.820380 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 19 15:09:55 crc kubenswrapper[4810]: I0219 15:09:55.820620 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bsztz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a45a199-beeb-4972-b796-15c958fe99d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a674e283274f1e77f6ac89a8170137bf5852153bd4a35a80101a3cd0090b85d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcg8h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bsztz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:55Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:55 crc kubenswrapper[4810]: I0219 15:09:55.820838 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 19 15:09:55 crc kubenswrapper[4810]: I0219 15:09:55.840576 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:55Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:55 crc kubenswrapper[4810]: I0219 15:09:55.862065 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmbsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"247e948b-3c17-4675-bd1c-f894b02d2817\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cbe5a969ecab49494c22bccec30a5b7050699729a8ae6244a4bb016ed9cbe64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cbe5a969ecab49494c22bccec30a5b7050699729a8ae6244a4bb016ed9cbe64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmbsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:55Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:55 crc kubenswrapper[4810]: I0219 15:09:55.881259 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6694b0d-3264-43eb-bd52-5088c7d2bf15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16713d77989f991d6b3136c41aac696aaa3233a0f3454bcd299cd968f46872eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42939a1750dd650a11f4f83812694248cc8767b040f5cab76ee64d30ad6004ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a3707767fcc9c9ee6a808f9c0598457f1710b6394ecf9910f7a1a0ddd8ec1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://640d02443d7d20048695866b4db1d1fdd6469c712a804010860ca4c730f59c65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://640d02443d7d20048695866b4db1d1fdd6469c712a804010860ca4c730f59c65\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T15:09:51Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 15:09:51.508430 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 15:09:51.508603 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 15:09:51.509986 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1367038872/tls.crt::/tmp/serving-cert-1367038872/tls.key\\\\\\\"\\\\nI0219 15:09:51.722990 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 15:09:51.726068 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 15:09:51.726086 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 15:09:51.726104 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 15:09:51.726110 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 15:09:51.731095 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 15:09:51.731120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0219 15:09:51.731119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0219 15:09:51.731126 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 15:09:51.731136 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 15:09:51.731141 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 15:09:51.731145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 15:09:51.731149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 15:09:51.734010 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77137e37c7a8cdddbba21456e942c760fb2cc6c31533e3bc6c4561929b41393b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:55Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:55 crc kubenswrapper[4810]: I0219 15:09:55.897933 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:55Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:55 crc kubenswrapper[4810]: I0219 15:09:55.910919 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfbf74ef-1e94-4826-8583-42b2e246ccf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f26baf8d6384c9133f7da338479eac54f1aa88609e4b0854078d4e85e8bf05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjrkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba65a8abc0841494ba4596019b005e3cbb3f3c0897bef615b51a2ab10aa9e651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjrkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t499d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:55Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:55 crc kubenswrapper[4810]: I0219 15:09:55.925837 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:55Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:55 crc kubenswrapper[4810]: I0219 15:09:55.941472 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54ba23405a6182e281d7d423f13e84a03c0f68cf69169f59fe1cd0d5881103c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:55Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:55 crc kubenswrapper[4810]: I0219 15:09:55.956138 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bsztz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a45a199-beeb-4972-b796-15c958fe99d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a674e283274f1e77f6ac89a8170137bf5852153bd4a35a80101a3cd0090b85d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcg8h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bsztz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:55Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:55 crc kubenswrapper[4810]: I0219 15:09:55.981392 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6694b0d-3264-43eb-bd52-5088c7d2bf15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16713d77989f991d6b3136c41aac696aaa3233a0f3454bcd299cd968f46872eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42939a1750dd650a11f4f83812694248cc8767b040f5cab76ee64d30ad6004ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a3707767fcc9c9ee6a808f9c0598457f1710b6394ecf9910f7a1a0ddd8ec1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://640d02443d7d20048695866b4db1d1fdd6469c712a804010860ca4c730f59c65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://640d02443d7d20048695866b4db1d1fdd6469c712a804010860ca4c730f59c65\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T15:09:51Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 15:09:51.508430 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 15:09:51.508603 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 15:09:51.509986 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1367038872/tls.crt::/tmp/serving-cert-1367038872/tls.key\\\\\\\"\\\\nI0219 15:09:51.722990 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 15:09:51.726068 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 15:09:51.726086 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 15:09:51.726104 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 15:09:51.726110 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 15:09:51.731095 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 15:09:51.731120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0219 15:09:51.731119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0219 15:09:51.731126 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 15:09:51.731136 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 15:09:51.731141 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 15:09:51.731145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 15:09:51.731149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 15:09:51.734010 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77137e37c7a8cdddbba21456e942c760fb2cc6c31533e3bc6c4561929b41393b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:55Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:55 crc kubenswrapper[4810]: I0219 15:09:55.981566 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/60d7228b-14bd-4988-8dca-cb89f487ba00-serviceca\") pod \"node-ca-t9jnq\" (UID: \"60d7228b-14bd-4988-8dca-cb89f487ba00\") " pod="openshift-image-registry/node-ca-t9jnq" Feb 19 15:09:55 crc kubenswrapper[4810]: I0219 15:09:55.981623 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/60d7228b-14bd-4988-8dca-cb89f487ba00-host\") pod \"node-ca-t9jnq\" (UID: \"60d7228b-14bd-4988-8dca-cb89f487ba00\") " pod="openshift-image-registry/node-ca-t9jnq" Feb 19 15:09:55 crc kubenswrapper[4810]: I0219 15:09:55.981693 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gzvj\" (UniqueName: \"kubernetes.io/projected/60d7228b-14bd-4988-8dca-cb89f487ba00-kube-api-access-5gzvj\") pod \"node-ca-t9jnq\" (UID: \"60d7228b-14bd-4988-8dca-cb89f487ba00\") " pod="openshift-image-registry/node-ca-t9jnq" Feb 19 15:09:55 crc kubenswrapper[4810]: I0219 15:09:55.994303 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:55Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:56 crc kubenswrapper[4810]: I0219 15:09:56.010395 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmbsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"247e948b-3c17-4675-bd1c-f894b02d2817\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cbe5a969ecab49494c22bccec30a5b7050699729a8ae6244a4bb016ed9cbe64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cbe5a969ecab49494c22bccec30a5b7050699729a8ae6244a4bb016ed9cbe64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmbsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:56Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:56 crc kubenswrapper[4810]: I0219 15:09:56.023216 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:56Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:56 crc kubenswrapper[4810]: I0219 15:09:56.035365 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfbf74ef-1e94-4826-8583-42b2e246ccf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f26baf8d6384c9133f7da338479eac54f1aa88609e4b0854078d4e85e8bf05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjrkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba65a8abc0841494ba4596019b005e3cbb3f3c0897bef615b51a2ab10aa9e651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjrkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t499d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:56Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:56 crc kubenswrapper[4810]: I0219 15:09:56.066429 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-t9jnq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60d7228b-14bd-4988-8dca-cb89f487ba00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:55Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:55Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gzvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-t9jnq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:56Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:56 crc kubenswrapper[4810]: I0219 15:09:56.082754 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/60d7228b-14bd-4988-8dca-cb89f487ba00-serviceca\") pod \"node-ca-t9jnq\" (UID: \"60d7228b-14bd-4988-8dca-cb89f487ba00\") " pod="openshift-image-registry/node-ca-t9jnq" Feb 19 15:09:56 crc kubenswrapper[4810]: I0219 15:09:56.082836 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/60d7228b-14bd-4988-8dca-cb89f487ba00-host\") pod \"node-ca-t9jnq\" (UID: \"60d7228b-14bd-4988-8dca-cb89f487ba00\") " pod="openshift-image-registry/node-ca-t9jnq" Feb 19 15:09:56 crc kubenswrapper[4810]: I0219 15:09:56.082897 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gzvj\" (UniqueName: \"kubernetes.io/projected/60d7228b-14bd-4988-8dca-cb89f487ba00-kube-api-access-5gzvj\") pod \"node-ca-t9jnq\" (UID: \"60d7228b-14bd-4988-8dca-cb89f487ba00\") " pod="openshift-image-registry/node-ca-t9jnq" Feb 19 15:09:56 crc kubenswrapper[4810]: I0219 15:09:56.083106 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/60d7228b-14bd-4988-8dca-cb89f487ba00-host\") pod \"node-ca-t9jnq\" (UID: \"60d7228b-14bd-4988-8dca-cb89f487ba00\") " pod="openshift-image-registry/node-ca-t9jnq" Feb 19 15:09:56 crc kubenswrapper[4810]: I0219 15:09:56.084087 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/60d7228b-14bd-4988-8dca-cb89f487ba00-serviceca\") pod \"node-ca-t9jnq\" (UID: \"60d7228b-14bd-4988-8dca-cb89f487ba00\") " pod="openshift-image-registry/node-ca-t9jnq" Feb 19 15:09:56 crc kubenswrapper[4810]: I0219 15:09:56.107541 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d74a04f-ffb6-47fb-8823-2fd7f3df4f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2070ce19c777ad791dfd3e5d99eded43e479f4cbbaefe96c5ed8525aa86e5ed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89e969496cd5adab974a5d8b8c3051bcdfb63c582b8f0d6d9af233d25e2c1d3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5559fbe43f7e18617e2a46182d621a6310ae4d0b10548bae2bb098588285e081\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fbf8815e560483199cc8f453952895d3e5d831e7c52671b204187dc686fa4ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:56Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:56 crc kubenswrapper[4810]: I0219 15:09:56.140472 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gzvj\" (UniqueName: \"kubernetes.io/projected/60d7228b-14bd-4988-8dca-cb89f487ba00-kube-api-access-5gzvj\") pod \"node-ca-t9jnq\" (UID: \"60d7228b-14bd-4988-8dca-cb89f487ba00\") " pod="openshift-image-registry/node-ca-t9jnq" Feb 19 15:09:56 crc kubenswrapper[4810]: I0219 15:09:56.156883 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-t9jnq" Feb 19 15:09:56 crc kubenswrapper[4810]: I0219 15:09:56.167718 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b60fe11e60f880bde640f63640fdc6a1b66e3044dcd975c711c45a15aed3fb2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:56Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:56 crc kubenswrapper[4810]: I0219 15:09:56.207290 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9db08d13283889d8c43ab89157e2c33f300359ecf66b548100b30ddea3da2be1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8472b62f0ac04884cd8a9d342c063860188b0f17e1ed2201c7ea7288fb44d77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:56Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:56 crc kubenswrapper[4810]: I0219 15:09:56.245749 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-flbx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cc3ea69-b881-4fd4-ad4d-42803f27865b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07b26bfbc15981836877e6ddc6aea3fb26a4df1be38ac67c76f8a7a8f6b84b5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqph9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-flbx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:56Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:56 crc kubenswrapper[4810]: I0219 15:09:56.297466 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8k7p5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:56Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:56 crc kubenswrapper[4810]: I0219 15:09:56.374798 4810 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 13:39:22.43422633 +0000 UTC Feb 19 15:09:56 crc kubenswrapper[4810]: I0219 15:09:56.438525 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 15:09:56 crc kubenswrapper[4810]: E0219 15:09:56.438676 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 15:09:56 crc kubenswrapper[4810]: I0219 15:09:56.464723 4810 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 15:09:56 crc kubenswrapper[4810]: I0219 15:09:56.465688 4810 scope.go:117] "RemoveContainer" containerID="640d02443d7d20048695866b4db1d1fdd6469c712a804010860ca4c730f59c65" Feb 19 15:09:56 crc kubenswrapper[4810]: E0219 15:09:56.465970 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 19 15:09:56 crc kubenswrapper[4810]: I0219 15:09:56.655929 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nmbsx" event={"ID":"247e948b-3c17-4675-bd1c-f894b02d2817","Type":"ContainerStarted","Data":"395e0b7a03b78f71b2823c72ea754e1909bde653a7107597bf07b6e664ba40dd"} Feb 19 15:09:56 crc kubenswrapper[4810]: I0219 15:09:56.657117 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-t9jnq" event={"ID":"60d7228b-14bd-4988-8dca-cb89f487ba00","Type":"ContainerStarted","Data":"4c381c6531fba5bdfc94af20d71b77afd5e67b0cb6fc49bd037be0b5bf8f26e5"} Feb 19 15:09:56 crc kubenswrapper[4810]: I0219 15:09:56.676302 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b60fe11e60f880bde640f63640fdc6a1b66e3044dcd975c711c45a15aed3fb2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:56Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:56 crc kubenswrapper[4810]: I0219 15:09:56.699974 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9db08d13283889d8c43ab89157e2c33f300359ecf66b548100b30ddea3da2be1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8472b62f0ac04884cd8a9d342c063860188b0f17e1ed2201c7ea7288fb44d77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:56Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:56 crc kubenswrapper[4810]: I0219 15:09:56.711409 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-flbx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cc3ea69-b881-4fd4-ad4d-42803f27865b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07b26bfbc15981836877e6ddc6aea3fb26a4df1be38ac67c76f8a7a8f6b84b5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqph9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-flbx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:56Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:56 crc kubenswrapper[4810]: I0219 15:09:56.735923 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8k7p5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:56Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:56 crc kubenswrapper[4810]: I0219 15:09:56.754488 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d74a04f-ffb6-47fb-8823-2fd7f3df4f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2070ce19c777ad791dfd3e5d99eded43e479f4cbbaefe96c5ed8525aa86e5ed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89e969496cd5adab974a5d8b8c3051bcdfb63c582b8f0d6d9af233d25e2c1d3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5559fbe43f7e18617e2a46182d621a6310ae4d0b10548bae2bb098588285e081\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fbf8815e560483199cc8f453952895d3e5d831e7c52671b204187dc686fa4ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:56Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:56 crc kubenswrapper[4810]: I0219 15:09:56.768220 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54ba23405a6182e281d7d423f13e84a03c0f68cf69169f59fe1cd0d5881103c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:56Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:56 crc kubenswrapper[4810]: I0219 15:09:56.781406 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bsztz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a45a199-beeb-4972-b796-15c958fe99d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a674e283274f1e77f6ac89a8170137bf5852153bd4a35a80101a3cd0090b85d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcg8h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bsztz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:56Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:56 crc kubenswrapper[4810]: I0219 15:09:56.800425 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:56Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:56 crc kubenswrapper[4810]: I0219 15:09:56.823459 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6694b0d-3264-43eb-bd52-5088c7d2bf15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16713d77989f991d6b3136c41aac696aaa3233a0f3454bcd299cd968f46872eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42939a1750dd650a11f4f83812694248cc8767b040f5cab76ee64d30ad6004ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a3707767fcc9c9ee6a808f9c0598457f1710b6394ecf9910f7a1a0ddd8ec1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://640d02443d7d20048695866b4db1d1fdd6469c712a804010860ca4c730f59c65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://640d02443d7d20048695866b4db1d1fdd6469c712a804010860ca4c730f59c65\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T15:09:51Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 15:09:51.508430 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 15:09:51.508603 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 15:09:51.509986 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1367038872/tls.crt::/tmp/serving-cert-1367038872/tls.key\\\\\\\"\\\\nI0219 15:09:51.722990 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 15:09:51.726068 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 15:09:51.726086 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 15:09:51.726104 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 15:09:51.726110 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 15:09:51.731095 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 15:09:51.731120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0219 15:09:51.731119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0219 15:09:51.731126 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 15:09:51.731136 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 15:09:51.731141 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 15:09:51.731145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 15:09:51.731149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 15:09:51.734010 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77137e37c7a8cdddbba21456e942c760fb2cc6c31533e3bc6c4561929b41393b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:56Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:56 crc kubenswrapper[4810]: I0219 15:09:56.838116 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:56Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:56 crc kubenswrapper[4810]: I0219 15:09:56.854769 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmbsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"247e948b-3c17-4675-bd1c-f894b02d2817\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cbe5a969ecab49494c22bccec30a5b7050699729a8ae6244a4bb016ed9cbe64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cbe5a969ecab49494c22bccec30a5b7050699729a8ae6244a4bb016ed9cbe64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://395e0b7a03b78f71b2823c72ea754e1909bde653a7107597bf07b6e664ba40dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmbsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:56Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:56 crc kubenswrapper[4810]: I0219 15:09:56.867978 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:56Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:56 crc kubenswrapper[4810]: I0219 15:09:56.885184 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfbf74ef-1e94-4826-8583-42b2e246ccf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f26baf8d6384c9133f7da338479eac54f1aa88609e4b0854078d4e85e8bf05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjrkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba65a8abc0841494ba4596019b005e3cbb3f3c0897bef615b51a2ab10aa9e651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjrkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t499d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:56Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:56 crc kubenswrapper[4810]: I0219 15:09:56.896968 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-t9jnq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60d7228b-14bd-4988-8dca-cb89f487ba00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:55Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:55Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gzvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-t9jnq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:56Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:57 crc kubenswrapper[4810]: I0219 15:09:57.375577 4810 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 20:13:24.488323314 +0000 UTC Feb 19 15:09:57 crc kubenswrapper[4810]: I0219 15:09:57.438423 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 15:09:57 crc kubenswrapper[4810]: I0219 15:09:57.438518 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 15:09:57 crc kubenswrapper[4810]: E0219 15:09:57.438634 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 15:09:57 crc kubenswrapper[4810]: E0219 15:09:57.438858 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 15:09:57 crc kubenswrapper[4810]: I0219 15:09:57.662976 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-t9jnq" event={"ID":"60d7228b-14bd-4988-8dca-cb89f487ba00","Type":"ContainerStarted","Data":"cb03eee4bad3a037ff0d61a204c9c86a839e9d684dbb89a433a50afa9cfcfcf3"} Feb 19 15:09:57 crc kubenswrapper[4810]: I0219 15:09:57.666747 4810 generic.go:334] "Generic (PLEG): container finished" podID="247e948b-3c17-4675-bd1c-f894b02d2817" containerID="395e0b7a03b78f71b2823c72ea754e1909bde653a7107597bf07b6e664ba40dd" exitCode=0 Feb 19 15:09:57 crc kubenswrapper[4810]: I0219 15:09:57.666820 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nmbsx" event={"ID":"247e948b-3c17-4675-bd1c-f894b02d2817","Type":"ContainerDied","Data":"395e0b7a03b78f71b2823c72ea754e1909bde653a7107597bf07b6e664ba40dd"} Feb 19 15:09:57 crc kubenswrapper[4810]: I0219 15:09:57.688822 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:57Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:57 crc kubenswrapper[4810]: I0219 15:09:57.711146 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfbf74ef-1e94-4826-8583-42b2e246ccf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f26baf8d6384c9133f7da338479eac54f1aa88609e4b0854078d4e85e8bf05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjrkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba65a8abc0841494ba4596019b005e3cbb3f3c0897bef615b51a2ab10aa9e651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjrkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t499d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:57Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:57 crc kubenswrapper[4810]: I0219 15:09:57.735920 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-t9jnq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60d7228b-14bd-4988-8dca-cb89f487ba00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb03eee4bad3a037ff0d61a204c9c86a839e9d684dbb89a433a50afa9cfcfcf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gzvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-t9jnq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:57Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:57 crc kubenswrapper[4810]: I0219 15:09:57.759445 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d74a04f-ffb6-47fb-8823-2fd7f3df4f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2070ce19c777ad791dfd3e5d99eded43e479f4cbbaefe96c5ed8525aa86e5ed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89e969496cd5adab974a5d8b8c3051bcdfb63c582b8f0d6d9af233d25e2c1d3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5559fbe43f7e18617e2a46182d621a6310ae4d0b10548bae2bb098588285e081\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fbf8815e560483199cc8f453952895d3e5d831e7c52671b204187dc686fa4ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:57Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:57 crc kubenswrapper[4810]: I0219 15:09:57.779854 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b60fe11e60f880bde640f63640fdc6a1b66e3044dcd975c711c45a15aed3fb2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:57Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:57 crc kubenswrapper[4810]: I0219 15:09:57.797509 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9db08d13283889d8c43ab89157e2c33f300359ecf66b548100b30ddea3da2be1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8472b62f0ac04884cd8a9d342c063860188b0f17e1ed2201c7ea7288fb44d77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:57Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:57 crc kubenswrapper[4810]: I0219 15:09:57.816472 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-flbx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cc3ea69-b881-4fd4-ad4d-42803f27865b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07b26bfbc15981836877e6ddc6aea3fb26a4df1be38ac67c76f8a7a8f6b84b5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqph9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-flbx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:57Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:57 crc kubenswrapper[4810]: I0219 15:09:57.841707 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8k7p5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:57Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:57 crc kubenswrapper[4810]: I0219 15:09:57.858407 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:57Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:57 crc kubenswrapper[4810]: I0219 15:09:57.875141 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54ba23405a6182e281d7d423f13e84a03c0f68cf69169f59fe1cd0d5881103c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:57Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:57 crc kubenswrapper[4810]: I0219 15:09:57.894094 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bsztz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a45a199-beeb-4972-b796-15c958fe99d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a674e283274f1e77f6ac89a8170137bf5852153bd4a35a80101a3cd0090b85d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcg8h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bsztz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:57Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:57 crc kubenswrapper[4810]: I0219 15:09:57.896270 4810 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 15:09:57 crc kubenswrapper[4810]: I0219 15:09:57.898581 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:09:57 crc kubenswrapper[4810]: I0219 15:09:57.899003 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:09:57 crc kubenswrapper[4810]: I0219 15:09:57.899016 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:09:57 crc kubenswrapper[4810]: I0219 15:09:57.899135 4810 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 19 15:09:57 crc kubenswrapper[4810]: I0219 15:09:57.908694 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6694b0d-3264-43eb-bd52-5088c7d2bf15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16713d77989f991d6b3136c41aac696aaa3233a0f3454bcd299cd968f46872eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42939a1750dd650a11f4f83812694248cc8767b040f5cab76ee64d30ad6004ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a3707767fcc9c9ee6a808f9c0598457f1710b6394ecf9910f7a1a0ddd8ec1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://640d02443d7d20048695866b4db1d1fdd6469c712a804010860ca4c730f59c65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://640d02443d7d20048695866b4db1d1fdd6469c712a804010860ca4c730f59c65\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T15:09:51Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 15:09:51.508430 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 15:09:51.508603 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 15:09:51.509986 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1367038872/tls.crt::/tmp/serving-cert-1367038872/tls.key\\\\\\\"\\\\nI0219 15:09:51.722990 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 15:09:51.726068 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 15:09:51.726086 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 15:09:51.726104 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 15:09:51.726110 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 15:09:51.731095 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 15:09:51.731120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0219 15:09:51.731119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0219 15:09:51.731126 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 15:09:51.731136 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 15:09:51.731141 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 15:09:51.731145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 15:09:51.731149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 15:09:51.734010 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77137e37c7a8cdddbba21456e942c760fb2cc6c31533e3bc6c4561929b41393b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:57Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:57 crc kubenswrapper[4810]: I0219 15:09:57.909361 4810 kubelet_node_status.go:115] "Node was previously registered" node="crc" Feb 19 15:09:57 crc kubenswrapper[4810]: I0219 15:09:57.909657 4810 kubelet_node_status.go:79] "Successfully registered node" node="crc" Feb 19 15:09:57 crc kubenswrapper[4810]: I0219 15:09:57.910788 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:09:57 crc kubenswrapper[4810]: I0219 15:09:57.910824 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:09:57 crc kubenswrapper[4810]: I0219 15:09:57.910836 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:09:57 crc kubenswrapper[4810]: I0219 15:09:57.910852 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:09:57 crc kubenswrapper[4810]: I0219 15:09:57.910861 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:09:57Z","lastTransitionTime":"2026-02-19T15:09:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:09:57 crc kubenswrapper[4810]: I0219 15:09:57.923718 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:57Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:57 crc kubenswrapper[4810]: E0219 15:09:57.931302 4810 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7961bd2b-8ad7-4e25-b313-8f82bef01c62\\\",\\\"systemUUID\\\":\\\"60bcb373-142f-4da9-846e-4d055863e63a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:57Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:57 crc kubenswrapper[4810]: I0219 15:09:57.935532 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:09:57 crc kubenswrapper[4810]: I0219 15:09:57.935564 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:09:57 crc kubenswrapper[4810]: I0219 15:09:57.935572 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:09:57 crc kubenswrapper[4810]: I0219 15:09:57.935588 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:09:57 crc kubenswrapper[4810]: I0219 15:09:57.935597 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:09:57Z","lastTransitionTime":"2026-02-19T15:09:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:09:57 crc kubenswrapper[4810]: I0219 15:09:57.939928 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmbsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"247e948b-3c17-4675-bd1c-f894b02d2817\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cbe5a969ecab49494c22bccec30a5b7050699729a8ae6244a4bb016ed9cbe64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cbe5a969ecab49494c22bccec30a5b7050699729a8ae6244a4bb016ed9cbe64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://395e0b7a03b78f71b2823c72ea754e1909bde653a7107597bf07b6e664ba40dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmbsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:57Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:57 crc kubenswrapper[4810]: E0219 15:09:57.955952 4810 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7961bd2b-8ad7-4e25-b313-8f82bef01c62\\\",\\\"systemUUID\\\":\\\"60bcb373-142f-4da9-846e-4d055863e63a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:57Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:57 crc kubenswrapper[4810]: I0219 15:09:57.960189 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:09:57 crc kubenswrapper[4810]: I0219 15:09:57.960249 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:09:57 crc kubenswrapper[4810]: I0219 15:09:57.960265 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:09:57 crc kubenswrapper[4810]: I0219 15:09:57.960286 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:09:57 crc kubenswrapper[4810]: I0219 15:09:57.960300 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:09:57Z","lastTransitionTime":"2026-02-19T15:09:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:09:57 crc kubenswrapper[4810]: I0219 15:09:57.964947 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8k7p5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:57Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:57 crc kubenswrapper[4810]: E0219 15:09:57.973772 4810 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7961bd2b-8ad7-4e25-b313-8f82bef01c62\\\",\\\"systemUUID\\\":\\\"60bcb373-142f-4da9-846e-4d055863e63a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:57Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:57 crc kubenswrapper[4810]: I0219 15:09:57.978834 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:09:57 crc kubenswrapper[4810]: I0219 15:09:57.978863 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:09:57 crc kubenswrapper[4810]: I0219 15:09:57.978884 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:09:57 crc kubenswrapper[4810]: I0219 15:09:57.978903 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:09:57 crc kubenswrapper[4810]: I0219 15:09:57.978914 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:09:57Z","lastTransitionTime":"2026-02-19T15:09:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:09:57 crc kubenswrapper[4810]: I0219 15:09:57.979149 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d74a04f-ffb6-47fb-8823-2fd7f3df4f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2070ce19c777ad791dfd3e5d99eded43e479f4cbbaefe96c5ed8525aa86e5ed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89e969496cd5adab974a5d8b8c3051bcdfb63c582b8f0d6d9af233d25e2c1d3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5559fbe43f7e18617e2a46182d621a6310ae4d0b10548bae2bb098588285e081\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fbf8815e560483199cc8f453952895d3e5d831e7c52671b204187dc686fa4ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:57Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:57 crc kubenswrapper[4810]: E0219 15:09:57.992388 4810 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7961bd2b-8ad7-4e25-b313-8f82bef01c62\\\",\\\"systemUUID\\\":\\\"60bcb373-142f-4da9-846e-4d055863e63a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:57Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:57 crc kubenswrapper[4810]: I0219 15:09:57.996120 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b60fe11e60f880bde640f63640fdc6a1b66e3044dcd975c711c45a15aed3fb2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:57Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:57 crc kubenswrapper[4810]: I0219 15:09:57.996574 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:09:57 crc kubenswrapper[4810]: I0219 15:09:57.996603 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:09:57 crc kubenswrapper[4810]: I0219 15:09:57.996612 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:09:57 crc kubenswrapper[4810]: I0219 15:09:57.996629 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:09:57 crc kubenswrapper[4810]: I0219 15:09:57.996640 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:09:57Z","lastTransitionTime":"2026-02-19T15:09:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:09:58 crc kubenswrapper[4810]: E0219 15:09:58.008257 4810 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7961bd2b-8ad7-4e25-b313-8f82bef01c62\\\",\\\"systemUUID\\\":\\\"60bcb373-142f-4da9-846e-4d055863e63a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:58Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:58 crc kubenswrapper[4810]: E0219 15:09:58.008443 4810 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 19 15:09:58 crc kubenswrapper[4810]: I0219 15:09:58.010231 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9db08d13283889d8c43ab89157e2c33f300359ecf66b548100b30ddea3da2be1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8472b62f0ac04884cd8a9d342c063860188b0f17e1ed2201c7ea7288fb44d77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:58Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:58 crc kubenswrapper[4810]: I0219 15:09:58.011450 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:09:58 crc kubenswrapper[4810]: I0219 15:09:58.011473 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:09:58 crc kubenswrapper[4810]: I0219 15:09:58.011482 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:09:58 crc kubenswrapper[4810]: I0219 15:09:58.011499 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:09:58 crc kubenswrapper[4810]: I0219 15:09:58.011512 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:09:58Z","lastTransitionTime":"2026-02-19T15:09:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:09:58 crc kubenswrapper[4810]: I0219 15:09:58.020645 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-flbx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cc3ea69-b881-4fd4-ad4d-42803f27865b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07b26bfbc15981836877e6ddc6aea3fb26a4df1be38ac67c76f8a7a8f6b84b5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqph9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-flbx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:58Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:58 crc kubenswrapper[4810]: I0219 15:09:58.034065 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:58Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:58 crc kubenswrapper[4810]: I0219 15:09:58.048472 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54ba23405a6182e281d7d423f13e84a03c0f68cf69169f59fe1cd0d5881103c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:58Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:58 crc kubenswrapper[4810]: I0219 15:09:58.060822 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bsztz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a45a199-beeb-4972-b796-15c958fe99d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a674e283274f1e77f6ac89a8170137bf5852153bd4a35a80101a3cd0090b85d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcg8h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bsztz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:58Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:58 crc kubenswrapper[4810]: I0219 15:09:58.078970 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmbsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"247e948b-3c17-4675-bd1c-f894b02d2817\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cbe5a969ecab49494c22bccec30a5b7050699729a8ae6244a4bb016ed9cbe64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cbe5a969ecab49494c22bccec30a5b7050699729a8ae6244a4bb016ed9cbe64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://395e0b7a03b78f71b2823c72ea754e1909bde653a7107597bf07b6e664ba40dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://395e0b7a03b78f71b2823c72ea754e1909bde653a7107597bf07b6e664ba40dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmbsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:58Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:58 crc kubenswrapper[4810]: I0219 15:09:58.096075 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6694b0d-3264-43eb-bd52-5088c7d2bf15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16713d77989f991d6b3136c41aac696aaa3233a0f3454bcd299cd968f46872eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42939a1750dd650a11f4f83812694248cc8767b040f5cab76ee64d30ad6004ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a3707767fcc9c9ee6a808f9c0598457f1710b6394ecf9910f7a1a0ddd8ec1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://640d02443d7d20048695866b4db1d1fdd6469c712a804010860ca4c730f59c65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://640d02443d7d20048695866b4db1d1fdd6469c712a804010860ca4c730f59c65\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T15:09:51Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 15:09:51.508430 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 15:09:51.508603 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 15:09:51.509986 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1367038872/tls.crt::/tmp/serving-cert-1367038872/tls.key\\\\\\\"\\\\nI0219 15:09:51.722990 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 15:09:51.726068 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 15:09:51.726086 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 15:09:51.726104 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 15:09:51.726110 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 15:09:51.731095 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 15:09:51.731120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0219 15:09:51.731119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0219 15:09:51.731126 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 15:09:51.731136 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 15:09:51.731141 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 15:09:51.731145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 15:09:51.731149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 15:09:51.734010 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77137e37c7a8cdddbba21456e942c760fb2cc6c31533e3bc6c4561929b41393b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:58Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:58 crc kubenswrapper[4810]: I0219 15:09:58.113526 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:58Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:58 crc kubenswrapper[4810]: I0219 15:09:58.113970 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:09:58 crc kubenswrapper[4810]: I0219 15:09:58.114011 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:09:58 crc kubenswrapper[4810]: I0219 15:09:58.114027 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:09:58 crc kubenswrapper[4810]: I0219 15:09:58.114050 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:09:58 crc kubenswrapper[4810]: I0219 15:09:58.114067 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:09:58Z","lastTransitionTime":"2026-02-19T15:09:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:09:58 crc kubenswrapper[4810]: I0219 15:09:58.133628 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:58Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:58 crc kubenswrapper[4810]: I0219 15:09:58.152486 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfbf74ef-1e94-4826-8583-42b2e246ccf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f26baf8d6384c9133f7da338479eac54f1aa88609e4b0854078d4e85e8bf05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjrkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba65a8abc0841494ba4596019b005e3cbb3f3c0897bef615b51a2ab10aa9e651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjrkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t499d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:58Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:58 crc kubenswrapper[4810]: I0219 15:09:58.165881 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-t9jnq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60d7228b-14bd-4988-8dca-cb89f487ba00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb03eee4bad3a037ff0d61a204c9c86a839e9d684dbb89a433a50afa9cfcfcf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gzvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-t9jnq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:58Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:58 crc kubenswrapper[4810]: I0219 15:09:58.216613 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:09:58 crc kubenswrapper[4810]: I0219 15:09:58.216652 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:09:58 crc kubenswrapper[4810]: I0219 15:09:58.216661 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:09:58 crc kubenswrapper[4810]: I0219 15:09:58.216677 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:09:58 crc kubenswrapper[4810]: I0219 15:09:58.216689 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:09:58Z","lastTransitionTime":"2026-02-19T15:09:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:09:58 crc kubenswrapper[4810]: I0219 15:09:58.320480 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:09:58 crc kubenswrapper[4810]: I0219 15:09:58.320543 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:09:58 crc kubenswrapper[4810]: I0219 15:09:58.320564 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:09:58 crc kubenswrapper[4810]: I0219 15:09:58.320594 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:09:58 crc kubenswrapper[4810]: I0219 15:09:58.320615 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:09:58Z","lastTransitionTime":"2026-02-19T15:09:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:09:58 crc kubenswrapper[4810]: I0219 15:09:58.375743 4810 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 19:46:29.279403469 +0000 UTC Feb 19 15:09:58 crc kubenswrapper[4810]: I0219 15:09:58.423400 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:09:58 crc kubenswrapper[4810]: I0219 15:09:58.423469 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:09:58 crc kubenswrapper[4810]: I0219 15:09:58.423489 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:09:58 crc kubenswrapper[4810]: I0219 15:09:58.423521 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:09:58 crc kubenswrapper[4810]: I0219 15:09:58.423539 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:09:58Z","lastTransitionTime":"2026-02-19T15:09:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:09:58 crc kubenswrapper[4810]: I0219 15:09:58.439208 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 15:09:58 crc kubenswrapper[4810]: E0219 15:09:58.439403 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 15:09:58 crc kubenswrapper[4810]: I0219 15:09:58.526944 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:09:58 crc kubenswrapper[4810]: I0219 15:09:58.526998 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:09:58 crc kubenswrapper[4810]: I0219 15:09:58.527014 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:09:58 crc kubenswrapper[4810]: I0219 15:09:58.527072 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:09:58 crc kubenswrapper[4810]: I0219 15:09:58.527091 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:09:58Z","lastTransitionTime":"2026-02-19T15:09:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:09:58 crc kubenswrapper[4810]: I0219 15:09:58.629553 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:09:58 crc kubenswrapper[4810]: I0219 15:09:58.629617 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:09:58 crc kubenswrapper[4810]: I0219 15:09:58.629636 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:09:58 crc kubenswrapper[4810]: I0219 15:09:58.629669 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:09:58 crc kubenswrapper[4810]: I0219 15:09:58.629692 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:09:58Z","lastTransitionTime":"2026-02-19T15:09:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:09:58 crc kubenswrapper[4810]: I0219 15:09:58.682018 4810 generic.go:334] "Generic (PLEG): container finished" podID="247e948b-3c17-4675-bd1c-f894b02d2817" containerID="f5a0126c8bea39a89fe488d49dcb1e47063abc9839d148f348a7ce592038a450" exitCode=0 Feb 19 15:09:58 crc kubenswrapper[4810]: I0219 15:09:58.682142 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nmbsx" event={"ID":"247e948b-3c17-4675-bd1c-f894b02d2817","Type":"ContainerDied","Data":"f5a0126c8bea39a89fe488d49dcb1e47063abc9839d148f348a7ce592038a450"} Feb 19 15:09:58 crc kubenswrapper[4810]: I0219 15:09:58.688086 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" event={"ID":"c5a8a15c-53e8-4868-8feb-dcd4e83939a4","Type":"ContainerStarted","Data":"e21fe2d6964872913c8e5a2f983519806b72d0e2880016372ac90fc2b9f61fc6"} Feb 19 15:09:58 crc kubenswrapper[4810]: I0219 15:09:58.714240 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6694b0d-3264-43eb-bd52-5088c7d2bf15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16713d77989f991d6b3136c41aac696aaa3233a0f3454bcd299cd968f46872eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42939a1750dd650a11f4f83812694248cc8767b040f5cab76ee64d30ad6004ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a3707767fcc9c9ee6a808f9c0598457f1710b6394ecf9910f7a1a0ddd8ec1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://640d02443d7d20048695866b4db1d1fdd6469c712a804010860ca4c730f59c65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://640d02443d7d20048695866b4db1d1fdd6469c712a804010860ca4c730f59c65\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T15:09:51Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 15:09:51.508430 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 15:09:51.508603 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 15:09:51.509986 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1367038872/tls.crt::/tmp/serving-cert-1367038872/tls.key\\\\\\\"\\\\nI0219 15:09:51.722990 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 15:09:51.726068 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 15:09:51.726086 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 15:09:51.726104 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 15:09:51.726110 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 15:09:51.731095 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 15:09:51.731120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0219 15:09:51.731119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0219 15:09:51.731126 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 15:09:51.731136 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 15:09:51.731141 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 15:09:51.731145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 15:09:51.731149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 15:09:51.734010 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77137e37c7a8cdddbba21456e942c760fb2cc6c31533e3bc6c4561929b41393b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:58Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:58 crc kubenswrapper[4810]: I0219 15:09:58.729599 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:58Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:58 crc kubenswrapper[4810]: I0219 15:09:58.734036 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:09:58 crc kubenswrapper[4810]: I0219 15:09:58.734126 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:09:58 crc kubenswrapper[4810]: I0219 15:09:58.734154 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:09:58 crc kubenswrapper[4810]: I0219 15:09:58.734192 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:09:58 crc kubenswrapper[4810]: I0219 15:09:58.734220 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:09:58Z","lastTransitionTime":"2026-02-19T15:09:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:09:58 crc kubenswrapper[4810]: I0219 15:09:58.750844 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmbsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"247e948b-3c17-4675-bd1c-f894b02d2817\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cbe5a969ecab49494c22bccec30a5b7050699729a8ae6244a4bb016ed9cbe64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cbe5a969ecab49494c22bccec30a5b7050699729a8ae6244a4bb016ed9cbe64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://395e0b7a03b78f71b2823c72ea754e1909bde653a7107597bf07b6e664ba40dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://395e0b7a03b78f71b2823c72ea754e1909bde653a7107597bf07b6e664ba40dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5a0126c8bea39a89fe488d49dcb1e47063abc9839d148f348a7ce592038a450\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5a0126c8bea39a89fe488d49dcb1e47063abc9839d148f348a7ce592038a450\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmbsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:58Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:58 crc kubenswrapper[4810]: I0219 15:09:58.764983 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfbf74ef-1e94-4826-8583-42b2e246ccf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f26baf8d6384c9133f7da338479eac54f1aa88609e4b0854078d4e85e8bf05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjrkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba65a8abc0841494ba4596019b005e3cbb3f3c0897bef615b51a2ab10aa9e651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjrkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t499d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:58Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:58 crc kubenswrapper[4810]: I0219 15:09:58.777217 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-t9jnq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60d7228b-14bd-4988-8dca-cb89f487ba00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb03eee4bad3a037ff0d61a204c9c86a839e9d684dbb89a433a50afa9cfcfcf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gzvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-t9jnq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:58Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:58 crc kubenswrapper[4810]: I0219 15:09:58.794612 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:58Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:58 crc kubenswrapper[4810]: I0219 15:09:58.810930 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9db08d13283889d8c43ab89157e2c33f300359ecf66b548100b30ddea3da2be1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8472b62f0ac04884cd8a9d342c063860188b0f17e1ed2201c7ea7288fb44d77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:58Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:58 crc kubenswrapper[4810]: I0219 15:09:58.829244 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-flbx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cc3ea69-b881-4fd4-ad4d-42803f27865b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07b26bfbc15981836877e6ddc6aea3fb26a4df1be38ac67c76f8a7a8f6b84b5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqph9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-flbx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:58Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:58 crc kubenswrapper[4810]: I0219 15:09:58.837138 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:09:58 crc kubenswrapper[4810]: I0219 15:09:58.837192 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:09:58 crc kubenswrapper[4810]: I0219 15:09:58.837206 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:09:58 crc kubenswrapper[4810]: I0219 15:09:58.837225 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:09:58 crc kubenswrapper[4810]: I0219 15:09:58.837239 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:09:58Z","lastTransitionTime":"2026-02-19T15:09:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:09:58 crc kubenswrapper[4810]: I0219 15:09:58.848096 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8k7p5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:58Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:58 crc kubenswrapper[4810]: I0219 15:09:58.860012 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d74a04f-ffb6-47fb-8823-2fd7f3df4f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2070ce19c777ad791dfd3e5d99eded43e479f4cbbaefe96c5ed8525aa86e5ed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89e969496cd5adab974a5d8b8c3051bcdfb63c582b8f0d6d9af233d25e2c1d3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5559fbe43f7e18617e2a46182d621a6310ae4d0b10548bae2bb098588285e081\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fbf8815e560483199cc8f453952895d3e5d831e7c52671b204187dc686fa4ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:58Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:58 crc kubenswrapper[4810]: I0219 15:09:58.874708 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b60fe11e60f880bde640f63640fdc6a1b66e3044dcd975c711c45a15aed3fb2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:58Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:58 crc kubenswrapper[4810]: I0219 15:09:58.890617 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bsztz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a45a199-beeb-4972-b796-15c958fe99d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a674e283274f1e77f6ac89a8170137bf5852153bd4a35a80101a3cd0090b85d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcg8h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bsztz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:58Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:58 crc kubenswrapper[4810]: I0219 15:09:58.904922 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:58Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:58 crc kubenswrapper[4810]: I0219 15:09:58.916387 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54ba23405a6182e281d7d423f13e84a03c0f68cf69169f59fe1cd0d5881103c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:58Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:58 crc kubenswrapper[4810]: I0219 15:09:58.940680 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:09:58 crc kubenswrapper[4810]: I0219 15:09:58.940744 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:09:58 crc kubenswrapper[4810]: I0219 15:09:58.940757 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:09:58 crc kubenswrapper[4810]: I0219 15:09:58.940778 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:09:58 crc kubenswrapper[4810]: I0219 15:09:58.940791 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:09:58Z","lastTransitionTime":"2026-02-19T15:09:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:09:59 crc kubenswrapper[4810]: I0219 15:09:59.043956 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:09:59 crc kubenswrapper[4810]: I0219 15:09:59.044013 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:09:59 crc kubenswrapper[4810]: I0219 15:09:59.044023 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:09:59 crc kubenswrapper[4810]: I0219 15:09:59.044046 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:09:59 crc kubenswrapper[4810]: I0219 15:09:59.044058 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:09:59Z","lastTransitionTime":"2026-02-19T15:09:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:09:59 crc kubenswrapper[4810]: I0219 15:09:59.147458 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:09:59 crc kubenswrapper[4810]: I0219 15:09:59.147522 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:09:59 crc kubenswrapper[4810]: I0219 15:09:59.147536 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:09:59 crc kubenswrapper[4810]: I0219 15:09:59.147558 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:09:59 crc kubenswrapper[4810]: I0219 15:09:59.147572 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:09:59Z","lastTransitionTime":"2026-02-19T15:09:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:09:59 crc kubenswrapper[4810]: I0219 15:09:59.217486 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 15:09:59 crc kubenswrapper[4810]: E0219 15:09:59.217790 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 15:10:07.217746311 +0000 UTC m=+36.699776435 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:09:59 crc kubenswrapper[4810]: I0219 15:09:59.250714 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:09:59 crc kubenswrapper[4810]: I0219 15:09:59.250789 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:09:59 crc kubenswrapper[4810]: I0219 15:09:59.250811 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:09:59 crc kubenswrapper[4810]: I0219 15:09:59.250842 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:09:59 crc kubenswrapper[4810]: I0219 15:09:59.250860 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:09:59Z","lastTransitionTime":"2026-02-19T15:09:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:09:59 crc kubenswrapper[4810]: I0219 15:09:59.318776 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 15:09:59 crc kubenswrapper[4810]: I0219 15:09:59.318837 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 15:09:59 crc kubenswrapper[4810]: I0219 15:09:59.318875 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 15:09:59 crc kubenswrapper[4810]: I0219 15:09:59.318915 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 15:09:59 crc kubenswrapper[4810]: E0219 15:09:59.319038 4810 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 15:09:59 crc kubenswrapper[4810]: E0219 15:09:59.319067 4810 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 15:09:59 crc kubenswrapper[4810]: E0219 15:09:59.319103 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 15:10:07.319081369 +0000 UTC m=+36.801111493 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 15:09:59 crc kubenswrapper[4810]: E0219 15:09:59.319107 4810 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 15:09:59 crc kubenswrapper[4810]: E0219 15:09:59.319129 4810 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 15:09:59 crc kubenswrapper[4810]: E0219 15:09:59.319143 4810 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 15:09:59 crc kubenswrapper[4810]: E0219 15:09:59.319215 4810 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 15:09:59 crc kubenswrapper[4810]: E0219 15:09:59.319237 4810 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 15:09:59 crc kubenswrapper[4810]: E0219 15:09:59.319258 4810 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 15:09:59 crc kubenswrapper[4810]: E0219 15:09:59.319198 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-19 15:10:07.319173342 +0000 UTC m=+36.801203566 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 15:09:59 crc kubenswrapper[4810]: E0219 15:09:59.319481 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-19 15:10:07.319448289 +0000 UTC m=+36.801478413 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 15:09:59 crc kubenswrapper[4810]: E0219 15:09:59.319500 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 15:10:07.31949365 +0000 UTC m=+36.801523774 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 15:09:59 crc kubenswrapper[4810]: I0219 15:09:59.354831 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:09:59 crc kubenswrapper[4810]: I0219 15:09:59.354917 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:09:59 crc kubenswrapper[4810]: I0219 15:09:59.354941 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:09:59 crc kubenswrapper[4810]: I0219 15:09:59.354971 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:09:59 crc kubenswrapper[4810]: I0219 15:09:59.354995 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:09:59Z","lastTransitionTime":"2026-02-19T15:09:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:09:59 crc kubenswrapper[4810]: I0219 15:09:59.376350 4810 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 05:00:17.173644083 +0000 UTC Feb 19 15:09:59 crc kubenswrapper[4810]: I0219 15:09:59.438391 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 15:09:59 crc kubenswrapper[4810]: E0219 15:09:59.438556 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 15:09:59 crc kubenswrapper[4810]: I0219 15:09:59.438695 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 15:09:59 crc kubenswrapper[4810]: E0219 15:09:59.438912 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 15:09:59 crc kubenswrapper[4810]: I0219 15:09:59.458187 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:09:59 crc kubenswrapper[4810]: I0219 15:09:59.458245 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:09:59 crc kubenswrapper[4810]: I0219 15:09:59.458256 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:09:59 crc kubenswrapper[4810]: I0219 15:09:59.458278 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:09:59 crc kubenswrapper[4810]: I0219 15:09:59.458762 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:09:59Z","lastTransitionTime":"2026-02-19T15:09:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:09:59 crc kubenswrapper[4810]: I0219 15:09:59.561673 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:09:59 crc kubenswrapper[4810]: I0219 15:09:59.561727 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:09:59 crc kubenswrapper[4810]: I0219 15:09:59.561737 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:09:59 crc kubenswrapper[4810]: I0219 15:09:59.561780 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:09:59 crc kubenswrapper[4810]: I0219 15:09:59.561792 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:09:59Z","lastTransitionTime":"2026-02-19T15:09:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:09:59 crc kubenswrapper[4810]: I0219 15:09:59.665169 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:09:59 crc kubenswrapper[4810]: I0219 15:09:59.665227 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:09:59 crc kubenswrapper[4810]: I0219 15:09:59.665240 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:09:59 crc kubenswrapper[4810]: I0219 15:09:59.665261 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:09:59 crc kubenswrapper[4810]: I0219 15:09:59.665275 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:09:59Z","lastTransitionTime":"2026-02-19T15:09:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:09:59 crc kubenswrapper[4810]: I0219 15:09:59.697375 4810 generic.go:334] "Generic (PLEG): container finished" podID="247e948b-3c17-4675-bd1c-f894b02d2817" containerID="f9864da53b0851b0c4eadd5c11669af9b9e539660908ca14f44618824b5e4158" exitCode=0 Feb 19 15:09:59 crc kubenswrapper[4810]: I0219 15:09:59.697460 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nmbsx" event={"ID":"247e948b-3c17-4675-bd1c-f894b02d2817","Type":"ContainerDied","Data":"f9864da53b0851b0c4eadd5c11669af9b9e539660908ca14f44618824b5e4158"} Feb 19 15:09:59 crc kubenswrapper[4810]: I0219 15:09:59.716761 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6694b0d-3264-43eb-bd52-5088c7d2bf15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16713d77989f991d6b3136c41aac696aaa3233a0f3454bcd299cd968f46872eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42939a1750dd650a11f4f83812694248cc8767b040f5cab76ee64d30ad6004ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a3707767fcc9c9ee6a808f9c0598457f1710b6394ecf9910f7a1a0ddd8ec1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://640d02443d7d20048695866b4db1d1fdd6469c712a804010860ca4c730f59c65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://640d02443d7d20048695866b4db1d1fdd6469c712a804010860ca4c730f59c65\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T15:09:51Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 15:09:51.508430 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 15:09:51.508603 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 15:09:51.509986 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1367038872/tls.crt::/tmp/serving-cert-1367038872/tls.key\\\\\\\"\\\\nI0219 15:09:51.722990 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 15:09:51.726068 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 15:09:51.726086 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 15:09:51.726104 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 15:09:51.726110 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 15:09:51.731095 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 15:09:51.731120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0219 15:09:51.731119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0219 15:09:51.731126 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 15:09:51.731136 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 15:09:51.731141 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 15:09:51.731145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 15:09:51.731149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 15:09:51.734010 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77137e37c7a8cdddbba21456e942c760fb2cc6c31533e3bc6c4561929b41393b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:59Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:59 crc kubenswrapper[4810]: I0219 15:09:59.739751 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:59Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:59 crc kubenswrapper[4810]: I0219 15:09:59.758550 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmbsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"247e948b-3c17-4675-bd1c-f894b02d2817\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cbe5a969ecab49494c22bccec30a5b7050699729a8ae6244a4bb016ed9cbe64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cbe5a969ecab49494c22bccec30a5b7050699729a8ae6244a4bb016ed9cbe64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://395e0b7a03b78f71b2823c72ea754e1909bde653a7107597bf07b6e664ba40dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://395e0b7a03b78f71b2823c72ea754e1909bde653a7107597bf07b6e664ba40dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5a0126c8bea39a89fe488d49dcb1e47063abc9839d148f348a7ce592038a450\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5a0126c8bea39a89fe488d49dcb1e47063abc9839d148f348a7ce592038a450\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9864da53b0851b0c4eadd5c11669af9b9e539660908ca14f44618824b5e4158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9864da53b0851b0c4eadd5c11669af9b9e539660908ca14f44618824b5e4158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmbsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:59Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:59 crc kubenswrapper[4810]: I0219 15:09:59.768579 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:09:59 crc kubenswrapper[4810]: I0219 15:09:59.768613 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:09:59 crc kubenswrapper[4810]: I0219 15:09:59.768624 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:09:59 crc kubenswrapper[4810]: I0219 15:09:59.768641 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:09:59 crc kubenswrapper[4810]: I0219 15:09:59.768652 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:09:59Z","lastTransitionTime":"2026-02-19T15:09:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:09:59 crc kubenswrapper[4810]: I0219 15:09:59.775157 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:59Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:59 crc kubenswrapper[4810]: I0219 15:09:59.793772 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfbf74ef-1e94-4826-8583-42b2e246ccf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f26baf8d6384c9133f7da338479eac54f1aa88609e4b0854078d4e85e8bf05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjrkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba65a8abc0841494ba4596019b005e3cbb3f3c0897bef615b51a2ab10aa9e651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjrkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t499d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:59Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:59 crc kubenswrapper[4810]: I0219 15:09:59.814375 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-t9jnq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60d7228b-14bd-4988-8dca-cb89f487ba00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb03eee4bad3a037ff0d61a204c9c86a839e9d684dbb89a433a50afa9cfcfcf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gzvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-t9jnq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:59Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:59 crc kubenswrapper[4810]: I0219 15:09:59.829705 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d74a04f-ffb6-47fb-8823-2fd7f3df4f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2070ce19c777ad791dfd3e5d99eded43e479f4cbbaefe96c5ed8525aa86e5ed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89e969496cd5adab974a5d8b8c3051bcdfb63c582b8f0d6d9af233d25e2c1d3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5559fbe43f7e18617e2a46182d621a6310ae4d0b10548bae2bb098588285e081\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fbf8815e560483199cc8f453952895d3e5d831e7c52671b204187dc686fa4ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:59Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:59 crc kubenswrapper[4810]: I0219 15:09:59.844674 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b60fe11e60f880bde640f63640fdc6a1b66e3044dcd975c711c45a15aed3fb2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:59Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:59 crc kubenswrapper[4810]: I0219 15:09:59.859192 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9db08d13283889d8c43ab89157e2c33f300359ecf66b548100b30ddea3da2be1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8472b62f0ac04884cd8a9d342c063860188b0f17e1ed2201c7ea7288fb44d77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:59Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:59 crc kubenswrapper[4810]: I0219 15:09:59.873807 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-flbx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cc3ea69-b881-4fd4-ad4d-42803f27865b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07b26bfbc15981836877e6ddc6aea3fb26a4df1be38ac67c76f8a7a8f6b84b5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqph9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-flbx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:59Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:59 crc kubenswrapper[4810]: I0219 15:09:59.874055 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:09:59 crc kubenswrapper[4810]: I0219 15:09:59.874103 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:09:59 crc kubenswrapper[4810]: I0219 15:09:59.874115 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:09:59 crc kubenswrapper[4810]: I0219 15:09:59.874136 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:09:59 crc kubenswrapper[4810]: I0219 15:09:59.874153 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:09:59Z","lastTransitionTime":"2026-02-19T15:09:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:09:59 crc kubenswrapper[4810]: I0219 15:09:59.904741 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8k7p5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:59Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:59 crc kubenswrapper[4810]: I0219 15:09:59.918294 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:59Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:59 crc kubenswrapper[4810]: I0219 15:09:59.931384 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54ba23405a6182e281d7d423f13e84a03c0f68cf69169f59fe1cd0d5881103c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:59Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:59 crc kubenswrapper[4810]: I0219 15:09:59.945275 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bsztz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a45a199-beeb-4972-b796-15c958fe99d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a674e283274f1e77f6ac89a8170137bf5852153bd4a35a80101a3cd0090b85d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcg8h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bsztz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:09:59Z is after 2025-08-24T17:21:41Z" Feb 19 15:09:59 crc kubenswrapper[4810]: I0219 15:09:59.977972 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:09:59 crc kubenswrapper[4810]: I0219 15:09:59.978480 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:09:59 crc kubenswrapper[4810]: I0219 15:09:59.978498 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:09:59 crc kubenswrapper[4810]: I0219 15:09:59.978527 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:09:59 crc kubenswrapper[4810]: I0219 15:09:59.978545 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:09:59Z","lastTransitionTime":"2026-02-19T15:09:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:00 crc kubenswrapper[4810]: I0219 15:10:00.082164 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:00 crc kubenswrapper[4810]: I0219 15:10:00.082257 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:00 crc kubenswrapper[4810]: I0219 15:10:00.082282 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:00 crc kubenswrapper[4810]: I0219 15:10:00.082679 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:00 crc kubenswrapper[4810]: I0219 15:10:00.082913 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:00Z","lastTransitionTime":"2026-02-19T15:10:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:00 crc kubenswrapper[4810]: I0219 15:10:00.185257 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:00 crc kubenswrapper[4810]: I0219 15:10:00.185296 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:00 crc kubenswrapper[4810]: I0219 15:10:00.185307 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:00 crc kubenswrapper[4810]: I0219 15:10:00.185344 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:00 crc kubenswrapper[4810]: I0219 15:10:00.185357 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:00Z","lastTransitionTime":"2026-02-19T15:10:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:00 crc kubenswrapper[4810]: I0219 15:10:00.288876 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:00 crc kubenswrapper[4810]: I0219 15:10:00.288973 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:00 crc kubenswrapper[4810]: I0219 15:10:00.288996 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:00 crc kubenswrapper[4810]: I0219 15:10:00.289031 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:00 crc kubenswrapper[4810]: I0219 15:10:00.289053 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:00Z","lastTransitionTime":"2026-02-19T15:10:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:00 crc kubenswrapper[4810]: I0219 15:10:00.376888 4810 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 08:56:19.197994159 +0000 UTC Feb 19 15:10:00 crc kubenswrapper[4810]: I0219 15:10:00.393769 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:00 crc kubenswrapper[4810]: I0219 15:10:00.393819 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:00 crc kubenswrapper[4810]: I0219 15:10:00.393836 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:00 crc kubenswrapper[4810]: I0219 15:10:00.393863 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:00 crc kubenswrapper[4810]: I0219 15:10:00.393882 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:00Z","lastTransitionTime":"2026-02-19T15:10:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:00 crc kubenswrapper[4810]: I0219 15:10:00.439571 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 15:10:00 crc kubenswrapper[4810]: E0219 15:10:00.439902 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 15:10:00 crc kubenswrapper[4810]: I0219 15:10:00.497118 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:00 crc kubenswrapper[4810]: I0219 15:10:00.497190 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:00 crc kubenswrapper[4810]: I0219 15:10:00.497212 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:00 crc kubenswrapper[4810]: I0219 15:10:00.497243 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:00 crc kubenswrapper[4810]: I0219 15:10:00.497261 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:00Z","lastTransitionTime":"2026-02-19T15:10:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:00 crc kubenswrapper[4810]: I0219 15:10:00.600553 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:00 crc kubenswrapper[4810]: I0219 15:10:00.600671 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:00 crc kubenswrapper[4810]: I0219 15:10:00.600690 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:00 crc kubenswrapper[4810]: I0219 15:10:00.600721 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:00 crc kubenswrapper[4810]: I0219 15:10:00.600744 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:00Z","lastTransitionTime":"2026-02-19T15:10:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:00 crc kubenswrapper[4810]: I0219 15:10:00.703258 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:00 crc kubenswrapper[4810]: I0219 15:10:00.703297 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:00 crc kubenswrapper[4810]: I0219 15:10:00.703308 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:00 crc kubenswrapper[4810]: I0219 15:10:00.703351 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:00 crc kubenswrapper[4810]: I0219 15:10:00.703364 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:00Z","lastTransitionTime":"2026-02-19T15:10:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:00 crc kubenswrapper[4810]: I0219 15:10:00.706123 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nmbsx" event={"ID":"247e948b-3c17-4675-bd1c-f894b02d2817","Type":"ContainerStarted","Data":"301c743749a2bcaee82a70d3c34a6cf13f880ef9986c390e279a7c056f9a99ba"} Feb 19 15:10:00 crc kubenswrapper[4810]: I0219 15:10:00.716733 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" event={"ID":"c5a8a15c-53e8-4868-8feb-dcd4e83939a4","Type":"ContainerStarted","Data":"2f04be3b9d3a27ffadd2a55f3b4cea853da683af5f844ec4c3ad6cbbc548add7"} Feb 19 15:10:00 crc kubenswrapper[4810]: I0219 15:10:00.717314 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" Feb 19 15:10:00 crc kubenswrapper[4810]: I0219 15:10:00.717357 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" Feb 19 15:10:00 crc kubenswrapper[4810]: I0219 15:10:00.717371 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" Feb 19 15:10:00 crc kubenswrapper[4810]: I0219 15:10:00.726248 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:00Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:00 crc kubenswrapper[4810]: I0219 15:10:00.742822 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54ba23405a6182e281d7d423f13e84a03c0f68cf69169f59fe1cd0d5881103c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:00Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:00 crc kubenswrapper[4810]: I0219 15:10:00.764612 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bsztz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a45a199-beeb-4972-b796-15c958fe99d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a674e283274f1e77f6ac89a8170137bf5852153bd4a35a80101a3cd0090b85d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcg8h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bsztz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:00Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:00 crc kubenswrapper[4810]: I0219 15:10:00.769806 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" Feb 19 15:10:00 crc kubenswrapper[4810]: I0219 15:10:00.783580 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" Feb 19 15:10:00 crc kubenswrapper[4810]: I0219 15:10:00.788755 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmbsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"247e948b-3c17-4675-bd1c-f894b02d2817\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cbe5a969ecab49494c22bccec30a5b7050699729a8ae6244a4bb016ed9cbe64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cbe5a969ecab49494c22bccec30a5b7050699729a8ae6244a4bb016ed9cbe64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://395e0b7a03b78f71b2823c72ea754e1909bde653a7107597bf07b6e664ba40dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://395e0b7a03b78f71b2823c72ea754e1909bde653a7107597bf07b6e664ba40dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5a0126c8bea39a89fe488d49dcb1e47063abc9839d148f348a7ce592038a450\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5a0126c8bea39a89fe488d49dcb1e47063abc9839d148f348a7ce592038a450\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9864da53b0851b0c4eadd5c11669af9b9e539660908ca14f44618824b5e4158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9864da53b0851b0c4eadd5c11669af9b9e539660908ca14f44618824b5e4158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://301c743749a2bcaee82a70d3c34a6cf13f880ef9986c390e279a7c056f9a99ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmbsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:00Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:00 crc kubenswrapper[4810]: I0219 15:10:00.807410 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:00 crc kubenswrapper[4810]: I0219 15:10:00.807466 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:00 crc kubenswrapper[4810]: I0219 15:10:00.807479 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:00 crc kubenswrapper[4810]: I0219 15:10:00.807501 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:00 crc kubenswrapper[4810]: I0219 15:10:00.807521 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:00Z","lastTransitionTime":"2026-02-19T15:10:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:00 crc kubenswrapper[4810]: I0219 15:10:00.821375 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6694b0d-3264-43eb-bd52-5088c7d2bf15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16713d77989f991d6b3136c41aac696aaa3233a0f3454bcd299cd968f46872eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42939a1750dd650a11f4f83812694248cc8767b040f5cab76ee64d30ad6004ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a3707767fcc9c9ee6a808f9c0598457f1710b6394ecf9910f7a1a0ddd8ec1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://640d02443d7d20048695866b4db1d1fdd6469c712a804010860ca4c730f59c65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://640d02443d7d20048695866b4db1d1fdd6469c712a804010860ca4c730f59c65\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T15:09:51Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 15:09:51.508430 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 15:09:51.508603 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 15:09:51.509986 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1367038872/tls.crt::/tmp/serving-cert-1367038872/tls.key\\\\\\\"\\\\nI0219 15:09:51.722990 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 15:09:51.726068 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 15:09:51.726086 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 15:09:51.726104 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 15:09:51.726110 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 15:09:51.731095 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 15:09:51.731120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0219 15:09:51.731119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0219 15:09:51.731126 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 15:09:51.731136 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 15:09:51.731141 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 15:09:51.731145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 15:09:51.731149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 15:09:51.734010 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77137e37c7a8cdddbba21456e942c760fb2cc6c31533e3bc6c4561929b41393b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:00Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:00 crc kubenswrapper[4810]: I0219 15:10:00.837993 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:00Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:00 crc kubenswrapper[4810]: I0219 15:10:00.853367 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:00Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:00 crc kubenswrapper[4810]: I0219 15:10:00.867467 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfbf74ef-1e94-4826-8583-42b2e246ccf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f26baf8d6384c9133f7da338479eac54f1aa88609e4b0854078d4e85e8bf05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjrkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba65a8abc0841494ba4596019b005e3cbb3f3c0897bef615b51a2ab10aa9e651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjrkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t499d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:00Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:00 crc kubenswrapper[4810]: I0219 15:10:00.884600 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-t9jnq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60d7228b-14bd-4988-8dca-cb89f487ba00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb03eee4bad3a037ff0d61a204c9c86a839e9d684dbb89a433a50afa9cfcfcf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gzvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-t9jnq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:00Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:00 crc kubenswrapper[4810]: I0219 15:10:00.912035 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:00 crc kubenswrapper[4810]: I0219 15:10:00.912093 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:00 crc kubenswrapper[4810]: I0219 15:10:00.912111 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:00 crc kubenswrapper[4810]: I0219 15:10:00.912138 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:00 crc kubenswrapper[4810]: I0219 15:10:00.912156 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:00Z","lastTransitionTime":"2026-02-19T15:10:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:00 crc kubenswrapper[4810]: I0219 15:10:00.918963 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8k7p5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:00Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:00 crc kubenswrapper[4810]: I0219 15:10:00.937913 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d74a04f-ffb6-47fb-8823-2fd7f3df4f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2070ce19c777ad791dfd3e5d99eded43e479f4cbbaefe96c5ed8525aa86e5ed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89e969496cd5adab974a5d8b8c3051bcdfb63c582b8f0d6d9af233d25e2c1d3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5559fbe43f7e18617e2a46182d621a6310ae4d0b10548bae2bb098588285e081\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fbf8815e560483199cc8f453952895d3e5d831e7c52671b204187dc686fa4ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:00Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:00 crc kubenswrapper[4810]: I0219 15:10:00.954986 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b60fe11e60f880bde640f63640fdc6a1b66e3044dcd975c711c45a15aed3fb2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:00Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:00 crc kubenswrapper[4810]: I0219 15:10:00.972006 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9db08d13283889d8c43ab89157e2c33f300359ecf66b548100b30ddea3da2be1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8472b62f0ac04884cd8a9d342c063860188b0f17e1ed2201c7ea7288fb44d77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:00Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:00 crc kubenswrapper[4810]: I0219 15:10:00.992143 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-flbx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cc3ea69-b881-4fd4-ad4d-42803f27865b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07b26bfbc15981836877e6ddc6aea3fb26a4df1be38ac67c76f8a7a8f6b84b5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqph9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-flbx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:00Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.015306 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.015387 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.015406 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.015430 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.015449 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:01Z","lastTransitionTime":"2026-02-19T15:10:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.029728 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6694b0d-3264-43eb-bd52-5088c7d2bf15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16713d77989f991d6b3136c41aac696aaa3233a0f3454bcd299cd968f46872eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42939a1750dd650a11f4f83812694248cc8767b040f5cab76ee64d30ad6004ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a3707767fcc9c9ee6a808f9c0598457f1710b6394ecf9910f7a1a0ddd8ec1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://640d02443d7d20048695866b4db1d1fdd6469c712a804010860ca4c730f59c65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://640d02443d7d20048695866b4db1d1fdd6469c712a804010860ca4c730f59c65\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T15:09:51Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 15:09:51.508430 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 15:09:51.508603 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 15:09:51.509986 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1367038872/tls.crt::/tmp/serving-cert-1367038872/tls.key\\\\\\\"\\\\nI0219 15:09:51.722990 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 15:09:51.726068 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 15:09:51.726086 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 15:09:51.726104 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 15:09:51.726110 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 15:09:51.731095 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 15:09:51.731120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0219 15:09:51.731119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0219 15:09:51.731126 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 15:09:51.731136 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 15:09:51.731141 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 15:09:51.731145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 15:09:51.731149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 15:09:51.734010 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77137e37c7a8cdddbba21456e942c760fb2cc6c31533e3bc6c4561929b41393b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:01Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.046868 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:01Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.065809 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmbsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"247e948b-3c17-4675-bd1c-f894b02d2817\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cbe5a969ecab49494c22bccec30a5b7050699729a8ae6244a4bb016ed9cbe64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cbe5a969ecab49494c22bccec30a5b7050699729a8ae6244a4bb016ed9cbe64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://395e0b7a03b78f71b2823c72ea754e1909bde653a7107597bf07b6e664ba40dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://395e0b7a03b78f71b2823c72ea754e1909bde653a7107597bf07b6e664ba40dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5a0126c8bea39a89fe488d49dcb1e47063abc9839d148f348a7ce592038a450\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5a0126c8bea39a89fe488d49dcb1e47063abc9839d148f348a7ce592038a450\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9864da53b0851b0c4eadd5c11669af9b9e539660908ca14f44618824b5e4158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9864da53b0851b0c4eadd5c11669af9b9e539660908ca14f44618824b5e4158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://301c743749a2bcaee82a70d3c34a6cf13f880ef9986c390e279a7c056f9a99ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmbsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:01Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.082683 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:01Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.101371 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfbf74ef-1e94-4826-8583-42b2e246ccf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f26baf8d6384c9133f7da338479eac54f1aa88609e4b0854078d4e85e8bf05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjrkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba65a8abc0841494ba4596019b005e3cbb3f3c0897bef615b51a2ab10aa9e651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjrkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t499d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:01Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.116999 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-t9jnq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60d7228b-14bd-4988-8dca-cb89f487ba00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb03eee4bad3a037ff0d61a204c9c86a839e9d684dbb89a433a50afa9cfcfcf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gzvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-t9jnq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:01Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.118781 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.118829 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.118843 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.118865 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.118881 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:01Z","lastTransitionTime":"2026-02-19T15:10:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.134671 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d74a04f-ffb6-47fb-8823-2fd7f3df4f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2070ce19c777ad791dfd3e5d99eded43e479f4cbbaefe96c5ed8525aa86e5ed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89e969496cd5adab974a5d8b8c3051bcdfb63c582b8f0d6d9af233d25e2c1d3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5559fbe43f7e18617e2a46182d621a6310ae4d0b10548bae2bb098588285e081\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fbf8815e560483199cc8f453952895d3e5d831e7c52671b204187dc686fa4ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:01Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.152809 4810 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.154284 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b60fe11e60f880bde640f63640fdc6a1b66e3044dcd975c711c45a15aed3fb2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Patch \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-operator/pods/network-operator-58b4c7f79c-55gtf/status\": read tcp 38.102.83.162:42874->38.102.83.162:6443: use of closed network connection" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.188019 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9db08d13283889d8c43ab89157e2c33f300359ecf66b548100b30ddea3da2be1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8472b62f0ac04884cd8a9d342c063860188b0f17e1ed2201c7ea7288fb44d77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:01Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.202295 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-flbx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cc3ea69-b881-4fd4-ad4d-42803f27865b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07b26bfbc15981836877e6ddc6aea3fb26a4df1be38ac67c76f8a7a8f6b84b5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqph9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-flbx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:01Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.221876 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.221914 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.221926 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.221943 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.221957 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:01Z","lastTransitionTime":"2026-02-19T15:10:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.227901 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dcca2640a7625737b151a65d63dac889139b41a8dff1cd5aae2ead1a37138f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://879052e78b8d383d0a7f4fbddb38b98fa931bec3d0d0c7ff7eb830ccb62be46d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5de90aae0b44ad98711b4e4c2764cf25c491a329a7acbe9a6ec49f88fb8e7c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c0568b462e1168949dcfd0131ef4e94d6c3f68dd31906ee5c6a8c87f7fef25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://702a868ef507f31cb27f41ccd862f38ba8d5256e5d4af19a5d8b168939932663\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5089067acda7084123f239150916500b11a0547068b0a39ee60a467c372f3459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f04be3b9d3a27ffadd2a55f3b4cea853da683af5f844ec4c3ad6cbbc548add7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e21fe2d6964872913c8e5a2f983519806b72d0e2880016372ac90fc2b9f61fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8k7p5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:01Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.241905 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:01Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.259923 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54ba23405a6182e281d7d423f13e84a03c0f68cf69169f59fe1cd0d5881103c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:01Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.282253 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bsztz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a45a199-beeb-4972-b796-15c958fe99d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a674e283274f1e77f6ac89a8170137bf5852153bd4a35a80101a3cd0090b85d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcg8h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bsztz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:01Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.325046 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.325093 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.325104 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.325122 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.325133 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:01Z","lastTransitionTime":"2026-02-19T15:10:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.377346 4810 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 13:24:38.218574451 +0000 UTC Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.428030 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.428098 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.428112 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.428136 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.428153 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:01Z","lastTransitionTime":"2026-02-19T15:10:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.438312 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.438391 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 15:10:01 crc kubenswrapper[4810]: E0219 15:10:01.438476 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 15:10:01 crc kubenswrapper[4810]: E0219 15:10:01.438676 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.464764 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6694b0d-3264-43eb-bd52-5088c7d2bf15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16713d77989f991d6b3136c41aac696aaa3233a0f3454bcd299cd968f46872eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42939a1750dd650a11f4f83812694248cc8767b040f5cab76ee64d30ad6004ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a3707767fcc9c9ee6a808f9c0598457f1710b6394ecf9910f7a1a0ddd8ec1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://640d02443d7d20048695866b4db1d1fdd6469c712a804010860ca4c730f59c65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://640d02443d7d20048695866b4db1d1fdd6469c712a804010860ca4c730f59c65\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T15:09:51Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 15:09:51.508430 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 15:09:51.508603 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 15:09:51.509986 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1367038872/tls.crt::/tmp/serving-cert-1367038872/tls.key\\\\\\\"\\\\nI0219 15:09:51.722990 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 15:09:51.726068 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 15:09:51.726086 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 15:09:51.726104 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 15:09:51.726110 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 15:09:51.731095 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 15:09:51.731120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0219 15:09:51.731119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0219 15:09:51.731126 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 15:09:51.731136 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 15:09:51.731141 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 15:09:51.731145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 15:09:51.731149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 15:09:51.734010 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77137e37c7a8cdddbba21456e942c760fb2cc6c31533e3bc6c4561929b41393b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:01Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.478834 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:01Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.498248 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmbsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"247e948b-3c17-4675-bd1c-f894b02d2817\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cbe5a969ecab49494c22bccec30a5b7050699729a8ae6244a4bb016ed9cbe64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cbe5a969ecab49494c22bccec30a5b7050699729a8ae6244a4bb016ed9cbe64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://395e0b7a03b78f71b2823c72ea754e1909bde653a7107597bf07b6e664ba40dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://395e0b7a03b78f71b2823c72ea754e1909bde653a7107597bf07b6e664ba40dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5a0126c8bea39a89fe488d49dcb1e47063abc9839d148f348a7ce592038a450\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5a0126c8bea39a89fe488d49dcb1e47063abc9839d148f348a7ce592038a450\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9864da53b0851b0c4eadd5c11669af9b9e539660908ca14f44618824b5e4158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9864da53b0851b0c4eadd5c11669af9b9e539660908ca14f44618824b5e4158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://301c743749a2bcaee82a70d3c34a6cf13f880ef9986c390e279a7c056f9a99ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmbsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:01Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.514616 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:01Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.531534 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.531601 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.531620 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.531646 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.531662 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:01Z","lastTransitionTime":"2026-02-19T15:10:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.532084 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfbf74ef-1e94-4826-8583-42b2e246ccf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f26baf8d6384c9133f7da338479eac54f1aa88609e4b0854078d4e85e8bf05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjrkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba65a8abc0841494ba4596019b005e3cbb3f3c0897bef615b51a2ab10aa9e651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjrkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t499d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:01Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.553263 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-t9jnq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60d7228b-14bd-4988-8dca-cb89f487ba00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb03eee4bad3a037ff0d61a204c9c86a839e9d684dbb89a433a50afa9cfcfcf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gzvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-t9jnq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:01Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.587639 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d74a04f-ffb6-47fb-8823-2fd7f3df4f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2070ce19c777ad791dfd3e5d99eded43e479f4cbbaefe96c5ed8525aa86e5ed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89e969496cd5adab974a5d8b8c3051bcdfb63c582b8f0d6d9af233d25e2c1d3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5559fbe43f7e18617e2a46182d621a6310ae4d0b10548bae2bb098588285e081\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fbf8815e560483199cc8f453952895d3e5d831e7c52671b204187dc686fa4ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:01Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.618463 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b60fe11e60f880bde640f63640fdc6a1b66e3044dcd975c711c45a15aed3fb2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:01Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.634662 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.634712 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.634726 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.634749 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.634770 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:01Z","lastTransitionTime":"2026-02-19T15:10:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.643368 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9db08d13283889d8c43ab89157e2c33f300359ecf66b548100b30ddea3da2be1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8472b62f0ac04884cd8a9d342c063860188b0f17e1ed2201c7ea7288fb44d77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:01Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.659738 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-flbx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cc3ea69-b881-4fd4-ad4d-42803f27865b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07b26bfbc15981836877e6ddc6aea3fb26a4df1be38ac67c76f8a7a8f6b84b5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqph9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-flbx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:01Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.680417 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dcca2640a7625737b151a65d63dac889139b41a8dff1cd5aae2ead1a37138f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://879052e78b8d383d0a7f4fbddb38b98fa931bec3d0d0c7ff7eb830ccb62be46d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5de90aae0b44ad98711b4e4c2764cf25c491a329a7acbe9a6ec49f88fb8e7c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c0568b462e1168949dcfd0131ef4e94d6c3f68dd31906ee5c6a8c87f7fef25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://702a868ef507f31cb27f41ccd862f38ba8d5256e5d4af19a5d8b168939932663\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5089067acda7084123f239150916500b11a0547068b0a39ee60a467c372f3459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f04be3b9d3a27ffadd2a55f3b4cea853da683af5f844ec4c3ad6cbbc548add7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e21fe2d6964872913c8e5a2f983519806b72d0e2880016372ac90fc2b9f61fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8k7p5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:01Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.697442 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:01Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.710438 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54ba23405a6182e281d7d423f13e84a03c0f68cf69169f59fe1cd0d5881103c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:01Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.723382 4810 generic.go:334] "Generic (PLEG): container finished" podID="247e948b-3c17-4675-bd1c-f894b02d2817" containerID="301c743749a2bcaee82a70d3c34a6cf13f880ef9986c390e279a7c056f9a99ba" exitCode=0 Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.723478 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nmbsx" event={"ID":"247e948b-3c17-4675-bd1c-f894b02d2817","Type":"ContainerDied","Data":"301c743749a2bcaee82a70d3c34a6cf13f880ef9986c390e279a7c056f9a99ba"} Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.726637 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bsztz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a45a199-beeb-4972-b796-15c958fe99d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a674e283274f1e77f6ac89a8170137bf5852153bd4a35a80101a3cd0090b85d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcg8h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bsztz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:01Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.737117 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.737186 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.737201 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.737224 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.737238 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:01Z","lastTransitionTime":"2026-02-19T15:10:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.750462 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6694b0d-3264-43eb-bd52-5088c7d2bf15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16713d77989f991d6b3136c41aac696aaa3233a0f3454bcd299cd968f46872eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42939a1750dd650a11f4f83812694248cc8767b040f5cab76ee64d30ad6004ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a3707767fcc9c9ee6a808f9c0598457f1710b6394ecf9910f7a1a0ddd8ec1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://640d02443d7d20048695866b4db1d1fdd6469c712a804010860ca4c730f59c65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://640d02443d7d20048695866b4db1d1fdd6469c712a804010860ca4c730f59c65\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T15:09:51Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 15:09:51.508430 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 15:09:51.508603 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 15:09:51.509986 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1367038872/tls.crt::/tmp/serving-cert-1367038872/tls.key\\\\\\\"\\\\nI0219 15:09:51.722990 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 15:09:51.726068 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 15:09:51.726086 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 15:09:51.726104 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 15:09:51.726110 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 15:09:51.731095 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 15:09:51.731120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0219 15:09:51.731119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0219 15:09:51.731126 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 15:09:51.731136 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 15:09:51.731141 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 15:09:51.731145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 15:09:51.731149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 15:09:51.734010 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77137e37c7a8cdddbba21456e942c760fb2cc6c31533e3bc6c4561929b41393b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:01Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.773855 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:01Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.794986 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmbsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"247e948b-3c17-4675-bd1c-f894b02d2817\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cbe5a969ecab49494c22bccec30a5b7050699729a8ae6244a4bb016ed9cbe64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cbe5a969ecab49494c22bccec30a5b7050699729a8ae6244a4bb016ed9cbe64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://395e0b7a03b78f71b2823c72ea754e1909bde653a7107597bf07b6e664ba40dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://395e0b7a03b78f71b2823c72ea754e1909bde653a7107597bf07b6e664ba40dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5a0126c8bea39a89fe488d49dcb1e47063abc9839d148f348a7ce592038a450\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5a0126c8bea39a89fe488d49dcb1e47063abc9839d148f348a7ce592038a450\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9864da53b0851b0c4eadd5c11669af9b9e539660908ca14f44618824b5e4158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9864da53b0851b0c4eadd5c11669af9b9e539660908ca14f44618824b5e4158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://301c743749a2bcaee82a70d3c34a6cf13f880ef9986c390e279a7c056f9a99ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://301c743749a2bcaee82a70d3c34a6cf13f880ef9986c390e279a7c056f9a99ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmbsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:01Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.816190 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:01Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.837272 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfbf74ef-1e94-4826-8583-42b2e246ccf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f26baf8d6384c9133f7da338479eac54f1aa88609e4b0854078d4e85e8bf05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjrkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba65a8abc0841494ba4596019b005e3cbb3f3c0897bef615b51a2ab10aa9e651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjrkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t499d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:01Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.841113 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.841144 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.841154 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.841174 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.841188 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:01Z","lastTransitionTime":"2026-02-19T15:10:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.850677 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-t9jnq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60d7228b-14bd-4988-8dca-cb89f487ba00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb03eee4bad3a037ff0d61a204c9c86a839e9d684dbb89a433a50afa9cfcfcf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gzvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-t9jnq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:01Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.866453 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d74a04f-ffb6-47fb-8823-2fd7f3df4f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2070ce19c777ad791dfd3e5d99eded43e479f4cbbaefe96c5ed8525aa86e5ed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89e969496cd5adab974a5d8b8c3051bcdfb63c582b8f0d6d9af233d25e2c1d3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5559fbe43f7e18617e2a46182d621a6310ae4d0b10548bae2bb098588285e081\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fbf8815e560483199cc8f453952895d3e5d831e7c52671b204187dc686fa4ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:01Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.880084 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b60fe11e60f880bde640f63640fdc6a1b66e3044dcd975c711c45a15aed3fb2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:01Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.894211 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9db08d13283889d8c43ab89157e2c33f300359ecf66b548100b30ddea3da2be1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8472b62f0ac04884cd8a9d342c063860188b0f17e1ed2201c7ea7288fb44d77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:01Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.906388 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-flbx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cc3ea69-b881-4fd4-ad4d-42803f27865b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07b26bfbc15981836877e6ddc6aea3fb26a4df1be38ac67c76f8a7a8f6b84b5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqph9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-flbx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:01Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.930756 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dcca2640a7625737b151a65d63dac889139b41a8dff1cd5aae2ead1a37138f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://879052e78b8d383d0a7f4fbddb38b98fa931bec3d0d0c7ff7eb830ccb62be46d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5de90aae0b44ad98711b4e4c2764cf25c491a329a7acbe9a6ec49f88fb8e7c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c0568b462e1168949dcfd0131ef4e94d6c3f68dd31906ee5c6a8c87f7fef25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://702a868ef507f31cb27f41ccd862f38ba8d5256e5d4af19a5d8b168939932663\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5089067acda7084123f239150916500b11a0547068b0a39ee60a467c372f3459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f04be3b9d3a27ffadd2a55f3b4cea853da683af5f844ec4c3ad6cbbc548add7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e21fe2d6964872913c8e5a2f983519806b72d0e2880016372ac90fc2b9f61fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8k7p5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:01Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.945125 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.945177 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.945188 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.945209 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.945222 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:01Z","lastTransitionTime":"2026-02-19T15:10:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.950619 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:01Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.964531 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54ba23405a6182e281d7d423f13e84a03c0f68cf69169f59fe1cd0d5881103c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:01Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:01 crc kubenswrapper[4810]: I0219 15:10:01.981018 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bsztz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a45a199-beeb-4972-b796-15c958fe99d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a674e283274f1e77f6ac89a8170137bf5852153bd4a35a80101a3cd0090b85d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcg8h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bsztz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:01Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:02 crc kubenswrapper[4810]: I0219 15:10:02.048010 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:02 crc kubenswrapper[4810]: I0219 15:10:02.048057 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:02 crc kubenswrapper[4810]: I0219 15:10:02.048068 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:02 crc kubenswrapper[4810]: I0219 15:10:02.048085 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:02 crc kubenswrapper[4810]: I0219 15:10:02.048098 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:02Z","lastTransitionTime":"2026-02-19T15:10:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:02 crc kubenswrapper[4810]: I0219 15:10:02.151790 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:02 crc kubenswrapper[4810]: I0219 15:10:02.151863 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:02 crc kubenswrapper[4810]: I0219 15:10:02.151881 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:02 crc kubenswrapper[4810]: I0219 15:10:02.151907 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:02 crc kubenswrapper[4810]: I0219 15:10:02.151927 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:02Z","lastTransitionTime":"2026-02-19T15:10:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:02 crc kubenswrapper[4810]: I0219 15:10:02.254961 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:02 crc kubenswrapper[4810]: I0219 15:10:02.255022 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:02 crc kubenswrapper[4810]: I0219 15:10:02.255038 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:02 crc kubenswrapper[4810]: I0219 15:10:02.255065 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:02 crc kubenswrapper[4810]: I0219 15:10:02.255083 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:02Z","lastTransitionTime":"2026-02-19T15:10:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:02 crc kubenswrapper[4810]: I0219 15:10:02.358407 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:02 crc kubenswrapper[4810]: I0219 15:10:02.358477 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:02 crc kubenswrapper[4810]: I0219 15:10:02.358492 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:02 crc kubenswrapper[4810]: I0219 15:10:02.358515 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:02 crc kubenswrapper[4810]: I0219 15:10:02.358531 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:02Z","lastTransitionTime":"2026-02-19T15:10:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:02 crc kubenswrapper[4810]: I0219 15:10:02.378404 4810 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 09:20:48.312009102 +0000 UTC Feb 19 15:10:02 crc kubenswrapper[4810]: I0219 15:10:02.438728 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 15:10:02 crc kubenswrapper[4810]: E0219 15:10:02.438886 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 15:10:02 crc kubenswrapper[4810]: I0219 15:10:02.462287 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:02 crc kubenswrapper[4810]: I0219 15:10:02.462341 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:02 crc kubenswrapper[4810]: I0219 15:10:02.462352 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:02 crc kubenswrapper[4810]: I0219 15:10:02.462370 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:02 crc kubenswrapper[4810]: I0219 15:10:02.462383 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:02Z","lastTransitionTime":"2026-02-19T15:10:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:02 crc kubenswrapper[4810]: I0219 15:10:02.565072 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:02 crc kubenswrapper[4810]: I0219 15:10:02.565154 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:02 crc kubenswrapper[4810]: I0219 15:10:02.565178 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:02 crc kubenswrapper[4810]: I0219 15:10:02.565229 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:02 crc kubenswrapper[4810]: I0219 15:10:02.565256 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:02Z","lastTransitionTime":"2026-02-19T15:10:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:02 crc kubenswrapper[4810]: I0219 15:10:02.668667 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:02 crc kubenswrapper[4810]: I0219 15:10:02.668739 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:02 crc kubenswrapper[4810]: I0219 15:10:02.668762 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:02 crc kubenswrapper[4810]: I0219 15:10:02.668798 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:02 crc kubenswrapper[4810]: I0219 15:10:02.668821 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:02Z","lastTransitionTime":"2026-02-19T15:10:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:02 crc kubenswrapper[4810]: I0219 15:10:02.736600 4810 generic.go:334] "Generic (PLEG): container finished" podID="247e948b-3c17-4675-bd1c-f894b02d2817" containerID="075e06e40beb99d2338c5850692a5e89314b6342cc72007d75b602ac21a90d9e" exitCode=0 Feb 19 15:10:02 crc kubenswrapper[4810]: I0219 15:10:02.738526 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nmbsx" event={"ID":"247e948b-3c17-4675-bd1c-f894b02d2817","Type":"ContainerDied","Data":"075e06e40beb99d2338c5850692a5e89314b6342cc72007d75b602ac21a90d9e"} Feb 19 15:10:02 crc kubenswrapper[4810]: I0219 15:10:02.764552 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-flbx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cc3ea69-b881-4fd4-ad4d-42803f27865b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07b26bfbc15981836877e6ddc6aea3fb26a4df1be38ac67c76f8a7a8f6b84b5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqph9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-flbx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:02Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:02 crc kubenswrapper[4810]: I0219 15:10:02.772590 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:02 crc kubenswrapper[4810]: I0219 15:10:02.772638 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:02 crc kubenswrapper[4810]: I0219 15:10:02.772650 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:02 crc kubenswrapper[4810]: I0219 15:10:02.772669 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:02 crc kubenswrapper[4810]: I0219 15:10:02.772687 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:02Z","lastTransitionTime":"2026-02-19T15:10:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:02 crc kubenswrapper[4810]: I0219 15:10:02.797013 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dcca2640a7625737b151a65d63dac889139b41a8dff1cd5aae2ead1a37138f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://879052e78b8d383d0a7f4fbddb38b98fa931bec3d0d0c7ff7eb830ccb62be46d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5de90aae0b44ad98711b4e4c2764cf25c491a329a7acbe9a6ec49f88fb8e7c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c0568b462e1168949dcfd0131ef4e94d6c3f68dd31906ee5c6a8c87f7fef25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://702a868ef507f31cb27f41ccd862f38ba8d5256e5d4af19a5d8b168939932663\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5089067acda7084123f239150916500b11a0547068b0a39ee60a467c372f3459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f04be3b9d3a27ffadd2a55f3b4cea853da683af5f844ec4c3ad6cbbc548add7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e21fe2d6964872913c8e5a2f983519806b72d0e2880016372ac90fc2b9f61fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8k7p5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:02Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:02 crc kubenswrapper[4810]: I0219 15:10:02.814623 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d74a04f-ffb6-47fb-8823-2fd7f3df4f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2070ce19c777ad791dfd3e5d99eded43e479f4cbbaefe96c5ed8525aa86e5ed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89e969496cd5adab974a5d8b8c3051bcdfb63c582b8f0d6d9af233d25e2c1d3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5559fbe43f7e18617e2a46182d621a6310ae4d0b10548bae2bb098588285e081\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fbf8815e560483199cc8f453952895d3e5d831e7c52671b204187dc686fa4ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:02Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:02 crc kubenswrapper[4810]: I0219 15:10:02.835389 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b60fe11e60f880bde640f63640fdc6a1b66e3044dcd975c711c45a15aed3fb2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:02Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:02 crc kubenswrapper[4810]: I0219 15:10:02.851449 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9db08d13283889d8c43ab89157e2c33f300359ecf66b548100b30ddea3da2be1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8472b62f0ac04884cd8a9d342c063860188b0f17e1ed2201c7ea7288fb44d77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:02Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:02 crc kubenswrapper[4810]: I0219 15:10:02.869380 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:02Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:02 crc kubenswrapper[4810]: I0219 15:10:02.876622 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:02 crc kubenswrapper[4810]: I0219 15:10:02.876691 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:02 crc kubenswrapper[4810]: I0219 15:10:02.876718 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:02 crc kubenswrapper[4810]: I0219 15:10:02.876755 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:02 crc kubenswrapper[4810]: I0219 15:10:02.876785 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:02Z","lastTransitionTime":"2026-02-19T15:10:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:02 crc kubenswrapper[4810]: I0219 15:10:02.889705 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54ba23405a6182e281d7d423f13e84a03c0f68cf69169f59fe1cd0d5881103c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:02Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:02 crc kubenswrapper[4810]: I0219 15:10:02.905099 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bsztz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a45a199-beeb-4972-b796-15c958fe99d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a674e283274f1e77f6ac89a8170137bf5852153bd4a35a80101a3cd0090b85d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcg8h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bsztz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:02Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:02 crc kubenswrapper[4810]: I0219 15:10:02.917954 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:02Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:02 crc kubenswrapper[4810]: I0219 15:10:02.935885 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmbsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"247e948b-3c17-4675-bd1c-f894b02d2817\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cbe5a969ecab49494c22bccec30a5b7050699729a8ae6244a4bb016ed9cbe64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cbe5a969ecab49494c22bccec30a5b7050699729a8ae6244a4bb016ed9cbe64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://395e0b7a03b78f71b2823c72ea754e1909bde653a7107597bf07b6e664ba40dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://395e0b7a03b78f71b2823c72ea754e1909bde653a7107597bf07b6e664ba40dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5a0126c8bea39a89fe488d49dcb1e47063abc9839d148f348a7ce592038a450\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5a0126c8bea39a89fe488d49dcb1e47063abc9839d148f348a7ce592038a450\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9864da53b0851b0c4eadd5c11669af9b9e539660908ca14f44618824b5e4158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9864da53b0851b0c4eadd5c11669af9b9e539660908ca14f44618824b5e4158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://301c743749a2bcaee82a70d3c34a6cf13f880ef9986c390e279a7c056f9a99ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://301c743749a2bcaee82a70d3c34a6cf13f880ef9986c390e279a7c056f9a99ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://075e06e40beb99d2338c5850692a5e89314b6342cc72007d75b602ac21a90d9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://075e06e40beb99d2338c5850692a5e89314b6342cc72007d75b602ac21a90d9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmbsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:02Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:02 crc kubenswrapper[4810]: I0219 15:10:02.953743 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6694b0d-3264-43eb-bd52-5088c7d2bf15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16713d77989f991d6b3136c41aac696aaa3233a0f3454bcd299cd968f46872eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42939a1750dd650a11f4f83812694248cc8767b040f5cab76ee64d30ad6004ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a3707767fcc9c9ee6a808f9c0598457f1710b6394ecf9910f7a1a0ddd8ec1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://640d02443d7d20048695866b4db1d1fdd6469c712a804010860ca4c730f59c65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://640d02443d7d20048695866b4db1d1fdd6469c712a804010860ca4c730f59c65\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T15:09:51Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 15:09:51.508430 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 15:09:51.508603 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 15:09:51.509986 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1367038872/tls.crt::/tmp/serving-cert-1367038872/tls.key\\\\\\\"\\\\nI0219 15:09:51.722990 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 15:09:51.726068 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 15:09:51.726086 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 15:09:51.726104 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 15:09:51.726110 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 15:09:51.731095 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 15:09:51.731120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0219 15:09:51.731119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0219 15:09:51.731126 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 15:09:51.731136 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 15:09:51.731141 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 15:09:51.731145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 15:09:51.731149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 15:09:51.734010 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77137e37c7a8cdddbba21456e942c760fb2cc6c31533e3bc6c4561929b41393b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:02Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:02 crc kubenswrapper[4810]: I0219 15:10:02.962993 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-t9jnq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60d7228b-14bd-4988-8dca-cb89f487ba00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb03eee4bad3a037ff0d61a204c9c86a839e9d684dbb89a433a50afa9cfcfcf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gzvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-t9jnq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:02Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:02 crc kubenswrapper[4810]: I0219 15:10:02.978602 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:02Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:02 crc kubenswrapper[4810]: I0219 15:10:02.980157 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:02 crc kubenswrapper[4810]: I0219 15:10:02.980231 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:02 crc kubenswrapper[4810]: I0219 15:10:02.980247 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:02 crc kubenswrapper[4810]: I0219 15:10:02.980269 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:02 crc kubenswrapper[4810]: I0219 15:10:02.980308 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:02Z","lastTransitionTime":"2026-02-19T15:10:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:02 crc kubenswrapper[4810]: I0219 15:10:02.994281 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfbf74ef-1e94-4826-8583-42b2e246ccf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f26baf8d6384c9133f7da338479eac54f1aa88609e4b0854078d4e85e8bf05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjrkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba65a8abc0841494ba4596019b005e3cbb3f3c0897bef615b51a2ab10aa9e651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjrkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t499d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:02Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:03 crc kubenswrapper[4810]: I0219 15:10:03.083923 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:03 crc kubenswrapper[4810]: I0219 15:10:03.083980 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:03 crc kubenswrapper[4810]: I0219 15:10:03.083988 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:03 crc kubenswrapper[4810]: I0219 15:10:03.084005 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:03 crc kubenswrapper[4810]: I0219 15:10:03.084013 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:03Z","lastTransitionTime":"2026-02-19T15:10:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:03 crc kubenswrapper[4810]: I0219 15:10:03.187200 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:03 crc kubenswrapper[4810]: I0219 15:10:03.187256 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:03 crc kubenswrapper[4810]: I0219 15:10:03.187275 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:03 crc kubenswrapper[4810]: I0219 15:10:03.187305 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:03 crc kubenswrapper[4810]: I0219 15:10:03.187359 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:03Z","lastTransitionTime":"2026-02-19T15:10:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:03 crc kubenswrapper[4810]: I0219 15:10:03.290173 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:03 crc kubenswrapper[4810]: I0219 15:10:03.290225 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:03 crc kubenswrapper[4810]: I0219 15:10:03.290241 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:03 crc kubenswrapper[4810]: I0219 15:10:03.290261 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:03 crc kubenswrapper[4810]: I0219 15:10:03.290277 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:03Z","lastTransitionTime":"2026-02-19T15:10:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:03 crc kubenswrapper[4810]: I0219 15:10:03.378832 4810 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 08:23:56.409654138 +0000 UTC Feb 19 15:10:03 crc kubenswrapper[4810]: I0219 15:10:03.392518 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:03 crc kubenswrapper[4810]: I0219 15:10:03.392554 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:03 crc kubenswrapper[4810]: I0219 15:10:03.392565 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:03 crc kubenswrapper[4810]: I0219 15:10:03.392582 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:03 crc kubenswrapper[4810]: I0219 15:10:03.392594 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:03Z","lastTransitionTime":"2026-02-19T15:10:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:03 crc kubenswrapper[4810]: I0219 15:10:03.445552 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 15:10:03 crc kubenswrapper[4810]: E0219 15:10:03.445679 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 15:10:03 crc kubenswrapper[4810]: I0219 15:10:03.445553 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 15:10:03 crc kubenswrapper[4810]: E0219 15:10:03.445810 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 15:10:03 crc kubenswrapper[4810]: I0219 15:10:03.495610 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:03 crc kubenswrapper[4810]: I0219 15:10:03.495656 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:03 crc kubenswrapper[4810]: I0219 15:10:03.495673 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:03 crc kubenswrapper[4810]: I0219 15:10:03.495699 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:03 crc kubenswrapper[4810]: I0219 15:10:03.495715 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:03Z","lastTransitionTime":"2026-02-19T15:10:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:03 crc kubenswrapper[4810]: I0219 15:10:03.599061 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:03 crc kubenswrapper[4810]: I0219 15:10:03.599109 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:03 crc kubenswrapper[4810]: I0219 15:10:03.599128 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:03 crc kubenswrapper[4810]: I0219 15:10:03.599153 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:03 crc kubenswrapper[4810]: I0219 15:10:03.599170 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:03Z","lastTransitionTime":"2026-02-19T15:10:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:03 crc kubenswrapper[4810]: I0219 15:10:03.702140 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:03 crc kubenswrapper[4810]: I0219 15:10:03.702194 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:03 crc kubenswrapper[4810]: I0219 15:10:03.702212 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:03 crc kubenswrapper[4810]: I0219 15:10:03.702235 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:03 crc kubenswrapper[4810]: I0219 15:10:03.702249 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:03Z","lastTransitionTime":"2026-02-19T15:10:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:03 crc kubenswrapper[4810]: I0219 15:10:03.747560 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nmbsx" event={"ID":"247e948b-3c17-4675-bd1c-f894b02d2817","Type":"ContainerStarted","Data":"7171cb2bba77488fbb60ec533172b2240d1871e89bc7760ae5c9b67ee6924354"} Feb 19 15:10:03 crc kubenswrapper[4810]: I0219 15:10:03.778866 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b60fe11e60f880bde640f63640fdc6a1b66e3044dcd975c711c45a15aed3fb2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:03Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:03 crc kubenswrapper[4810]: I0219 15:10:03.798988 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9db08d13283889d8c43ab89157e2c33f300359ecf66b548100b30ddea3da2be1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8472b62f0ac04884cd8a9d342c063860188b0f17e1ed2201c7ea7288fb44d77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:03Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:03 crc kubenswrapper[4810]: I0219 15:10:03.805606 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:03 crc kubenswrapper[4810]: I0219 15:10:03.805670 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:03 crc kubenswrapper[4810]: I0219 15:10:03.805690 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:03 crc kubenswrapper[4810]: I0219 15:10:03.805718 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:03 crc kubenswrapper[4810]: I0219 15:10:03.805739 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:03Z","lastTransitionTime":"2026-02-19T15:10:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:03 crc kubenswrapper[4810]: I0219 15:10:03.819057 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-flbx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cc3ea69-b881-4fd4-ad4d-42803f27865b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07b26bfbc15981836877e6ddc6aea3fb26a4df1be38ac67c76f8a7a8f6b84b5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqph9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-flbx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:03Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:03 crc kubenswrapper[4810]: I0219 15:10:03.844375 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dcca2640a7625737b151a65d63dac889139b41a8dff1cd5aae2ead1a37138f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://879052e78b8d383d0a7f4fbddb38b98fa931bec3d0d0c7ff7eb830ccb62be46d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5de90aae0b44ad98711b4e4c2764cf25c491a329a7acbe9a6ec49f88fb8e7c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c0568b462e1168949dcfd0131ef4e94d6c3f68dd31906ee5c6a8c87f7fef25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://702a868ef507f31cb27f41ccd862f38ba8d5256e5d4af19a5d8b168939932663\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5089067acda7084123f239150916500b11a0547068b0a39ee60a467c372f3459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f04be3b9d3a27ffadd2a55f3b4cea853da683af5f844ec4c3ad6cbbc548add7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e21fe2d6964872913c8e5a2f983519806b72d0e2880016372ac90fc2b9f61fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8k7p5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:03Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:03 crc kubenswrapper[4810]: I0219 15:10:03.865530 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d74a04f-ffb6-47fb-8823-2fd7f3df4f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2070ce19c777ad791dfd3e5d99eded43e479f4cbbaefe96c5ed8525aa86e5ed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89e969496cd5adab974a5d8b8c3051bcdfb63c582b8f0d6d9af233d25e2c1d3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5559fbe43f7e18617e2a46182d621a6310ae4d0b10548bae2bb098588285e081\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fbf8815e560483199cc8f453952895d3e5d831e7c52671b204187dc686fa4ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:03Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:03 crc kubenswrapper[4810]: I0219 15:10:03.890886 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54ba23405a6182e281d7d423f13e84a03c0f68cf69169f59fe1cd0d5881103c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:03Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:03 crc kubenswrapper[4810]: I0219 15:10:03.908821 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:03 crc kubenswrapper[4810]: I0219 15:10:03.908871 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:03 crc kubenswrapper[4810]: I0219 15:10:03.908886 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:03 crc kubenswrapper[4810]: I0219 15:10:03.908908 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:03 crc kubenswrapper[4810]: I0219 15:10:03.908923 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:03Z","lastTransitionTime":"2026-02-19T15:10:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:03 crc kubenswrapper[4810]: I0219 15:10:03.908990 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bsztz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a45a199-beeb-4972-b796-15c958fe99d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a674e283274f1e77f6ac89a8170137bf5852153bd4a35a80101a3cd0090b85d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcg8h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bsztz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:03Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:03 crc kubenswrapper[4810]: I0219 15:10:03.929382 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:03Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:03 crc kubenswrapper[4810]: I0219 15:10:03.948735 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6694b0d-3264-43eb-bd52-5088c7d2bf15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16713d77989f991d6b3136c41aac696aaa3233a0f3454bcd299cd968f46872eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42939a1750dd650a11f4f83812694248cc8767b040f5cab76ee64d30ad6004ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a3707767fcc9c9ee6a808f9c0598457f1710b6394ecf9910f7a1a0ddd8ec1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://640d02443d7d20048695866b4db1d1fdd6469c712a804010860ca4c730f59c65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://640d02443d7d20048695866b4db1d1fdd6469c712a804010860ca4c730f59c65\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T15:09:51Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 15:09:51.508430 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 15:09:51.508603 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 15:09:51.509986 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1367038872/tls.crt::/tmp/serving-cert-1367038872/tls.key\\\\\\\"\\\\nI0219 15:09:51.722990 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 15:09:51.726068 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 15:09:51.726086 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 15:09:51.726104 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 15:09:51.726110 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 15:09:51.731095 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 15:09:51.731120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0219 15:09:51.731119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0219 15:09:51.731126 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 15:09:51.731136 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 15:09:51.731141 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 15:09:51.731145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 15:09:51.731149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 15:09:51.734010 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77137e37c7a8cdddbba21456e942c760fb2cc6c31533e3bc6c4561929b41393b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:03Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:03 crc kubenswrapper[4810]: I0219 15:10:03.966188 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:03Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:03 crc kubenswrapper[4810]: I0219 15:10:03.985214 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmbsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"247e948b-3c17-4675-bd1c-f894b02d2817\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7171cb2bba77488fbb60ec533172b2240d1871e89bc7760ae5c9b67ee6924354\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cbe5a969ecab49494c22bccec30a5b7050699729a8ae6244a4bb016ed9cbe64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cbe5a969ecab49494c22bccec30a5b7050699729a8ae6244a4bb016ed9cbe64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://395e0b7a03b78f71b2823c72ea754e1909bde653a7107597bf07b6e664ba40dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://395e0b7a03b78f71b2823c72ea754e1909bde653a7107597bf07b6e664ba40dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5a0126c8bea39a89fe488d49dcb1e47063abc9839d148f348a7ce592038a450\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5a0126c8bea39a89fe488d49dcb1e47063abc9839d148f348a7ce592038a450\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9864da53b0851b0c4eadd5c11669af9b9e539660908ca14f44618824b5e4158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9864da53b0851b0c4eadd5c11669af9b9e539660908ca14f44618824b5e4158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://301c743749a2bcaee82a70d3c34a6cf13f880ef9986c390e279a7c056f9a99ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://301c743749a2bcaee82a70d3c34a6cf13f880ef9986c390e279a7c056f9a99ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://075e06e40beb99d2338c5850692a5e89314b6342cc72007d75b602ac21a90d9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://075e06e40beb99d2338c5850692a5e89314b6342cc72007d75b602ac21a90d9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmbsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:03Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:03 crc kubenswrapper[4810]: I0219 15:10:03.999490 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:03Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:04 crc kubenswrapper[4810]: I0219 15:10:04.011592 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:04 crc kubenswrapper[4810]: I0219 15:10:04.011645 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:04 crc kubenswrapper[4810]: I0219 15:10:04.011662 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:04 crc kubenswrapper[4810]: I0219 15:10:04.011686 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:04 crc kubenswrapper[4810]: I0219 15:10:04.011703 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:04Z","lastTransitionTime":"2026-02-19T15:10:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:04 crc kubenswrapper[4810]: I0219 15:10:04.017398 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfbf74ef-1e94-4826-8583-42b2e246ccf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f26baf8d6384c9133f7da338479eac54f1aa88609e4b0854078d4e85e8bf05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjrkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba65a8abc0841494ba4596019b005e3cbb3f3c0897bef615b51a2ab10aa9e651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjrkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t499d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:04Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:04 crc kubenswrapper[4810]: I0219 15:10:04.028378 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-t9jnq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60d7228b-14bd-4988-8dca-cb89f487ba00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb03eee4bad3a037ff0d61a204c9c86a839e9d684dbb89a433a50afa9cfcfcf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gzvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-t9jnq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:04Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:04 crc kubenswrapper[4810]: I0219 15:10:04.114736 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:04 crc kubenswrapper[4810]: I0219 15:10:04.114780 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:04 crc kubenswrapper[4810]: I0219 15:10:04.114793 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:04 crc kubenswrapper[4810]: I0219 15:10:04.114811 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:04 crc kubenswrapper[4810]: I0219 15:10:04.114823 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:04Z","lastTransitionTime":"2026-02-19T15:10:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:04 crc kubenswrapper[4810]: I0219 15:10:04.217171 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:04 crc kubenswrapper[4810]: I0219 15:10:04.217209 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:04 crc kubenswrapper[4810]: I0219 15:10:04.217219 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:04 crc kubenswrapper[4810]: I0219 15:10:04.217238 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:04 crc kubenswrapper[4810]: I0219 15:10:04.217250 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:04Z","lastTransitionTime":"2026-02-19T15:10:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:04 crc kubenswrapper[4810]: I0219 15:10:04.320217 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:04 crc kubenswrapper[4810]: I0219 15:10:04.320303 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:04 crc kubenswrapper[4810]: I0219 15:10:04.320342 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:04 crc kubenswrapper[4810]: I0219 15:10:04.320368 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:04 crc kubenswrapper[4810]: I0219 15:10:04.320394 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:04Z","lastTransitionTime":"2026-02-19T15:10:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:04 crc kubenswrapper[4810]: I0219 15:10:04.379971 4810 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 18:58:51.861305569 +0000 UTC Feb 19 15:10:04 crc kubenswrapper[4810]: I0219 15:10:04.423315 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:04 crc kubenswrapper[4810]: I0219 15:10:04.423376 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:04 crc kubenswrapper[4810]: I0219 15:10:04.423392 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:04 crc kubenswrapper[4810]: I0219 15:10:04.423425 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:04 crc kubenswrapper[4810]: I0219 15:10:04.423440 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:04Z","lastTransitionTime":"2026-02-19T15:10:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:04 crc kubenswrapper[4810]: I0219 15:10:04.439247 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 15:10:04 crc kubenswrapper[4810]: E0219 15:10:04.439402 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 15:10:04 crc kubenswrapper[4810]: I0219 15:10:04.531741 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:04 crc kubenswrapper[4810]: I0219 15:10:04.531924 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:04 crc kubenswrapper[4810]: I0219 15:10:04.532105 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:04 crc kubenswrapper[4810]: I0219 15:10:04.532950 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:04 crc kubenswrapper[4810]: I0219 15:10:04.533026 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:04Z","lastTransitionTime":"2026-02-19T15:10:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:04 crc kubenswrapper[4810]: I0219 15:10:04.636493 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:04 crc kubenswrapper[4810]: I0219 15:10:04.636576 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:04 crc kubenswrapper[4810]: I0219 15:10:04.636600 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:04 crc kubenswrapper[4810]: I0219 15:10:04.636635 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:04 crc kubenswrapper[4810]: I0219 15:10:04.636659 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:04Z","lastTransitionTime":"2026-02-19T15:10:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:04 crc kubenswrapper[4810]: I0219 15:10:04.739297 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:04 crc kubenswrapper[4810]: I0219 15:10:04.739418 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:04 crc kubenswrapper[4810]: I0219 15:10:04.739489 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:04 crc kubenswrapper[4810]: I0219 15:10:04.739513 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:04 crc kubenswrapper[4810]: I0219 15:10:04.739527 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:04Z","lastTransitionTime":"2026-02-19T15:10:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:04 crc kubenswrapper[4810]: I0219 15:10:04.753909 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8k7p5_c5a8a15c-53e8-4868-8feb-dcd4e83939a4/ovnkube-controller/0.log" Feb 19 15:10:04 crc kubenswrapper[4810]: I0219 15:10:04.757634 4810 generic.go:334] "Generic (PLEG): container finished" podID="c5a8a15c-53e8-4868-8feb-dcd4e83939a4" containerID="2f04be3b9d3a27ffadd2a55f3b4cea853da683af5f844ec4c3ad6cbbc548add7" exitCode=1 Feb 19 15:10:04 crc kubenswrapper[4810]: I0219 15:10:04.757683 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" event={"ID":"c5a8a15c-53e8-4868-8feb-dcd4e83939a4","Type":"ContainerDied","Data":"2f04be3b9d3a27ffadd2a55f3b4cea853da683af5f844ec4c3ad6cbbc548add7"} Feb 19 15:10:04 crc kubenswrapper[4810]: I0219 15:10:04.758828 4810 scope.go:117] "RemoveContainer" containerID="2f04be3b9d3a27ffadd2a55f3b4cea853da683af5f844ec4c3ad6cbbc548add7" Feb 19 15:10:04 crc kubenswrapper[4810]: I0219 15:10:04.787653 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b60fe11e60f880bde640f63640fdc6a1b66e3044dcd975c711c45a15aed3fb2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:04Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:04 crc kubenswrapper[4810]: I0219 15:10:04.811532 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9db08d13283889d8c43ab89157e2c33f300359ecf66b548100b30ddea3da2be1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8472b62f0ac04884cd8a9d342c063860188b0f17e1ed2201c7ea7288fb44d77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:04Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:04 crc kubenswrapper[4810]: I0219 15:10:04.828452 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-flbx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cc3ea69-b881-4fd4-ad4d-42803f27865b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07b26bfbc15981836877e6ddc6aea3fb26a4df1be38ac67c76f8a7a8f6b84b5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqph9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-flbx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:04Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:04 crc kubenswrapper[4810]: I0219 15:10:04.842773 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:04 crc kubenswrapper[4810]: I0219 15:10:04.842860 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:04 crc kubenswrapper[4810]: I0219 15:10:04.842882 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:04 crc kubenswrapper[4810]: I0219 15:10:04.843401 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:04 crc kubenswrapper[4810]: I0219 15:10:04.843700 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:04Z","lastTransitionTime":"2026-02-19T15:10:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:04 crc kubenswrapper[4810]: I0219 15:10:04.859698 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dcca2640a7625737b151a65d63dac889139b41a8dff1cd5aae2ead1a37138f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://879052e78b8d383d0a7f4fbddb38b98fa931bec3d0d0c7ff7eb830ccb62be46d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5de90aae0b44ad98711b4e4c2764cf25c491a329a7acbe9a6ec49f88fb8e7c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c0568b462e1168949dcfd0131ef4e94d6c3f68dd31906ee5c6a8c87f7fef25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://702a868ef507f31cb27f41ccd862f38ba8d5256e5d4af19a5d8b168939932663\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5089067acda7084123f239150916500b11a0547068b0a39ee60a467c372f3459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f04be3b9d3a27ffadd2a55f3b4cea853da683af5f844ec4c3ad6cbbc548add7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f04be3b9d3a27ffadd2a55f3b4cea853da683af5f844ec4c3ad6cbbc548add7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T15:10:04Z\\\",\\\"message\\\":\\\" 15:10:04.039820 6046 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 15:10:04.040242 6046 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 15:10:04.040279 6046 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0219 15:10:04.040252 6046 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 15:10:04.040971 6046 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0219 15:10:04.040995 6046 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0219 15:10:04.041001 6046 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0219 15:10:04.041012 6046 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0219 15:10:04.041033 6046 factory.go:656] Stopping watch factory\\\\nI0219 15:10:04.041045 6046 ovnkube.go:599] Stopped ovnkube\\\\nI0219 15:10:04.041064 6046 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0219 15:10:04.041070 6046 handler.go:208] Removed *v1.Node event handler 7\\\\nI0219 15:10:04.041076 6046 handler.go:208] Removed *v1.Node event handler 2\\\\nI0219 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T15:10:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e21fe2d6964872913c8e5a2f983519806b72d0e2880016372ac90fc2b9f61fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8k7p5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:04Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:04 crc kubenswrapper[4810]: I0219 15:10:04.875136 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d74a04f-ffb6-47fb-8823-2fd7f3df4f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2070ce19c777ad791dfd3e5d99eded43e479f4cbbaefe96c5ed8525aa86e5ed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89e969496cd5adab974a5d8b8c3051bcdfb63c582b8f0d6d9af233d25e2c1d3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5559fbe43f7e18617e2a46182d621a6310ae4d0b10548bae2bb098588285e081\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fbf8815e560483199cc8f453952895d3e5d831e7c52671b204187dc686fa4ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:04Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:04 crc kubenswrapper[4810]: I0219 15:10:04.887932 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54ba23405a6182e281d7d423f13e84a03c0f68cf69169f59fe1cd0d5881103c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:04Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:04 crc kubenswrapper[4810]: I0219 15:10:04.906247 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bsztz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a45a199-beeb-4972-b796-15c958fe99d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a674e283274f1e77f6ac89a8170137bf5852153bd4a35a80101a3cd0090b85d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcg8h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bsztz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:04Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:04 crc kubenswrapper[4810]: I0219 15:10:04.925004 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:04Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:04 crc kubenswrapper[4810]: I0219 15:10:04.945628 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6694b0d-3264-43eb-bd52-5088c7d2bf15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16713d77989f991d6b3136c41aac696aaa3233a0f3454bcd299cd968f46872eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42939a1750dd650a11f4f83812694248cc8767b040f5cab76ee64d30ad6004ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a3707767fcc9c9ee6a808f9c0598457f1710b6394ecf9910f7a1a0ddd8ec1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://640d02443d7d20048695866b4db1d1fdd6469c712a804010860ca4c730f59c65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://640d02443d7d20048695866b4db1d1fdd6469c712a804010860ca4c730f59c65\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T15:09:51Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 15:09:51.508430 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 15:09:51.508603 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 15:09:51.509986 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1367038872/tls.crt::/tmp/serving-cert-1367038872/tls.key\\\\\\\"\\\\nI0219 15:09:51.722990 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 15:09:51.726068 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 15:09:51.726086 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 15:09:51.726104 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 15:09:51.726110 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 15:09:51.731095 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 15:09:51.731120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0219 15:09:51.731119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0219 15:09:51.731126 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 15:09:51.731136 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 15:09:51.731141 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 15:09:51.731145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 15:09:51.731149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 15:09:51.734010 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77137e37c7a8cdddbba21456e942c760fb2cc6c31533e3bc6c4561929b41393b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:04Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:04 crc kubenswrapper[4810]: I0219 15:10:04.947014 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:04 crc kubenswrapper[4810]: I0219 15:10:04.947048 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:04 crc kubenswrapper[4810]: I0219 15:10:04.947061 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:04 crc kubenswrapper[4810]: I0219 15:10:04.947080 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:04 crc kubenswrapper[4810]: I0219 15:10:04.947093 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:04Z","lastTransitionTime":"2026-02-19T15:10:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:04 crc kubenswrapper[4810]: I0219 15:10:04.959881 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:04Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:04 crc kubenswrapper[4810]: I0219 15:10:04.974620 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmbsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"247e948b-3c17-4675-bd1c-f894b02d2817\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7171cb2bba77488fbb60ec533172b2240d1871e89bc7760ae5c9b67ee6924354\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cbe5a969ecab49494c22bccec30a5b7050699729a8ae6244a4bb016ed9cbe64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cbe5a969ecab49494c22bccec30a5b7050699729a8ae6244a4bb016ed9cbe64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://395e0b7a03b78f71b2823c72ea754e1909bde653a7107597bf07b6e664ba40dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://395e0b7a03b78f71b2823c72ea754e1909bde653a7107597bf07b6e664ba40dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5a0126c8bea39a89fe488d49dcb1e47063abc9839d148f348a7ce592038a450\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5a0126c8bea39a89fe488d49dcb1e47063abc9839d148f348a7ce592038a450\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9864da53b0851b0c4eadd5c11669af9b9e539660908ca14f44618824b5e4158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9864da53b0851b0c4eadd5c11669af9b9e539660908ca14f44618824b5e4158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://301c743749a2bcaee82a70d3c34a6cf13f880ef9986c390e279a7c056f9a99ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://301c743749a2bcaee82a70d3c34a6cf13f880ef9986c390e279a7c056f9a99ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://075e06e40beb99d2338c5850692a5e89314b6342cc72007d75b602ac21a90d9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://075e06e40beb99d2338c5850692a5e89314b6342cc72007d75b602ac21a90d9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmbsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:04Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:04 crc kubenswrapper[4810]: I0219 15:10:04.989812 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:04Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:05 crc kubenswrapper[4810]: I0219 15:10:05.004349 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfbf74ef-1e94-4826-8583-42b2e246ccf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f26baf8d6384c9133f7da338479eac54f1aa88609e4b0854078d4e85e8bf05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjrkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba65a8abc0841494ba4596019b005e3cbb3f3c0897bef615b51a2ab10aa9e651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjrkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t499d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:05Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:05 crc kubenswrapper[4810]: I0219 15:10:05.018300 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-t9jnq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60d7228b-14bd-4988-8dca-cb89f487ba00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb03eee4bad3a037ff0d61a204c9c86a839e9d684dbb89a433a50afa9cfcfcf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gzvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-t9jnq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:05Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:05 crc kubenswrapper[4810]: I0219 15:10:05.050076 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:05 crc kubenswrapper[4810]: I0219 15:10:05.050763 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:05 crc kubenswrapper[4810]: I0219 15:10:05.050964 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:05 crc kubenswrapper[4810]: I0219 15:10:05.051190 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:05 crc kubenswrapper[4810]: I0219 15:10:05.051453 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:05Z","lastTransitionTime":"2026-02-19T15:10:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:05 crc kubenswrapper[4810]: I0219 15:10:05.155170 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:05 crc kubenswrapper[4810]: I0219 15:10:05.155236 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:05 crc kubenswrapper[4810]: I0219 15:10:05.155255 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:05 crc kubenswrapper[4810]: I0219 15:10:05.155287 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:05 crc kubenswrapper[4810]: I0219 15:10:05.155306 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:05Z","lastTransitionTime":"2026-02-19T15:10:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:05 crc kubenswrapper[4810]: I0219 15:10:05.257899 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:05 crc kubenswrapper[4810]: I0219 15:10:05.257976 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:05 crc kubenswrapper[4810]: I0219 15:10:05.257995 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:05 crc kubenswrapper[4810]: I0219 15:10:05.258026 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:05 crc kubenswrapper[4810]: I0219 15:10:05.258048 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:05Z","lastTransitionTime":"2026-02-19T15:10:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:05 crc kubenswrapper[4810]: I0219 15:10:05.361944 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:05 crc kubenswrapper[4810]: I0219 15:10:05.362037 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:05 crc kubenswrapper[4810]: I0219 15:10:05.362063 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:05 crc kubenswrapper[4810]: I0219 15:10:05.362095 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:05 crc kubenswrapper[4810]: I0219 15:10:05.362119 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:05Z","lastTransitionTime":"2026-02-19T15:10:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:05 crc kubenswrapper[4810]: I0219 15:10:05.381399 4810 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 22:08:41.853977983 +0000 UTC Feb 19 15:10:05 crc kubenswrapper[4810]: I0219 15:10:05.438546 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 15:10:05 crc kubenswrapper[4810]: I0219 15:10:05.438636 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 15:10:05 crc kubenswrapper[4810]: E0219 15:10:05.438745 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 15:10:05 crc kubenswrapper[4810]: E0219 15:10:05.438897 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 15:10:05 crc kubenswrapper[4810]: I0219 15:10:05.465053 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:05 crc kubenswrapper[4810]: I0219 15:10:05.465114 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:05 crc kubenswrapper[4810]: I0219 15:10:05.465137 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:05 crc kubenswrapper[4810]: I0219 15:10:05.465170 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:05 crc kubenswrapper[4810]: I0219 15:10:05.465192 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:05Z","lastTransitionTime":"2026-02-19T15:10:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:05 crc kubenswrapper[4810]: I0219 15:10:05.567996 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:05 crc kubenswrapper[4810]: I0219 15:10:05.568038 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:05 crc kubenswrapper[4810]: I0219 15:10:05.568050 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:05 crc kubenswrapper[4810]: I0219 15:10:05.568070 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:05 crc kubenswrapper[4810]: I0219 15:10:05.568086 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:05Z","lastTransitionTime":"2026-02-19T15:10:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:05 crc kubenswrapper[4810]: I0219 15:10:05.670806 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:05 crc kubenswrapper[4810]: I0219 15:10:05.670845 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:05 crc kubenswrapper[4810]: I0219 15:10:05.670857 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:05 crc kubenswrapper[4810]: I0219 15:10:05.670875 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:05 crc kubenswrapper[4810]: I0219 15:10:05.670892 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:05Z","lastTransitionTime":"2026-02-19T15:10:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:05 crc kubenswrapper[4810]: I0219 15:10:05.763303 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8k7p5_c5a8a15c-53e8-4868-8feb-dcd4e83939a4/ovnkube-controller/0.log" Feb 19 15:10:05 crc kubenswrapper[4810]: I0219 15:10:05.766473 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" event={"ID":"c5a8a15c-53e8-4868-8feb-dcd4e83939a4","Type":"ContainerStarted","Data":"86c2bd7e735fa6950f8384e98fc9f9b2c030892c44b19ca0f13fd0e4bad302e4"} Feb 19 15:10:05 crc kubenswrapper[4810]: I0219 15:10:05.767043 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" Feb 19 15:10:05 crc kubenswrapper[4810]: I0219 15:10:05.778040 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:05 crc kubenswrapper[4810]: I0219 15:10:05.778088 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:05 crc kubenswrapper[4810]: I0219 15:10:05.778107 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:05 crc kubenswrapper[4810]: I0219 15:10:05.778127 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:05 crc kubenswrapper[4810]: I0219 15:10:05.778139 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:05Z","lastTransitionTime":"2026-02-19T15:10:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:05 crc kubenswrapper[4810]: I0219 15:10:05.789943 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-t9jnq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60d7228b-14bd-4988-8dca-cb89f487ba00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb03eee4bad3a037ff0d61a204c9c86a839e9d684dbb89a433a50afa9cfcfcf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gzvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-t9jnq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:05Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:05 crc kubenswrapper[4810]: I0219 15:10:05.822896 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:05Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:05 crc kubenswrapper[4810]: I0219 15:10:05.837544 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfbf74ef-1e94-4826-8583-42b2e246ccf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f26baf8d6384c9133f7da338479eac54f1aa88609e4b0854078d4e85e8bf05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjrkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba65a8abc0841494ba4596019b005e3cbb3f3c0897bef615b51a2ab10aa9e651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjrkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t499d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:05Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:05 crc kubenswrapper[4810]: I0219 15:10:05.849699 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-flbx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cc3ea69-b881-4fd4-ad4d-42803f27865b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07b26bfbc15981836877e6ddc6aea3fb26a4df1be38ac67c76f8a7a8f6b84b5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqph9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-flbx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:05Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:05 crc kubenswrapper[4810]: I0219 15:10:05.868680 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dcca2640a7625737b151a65d63dac889139b41a8dff1cd5aae2ead1a37138f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://879052e78b8d383d0a7f4fbddb38b98fa931bec3d0d0c7ff7eb830ccb62be46d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5de90aae0b44ad98711b4e4c2764cf25c491a329a7acbe9a6ec49f88fb8e7c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c0568b462e1168949dcfd0131ef4e94d6c3f68dd31906ee5c6a8c87f7fef25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://702a868ef507f31cb27f41ccd862f38ba8d5256e5d4af19a5d8b168939932663\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5089067acda7084123f239150916500b11a0547068b0a39ee60a467c372f3459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c2bd7e735fa6950f8384e98fc9f9b2c030892c44b19ca0f13fd0e4bad302e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f04be3b9d3a27ffadd2a55f3b4cea853da683af5f844ec4c3ad6cbbc548add7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T15:10:04Z\\\",\\\"message\\\":\\\" 15:10:04.039820 6046 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 15:10:04.040242 6046 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 15:10:04.040279 6046 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0219 15:10:04.040252 6046 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 15:10:04.040971 6046 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0219 15:10:04.040995 6046 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0219 15:10:04.041001 6046 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0219 15:10:04.041012 6046 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0219 15:10:04.041033 6046 factory.go:656] Stopping watch factory\\\\nI0219 15:10:04.041045 6046 ovnkube.go:599] Stopped ovnkube\\\\nI0219 15:10:04.041064 6046 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0219 15:10:04.041070 6046 handler.go:208] Removed *v1.Node event handler 7\\\\nI0219 15:10:04.041076 6046 handler.go:208] Removed *v1.Node event handler 2\\\\nI0219 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T15:10:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e21fe2d6964872913c8e5a2f983519806b72d0e2880016372ac90fc2b9f61fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8k7p5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:05Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:05 crc kubenswrapper[4810]: I0219 15:10:05.879856 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d74a04f-ffb6-47fb-8823-2fd7f3df4f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2070ce19c777ad791dfd3e5d99eded43e479f4cbbaefe96c5ed8525aa86e5ed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89e969496cd5adab974a5d8b8c3051bcdfb63c582b8f0d6d9af233d25e2c1d3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5559fbe43f7e18617e2a46182d621a6310ae4d0b10548bae2bb098588285e081\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fbf8815e560483199cc8f453952895d3e5d831e7c52671b204187dc686fa4ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:05Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:05 crc kubenswrapper[4810]: I0219 15:10:05.893556 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b60fe11e60f880bde640f63640fdc6a1b66e3044dcd975c711c45a15aed3fb2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:05Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:05 crc kubenswrapper[4810]: I0219 15:10:05.909227 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:05 crc kubenswrapper[4810]: I0219 15:10:05.909286 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:05 crc kubenswrapper[4810]: I0219 15:10:05.909297 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:05 crc kubenswrapper[4810]: I0219 15:10:05.909317 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:05 crc kubenswrapper[4810]: I0219 15:10:05.909344 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:05Z","lastTransitionTime":"2026-02-19T15:10:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:05 crc kubenswrapper[4810]: I0219 15:10:05.911927 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9db08d13283889d8c43ab89157e2c33f300359ecf66b548100b30ddea3da2be1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8472b62f0ac04884cd8a9d342c063860188b0f17e1ed2201c7ea7288fb44d77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:05Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:05 crc kubenswrapper[4810]: I0219 15:10:05.926929 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:05Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:05 crc kubenswrapper[4810]: I0219 15:10:05.941201 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54ba23405a6182e281d7d423f13e84a03c0f68cf69169f59fe1cd0d5881103c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:05Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:05 crc kubenswrapper[4810]: I0219 15:10:05.961556 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bsztz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a45a199-beeb-4972-b796-15c958fe99d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a674e283274f1e77f6ac89a8170137bf5852153bd4a35a80101a3cd0090b85d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcg8h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bsztz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:05Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:05 crc kubenswrapper[4810]: I0219 15:10:05.975134 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:05Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:05 crc kubenswrapper[4810]: I0219 15:10:05.989003 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmbsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"247e948b-3c17-4675-bd1c-f894b02d2817\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7171cb2bba77488fbb60ec533172b2240d1871e89bc7760ae5c9b67ee6924354\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cbe5a969ecab49494c22bccec30a5b7050699729a8ae6244a4bb016ed9cbe64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cbe5a969ecab49494c22bccec30a5b7050699729a8ae6244a4bb016ed9cbe64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://395e0b7a03b78f71b2823c72ea754e1909bde653a7107597bf07b6e664ba40dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://395e0b7a03b78f71b2823c72ea754e1909bde653a7107597bf07b6e664ba40dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5a0126c8bea39a89fe488d49dcb1e47063abc9839d148f348a7ce592038a450\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5a0126c8bea39a89fe488d49dcb1e47063abc9839d148f348a7ce592038a450\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9864da53b0851b0c4eadd5c11669af9b9e539660908ca14f44618824b5e4158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9864da53b0851b0c4eadd5c11669af9b9e539660908ca14f44618824b5e4158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://301c743749a2bcaee82a70d3c34a6cf13f880ef9986c390e279a7c056f9a99ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://301c743749a2bcaee82a70d3c34a6cf13f880ef9986c390e279a7c056f9a99ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://075e06e40beb99d2338c5850692a5e89314b6342cc72007d75b602ac21a90d9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://075e06e40beb99d2338c5850692a5e89314b6342cc72007d75b602ac21a90d9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmbsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:05Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.005775 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6694b0d-3264-43eb-bd52-5088c7d2bf15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16713d77989f991d6b3136c41aac696aaa3233a0f3454bcd299cd968f46872eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42939a1750dd650a11f4f83812694248cc8767b040f5cab76ee64d30ad6004ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a3707767fcc9c9ee6a808f9c0598457f1710b6394ecf9910f7a1a0ddd8ec1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://640d02443d7d20048695866b4db1d1fdd6469c712a804010860ca4c730f59c65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://640d02443d7d20048695866b4db1d1fdd6469c712a804010860ca4c730f59c65\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T15:09:51Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 15:09:51.508430 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 15:09:51.508603 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 15:09:51.509986 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1367038872/tls.crt::/tmp/serving-cert-1367038872/tls.key\\\\\\\"\\\\nI0219 15:09:51.722990 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 15:09:51.726068 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 15:09:51.726086 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 15:09:51.726104 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 15:09:51.726110 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 15:09:51.731095 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 15:09:51.731120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0219 15:09:51.731119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0219 15:09:51.731126 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 15:09:51.731136 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 15:09:51.731141 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 15:09:51.731145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 15:09:51.731149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 15:09:51.734010 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77137e37c7a8cdddbba21456e942c760fb2cc6c31533e3bc6c4561929b41393b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:06Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.011730 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.011777 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.011788 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.011806 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.011818 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:06Z","lastTransitionTime":"2026-02-19T15:10:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.114836 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.114894 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.114905 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.114923 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.114935 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:06Z","lastTransitionTime":"2026-02-19T15:10:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.182257 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-l2g4t"] Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.182830 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-l2g4t" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.186932 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.187166 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.201080 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6694b0d-3264-43eb-bd52-5088c7d2bf15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16713d77989f991d6b3136c41aac696aaa3233a0f3454bcd299cd968f46872eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42939a1750dd650a11f4f83812694248cc8767b040f5cab76ee64d30ad6004ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a3707767fcc9c9ee6a808f9c0598457f1710b6394ecf9910f7a1a0ddd8ec1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://640d02443d7d20048695866b4db1d1fdd6469c712a804010860ca4c730f59c65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://640d02443d7d20048695866b4db1d1fdd6469c712a804010860ca4c730f59c65\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T15:09:51Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 15:09:51.508430 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 15:09:51.508603 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 15:09:51.509986 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1367038872/tls.crt::/tmp/serving-cert-1367038872/tls.key\\\\\\\"\\\\nI0219 15:09:51.722990 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 15:09:51.726068 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 15:09:51.726086 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 15:09:51.726104 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 15:09:51.726110 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 15:09:51.731095 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 15:09:51.731120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0219 15:09:51.731119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0219 15:09:51.731126 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 15:09:51.731136 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 15:09:51.731141 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 15:09:51.731145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 15:09:51.731149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 15:09:51.734010 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77137e37c7a8cdddbba21456e942c760fb2cc6c31533e3bc6c4561929b41393b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:06Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.209990 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/06adf9a1-ec31-4acc-9864-41549913d3f4-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-l2g4t\" (UID: \"06adf9a1-ec31-4acc-9864-41549913d3f4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-l2g4t" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.210046 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/06adf9a1-ec31-4acc-9864-41549913d3f4-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-l2g4t\" (UID: \"06adf9a1-ec31-4acc-9864-41549913d3f4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-l2g4t" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.210093 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nh6qr\" (UniqueName: \"kubernetes.io/projected/06adf9a1-ec31-4acc-9864-41549913d3f4-kube-api-access-nh6qr\") pod \"ovnkube-control-plane-749d76644c-l2g4t\" (UID: \"06adf9a1-ec31-4acc-9864-41549913d3f4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-l2g4t" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.210372 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/06adf9a1-ec31-4acc-9864-41549913d3f4-env-overrides\") pod \"ovnkube-control-plane-749d76644c-l2g4t\" (UID: \"06adf9a1-ec31-4acc-9864-41549913d3f4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-l2g4t" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.214447 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:06Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.217879 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.217905 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.217915 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.217933 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.217948 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:06Z","lastTransitionTime":"2026-02-19T15:10:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.237135 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmbsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"247e948b-3c17-4675-bd1c-f894b02d2817\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7171cb2bba77488fbb60ec533172b2240d1871e89bc7760ae5c9b67ee6924354\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cbe5a969ecab49494c22bccec30a5b7050699729a8ae6244a4bb016ed9cbe64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cbe5a969ecab49494c22bccec30a5b7050699729a8ae6244a4bb016ed9cbe64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://395e0b7a03b78f71b2823c72ea754e1909bde653a7107597bf07b6e664ba40dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://395e0b7a03b78f71b2823c72ea754e1909bde653a7107597bf07b6e664ba40dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5a0126c8bea39a89fe488d49dcb1e47063abc9839d148f348a7ce592038a450\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5a0126c8bea39a89fe488d49dcb1e47063abc9839d148f348a7ce592038a450\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9864da53b0851b0c4eadd5c11669af9b9e539660908ca14f44618824b5e4158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9864da53b0851b0c4eadd5c11669af9b9e539660908ca14f44618824b5e4158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://301c743749a2bcaee82a70d3c34a6cf13f880ef9986c390e279a7c056f9a99ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://301c743749a2bcaee82a70d3c34a6cf13f880ef9986c390e279a7c056f9a99ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://075e06e40beb99d2338c5850692a5e89314b6342cc72007d75b602ac21a90d9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://075e06e40beb99d2338c5850692a5e89314b6342cc72007d75b602ac21a90d9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmbsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:06Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.258529 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:06Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.273752 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfbf74ef-1e94-4826-8583-42b2e246ccf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f26baf8d6384c9133f7da338479eac54f1aa88609e4b0854078d4e85e8bf05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjrkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba65a8abc0841494ba4596019b005e3cbb3f3c0897bef615b51a2ab10aa9e651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjrkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t499d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:06Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.284481 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-t9jnq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60d7228b-14bd-4988-8dca-cb89f487ba00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb03eee4bad3a037ff0d61a204c9c86a839e9d684dbb89a433a50afa9cfcfcf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gzvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-t9jnq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:06Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.299430 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-l2g4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06adf9a1-ec31-4acc-9864-41549913d3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh6qr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh6qr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:10:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-l2g4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:06Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.311422 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/06adf9a1-ec31-4acc-9864-41549913d3f4-env-overrides\") pod \"ovnkube-control-plane-749d76644c-l2g4t\" (UID: \"06adf9a1-ec31-4acc-9864-41549913d3f4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-l2g4t" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.311499 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/06adf9a1-ec31-4acc-9864-41549913d3f4-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-l2g4t\" (UID: \"06adf9a1-ec31-4acc-9864-41549913d3f4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-l2g4t" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.311544 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/06adf9a1-ec31-4acc-9864-41549913d3f4-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-l2g4t\" (UID: \"06adf9a1-ec31-4acc-9864-41549913d3f4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-l2g4t" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.311565 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nh6qr\" (UniqueName: \"kubernetes.io/projected/06adf9a1-ec31-4acc-9864-41549913d3f4-kube-api-access-nh6qr\") pod \"ovnkube-control-plane-749d76644c-l2g4t\" (UID: \"06adf9a1-ec31-4acc-9864-41549913d3f4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-l2g4t" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.312284 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/06adf9a1-ec31-4acc-9864-41549913d3f4-env-overrides\") pod \"ovnkube-control-plane-749d76644c-l2g4t\" (UID: \"06adf9a1-ec31-4acc-9864-41549913d3f4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-l2g4t" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.312913 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/06adf9a1-ec31-4acc-9864-41549913d3f4-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-l2g4t\" (UID: \"06adf9a1-ec31-4acc-9864-41549913d3f4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-l2g4t" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.315574 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d74a04f-ffb6-47fb-8823-2fd7f3df4f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2070ce19c777ad791dfd3e5d99eded43e479f4cbbaefe96c5ed8525aa86e5ed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89e969496cd5adab974a5d8b8c3051bcdfb63c582b8f0d6d9af233d25e2c1d3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5559fbe43f7e18617e2a46182d621a6310ae4d0b10548bae2bb098588285e081\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fbf8815e560483199cc8f453952895d3e5d831e7c52671b204187dc686fa4ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:06Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.324960 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/06adf9a1-ec31-4acc-9864-41549913d3f4-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-l2g4t\" (UID: \"06adf9a1-ec31-4acc-9864-41549913d3f4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-l2g4t" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.325165 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.325189 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.325201 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.325223 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.325239 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:06Z","lastTransitionTime":"2026-02-19T15:10:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.332691 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b60fe11e60f880bde640f63640fdc6a1b66e3044dcd975c711c45a15aed3fb2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:06Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.344077 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nh6qr\" (UniqueName: \"kubernetes.io/projected/06adf9a1-ec31-4acc-9864-41549913d3f4-kube-api-access-nh6qr\") pod \"ovnkube-control-plane-749d76644c-l2g4t\" (UID: \"06adf9a1-ec31-4acc-9864-41549913d3f4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-l2g4t" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.349189 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9db08d13283889d8c43ab89157e2c33f300359ecf66b548100b30ddea3da2be1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8472b62f0ac04884cd8a9d342c063860188b0f17e1ed2201c7ea7288fb44d77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:06Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.360395 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-flbx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cc3ea69-b881-4fd4-ad4d-42803f27865b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07b26bfbc15981836877e6ddc6aea3fb26a4df1be38ac67c76f8a7a8f6b84b5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqph9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-flbx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:06Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.379203 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dcca2640a7625737b151a65d63dac889139b41a8dff1cd5aae2ead1a37138f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://879052e78b8d383d0a7f4fbddb38b98fa931bec3d0d0c7ff7eb830ccb62be46d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5de90aae0b44ad98711b4e4c2764cf25c491a329a7acbe9a6ec49f88fb8e7c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c0568b462e1168949dcfd0131ef4e94d6c3f68dd31906ee5c6a8c87f7fef25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://702a868ef507f31cb27f41ccd862f38ba8d5256e5d4af19a5d8b168939932663\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5089067acda7084123f239150916500b11a0547068b0a39ee60a467c372f3459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c2bd7e735fa6950f8384e98fc9f9b2c030892c44b19ca0f13fd0e4bad302e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f04be3b9d3a27ffadd2a55f3b4cea853da683af5f844ec4c3ad6cbbc548add7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T15:10:04Z\\\",\\\"message\\\":\\\" 15:10:04.039820 6046 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 15:10:04.040242 6046 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 15:10:04.040279 6046 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0219 15:10:04.040252 6046 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 15:10:04.040971 6046 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0219 15:10:04.040995 6046 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0219 15:10:04.041001 6046 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0219 15:10:04.041012 6046 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0219 15:10:04.041033 6046 factory.go:656] Stopping watch factory\\\\nI0219 15:10:04.041045 6046 ovnkube.go:599] Stopped ovnkube\\\\nI0219 15:10:04.041064 6046 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0219 15:10:04.041070 6046 handler.go:208] Removed *v1.Node event handler 7\\\\nI0219 15:10:04.041076 6046 handler.go:208] Removed *v1.Node event handler 2\\\\nI0219 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T15:10:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e21fe2d6964872913c8e5a2f983519806b72d0e2880016372ac90fc2b9f61fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8k7p5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:06Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.381636 4810 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 23:53:20.096161678 +0000 UTC Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.396216 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:06Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.411486 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54ba23405a6182e281d7d423f13e84a03c0f68cf69169f59fe1cd0d5881103c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:06Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.423939 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bsztz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a45a199-beeb-4972-b796-15c958fe99d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a674e283274f1e77f6ac89a8170137bf5852153bd4a35a80101a3cd0090b85d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcg8h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bsztz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:06Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.427922 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.427946 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.427956 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.427974 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.427984 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:06Z","lastTransitionTime":"2026-02-19T15:10:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.438396 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 15:10:06 crc kubenswrapper[4810]: E0219 15:10:06.438509 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.501725 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-l2g4t" Feb 19 15:10:06 crc kubenswrapper[4810]: W0219 15:10:06.518693 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06adf9a1_ec31_4acc_9864_41549913d3f4.slice/crio-c579edde8952a080f589ba913be0a30a6a895f762bee2bb74d70e89b7bea69cf WatchSource:0}: Error finding container c579edde8952a080f589ba913be0a30a6a895f762bee2bb74d70e89b7bea69cf: Status 404 returned error can't find the container with id c579edde8952a080f589ba913be0a30a6a895f762bee2bb74d70e89b7bea69cf Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.539127 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.539193 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.539218 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.539251 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.539273 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:06Z","lastTransitionTime":"2026-02-19T15:10:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.643418 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.643465 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.643476 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.643497 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.643509 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:06Z","lastTransitionTime":"2026-02-19T15:10:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.746955 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.747017 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.747036 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.747065 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.747085 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:06Z","lastTransitionTime":"2026-02-19T15:10:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.771617 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8k7p5_c5a8a15c-53e8-4868-8feb-dcd4e83939a4/ovnkube-controller/1.log" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.772232 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8k7p5_c5a8a15c-53e8-4868-8feb-dcd4e83939a4/ovnkube-controller/0.log" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.775851 4810 generic.go:334] "Generic (PLEG): container finished" podID="c5a8a15c-53e8-4868-8feb-dcd4e83939a4" containerID="86c2bd7e735fa6950f8384e98fc9f9b2c030892c44b19ca0f13fd0e4bad302e4" exitCode=1 Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.775910 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" event={"ID":"c5a8a15c-53e8-4868-8feb-dcd4e83939a4","Type":"ContainerDied","Data":"86c2bd7e735fa6950f8384e98fc9f9b2c030892c44b19ca0f13fd0e4bad302e4"} Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.776013 4810 scope.go:117] "RemoveContainer" containerID="2f04be3b9d3a27ffadd2a55f3b4cea853da683af5f844ec4c3ad6cbbc548add7" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.776777 4810 scope.go:117] "RemoveContainer" containerID="86c2bd7e735fa6950f8384e98fc9f9b2c030892c44b19ca0f13fd0e4bad302e4" Feb 19 15:10:06 crc kubenswrapper[4810]: E0219 15:10:06.777027 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-8k7p5_openshift-ovn-kubernetes(c5a8a15c-53e8-4868-8feb-dcd4e83939a4)\"" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" podUID="c5a8a15c-53e8-4868-8feb-dcd4e83939a4" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.778275 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-l2g4t" event={"ID":"06adf9a1-ec31-4acc-9864-41549913d3f4","Type":"ContainerStarted","Data":"c579edde8952a080f589ba913be0a30a6a895f762bee2bb74d70e89b7bea69cf"} Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.793858 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-t9jnq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60d7228b-14bd-4988-8dca-cb89f487ba00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb03eee4bad3a037ff0d61a204c9c86a839e9d684dbb89a433a50afa9cfcfcf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gzvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-t9jnq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:06Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.812560 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-l2g4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06adf9a1-ec31-4acc-9864-41549913d3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh6qr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh6qr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:10:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-l2g4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:06Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.833412 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:06Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.850711 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.850748 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.850768 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.850796 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.850816 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:06Z","lastTransitionTime":"2026-02-19T15:10:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.852297 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfbf74ef-1e94-4826-8583-42b2e246ccf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f26baf8d6384c9133f7da338479eac54f1aa88609e4b0854078d4e85e8bf05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjrkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba65a8abc0841494ba4596019b005e3cbb3f3c0897bef615b51a2ab10aa9e651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjrkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t499d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:06Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.870401 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-flbx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cc3ea69-b881-4fd4-ad4d-42803f27865b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07b26bfbc15981836877e6ddc6aea3fb26a4df1be38ac67c76f8a7a8f6b84b5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqph9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-flbx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:06Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.901567 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dcca2640a7625737b151a65d63dac889139b41a8dff1cd5aae2ead1a37138f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://879052e78b8d383d0a7f4fbddb38b98fa931bec3d0d0c7ff7eb830ccb62be46d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5de90aae0b44ad98711b4e4c2764cf25c491a329a7acbe9a6ec49f88fb8e7c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c0568b462e1168949dcfd0131ef4e94d6c3f68dd31906ee5c6a8c87f7fef25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://702a868ef507f31cb27f41ccd862f38ba8d5256e5d4af19a5d8b168939932663\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5089067acda7084123f239150916500b11a0547068b0a39ee60a467c372f3459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c2bd7e735fa6950f8384e98fc9f9b2c030892c44b19ca0f13fd0e4bad302e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f04be3b9d3a27ffadd2a55f3b4cea853da683af5f844ec4c3ad6cbbc548add7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T15:10:04Z\\\",\\\"message\\\":\\\" 15:10:04.039820 6046 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 15:10:04.040242 6046 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 15:10:04.040279 6046 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0219 15:10:04.040252 6046 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 15:10:04.040971 6046 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0219 15:10:04.040995 6046 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0219 15:10:04.041001 6046 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0219 15:10:04.041012 6046 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0219 15:10:04.041033 6046 factory.go:656] Stopping watch factory\\\\nI0219 15:10:04.041045 6046 ovnkube.go:599] Stopped ovnkube\\\\nI0219 15:10:04.041064 6046 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0219 15:10:04.041070 6046 handler.go:208] Removed *v1.Node event handler 7\\\\nI0219 15:10:04.041076 6046 handler.go:208] Removed *v1.Node event handler 2\\\\nI0219 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T15:10:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86c2bd7e735fa6950f8384e98fc9f9b2c030892c44b19ca0f13fd0e4bad302e4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T15:10:06Z\\\",\\\"message\\\":\\\"ns:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {43933d5e-3c3b-4ff8-8926-04ac25de450e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0219 15:10:06.221628 6231 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0219 15:10:06.223394 6231 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:43933d5e-3c3b-4ff8-8926-04ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0219 15:10:06.223450 6231 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T15:10:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e21fe2d6964872913c8e5a2f983519806b72d0e2880016372ac90fc2b9f61fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8k7p5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:06Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.919290 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d74a04f-ffb6-47fb-8823-2fd7f3df4f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2070ce19c777ad791dfd3e5d99eded43e479f4cbbaefe96c5ed8525aa86e5ed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89e969496cd5adab974a5d8b8c3051bcdfb63c582b8f0d6d9af233d25e2c1d3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5559fbe43f7e18617e2a46182d621a6310ae4d0b10548bae2bb098588285e081\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fbf8815e560483199cc8f453952895d3e5d831e7c52671b204187dc686fa4ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:06Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.936874 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b60fe11e60f880bde640f63640fdc6a1b66e3044dcd975c711c45a15aed3fb2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:06Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.953164 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.953225 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.953241 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.953267 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.953286 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:06Z","lastTransitionTime":"2026-02-19T15:10:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.953470 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9db08d13283889d8c43ab89157e2c33f300359ecf66b548100b30ddea3da2be1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8472b62f0ac04884cd8a9d342c063860188b0f17e1ed2201c7ea7288fb44d77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:06Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.970340 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:06Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:06 crc kubenswrapper[4810]: I0219 15:10:06.984489 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54ba23405a6182e281d7d423f13e84a03c0f68cf69169f59fe1cd0d5881103c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:06Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.000672 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bsztz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a45a199-beeb-4972-b796-15c958fe99d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a674e283274f1e77f6ac89a8170137bf5852153bd4a35a80101a3cd0090b85d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcg8h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bsztz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:06Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.020921 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:07Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.049081 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmbsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"247e948b-3c17-4675-bd1c-f894b02d2817\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7171cb2bba77488fbb60ec533172b2240d1871e89bc7760ae5c9b67ee6924354\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cbe5a969ecab49494c22bccec30a5b7050699729a8ae6244a4bb016ed9cbe64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cbe5a969ecab49494c22bccec30a5b7050699729a8ae6244a4bb016ed9cbe64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://395e0b7a03b78f71b2823c72ea754e1909bde653a7107597bf07b6e664ba40dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://395e0b7a03b78f71b2823c72ea754e1909bde653a7107597bf07b6e664ba40dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5a0126c8bea39a89fe488d49dcb1e47063abc9839d148f348a7ce592038a450\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5a0126c8bea39a89fe488d49dcb1e47063abc9839d148f348a7ce592038a450\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9864da53b0851b0c4eadd5c11669af9b9e539660908ca14f44618824b5e4158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9864da53b0851b0c4eadd5c11669af9b9e539660908ca14f44618824b5e4158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://301c743749a2bcaee82a70d3c34a6cf13f880ef9986c390e279a7c056f9a99ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://301c743749a2bcaee82a70d3c34a6cf13f880ef9986c390e279a7c056f9a99ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://075e06e40beb99d2338c5850692a5e89314b6342cc72007d75b602ac21a90d9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://075e06e40beb99d2338c5850692a5e89314b6342cc72007d75b602ac21a90d9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmbsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:07Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.056660 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.056732 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.056752 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.056781 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.056807 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:07Z","lastTransitionTime":"2026-02-19T15:10:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.075886 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6694b0d-3264-43eb-bd52-5088c7d2bf15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16713d77989f991d6b3136c41aac696aaa3233a0f3454bcd299cd968f46872eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42939a1750dd650a11f4f83812694248cc8767b040f5cab76ee64d30ad6004ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a3707767fcc9c9ee6a808f9c0598457f1710b6394ecf9910f7a1a0ddd8ec1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://640d02443d7d20048695866b4db1d1fdd6469c712a804010860ca4c730f59c65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://640d02443d7d20048695866b4db1d1fdd6469c712a804010860ca4c730f59c65\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T15:09:51Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 15:09:51.508430 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 15:09:51.508603 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 15:09:51.509986 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1367038872/tls.crt::/tmp/serving-cert-1367038872/tls.key\\\\\\\"\\\\nI0219 15:09:51.722990 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 15:09:51.726068 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 15:09:51.726086 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 15:09:51.726104 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 15:09:51.726110 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 15:09:51.731095 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 15:09:51.731120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0219 15:09:51.731119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0219 15:09:51.731126 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 15:09:51.731136 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 15:09:51.731141 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 15:09:51.731145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 15:09:51.731149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 15:09:51.734010 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77137e37c7a8cdddbba21456e942c760fb2cc6c31533e3bc6c4561929b41393b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:07Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.160298 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.160382 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.160399 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.160424 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.160445 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:07Z","lastTransitionTime":"2026-02-19T15:10:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.220135 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 15:10:07 crc kubenswrapper[4810]: E0219 15:10:07.220401 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 15:10:23.220356824 +0000 UTC m=+52.702386978 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.264016 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.264082 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.264099 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.264122 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.264135 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:07Z","lastTransitionTime":"2026-02-19T15:10:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.320846 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.320927 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.320970 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.321005 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 15:10:07 crc kubenswrapper[4810]: E0219 15:10:07.321086 4810 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 15:10:07 crc kubenswrapper[4810]: E0219 15:10:07.321151 4810 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 15:10:07 crc kubenswrapper[4810]: E0219 15:10:07.321174 4810 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 15:10:07 crc kubenswrapper[4810]: E0219 15:10:07.321187 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 15:10:23.321158449 +0000 UTC m=+52.803188613 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 15:10:07 crc kubenswrapper[4810]: E0219 15:10:07.321198 4810 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 15:10:07 crc kubenswrapper[4810]: E0219 15:10:07.321235 4810 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 15:10:07 crc kubenswrapper[4810]: E0219 15:10:07.321375 4810 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 15:10:07 crc kubenswrapper[4810]: E0219 15:10:07.321435 4810 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 15:10:07 crc kubenswrapper[4810]: E0219 15:10:07.321455 4810 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 15:10:07 crc kubenswrapper[4810]: E0219 15:10:07.321257 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-19 15:10:23.321239691 +0000 UTC m=+52.803269845 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 15:10:07 crc kubenswrapper[4810]: E0219 15:10:07.321567 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 15:10:23.321540519 +0000 UTC m=+52.803570643 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 15:10:07 crc kubenswrapper[4810]: E0219 15:10:07.321583 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-19 15:10:23.32157641 +0000 UTC m=+52.803606534 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.367847 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.367894 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.367906 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.367925 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.367940 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:07Z","lastTransitionTime":"2026-02-19T15:10:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.382054 4810 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 23:29:22.442629155 +0000 UTC Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.439086 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.439201 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 15:10:07 crc kubenswrapper[4810]: E0219 15:10:07.439462 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 15:10:07 crc kubenswrapper[4810]: E0219 15:10:07.439644 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.471093 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.471151 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.471169 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.471196 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.471215 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:07Z","lastTransitionTime":"2026-02-19T15:10:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.579143 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.579242 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.579269 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.579306 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.579379 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:07Z","lastTransitionTime":"2026-02-19T15:10:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.684505 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.684589 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.684610 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.684639 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.684661 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:07Z","lastTransitionTime":"2026-02-19T15:10:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.692812 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-2x9v9"] Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.694154 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2x9v9" Feb 19 15:10:07 crc kubenswrapper[4810]: E0219 15:10:07.694362 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2x9v9" podUID="b72d3f7a-e418-4a21-af73-6a43ce3358c1" Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.715703 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:07Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.726273 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7z7x5\" (UniqueName: \"kubernetes.io/projected/b72d3f7a-e418-4a21-af73-6a43ce3358c1-kube-api-access-7z7x5\") pod \"network-metrics-daemon-2x9v9\" (UID: \"b72d3f7a-e418-4a21-af73-6a43ce3358c1\") " pod="openshift-multus/network-metrics-daemon-2x9v9" Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.726406 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b72d3f7a-e418-4a21-af73-6a43ce3358c1-metrics-certs\") pod \"network-metrics-daemon-2x9v9\" (UID: \"b72d3f7a-e418-4a21-af73-6a43ce3358c1\") " pod="openshift-multus/network-metrics-daemon-2x9v9" Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.741619 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmbsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"247e948b-3c17-4675-bd1c-f894b02d2817\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7171cb2bba77488fbb60ec533172b2240d1871e89bc7760ae5c9b67ee6924354\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cbe5a969ecab49494c22bccec30a5b7050699729a8ae6244a4bb016ed9cbe64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cbe5a969ecab49494c22bccec30a5b7050699729a8ae6244a4bb016ed9cbe64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://395e0b7a03b78f71b2823c72ea754e1909bde653a7107597bf07b6e664ba40dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://395e0b7a03b78f71b2823c72ea754e1909bde653a7107597bf07b6e664ba40dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5a0126c8bea39a89fe488d49dcb1e47063abc9839d148f348a7ce592038a450\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5a0126c8bea39a89fe488d49dcb1e47063abc9839d148f348a7ce592038a450\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9864da53b0851b0c4eadd5c11669af9b9e539660908ca14f44618824b5e4158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9864da53b0851b0c4eadd5c11669af9b9e539660908ca14f44618824b5e4158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://301c743749a2bcaee82a70d3c34a6cf13f880ef9986c390e279a7c056f9a99ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://301c743749a2bcaee82a70d3c34a6cf13f880ef9986c390e279a7c056f9a99ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://075e06e40beb99d2338c5850692a5e89314b6342cc72007d75b602ac21a90d9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://075e06e40beb99d2338c5850692a5e89314b6342cc72007d75b602ac21a90d9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmbsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:07Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.763154 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2x9v9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b72d3f7a-e418-4a21-af73-6a43ce3358c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7z7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7z7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:10:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2x9v9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:07Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.787844 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6694b0d-3264-43eb-bd52-5088c7d2bf15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16713d77989f991d6b3136c41aac696aaa3233a0f3454bcd299cd968f46872eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42939a1750dd650a11f4f83812694248cc8767b040f5cab76ee64d30ad6004ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a3707767fcc9c9ee6a808f9c0598457f1710b6394ecf9910f7a1a0ddd8ec1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://640d02443d7d20048695866b4db1d1fdd6469c712a804010860ca4c730f59c65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://640d02443d7d20048695866b4db1d1fdd6469c712a804010860ca4c730f59c65\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T15:09:51Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 15:09:51.508430 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 15:09:51.508603 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 15:09:51.509986 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1367038872/tls.crt::/tmp/serving-cert-1367038872/tls.key\\\\\\\"\\\\nI0219 15:09:51.722990 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 15:09:51.726068 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 15:09:51.726086 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 15:09:51.726104 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 15:09:51.726110 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 15:09:51.731095 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 15:09:51.731120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0219 15:09:51.731119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0219 15:09:51.731126 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 15:09:51.731136 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 15:09:51.731141 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 15:09:51.731145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 15:09:51.731149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 15:09:51.734010 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77137e37c7a8cdddbba21456e942c760fb2cc6c31533e3bc6c4561929b41393b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:07Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.788241 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.788283 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.788300 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.788359 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.788392 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:07Z","lastTransitionTime":"2026-02-19T15:10:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.791037 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8k7p5_c5a8a15c-53e8-4868-8feb-dcd4e83939a4/ovnkube-controller/1.log" Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.799004 4810 scope.go:117] "RemoveContainer" containerID="86c2bd7e735fa6950f8384e98fc9f9b2c030892c44b19ca0f13fd0e4bad302e4" Feb 19 15:10:07 crc kubenswrapper[4810]: E0219 15:10:07.799239 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-8k7p5_openshift-ovn-kubernetes(c5a8a15c-53e8-4868-8feb-dcd4e83939a4)\"" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" podUID="c5a8a15c-53e8-4868-8feb-dcd4e83939a4" Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.801388 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-l2g4t" event={"ID":"06adf9a1-ec31-4acc-9864-41549913d3f4","Type":"ContainerStarted","Data":"82d6389909245cbf6091dfb1ef53a497dd254228acd53c508c32c4345f86b7b4"} Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.801434 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-l2g4t" event={"ID":"06adf9a1-ec31-4acc-9864-41549913d3f4","Type":"ContainerStarted","Data":"618a71f428d132079ed8a6756ea331ec8fd20c32c98bc00691f56fa0b12669aa"} Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.807398 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-t9jnq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60d7228b-14bd-4988-8dca-cb89f487ba00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb03eee4bad3a037ff0d61a204c9c86a839e9d684dbb89a433a50afa9cfcfcf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gzvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-t9jnq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:07Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.827226 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-l2g4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06adf9a1-ec31-4acc-9864-41549913d3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh6qr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh6qr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:10:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-l2g4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:07Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.827579 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b72d3f7a-e418-4a21-af73-6a43ce3358c1-metrics-certs\") pod \"network-metrics-daemon-2x9v9\" (UID: \"b72d3f7a-e418-4a21-af73-6a43ce3358c1\") " pod="openshift-multus/network-metrics-daemon-2x9v9" Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.827774 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7z7x5\" (UniqueName: \"kubernetes.io/projected/b72d3f7a-e418-4a21-af73-6a43ce3358c1-kube-api-access-7z7x5\") pod \"network-metrics-daemon-2x9v9\" (UID: \"b72d3f7a-e418-4a21-af73-6a43ce3358c1\") " pod="openshift-multus/network-metrics-daemon-2x9v9" Feb 19 15:10:07 crc kubenswrapper[4810]: E0219 15:10:07.828082 4810 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 15:10:07 crc kubenswrapper[4810]: E0219 15:10:07.828195 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b72d3f7a-e418-4a21-af73-6a43ce3358c1-metrics-certs podName:b72d3f7a-e418-4a21-af73-6a43ce3358c1 nodeName:}" failed. No retries permitted until 2026-02-19 15:10:08.328162368 +0000 UTC m=+37.810192502 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b72d3f7a-e418-4a21-af73-6a43ce3358c1-metrics-certs") pod "network-metrics-daemon-2x9v9" (UID: "b72d3f7a-e418-4a21-af73-6a43ce3358c1") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.848046 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:07Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.858964 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7z7x5\" (UniqueName: \"kubernetes.io/projected/b72d3f7a-e418-4a21-af73-6a43ce3358c1-kube-api-access-7z7x5\") pod \"network-metrics-daemon-2x9v9\" (UID: \"b72d3f7a-e418-4a21-af73-6a43ce3358c1\") " pod="openshift-multus/network-metrics-daemon-2x9v9" Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.867227 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfbf74ef-1e94-4826-8583-42b2e246ccf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f26baf8d6384c9133f7da338479eac54f1aa88609e4b0854078d4e85e8bf05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjrkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba65a8abc0841494ba4596019b005e3cbb3f3c0897bef615b51a2ab10aa9e651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjrkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t499d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:07Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.884689 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-flbx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cc3ea69-b881-4fd4-ad4d-42803f27865b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07b26bfbc15981836877e6ddc6aea3fb26a4df1be38ac67c76f8a7a8f6b84b5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqph9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-flbx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:07Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.891497 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.891555 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.891577 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.891607 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.891629 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:07Z","lastTransitionTime":"2026-02-19T15:10:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.921589 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dcca2640a7625737b151a65d63dac889139b41a8dff1cd5aae2ead1a37138f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://879052e78b8d383d0a7f4fbddb38b98fa931bec3d0d0c7ff7eb830ccb62be46d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5de90aae0b44ad98711b4e4c2764cf25c491a329a7acbe9a6ec49f88fb8e7c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c0568b462e1168949dcfd0131ef4e94d6c3f68dd31906ee5c6a8c87f7fef25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://702a868ef507f31cb27f41ccd862f38ba8d5256e5d4af19a5d8b168939932663\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5089067acda7084123f239150916500b11a0547068b0a39ee60a467c372f3459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c2bd7e735fa6950f8384e98fc9f9b2c030892c44b19ca0f13fd0e4bad302e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f04be3b9d3a27ffadd2a55f3b4cea853da683af5f844ec4c3ad6cbbc548add7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T15:10:04Z\\\",\\\"message\\\":\\\" 15:10:04.039820 6046 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 15:10:04.040242 6046 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 15:10:04.040279 6046 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0219 15:10:04.040252 6046 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 15:10:04.040971 6046 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0219 15:10:04.040995 6046 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0219 15:10:04.041001 6046 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0219 15:10:04.041012 6046 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0219 15:10:04.041033 6046 factory.go:656] Stopping watch factory\\\\nI0219 15:10:04.041045 6046 ovnkube.go:599] Stopped ovnkube\\\\nI0219 15:10:04.041064 6046 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0219 15:10:04.041070 6046 handler.go:208] Removed *v1.Node event handler 7\\\\nI0219 15:10:04.041076 6046 handler.go:208] Removed *v1.Node event handler 2\\\\nI0219 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T15:10:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86c2bd7e735fa6950f8384e98fc9f9b2c030892c44b19ca0f13fd0e4bad302e4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T15:10:06Z\\\",\\\"message\\\":\\\"ns:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {43933d5e-3c3b-4ff8-8926-04ac25de450e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0219 15:10:06.221628 6231 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0219 15:10:06.223394 6231 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:43933d5e-3c3b-4ff8-8926-04ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0219 15:10:06.223450 6231 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T15:10:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e21fe2d6964872913c8e5a2f983519806b72d0e2880016372ac90fc2b9f61fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8k7p5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:07Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.942437 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d74a04f-ffb6-47fb-8823-2fd7f3df4f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2070ce19c777ad791dfd3e5d99eded43e479f4cbbaefe96c5ed8525aa86e5ed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89e969496cd5adab974a5d8b8c3051bcdfb63c582b8f0d6d9af233d25e2c1d3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5559fbe43f7e18617e2a46182d621a6310ae4d0b10548bae2bb098588285e081\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fbf8815e560483199cc8f453952895d3e5d831e7c52671b204187dc686fa4ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:07Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.962267 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b60fe11e60f880bde640f63640fdc6a1b66e3044dcd975c711c45a15aed3fb2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:07Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.983086 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9db08d13283889d8c43ab89157e2c33f300359ecf66b548100b30ddea3da2be1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8472b62f0ac04884cd8a9d342c063860188b0f17e1ed2201c7ea7288fb44d77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:07Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.996226 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.996318 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.996402 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.996444 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:07 crc kubenswrapper[4810]: I0219 15:10:07.996468 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:07Z","lastTransitionTime":"2026-02-19T15:10:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.003972 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:08Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.023592 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54ba23405a6182e281d7d423f13e84a03c0f68cf69169f59fe1cd0d5881103c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:08Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.045872 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bsztz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a45a199-beeb-4972-b796-15c958fe99d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a674e283274f1e77f6ac89a8170137bf5852153bd4a35a80101a3cd0090b85d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcg8h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bsztz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:08Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.082300 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:08Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.099825 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.099896 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.099913 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.099935 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.099948 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:08Z","lastTransitionTime":"2026-02-19T15:10:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.108590 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54ba23405a6182e281d7d423f13e84a03c0f68cf69169f59fe1cd0d5881103c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:08Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.144638 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bsztz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a45a199-beeb-4972-b796-15c958fe99d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a674e283274f1e77f6ac89a8170137bf5852153bd4a35a80101a3cd0090b85d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcg8h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bsztz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:08Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.165441 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6694b0d-3264-43eb-bd52-5088c7d2bf15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16713d77989f991d6b3136c41aac696aaa3233a0f3454bcd299cd968f46872eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42939a1750dd650a11f4f83812694248cc8767b040f5cab76ee64d30ad6004ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a3707767fcc9c9ee6a808f9c0598457f1710b6394ecf9910f7a1a0ddd8ec1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://640d02443d7d20048695866b4db1d1fdd6469c712a804010860ca4c730f59c65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://640d02443d7d20048695866b4db1d1fdd6469c712a804010860ca4c730f59c65\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T15:09:51Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 15:09:51.508430 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 15:09:51.508603 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 15:09:51.509986 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1367038872/tls.crt::/tmp/serving-cert-1367038872/tls.key\\\\\\\"\\\\nI0219 15:09:51.722990 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 15:09:51.726068 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 15:09:51.726086 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 15:09:51.726104 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 15:09:51.726110 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 15:09:51.731095 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 15:09:51.731120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0219 15:09:51.731119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0219 15:09:51.731126 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 15:09:51.731136 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 15:09:51.731141 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 15:09:51.731145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 15:09:51.731149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 15:09:51.734010 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77137e37c7a8cdddbba21456e942c760fb2cc6c31533e3bc6c4561929b41393b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:08Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.181988 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:08Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.204106 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.204181 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.204197 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.204225 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.204240 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:08Z","lastTransitionTime":"2026-02-19T15:10:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.209018 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmbsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"247e948b-3c17-4675-bd1c-f894b02d2817\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7171cb2bba77488fbb60ec533172b2240d1871e89bc7760ae5c9b67ee6924354\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cbe5a969ecab49494c22bccec30a5b7050699729a8ae6244a4bb016ed9cbe64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cbe5a969ecab49494c22bccec30a5b7050699729a8ae6244a4bb016ed9cbe64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://395e0b7a03b78f71b2823c72ea754e1909bde653a7107597bf07b6e664ba40dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://395e0b7a03b78f71b2823c72ea754e1909bde653a7107597bf07b6e664ba40dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5a0126c8bea39a89fe488d49dcb1e47063abc9839d148f348a7ce592038a450\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5a0126c8bea39a89fe488d49dcb1e47063abc9839d148f348a7ce592038a450\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9864da53b0851b0c4eadd5c11669af9b9e539660908ca14f44618824b5e4158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9864da53b0851b0c4eadd5c11669af9b9e539660908ca14f44618824b5e4158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://301c743749a2bcaee82a70d3c34a6cf13f880ef9986c390e279a7c056f9a99ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://301c743749a2bcaee82a70d3c34a6cf13f880ef9986c390e279a7c056f9a99ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://075e06e40beb99d2338c5850692a5e89314b6342cc72007d75b602ac21a90d9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://075e06e40beb99d2338c5850692a5e89314b6342cc72007d75b602ac21a90d9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmbsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:08Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.224432 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2x9v9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b72d3f7a-e418-4a21-af73-6a43ce3358c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7z7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7z7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:10:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2x9v9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:08Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.237366 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:08Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.249796 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfbf74ef-1e94-4826-8583-42b2e246ccf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f26baf8d6384c9133f7da338479eac54f1aa88609e4b0854078d4e85e8bf05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjrkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba65a8abc0841494ba4596019b005e3cbb3f3c0897bef615b51a2ab10aa9e651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjrkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t499d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:08Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.263133 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-t9jnq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60d7228b-14bd-4988-8dca-cb89f487ba00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb03eee4bad3a037ff0d61a204c9c86a839e9d684dbb89a433a50afa9cfcfcf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gzvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-t9jnq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:08Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.275959 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-l2g4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06adf9a1-ec31-4acc-9864-41549913d3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://618a71f428d132079ed8a6756ea331ec8fd20c32c98bc00691f56fa0b12669aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh6qr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82d6389909245cbf6091dfb1ef53a497dd254228acd53c508c32c4345f86b7b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh6qr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:10:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-l2g4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:08Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.293551 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d74a04f-ffb6-47fb-8823-2fd7f3df4f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2070ce19c777ad791dfd3e5d99eded43e479f4cbbaefe96c5ed8525aa86e5ed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89e969496cd5adab974a5d8b8c3051bcdfb63c582b8f0d6d9af233d25e2c1d3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5559fbe43f7e18617e2a46182d621a6310ae4d0b10548bae2bb098588285e081\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fbf8815e560483199cc8f453952895d3e5d831e7c52671b204187dc686fa4ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:08Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.309215 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.309292 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.309316 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.309370 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.309425 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:08Z","lastTransitionTime":"2026-02-19T15:10:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.316006 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b60fe11e60f880bde640f63640fdc6a1b66e3044dcd975c711c45a15aed3fb2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:08Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.332869 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b72d3f7a-e418-4a21-af73-6a43ce3358c1-metrics-certs\") pod \"network-metrics-daemon-2x9v9\" (UID: \"b72d3f7a-e418-4a21-af73-6a43ce3358c1\") " pod="openshift-multus/network-metrics-daemon-2x9v9" Feb 19 15:10:08 crc kubenswrapper[4810]: E0219 15:10:08.333167 4810 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 15:10:08 crc kubenswrapper[4810]: E0219 15:10:08.333312 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b72d3f7a-e418-4a21-af73-6a43ce3358c1-metrics-certs podName:b72d3f7a-e418-4a21-af73-6a43ce3358c1 nodeName:}" failed. No retries permitted until 2026-02-19 15:10:09.333279172 +0000 UTC m=+38.815309516 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b72d3f7a-e418-4a21-af73-6a43ce3358c1-metrics-certs") pod "network-metrics-daemon-2x9v9" (UID: "b72d3f7a-e418-4a21-af73-6a43ce3358c1") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.334878 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9db08d13283889d8c43ab89157e2c33f300359ecf66b548100b30ddea3da2be1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8472b62f0ac04884cd8a9d342c063860188b0f17e1ed2201c7ea7288fb44d77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:08Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.353026 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-flbx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cc3ea69-b881-4fd4-ad4d-42803f27865b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07b26bfbc15981836877e6ddc6aea3fb26a4df1be38ac67c76f8a7a8f6b84b5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqph9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-flbx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:08Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.382179 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dcca2640a7625737b151a65d63dac889139b41a8dff1cd5aae2ead1a37138f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://879052e78b8d383d0a7f4fbddb38b98fa931bec3d0d0c7ff7eb830ccb62be46d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5de90aae0b44ad98711b4e4c2764cf25c491a329a7acbe9a6ec49f88fb8e7c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c0568b462e1168949dcfd0131ef4e94d6c3f68dd31906ee5c6a8c87f7fef25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://702a868ef507f31cb27f41ccd862f38ba8d5256e5d4af19a5d8b168939932663\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5089067acda7084123f239150916500b11a0547068b0a39ee60a467c372f3459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c2bd7e735fa6950f8384e98fc9f9b2c030892c44b19ca0f13fd0e4bad302e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86c2bd7e735fa6950f8384e98fc9f9b2c030892c44b19ca0f13fd0e4bad302e4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T15:10:06Z\\\",\\\"message\\\":\\\"ns:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {43933d5e-3c3b-4ff8-8926-04ac25de450e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0219 15:10:06.221628 6231 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0219 15:10:06.223394 6231 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:43933d5e-3c3b-4ff8-8926-04ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0219 15:10:06.223450 6231 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T15:10:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-8k7p5_openshift-ovn-kubernetes(c5a8a15c-53e8-4868-8feb-dcd4e83939a4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e21fe2d6964872913c8e5a2f983519806b72d0e2880016372ac90fc2b9f61fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8k7p5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:08Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.382411 4810 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 03:06:43.854130182 +0000 UTC Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.383659 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.383709 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.383726 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.383750 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.383763 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:08Z","lastTransitionTime":"2026-02-19T15:10:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:08 crc kubenswrapper[4810]: E0219 15:10:08.403459 4810 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7961bd2b-8ad7-4e25-b313-8f82bef01c62\\\",\\\"systemUUID\\\":\\\"60bcb373-142f-4da9-846e-4d055863e63a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:08Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.408930 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.408963 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.408975 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.409001 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.409016 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:08Z","lastTransitionTime":"2026-02-19T15:10:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:08 crc kubenswrapper[4810]: E0219 15:10:08.427941 4810 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7961bd2b-8ad7-4e25-b313-8f82bef01c62\\\",\\\"systemUUID\\\":\\\"60bcb373-142f-4da9-846e-4d055863e63a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:08Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.432422 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.432455 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.432467 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.432487 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.432499 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:08Z","lastTransitionTime":"2026-02-19T15:10:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.439132 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 15:10:08 crc kubenswrapper[4810]: E0219 15:10:08.439315 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 15:10:08 crc kubenswrapper[4810]: E0219 15:10:08.455268 4810 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7961bd2b-8ad7-4e25-b313-8f82bef01c62\\\",\\\"systemUUID\\\":\\\"60bcb373-142f-4da9-846e-4d055863e63a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:08Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.461037 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.461140 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.461163 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.461195 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.461220 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:08Z","lastTransitionTime":"2026-02-19T15:10:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:08 crc kubenswrapper[4810]: E0219 15:10:08.484022 4810 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7961bd2b-8ad7-4e25-b313-8f82bef01c62\\\",\\\"systemUUID\\\":\\\"60bcb373-142f-4da9-846e-4d055863e63a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:08Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.489471 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.489545 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.489570 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.489603 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.489630 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:08Z","lastTransitionTime":"2026-02-19T15:10:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:08 crc kubenswrapper[4810]: E0219 15:10:08.508593 4810 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7961bd2b-8ad7-4e25-b313-8f82bef01c62\\\",\\\"systemUUID\\\":\\\"60bcb373-142f-4da9-846e-4d055863e63a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:08Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:08 crc kubenswrapper[4810]: E0219 15:10:08.508963 4810 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.511522 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.511580 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.511596 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.511622 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.511638 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:08Z","lastTransitionTime":"2026-02-19T15:10:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.615173 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.615246 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.615264 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.615297 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.615320 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:08Z","lastTransitionTime":"2026-02-19T15:10:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.719180 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.719285 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.719305 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.719414 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.719448 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:08Z","lastTransitionTime":"2026-02-19T15:10:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.822683 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.822745 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.822770 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.822798 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.822820 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:08Z","lastTransitionTime":"2026-02-19T15:10:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.926522 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.926607 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.926630 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.926660 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:08 crc kubenswrapper[4810]: I0219 15:10:08.926685 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:08Z","lastTransitionTime":"2026-02-19T15:10:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:09 crc kubenswrapper[4810]: I0219 15:10:09.029789 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:09 crc kubenswrapper[4810]: I0219 15:10:09.029870 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:09 crc kubenswrapper[4810]: I0219 15:10:09.029889 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:09 crc kubenswrapper[4810]: I0219 15:10:09.029916 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:09 crc kubenswrapper[4810]: I0219 15:10:09.029935 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:09Z","lastTransitionTime":"2026-02-19T15:10:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:09 crc kubenswrapper[4810]: I0219 15:10:09.133453 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:09 crc kubenswrapper[4810]: I0219 15:10:09.133510 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:09 crc kubenswrapper[4810]: I0219 15:10:09.133533 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:09 crc kubenswrapper[4810]: I0219 15:10:09.133562 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:09 crc kubenswrapper[4810]: I0219 15:10:09.133585 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:09Z","lastTransitionTime":"2026-02-19T15:10:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:09 crc kubenswrapper[4810]: I0219 15:10:09.237010 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:09 crc kubenswrapper[4810]: I0219 15:10:09.237051 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:09 crc kubenswrapper[4810]: I0219 15:10:09.237066 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:09 crc kubenswrapper[4810]: I0219 15:10:09.237087 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:09 crc kubenswrapper[4810]: I0219 15:10:09.237100 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:09Z","lastTransitionTime":"2026-02-19T15:10:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:09 crc kubenswrapper[4810]: I0219 15:10:09.340650 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:09 crc kubenswrapper[4810]: I0219 15:10:09.340712 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:09 crc kubenswrapper[4810]: I0219 15:10:09.340736 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:09 crc kubenswrapper[4810]: I0219 15:10:09.340757 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:09 crc kubenswrapper[4810]: I0219 15:10:09.340771 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:09Z","lastTransitionTime":"2026-02-19T15:10:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:09 crc kubenswrapper[4810]: I0219 15:10:09.343353 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b72d3f7a-e418-4a21-af73-6a43ce3358c1-metrics-certs\") pod \"network-metrics-daemon-2x9v9\" (UID: \"b72d3f7a-e418-4a21-af73-6a43ce3358c1\") " pod="openshift-multus/network-metrics-daemon-2x9v9" Feb 19 15:10:09 crc kubenswrapper[4810]: E0219 15:10:09.343474 4810 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 15:10:09 crc kubenswrapper[4810]: E0219 15:10:09.343561 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b72d3f7a-e418-4a21-af73-6a43ce3358c1-metrics-certs podName:b72d3f7a-e418-4a21-af73-6a43ce3358c1 nodeName:}" failed. No retries permitted until 2026-02-19 15:10:11.343538949 +0000 UTC m=+40.825569083 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b72d3f7a-e418-4a21-af73-6a43ce3358c1-metrics-certs") pod "network-metrics-daemon-2x9v9" (UID: "b72d3f7a-e418-4a21-af73-6a43ce3358c1") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 15:10:09 crc kubenswrapper[4810]: I0219 15:10:09.382880 4810 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 11:05:28.219219383 +0000 UTC Feb 19 15:10:09 crc kubenswrapper[4810]: I0219 15:10:09.438953 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2x9v9" Feb 19 15:10:09 crc kubenswrapper[4810]: I0219 15:10:09.439008 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 15:10:09 crc kubenswrapper[4810]: I0219 15:10:09.439046 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 15:10:09 crc kubenswrapper[4810]: E0219 15:10:09.439219 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2x9v9" podUID="b72d3f7a-e418-4a21-af73-6a43ce3358c1" Feb 19 15:10:09 crc kubenswrapper[4810]: E0219 15:10:09.439416 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 15:10:09 crc kubenswrapper[4810]: E0219 15:10:09.439541 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 15:10:09 crc kubenswrapper[4810]: I0219 15:10:09.444120 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:09 crc kubenswrapper[4810]: I0219 15:10:09.444164 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:09 crc kubenswrapper[4810]: I0219 15:10:09.444180 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:09 crc kubenswrapper[4810]: I0219 15:10:09.444205 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:09 crc kubenswrapper[4810]: I0219 15:10:09.444223 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:09Z","lastTransitionTime":"2026-02-19T15:10:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:09 crc kubenswrapper[4810]: I0219 15:10:09.550396 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:09 crc kubenswrapper[4810]: I0219 15:10:09.550480 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:09 crc kubenswrapper[4810]: I0219 15:10:09.550498 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:09 crc kubenswrapper[4810]: I0219 15:10:09.550530 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:09 crc kubenswrapper[4810]: I0219 15:10:09.550549 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:09Z","lastTransitionTime":"2026-02-19T15:10:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:09 crc kubenswrapper[4810]: I0219 15:10:09.654079 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:09 crc kubenswrapper[4810]: I0219 15:10:09.654158 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:09 crc kubenswrapper[4810]: I0219 15:10:09.654177 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:09 crc kubenswrapper[4810]: I0219 15:10:09.654210 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:09 crc kubenswrapper[4810]: I0219 15:10:09.654229 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:09Z","lastTransitionTime":"2026-02-19T15:10:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:09 crc kubenswrapper[4810]: I0219 15:10:09.757544 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:09 crc kubenswrapper[4810]: I0219 15:10:09.757603 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:09 crc kubenswrapper[4810]: I0219 15:10:09.757621 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:09 crc kubenswrapper[4810]: I0219 15:10:09.757648 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:09 crc kubenswrapper[4810]: I0219 15:10:09.757671 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:09Z","lastTransitionTime":"2026-02-19T15:10:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:09 crc kubenswrapper[4810]: I0219 15:10:09.861712 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:09 crc kubenswrapper[4810]: I0219 15:10:09.861772 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:09 crc kubenswrapper[4810]: I0219 15:10:09.861795 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:09 crc kubenswrapper[4810]: I0219 15:10:09.861826 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:09 crc kubenswrapper[4810]: I0219 15:10:09.861848 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:09Z","lastTransitionTime":"2026-02-19T15:10:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:09 crc kubenswrapper[4810]: I0219 15:10:09.964944 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:09 crc kubenswrapper[4810]: I0219 15:10:09.964989 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:09 crc kubenswrapper[4810]: I0219 15:10:09.964999 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:09 crc kubenswrapper[4810]: I0219 15:10:09.965017 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:09 crc kubenswrapper[4810]: I0219 15:10:09.965027 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:09Z","lastTransitionTime":"2026-02-19T15:10:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:10 crc kubenswrapper[4810]: I0219 15:10:10.068439 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:10 crc kubenswrapper[4810]: I0219 15:10:10.068897 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:10 crc kubenswrapper[4810]: I0219 15:10:10.069046 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:10 crc kubenswrapper[4810]: I0219 15:10:10.069196 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:10 crc kubenswrapper[4810]: I0219 15:10:10.069402 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:10Z","lastTransitionTime":"2026-02-19T15:10:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:10 crc kubenswrapper[4810]: I0219 15:10:10.172691 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:10 crc kubenswrapper[4810]: I0219 15:10:10.172751 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:10 crc kubenswrapper[4810]: I0219 15:10:10.172771 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:10 crc kubenswrapper[4810]: I0219 15:10:10.172799 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:10 crc kubenswrapper[4810]: I0219 15:10:10.172830 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:10Z","lastTransitionTime":"2026-02-19T15:10:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:10 crc kubenswrapper[4810]: I0219 15:10:10.276202 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:10 crc kubenswrapper[4810]: I0219 15:10:10.276269 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:10 crc kubenswrapper[4810]: I0219 15:10:10.276288 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:10 crc kubenswrapper[4810]: I0219 15:10:10.276318 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:10 crc kubenswrapper[4810]: I0219 15:10:10.276363 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:10Z","lastTransitionTime":"2026-02-19T15:10:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:10 crc kubenswrapper[4810]: I0219 15:10:10.380378 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:10 crc kubenswrapper[4810]: I0219 15:10:10.380458 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:10 crc kubenswrapper[4810]: I0219 15:10:10.380479 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:10 crc kubenswrapper[4810]: I0219 15:10:10.380507 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:10 crc kubenswrapper[4810]: I0219 15:10:10.380526 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:10Z","lastTransitionTime":"2026-02-19T15:10:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:10 crc kubenswrapper[4810]: I0219 15:10:10.383436 4810 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 18:09:28.197529938 +0000 UTC Feb 19 15:10:10 crc kubenswrapper[4810]: I0219 15:10:10.438257 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 15:10:10 crc kubenswrapper[4810]: E0219 15:10:10.438453 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 15:10:10 crc kubenswrapper[4810]: I0219 15:10:10.439269 4810 scope.go:117] "RemoveContainer" containerID="640d02443d7d20048695866b4db1d1fdd6469c712a804010860ca4c730f59c65" Feb 19 15:10:10 crc kubenswrapper[4810]: I0219 15:10:10.483545 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:10 crc kubenswrapper[4810]: I0219 15:10:10.483620 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:10 crc kubenswrapper[4810]: I0219 15:10:10.483646 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:10 crc kubenswrapper[4810]: I0219 15:10:10.483679 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:10 crc kubenswrapper[4810]: I0219 15:10:10.483703 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:10Z","lastTransitionTime":"2026-02-19T15:10:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:10 crc kubenswrapper[4810]: I0219 15:10:10.588140 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:10 crc kubenswrapper[4810]: I0219 15:10:10.588232 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:10 crc kubenswrapper[4810]: I0219 15:10:10.588254 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:10 crc kubenswrapper[4810]: I0219 15:10:10.588290 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:10 crc kubenswrapper[4810]: I0219 15:10:10.588311 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:10Z","lastTransitionTime":"2026-02-19T15:10:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:10 crc kubenswrapper[4810]: I0219 15:10:10.691732 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:10 crc kubenswrapper[4810]: I0219 15:10:10.692004 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:10 crc kubenswrapper[4810]: I0219 15:10:10.692018 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:10 crc kubenswrapper[4810]: I0219 15:10:10.692039 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:10 crc kubenswrapper[4810]: I0219 15:10:10.692052 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:10Z","lastTransitionTime":"2026-02-19T15:10:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:10 crc kubenswrapper[4810]: I0219 15:10:10.795252 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:10 crc kubenswrapper[4810]: I0219 15:10:10.795304 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:10 crc kubenswrapper[4810]: I0219 15:10:10.795321 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:10 crc kubenswrapper[4810]: I0219 15:10:10.795378 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:10 crc kubenswrapper[4810]: I0219 15:10:10.795408 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:10Z","lastTransitionTime":"2026-02-19T15:10:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:10 crc kubenswrapper[4810]: I0219 15:10:10.816316 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 19 15:10:10 crc kubenswrapper[4810]: I0219 15:10:10.819359 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ef740fa90b279363e303fb824188d0a26097b06aa8eaa488c0c800a8caa8f10e"} Feb 19 15:10:10 crc kubenswrapper[4810]: I0219 15:10:10.898281 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:10 crc kubenswrapper[4810]: I0219 15:10:10.898364 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:10 crc kubenswrapper[4810]: I0219 15:10:10.898382 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:10 crc kubenswrapper[4810]: I0219 15:10:10.898410 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:10 crc kubenswrapper[4810]: I0219 15:10:10.898427 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:10Z","lastTransitionTime":"2026-02-19T15:10:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:11 crc kubenswrapper[4810]: I0219 15:10:11.001784 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:11 crc kubenswrapper[4810]: I0219 15:10:11.001857 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:11 crc kubenswrapper[4810]: I0219 15:10:11.001880 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:11 crc kubenswrapper[4810]: I0219 15:10:11.001912 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:11 crc kubenswrapper[4810]: I0219 15:10:11.001934 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:11Z","lastTransitionTime":"2026-02-19T15:10:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:11 crc kubenswrapper[4810]: I0219 15:10:11.106418 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:11 crc kubenswrapper[4810]: I0219 15:10:11.106486 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:11 crc kubenswrapper[4810]: I0219 15:10:11.106511 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:11 crc kubenswrapper[4810]: I0219 15:10:11.106542 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:11 crc kubenswrapper[4810]: I0219 15:10:11.106560 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:11Z","lastTransitionTime":"2026-02-19T15:10:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:11 crc kubenswrapper[4810]: I0219 15:10:11.210187 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:11 crc kubenswrapper[4810]: I0219 15:10:11.210262 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:11 crc kubenswrapper[4810]: I0219 15:10:11.210281 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:11 crc kubenswrapper[4810]: I0219 15:10:11.210309 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:11 crc kubenswrapper[4810]: I0219 15:10:11.210353 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:11Z","lastTransitionTime":"2026-02-19T15:10:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:11 crc kubenswrapper[4810]: I0219 15:10:11.314308 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:11 crc kubenswrapper[4810]: I0219 15:10:11.314404 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:11 crc kubenswrapper[4810]: I0219 15:10:11.314422 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:11 crc kubenswrapper[4810]: I0219 15:10:11.314450 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:11 crc kubenswrapper[4810]: I0219 15:10:11.314474 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:11Z","lastTransitionTime":"2026-02-19T15:10:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:11 crc kubenswrapper[4810]: I0219 15:10:11.368647 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b72d3f7a-e418-4a21-af73-6a43ce3358c1-metrics-certs\") pod \"network-metrics-daemon-2x9v9\" (UID: \"b72d3f7a-e418-4a21-af73-6a43ce3358c1\") " pod="openshift-multus/network-metrics-daemon-2x9v9" Feb 19 15:10:11 crc kubenswrapper[4810]: E0219 15:10:11.368882 4810 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 15:10:11 crc kubenswrapper[4810]: E0219 15:10:11.369276 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b72d3f7a-e418-4a21-af73-6a43ce3358c1-metrics-certs podName:b72d3f7a-e418-4a21-af73-6a43ce3358c1 nodeName:}" failed. No retries permitted until 2026-02-19 15:10:15.369247972 +0000 UTC m=+44.851278116 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b72d3f7a-e418-4a21-af73-6a43ce3358c1-metrics-certs") pod "network-metrics-daemon-2x9v9" (UID: "b72d3f7a-e418-4a21-af73-6a43ce3358c1") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 15:10:11 crc kubenswrapper[4810]: I0219 15:10:11.384563 4810 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 04:21:49.134226845 +0000 UTC Feb 19 15:10:11 crc kubenswrapper[4810]: I0219 15:10:11.417066 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:11 crc kubenswrapper[4810]: I0219 15:10:11.417150 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:11 crc kubenswrapper[4810]: I0219 15:10:11.417170 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:11 crc kubenswrapper[4810]: I0219 15:10:11.417197 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:11 crc kubenswrapper[4810]: I0219 15:10:11.417218 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:11Z","lastTransitionTime":"2026-02-19T15:10:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:11 crc kubenswrapper[4810]: I0219 15:10:11.438573 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 15:10:11 crc kubenswrapper[4810]: I0219 15:10:11.438614 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2x9v9" Feb 19 15:10:11 crc kubenswrapper[4810]: I0219 15:10:11.438684 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 15:10:11 crc kubenswrapper[4810]: E0219 15:10:11.439543 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2x9v9" podUID="b72d3f7a-e418-4a21-af73-6a43ce3358c1" Feb 19 15:10:11 crc kubenswrapper[4810]: E0219 15:10:11.439634 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 15:10:11 crc kubenswrapper[4810]: E0219 15:10:11.439316 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 15:10:11 crc kubenswrapper[4810]: I0219 15:10:11.468251 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:11Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:11 crc kubenswrapper[4810]: I0219 15:10:11.488635 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54ba23405a6182e281d7d423f13e84a03c0f68cf69169f59fe1cd0d5881103c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:11Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:11 crc kubenswrapper[4810]: I0219 15:10:11.508299 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bsztz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a45a199-beeb-4972-b796-15c958fe99d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a674e283274f1e77f6ac89a8170137bf5852153bd4a35a80101a3cd0090b85d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcg8h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bsztz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:11Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:11 crc kubenswrapper[4810]: I0219 15:10:11.520970 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:11 crc kubenswrapper[4810]: I0219 15:10:11.521019 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:11 crc kubenswrapper[4810]: I0219 15:10:11.521038 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:11 crc kubenswrapper[4810]: I0219 15:10:11.521063 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:11 crc kubenswrapper[4810]: I0219 15:10:11.521086 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:11Z","lastTransitionTime":"2026-02-19T15:10:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:11 crc kubenswrapper[4810]: I0219 15:10:11.530661 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6694b0d-3264-43eb-bd52-5088c7d2bf15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16713d77989f991d6b3136c41aac696aaa3233a0f3454bcd299cd968f46872eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42939a1750dd650a11f4f83812694248cc8767b040f5cab76ee64d30ad6004ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a3707767fcc9c9ee6a808f9c0598457f1710b6394ecf9910f7a1a0ddd8ec1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://640d02443d7d20048695866b4db1d1fdd6469c712a804010860ca4c730f59c65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://640d02443d7d20048695866b4db1d1fdd6469c712a804010860ca4c730f59c65\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T15:09:51Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 15:09:51.508430 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 15:09:51.508603 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 15:09:51.509986 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1367038872/tls.crt::/tmp/serving-cert-1367038872/tls.key\\\\\\\"\\\\nI0219 15:09:51.722990 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 15:09:51.726068 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 15:09:51.726086 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 15:09:51.726104 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 15:09:51.726110 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 15:09:51.731095 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 15:09:51.731120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0219 15:09:51.731119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0219 15:09:51.731126 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 15:09:51.731136 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 15:09:51.731141 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 15:09:51.731145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 15:09:51.731149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 15:09:51.734010 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77137e37c7a8cdddbba21456e942c760fb2cc6c31533e3bc6c4561929b41393b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:11Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:11 crc kubenswrapper[4810]: I0219 15:10:11.550964 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:11Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:11 crc kubenswrapper[4810]: I0219 15:10:11.578981 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmbsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"247e948b-3c17-4675-bd1c-f894b02d2817\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7171cb2bba77488fbb60ec533172b2240d1871e89bc7760ae5c9b67ee6924354\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cbe5a969ecab49494c22bccec30a5b7050699729a8ae6244a4bb016ed9cbe64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cbe5a969ecab49494c22bccec30a5b7050699729a8ae6244a4bb016ed9cbe64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://395e0b7a03b78f71b2823c72ea754e1909bde653a7107597bf07b6e664ba40dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://395e0b7a03b78f71b2823c72ea754e1909bde653a7107597bf07b6e664ba40dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5a0126c8bea39a89fe488d49dcb1e47063abc9839d148f348a7ce592038a450\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5a0126c8bea39a89fe488d49dcb1e47063abc9839d148f348a7ce592038a450\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9864da53b0851b0c4eadd5c11669af9b9e539660908ca14f44618824b5e4158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9864da53b0851b0c4eadd5c11669af9b9e539660908ca14f44618824b5e4158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://301c743749a2bcaee82a70d3c34a6cf13f880ef9986c390e279a7c056f9a99ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://301c743749a2bcaee82a70d3c34a6cf13f880ef9986c390e279a7c056f9a99ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://075e06e40beb99d2338c5850692a5e89314b6342cc72007d75b602ac21a90d9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://075e06e40beb99d2338c5850692a5e89314b6342cc72007d75b602ac21a90d9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmbsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:11Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:11 crc kubenswrapper[4810]: I0219 15:10:11.595322 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2x9v9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b72d3f7a-e418-4a21-af73-6a43ce3358c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7z7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7z7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:10:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2x9v9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:11Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:11 crc kubenswrapper[4810]: I0219 15:10:11.612628 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:11Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:11 crc kubenswrapper[4810]: I0219 15:10:11.624643 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:11 crc kubenswrapper[4810]: I0219 15:10:11.624716 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:11 crc kubenswrapper[4810]: I0219 15:10:11.624740 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:11 crc kubenswrapper[4810]: I0219 15:10:11.624770 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:11 crc kubenswrapper[4810]: I0219 15:10:11.624792 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:11Z","lastTransitionTime":"2026-02-19T15:10:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:11 crc kubenswrapper[4810]: I0219 15:10:11.633300 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfbf74ef-1e94-4826-8583-42b2e246ccf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f26baf8d6384c9133f7da338479eac54f1aa88609e4b0854078d4e85e8bf05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjrkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba65a8abc0841494ba4596019b005e3cbb3f3c0897bef615b51a2ab10aa9e651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjrkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t499d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:11Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:11 crc kubenswrapper[4810]: I0219 15:10:11.650875 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-t9jnq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60d7228b-14bd-4988-8dca-cb89f487ba00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb03eee4bad3a037ff0d61a204c9c86a839e9d684dbb89a433a50afa9cfcfcf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gzvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-t9jnq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:11Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:11 crc kubenswrapper[4810]: I0219 15:10:11.672954 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-l2g4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06adf9a1-ec31-4acc-9864-41549913d3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://618a71f428d132079ed8a6756ea331ec8fd20c32c98bc00691f56fa0b12669aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh6qr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82d6389909245cbf6091dfb1ef53a497dd254228acd53c508c32c4345f86b7b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh6qr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:10:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-l2g4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:11Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:11 crc kubenswrapper[4810]: I0219 15:10:11.697430 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d74a04f-ffb6-47fb-8823-2fd7f3df4f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2070ce19c777ad791dfd3e5d99eded43e479f4cbbaefe96c5ed8525aa86e5ed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89e969496cd5adab974a5d8b8c3051bcdfb63c582b8f0d6d9af233d25e2c1d3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5559fbe43f7e18617e2a46182d621a6310ae4d0b10548bae2bb098588285e081\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fbf8815e560483199cc8f453952895d3e5d831e7c52671b204187dc686fa4ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:11Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:11 crc kubenswrapper[4810]: I0219 15:10:11.713793 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b60fe11e60f880bde640f63640fdc6a1b66e3044dcd975c711c45a15aed3fb2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:11Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:11 crc kubenswrapper[4810]: I0219 15:10:11.727150 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:11 crc kubenswrapper[4810]: I0219 15:10:11.727186 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:11 crc kubenswrapper[4810]: I0219 15:10:11.727197 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:11 crc kubenswrapper[4810]: I0219 15:10:11.727216 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:11 crc kubenswrapper[4810]: I0219 15:10:11.727228 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:11Z","lastTransitionTime":"2026-02-19T15:10:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:11 crc kubenswrapper[4810]: I0219 15:10:11.731404 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9db08d13283889d8c43ab89157e2c33f300359ecf66b548100b30ddea3da2be1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8472b62f0ac04884cd8a9d342c063860188b0f17e1ed2201c7ea7288fb44d77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:11Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:11 crc kubenswrapper[4810]: I0219 15:10:11.744703 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-flbx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cc3ea69-b881-4fd4-ad4d-42803f27865b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07b26bfbc15981836877e6ddc6aea3fb26a4df1be38ac67c76f8a7a8f6b84b5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqph9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-flbx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:11Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:11 crc kubenswrapper[4810]: I0219 15:10:11.767907 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dcca2640a7625737b151a65d63dac889139b41a8dff1cd5aae2ead1a37138f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://879052e78b8d383d0a7f4fbddb38b98fa931bec3d0d0c7ff7eb830ccb62be46d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5de90aae0b44ad98711b4e4c2764cf25c491a329a7acbe9a6ec49f88fb8e7c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c0568b462e1168949dcfd0131ef4e94d6c3f68dd31906ee5c6a8c87f7fef25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://702a868ef507f31cb27f41ccd862f38ba8d5256e5d4af19a5d8b168939932663\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5089067acda7084123f239150916500b11a0547068b0a39ee60a467c372f3459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c2bd7e735fa6950f8384e98fc9f9b2c030892c44b19ca0f13fd0e4bad302e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86c2bd7e735fa6950f8384e98fc9f9b2c030892c44b19ca0f13fd0e4bad302e4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T15:10:06Z\\\",\\\"message\\\":\\\"ns:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {43933d5e-3c3b-4ff8-8926-04ac25de450e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0219 15:10:06.221628 6231 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0219 15:10:06.223394 6231 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:43933d5e-3c3b-4ff8-8926-04ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0219 15:10:06.223450 6231 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T15:10:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-8k7p5_openshift-ovn-kubernetes(c5a8a15c-53e8-4868-8feb-dcd4e83939a4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e21fe2d6964872913c8e5a2f983519806b72d0e2880016372ac90fc2b9f61fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8k7p5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:11Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:11 crc kubenswrapper[4810]: I0219 15:10:11.829780 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:11 crc kubenswrapper[4810]: I0219 15:10:11.829845 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:11 crc kubenswrapper[4810]: I0219 15:10:11.829863 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:11 crc kubenswrapper[4810]: I0219 15:10:11.829894 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:11 crc kubenswrapper[4810]: I0219 15:10:11.829913 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:11Z","lastTransitionTime":"2026-02-19T15:10:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:11 crc kubenswrapper[4810]: I0219 15:10:11.842229 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9db08d13283889d8c43ab89157e2c33f300359ecf66b548100b30ddea3da2be1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8472b62f0ac04884cd8a9d342c063860188b0f17e1ed2201c7ea7288fb44d77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:11Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:11 crc kubenswrapper[4810]: I0219 15:10:11.854703 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-flbx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cc3ea69-b881-4fd4-ad4d-42803f27865b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07b26bfbc15981836877e6ddc6aea3fb26a4df1be38ac67c76f8a7a8f6b84b5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqph9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-flbx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:11Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:11 crc kubenswrapper[4810]: I0219 15:10:11.876763 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dcca2640a7625737b151a65d63dac889139b41a8dff1cd5aae2ead1a37138f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://879052e78b8d383d0a7f4fbddb38b98fa931bec3d0d0c7ff7eb830ccb62be46d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5de90aae0b44ad98711b4e4c2764cf25c491a329a7acbe9a6ec49f88fb8e7c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c0568b462e1168949dcfd0131ef4e94d6c3f68dd31906ee5c6a8c87f7fef25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://702a868ef507f31cb27f41ccd862f38ba8d5256e5d4af19a5d8b168939932663\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5089067acda7084123f239150916500b11a0547068b0a39ee60a467c372f3459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c2bd7e735fa6950f8384e98fc9f9b2c030892c44b19ca0f13fd0e4bad302e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86c2bd7e735fa6950f8384e98fc9f9b2c030892c44b19ca0f13fd0e4bad302e4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T15:10:06Z\\\",\\\"message\\\":\\\"ns:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {43933d5e-3c3b-4ff8-8926-04ac25de450e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0219 15:10:06.221628 6231 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0219 15:10:06.223394 6231 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:43933d5e-3c3b-4ff8-8926-04ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0219 15:10:06.223450 6231 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T15:10:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-8k7p5_openshift-ovn-kubernetes(c5a8a15c-53e8-4868-8feb-dcd4e83939a4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e21fe2d6964872913c8e5a2f983519806b72d0e2880016372ac90fc2b9f61fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8k7p5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:11Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:11 crc kubenswrapper[4810]: I0219 15:10:11.892248 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d74a04f-ffb6-47fb-8823-2fd7f3df4f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2070ce19c777ad791dfd3e5d99eded43e479f4cbbaefe96c5ed8525aa86e5ed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89e969496cd5adab974a5d8b8c3051bcdfb63c582b8f0d6d9af233d25e2c1d3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5559fbe43f7e18617e2a46182d621a6310ae4d0b10548bae2bb098588285e081\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fbf8815e560483199cc8f453952895d3e5d831e7c52671b204187dc686fa4ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:11Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:11 crc kubenswrapper[4810]: I0219 15:10:11.911594 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b60fe11e60f880bde640f63640fdc6a1b66e3044dcd975c711c45a15aed3fb2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:11Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:11 crc kubenswrapper[4810]: I0219 15:10:11.928277 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bsztz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a45a199-beeb-4972-b796-15c958fe99d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a674e283274f1e77f6ac89a8170137bf5852153bd4a35a80101a3cd0090b85d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcg8h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bsztz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:11Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:11 crc kubenswrapper[4810]: I0219 15:10:11.936103 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:11 crc kubenswrapper[4810]: I0219 15:10:11.936160 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:11 crc kubenswrapper[4810]: I0219 15:10:11.936179 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:11 crc kubenswrapper[4810]: I0219 15:10:11.936214 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:11 crc kubenswrapper[4810]: I0219 15:10:11.936228 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:11Z","lastTransitionTime":"2026-02-19T15:10:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:11 crc kubenswrapper[4810]: I0219 15:10:11.954574 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:11Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:11 crc kubenswrapper[4810]: I0219 15:10:11.973565 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54ba23405a6182e281d7d423f13e84a03c0f68cf69169f59fe1cd0d5881103c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:11Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:11 crc kubenswrapper[4810]: I0219 15:10:11.991129 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6694b0d-3264-43eb-bd52-5088c7d2bf15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16713d77989f991d6b3136c41aac696aaa3233a0f3454bcd299cd968f46872eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42939a1750dd650a11f4f83812694248cc8767b040f5cab76ee64d30ad6004ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a3707767fcc9c9ee6a808f9c0598457f1710b6394ecf9910f7a1a0ddd8ec1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef740fa90b279363e303fb824188d0a26097b06aa8eaa488c0c800a8caa8f10e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://640d02443d7d20048695866b4db1d1fdd6469c712a804010860ca4c730f59c65\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T15:09:51Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 15:09:51.508430 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 15:09:51.508603 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 15:09:51.509986 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1367038872/tls.crt::/tmp/serving-cert-1367038872/tls.key\\\\\\\"\\\\nI0219 15:09:51.722990 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 15:09:51.726068 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 15:09:51.726086 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 15:09:51.726104 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 15:09:51.726110 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 15:09:51.731095 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 15:09:51.731120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0219 15:09:51.731119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0219 15:09:51.731126 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 15:09:51.731136 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 15:09:51.731141 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 15:09:51.731145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 15:09:51.731149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 15:09:51.734010 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77137e37c7a8cdddbba21456e942c760fb2cc6c31533e3bc6c4561929b41393b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:11Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:12 crc kubenswrapper[4810]: I0219 15:10:12.009931 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:12Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:12 crc kubenswrapper[4810]: I0219 15:10:12.033107 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmbsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"247e948b-3c17-4675-bd1c-f894b02d2817\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7171cb2bba77488fbb60ec533172b2240d1871e89bc7760ae5c9b67ee6924354\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cbe5a969ecab49494c22bccec30a5b7050699729a8ae6244a4bb016ed9cbe64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cbe5a969ecab49494c22bccec30a5b7050699729a8ae6244a4bb016ed9cbe64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://395e0b7a03b78f71b2823c72ea754e1909bde653a7107597bf07b6e664ba40dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://395e0b7a03b78f71b2823c72ea754e1909bde653a7107597bf07b6e664ba40dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5a0126c8bea39a89fe488d49dcb1e47063abc9839d148f348a7ce592038a450\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5a0126c8bea39a89fe488d49dcb1e47063abc9839d148f348a7ce592038a450\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9864da53b0851b0c4eadd5c11669af9b9e539660908ca14f44618824b5e4158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9864da53b0851b0c4eadd5c11669af9b9e539660908ca14f44618824b5e4158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://301c743749a2bcaee82a70d3c34a6cf13f880ef9986c390e279a7c056f9a99ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://301c743749a2bcaee82a70d3c34a6cf13f880ef9986c390e279a7c056f9a99ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://075e06e40beb99d2338c5850692a5e89314b6342cc72007d75b602ac21a90d9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://075e06e40beb99d2338c5850692a5e89314b6342cc72007d75b602ac21a90d9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmbsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:12Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:12 crc kubenswrapper[4810]: I0219 15:10:12.039715 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:12 crc kubenswrapper[4810]: I0219 15:10:12.039771 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:12 crc kubenswrapper[4810]: I0219 15:10:12.039788 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:12 crc kubenswrapper[4810]: I0219 15:10:12.039815 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:12 crc kubenswrapper[4810]: I0219 15:10:12.039835 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:12Z","lastTransitionTime":"2026-02-19T15:10:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:12 crc kubenswrapper[4810]: I0219 15:10:12.054286 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2x9v9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b72d3f7a-e418-4a21-af73-6a43ce3358c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7z7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7z7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:10:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2x9v9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:12Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:12 crc kubenswrapper[4810]: I0219 15:10:12.072752 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfbf74ef-1e94-4826-8583-42b2e246ccf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f26baf8d6384c9133f7da338479eac54f1aa88609e4b0854078d4e85e8bf05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjrkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba65a8abc0841494ba4596019b005e3cbb3f3c0897bef615b51a2ab10aa9e651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjrkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t499d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:12Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:12 crc kubenswrapper[4810]: I0219 15:10:12.088609 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-t9jnq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60d7228b-14bd-4988-8dca-cb89f487ba00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb03eee4bad3a037ff0d61a204c9c86a839e9d684dbb89a433a50afa9cfcfcf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gzvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-t9jnq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:12Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:12 crc kubenswrapper[4810]: I0219 15:10:12.110070 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-l2g4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06adf9a1-ec31-4acc-9864-41549913d3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://618a71f428d132079ed8a6756ea331ec8fd20c32c98bc00691f56fa0b12669aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh6qr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82d6389909245cbf6091dfb1ef53a497dd254228acd53c508c32c4345f86b7b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh6qr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:10:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-l2g4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:12Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:12 crc kubenswrapper[4810]: I0219 15:10:12.126843 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:12Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:12 crc kubenswrapper[4810]: I0219 15:10:12.145354 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:12 crc kubenswrapper[4810]: I0219 15:10:12.145814 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:12 crc kubenswrapper[4810]: I0219 15:10:12.145889 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:12 crc kubenswrapper[4810]: I0219 15:10:12.146001 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:12 crc kubenswrapper[4810]: I0219 15:10:12.146071 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:12Z","lastTransitionTime":"2026-02-19T15:10:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:12 crc kubenswrapper[4810]: I0219 15:10:12.250095 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:12 crc kubenswrapper[4810]: I0219 15:10:12.250154 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:12 crc kubenswrapper[4810]: I0219 15:10:12.250163 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:12 crc kubenswrapper[4810]: I0219 15:10:12.250188 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:12 crc kubenswrapper[4810]: I0219 15:10:12.250200 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:12Z","lastTransitionTime":"2026-02-19T15:10:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:12 crc kubenswrapper[4810]: I0219 15:10:12.353285 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:12 crc kubenswrapper[4810]: I0219 15:10:12.353376 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:12 crc kubenswrapper[4810]: I0219 15:10:12.353395 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:12 crc kubenswrapper[4810]: I0219 15:10:12.353424 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:12 crc kubenswrapper[4810]: I0219 15:10:12.353444 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:12Z","lastTransitionTime":"2026-02-19T15:10:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:12 crc kubenswrapper[4810]: I0219 15:10:12.385537 4810 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 02:38:58.357487677 +0000 UTC Feb 19 15:10:12 crc kubenswrapper[4810]: I0219 15:10:12.439056 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 15:10:12 crc kubenswrapper[4810]: E0219 15:10:12.439590 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 15:10:12 crc kubenswrapper[4810]: I0219 15:10:12.457304 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:12 crc kubenswrapper[4810]: I0219 15:10:12.457409 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:12 crc kubenswrapper[4810]: I0219 15:10:12.457422 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:12 crc kubenswrapper[4810]: I0219 15:10:12.457444 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:12 crc kubenswrapper[4810]: I0219 15:10:12.457459 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:12Z","lastTransitionTime":"2026-02-19T15:10:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:12 crc kubenswrapper[4810]: I0219 15:10:12.561513 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:12 crc kubenswrapper[4810]: I0219 15:10:12.561574 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:12 crc kubenswrapper[4810]: I0219 15:10:12.561589 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:12 crc kubenswrapper[4810]: I0219 15:10:12.561613 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:12 crc kubenswrapper[4810]: I0219 15:10:12.561632 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:12Z","lastTransitionTime":"2026-02-19T15:10:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:12 crc kubenswrapper[4810]: I0219 15:10:12.666188 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:12 crc kubenswrapper[4810]: I0219 15:10:12.666267 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:12 crc kubenswrapper[4810]: I0219 15:10:12.666289 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:12 crc kubenswrapper[4810]: I0219 15:10:12.666358 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:12 crc kubenswrapper[4810]: I0219 15:10:12.666384 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:12Z","lastTransitionTime":"2026-02-19T15:10:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:12 crc kubenswrapper[4810]: I0219 15:10:12.770295 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:12 crc kubenswrapper[4810]: I0219 15:10:12.770402 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:12 crc kubenswrapper[4810]: I0219 15:10:12.770428 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:12 crc kubenswrapper[4810]: I0219 15:10:12.770459 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:12 crc kubenswrapper[4810]: I0219 15:10:12.770480 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:12Z","lastTransitionTime":"2026-02-19T15:10:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:12 crc kubenswrapper[4810]: I0219 15:10:12.874046 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:12 crc kubenswrapper[4810]: I0219 15:10:12.874116 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:12 crc kubenswrapper[4810]: I0219 15:10:12.874226 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:12 crc kubenswrapper[4810]: I0219 15:10:12.874283 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:12 crc kubenswrapper[4810]: I0219 15:10:12.874310 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:12Z","lastTransitionTime":"2026-02-19T15:10:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:12 crc kubenswrapper[4810]: I0219 15:10:12.977430 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:12 crc kubenswrapper[4810]: I0219 15:10:12.977504 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:12 crc kubenswrapper[4810]: I0219 15:10:12.977524 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:12 crc kubenswrapper[4810]: I0219 15:10:12.977554 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:12 crc kubenswrapper[4810]: I0219 15:10:12.977573 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:12Z","lastTransitionTime":"2026-02-19T15:10:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:13 crc kubenswrapper[4810]: I0219 15:10:13.080886 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:13 crc kubenswrapper[4810]: I0219 15:10:13.080957 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:13 crc kubenswrapper[4810]: I0219 15:10:13.080981 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:13 crc kubenswrapper[4810]: I0219 15:10:13.081009 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:13 crc kubenswrapper[4810]: I0219 15:10:13.081031 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:13Z","lastTransitionTime":"2026-02-19T15:10:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:13 crc kubenswrapper[4810]: I0219 15:10:13.185236 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:13 crc kubenswrapper[4810]: I0219 15:10:13.185363 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:13 crc kubenswrapper[4810]: I0219 15:10:13.185390 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:13 crc kubenswrapper[4810]: I0219 15:10:13.185424 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:13 crc kubenswrapper[4810]: I0219 15:10:13.185447 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:13Z","lastTransitionTime":"2026-02-19T15:10:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:13 crc kubenswrapper[4810]: I0219 15:10:13.290995 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:13 crc kubenswrapper[4810]: I0219 15:10:13.291064 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:13 crc kubenswrapper[4810]: I0219 15:10:13.291082 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:13 crc kubenswrapper[4810]: I0219 15:10:13.291110 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:13 crc kubenswrapper[4810]: I0219 15:10:13.291130 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:13Z","lastTransitionTime":"2026-02-19T15:10:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:13 crc kubenswrapper[4810]: I0219 15:10:13.386694 4810 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 03:08:54.440407628 +0000 UTC Feb 19 15:10:13 crc kubenswrapper[4810]: I0219 15:10:13.395040 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:13 crc kubenswrapper[4810]: I0219 15:10:13.395111 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:13 crc kubenswrapper[4810]: I0219 15:10:13.395126 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:13 crc kubenswrapper[4810]: I0219 15:10:13.395148 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:13 crc kubenswrapper[4810]: I0219 15:10:13.395163 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:13Z","lastTransitionTime":"2026-02-19T15:10:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:13 crc kubenswrapper[4810]: I0219 15:10:13.439668 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 15:10:13 crc kubenswrapper[4810]: I0219 15:10:13.439830 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2x9v9" Feb 19 15:10:13 crc kubenswrapper[4810]: E0219 15:10:13.439870 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 15:10:13 crc kubenswrapper[4810]: I0219 15:10:13.439882 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 15:10:13 crc kubenswrapper[4810]: E0219 15:10:13.440176 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2x9v9" podUID="b72d3f7a-e418-4a21-af73-6a43ce3358c1" Feb 19 15:10:13 crc kubenswrapper[4810]: E0219 15:10:13.440426 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 15:10:13 crc kubenswrapper[4810]: I0219 15:10:13.497607 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:13 crc kubenswrapper[4810]: I0219 15:10:13.497668 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:13 crc kubenswrapper[4810]: I0219 15:10:13.497683 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:13 crc kubenswrapper[4810]: I0219 15:10:13.497710 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:13 crc kubenswrapper[4810]: I0219 15:10:13.497726 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:13Z","lastTransitionTime":"2026-02-19T15:10:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:13 crc kubenswrapper[4810]: I0219 15:10:13.600475 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:13 crc kubenswrapper[4810]: I0219 15:10:13.600534 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:13 crc kubenswrapper[4810]: I0219 15:10:13.600547 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:13 crc kubenswrapper[4810]: I0219 15:10:13.600569 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:13 crc kubenswrapper[4810]: I0219 15:10:13.600581 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:13Z","lastTransitionTime":"2026-02-19T15:10:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:13 crc kubenswrapper[4810]: I0219 15:10:13.703126 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:13 crc kubenswrapper[4810]: I0219 15:10:13.703190 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:13 crc kubenswrapper[4810]: I0219 15:10:13.703213 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:13 crc kubenswrapper[4810]: I0219 15:10:13.703240 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:13 crc kubenswrapper[4810]: I0219 15:10:13.703284 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:13Z","lastTransitionTime":"2026-02-19T15:10:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:13 crc kubenswrapper[4810]: I0219 15:10:13.807305 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:13 crc kubenswrapper[4810]: I0219 15:10:13.807437 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:13 crc kubenswrapper[4810]: I0219 15:10:13.807465 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:13 crc kubenswrapper[4810]: I0219 15:10:13.807505 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:13 crc kubenswrapper[4810]: I0219 15:10:13.807532 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:13Z","lastTransitionTime":"2026-02-19T15:10:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:13 crc kubenswrapper[4810]: I0219 15:10:13.910837 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:13 crc kubenswrapper[4810]: I0219 15:10:13.910977 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:13 crc kubenswrapper[4810]: I0219 15:10:13.911018 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:13 crc kubenswrapper[4810]: I0219 15:10:13.911061 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:13 crc kubenswrapper[4810]: I0219 15:10:13.911088 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:13Z","lastTransitionTime":"2026-02-19T15:10:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:14 crc kubenswrapper[4810]: I0219 15:10:14.014924 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:14 crc kubenswrapper[4810]: I0219 15:10:14.015367 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:14 crc kubenswrapper[4810]: I0219 15:10:14.015576 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:14 crc kubenswrapper[4810]: I0219 15:10:14.015845 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:14 crc kubenswrapper[4810]: I0219 15:10:14.016003 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:14Z","lastTransitionTime":"2026-02-19T15:10:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:14 crc kubenswrapper[4810]: I0219 15:10:14.119949 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:14 crc kubenswrapper[4810]: I0219 15:10:14.120019 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:14 crc kubenswrapper[4810]: I0219 15:10:14.120042 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:14 crc kubenswrapper[4810]: I0219 15:10:14.120075 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:14 crc kubenswrapper[4810]: I0219 15:10:14.120097 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:14Z","lastTransitionTime":"2026-02-19T15:10:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:14 crc kubenswrapper[4810]: I0219 15:10:14.223011 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:14 crc kubenswrapper[4810]: I0219 15:10:14.223051 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:14 crc kubenswrapper[4810]: I0219 15:10:14.223063 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:14 crc kubenswrapper[4810]: I0219 15:10:14.223081 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:14 crc kubenswrapper[4810]: I0219 15:10:14.223093 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:14Z","lastTransitionTime":"2026-02-19T15:10:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:14 crc kubenswrapper[4810]: I0219 15:10:14.325853 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:14 crc kubenswrapper[4810]: I0219 15:10:14.325896 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:14 crc kubenswrapper[4810]: I0219 15:10:14.325908 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:14 crc kubenswrapper[4810]: I0219 15:10:14.325927 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:14 crc kubenswrapper[4810]: I0219 15:10:14.325938 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:14Z","lastTransitionTime":"2026-02-19T15:10:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:14 crc kubenswrapper[4810]: I0219 15:10:14.388240 4810 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 11:01:39.702338114 +0000 UTC Feb 19 15:10:14 crc kubenswrapper[4810]: I0219 15:10:14.429706 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:14 crc kubenswrapper[4810]: I0219 15:10:14.429758 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:14 crc kubenswrapper[4810]: I0219 15:10:14.429769 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:14 crc kubenswrapper[4810]: I0219 15:10:14.429789 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:14 crc kubenswrapper[4810]: I0219 15:10:14.429830 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:14Z","lastTransitionTime":"2026-02-19T15:10:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:14 crc kubenswrapper[4810]: I0219 15:10:14.438573 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 15:10:14 crc kubenswrapper[4810]: E0219 15:10:14.438732 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 15:10:14 crc kubenswrapper[4810]: I0219 15:10:14.532133 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:14 crc kubenswrapper[4810]: I0219 15:10:14.532183 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:14 crc kubenswrapper[4810]: I0219 15:10:14.532195 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:14 crc kubenswrapper[4810]: I0219 15:10:14.532221 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:14 crc kubenswrapper[4810]: I0219 15:10:14.532239 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:14Z","lastTransitionTime":"2026-02-19T15:10:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:14 crc kubenswrapper[4810]: I0219 15:10:14.635969 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:14 crc kubenswrapper[4810]: I0219 15:10:14.636062 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:14 crc kubenswrapper[4810]: I0219 15:10:14.636080 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:14 crc kubenswrapper[4810]: I0219 15:10:14.636108 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:14 crc kubenswrapper[4810]: I0219 15:10:14.636126 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:14Z","lastTransitionTime":"2026-02-19T15:10:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:14 crc kubenswrapper[4810]: I0219 15:10:14.739451 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:14 crc kubenswrapper[4810]: I0219 15:10:14.739527 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:14 crc kubenswrapper[4810]: I0219 15:10:14.739545 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:14 crc kubenswrapper[4810]: I0219 15:10:14.739580 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:14 crc kubenswrapper[4810]: I0219 15:10:14.739601 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:14Z","lastTransitionTime":"2026-02-19T15:10:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:14 crc kubenswrapper[4810]: I0219 15:10:14.843093 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:14 crc kubenswrapper[4810]: I0219 15:10:14.843160 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:14 crc kubenswrapper[4810]: I0219 15:10:14.843178 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:14 crc kubenswrapper[4810]: I0219 15:10:14.843206 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:14 crc kubenswrapper[4810]: I0219 15:10:14.843221 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:14Z","lastTransitionTime":"2026-02-19T15:10:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:14 crc kubenswrapper[4810]: I0219 15:10:14.947614 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:14 crc kubenswrapper[4810]: I0219 15:10:14.947697 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:14 crc kubenswrapper[4810]: I0219 15:10:14.947715 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:14 crc kubenswrapper[4810]: I0219 15:10:14.947751 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:14 crc kubenswrapper[4810]: I0219 15:10:14.947770 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:14Z","lastTransitionTime":"2026-02-19T15:10:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:15 crc kubenswrapper[4810]: I0219 15:10:15.050952 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:15 crc kubenswrapper[4810]: I0219 15:10:15.051028 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:15 crc kubenswrapper[4810]: I0219 15:10:15.051044 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:15 crc kubenswrapper[4810]: I0219 15:10:15.051069 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:15 crc kubenswrapper[4810]: I0219 15:10:15.051088 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:15Z","lastTransitionTime":"2026-02-19T15:10:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:15 crc kubenswrapper[4810]: I0219 15:10:15.085189 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 15:10:15 crc kubenswrapper[4810]: I0219 15:10:15.154561 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:15 crc kubenswrapper[4810]: I0219 15:10:15.154714 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:15 crc kubenswrapper[4810]: I0219 15:10:15.154739 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:15 crc kubenswrapper[4810]: I0219 15:10:15.154771 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:15 crc kubenswrapper[4810]: I0219 15:10:15.154791 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:15Z","lastTransitionTime":"2026-02-19T15:10:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:15 crc kubenswrapper[4810]: I0219 15:10:15.257831 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:15 crc kubenswrapper[4810]: I0219 15:10:15.257887 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:15 crc kubenswrapper[4810]: I0219 15:10:15.257903 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:15 crc kubenswrapper[4810]: I0219 15:10:15.257923 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:15 crc kubenswrapper[4810]: I0219 15:10:15.257937 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:15Z","lastTransitionTime":"2026-02-19T15:10:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:15 crc kubenswrapper[4810]: I0219 15:10:15.361924 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:15 crc kubenswrapper[4810]: I0219 15:10:15.361997 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:15 crc kubenswrapper[4810]: I0219 15:10:15.362011 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:15 crc kubenswrapper[4810]: I0219 15:10:15.362033 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:15 crc kubenswrapper[4810]: I0219 15:10:15.362048 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:15Z","lastTransitionTime":"2026-02-19T15:10:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:15 crc kubenswrapper[4810]: I0219 15:10:15.389512 4810 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 12:23:44.427192891 +0000 UTC Feb 19 15:10:15 crc kubenswrapper[4810]: I0219 15:10:15.427462 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b72d3f7a-e418-4a21-af73-6a43ce3358c1-metrics-certs\") pod \"network-metrics-daemon-2x9v9\" (UID: \"b72d3f7a-e418-4a21-af73-6a43ce3358c1\") " pod="openshift-multus/network-metrics-daemon-2x9v9" Feb 19 15:10:15 crc kubenswrapper[4810]: E0219 15:10:15.427764 4810 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 15:10:15 crc kubenswrapper[4810]: E0219 15:10:15.427913 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b72d3f7a-e418-4a21-af73-6a43ce3358c1-metrics-certs podName:b72d3f7a-e418-4a21-af73-6a43ce3358c1 nodeName:}" failed. No retries permitted until 2026-02-19 15:10:23.427877948 +0000 UTC m=+52.909908112 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b72d3f7a-e418-4a21-af73-6a43ce3358c1-metrics-certs") pod "network-metrics-daemon-2x9v9" (UID: "b72d3f7a-e418-4a21-af73-6a43ce3358c1") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 15:10:15 crc kubenswrapper[4810]: I0219 15:10:15.438532 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2x9v9" Feb 19 15:10:15 crc kubenswrapper[4810]: I0219 15:10:15.438662 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 15:10:15 crc kubenswrapper[4810]: I0219 15:10:15.438655 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 15:10:15 crc kubenswrapper[4810]: E0219 15:10:15.438882 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2x9v9" podUID="b72d3f7a-e418-4a21-af73-6a43ce3358c1" Feb 19 15:10:15 crc kubenswrapper[4810]: E0219 15:10:15.439073 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 15:10:15 crc kubenswrapper[4810]: E0219 15:10:15.439643 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 15:10:15 crc kubenswrapper[4810]: I0219 15:10:15.465829 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:15 crc kubenswrapper[4810]: I0219 15:10:15.465943 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:15 crc kubenswrapper[4810]: I0219 15:10:15.466016 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:15 crc kubenswrapper[4810]: I0219 15:10:15.466059 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:15 crc kubenswrapper[4810]: I0219 15:10:15.466101 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:15Z","lastTransitionTime":"2026-02-19T15:10:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:15 crc kubenswrapper[4810]: I0219 15:10:15.569307 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:15 crc kubenswrapper[4810]: I0219 15:10:15.569385 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:15 crc kubenswrapper[4810]: I0219 15:10:15.569400 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:15 crc kubenswrapper[4810]: I0219 15:10:15.569423 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:15 crc kubenswrapper[4810]: I0219 15:10:15.569440 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:15Z","lastTransitionTime":"2026-02-19T15:10:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:15 crc kubenswrapper[4810]: I0219 15:10:15.672588 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:15 crc kubenswrapper[4810]: I0219 15:10:15.672656 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:15 crc kubenswrapper[4810]: I0219 15:10:15.672867 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:15 crc kubenswrapper[4810]: I0219 15:10:15.672901 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:15 crc kubenswrapper[4810]: I0219 15:10:15.672924 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:15Z","lastTransitionTime":"2026-02-19T15:10:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:15 crc kubenswrapper[4810]: I0219 15:10:15.775796 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:15 crc kubenswrapper[4810]: I0219 15:10:15.775863 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:15 crc kubenswrapper[4810]: I0219 15:10:15.775883 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:15 crc kubenswrapper[4810]: I0219 15:10:15.775910 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:15 crc kubenswrapper[4810]: I0219 15:10:15.775927 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:15Z","lastTransitionTime":"2026-02-19T15:10:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:15 crc kubenswrapper[4810]: I0219 15:10:15.878780 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:15 crc kubenswrapper[4810]: I0219 15:10:15.878836 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:15 crc kubenswrapper[4810]: I0219 15:10:15.878850 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:15 crc kubenswrapper[4810]: I0219 15:10:15.878877 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:15 crc kubenswrapper[4810]: I0219 15:10:15.878897 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:15Z","lastTransitionTime":"2026-02-19T15:10:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:15 crc kubenswrapper[4810]: I0219 15:10:15.981869 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:15 crc kubenswrapper[4810]: I0219 15:10:15.981923 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:15 crc kubenswrapper[4810]: I0219 15:10:15.981940 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:15 crc kubenswrapper[4810]: I0219 15:10:15.981966 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:15 crc kubenswrapper[4810]: I0219 15:10:15.981983 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:15Z","lastTransitionTime":"2026-02-19T15:10:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:16 crc kubenswrapper[4810]: I0219 15:10:16.085508 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:16 crc kubenswrapper[4810]: I0219 15:10:16.085552 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:16 crc kubenswrapper[4810]: I0219 15:10:16.085585 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:16 crc kubenswrapper[4810]: I0219 15:10:16.085607 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:16 crc kubenswrapper[4810]: I0219 15:10:16.085623 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:16Z","lastTransitionTime":"2026-02-19T15:10:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:16 crc kubenswrapper[4810]: I0219 15:10:16.188585 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:16 crc kubenswrapper[4810]: I0219 15:10:16.188639 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:16 crc kubenswrapper[4810]: I0219 15:10:16.188655 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:16 crc kubenswrapper[4810]: I0219 15:10:16.188677 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:16 crc kubenswrapper[4810]: I0219 15:10:16.188691 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:16Z","lastTransitionTime":"2026-02-19T15:10:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:16 crc kubenswrapper[4810]: I0219 15:10:16.291583 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:16 crc kubenswrapper[4810]: I0219 15:10:16.291633 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:16 crc kubenswrapper[4810]: I0219 15:10:16.291646 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:16 crc kubenswrapper[4810]: I0219 15:10:16.291666 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:16 crc kubenswrapper[4810]: I0219 15:10:16.291679 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:16Z","lastTransitionTime":"2026-02-19T15:10:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:16 crc kubenswrapper[4810]: I0219 15:10:16.390315 4810 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 23:55:05.858331616 +0000 UTC Feb 19 15:10:16 crc kubenswrapper[4810]: I0219 15:10:16.394499 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:16 crc kubenswrapper[4810]: I0219 15:10:16.394581 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:16 crc kubenswrapper[4810]: I0219 15:10:16.394609 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:16 crc kubenswrapper[4810]: I0219 15:10:16.394645 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:16 crc kubenswrapper[4810]: I0219 15:10:16.394674 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:16Z","lastTransitionTime":"2026-02-19T15:10:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:16 crc kubenswrapper[4810]: I0219 15:10:16.439024 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 15:10:16 crc kubenswrapper[4810]: E0219 15:10:16.439257 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 15:10:16 crc kubenswrapper[4810]: I0219 15:10:16.497813 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:16 crc kubenswrapper[4810]: I0219 15:10:16.497875 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:16 crc kubenswrapper[4810]: I0219 15:10:16.497890 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:16 crc kubenswrapper[4810]: I0219 15:10:16.497911 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:16 crc kubenswrapper[4810]: I0219 15:10:16.497927 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:16Z","lastTransitionTime":"2026-02-19T15:10:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:16 crc kubenswrapper[4810]: I0219 15:10:16.600253 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:16 crc kubenswrapper[4810]: I0219 15:10:16.600376 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:16 crc kubenswrapper[4810]: I0219 15:10:16.600404 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:16 crc kubenswrapper[4810]: I0219 15:10:16.600439 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:16 crc kubenswrapper[4810]: I0219 15:10:16.600471 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:16Z","lastTransitionTime":"2026-02-19T15:10:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:16 crc kubenswrapper[4810]: I0219 15:10:16.703549 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:16 crc kubenswrapper[4810]: I0219 15:10:16.703587 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:16 crc kubenswrapper[4810]: I0219 15:10:16.703633 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:16 crc kubenswrapper[4810]: I0219 15:10:16.703651 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:16 crc kubenswrapper[4810]: I0219 15:10:16.703664 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:16Z","lastTransitionTime":"2026-02-19T15:10:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:16 crc kubenswrapper[4810]: I0219 15:10:16.806754 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:16 crc kubenswrapper[4810]: I0219 15:10:16.806809 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:16 crc kubenswrapper[4810]: I0219 15:10:16.806820 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:16 crc kubenswrapper[4810]: I0219 15:10:16.806844 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:16 crc kubenswrapper[4810]: I0219 15:10:16.806858 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:16Z","lastTransitionTime":"2026-02-19T15:10:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:16 crc kubenswrapper[4810]: I0219 15:10:16.910296 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:16 crc kubenswrapper[4810]: I0219 15:10:16.910418 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:16 crc kubenswrapper[4810]: I0219 15:10:16.910427 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:16 crc kubenswrapper[4810]: I0219 15:10:16.910451 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:16 crc kubenswrapper[4810]: I0219 15:10:16.910462 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:16Z","lastTransitionTime":"2026-02-19T15:10:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:17 crc kubenswrapper[4810]: I0219 15:10:17.013639 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:17 crc kubenswrapper[4810]: I0219 15:10:17.013725 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:17 crc kubenswrapper[4810]: I0219 15:10:17.013749 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:17 crc kubenswrapper[4810]: I0219 15:10:17.013787 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:17 crc kubenswrapper[4810]: I0219 15:10:17.013812 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:17Z","lastTransitionTime":"2026-02-19T15:10:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:17 crc kubenswrapper[4810]: I0219 15:10:17.116875 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:17 crc kubenswrapper[4810]: I0219 15:10:17.116924 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:17 crc kubenswrapper[4810]: I0219 15:10:17.116935 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:17 crc kubenswrapper[4810]: I0219 15:10:17.116953 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:17 crc kubenswrapper[4810]: I0219 15:10:17.116963 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:17Z","lastTransitionTime":"2026-02-19T15:10:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:17 crc kubenswrapper[4810]: I0219 15:10:17.219518 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:17 crc kubenswrapper[4810]: I0219 15:10:17.219570 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:17 crc kubenswrapper[4810]: I0219 15:10:17.219579 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:17 crc kubenswrapper[4810]: I0219 15:10:17.219596 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:17 crc kubenswrapper[4810]: I0219 15:10:17.219608 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:17Z","lastTransitionTime":"2026-02-19T15:10:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:17 crc kubenswrapper[4810]: I0219 15:10:17.322294 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:17 crc kubenswrapper[4810]: I0219 15:10:17.322351 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:17 crc kubenswrapper[4810]: I0219 15:10:17.322360 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:17 crc kubenswrapper[4810]: I0219 15:10:17.322378 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:17 crc kubenswrapper[4810]: I0219 15:10:17.322389 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:17Z","lastTransitionTime":"2026-02-19T15:10:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:17 crc kubenswrapper[4810]: I0219 15:10:17.391014 4810 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 20:58:27.420958112 +0000 UTC Feb 19 15:10:17 crc kubenswrapper[4810]: I0219 15:10:17.425513 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:17 crc kubenswrapper[4810]: I0219 15:10:17.425918 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:17 crc kubenswrapper[4810]: I0219 15:10:17.426018 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:17 crc kubenswrapper[4810]: I0219 15:10:17.426121 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:17 crc kubenswrapper[4810]: I0219 15:10:17.426205 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:17Z","lastTransitionTime":"2026-02-19T15:10:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:17 crc kubenswrapper[4810]: I0219 15:10:17.438960 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2x9v9" Feb 19 15:10:17 crc kubenswrapper[4810]: I0219 15:10:17.439025 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 15:10:17 crc kubenswrapper[4810]: I0219 15:10:17.438960 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 15:10:17 crc kubenswrapper[4810]: E0219 15:10:17.439187 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2x9v9" podUID="b72d3f7a-e418-4a21-af73-6a43ce3358c1" Feb 19 15:10:17 crc kubenswrapper[4810]: E0219 15:10:17.439276 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 15:10:17 crc kubenswrapper[4810]: E0219 15:10:17.439385 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 15:10:17 crc kubenswrapper[4810]: I0219 15:10:17.529753 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:17 crc kubenswrapper[4810]: I0219 15:10:17.529829 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:17 crc kubenswrapper[4810]: I0219 15:10:17.529843 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:17 crc kubenswrapper[4810]: I0219 15:10:17.529867 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:17 crc kubenswrapper[4810]: I0219 15:10:17.529885 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:17Z","lastTransitionTime":"2026-02-19T15:10:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:17 crc kubenswrapper[4810]: I0219 15:10:17.632721 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:17 crc kubenswrapper[4810]: I0219 15:10:17.632788 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:17 crc kubenswrapper[4810]: I0219 15:10:17.632809 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:17 crc kubenswrapper[4810]: I0219 15:10:17.632842 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:17 crc kubenswrapper[4810]: I0219 15:10:17.632868 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:17Z","lastTransitionTime":"2026-02-19T15:10:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:17 crc kubenswrapper[4810]: I0219 15:10:17.736835 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:17 crc kubenswrapper[4810]: I0219 15:10:17.736918 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:17 crc kubenswrapper[4810]: I0219 15:10:17.736940 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:17 crc kubenswrapper[4810]: I0219 15:10:17.736974 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:17 crc kubenswrapper[4810]: I0219 15:10:17.736997 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:17Z","lastTransitionTime":"2026-02-19T15:10:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:17 crc kubenswrapper[4810]: I0219 15:10:17.840585 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:17 crc kubenswrapper[4810]: I0219 15:10:17.840636 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:17 crc kubenswrapper[4810]: I0219 15:10:17.840655 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:17 crc kubenswrapper[4810]: I0219 15:10:17.840679 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:17 crc kubenswrapper[4810]: I0219 15:10:17.840695 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:17Z","lastTransitionTime":"2026-02-19T15:10:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:17 crc kubenswrapper[4810]: I0219 15:10:17.944481 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:17 crc kubenswrapper[4810]: I0219 15:10:17.944570 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:17 crc kubenswrapper[4810]: I0219 15:10:17.944595 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:17 crc kubenswrapper[4810]: I0219 15:10:17.944630 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:17 crc kubenswrapper[4810]: I0219 15:10:17.944654 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:17Z","lastTransitionTime":"2026-02-19T15:10:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.048254 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.048314 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.048350 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.048378 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.048393 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:18Z","lastTransitionTime":"2026-02-19T15:10:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.151530 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.151584 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.151599 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.151622 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.151635 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:18Z","lastTransitionTime":"2026-02-19T15:10:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.254184 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.254234 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.254248 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.254267 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.254279 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:18Z","lastTransitionTime":"2026-02-19T15:10:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.358359 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.358421 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.358439 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.358468 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.358487 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:18Z","lastTransitionTime":"2026-02-19T15:10:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.368427 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.380263 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.391794 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6694b0d-3264-43eb-bd52-5088c7d2bf15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16713d77989f991d6b3136c41aac696aaa3233a0f3454bcd299cd968f46872eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42939a1750dd650a11f4f83812694248cc8767b040f5cab76ee64d30ad6004ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a3707767fcc9c9ee6a808f9c0598457f1710b6394ecf9910f7a1a0ddd8ec1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef740fa90b279363e303fb824188d0a26097b06aa8eaa488c0c800a8caa8f10e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://640d02443d7d20048695866b4db1d1fdd6469c712a804010860ca4c730f59c65\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T15:09:51Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 15:09:51.508430 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 15:09:51.508603 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 15:09:51.509986 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1367038872/tls.crt::/tmp/serving-cert-1367038872/tls.key\\\\\\\"\\\\nI0219 15:09:51.722990 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 15:09:51.726068 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 15:09:51.726086 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 15:09:51.726104 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 15:09:51.726110 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 15:09:51.731095 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 15:09:51.731120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0219 15:09:51.731119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0219 15:09:51.731126 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 15:09:51.731136 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 15:09:51.731141 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 15:09:51.731145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 15:09:51.731149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 15:09:51.734010 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77137e37c7a8cdddbba21456e942c760fb2cc6c31533e3bc6c4561929b41393b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:18Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.392871 4810 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 12:43:22.974988894 +0000 UTC Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.411538 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:18Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.435582 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmbsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"247e948b-3c17-4675-bd1c-f894b02d2817\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7171cb2bba77488fbb60ec533172b2240d1871e89bc7760ae5c9b67ee6924354\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cbe5a969ecab49494c22bccec30a5b7050699729a8ae6244a4bb016ed9cbe64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cbe5a969ecab49494c22bccec30a5b7050699729a8ae6244a4bb016ed9cbe64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://395e0b7a03b78f71b2823c72ea754e1909bde653a7107597bf07b6e664ba40dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://395e0b7a03b78f71b2823c72ea754e1909bde653a7107597bf07b6e664ba40dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5a0126c8bea39a89fe488d49dcb1e47063abc9839d148f348a7ce592038a450\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5a0126c8bea39a89fe488d49dcb1e47063abc9839d148f348a7ce592038a450\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9864da53b0851b0c4eadd5c11669af9b9e539660908ca14f44618824b5e4158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9864da53b0851b0c4eadd5c11669af9b9e539660908ca14f44618824b5e4158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://301c743749a2bcaee82a70d3c34a6cf13f880ef9986c390e279a7c056f9a99ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://301c743749a2bcaee82a70d3c34a6cf13f880ef9986c390e279a7c056f9a99ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://075e06e40beb99d2338c5850692a5e89314b6342cc72007d75b602ac21a90d9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://075e06e40beb99d2338c5850692a5e89314b6342cc72007d75b602ac21a90d9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmbsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:18Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.438544 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 15:10:18 crc kubenswrapper[4810]: E0219 15:10:18.438759 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.455533 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2x9v9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b72d3f7a-e418-4a21-af73-6a43ce3358c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7z7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7z7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:10:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2x9v9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:18Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.467734 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.467807 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.467840 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.467870 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.467889 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:18Z","lastTransitionTime":"2026-02-19T15:10:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.475126 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfbf74ef-1e94-4826-8583-42b2e246ccf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f26baf8d6384c9133f7da338479eac54f1aa88609e4b0854078d4e85e8bf05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjrkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba65a8abc0841494ba4596019b005e3cbb3f3c0897bef615b51a2ab10aa9e651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjrkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t499d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:18Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.492971 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-t9jnq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60d7228b-14bd-4988-8dca-cb89f487ba00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb03eee4bad3a037ff0d61a204c9c86a839e9d684dbb89a433a50afa9cfcfcf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gzvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-t9jnq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:18Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.511443 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-l2g4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06adf9a1-ec31-4acc-9864-41549913d3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://618a71f428d132079ed8a6756ea331ec8fd20c32c98bc00691f56fa0b12669aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh6qr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82d6389909245cbf6091dfb1ef53a497dd254228acd53c508c32c4345f86b7b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh6qr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:10:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-l2g4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:18Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.531767 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:18Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.553231 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9db08d13283889d8c43ab89157e2c33f300359ecf66b548100b30ddea3da2be1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8472b62f0ac04884cd8a9d342c063860188b0f17e1ed2201c7ea7288fb44d77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:18Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.570842 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-flbx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cc3ea69-b881-4fd4-ad4d-42803f27865b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07b26bfbc15981836877e6ddc6aea3fb26a4df1be38ac67c76f8a7a8f6b84b5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqph9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-flbx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:18Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.571723 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.571772 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.571788 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.571815 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.571835 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:18Z","lastTransitionTime":"2026-02-19T15:10:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.600668 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dcca2640a7625737b151a65d63dac889139b41a8dff1cd5aae2ead1a37138f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://879052e78b8d383d0a7f4fbddb38b98fa931bec3d0d0c7ff7eb830ccb62be46d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5de90aae0b44ad98711b4e4c2764cf25c491a329a7acbe9a6ec49f88fb8e7c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c0568b462e1168949dcfd0131ef4e94d6c3f68dd31906ee5c6a8c87f7fef25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://702a868ef507f31cb27f41ccd862f38ba8d5256e5d4af19a5d8b168939932663\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5089067acda7084123f239150916500b11a0547068b0a39ee60a467c372f3459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c2bd7e735fa6950f8384e98fc9f9b2c030892c44b19ca0f13fd0e4bad302e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86c2bd7e735fa6950f8384e98fc9f9b2c030892c44b19ca0f13fd0e4bad302e4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T15:10:06Z\\\",\\\"message\\\":\\\"ns:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {43933d5e-3c3b-4ff8-8926-04ac25de450e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0219 15:10:06.221628 6231 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0219 15:10:06.223394 6231 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:43933d5e-3c3b-4ff8-8926-04ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0219 15:10:06.223450 6231 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T15:10:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-8k7p5_openshift-ovn-kubernetes(c5a8a15c-53e8-4868-8feb-dcd4e83939a4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e21fe2d6964872913c8e5a2f983519806b72d0e2880016372ac90fc2b9f61fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8k7p5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:18Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.621312 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d74a04f-ffb6-47fb-8823-2fd7f3df4f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2070ce19c777ad791dfd3e5d99eded43e479f4cbbaefe96c5ed8525aa86e5ed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89e969496cd5adab974a5d8b8c3051bcdfb63c582b8f0d6d9af233d25e2c1d3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5559fbe43f7e18617e2a46182d621a6310ae4d0b10548bae2bb098588285e081\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fbf8815e560483199cc8f453952895d3e5d831e7c52671b204187dc686fa4ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:18Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.643766 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b60fe11e60f880bde640f63640fdc6a1b66e3044dcd975c711c45a15aed3fb2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:18Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.665200 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bsztz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a45a199-beeb-4972-b796-15c958fe99d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a674e283274f1e77f6ac89a8170137bf5852153bd4a35a80101a3cd0090b85d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcg8h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bsztz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:18Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.675227 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.675287 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.675305 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.675360 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.675380 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:18Z","lastTransitionTime":"2026-02-19T15:10:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.685561 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:18Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.707150 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54ba23405a6182e281d7d423f13e84a03c0f68cf69169f59fe1cd0d5881103c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:18Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.778447 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.778874 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.778973 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.779062 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.779144 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:18Z","lastTransitionTime":"2026-02-19T15:10:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.882688 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.882753 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.882772 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.882800 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.882820 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:18Z","lastTransitionTime":"2026-02-19T15:10:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.891225 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.891296 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.891320 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.891378 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.891395 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:18Z","lastTransitionTime":"2026-02-19T15:10:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:18 crc kubenswrapper[4810]: E0219 15:10:18.912367 4810 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7961bd2b-8ad7-4e25-b313-8f82bef01c62\\\",\\\"systemUUID\\\":\\\"60bcb373-142f-4da9-846e-4d055863e63a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:18Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.918411 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.918763 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.918959 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.919115 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.919258 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:18Z","lastTransitionTime":"2026-02-19T15:10:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:18 crc kubenswrapper[4810]: E0219 15:10:18.941712 4810 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7961bd2b-8ad7-4e25-b313-8f82bef01c62\\\",\\\"systemUUID\\\":\\\"60bcb373-142f-4da9-846e-4d055863e63a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:18Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.946490 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.946537 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.946550 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.946569 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.946582 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:18Z","lastTransitionTime":"2026-02-19T15:10:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:18 crc kubenswrapper[4810]: E0219 15:10:18.966474 4810 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7961bd2b-8ad7-4e25-b313-8f82bef01c62\\\",\\\"systemUUID\\\":\\\"60bcb373-142f-4da9-846e-4d055863e63a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:18Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.971993 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.972039 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.972051 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.972070 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.972083 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:18Z","lastTransitionTime":"2026-02-19T15:10:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:18 crc kubenswrapper[4810]: E0219 15:10:18.991306 4810 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7961bd2b-8ad7-4e25-b313-8f82bef01c62\\\",\\\"systemUUID\\\":\\\"60bcb373-142f-4da9-846e-4d055863e63a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:18Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.996915 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.996971 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.996990 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.997016 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:18 crc kubenswrapper[4810]: I0219 15:10:18.997039 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:18Z","lastTransitionTime":"2026-02-19T15:10:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:19 crc kubenswrapper[4810]: E0219 15:10:19.018503 4810 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7961bd2b-8ad7-4e25-b313-8f82bef01c62\\\",\\\"systemUUID\\\":\\\"60bcb373-142f-4da9-846e-4d055863e63a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:19Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:19 crc kubenswrapper[4810]: E0219 15:10:19.018727 4810 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 19 15:10:19 crc kubenswrapper[4810]: I0219 15:10:19.020838 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:19 crc kubenswrapper[4810]: I0219 15:10:19.020878 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:19 crc kubenswrapper[4810]: I0219 15:10:19.020890 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:19 crc kubenswrapper[4810]: I0219 15:10:19.020909 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:19 crc kubenswrapper[4810]: I0219 15:10:19.020922 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:19Z","lastTransitionTime":"2026-02-19T15:10:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:19 crc kubenswrapper[4810]: I0219 15:10:19.124062 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:19 crc kubenswrapper[4810]: I0219 15:10:19.124131 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:19 crc kubenswrapper[4810]: I0219 15:10:19.124147 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:19 crc kubenswrapper[4810]: I0219 15:10:19.124175 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:19 crc kubenswrapper[4810]: I0219 15:10:19.124191 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:19Z","lastTransitionTime":"2026-02-19T15:10:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:19 crc kubenswrapper[4810]: I0219 15:10:19.227466 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:19 crc kubenswrapper[4810]: I0219 15:10:19.227502 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:19 crc kubenswrapper[4810]: I0219 15:10:19.227511 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:19 crc kubenswrapper[4810]: I0219 15:10:19.227526 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:19 crc kubenswrapper[4810]: I0219 15:10:19.227536 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:19Z","lastTransitionTime":"2026-02-19T15:10:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:19 crc kubenswrapper[4810]: I0219 15:10:19.330103 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:19 crc kubenswrapper[4810]: I0219 15:10:19.330148 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:19 crc kubenswrapper[4810]: I0219 15:10:19.330161 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:19 crc kubenswrapper[4810]: I0219 15:10:19.330181 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:19 crc kubenswrapper[4810]: I0219 15:10:19.330193 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:19Z","lastTransitionTime":"2026-02-19T15:10:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:19 crc kubenswrapper[4810]: I0219 15:10:19.393665 4810 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 16:30:11.192771959 +0000 UTC Feb 19 15:10:19 crc kubenswrapper[4810]: I0219 15:10:19.433862 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:19 crc kubenswrapper[4810]: I0219 15:10:19.434260 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:19 crc kubenswrapper[4810]: I0219 15:10:19.434383 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:19 crc kubenswrapper[4810]: I0219 15:10:19.434503 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:19 crc kubenswrapper[4810]: I0219 15:10:19.434606 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:19Z","lastTransitionTime":"2026-02-19T15:10:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:19 crc kubenswrapper[4810]: I0219 15:10:19.439540 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 15:10:19 crc kubenswrapper[4810]: I0219 15:10:19.439655 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2x9v9" Feb 19 15:10:19 crc kubenswrapper[4810]: I0219 15:10:19.439548 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 15:10:19 crc kubenswrapper[4810]: E0219 15:10:19.439773 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 15:10:19 crc kubenswrapper[4810]: E0219 15:10:19.439913 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 15:10:19 crc kubenswrapper[4810]: E0219 15:10:19.440041 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2x9v9" podUID="b72d3f7a-e418-4a21-af73-6a43ce3358c1" Feb 19 15:10:19 crc kubenswrapper[4810]: I0219 15:10:19.538511 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:19 crc kubenswrapper[4810]: I0219 15:10:19.538583 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:19 crc kubenswrapper[4810]: I0219 15:10:19.538602 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:19 crc kubenswrapper[4810]: I0219 15:10:19.538628 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:19 crc kubenswrapper[4810]: I0219 15:10:19.538649 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:19Z","lastTransitionTime":"2026-02-19T15:10:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:19 crc kubenswrapper[4810]: I0219 15:10:19.641692 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:19 crc kubenswrapper[4810]: I0219 15:10:19.641772 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:19 crc kubenswrapper[4810]: I0219 15:10:19.641799 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:19 crc kubenswrapper[4810]: I0219 15:10:19.641832 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:19 crc kubenswrapper[4810]: I0219 15:10:19.641854 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:19Z","lastTransitionTime":"2026-02-19T15:10:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:19 crc kubenswrapper[4810]: I0219 15:10:19.744374 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:19 crc kubenswrapper[4810]: I0219 15:10:19.744716 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:19 crc kubenswrapper[4810]: I0219 15:10:19.744798 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:19 crc kubenswrapper[4810]: I0219 15:10:19.744879 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:19 crc kubenswrapper[4810]: I0219 15:10:19.744947 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:19Z","lastTransitionTime":"2026-02-19T15:10:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:19 crc kubenswrapper[4810]: I0219 15:10:19.848193 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:19 crc kubenswrapper[4810]: I0219 15:10:19.848615 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:19 crc kubenswrapper[4810]: I0219 15:10:19.848832 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:19 crc kubenswrapper[4810]: I0219 15:10:19.849002 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:19 crc kubenswrapper[4810]: I0219 15:10:19.849146 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:19Z","lastTransitionTime":"2026-02-19T15:10:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:19 crc kubenswrapper[4810]: I0219 15:10:19.951629 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:19 crc kubenswrapper[4810]: I0219 15:10:19.951681 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:19 crc kubenswrapper[4810]: I0219 15:10:19.951700 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:19 crc kubenswrapper[4810]: I0219 15:10:19.951725 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:19 crc kubenswrapper[4810]: I0219 15:10:19.951744 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:19Z","lastTransitionTime":"2026-02-19T15:10:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:20 crc kubenswrapper[4810]: I0219 15:10:20.054919 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:20 crc kubenswrapper[4810]: I0219 15:10:20.054953 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:20 crc kubenswrapper[4810]: I0219 15:10:20.054962 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:20 crc kubenswrapper[4810]: I0219 15:10:20.055024 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:20 crc kubenswrapper[4810]: I0219 15:10:20.055034 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:20Z","lastTransitionTime":"2026-02-19T15:10:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:20 crc kubenswrapper[4810]: I0219 15:10:20.158435 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:20 crc kubenswrapper[4810]: I0219 15:10:20.158510 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:20 crc kubenswrapper[4810]: I0219 15:10:20.158536 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:20 crc kubenswrapper[4810]: I0219 15:10:20.158570 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:20 crc kubenswrapper[4810]: I0219 15:10:20.158588 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:20Z","lastTransitionTime":"2026-02-19T15:10:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:20 crc kubenswrapper[4810]: I0219 15:10:20.261887 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:20 crc kubenswrapper[4810]: I0219 15:10:20.261946 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:20 crc kubenswrapper[4810]: I0219 15:10:20.261965 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:20 crc kubenswrapper[4810]: I0219 15:10:20.261993 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:20 crc kubenswrapper[4810]: I0219 15:10:20.262013 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:20Z","lastTransitionTime":"2026-02-19T15:10:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:20 crc kubenswrapper[4810]: I0219 15:10:20.364515 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:20 crc kubenswrapper[4810]: I0219 15:10:20.364566 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:20 crc kubenswrapper[4810]: I0219 15:10:20.364582 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:20 crc kubenswrapper[4810]: I0219 15:10:20.364607 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:20 crc kubenswrapper[4810]: I0219 15:10:20.364623 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:20Z","lastTransitionTime":"2026-02-19T15:10:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:20 crc kubenswrapper[4810]: I0219 15:10:20.394299 4810 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 11:22:37.417621204 +0000 UTC Feb 19 15:10:20 crc kubenswrapper[4810]: I0219 15:10:20.438619 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 15:10:20 crc kubenswrapper[4810]: E0219 15:10:20.438788 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 15:10:20 crc kubenswrapper[4810]: I0219 15:10:20.467630 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:20 crc kubenswrapper[4810]: I0219 15:10:20.467698 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:20 crc kubenswrapper[4810]: I0219 15:10:20.467709 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:20 crc kubenswrapper[4810]: I0219 15:10:20.467731 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:20 crc kubenswrapper[4810]: I0219 15:10:20.467744 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:20Z","lastTransitionTime":"2026-02-19T15:10:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:20 crc kubenswrapper[4810]: I0219 15:10:20.571104 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:20 crc kubenswrapper[4810]: I0219 15:10:20.571174 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:20 crc kubenswrapper[4810]: I0219 15:10:20.571197 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:20 crc kubenswrapper[4810]: I0219 15:10:20.571228 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:20 crc kubenswrapper[4810]: I0219 15:10:20.571251 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:20Z","lastTransitionTime":"2026-02-19T15:10:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:20 crc kubenswrapper[4810]: I0219 15:10:20.674514 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:20 crc kubenswrapper[4810]: I0219 15:10:20.674613 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:20 crc kubenswrapper[4810]: I0219 15:10:20.674632 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:20 crc kubenswrapper[4810]: I0219 15:10:20.674733 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:20 crc kubenswrapper[4810]: I0219 15:10:20.674751 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:20Z","lastTransitionTime":"2026-02-19T15:10:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:20 crc kubenswrapper[4810]: I0219 15:10:20.778007 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:20 crc kubenswrapper[4810]: I0219 15:10:20.778064 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:20 crc kubenswrapper[4810]: I0219 15:10:20.778083 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:20 crc kubenswrapper[4810]: I0219 15:10:20.778109 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:20 crc kubenswrapper[4810]: I0219 15:10:20.778129 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:20Z","lastTransitionTime":"2026-02-19T15:10:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:20 crc kubenswrapper[4810]: I0219 15:10:20.880783 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:20 crc kubenswrapper[4810]: I0219 15:10:20.880857 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:20 crc kubenswrapper[4810]: I0219 15:10:20.880871 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:20 crc kubenswrapper[4810]: I0219 15:10:20.880892 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:20 crc kubenswrapper[4810]: I0219 15:10:20.880904 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:20Z","lastTransitionTime":"2026-02-19T15:10:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:20 crc kubenswrapper[4810]: I0219 15:10:20.983538 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:20 crc kubenswrapper[4810]: I0219 15:10:20.983578 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:20 crc kubenswrapper[4810]: I0219 15:10:20.983591 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:20 crc kubenswrapper[4810]: I0219 15:10:20.983610 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:20 crc kubenswrapper[4810]: I0219 15:10:20.983624 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:20Z","lastTransitionTime":"2026-02-19T15:10:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:21 crc kubenswrapper[4810]: I0219 15:10:21.086731 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:21 crc kubenswrapper[4810]: I0219 15:10:21.086872 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:21 crc kubenswrapper[4810]: I0219 15:10:21.086898 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:21 crc kubenswrapper[4810]: I0219 15:10:21.086928 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:21 crc kubenswrapper[4810]: I0219 15:10:21.086950 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:21Z","lastTransitionTime":"2026-02-19T15:10:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:21 crc kubenswrapper[4810]: I0219 15:10:21.190362 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:21 crc kubenswrapper[4810]: I0219 15:10:21.190446 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:21 crc kubenswrapper[4810]: I0219 15:10:21.190483 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:21 crc kubenswrapper[4810]: I0219 15:10:21.190519 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:21 crc kubenswrapper[4810]: I0219 15:10:21.190544 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:21Z","lastTransitionTime":"2026-02-19T15:10:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:21 crc kubenswrapper[4810]: I0219 15:10:21.294043 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:21 crc kubenswrapper[4810]: I0219 15:10:21.294103 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:21 crc kubenswrapper[4810]: I0219 15:10:21.294120 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:21 crc kubenswrapper[4810]: I0219 15:10:21.294144 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:21 crc kubenswrapper[4810]: I0219 15:10:21.294165 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:21Z","lastTransitionTime":"2026-02-19T15:10:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:21 crc kubenswrapper[4810]: I0219 15:10:21.395276 4810 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 14:49:36.11668155 +0000 UTC Feb 19 15:10:21 crc kubenswrapper[4810]: I0219 15:10:21.397307 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:21 crc kubenswrapper[4810]: I0219 15:10:21.397378 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:21 crc kubenswrapper[4810]: I0219 15:10:21.397391 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:21 crc kubenswrapper[4810]: I0219 15:10:21.397411 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:21 crc kubenswrapper[4810]: I0219 15:10:21.397429 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:21Z","lastTransitionTime":"2026-02-19T15:10:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:21 crc kubenswrapper[4810]: I0219 15:10:21.439188 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2x9v9" Feb 19 15:10:21 crc kubenswrapper[4810]: I0219 15:10:21.439355 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 15:10:21 crc kubenswrapper[4810]: I0219 15:10:21.439451 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 15:10:21 crc kubenswrapper[4810]: E0219 15:10:21.439483 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2x9v9" podUID="b72d3f7a-e418-4a21-af73-6a43ce3358c1" Feb 19 15:10:21 crc kubenswrapper[4810]: E0219 15:10:21.439564 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 15:10:21 crc kubenswrapper[4810]: E0219 15:10:21.439632 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 15:10:21 crc kubenswrapper[4810]: I0219 15:10:21.456159 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6694b0d-3264-43eb-bd52-5088c7d2bf15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16713d77989f991d6b3136c41aac696aaa3233a0f3454bcd299cd968f46872eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42939a1750dd650a11f4f83812694248cc8767b040f5cab76ee64d30ad6004ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a3707767fcc9c9ee6a808f9c0598457f1710b6394ecf9910f7a1a0ddd8ec1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef740fa90b279363e303fb824188d0a26097b06aa8eaa488c0c800a8caa8f10e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://640d02443d7d20048695866b4db1d1fdd6469c712a804010860ca4c730f59c65\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T15:09:51Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 15:09:51.508430 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 15:09:51.508603 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 15:09:51.509986 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1367038872/tls.crt::/tmp/serving-cert-1367038872/tls.key\\\\\\\"\\\\nI0219 15:09:51.722990 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 15:09:51.726068 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 15:09:51.726086 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 15:09:51.726104 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 15:09:51.726110 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 15:09:51.731095 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 15:09:51.731120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0219 15:09:51.731119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0219 15:09:51.731126 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 15:09:51.731136 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 15:09:51.731141 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 15:09:51.731145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 15:09:51.731149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 15:09:51.734010 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77137e37c7a8cdddbba21456e942c760fb2cc6c31533e3bc6c4561929b41393b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:21Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:21 crc kubenswrapper[4810]: I0219 15:10:21.475890 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:21Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:21 crc kubenswrapper[4810]: I0219 15:10:21.494995 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmbsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"247e948b-3c17-4675-bd1c-f894b02d2817\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7171cb2bba77488fbb60ec533172b2240d1871e89bc7760ae5c9b67ee6924354\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cbe5a969ecab49494c22bccec30a5b7050699729a8ae6244a4bb016ed9cbe64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cbe5a969ecab49494c22bccec30a5b7050699729a8ae6244a4bb016ed9cbe64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://395e0b7a03b78f71b2823c72ea754e1909bde653a7107597bf07b6e664ba40dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://395e0b7a03b78f71b2823c72ea754e1909bde653a7107597bf07b6e664ba40dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5a0126c8bea39a89fe488d49dcb1e47063abc9839d148f348a7ce592038a450\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5a0126c8bea39a89fe488d49dcb1e47063abc9839d148f348a7ce592038a450\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9864da53b0851b0c4eadd5c11669af9b9e539660908ca14f44618824b5e4158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9864da53b0851b0c4eadd5c11669af9b9e539660908ca14f44618824b5e4158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://301c743749a2bcaee82a70d3c34a6cf13f880ef9986c390e279a7c056f9a99ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://301c743749a2bcaee82a70d3c34a6cf13f880ef9986c390e279a7c056f9a99ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://075e06e40beb99d2338c5850692a5e89314b6342cc72007d75b602ac21a90d9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://075e06e40beb99d2338c5850692a5e89314b6342cc72007d75b602ac21a90d9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmbsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:21Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:21 crc kubenswrapper[4810]: I0219 15:10:21.503452 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:21 crc kubenswrapper[4810]: I0219 15:10:21.503543 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:21 crc kubenswrapper[4810]: I0219 15:10:21.503567 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:21 crc kubenswrapper[4810]: I0219 15:10:21.503595 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:21 crc kubenswrapper[4810]: I0219 15:10:21.503616 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:21Z","lastTransitionTime":"2026-02-19T15:10:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:21 crc kubenswrapper[4810]: I0219 15:10:21.507251 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2x9v9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b72d3f7a-e418-4a21-af73-6a43ce3358c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7z7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7z7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:10:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2x9v9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:21Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:21 crc kubenswrapper[4810]: I0219 15:10:21.519855 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfbf74ef-1e94-4826-8583-42b2e246ccf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f26baf8d6384c9133f7da338479eac54f1aa88609e4b0854078d4e85e8bf05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjrkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba65a8abc0841494ba4596019b005e3cbb3f3c0897bef615b51a2ab10aa9e651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjrkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t499d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:21Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:21 crc kubenswrapper[4810]: I0219 15:10:21.533803 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-t9jnq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60d7228b-14bd-4988-8dca-cb89f487ba00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb03eee4bad3a037ff0d61a204c9c86a839e9d684dbb89a433a50afa9cfcfcf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gzvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-t9jnq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:21Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:21 crc kubenswrapper[4810]: I0219 15:10:21.554065 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-l2g4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06adf9a1-ec31-4acc-9864-41549913d3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://618a71f428d132079ed8a6756ea331ec8fd20c32c98bc00691f56fa0b12669aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh6qr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82d6389909245cbf6091dfb1ef53a497dd254228acd53c508c32c4345f86b7b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh6qr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:10:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-l2g4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:21Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:21 crc kubenswrapper[4810]: I0219 15:10:21.573681 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac4287c9-a6d9-470d-820f-316c037a5d1a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc8b81d39b5b02958a8f7fa7ec86ac9dd530f2c4674042f60b11209e0400433a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://034d625f766dc7e2dd97fcedcfc0ca251971bc0be7e0d04ed4b2bcf3939905b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81860198cc657673df0021324ba4fde92aa6ffbde974993d0553a080872f7fb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eeb662772f98f52da96cf3b347f7b4990f4daa4552da61d972ecb8875d74b01c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eeb662772f98f52da96cf3b347f7b4990f4daa4552da61d972ecb8875d74b01c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:21Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:21 crc kubenswrapper[4810]: I0219 15:10:21.596321 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:21Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:21 crc kubenswrapper[4810]: I0219 15:10:21.606455 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:21 crc kubenswrapper[4810]: I0219 15:10:21.606528 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:21 crc kubenswrapper[4810]: I0219 15:10:21.606544 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:21 crc kubenswrapper[4810]: I0219 15:10:21.606592 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:21 crc kubenswrapper[4810]: I0219 15:10:21.606610 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:21Z","lastTransitionTime":"2026-02-19T15:10:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:21 crc kubenswrapper[4810]: I0219 15:10:21.617996 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9db08d13283889d8c43ab89157e2c33f300359ecf66b548100b30ddea3da2be1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8472b62f0ac04884cd8a9d342c063860188b0f17e1ed2201c7ea7288fb44d77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:21Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:21 crc kubenswrapper[4810]: I0219 15:10:21.632114 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-flbx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cc3ea69-b881-4fd4-ad4d-42803f27865b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07b26bfbc15981836877e6ddc6aea3fb26a4df1be38ac67c76f8a7a8f6b84b5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqph9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-flbx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:21Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:21 crc kubenswrapper[4810]: I0219 15:10:21.654361 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dcca2640a7625737b151a65d63dac889139b41a8dff1cd5aae2ead1a37138f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://879052e78b8d383d0a7f4fbddb38b98fa931bec3d0d0c7ff7eb830ccb62be46d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5de90aae0b44ad98711b4e4c2764cf25c491a329a7acbe9a6ec49f88fb8e7c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c0568b462e1168949dcfd0131ef4e94d6c3f68dd31906ee5c6a8c87f7fef25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://702a868ef507f31cb27f41ccd862f38ba8d5256e5d4af19a5d8b168939932663\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5089067acda7084123f239150916500b11a0547068b0a39ee60a467c372f3459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c2bd7e735fa6950f8384e98fc9f9b2c030892c44b19ca0f13fd0e4bad302e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86c2bd7e735fa6950f8384e98fc9f9b2c030892c44b19ca0f13fd0e4bad302e4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T15:10:06Z\\\",\\\"message\\\":\\\"ns:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {43933d5e-3c3b-4ff8-8926-04ac25de450e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0219 15:10:06.221628 6231 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0219 15:10:06.223394 6231 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:43933d5e-3c3b-4ff8-8926-04ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0219 15:10:06.223450 6231 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T15:10:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-8k7p5_openshift-ovn-kubernetes(c5a8a15c-53e8-4868-8feb-dcd4e83939a4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e21fe2d6964872913c8e5a2f983519806b72d0e2880016372ac90fc2b9f61fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8k7p5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:21Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:21 crc kubenswrapper[4810]: I0219 15:10:21.672747 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d74a04f-ffb6-47fb-8823-2fd7f3df4f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2070ce19c777ad791dfd3e5d99eded43e479f4cbbaefe96c5ed8525aa86e5ed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89e969496cd5adab974a5d8b8c3051bcdfb63c582b8f0d6d9af233d25e2c1d3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5559fbe43f7e18617e2a46182d621a6310ae4d0b10548bae2bb098588285e081\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fbf8815e560483199cc8f453952895d3e5d831e7c52671b204187dc686fa4ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:21Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:21 crc kubenswrapper[4810]: I0219 15:10:21.688417 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b60fe11e60f880bde640f63640fdc6a1b66e3044dcd975c711c45a15aed3fb2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:21Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:21 crc kubenswrapper[4810]: I0219 15:10:21.701951 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bsztz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a45a199-beeb-4972-b796-15c958fe99d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a674e283274f1e77f6ac89a8170137bf5852153bd4a35a80101a3cd0090b85d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcg8h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bsztz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:21Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:21 crc kubenswrapper[4810]: I0219 15:10:21.709186 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:21 crc kubenswrapper[4810]: I0219 15:10:21.709244 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:21 crc kubenswrapper[4810]: I0219 15:10:21.709262 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:21 crc kubenswrapper[4810]: I0219 15:10:21.709291 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:21 crc kubenswrapper[4810]: I0219 15:10:21.709310 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:21Z","lastTransitionTime":"2026-02-19T15:10:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:21 crc kubenswrapper[4810]: I0219 15:10:21.717529 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:21Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:21 crc kubenswrapper[4810]: I0219 15:10:21.734602 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54ba23405a6182e281d7d423f13e84a03c0f68cf69169f59fe1cd0d5881103c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:21Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:21 crc kubenswrapper[4810]: I0219 15:10:21.811467 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:21 crc kubenswrapper[4810]: I0219 15:10:21.811515 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:21 crc kubenswrapper[4810]: I0219 15:10:21.811526 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:21 crc kubenswrapper[4810]: I0219 15:10:21.811546 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:21 crc kubenswrapper[4810]: I0219 15:10:21.811559 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:21Z","lastTransitionTime":"2026-02-19T15:10:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:21 crc kubenswrapper[4810]: I0219 15:10:21.914450 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:21 crc kubenswrapper[4810]: I0219 15:10:21.914519 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:21 crc kubenswrapper[4810]: I0219 15:10:21.914542 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:21 crc kubenswrapper[4810]: I0219 15:10:21.914566 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:21 crc kubenswrapper[4810]: I0219 15:10:21.914585 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:21Z","lastTransitionTime":"2026-02-19T15:10:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:22 crc kubenswrapper[4810]: I0219 15:10:22.017309 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:22 crc kubenswrapper[4810]: I0219 15:10:22.017363 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:22 crc kubenswrapper[4810]: I0219 15:10:22.017374 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:22 crc kubenswrapper[4810]: I0219 15:10:22.017392 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:22 crc kubenswrapper[4810]: I0219 15:10:22.017409 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:22Z","lastTransitionTime":"2026-02-19T15:10:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:22 crc kubenswrapper[4810]: I0219 15:10:22.120614 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:22 crc kubenswrapper[4810]: I0219 15:10:22.120677 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:22 crc kubenswrapper[4810]: I0219 15:10:22.120692 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:22 crc kubenswrapper[4810]: I0219 15:10:22.120714 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:22 crc kubenswrapper[4810]: I0219 15:10:22.120729 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:22Z","lastTransitionTime":"2026-02-19T15:10:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:22 crc kubenswrapper[4810]: I0219 15:10:22.223199 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:22 crc kubenswrapper[4810]: I0219 15:10:22.223563 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:22 crc kubenswrapper[4810]: I0219 15:10:22.223651 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:22 crc kubenswrapper[4810]: I0219 15:10:22.223745 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:22 crc kubenswrapper[4810]: I0219 15:10:22.223814 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:22Z","lastTransitionTime":"2026-02-19T15:10:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:22 crc kubenswrapper[4810]: I0219 15:10:22.327712 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:22 crc kubenswrapper[4810]: I0219 15:10:22.327784 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:22 crc kubenswrapper[4810]: I0219 15:10:22.327802 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:22 crc kubenswrapper[4810]: I0219 15:10:22.327826 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:22 crc kubenswrapper[4810]: I0219 15:10:22.327845 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:22Z","lastTransitionTime":"2026-02-19T15:10:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:22 crc kubenswrapper[4810]: I0219 15:10:22.395501 4810 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 10:35:08.287432702 +0000 UTC Feb 19 15:10:22 crc kubenswrapper[4810]: I0219 15:10:22.430663 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:22 crc kubenswrapper[4810]: I0219 15:10:22.430716 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:22 crc kubenswrapper[4810]: I0219 15:10:22.430734 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:22 crc kubenswrapper[4810]: I0219 15:10:22.430764 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:22 crc kubenswrapper[4810]: I0219 15:10:22.430783 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:22Z","lastTransitionTime":"2026-02-19T15:10:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:22 crc kubenswrapper[4810]: I0219 15:10:22.438595 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 15:10:22 crc kubenswrapper[4810]: E0219 15:10:22.438784 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 15:10:22 crc kubenswrapper[4810]: I0219 15:10:22.440228 4810 scope.go:117] "RemoveContainer" containerID="86c2bd7e735fa6950f8384e98fc9f9b2c030892c44b19ca0f13fd0e4bad302e4" Feb 19 15:10:22 crc kubenswrapper[4810]: I0219 15:10:22.532868 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:22 crc kubenswrapper[4810]: I0219 15:10:22.532924 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:22 crc kubenswrapper[4810]: I0219 15:10:22.532941 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:22 crc kubenswrapper[4810]: I0219 15:10:22.532968 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:22 crc kubenswrapper[4810]: I0219 15:10:22.532985 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:22Z","lastTransitionTime":"2026-02-19T15:10:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:22 crc kubenswrapper[4810]: I0219 15:10:22.636634 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:22 crc kubenswrapper[4810]: I0219 15:10:22.636716 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:22 crc kubenswrapper[4810]: I0219 15:10:22.636741 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:22 crc kubenswrapper[4810]: I0219 15:10:22.636779 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:22 crc kubenswrapper[4810]: I0219 15:10:22.636806 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:22Z","lastTransitionTime":"2026-02-19T15:10:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:22 crc kubenswrapper[4810]: I0219 15:10:22.740823 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:22 crc kubenswrapper[4810]: I0219 15:10:22.740879 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:22 crc kubenswrapper[4810]: I0219 15:10:22.740889 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:22 crc kubenswrapper[4810]: I0219 15:10:22.740912 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:22 crc kubenswrapper[4810]: I0219 15:10:22.740926 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:22Z","lastTransitionTime":"2026-02-19T15:10:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:22 crc kubenswrapper[4810]: I0219 15:10:22.845022 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:22 crc kubenswrapper[4810]: I0219 15:10:22.845082 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:22 crc kubenswrapper[4810]: I0219 15:10:22.845101 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:22 crc kubenswrapper[4810]: I0219 15:10:22.845135 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:22 crc kubenswrapper[4810]: I0219 15:10:22.845158 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:22Z","lastTransitionTime":"2026-02-19T15:10:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:22 crc kubenswrapper[4810]: I0219 15:10:22.869377 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8k7p5_c5a8a15c-53e8-4868-8feb-dcd4e83939a4/ovnkube-controller/1.log" Feb 19 15:10:22 crc kubenswrapper[4810]: I0219 15:10:22.873882 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" event={"ID":"c5a8a15c-53e8-4868-8feb-dcd4e83939a4","Type":"ContainerStarted","Data":"4bb8c0b89e0d91440beb93dc0fe78cebe4ca481940af3c8affe5efd75d589980"} Feb 19 15:10:22 crc kubenswrapper[4810]: I0219 15:10:22.874707 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" Feb 19 15:10:22 crc kubenswrapper[4810]: I0219 15:10:22.890549 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2x9v9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b72d3f7a-e418-4a21-af73-6a43ce3358c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7z7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7z7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:10:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2x9v9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:22Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:22 crc kubenswrapper[4810]: I0219 15:10:22.913086 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6694b0d-3264-43eb-bd52-5088c7d2bf15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16713d77989f991d6b3136c41aac696aaa3233a0f3454bcd299cd968f46872eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42939a1750dd650a11f4f83812694248cc8767b040f5cab76ee64d30ad6004ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a3707767fcc9c9ee6a808f9c0598457f1710b6394ecf9910f7a1a0ddd8ec1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef740fa90b279363e303fb824188d0a26097b06aa8eaa488c0c800a8caa8f10e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://640d02443d7d20048695866b4db1d1fdd6469c712a804010860ca4c730f59c65\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T15:09:51Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 15:09:51.508430 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 15:09:51.508603 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 15:09:51.509986 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1367038872/tls.crt::/tmp/serving-cert-1367038872/tls.key\\\\\\\"\\\\nI0219 15:09:51.722990 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 15:09:51.726068 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 15:09:51.726086 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 15:09:51.726104 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 15:09:51.726110 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 15:09:51.731095 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 15:09:51.731120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0219 15:09:51.731119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0219 15:09:51.731126 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 15:09:51.731136 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 15:09:51.731141 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 15:09:51.731145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 15:09:51.731149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 15:09:51.734010 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77137e37c7a8cdddbba21456e942c760fb2cc6c31533e3bc6c4561929b41393b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:22Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:22 crc kubenswrapper[4810]: I0219 15:10:22.929430 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:22Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:22 crc kubenswrapper[4810]: I0219 15:10:22.947853 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:22 crc kubenswrapper[4810]: I0219 15:10:22.947900 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:22 crc kubenswrapper[4810]: I0219 15:10:22.947912 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:22 crc kubenswrapper[4810]: I0219 15:10:22.947934 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:22 crc kubenswrapper[4810]: I0219 15:10:22.947948 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:22Z","lastTransitionTime":"2026-02-19T15:10:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:22 crc kubenswrapper[4810]: I0219 15:10:22.951933 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmbsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"247e948b-3c17-4675-bd1c-f894b02d2817\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7171cb2bba77488fbb60ec533172b2240d1871e89bc7760ae5c9b67ee6924354\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cbe5a969ecab49494c22bccec30a5b7050699729a8ae6244a4bb016ed9cbe64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cbe5a969ecab49494c22bccec30a5b7050699729a8ae6244a4bb016ed9cbe64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://395e0b7a03b78f71b2823c72ea754e1909bde653a7107597bf07b6e664ba40dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://395e0b7a03b78f71b2823c72ea754e1909bde653a7107597bf07b6e664ba40dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5a0126c8bea39a89fe488d49dcb1e47063abc9839d148f348a7ce592038a450\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5a0126c8bea39a89fe488d49dcb1e47063abc9839d148f348a7ce592038a450\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9864da53b0851b0c4eadd5c11669af9b9e539660908ca14f44618824b5e4158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9864da53b0851b0c4eadd5c11669af9b9e539660908ca14f44618824b5e4158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://301c743749a2bcaee82a70d3c34a6cf13f880ef9986c390e279a7c056f9a99ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://301c743749a2bcaee82a70d3c34a6cf13f880ef9986c390e279a7c056f9a99ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://075e06e40beb99d2338c5850692a5e89314b6342cc72007d75b602ac21a90d9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://075e06e40beb99d2338c5850692a5e89314b6342cc72007d75b602ac21a90d9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmbsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:22Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:22 crc kubenswrapper[4810]: I0219 15:10:22.970819 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac4287c9-a6d9-470d-820f-316c037a5d1a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc8b81d39b5b02958a8f7fa7ec86ac9dd530f2c4674042f60b11209e0400433a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://034d625f766dc7e2dd97fcedcfc0ca251971bc0be7e0d04ed4b2bcf3939905b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81860198cc657673df0021324ba4fde92aa6ffbde974993d0553a080872f7fb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eeb662772f98f52da96cf3b347f7b4990f4daa4552da61d972ecb8875d74b01c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eeb662772f98f52da96cf3b347f7b4990f4daa4552da61d972ecb8875d74b01c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:22Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:22 crc kubenswrapper[4810]: I0219 15:10:22.987305 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:22Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.004427 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfbf74ef-1e94-4826-8583-42b2e246ccf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f26baf8d6384c9133f7da338479eac54f1aa88609e4b0854078d4e85e8bf05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjrkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba65a8abc0841494ba4596019b005e3cbb3f3c0897bef615b51a2ab10aa9e651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjrkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t499d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:22Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.031785 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-t9jnq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60d7228b-14bd-4988-8dca-cb89f487ba00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb03eee4bad3a037ff0d61a204c9c86a839e9d684dbb89a433a50afa9cfcfcf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gzvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-t9jnq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:23Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.048199 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-l2g4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06adf9a1-ec31-4acc-9864-41549913d3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://618a71f428d132079ed8a6756ea331ec8fd20c32c98bc00691f56fa0b12669aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh6qr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82d6389909245cbf6091dfb1ef53a497dd254228acd53c508c32c4345f86b7b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh6qr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:10:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-l2g4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:23Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.050101 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.050128 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.050139 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.050157 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.050168 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:23Z","lastTransitionTime":"2026-02-19T15:10:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.064590 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d74a04f-ffb6-47fb-8823-2fd7f3df4f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2070ce19c777ad791dfd3e5d99eded43e479f4cbbaefe96c5ed8525aa86e5ed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89e969496cd5adab974a5d8b8c3051bcdfb63c582b8f0d6d9af233d25e2c1d3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5559fbe43f7e18617e2a46182d621a6310ae4d0b10548bae2bb098588285e081\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fbf8815e560483199cc8f453952895d3e5d831e7c52671b204187dc686fa4ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:23Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.087482 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b60fe11e60f880bde640f63640fdc6a1b66e3044dcd975c711c45a15aed3fb2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:23Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.117466 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9db08d13283889d8c43ab89157e2c33f300359ecf66b548100b30ddea3da2be1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8472b62f0ac04884cd8a9d342c063860188b0f17e1ed2201c7ea7288fb44d77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:23Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.135416 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-flbx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cc3ea69-b881-4fd4-ad4d-42803f27865b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07b26bfbc15981836877e6ddc6aea3fb26a4df1be38ac67c76f8a7a8f6b84b5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqph9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-flbx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:23Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.153058 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.153118 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.153136 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.153158 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.153174 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:23Z","lastTransitionTime":"2026-02-19T15:10:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.159773 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dcca2640a7625737b151a65d63dac889139b41a8dff1cd5aae2ead1a37138f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://879052e78b8d383d0a7f4fbddb38b98fa931bec3d0d0c7ff7eb830ccb62be46d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5de90aae0b44ad98711b4e4c2764cf25c491a329a7acbe9a6ec49f88fb8e7c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c0568b462e1168949dcfd0131ef4e94d6c3f68dd31906ee5c6a8c87f7fef25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://702a868ef507f31cb27f41ccd862f38ba8d5256e5d4af19a5d8b168939932663\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5089067acda7084123f239150916500b11a0547068b0a39ee60a467c372f3459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bb8c0b89e0d91440beb93dc0fe78cebe4ca481940af3c8affe5efd75d589980\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86c2bd7e735fa6950f8384e98fc9f9b2c030892c44b19ca0f13fd0e4bad302e4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T15:10:06Z\\\",\\\"message\\\":\\\"ns:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {43933d5e-3c3b-4ff8-8926-04ac25de450e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0219 15:10:06.221628 6231 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0219 15:10:06.223394 6231 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:43933d5e-3c3b-4ff8-8926-04ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0219 15:10:06.223450 6231 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T15:10:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e21fe2d6964872913c8e5a2f983519806b72d0e2880016372ac90fc2b9f61fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8k7p5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:23Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.173544 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:23Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.186896 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54ba23405a6182e281d7d423f13e84a03c0f68cf69169f59fe1cd0d5881103c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:23Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.201241 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bsztz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a45a199-beeb-4972-b796-15c958fe99d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a674e283274f1e77f6ac89a8170137bf5852153bd4a35a80101a3cd0090b85d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcg8h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bsztz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:23Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.224926 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 15:10:23 crc kubenswrapper[4810]: E0219 15:10:23.225222 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 15:10:55.22517756 +0000 UTC m=+84.707207684 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.256542 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.256602 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.256620 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.256651 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.256670 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:23Z","lastTransitionTime":"2026-02-19T15:10:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.326557 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.326635 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.326687 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.326721 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 15:10:23 crc kubenswrapper[4810]: E0219 15:10:23.326773 4810 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 15:10:23 crc kubenswrapper[4810]: E0219 15:10:23.326853 4810 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 15:10:23 crc kubenswrapper[4810]: E0219 15:10:23.326879 4810 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 15:10:23 crc kubenswrapper[4810]: E0219 15:10:23.326895 4810 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 15:10:23 crc kubenswrapper[4810]: E0219 15:10:23.326861 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 15:10:55.326835226 +0000 UTC m=+84.808865360 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 15:10:23 crc kubenswrapper[4810]: E0219 15:10:23.326967 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-19 15:10:55.326935329 +0000 UTC m=+84.808965463 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 15:10:23 crc kubenswrapper[4810]: E0219 15:10:23.326960 4810 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 15:10:23 crc kubenswrapper[4810]: E0219 15:10:23.327020 4810 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 15:10:23 crc kubenswrapper[4810]: E0219 15:10:23.327060 4810 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 15:10:23 crc kubenswrapper[4810]: E0219 15:10:23.327078 4810 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 15:10:23 crc kubenswrapper[4810]: E0219 15:10:23.327121 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 15:10:55.327094803 +0000 UTC m=+84.809124927 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 15:10:23 crc kubenswrapper[4810]: E0219 15:10:23.327153 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-19 15:10:55.327131424 +0000 UTC m=+84.809161548 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.359414 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.359472 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.359484 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.359508 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.359523 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:23Z","lastTransitionTime":"2026-02-19T15:10:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.396793 4810 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 12:33:00.886298481 +0000 UTC Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.428299 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b72d3f7a-e418-4a21-af73-6a43ce3358c1-metrics-certs\") pod \"network-metrics-daemon-2x9v9\" (UID: \"b72d3f7a-e418-4a21-af73-6a43ce3358c1\") " pod="openshift-multus/network-metrics-daemon-2x9v9" Feb 19 15:10:23 crc kubenswrapper[4810]: E0219 15:10:23.428515 4810 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 15:10:23 crc kubenswrapper[4810]: E0219 15:10:23.428647 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b72d3f7a-e418-4a21-af73-6a43ce3358c1-metrics-certs podName:b72d3f7a-e418-4a21-af73-6a43ce3358c1 nodeName:}" failed. No retries permitted until 2026-02-19 15:10:39.428615756 +0000 UTC m=+68.910645890 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b72d3f7a-e418-4a21-af73-6a43ce3358c1-metrics-certs") pod "network-metrics-daemon-2x9v9" (UID: "b72d3f7a-e418-4a21-af73-6a43ce3358c1") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.439014 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2x9v9" Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.439037 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.439188 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 15:10:23 crc kubenswrapper[4810]: E0219 15:10:23.439359 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2x9v9" podUID="b72d3f7a-e418-4a21-af73-6a43ce3358c1" Feb 19 15:10:23 crc kubenswrapper[4810]: E0219 15:10:23.439588 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 15:10:23 crc kubenswrapper[4810]: E0219 15:10:23.439742 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.462725 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.462791 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.462806 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.462828 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.462845 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:23Z","lastTransitionTime":"2026-02-19T15:10:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.565737 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.565791 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.565802 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.565823 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.565837 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:23Z","lastTransitionTime":"2026-02-19T15:10:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.669217 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.669259 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.669269 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.669286 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.669299 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:23Z","lastTransitionTime":"2026-02-19T15:10:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.771951 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.772015 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.772029 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.772052 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.772067 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:23Z","lastTransitionTime":"2026-02-19T15:10:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.874710 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.874766 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.874776 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.874797 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.874811 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:23Z","lastTransitionTime":"2026-02-19T15:10:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.878982 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8k7p5_c5a8a15c-53e8-4868-8feb-dcd4e83939a4/ovnkube-controller/2.log" Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.879749 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8k7p5_c5a8a15c-53e8-4868-8feb-dcd4e83939a4/ovnkube-controller/1.log" Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.882682 4810 generic.go:334] "Generic (PLEG): container finished" podID="c5a8a15c-53e8-4868-8feb-dcd4e83939a4" containerID="4bb8c0b89e0d91440beb93dc0fe78cebe4ca481940af3c8affe5efd75d589980" exitCode=1 Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.882726 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" event={"ID":"c5a8a15c-53e8-4868-8feb-dcd4e83939a4","Type":"ContainerDied","Data":"4bb8c0b89e0d91440beb93dc0fe78cebe4ca481940af3c8affe5efd75d589980"} Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.882796 4810 scope.go:117] "RemoveContainer" containerID="86c2bd7e735fa6950f8384e98fc9f9b2c030892c44b19ca0f13fd0e4bad302e4" Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.884651 4810 scope.go:117] "RemoveContainer" containerID="4bb8c0b89e0d91440beb93dc0fe78cebe4ca481940af3c8affe5efd75d589980" Feb 19 15:10:23 crc kubenswrapper[4810]: E0219 15:10:23.887655 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-8k7p5_openshift-ovn-kubernetes(c5a8a15c-53e8-4868-8feb-dcd4e83939a4)\"" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" podUID="c5a8a15c-53e8-4868-8feb-dcd4e83939a4" Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.905231 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac4287c9-a6d9-470d-820f-316c037a5d1a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc8b81d39b5b02958a8f7fa7ec86ac9dd530f2c4674042f60b11209e0400433a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://034d625f766dc7e2dd97fcedcfc0ca251971bc0be7e0d04ed4b2bcf3939905b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81860198cc657673df0021324ba4fde92aa6ffbde974993d0553a080872f7fb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eeb662772f98f52da96cf3b347f7b4990f4daa4552da61d972ecb8875d74b01c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eeb662772f98f52da96cf3b347f7b4990f4daa4552da61d972ecb8875d74b01c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:23Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.919737 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:23Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.930806 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfbf74ef-1e94-4826-8583-42b2e246ccf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f26baf8d6384c9133f7da338479eac54f1aa88609e4b0854078d4e85e8bf05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjrkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba65a8abc0841494ba4596019b005e3cbb3f3c0897bef615b51a2ab10aa9e651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjrkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t499d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:23Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.943975 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-t9jnq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60d7228b-14bd-4988-8dca-cb89f487ba00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb03eee4bad3a037ff0d61a204c9c86a839e9d684dbb89a433a50afa9cfcfcf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gzvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-t9jnq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:23Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.957253 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-l2g4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06adf9a1-ec31-4acc-9864-41549913d3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://618a71f428d132079ed8a6756ea331ec8fd20c32c98bc00691f56fa0b12669aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh6qr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82d6389909245cbf6091dfb1ef53a497dd254228acd53c508c32c4345f86b7b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh6qr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:10:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-l2g4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:23Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.972124 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d74a04f-ffb6-47fb-8823-2fd7f3df4f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2070ce19c777ad791dfd3e5d99eded43e479f4cbbaefe96c5ed8525aa86e5ed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89e969496cd5adab974a5d8b8c3051bcdfb63c582b8f0d6d9af233d25e2c1d3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5559fbe43f7e18617e2a46182d621a6310ae4d0b10548bae2bb098588285e081\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fbf8815e560483199cc8f453952895d3e5d831e7c52671b204187dc686fa4ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:23Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.979082 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.979140 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.979152 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.979174 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.979186 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:23Z","lastTransitionTime":"2026-02-19T15:10:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:23 crc kubenswrapper[4810]: I0219 15:10:23.988638 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b60fe11e60f880bde640f63640fdc6a1b66e3044dcd975c711c45a15aed3fb2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:23Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:24 crc kubenswrapper[4810]: I0219 15:10:24.007107 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9db08d13283889d8c43ab89157e2c33f300359ecf66b548100b30ddea3da2be1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8472b62f0ac04884cd8a9d342c063860188b0f17e1ed2201c7ea7288fb44d77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:24Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:24 crc kubenswrapper[4810]: I0219 15:10:24.020431 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-flbx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cc3ea69-b881-4fd4-ad4d-42803f27865b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07b26bfbc15981836877e6ddc6aea3fb26a4df1be38ac67c76f8a7a8f6b84b5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqph9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-flbx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:24Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:24 crc kubenswrapper[4810]: I0219 15:10:24.047830 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dcca2640a7625737b151a65d63dac889139b41a8dff1cd5aae2ead1a37138f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://879052e78b8d383d0a7f4fbddb38b98fa931bec3d0d0c7ff7eb830ccb62be46d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5de90aae0b44ad98711b4e4c2764cf25c491a329a7acbe9a6ec49f88fb8e7c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c0568b462e1168949dcfd0131ef4e94d6c3f68dd31906ee5c6a8c87f7fef25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://702a868ef507f31cb27f41ccd862f38ba8d5256e5d4af19a5d8b168939932663\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5089067acda7084123f239150916500b11a0547068b0a39ee60a467c372f3459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bb8c0b89e0d91440beb93dc0fe78cebe4ca481940af3c8affe5efd75d589980\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86c2bd7e735fa6950f8384e98fc9f9b2c030892c44b19ca0f13fd0e4bad302e4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T15:10:06Z\\\",\\\"message\\\":\\\"ns:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {43933d5e-3c3b-4ff8-8926-04ac25de450e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0219 15:10:06.221628 6231 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0219 15:10:06.223394 6231 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:43933d5e-3c3b-4ff8-8926-04ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0219 15:10:06.223450 6231 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T15:10:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bb8c0b89e0d91440beb93dc0fe78cebe4ca481940af3c8affe5efd75d589980\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T15:10:23Z\\\",\\\"message\\\":\\\"responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.139:17698:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8efa4d1a-72f5-4dfa-9bc2-9d93ef11ecf2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0219 15:10:23.623215 6458 services_controller.go:443] Built service openshift-machine-config-operator/machine-config-controller LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.16\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:9001, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0219 15:10:23.623235 6458 services_controller.go:444] Built service openshift-machine-config-operator/machine-config-controller LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0219 15:10:23.623244 6458 services_controller.go:445] Built service openshift-machine-config-operator/machine-config-controller LB template configs for network=default: []services.lbConfig(nil)\\\\nF0219 15:10:23.623259 6458 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T15:10:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e21fe2d6964872913c8e5a2f983519806b72d0e2880016372ac90fc2b9f61fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8k7p5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:24Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:24 crc kubenswrapper[4810]: I0219 15:10:24.065940 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:24Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:24 crc kubenswrapper[4810]: I0219 15:10:24.081084 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:24 crc kubenswrapper[4810]: I0219 15:10:24.081122 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:24 crc kubenswrapper[4810]: I0219 15:10:24.081133 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:24 crc kubenswrapper[4810]: I0219 15:10:24.081151 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:24 crc kubenswrapper[4810]: I0219 15:10:24.081164 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:24Z","lastTransitionTime":"2026-02-19T15:10:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:24 crc kubenswrapper[4810]: I0219 15:10:24.082526 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54ba23405a6182e281d7d423f13e84a03c0f68cf69169f59fe1cd0d5881103c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:24Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:24 crc kubenswrapper[4810]: I0219 15:10:24.104811 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bsztz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a45a199-beeb-4972-b796-15c958fe99d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a674e283274f1e77f6ac89a8170137bf5852153bd4a35a80101a3cd0090b85d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcg8h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bsztz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:24Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:24 crc kubenswrapper[4810]: I0219 15:10:24.125315 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6694b0d-3264-43eb-bd52-5088c7d2bf15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16713d77989f991d6b3136c41aac696aaa3233a0f3454bcd299cd968f46872eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42939a1750dd650a11f4f83812694248cc8767b040f5cab76ee64d30ad6004ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a3707767fcc9c9ee6a808f9c0598457f1710b6394ecf9910f7a1a0ddd8ec1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef740fa90b279363e303fb824188d0a26097b06aa8eaa488c0c800a8caa8f10e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://640d02443d7d20048695866b4db1d1fdd6469c712a804010860ca4c730f59c65\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T15:09:51Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 15:09:51.508430 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 15:09:51.508603 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 15:09:51.509986 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1367038872/tls.crt::/tmp/serving-cert-1367038872/tls.key\\\\\\\"\\\\nI0219 15:09:51.722990 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 15:09:51.726068 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 15:09:51.726086 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 15:09:51.726104 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 15:09:51.726110 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 15:09:51.731095 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 15:09:51.731120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0219 15:09:51.731119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0219 15:09:51.731126 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 15:09:51.731136 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 15:09:51.731141 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 15:09:51.731145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 15:09:51.731149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 15:09:51.734010 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77137e37c7a8cdddbba21456e942c760fb2cc6c31533e3bc6c4561929b41393b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:24Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:24 crc kubenswrapper[4810]: I0219 15:10:24.147855 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:24Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:24 crc kubenswrapper[4810]: I0219 15:10:24.171588 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmbsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"247e948b-3c17-4675-bd1c-f894b02d2817\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7171cb2bba77488fbb60ec533172b2240d1871e89bc7760ae5c9b67ee6924354\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cbe5a969ecab49494c22bccec30a5b7050699729a8ae6244a4bb016ed9cbe64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cbe5a969ecab49494c22bccec30a5b7050699729a8ae6244a4bb016ed9cbe64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://395e0b7a03b78f71b2823c72ea754e1909bde653a7107597bf07b6e664ba40dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://395e0b7a03b78f71b2823c72ea754e1909bde653a7107597bf07b6e664ba40dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5a0126c8bea39a89fe488d49dcb1e47063abc9839d148f348a7ce592038a450\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5a0126c8bea39a89fe488d49dcb1e47063abc9839d148f348a7ce592038a450\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9864da53b0851b0c4eadd5c11669af9b9e539660908ca14f44618824b5e4158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9864da53b0851b0c4eadd5c11669af9b9e539660908ca14f44618824b5e4158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://301c743749a2bcaee82a70d3c34a6cf13f880ef9986c390e279a7c056f9a99ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://301c743749a2bcaee82a70d3c34a6cf13f880ef9986c390e279a7c056f9a99ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://075e06e40beb99d2338c5850692a5e89314b6342cc72007d75b602ac21a90d9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://075e06e40beb99d2338c5850692a5e89314b6342cc72007d75b602ac21a90d9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmbsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:24Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:24 crc kubenswrapper[4810]: I0219 15:10:24.184799 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:24 crc kubenswrapper[4810]: I0219 15:10:24.184866 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:24 crc kubenswrapper[4810]: I0219 15:10:24.184882 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:24 crc kubenswrapper[4810]: I0219 15:10:24.184915 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:24 crc kubenswrapper[4810]: I0219 15:10:24.184934 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:24Z","lastTransitionTime":"2026-02-19T15:10:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:24 crc kubenswrapper[4810]: I0219 15:10:24.187081 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2x9v9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b72d3f7a-e418-4a21-af73-6a43ce3358c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7z7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7z7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:10:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2x9v9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:24Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:24 crc kubenswrapper[4810]: I0219 15:10:24.288197 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:24 crc kubenswrapper[4810]: I0219 15:10:24.288249 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:24 crc kubenswrapper[4810]: I0219 15:10:24.288260 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:24 crc kubenswrapper[4810]: I0219 15:10:24.288282 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:24 crc kubenswrapper[4810]: I0219 15:10:24.288298 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:24Z","lastTransitionTime":"2026-02-19T15:10:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:24 crc kubenswrapper[4810]: I0219 15:10:24.392203 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:24 crc kubenswrapper[4810]: I0219 15:10:24.392256 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:24 crc kubenswrapper[4810]: I0219 15:10:24.392267 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:24 crc kubenswrapper[4810]: I0219 15:10:24.392289 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:24 crc kubenswrapper[4810]: I0219 15:10:24.392304 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:24Z","lastTransitionTime":"2026-02-19T15:10:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:24 crc kubenswrapper[4810]: I0219 15:10:24.397415 4810 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 03:40:48.62759197 +0000 UTC Feb 19 15:10:24 crc kubenswrapper[4810]: I0219 15:10:24.439272 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 15:10:24 crc kubenswrapper[4810]: E0219 15:10:24.439906 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 15:10:24 crc kubenswrapper[4810]: I0219 15:10:24.496055 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:24 crc kubenswrapper[4810]: I0219 15:10:24.496100 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:24 crc kubenswrapper[4810]: I0219 15:10:24.496111 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:24 crc kubenswrapper[4810]: I0219 15:10:24.496131 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:24 crc kubenswrapper[4810]: I0219 15:10:24.496145 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:24Z","lastTransitionTime":"2026-02-19T15:10:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:24 crc kubenswrapper[4810]: I0219 15:10:24.599494 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:24 crc kubenswrapper[4810]: I0219 15:10:24.599540 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:24 crc kubenswrapper[4810]: I0219 15:10:24.599549 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:24 crc kubenswrapper[4810]: I0219 15:10:24.599570 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:24 crc kubenswrapper[4810]: I0219 15:10:24.599582 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:24Z","lastTransitionTime":"2026-02-19T15:10:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:24 crc kubenswrapper[4810]: I0219 15:10:24.702762 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:24 crc kubenswrapper[4810]: I0219 15:10:24.702810 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:24 crc kubenswrapper[4810]: I0219 15:10:24.702818 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:24 crc kubenswrapper[4810]: I0219 15:10:24.702836 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:24 crc kubenswrapper[4810]: I0219 15:10:24.702846 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:24Z","lastTransitionTime":"2026-02-19T15:10:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:24 crc kubenswrapper[4810]: I0219 15:10:24.806299 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:24 crc kubenswrapper[4810]: I0219 15:10:24.806393 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:24 crc kubenswrapper[4810]: I0219 15:10:24.806408 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:24 crc kubenswrapper[4810]: I0219 15:10:24.806430 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:24 crc kubenswrapper[4810]: I0219 15:10:24.806442 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:24Z","lastTransitionTime":"2026-02-19T15:10:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:24 crc kubenswrapper[4810]: I0219 15:10:24.894085 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8k7p5_c5a8a15c-53e8-4868-8feb-dcd4e83939a4/ovnkube-controller/2.log" Feb 19 15:10:24 crc kubenswrapper[4810]: I0219 15:10:24.899109 4810 scope.go:117] "RemoveContainer" containerID="4bb8c0b89e0d91440beb93dc0fe78cebe4ca481940af3c8affe5efd75d589980" Feb 19 15:10:24 crc kubenswrapper[4810]: E0219 15:10:24.899293 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-8k7p5_openshift-ovn-kubernetes(c5a8a15c-53e8-4868-8feb-dcd4e83939a4)\"" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" podUID="c5a8a15c-53e8-4868-8feb-dcd4e83939a4" Feb 19 15:10:24 crc kubenswrapper[4810]: I0219 15:10:24.909995 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:24 crc kubenswrapper[4810]: I0219 15:10:24.910063 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:24 crc kubenswrapper[4810]: I0219 15:10:24.910084 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:24 crc kubenswrapper[4810]: I0219 15:10:24.910113 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:24 crc kubenswrapper[4810]: I0219 15:10:24.910130 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:24Z","lastTransitionTime":"2026-02-19T15:10:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:24 crc kubenswrapper[4810]: I0219 15:10:24.931268 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dcca2640a7625737b151a65d63dac889139b41a8dff1cd5aae2ead1a37138f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://879052e78b8d383d0a7f4fbddb38b98fa931bec3d0d0c7ff7eb830ccb62be46d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5de90aae0b44ad98711b4e4c2764cf25c491a329a7acbe9a6ec49f88fb8e7c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c0568b462e1168949dcfd0131ef4e94d6c3f68dd31906ee5c6a8c87f7fef25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://702a868ef507f31cb27f41ccd862f38ba8d5256e5d4af19a5d8b168939932663\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5089067acda7084123f239150916500b11a0547068b0a39ee60a467c372f3459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bb8c0b89e0d91440beb93dc0fe78cebe4ca481940af3c8affe5efd75d589980\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bb8c0b89e0d91440beb93dc0fe78cebe4ca481940af3c8affe5efd75d589980\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T15:10:23Z\\\",\\\"message\\\":\\\"responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.139:17698:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8efa4d1a-72f5-4dfa-9bc2-9d93ef11ecf2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0219 15:10:23.623215 6458 services_controller.go:443] Built service openshift-machine-config-operator/machine-config-controller LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.16\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:9001, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0219 15:10:23.623235 6458 services_controller.go:444] Built service openshift-machine-config-operator/machine-config-controller LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0219 15:10:23.623244 6458 services_controller.go:445] Built service openshift-machine-config-operator/machine-config-controller LB template configs for network=default: []services.lbConfig(nil)\\\\nF0219 15:10:23.623259 6458 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T15:10:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-8k7p5_openshift-ovn-kubernetes(c5a8a15c-53e8-4868-8feb-dcd4e83939a4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e21fe2d6964872913c8e5a2f983519806b72d0e2880016372ac90fc2b9f61fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8k7p5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:24Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:24 crc kubenswrapper[4810]: I0219 15:10:24.951731 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d74a04f-ffb6-47fb-8823-2fd7f3df4f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2070ce19c777ad791dfd3e5d99eded43e479f4cbbaefe96c5ed8525aa86e5ed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89e969496cd5adab974a5d8b8c3051bcdfb63c582b8f0d6d9af233d25e2c1d3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5559fbe43f7e18617e2a46182d621a6310ae4d0b10548bae2bb098588285e081\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fbf8815e560483199cc8f453952895d3e5d831e7c52671b204187dc686fa4ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:24Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:24 crc kubenswrapper[4810]: I0219 15:10:24.974844 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b60fe11e60f880bde640f63640fdc6a1b66e3044dcd975c711c45a15aed3fb2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:24Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:24 crc kubenswrapper[4810]: I0219 15:10:24.997004 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9db08d13283889d8c43ab89157e2c33f300359ecf66b548100b30ddea3da2be1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8472b62f0ac04884cd8a9d342c063860188b0f17e1ed2201c7ea7288fb44d77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:24Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.013943 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.014031 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.014052 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.014082 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.014099 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:25Z","lastTransitionTime":"2026-02-19T15:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.015005 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-flbx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cc3ea69-b881-4fd4-ad4d-42803f27865b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07b26bfbc15981836877e6ddc6aea3fb26a4df1be38ac67c76f8a7a8f6b84b5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqph9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-flbx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:25Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.034875 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:25Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.055313 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54ba23405a6182e281d7d423f13e84a03c0f68cf69169f59fe1cd0d5881103c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:25Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.075755 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bsztz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a45a199-beeb-4972-b796-15c958fe99d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a674e283274f1e77f6ac89a8170137bf5852153bd4a35a80101a3cd0090b85d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcg8h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bsztz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:25Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.091739 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.101706 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmbsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"247e948b-3c17-4675-bd1c-f894b02d2817\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7171cb2bba77488fbb60ec533172b2240d1871e89bc7760ae5c9b67ee6924354\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cbe5a969ecab49494c22bccec30a5b7050699729a8ae6244a4bb016ed9cbe64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cbe5a969ecab49494c22bccec30a5b7050699729a8ae6244a4bb016ed9cbe64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://395e0b7a03b78f71b2823c72ea754e1909bde653a7107597bf07b6e664ba40dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://395e0b7a03b78f71b2823c72ea754e1909bde653a7107597bf07b6e664ba40dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5a0126c8bea39a89fe488d49dcb1e47063abc9839d148f348a7ce592038a450\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5a0126c8bea39a89fe488d49dcb1e47063abc9839d148f348a7ce592038a450\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9864da53b0851b0c4eadd5c11669af9b9e539660908ca14f44618824b5e4158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9864da53b0851b0c4eadd5c11669af9b9e539660908ca14f44618824b5e4158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://301c743749a2bcaee82a70d3c34a6cf13f880ef9986c390e279a7c056f9a99ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://301c743749a2bcaee82a70d3c34a6cf13f880ef9986c390e279a7c056f9a99ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://075e06e40beb99d2338c5850692a5e89314b6342cc72007d75b602ac21a90d9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://075e06e40beb99d2338c5850692a5e89314b6342cc72007d75b602ac21a90d9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmbsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:25Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.117258 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2x9v9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b72d3f7a-e418-4a21-af73-6a43ce3358c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7z7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7z7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:10:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2x9v9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:25Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.117761 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.117825 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.117845 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.118283 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.118384 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:25Z","lastTransitionTime":"2026-02-19T15:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.137657 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6694b0d-3264-43eb-bd52-5088c7d2bf15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16713d77989f991d6b3136c41aac696aaa3233a0f3454bcd299cd968f46872eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42939a1750dd650a11f4f83812694248cc8767b040f5cab76ee64d30ad6004ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a3707767fcc9c9ee6a808f9c0598457f1710b6394ecf9910f7a1a0ddd8ec1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef740fa90b279363e303fb824188d0a26097b06aa8eaa488c0c800a8caa8f10e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://640d02443d7d20048695866b4db1d1fdd6469c712a804010860ca4c730f59c65\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T15:09:51Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 15:09:51.508430 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 15:09:51.508603 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 15:09:51.509986 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1367038872/tls.crt::/tmp/serving-cert-1367038872/tls.key\\\\\\\"\\\\nI0219 15:09:51.722990 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 15:09:51.726068 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 15:09:51.726086 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 15:09:51.726104 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 15:09:51.726110 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 15:09:51.731095 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 15:09:51.731120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0219 15:09:51.731119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0219 15:09:51.731126 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 15:09:51.731136 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 15:09:51.731141 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 15:09:51.731145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 15:09:51.731149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 15:09:51.734010 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77137e37c7a8cdddbba21456e942c760fb2cc6c31533e3bc6c4561929b41393b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:25Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.153077 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:25Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.169581 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-l2g4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06adf9a1-ec31-4acc-9864-41549913d3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://618a71f428d132079ed8a6756ea331ec8fd20c32c98bc00691f56fa0b12669aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh6qr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82d6389909245cbf6091dfb1ef53a497dd254228acd53c508c32c4345f86b7b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh6qr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:10:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-l2g4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:25Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.183741 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac4287c9-a6d9-470d-820f-316c037a5d1a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc8b81d39b5b02958a8f7fa7ec86ac9dd530f2c4674042f60b11209e0400433a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://034d625f766dc7e2dd97fcedcfc0ca251971bc0be7e0d04ed4b2bcf3939905b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81860198cc657673df0021324ba4fde92aa6ffbde974993d0553a080872f7fb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eeb662772f98f52da96cf3b347f7b4990f4daa4552da61d972ecb8875d74b01c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eeb662772f98f52da96cf3b347f7b4990f4daa4552da61d972ecb8875d74b01c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:25Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.198443 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:25Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.211475 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfbf74ef-1e94-4826-8583-42b2e246ccf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f26baf8d6384c9133f7da338479eac54f1aa88609e4b0854078d4e85e8bf05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjrkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba65a8abc0841494ba4596019b005e3cbb3f3c0897bef615b51a2ab10aa9e651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjrkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t499d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:25Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.221006 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.221050 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.221061 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.221081 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.221095 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:25Z","lastTransitionTime":"2026-02-19T15:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.224420 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-t9jnq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60d7228b-14bd-4988-8dca-cb89f487ba00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb03eee4bad3a037ff0d61a204c9c86a839e9d684dbb89a433a50afa9cfcfcf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gzvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-t9jnq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:25Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.241274 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac4287c9-a6d9-470d-820f-316c037a5d1a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc8b81d39b5b02958a8f7fa7ec86ac9dd530f2c4674042f60b11209e0400433a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://034d625f766dc7e2dd97fcedcfc0ca251971bc0be7e0d04ed4b2bcf3939905b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81860198cc657673df0021324ba4fde92aa6ffbde974993d0553a080872f7fb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eeb662772f98f52da96cf3b347f7b4990f4daa4552da61d972ecb8875d74b01c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eeb662772f98f52da96cf3b347f7b4990f4daa4552da61d972ecb8875d74b01c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:25Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.257212 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:25Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.271185 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfbf74ef-1e94-4826-8583-42b2e246ccf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f26baf8d6384c9133f7da338479eac54f1aa88609e4b0854078d4e85e8bf05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjrkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba65a8abc0841494ba4596019b005e3cbb3f3c0897bef615b51a2ab10aa9e651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjrkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t499d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:25Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.286707 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-t9jnq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60d7228b-14bd-4988-8dca-cb89f487ba00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb03eee4bad3a037ff0d61a204c9c86a839e9d684dbb89a433a50afa9cfcfcf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gzvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-t9jnq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:25Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.305385 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-l2g4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06adf9a1-ec31-4acc-9864-41549913d3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://618a71f428d132079ed8a6756ea331ec8fd20c32c98bc00691f56fa0b12669aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh6qr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82d6389909245cbf6091dfb1ef53a497dd254228acd53c508c32c4345f86b7b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh6qr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:10:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-l2g4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:25Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.324165 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.324214 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.324225 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.324242 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.324252 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:25Z","lastTransitionTime":"2026-02-19T15:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.324798 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d74a04f-ffb6-47fb-8823-2fd7f3df4f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2070ce19c777ad791dfd3e5d99eded43e479f4cbbaefe96c5ed8525aa86e5ed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89e969496cd5adab974a5d8b8c3051bcdfb63c582b8f0d6d9af233d25e2c1d3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5559fbe43f7e18617e2a46182d621a6310ae4d0b10548bae2bb098588285e081\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fbf8815e560483199cc8f453952895d3e5d831e7c52671b204187dc686fa4ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:25Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.346303 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b60fe11e60f880bde640f63640fdc6a1b66e3044dcd975c711c45a15aed3fb2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:25Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.364028 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9db08d13283889d8c43ab89157e2c33f300359ecf66b548100b30ddea3da2be1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8472b62f0ac04884cd8a9d342c063860188b0f17e1ed2201c7ea7288fb44d77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:25Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.375718 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-flbx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cc3ea69-b881-4fd4-ad4d-42803f27865b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07b26bfbc15981836877e6ddc6aea3fb26a4df1be38ac67c76f8a7a8f6b84b5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqph9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-flbx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:25Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.397850 4810 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 21:41:47.072050736 +0000 UTC Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.403603 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dcca2640a7625737b151a65d63dac889139b41a8dff1cd5aae2ead1a37138f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://879052e78b8d383d0a7f4fbddb38b98fa931bec3d0d0c7ff7eb830ccb62be46d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5de90aae0b44ad98711b4e4c2764cf25c491a329a7acbe9a6ec49f88fb8e7c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c0568b462e1168949dcfd0131ef4e94d6c3f68dd31906ee5c6a8c87f7fef25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://702a868ef507f31cb27f41ccd862f38ba8d5256e5d4af19a5d8b168939932663\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5089067acda7084123f239150916500b11a0547068b0a39ee60a467c372f3459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bb8c0b89e0d91440beb93dc0fe78cebe4ca481940af3c8affe5efd75d589980\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bb8c0b89e0d91440beb93dc0fe78cebe4ca481940af3c8affe5efd75d589980\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T15:10:23Z\\\",\\\"message\\\":\\\"responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.139:17698:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8efa4d1a-72f5-4dfa-9bc2-9d93ef11ecf2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0219 15:10:23.623215 6458 services_controller.go:443] Built service openshift-machine-config-operator/machine-config-controller LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.16\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:9001, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0219 15:10:23.623235 6458 services_controller.go:444] Built service openshift-machine-config-operator/machine-config-controller LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0219 15:10:23.623244 6458 services_controller.go:445] Built service openshift-machine-config-operator/machine-config-controller LB template configs for network=default: []services.lbConfig(nil)\\\\nF0219 15:10:23.623259 6458 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T15:10:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-8k7p5_openshift-ovn-kubernetes(c5a8a15c-53e8-4868-8feb-dcd4e83939a4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e21fe2d6964872913c8e5a2f983519806b72d0e2880016372ac90fc2b9f61fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8k7p5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:25Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.417842 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:25Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.426622 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.426663 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.426673 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.426691 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.426705 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:25Z","lastTransitionTime":"2026-02-19T15:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.430696 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54ba23405a6182e281d7d423f13e84a03c0f68cf69169f59fe1cd0d5881103c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:25Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.438359 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2x9v9" Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.438368 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 15:10:25 crc kubenswrapper[4810]: E0219 15:10:25.438533 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2x9v9" podUID="b72d3f7a-e418-4a21-af73-6a43ce3358c1" Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.438386 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 15:10:25 crc kubenswrapper[4810]: E0219 15:10:25.438629 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 15:10:25 crc kubenswrapper[4810]: E0219 15:10:25.438726 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.444183 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bsztz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a45a199-beeb-4972-b796-15c958fe99d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a674e283274f1e77f6ac89a8170137bf5852153bd4a35a80101a3cd0090b85d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcg8h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bsztz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:25Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.454755 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2x9v9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b72d3f7a-e418-4a21-af73-6a43ce3358c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7z7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7z7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:10:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2x9v9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:25Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.470247 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6694b0d-3264-43eb-bd52-5088c7d2bf15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16713d77989f991d6b3136c41aac696aaa3233a0f3454bcd299cd968f46872eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42939a1750dd650a11f4f83812694248cc8767b040f5cab76ee64d30ad6004ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a3707767fcc9c9ee6a808f9c0598457f1710b6394ecf9910f7a1a0ddd8ec1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef740fa90b279363e303fb824188d0a26097b06aa8eaa488c0c800a8caa8f10e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://640d02443d7d20048695866b4db1d1fdd6469c712a804010860ca4c730f59c65\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T15:09:51Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 15:09:51.508430 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 15:09:51.508603 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 15:09:51.509986 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1367038872/tls.crt::/tmp/serving-cert-1367038872/tls.key\\\\\\\"\\\\nI0219 15:09:51.722990 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 15:09:51.726068 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 15:09:51.726086 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 15:09:51.726104 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 15:09:51.726110 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 15:09:51.731095 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 15:09:51.731120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0219 15:09:51.731119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0219 15:09:51.731126 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 15:09:51.731136 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 15:09:51.731141 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 15:09:51.731145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 15:09:51.731149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 15:09:51.734010 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77137e37c7a8cdddbba21456e942c760fb2cc6c31533e3bc6c4561929b41393b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:25Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.482802 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:25Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.498650 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmbsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"247e948b-3c17-4675-bd1c-f894b02d2817\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7171cb2bba77488fbb60ec533172b2240d1871e89bc7760ae5c9b67ee6924354\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cbe5a969ecab49494c22bccec30a5b7050699729a8ae6244a4bb016ed9cbe64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cbe5a969ecab49494c22bccec30a5b7050699729a8ae6244a4bb016ed9cbe64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://395e0b7a03b78f71b2823c72ea754e1909bde653a7107597bf07b6e664ba40dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://395e0b7a03b78f71b2823c72ea754e1909bde653a7107597bf07b6e664ba40dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5a0126c8bea39a89fe488d49dcb1e47063abc9839d148f348a7ce592038a450\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5a0126c8bea39a89fe488d49dcb1e47063abc9839d148f348a7ce592038a450\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9864da53b0851b0c4eadd5c11669af9b9e539660908ca14f44618824b5e4158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9864da53b0851b0c4eadd5c11669af9b9e539660908ca14f44618824b5e4158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://301c743749a2bcaee82a70d3c34a6cf13f880ef9986c390e279a7c056f9a99ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://301c743749a2bcaee82a70d3c34a6cf13f880ef9986c390e279a7c056f9a99ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://075e06e40beb99d2338c5850692a5e89314b6342cc72007d75b602ac21a90d9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://075e06e40beb99d2338c5850692a5e89314b6342cc72007d75b602ac21a90d9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmbsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:25Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.530228 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.530289 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.530307 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.530358 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.530377 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:25Z","lastTransitionTime":"2026-02-19T15:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.633926 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.633995 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.634013 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.634046 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.634064 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:25Z","lastTransitionTime":"2026-02-19T15:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.736811 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.736876 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.736894 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.736922 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.736943 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:25Z","lastTransitionTime":"2026-02-19T15:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.840207 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.840352 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.840373 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.840400 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.840421 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:25Z","lastTransitionTime":"2026-02-19T15:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.943371 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.943424 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.943442 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.943506 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:25 crc kubenswrapper[4810]: I0219 15:10:25.943526 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:25Z","lastTransitionTime":"2026-02-19T15:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:26 crc kubenswrapper[4810]: I0219 15:10:26.046445 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:26 crc kubenswrapper[4810]: I0219 15:10:26.046500 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:26 crc kubenswrapper[4810]: I0219 15:10:26.046515 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:26 crc kubenswrapper[4810]: I0219 15:10:26.046534 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:26 crc kubenswrapper[4810]: I0219 15:10:26.046548 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:26Z","lastTransitionTime":"2026-02-19T15:10:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:26 crc kubenswrapper[4810]: I0219 15:10:26.149885 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:26 crc kubenswrapper[4810]: I0219 15:10:26.149952 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:26 crc kubenswrapper[4810]: I0219 15:10:26.149971 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:26 crc kubenswrapper[4810]: I0219 15:10:26.149997 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:26 crc kubenswrapper[4810]: I0219 15:10:26.150016 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:26Z","lastTransitionTime":"2026-02-19T15:10:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:26 crc kubenswrapper[4810]: I0219 15:10:26.252219 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:26 crc kubenswrapper[4810]: I0219 15:10:26.252253 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:26 crc kubenswrapper[4810]: I0219 15:10:26.252262 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:26 crc kubenswrapper[4810]: I0219 15:10:26.252279 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:26 crc kubenswrapper[4810]: I0219 15:10:26.252288 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:26Z","lastTransitionTime":"2026-02-19T15:10:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:26 crc kubenswrapper[4810]: I0219 15:10:26.356519 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:26 crc kubenswrapper[4810]: I0219 15:10:26.356587 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:26 crc kubenswrapper[4810]: I0219 15:10:26.356606 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:26 crc kubenswrapper[4810]: I0219 15:10:26.356626 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:26 crc kubenswrapper[4810]: I0219 15:10:26.356639 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:26Z","lastTransitionTime":"2026-02-19T15:10:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:26 crc kubenswrapper[4810]: I0219 15:10:26.398637 4810 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 05:45:25.644992925 +0000 UTC Feb 19 15:10:26 crc kubenswrapper[4810]: I0219 15:10:26.438901 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 15:10:26 crc kubenswrapper[4810]: E0219 15:10:26.439228 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 15:10:26 crc kubenswrapper[4810]: I0219 15:10:26.459639 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:26 crc kubenswrapper[4810]: I0219 15:10:26.459679 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:26 crc kubenswrapper[4810]: I0219 15:10:26.459689 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:26 crc kubenswrapper[4810]: I0219 15:10:26.459706 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:26 crc kubenswrapper[4810]: I0219 15:10:26.459718 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:26Z","lastTransitionTime":"2026-02-19T15:10:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:26 crc kubenswrapper[4810]: I0219 15:10:26.562620 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:26 crc kubenswrapper[4810]: I0219 15:10:26.562678 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:26 crc kubenswrapper[4810]: I0219 15:10:26.562689 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:26 crc kubenswrapper[4810]: I0219 15:10:26.562713 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:26 crc kubenswrapper[4810]: I0219 15:10:26.562724 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:26Z","lastTransitionTime":"2026-02-19T15:10:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:26 crc kubenswrapper[4810]: I0219 15:10:26.666140 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:26 crc kubenswrapper[4810]: I0219 15:10:26.666204 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:26 crc kubenswrapper[4810]: I0219 15:10:26.666225 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:26 crc kubenswrapper[4810]: I0219 15:10:26.666254 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:26 crc kubenswrapper[4810]: I0219 15:10:26.666274 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:26Z","lastTransitionTime":"2026-02-19T15:10:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:26 crc kubenswrapper[4810]: I0219 15:10:26.770279 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:26 crc kubenswrapper[4810]: I0219 15:10:26.770343 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:26 crc kubenswrapper[4810]: I0219 15:10:26.770355 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:26 crc kubenswrapper[4810]: I0219 15:10:26.770372 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:26 crc kubenswrapper[4810]: I0219 15:10:26.770381 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:26Z","lastTransitionTime":"2026-02-19T15:10:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:26 crc kubenswrapper[4810]: I0219 15:10:26.873375 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:26 crc kubenswrapper[4810]: I0219 15:10:26.873463 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:26 crc kubenswrapper[4810]: I0219 15:10:26.873486 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:26 crc kubenswrapper[4810]: I0219 15:10:26.873520 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:26 crc kubenswrapper[4810]: I0219 15:10:26.873543 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:26Z","lastTransitionTime":"2026-02-19T15:10:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:26 crc kubenswrapper[4810]: I0219 15:10:26.976471 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:26 crc kubenswrapper[4810]: I0219 15:10:26.976531 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:26 crc kubenswrapper[4810]: I0219 15:10:26.976549 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:26 crc kubenswrapper[4810]: I0219 15:10:26.976574 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:26 crc kubenswrapper[4810]: I0219 15:10:26.976594 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:26Z","lastTransitionTime":"2026-02-19T15:10:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:27 crc kubenswrapper[4810]: I0219 15:10:27.080137 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:27 crc kubenswrapper[4810]: I0219 15:10:27.080202 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:27 crc kubenswrapper[4810]: I0219 15:10:27.080228 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:27 crc kubenswrapper[4810]: I0219 15:10:27.080261 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:27 crc kubenswrapper[4810]: I0219 15:10:27.080286 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:27Z","lastTransitionTime":"2026-02-19T15:10:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:27 crc kubenswrapper[4810]: I0219 15:10:27.183833 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:27 crc kubenswrapper[4810]: I0219 15:10:27.183910 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:27 crc kubenswrapper[4810]: I0219 15:10:27.183928 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:27 crc kubenswrapper[4810]: I0219 15:10:27.183953 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:27 crc kubenswrapper[4810]: I0219 15:10:27.183966 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:27Z","lastTransitionTime":"2026-02-19T15:10:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:27 crc kubenswrapper[4810]: I0219 15:10:27.287615 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:27 crc kubenswrapper[4810]: I0219 15:10:27.287679 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:27 crc kubenswrapper[4810]: I0219 15:10:27.287700 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:27 crc kubenswrapper[4810]: I0219 15:10:27.287726 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:27 crc kubenswrapper[4810]: I0219 15:10:27.287748 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:27Z","lastTransitionTime":"2026-02-19T15:10:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:27 crc kubenswrapper[4810]: I0219 15:10:27.391051 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:27 crc kubenswrapper[4810]: I0219 15:10:27.391115 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:27 crc kubenswrapper[4810]: I0219 15:10:27.391135 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:27 crc kubenswrapper[4810]: I0219 15:10:27.391166 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:27 crc kubenswrapper[4810]: I0219 15:10:27.391186 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:27Z","lastTransitionTime":"2026-02-19T15:10:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:27 crc kubenswrapper[4810]: I0219 15:10:27.399211 4810 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 17:11:01.269717572 +0000 UTC Feb 19 15:10:27 crc kubenswrapper[4810]: I0219 15:10:27.439101 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2x9v9" Feb 19 15:10:27 crc kubenswrapper[4810]: I0219 15:10:27.439132 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 15:10:27 crc kubenswrapper[4810]: E0219 15:10:27.439453 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2x9v9" podUID="b72d3f7a-e418-4a21-af73-6a43ce3358c1" Feb 19 15:10:27 crc kubenswrapper[4810]: I0219 15:10:27.439151 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 15:10:27 crc kubenswrapper[4810]: E0219 15:10:27.439752 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 15:10:27 crc kubenswrapper[4810]: E0219 15:10:27.439979 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 15:10:27 crc kubenswrapper[4810]: I0219 15:10:27.494420 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:27 crc kubenswrapper[4810]: I0219 15:10:27.494481 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:27 crc kubenswrapper[4810]: I0219 15:10:27.494502 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:27 crc kubenswrapper[4810]: I0219 15:10:27.494533 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:27 crc kubenswrapper[4810]: I0219 15:10:27.494554 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:27Z","lastTransitionTime":"2026-02-19T15:10:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:27 crc kubenswrapper[4810]: I0219 15:10:27.597814 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:27 crc kubenswrapper[4810]: I0219 15:10:27.597893 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:27 crc kubenswrapper[4810]: I0219 15:10:27.597907 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:27 crc kubenswrapper[4810]: I0219 15:10:27.597928 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:27 crc kubenswrapper[4810]: I0219 15:10:27.597942 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:27Z","lastTransitionTime":"2026-02-19T15:10:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:27 crc kubenswrapper[4810]: I0219 15:10:27.700394 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:27 crc kubenswrapper[4810]: I0219 15:10:27.700449 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:27 crc kubenswrapper[4810]: I0219 15:10:27.700462 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:27 crc kubenswrapper[4810]: I0219 15:10:27.700481 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:27 crc kubenswrapper[4810]: I0219 15:10:27.700497 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:27Z","lastTransitionTime":"2026-02-19T15:10:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:27 crc kubenswrapper[4810]: I0219 15:10:27.803697 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:27 crc kubenswrapper[4810]: I0219 15:10:27.803771 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:27 crc kubenswrapper[4810]: I0219 15:10:27.803793 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:27 crc kubenswrapper[4810]: I0219 15:10:27.803820 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:27 crc kubenswrapper[4810]: I0219 15:10:27.803839 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:27Z","lastTransitionTime":"2026-02-19T15:10:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:27 crc kubenswrapper[4810]: I0219 15:10:27.907650 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:27 crc kubenswrapper[4810]: I0219 15:10:27.907717 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:27 crc kubenswrapper[4810]: I0219 15:10:27.907729 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:27 crc kubenswrapper[4810]: I0219 15:10:27.907750 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:27 crc kubenswrapper[4810]: I0219 15:10:27.907763 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:27Z","lastTransitionTime":"2026-02-19T15:10:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:28 crc kubenswrapper[4810]: I0219 15:10:28.011084 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:28 crc kubenswrapper[4810]: I0219 15:10:28.011158 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:28 crc kubenswrapper[4810]: I0219 15:10:28.011182 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:28 crc kubenswrapper[4810]: I0219 15:10:28.011211 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:28 crc kubenswrapper[4810]: I0219 15:10:28.011228 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:28Z","lastTransitionTime":"2026-02-19T15:10:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:28 crc kubenswrapper[4810]: I0219 15:10:28.114741 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:28 crc kubenswrapper[4810]: I0219 15:10:28.114795 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:28 crc kubenswrapper[4810]: I0219 15:10:28.114813 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:28 crc kubenswrapper[4810]: I0219 15:10:28.114841 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:28 crc kubenswrapper[4810]: I0219 15:10:28.114860 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:28Z","lastTransitionTime":"2026-02-19T15:10:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:28 crc kubenswrapper[4810]: I0219 15:10:28.218275 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:28 crc kubenswrapper[4810]: I0219 15:10:28.218351 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:28 crc kubenswrapper[4810]: I0219 15:10:28.218371 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:28 crc kubenswrapper[4810]: I0219 15:10:28.218393 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:28 crc kubenswrapper[4810]: I0219 15:10:28.218408 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:28Z","lastTransitionTime":"2026-02-19T15:10:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:28 crc kubenswrapper[4810]: I0219 15:10:28.321213 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:28 crc kubenswrapper[4810]: I0219 15:10:28.321275 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:28 crc kubenswrapper[4810]: I0219 15:10:28.321293 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:28 crc kubenswrapper[4810]: I0219 15:10:28.321321 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:28 crc kubenswrapper[4810]: I0219 15:10:28.321381 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:28Z","lastTransitionTime":"2026-02-19T15:10:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:28 crc kubenswrapper[4810]: I0219 15:10:28.399523 4810 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 21:49:09.552290039 +0000 UTC Feb 19 15:10:28 crc kubenswrapper[4810]: I0219 15:10:28.423883 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:28 crc kubenswrapper[4810]: I0219 15:10:28.423961 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:28 crc kubenswrapper[4810]: I0219 15:10:28.423984 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:28 crc kubenswrapper[4810]: I0219 15:10:28.424011 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:28 crc kubenswrapper[4810]: I0219 15:10:28.424028 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:28Z","lastTransitionTime":"2026-02-19T15:10:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:28 crc kubenswrapper[4810]: I0219 15:10:28.439229 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 15:10:28 crc kubenswrapper[4810]: E0219 15:10:28.439448 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 15:10:28 crc kubenswrapper[4810]: I0219 15:10:28.527070 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:28 crc kubenswrapper[4810]: I0219 15:10:28.527146 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:28 crc kubenswrapper[4810]: I0219 15:10:28.527170 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:28 crc kubenswrapper[4810]: I0219 15:10:28.527199 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:28 crc kubenswrapper[4810]: I0219 15:10:28.527220 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:28Z","lastTransitionTime":"2026-02-19T15:10:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:28 crc kubenswrapper[4810]: I0219 15:10:28.630081 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:28 crc kubenswrapper[4810]: I0219 15:10:28.630170 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:28 crc kubenswrapper[4810]: I0219 15:10:28.630194 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:28 crc kubenswrapper[4810]: I0219 15:10:28.630225 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:28 crc kubenswrapper[4810]: I0219 15:10:28.630245 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:28Z","lastTransitionTime":"2026-02-19T15:10:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:28 crc kubenswrapper[4810]: I0219 15:10:28.734065 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:28 crc kubenswrapper[4810]: I0219 15:10:28.734255 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:28 crc kubenswrapper[4810]: I0219 15:10:28.734278 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:28 crc kubenswrapper[4810]: I0219 15:10:28.734306 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:28 crc kubenswrapper[4810]: I0219 15:10:28.734353 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:28Z","lastTransitionTime":"2026-02-19T15:10:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:28 crc kubenswrapper[4810]: I0219 15:10:28.838279 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:28 crc kubenswrapper[4810]: I0219 15:10:28.838372 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:28 crc kubenswrapper[4810]: I0219 15:10:28.838400 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:28 crc kubenswrapper[4810]: I0219 15:10:28.838432 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:28 crc kubenswrapper[4810]: I0219 15:10:28.838452 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:28Z","lastTransitionTime":"2026-02-19T15:10:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:28 crc kubenswrapper[4810]: I0219 15:10:28.941805 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:28 crc kubenswrapper[4810]: I0219 15:10:28.941882 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:28 crc kubenswrapper[4810]: I0219 15:10:28.941901 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:28 crc kubenswrapper[4810]: I0219 15:10:28.941963 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:28 crc kubenswrapper[4810]: I0219 15:10:28.941982 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:28Z","lastTransitionTime":"2026-02-19T15:10:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:29 crc kubenswrapper[4810]: I0219 15:10:29.045147 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:29 crc kubenswrapper[4810]: I0219 15:10:29.045207 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:29 crc kubenswrapper[4810]: I0219 15:10:29.045225 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:29 crc kubenswrapper[4810]: I0219 15:10:29.045252 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:29 crc kubenswrapper[4810]: I0219 15:10:29.045270 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:29Z","lastTransitionTime":"2026-02-19T15:10:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:29 crc kubenswrapper[4810]: I0219 15:10:29.148641 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:29 crc kubenswrapper[4810]: I0219 15:10:29.148708 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:29 crc kubenswrapper[4810]: I0219 15:10:29.148726 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:29 crc kubenswrapper[4810]: I0219 15:10:29.148753 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:29 crc kubenswrapper[4810]: I0219 15:10:29.148772 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:29Z","lastTransitionTime":"2026-02-19T15:10:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:29 crc kubenswrapper[4810]: I0219 15:10:29.154839 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:29 crc kubenswrapper[4810]: I0219 15:10:29.154930 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:29 crc kubenswrapper[4810]: I0219 15:10:29.154956 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:29 crc kubenswrapper[4810]: I0219 15:10:29.154990 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:29 crc kubenswrapper[4810]: I0219 15:10:29.155014 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:29Z","lastTransitionTime":"2026-02-19T15:10:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:29 crc kubenswrapper[4810]: E0219 15:10:29.176835 4810 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7961bd2b-8ad7-4e25-b313-8f82bef01c62\\\",\\\"systemUUID\\\":\\\"60bcb373-142f-4da9-846e-4d055863e63a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:29Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:29 crc kubenswrapper[4810]: I0219 15:10:29.184281 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:29 crc kubenswrapper[4810]: I0219 15:10:29.184426 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:29 crc kubenswrapper[4810]: I0219 15:10:29.184444 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:29 crc kubenswrapper[4810]: I0219 15:10:29.184468 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:29 crc kubenswrapper[4810]: I0219 15:10:29.184481 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:29Z","lastTransitionTime":"2026-02-19T15:10:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:29 crc kubenswrapper[4810]: E0219 15:10:29.200850 4810 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7961bd2b-8ad7-4e25-b313-8f82bef01c62\\\",\\\"systemUUID\\\":\\\"60bcb373-142f-4da9-846e-4d055863e63a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:29Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:29 crc kubenswrapper[4810]: I0219 15:10:29.207038 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:29 crc kubenswrapper[4810]: I0219 15:10:29.207086 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:29 crc kubenswrapper[4810]: I0219 15:10:29.207095 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:29 crc kubenswrapper[4810]: I0219 15:10:29.207115 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:29 crc kubenswrapper[4810]: I0219 15:10:29.207127 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:29Z","lastTransitionTime":"2026-02-19T15:10:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:29 crc kubenswrapper[4810]: E0219 15:10:29.226278 4810 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7961bd2b-8ad7-4e25-b313-8f82bef01c62\\\",\\\"systemUUID\\\":\\\"60bcb373-142f-4da9-846e-4d055863e63a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:29Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:29 crc kubenswrapper[4810]: I0219 15:10:29.233207 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:29 crc kubenswrapper[4810]: I0219 15:10:29.233282 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:29 crc kubenswrapper[4810]: I0219 15:10:29.233303 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:29 crc kubenswrapper[4810]: I0219 15:10:29.233359 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:29 crc kubenswrapper[4810]: I0219 15:10:29.233382 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:29Z","lastTransitionTime":"2026-02-19T15:10:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:29 crc kubenswrapper[4810]: E0219 15:10:29.247826 4810 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7961bd2b-8ad7-4e25-b313-8f82bef01c62\\\",\\\"systemUUID\\\":\\\"60bcb373-142f-4da9-846e-4d055863e63a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:29Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:29 crc kubenswrapper[4810]: I0219 15:10:29.253470 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:29 crc kubenswrapper[4810]: I0219 15:10:29.253595 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:29 crc kubenswrapper[4810]: I0219 15:10:29.253656 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:29 crc kubenswrapper[4810]: I0219 15:10:29.253692 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:29 crc kubenswrapper[4810]: I0219 15:10:29.253743 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:29Z","lastTransitionTime":"2026-02-19T15:10:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:29 crc kubenswrapper[4810]: E0219 15:10:29.269694 4810 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7961bd2b-8ad7-4e25-b313-8f82bef01c62\\\",\\\"systemUUID\\\":\\\"60bcb373-142f-4da9-846e-4d055863e63a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:29Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:29 crc kubenswrapper[4810]: E0219 15:10:29.270159 4810 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 19 15:10:29 crc kubenswrapper[4810]: I0219 15:10:29.272603 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:29 crc kubenswrapper[4810]: I0219 15:10:29.272692 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:29 crc kubenswrapper[4810]: I0219 15:10:29.272710 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:29 crc kubenswrapper[4810]: I0219 15:10:29.272762 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:29 crc kubenswrapper[4810]: I0219 15:10:29.272780 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:29Z","lastTransitionTime":"2026-02-19T15:10:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:29 crc kubenswrapper[4810]: I0219 15:10:29.376725 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:29 crc kubenswrapper[4810]: I0219 15:10:29.376813 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:29 crc kubenswrapper[4810]: I0219 15:10:29.376853 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:29 crc kubenswrapper[4810]: I0219 15:10:29.376878 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:29 crc kubenswrapper[4810]: I0219 15:10:29.376892 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:29Z","lastTransitionTime":"2026-02-19T15:10:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:29 crc kubenswrapper[4810]: I0219 15:10:29.400481 4810 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 00:14:23.235806766 +0000 UTC Feb 19 15:10:29 crc kubenswrapper[4810]: I0219 15:10:29.438403 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 15:10:29 crc kubenswrapper[4810]: I0219 15:10:29.438525 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2x9v9" Feb 19 15:10:29 crc kubenswrapper[4810]: E0219 15:10:29.438654 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 15:10:29 crc kubenswrapper[4810]: I0219 15:10:29.438415 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 15:10:29 crc kubenswrapper[4810]: E0219 15:10:29.438754 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2x9v9" podUID="b72d3f7a-e418-4a21-af73-6a43ce3358c1" Feb 19 15:10:29 crc kubenswrapper[4810]: E0219 15:10:29.439029 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 15:10:29 crc kubenswrapper[4810]: I0219 15:10:29.480681 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:29 crc kubenswrapper[4810]: I0219 15:10:29.480766 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:29 crc kubenswrapper[4810]: I0219 15:10:29.480837 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:29 crc kubenswrapper[4810]: I0219 15:10:29.480871 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:29 crc kubenswrapper[4810]: I0219 15:10:29.480932 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:29Z","lastTransitionTime":"2026-02-19T15:10:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:29 crc kubenswrapper[4810]: I0219 15:10:29.584302 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:29 crc kubenswrapper[4810]: I0219 15:10:29.584408 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:29 crc kubenswrapper[4810]: I0219 15:10:29.584426 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:29 crc kubenswrapper[4810]: I0219 15:10:29.584452 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:29 crc kubenswrapper[4810]: I0219 15:10:29.584468 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:29Z","lastTransitionTime":"2026-02-19T15:10:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:29 crc kubenswrapper[4810]: I0219 15:10:29.687280 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:29 crc kubenswrapper[4810]: I0219 15:10:29.687397 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:29 crc kubenswrapper[4810]: I0219 15:10:29.687423 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:29 crc kubenswrapper[4810]: I0219 15:10:29.687452 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:29 crc kubenswrapper[4810]: I0219 15:10:29.687469 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:29Z","lastTransitionTime":"2026-02-19T15:10:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:29 crc kubenswrapper[4810]: I0219 15:10:29.790055 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:29 crc kubenswrapper[4810]: I0219 15:10:29.790104 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:29 crc kubenswrapper[4810]: I0219 15:10:29.790114 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:29 crc kubenswrapper[4810]: I0219 15:10:29.790132 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:29 crc kubenswrapper[4810]: I0219 15:10:29.790144 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:29Z","lastTransitionTime":"2026-02-19T15:10:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:29 crc kubenswrapper[4810]: I0219 15:10:29.892544 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:29 crc kubenswrapper[4810]: I0219 15:10:29.892641 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:29 crc kubenswrapper[4810]: I0219 15:10:29.892667 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:29 crc kubenswrapper[4810]: I0219 15:10:29.892697 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:29 crc kubenswrapper[4810]: I0219 15:10:29.892719 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:29Z","lastTransitionTime":"2026-02-19T15:10:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:29 crc kubenswrapper[4810]: I0219 15:10:29.996276 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:29 crc kubenswrapper[4810]: I0219 15:10:29.996391 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:29 crc kubenswrapper[4810]: I0219 15:10:29.996415 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:29 crc kubenswrapper[4810]: I0219 15:10:29.996447 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:29 crc kubenswrapper[4810]: I0219 15:10:29.996472 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:29Z","lastTransitionTime":"2026-02-19T15:10:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:30 crc kubenswrapper[4810]: I0219 15:10:30.100254 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:30 crc kubenswrapper[4810]: I0219 15:10:30.100304 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:30 crc kubenswrapper[4810]: I0219 15:10:30.100386 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:30 crc kubenswrapper[4810]: I0219 15:10:30.100421 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:30 crc kubenswrapper[4810]: I0219 15:10:30.100439 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:30Z","lastTransitionTime":"2026-02-19T15:10:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:30 crc kubenswrapper[4810]: I0219 15:10:30.204231 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:30 crc kubenswrapper[4810]: I0219 15:10:30.204357 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:30 crc kubenswrapper[4810]: I0219 15:10:30.204395 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:30 crc kubenswrapper[4810]: I0219 15:10:30.204426 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:30 crc kubenswrapper[4810]: I0219 15:10:30.204448 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:30Z","lastTransitionTime":"2026-02-19T15:10:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:30 crc kubenswrapper[4810]: I0219 15:10:30.308632 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:30 crc kubenswrapper[4810]: I0219 15:10:30.308686 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:30 crc kubenswrapper[4810]: I0219 15:10:30.308703 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:30 crc kubenswrapper[4810]: I0219 15:10:30.308729 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:30 crc kubenswrapper[4810]: I0219 15:10:30.308747 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:30Z","lastTransitionTime":"2026-02-19T15:10:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:30 crc kubenswrapper[4810]: I0219 15:10:30.401516 4810 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 12:06:02.13748365 +0000 UTC Feb 19 15:10:30 crc kubenswrapper[4810]: I0219 15:10:30.412511 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:30 crc kubenswrapper[4810]: I0219 15:10:30.412573 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:30 crc kubenswrapper[4810]: I0219 15:10:30.412595 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:30 crc kubenswrapper[4810]: I0219 15:10:30.412624 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:30 crc kubenswrapper[4810]: I0219 15:10:30.412642 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:30Z","lastTransitionTime":"2026-02-19T15:10:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:30 crc kubenswrapper[4810]: I0219 15:10:30.439066 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 15:10:30 crc kubenswrapper[4810]: E0219 15:10:30.439236 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 15:10:30 crc kubenswrapper[4810]: I0219 15:10:30.516371 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:30 crc kubenswrapper[4810]: I0219 15:10:30.516430 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:30 crc kubenswrapper[4810]: I0219 15:10:30.516452 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:30 crc kubenswrapper[4810]: I0219 15:10:30.516476 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:30 crc kubenswrapper[4810]: I0219 15:10:30.516498 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:30Z","lastTransitionTime":"2026-02-19T15:10:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:30 crc kubenswrapper[4810]: I0219 15:10:30.620373 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:30 crc kubenswrapper[4810]: I0219 15:10:30.620418 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:30 crc kubenswrapper[4810]: I0219 15:10:30.620428 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:30 crc kubenswrapper[4810]: I0219 15:10:30.620446 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:30 crc kubenswrapper[4810]: I0219 15:10:30.620460 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:30Z","lastTransitionTime":"2026-02-19T15:10:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:30 crc kubenswrapper[4810]: I0219 15:10:30.724261 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:30 crc kubenswrapper[4810]: I0219 15:10:30.724357 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:30 crc kubenswrapper[4810]: I0219 15:10:30.724374 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:30 crc kubenswrapper[4810]: I0219 15:10:30.724398 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:30 crc kubenswrapper[4810]: I0219 15:10:30.724414 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:30Z","lastTransitionTime":"2026-02-19T15:10:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:30 crc kubenswrapper[4810]: I0219 15:10:30.829319 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:30 crc kubenswrapper[4810]: I0219 15:10:30.829399 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:30 crc kubenswrapper[4810]: I0219 15:10:30.829414 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:30 crc kubenswrapper[4810]: I0219 15:10:30.829434 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:30 crc kubenswrapper[4810]: I0219 15:10:30.829445 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:30Z","lastTransitionTime":"2026-02-19T15:10:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:30 crc kubenswrapper[4810]: I0219 15:10:30.932917 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:30 crc kubenswrapper[4810]: I0219 15:10:30.932965 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:30 crc kubenswrapper[4810]: I0219 15:10:30.932984 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:30 crc kubenswrapper[4810]: I0219 15:10:30.933011 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:30 crc kubenswrapper[4810]: I0219 15:10:30.933028 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:30Z","lastTransitionTime":"2026-02-19T15:10:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:31 crc kubenswrapper[4810]: I0219 15:10:31.036477 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:31 crc kubenswrapper[4810]: I0219 15:10:31.036569 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:31 crc kubenswrapper[4810]: I0219 15:10:31.036598 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:31 crc kubenswrapper[4810]: I0219 15:10:31.036634 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:31 crc kubenswrapper[4810]: I0219 15:10:31.036656 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:31Z","lastTransitionTime":"2026-02-19T15:10:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:31 crc kubenswrapper[4810]: I0219 15:10:31.139868 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:31 crc kubenswrapper[4810]: I0219 15:10:31.139908 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:31 crc kubenswrapper[4810]: I0219 15:10:31.139916 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:31 crc kubenswrapper[4810]: I0219 15:10:31.139932 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:31 crc kubenswrapper[4810]: I0219 15:10:31.139945 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:31Z","lastTransitionTime":"2026-02-19T15:10:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:31 crc kubenswrapper[4810]: I0219 15:10:31.243777 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:31 crc kubenswrapper[4810]: I0219 15:10:31.243832 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:31 crc kubenswrapper[4810]: I0219 15:10:31.243851 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:31 crc kubenswrapper[4810]: I0219 15:10:31.243879 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:31 crc kubenswrapper[4810]: I0219 15:10:31.243898 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:31Z","lastTransitionTime":"2026-02-19T15:10:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:31 crc kubenswrapper[4810]: I0219 15:10:31.346286 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:31 crc kubenswrapper[4810]: I0219 15:10:31.346319 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:31 crc kubenswrapper[4810]: I0219 15:10:31.346373 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:31 crc kubenswrapper[4810]: I0219 15:10:31.346397 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:31 crc kubenswrapper[4810]: I0219 15:10:31.346411 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:31Z","lastTransitionTime":"2026-02-19T15:10:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:31 crc kubenswrapper[4810]: I0219 15:10:31.402248 4810 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 19:23:38.445231412 +0000 UTC Feb 19 15:10:31 crc kubenswrapper[4810]: I0219 15:10:31.438865 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 15:10:31 crc kubenswrapper[4810]: I0219 15:10:31.438931 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2x9v9" Feb 19 15:10:31 crc kubenswrapper[4810]: I0219 15:10:31.439011 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 15:10:31 crc kubenswrapper[4810]: E0219 15:10:31.439055 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 15:10:31 crc kubenswrapper[4810]: E0219 15:10:31.439237 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 15:10:31 crc kubenswrapper[4810]: E0219 15:10:31.439400 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2x9v9" podUID="b72d3f7a-e418-4a21-af73-6a43ce3358c1" Feb 19 15:10:31 crc kubenswrapper[4810]: I0219 15:10:31.454412 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:31 crc kubenswrapper[4810]: I0219 15:10:31.454509 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:31 crc kubenswrapper[4810]: I0219 15:10:31.454541 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:31 crc kubenswrapper[4810]: I0219 15:10:31.454582 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:31 crc kubenswrapper[4810]: I0219 15:10:31.454622 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:31Z","lastTransitionTime":"2026-02-19T15:10:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:31 crc kubenswrapper[4810]: I0219 15:10:31.464896 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dcca2640a7625737b151a65d63dac889139b41a8dff1cd5aae2ead1a37138f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://879052e78b8d383d0a7f4fbddb38b98fa931bec3d0d0c7ff7eb830ccb62be46d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5de90aae0b44ad98711b4e4c2764cf25c491a329a7acbe9a6ec49f88fb8e7c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c0568b462e1168949dcfd0131ef4e94d6c3f68dd31906ee5c6a8c87f7fef25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://702a868ef507f31cb27f41ccd862f38ba8d5256e5d4af19a5d8b168939932663\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5089067acda7084123f239150916500b11a0547068b0a39ee60a467c372f3459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bb8c0b89e0d91440beb93dc0fe78cebe4ca481940af3c8affe5efd75d589980\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bb8c0b89e0d91440beb93dc0fe78cebe4ca481940af3c8affe5efd75d589980\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T15:10:23Z\\\",\\\"message\\\":\\\"responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.139:17698:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8efa4d1a-72f5-4dfa-9bc2-9d93ef11ecf2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0219 15:10:23.623215 6458 services_controller.go:443] Built service openshift-machine-config-operator/machine-config-controller LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.16\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:9001, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0219 15:10:23.623235 6458 services_controller.go:444] Built service openshift-machine-config-operator/machine-config-controller LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0219 15:10:23.623244 6458 services_controller.go:445] Built service openshift-machine-config-operator/machine-config-controller LB template configs for network=default: []services.lbConfig(nil)\\\\nF0219 15:10:23.623259 6458 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T15:10:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-8k7p5_openshift-ovn-kubernetes(c5a8a15c-53e8-4868-8feb-dcd4e83939a4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e21fe2d6964872913c8e5a2f983519806b72d0e2880016372ac90fc2b9f61fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8k7p5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:31Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:31 crc kubenswrapper[4810]: I0219 15:10:31.482794 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d74a04f-ffb6-47fb-8823-2fd7f3df4f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2070ce19c777ad791dfd3e5d99eded43e479f4cbbaefe96c5ed8525aa86e5ed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89e969496cd5adab974a5d8b8c3051bcdfb63c582b8f0d6d9af233d25e2c1d3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5559fbe43f7e18617e2a46182d621a6310ae4d0b10548bae2bb098588285e081\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fbf8815e560483199cc8f453952895d3e5d831e7c52671b204187dc686fa4ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:31Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:31 crc kubenswrapper[4810]: I0219 15:10:31.501636 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b60fe11e60f880bde640f63640fdc6a1b66e3044dcd975c711c45a15aed3fb2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:31Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:31 crc kubenswrapper[4810]: I0219 15:10:31.517151 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9db08d13283889d8c43ab89157e2c33f300359ecf66b548100b30ddea3da2be1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8472b62f0ac04884cd8a9d342c063860188b0f17e1ed2201c7ea7288fb44d77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:31Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:31 crc kubenswrapper[4810]: I0219 15:10:31.531317 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-flbx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cc3ea69-b881-4fd4-ad4d-42803f27865b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07b26bfbc15981836877e6ddc6aea3fb26a4df1be38ac67c76f8a7a8f6b84b5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqph9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-flbx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:31Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:31 crc kubenswrapper[4810]: I0219 15:10:31.547435 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:31Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:31 crc kubenswrapper[4810]: I0219 15:10:31.560092 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:31 crc kubenswrapper[4810]: I0219 15:10:31.560284 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:31 crc kubenswrapper[4810]: I0219 15:10:31.560408 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:31 crc kubenswrapper[4810]: I0219 15:10:31.560510 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:31 crc kubenswrapper[4810]: I0219 15:10:31.560598 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:31Z","lastTransitionTime":"2026-02-19T15:10:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:31 crc kubenswrapper[4810]: I0219 15:10:31.565038 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54ba23405a6182e281d7d423f13e84a03c0f68cf69169f59fe1cd0d5881103c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:31Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:31 crc kubenswrapper[4810]: I0219 15:10:31.586193 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bsztz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a45a199-beeb-4972-b796-15c958fe99d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a674e283274f1e77f6ac89a8170137bf5852153bd4a35a80101a3cd0090b85d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcg8h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bsztz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:31Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:31 crc kubenswrapper[4810]: I0219 15:10:31.610384 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmbsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"247e948b-3c17-4675-bd1c-f894b02d2817\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7171cb2bba77488fbb60ec533172b2240d1871e89bc7760ae5c9b67ee6924354\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cbe5a969ecab49494c22bccec30a5b7050699729a8ae6244a4bb016ed9cbe64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cbe5a969ecab49494c22bccec30a5b7050699729a8ae6244a4bb016ed9cbe64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://395e0b7a03b78f71b2823c72ea754e1909bde653a7107597bf07b6e664ba40dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://395e0b7a03b78f71b2823c72ea754e1909bde653a7107597bf07b6e664ba40dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5a0126c8bea39a89fe488d49dcb1e47063abc9839d148f348a7ce592038a450\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5a0126c8bea39a89fe488d49dcb1e47063abc9839d148f348a7ce592038a450\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9864da53b0851b0c4eadd5c11669af9b9e539660908ca14f44618824b5e4158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9864da53b0851b0c4eadd5c11669af9b9e539660908ca14f44618824b5e4158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://301c743749a2bcaee82a70d3c34a6cf13f880ef9986c390e279a7c056f9a99ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://301c743749a2bcaee82a70d3c34a6cf13f880ef9986c390e279a7c056f9a99ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://075e06e40beb99d2338c5850692a5e89314b6342cc72007d75b602ac21a90d9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://075e06e40beb99d2338c5850692a5e89314b6342cc72007d75b602ac21a90d9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmbsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:31Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:31 crc kubenswrapper[4810]: I0219 15:10:31.627156 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2x9v9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b72d3f7a-e418-4a21-af73-6a43ce3358c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7z7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7z7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:10:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2x9v9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:31Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:31 crc kubenswrapper[4810]: I0219 15:10:31.647487 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6694b0d-3264-43eb-bd52-5088c7d2bf15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16713d77989f991d6b3136c41aac696aaa3233a0f3454bcd299cd968f46872eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42939a1750dd650a11f4f83812694248cc8767b040f5cab76ee64d30ad6004ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a3707767fcc9c9ee6a808f9c0598457f1710b6394ecf9910f7a1a0ddd8ec1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef740fa90b279363e303fb824188d0a26097b06aa8eaa488c0c800a8caa8f10e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://640d02443d7d20048695866b4db1d1fdd6469c712a804010860ca4c730f59c65\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T15:09:51Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 15:09:51.508430 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 15:09:51.508603 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 15:09:51.509986 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1367038872/tls.crt::/tmp/serving-cert-1367038872/tls.key\\\\\\\"\\\\nI0219 15:09:51.722990 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 15:09:51.726068 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 15:09:51.726086 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 15:09:51.726104 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 15:09:51.726110 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 15:09:51.731095 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 15:09:51.731120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0219 15:09:51.731119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0219 15:09:51.731126 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 15:09:51.731136 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 15:09:51.731141 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 15:09:51.731145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 15:09:51.731149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 15:09:51.734010 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77137e37c7a8cdddbba21456e942c760fb2cc6c31533e3bc6c4561929b41393b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:31Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:31 crc kubenswrapper[4810]: I0219 15:10:31.664145 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:31 crc kubenswrapper[4810]: I0219 15:10:31.664207 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:31 crc kubenswrapper[4810]: I0219 15:10:31.664227 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:31 crc kubenswrapper[4810]: I0219 15:10:31.664254 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:31 crc kubenswrapper[4810]: I0219 15:10:31.664273 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:31Z","lastTransitionTime":"2026-02-19T15:10:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:31 crc kubenswrapper[4810]: I0219 15:10:31.667354 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:31Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:31 crc kubenswrapper[4810]: I0219 15:10:31.682167 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-l2g4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06adf9a1-ec31-4acc-9864-41549913d3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://618a71f428d132079ed8a6756ea331ec8fd20c32c98bc00691f56fa0b12669aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh6qr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82d6389909245cbf6091dfb1ef53a497dd254228acd53c508c32c4345f86b7b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh6qr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:10:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-l2g4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:31Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:31 crc kubenswrapper[4810]: I0219 15:10:31.701021 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac4287c9-a6d9-470d-820f-316c037a5d1a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc8b81d39b5b02958a8f7fa7ec86ac9dd530f2c4674042f60b11209e0400433a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://034d625f766dc7e2dd97fcedcfc0ca251971bc0be7e0d04ed4b2bcf3939905b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81860198cc657673df0021324ba4fde92aa6ffbde974993d0553a080872f7fb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eeb662772f98f52da96cf3b347f7b4990f4daa4552da61d972ecb8875d74b01c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eeb662772f98f52da96cf3b347f7b4990f4daa4552da61d972ecb8875d74b01c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:31Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:31 crc kubenswrapper[4810]: I0219 15:10:31.716795 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:31Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:31 crc kubenswrapper[4810]: I0219 15:10:31.731965 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfbf74ef-1e94-4826-8583-42b2e246ccf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f26baf8d6384c9133f7da338479eac54f1aa88609e4b0854078d4e85e8bf05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjrkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba65a8abc0841494ba4596019b005e3cbb3f3c0897bef615b51a2ab10aa9e651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjrkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t499d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:31Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:31 crc kubenswrapper[4810]: I0219 15:10:31.748924 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-t9jnq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60d7228b-14bd-4988-8dca-cb89f487ba00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb03eee4bad3a037ff0d61a204c9c86a839e9d684dbb89a433a50afa9cfcfcf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gzvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-t9jnq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:31Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:31 crc kubenswrapper[4810]: I0219 15:10:31.766922 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:31 crc kubenswrapper[4810]: I0219 15:10:31.767005 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:31 crc kubenswrapper[4810]: I0219 15:10:31.767022 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:31 crc kubenswrapper[4810]: I0219 15:10:31.767048 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:31 crc kubenswrapper[4810]: I0219 15:10:31.767066 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:31Z","lastTransitionTime":"2026-02-19T15:10:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:31 crc kubenswrapper[4810]: I0219 15:10:31.869682 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:31 crc kubenswrapper[4810]: I0219 15:10:31.869764 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:31 crc kubenswrapper[4810]: I0219 15:10:31.869780 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:31 crc kubenswrapper[4810]: I0219 15:10:31.869802 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:31 crc kubenswrapper[4810]: I0219 15:10:31.869817 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:31Z","lastTransitionTime":"2026-02-19T15:10:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:31 crc kubenswrapper[4810]: I0219 15:10:31.972461 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:31 crc kubenswrapper[4810]: I0219 15:10:31.973024 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:31 crc kubenswrapper[4810]: I0219 15:10:31.973044 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:31 crc kubenswrapper[4810]: I0219 15:10:31.973075 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:31 crc kubenswrapper[4810]: I0219 15:10:31.973093 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:31Z","lastTransitionTime":"2026-02-19T15:10:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:32 crc kubenswrapper[4810]: I0219 15:10:32.076732 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:32 crc kubenswrapper[4810]: I0219 15:10:32.076823 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:32 crc kubenswrapper[4810]: I0219 15:10:32.076881 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:32 crc kubenswrapper[4810]: I0219 15:10:32.076909 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:32 crc kubenswrapper[4810]: I0219 15:10:32.076937 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:32Z","lastTransitionTime":"2026-02-19T15:10:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:32 crc kubenswrapper[4810]: I0219 15:10:32.181222 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:32 crc kubenswrapper[4810]: I0219 15:10:32.181531 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:32 crc kubenswrapper[4810]: I0219 15:10:32.181612 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:32 crc kubenswrapper[4810]: I0219 15:10:32.181695 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:32 crc kubenswrapper[4810]: I0219 15:10:32.181797 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:32Z","lastTransitionTime":"2026-02-19T15:10:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:32 crc kubenswrapper[4810]: I0219 15:10:32.285715 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:32 crc kubenswrapper[4810]: I0219 15:10:32.285800 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:32 crc kubenswrapper[4810]: I0219 15:10:32.285845 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:32 crc kubenswrapper[4810]: I0219 15:10:32.285886 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:32 crc kubenswrapper[4810]: I0219 15:10:32.285913 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:32Z","lastTransitionTime":"2026-02-19T15:10:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:32 crc kubenswrapper[4810]: I0219 15:10:32.389475 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:32 crc kubenswrapper[4810]: I0219 15:10:32.389577 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:32 crc kubenswrapper[4810]: I0219 15:10:32.389616 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:32 crc kubenswrapper[4810]: I0219 15:10:32.389637 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:32 crc kubenswrapper[4810]: I0219 15:10:32.389654 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:32Z","lastTransitionTime":"2026-02-19T15:10:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:32 crc kubenswrapper[4810]: I0219 15:10:32.403075 4810 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 13:40:42.395395938 +0000 UTC Feb 19 15:10:32 crc kubenswrapper[4810]: I0219 15:10:32.438491 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 15:10:32 crc kubenswrapper[4810]: E0219 15:10:32.438692 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 15:10:32 crc kubenswrapper[4810]: I0219 15:10:32.492622 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:32 crc kubenswrapper[4810]: I0219 15:10:32.492674 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:32 crc kubenswrapper[4810]: I0219 15:10:32.492689 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:32 crc kubenswrapper[4810]: I0219 15:10:32.492710 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:32 crc kubenswrapper[4810]: I0219 15:10:32.492724 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:32Z","lastTransitionTime":"2026-02-19T15:10:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:32 crc kubenswrapper[4810]: I0219 15:10:32.595180 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:32 crc kubenswrapper[4810]: I0219 15:10:32.595257 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:32 crc kubenswrapper[4810]: I0219 15:10:32.595268 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:32 crc kubenswrapper[4810]: I0219 15:10:32.595292 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:32 crc kubenswrapper[4810]: I0219 15:10:32.595305 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:32Z","lastTransitionTime":"2026-02-19T15:10:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:32 crc kubenswrapper[4810]: I0219 15:10:32.706848 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:32 crc kubenswrapper[4810]: I0219 15:10:32.706966 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:32 crc kubenswrapper[4810]: I0219 15:10:32.706988 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:32 crc kubenswrapper[4810]: I0219 15:10:32.707023 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:32 crc kubenswrapper[4810]: I0219 15:10:32.707046 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:32Z","lastTransitionTime":"2026-02-19T15:10:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:32 crc kubenswrapper[4810]: I0219 15:10:32.810822 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:32 crc kubenswrapper[4810]: I0219 15:10:32.810894 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:32 crc kubenswrapper[4810]: I0219 15:10:32.810919 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:32 crc kubenswrapper[4810]: I0219 15:10:32.810958 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:32 crc kubenswrapper[4810]: I0219 15:10:32.810984 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:32Z","lastTransitionTime":"2026-02-19T15:10:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:32 crc kubenswrapper[4810]: I0219 15:10:32.915266 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:32 crc kubenswrapper[4810]: I0219 15:10:32.915368 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:32 crc kubenswrapper[4810]: I0219 15:10:32.915389 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:32 crc kubenswrapper[4810]: I0219 15:10:32.915414 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:32 crc kubenswrapper[4810]: I0219 15:10:32.915431 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:32Z","lastTransitionTime":"2026-02-19T15:10:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:33 crc kubenswrapper[4810]: I0219 15:10:33.018802 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:33 crc kubenswrapper[4810]: I0219 15:10:33.018874 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:33 crc kubenswrapper[4810]: I0219 15:10:33.018897 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:33 crc kubenswrapper[4810]: I0219 15:10:33.018928 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:33 crc kubenswrapper[4810]: I0219 15:10:33.018951 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:33Z","lastTransitionTime":"2026-02-19T15:10:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:33 crc kubenswrapper[4810]: I0219 15:10:33.121983 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:33 crc kubenswrapper[4810]: I0219 15:10:33.122043 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:33 crc kubenswrapper[4810]: I0219 15:10:33.122058 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:33 crc kubenswrapper[4810]: I0219 15:10:33.122080 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:33 crc kubenswrapper[4810]: I0219 15:10:33.122094 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:33Z","lastTransitionTime":"2026-02-19T15:10:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:33 crc kubenswrapper[4810]: I0219 15:10:33.225803 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:33 crc kubenswrapper[4810]: I0219 15:10:33.225923 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:33 crc kubenswrapper[4810]: I0219 15:10:33.225957 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:33 crc kubenswrapper[4810]: I0219 15:10:33.225990 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:33 crc kubenswrapper[4810]: I0219 15:10:33.226012 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:33Z","lastTransitionTime":"2026-02-19T15:10:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:33 crc kubenswrapper[4810]: I0219 15:10:33.329859 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:33 crc kubenswrapper[4810]: I0219 15:10:33.329943 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:33 crc kubenswrapper[4810]: I0219 15:10:33.330014 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:33 crc kubenswrapper[4810]: I0219 15:10:33.330044 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:33 crc kubenswrapper[4810]: I0219 15:10:33.330062 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:33Z","lastTransitionTime":"2026-02-19T15:10:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:33 crc kubenswrapper[4810]: I0219 15:10:33.404011 4810 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 08:29:35.543344479 +0000 UTC Feb 19 15:10:33 crc kubenswrapper[4810]: I0219 15:10:33.433148 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:33 crc kubenswrapper[4810]: I0219 15:10:33.433206 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:33 crc kubenswrapper[4810]: I0219 15:10:33.433219 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:33 crc kubenswrapper[4810]: I0219 15:10:33.433241 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:33 crc kubenswrapper[4810]: I0219 15:10:33.433255 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:33Z","lastTransitionTime":"2026-02-19T15:10:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:33 crc kubenswrapper[4810]: I0219 15:10:33.438702 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2x9v9" Feb 19 15:10:33 crc kubenswrapper[4810]: I0219 15:10:33.438793 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 15:10:33 crc kubenswrapper[4810]: I0219 15:10:33.438702 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 15:10:33 crc kubenswrapper[4810]: E0219 15:10:33.438952 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2x9v9" podUID="b72d3f7a-e418-4a21-af73-6a43ce3358c1" Feb 19 15:10:33 crc kubenswrapper[4810]: E0219 15:10:33.439102 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 15:10:33 crc kubenswrapper[4810]: E0219 15:10:33.439379 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 15:10:33 crc kubenswrapper[4810]: I0219 15:10:33.536797 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:33 crc kubenswrapper[4810]: I0219 15:10:33.536878 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:33 crc kubenswrapper[4810]: I0219 15:10:33.536907 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:33 crc kubenswrapper[4810]: I0219 15:10:33.536946 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:33 crc kubenswrapper[4810]: I0219 15:10:33.536976 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:33Z","lastTransitionTime":"2026-02-19T15:10:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:33 crc kubenswrapper[4810]: I0219 15:10:33.640473 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:33 crc kubenswrapper[4810]: I0219 15:10:33.640601 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:33 crc kubenswrapper[4810]: I0219 15:10:33.640630 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:33 crc kubenswrapper[4810]: I0219 15:10:33.640666 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:33 crc kubenswrapper[4810]: I0219 15:10:33.640690 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:33Z","lastTransitionTime":"2026-02-19T15:10:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:33 crc kubenswrapper[4810]: I0219 15:10:33.744169 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:33 crc kubenswrapper[4810]: I0219 15:10:33.744226 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:33 crc kubenswrapper[4810]: I0219 15:10:33.744235 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:33 crc kubenswrapper[4810]: I0219 15:10:33.744258 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:33 crc kubenswrapper[4810]: I0219 15:10:33.744270 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:33Z","lastTransitionTime":"2026-02-19T15:10:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:33 crc kubenswrapper[4810]: I0219 15:10:33.848039 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:33 crc kubenswrapper[4810]: I0219 15:10:33.848120 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:33 crc kubenswrapper[4810]: I0219 15:10:33.848157 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:33 crc kubenswrapper[4810]: I0219 15:10:33.848205 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:33 crc kubenswrapper[4810]: I0219 15:10:33.848232 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:33Z","lastTransitionTime":"2026-02-19T15:10:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:33 crc kubenswrapper[4810]: I0219 15:10:33.951401 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:33 crc kubenswrapper[4810]: I0219 15:10:33.951555 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:33 crc kubenswrapper[4810]: I0219 15:10:33.951584 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:33 crc kubenswrapper[4810]: I0219 15:10:33.951615 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:33 crc kubenswrapper[4810]: I0219 15:10:33.951660 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:33Z","lastTransitionTime":"2026-02-19T15:10:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:34 crc kubenswrapper[4810]: I0219 15:10:34.055317 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:34 crc kubenswrapper[4810]: I0219 15:10:34.055464 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:34 crc kubenswrapper[4810]: I0219 15:10:34.055502 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:34 crc kubenswrapper[4810]: I0219 15:10:34.055539 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:34 crc kubenswrapper[4810]: I0219 15:10:34.055564 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:34Z","lastTransitionTime":"2026-02-19T15:10:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:34 crc kubenswrapper[4810]: I0219 15:10:34.158665 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:34 crc kubenswrapper[4810]: I0219 15:10:34.158709 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:34 crc kubenswrapper[4810]: I0219 15:10:34.158719 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:34 crc kubenswrapper[4810]: I0219 15:10:34.158736 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:34 crc kubenswrapper[4810]: I0219 15:10:34.158747 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:34Z","lastTransitionTime":"2026-02-19T15:10:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:34 crc kubenswrapper[4810]: I0219 15:10:34.262272 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:34 crc kubenswrapper[4810]: I0219 15:10:34.262381 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:34 crc kubenswrapper[4810]: I0219 15:10:34.262403 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:34 crc kubenswrapper[4810]: I0219 15:10:34.262436 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:34 crc kubenswrapper[4810]: I0219 15:10:34.262459 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:34Z","lastTransitionTime":"2026-02-19T15:10:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:34 crc kubenswrapper[4810]: I0219 15:10:34.366142 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:34 crc kubenswrapper[4810]: I0219 15:10:34.366213 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:34 crc kubenswrapper[4810]: I0219 15:10:34.366231 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:34 crc kubenswrapper[4810]: I0219 15:10:34.366253 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:34 crc kubenswrapper[4810]: I0219 15:10:34.366266 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:34Z","lastTransitionTime":"2026-02-19T15:10:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:34 crc kubenswrapper[4810]: I0219 15:10:34.404965 4810 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 18:05:30.523716276 +0000 UTC Feb 19 15:10:34 crc kubenswrapper[4810]: I0219 15:10:34.439172 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 15:10:34 crc kubenswrapper[4810]: E0219 15:10:34.439432 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 15:10:34 crc kubenswrapper[4810]: I0219 15:10:34.469827 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:34 crc kubenswrapper[4810]: I0219 15:10:34.469879 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:34 crc kubenswrapper[4810]: I0219 15:10:34.469896 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:34 crc kubenswrapper[4810]: I0219 15:10:34.469920 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:34 crc kubenswrapper[4810]: I0219 15:10:34.469939 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:34Z","lastTransitionTime":"2026-02-19T15:10:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:34 crc kubenswrapper[4810]: I0219 15:10:34.573420 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:34 crc kubenswrapper[4810]: I0219 15:10:34.573470 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:34 crc kubenswrapper[4810]: I0219 15:10:34.573482 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:34 crc kubenswrapper[4810]: I0219 15:10:34.573502 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:34 crc kubenswrapper[4810]: I0219 15:10:34.573516 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:34Z","lastTransitionTime":"2026-02-19T15:10:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:34 crc kubenswrapper[4810]: I0219 15:10:34.677189 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:34 crc kubenswrapper[4810]: I0219 15:10:34.677274 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:34 crc kubenswrapper[4810]: I0219 15:10:34.677298 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:34 crc kubenswrapper[4810]: I0219 15:10:34.677371 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:34 crc kubenswrapper[4810]: I0219 15:10:34.677394 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:34Z","lastTransitionTime":"2026-02-19T15:10:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:34 crc kubenswrapper[4810]: I0219 15:10:34.863026 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:34 crc kubenswrapper[4810]: I0219 15:10:34.863085 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:34 crc kubenswrapper[4810]: I0219 15:10:34.863099 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:34 crc kubenswrapper[4810]: I0219 15:10:34.863126 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:34 crc kubenswrapper[4810]: I0219 15:10:34.863138 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:34Z","lastTransitionTime":"2026-02-19T15:10:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:34 crc kubenswrapper[4810]: I0219 15:10:34.965652 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:34 crc kubenswrapper[4810]: I0219 15:10:34.965705 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:34 crc kubenswrapper[4810]: I0219 15:10:34.965717 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:34 crc kubenswrapper[4810]: I0219 15:10:34.965741 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:34 crc kubenswrapper[4810]: I0219 15:10:34.965763 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:34Z","lastTransitionTime":"2026-02-19T15:10:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:35 crc kubenswrapper[4810]: I0219 15:10:35.068867 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:35 crc kubenswrapper[4810]: I0219 15:10:35.068924 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:35 crc kubenswrapper[4810]: I0219 15:10:35.068934 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:35 crc kubenswrapper[4810]: I0219 15:10:35.068951 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:35 crc kubenswrapper[4810]: I0219 15:10:35.068963 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:35Z","lastTransitionTime":"2026-02-19T15:10:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:35 crc kubenswrapper[4810]: I0219 15:10:35.172258 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:35 crc kubenswrapper[4810]: I0219 15:10:35.172305 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:35 crc kubenswrapper[4810]: I0219 15:10:35.172316 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:35 crc kubenswrapper[4810]: I0219 15:10:35.172377 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:35 crc kubenswrapper[4810]: I0219 15:10:35.172390 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:35Z","lastTransitionTime":"2026-02-19T15:10:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:35 crc kubenswrapper[4810]: I0219 15:10:35.275602 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:35 crc kubenswrapper[4810]: I0219 15:10:35.275697 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:35 crc kubenswrapper[4810]: I0219 15:10:35.275716 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:35 crc kubenswrapper[4810]: I0219 15:10:35.275747 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:35 crc kubenswrapper[4810]: I0219 15:10:35.275768 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:35Z","lastTransitionTime":"2026-02-19T15:10:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:35 crc kubenswrapper[4810]: I0219 15:10:35.379032 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:35 crc kubenswrapper[4810]: I0219 15:10:35.379111 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:35 crc kubenswrapper[4810]: I0219 15:10:35.379123 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:35 crc kubenswrapper[4810]: I0219 15:10:35.379146 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:35 crc kubenswrapper[4810]: I0219 15:10:35.379160 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:35Z","lastTransitionTime":"2026-02-19T15:10:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:35 crc kubenswrapper[4810]: I0219 15:10:35.405607 4810 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 01:45:55.086092659 +0000 UTC Feb 19 15:10:35 crc kubenswrapper[4810]: I0219 15:10:35.439289 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 15:10:35 crc kubenswrapper[4810]: I0219 15:10:35.439341 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 15:10:35 crc kubenswrapper[4810]: E0219 15:10:35.439594 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 15:10:35 crc kubenswrapper[4810]: I0219 15:10:35.439684 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2x9v9" Feb 19 15:10:35 crc kubenswrapper[4810]: E0219 15:10:35.439845 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 15:10:35 crc kubenswrapper[4810]: E0219 15:10:35.439932 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2x9v9" podUID="b72d3f7a-e418-4a21-af73-6a43ce3358c1" Feb 19 15:10:35 crc kubenswrapper[4810]: I0219 15:10:35.482050 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:35 crc kubenswrapper[4810]: I0219 15:10:35.482154 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:35 crc kubenswrapper[4810]: I0219 15:10:35.482173 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:35 crc kubenswrapper[4810]: I0219 15:10:35.482198 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:35 crc kubenswrapper[4810]: I0219 15:10:35.482217 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:35Z","lastTransitionTime":"2026-02-19T15:10:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:35 crc kubenswrapper[4810]: I0219 15:10:35.585686 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:35 crc kubenswrapper[4810]: I0219 15:10:35.585735 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:35 crc kubenswrapper[4810]: I0219 15:10:35.585743 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:35 crc kubenswrapper[4810]: I0219 15:10:35.585760 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:35 crc kubenswrapper[4810]: I0219 15:10:35.585773 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:35Z","lastTransitionTime":"2026-02-19T15:10:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:35 crc kubenswrapper[4810]: I0219 15:10:35.690676 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:35 crc kubenswrapper[4810]: I0219 15:10:35.690742 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:35 crc kubenswrapper[4810]: I0219 15:10:35.690753 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:35 crc kubenswrapper[4810]: I0219 15:10:35.690773 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:35 crc kubenswrapper[4810]: I0219 15:10:35.690785 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:35Z","lastTransitionTime":"2026-02-19T15:10:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:35 crc kubenswrapper[4810]: I0219 15:10:35.794860 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:35 crc kubenswrapper[4810]: I0219 15:10:35.794949 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:35 crc kubenswrapper[4810]: I0219 15:10:35.794974 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:35 crc kubenswrapper[4810]: I0219 15:10:35.795005 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:35 crc kubenswrapper[4810]: I0219 15:10:35.795027 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:35Z","lastTransitionTime":"2026-02-19T15:10:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:35 crc kubenswrapper[4810]: I0219 15:10:35.897695 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:35 crc kubenswrapper[4810]: I0219 15:10:35.897759 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:35 crc kubenswrapper[4810]: I0219 15:10:35.897776 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:35 crc kubenswrapper[4810]: I0219 15:10:35.897803 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:35 crc kubenswrapper[4810]: I0219 15:10:35.897820 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:35Z","lastTransitionTime":"2026-02-19T15:10:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:35 crc kubenswrapper[4810]: I0219 15:10:35.999791 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:35 crc kubenswrapper[4810]: I0219 15:10:35.999827 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:35 crc kubenswrapper[4810]: I0219 15:10:35.999835 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:35 crc kubenswrapper[4810]: I0219 15:10:35.999852 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:35 crc kubenswrapper[4810]: I0219 15:10:35.999861 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:35Z","lastTransitionTime":"2026-02-19T15:10:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:36 crc kubenswrapper[4810]: I0219 15:10:36.102504 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:36 crc kubenswrapper[4810]: I0219 15:10:36.102540 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:36 crc kubenswrapper[4810]: I0219 15:10:36.102548 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:36 crc kubenswrapper[4810]: I0219 15:10:36.102564 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:36 crc kubenswrapper[4810]: I0219 15:10:36.102573 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:36Z","lastTransitionTime":"2026-02-19T15:10:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:36 crc kubenswrapper[4810]: I0219 15:10:36.204857 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:36 crc kubenswrapper[4810]: I0219 15:10:36.204934 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:36 crc kubenswrapper[4810]: I0219 15:10:36.204953 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:36 crc kubenswrapper[4810]: I0219 15:10:36.204979 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:36 crc kubenswrapper[4810]: I0219 15:10:36.205001 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:36Z","lastTransitionTime":"2026-02-19T15:10:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:36 crc kubenswrapper[4810]: I0219 15:10:36.307884 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:36 crc kubenswrapper[4810]: I0219 15:10:36.307928 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:36 crc kubenswrapper[4810]: I0219 15:10:36.307939 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:36 crc kubenswrapper[4810]: I0219 15:10:36.307957 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:36 crc kubenswrapper[4810]: I0219 15:10:36.307968 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:36Z","lastTransitionTime":"2026-02-19T15:10:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:36 crc kubenswrapper[4810]: I0219 15:10:36.406153 4810 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 11:48:01.704585739 +0000 UTC Feb 19 15:10:36 crc kubenswrapper[4810]: I0219 15:10:36.410845 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:36 crc kubenswrapper[4810]: I0219 15:10:36.410881 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:36 crc kubenswrapper[4810]: I0219 15:10:36.410891 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:36 crc kubenswrapper[4810]: I0219 15:10:36.410908 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:36 crc kubenswrapper[4810]: I0219 15:10:36.410918 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:36Z","lastTransitionTime":"2026-02-19T15:10:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:36 crc kubenswrapper[4810]: I0219 15:10:36.438648 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 15:10:36 crc kubenswrapper[4810]: E0219 15:10:36.438813 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 15:10:36 crc kubenswrapper[4810]: I0219 15:10:36.514188 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:36 crc kubenswrapper[4810]: I0219 15:10:36.514225 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:36 crc kubenswrapper[4810]: I0219 15:10:36.514236 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:36 crc kubenswrapper[4810]: I0219 15:10:36.514253 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:36 crc kubenswrapper[4810]: I0219 15:10:36.514299 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:36Z","lastTransitionTime":"2026-02-19T15:10:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:36 crc kubenswrapper[4810]: I0219 15:10:36.617634 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:36 crc kubenswrapper[4810]: I0219 15:10:36.617687 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:36 crc kubenswrapper[4810]: I0219 15:10:36.617699 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:36 crc kubenswrapper[4810]: I0219 15:10:36.617717 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:36 crc kubenswrapper[4810]: I0219 15:10:36.617729 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:36Z","lastTransitionTime":"2026-02-19T15:10:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:36 crc kubenswrapper[4810]: I0219 15:10:36.720394 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:36 crc kubenswrapper[4810]: I0219 15:10:36.720457 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:36 crc kubenswrapper[4810]: I0219 15:10:36.720480 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:36 crc kubenswrapper[4810]: I0219 15:10:36.720511 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:36 crc kubenswrapper[4810]: I0219 15:10:36.720535 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:36Z","lastTransitionTime":"2026-02-19T15:10:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:36 crc kubenswrapper[4810]: I0219 15:10:36.823248 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:36 crc kubenswrapper[4810]: I0219 15:10:36.823306 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:36 crc kubenswrapper[4810]: I0219 15:10:36.823319 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:36 crc kubenswrapper[4810]: I0219 15:10:36.823363 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:36 crc kubenswrapper[4810]: I0219 15:10:36.823377 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:36Z","lastTransitionTime":"2026-02-19T15:10:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:36 crc kubenswrapper[4810]: I0219 15:10:36.926378 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:36 crc kubenswrapper[4810]: I0219 15:10:36.926449 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:36 crc kubenswrapper[4810]: I0219 15:10:36.926480 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:36 crc kubenswrapper[4810]: I0219 15:10:36.926510 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:36 crc kubenswrapper[4810]: I0219 15:10:36.926533 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:36Z","lastTransitionTime":"2026-02-19T15:10:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:37 crc kubenswrapper[4810]: I0219 15:10:37.028936 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:37 crc kubenswrapper[4810]: I0219 15:10:37.028991 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:37 crc kubenswrapper[4810]: I0219 15:10:37.029011 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:37 crc kubenswrapper[4810]: I0219 15:10:37.029038 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:37 crc kubenswrapper[4810]: I0219 15:10:37.029056 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:37Z","lastTransitionTime":"2026-02-19T15:10:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:37 crc kubenswrapper[4810]: I0219 15:10:37.131782 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:37 crc kubenswrapper[4810]: I0219 15:10:37.131853 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:37 crc kubenswrapper[4810]: I0219 15:10:37.131876 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:37 crc kubenswrapper[4810]: I0219 15:10:37.131912 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:37 crc kubenswrapper[4810]: I0219 15:10:37.131942 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:37Z","lastTransitionTime":"2026-02-19T15:10:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:37 crc kubenswrapper[4810]: I0219 15:10:37.235214 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:37 crc kubenswrapper[4810]: I0219 15:10:37.235261 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:37 crc kubenswrapper[4810]: I0219 15:10:37.235274 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:37 crc kubenswrapper[4810]: I0219 15:10:37.235293 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:37 crc kubenswrapper[4810]: I0219 15:10:37.235307 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:37Z","lastTransitionTime":"2026-02-19T15:10:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:37 crc kubenswrapper[4810]: I0219 15:10:37.337858 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:37 crc kubenswrapper[4810]: I0219 15:10:37.337932 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:37 crc kubenswrapper[4810]: I0219 15:10:37.337957 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:37 crc kubenswrapper[4810]: I0219 15:10:37.337993 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:37 crc kubenswrapper[4810]: I0219 15:10:37.338021 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:37Z","lastTransitionTime":"2026-02-19T15:10:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:37 crc kubenswrapper[4810]: I0219 15:10:37.406664 4810 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 17:25:43.244626457 +0000 UTC Feb 19 15:10:37 crc kubenswrapper[4810]: I0219 15:10:37.439206 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2x9v9" Feb 19 15:10:37 crc kubenswrapper[4810]: I0219 15:10:37.439476 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 15:10:37 crc kubenswrapper[4810]: I0219 15:10:37.439529 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 15:10:37 crc kubenswrapper[4810]: E0219 15:10:37.439741 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2x9v9" podUID="b72d3f7a-e418-4a21-af73-6a43ce3358c1" Feb 19 15:10:37 crc kubenswrapper[4810]: E0219 15:10:37.439893 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 15:10:37 crc kubenswrapper[4810]: E0219 15:10:37.440199 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 15:10:37 crc kubenswrapper[4810]: I0219 15:10:37.444007 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:37 crc kubenswrapper[4810]: I0219 15:10:37.444145 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:37 crc kubenswrapper[4810]: I0219 15:10:37.444166 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:37 crc kubenswrapper[4810]: I0219 15:10:37.444193 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:37 crc kubenswrapper[4810]: I0219 15:10:37.444382 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:37Z","lastTransitionTime":"2026-02-19T15:10:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:37 crc kubenswrapper[4810]: I0219 15:10:37.469600 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Feb 19 15:10:37 crc kubenswrapper[4810]: I0219 15:10:37.547502 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:37 crc kubenswrapper[4810]: I0219 15:10:37.547580 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:37 crc kubenswrapper[4810]: I0219 15:10:37.547603 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:37 crc kubenswrapper[4810]: I0219 15:10:37.547637 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:37 crc kubenswrapper[4810]: I0219 15:10:37.547661 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:37Z","lastTransitionTime":"2026-02-19T15:10:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:37 crc kubenswrapper[4810]: I0219 15:10:37.651187 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:37 crc kubenswrapper[4810]: I0219 15:10:37.651246 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:37 crc kubenswrapper[4810]: I0219 15:10:37.651257 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:37 crc kubenswrapper[4810]: I0219 15:10:37.651281 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:37 crc kubenswrapper[4810]: I0219 15:10:37.651294 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:37Z","lastTransitionTime":"2026-02-19T15:10:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:37 crc kubenswrapper[4810]: I0219 15:10:37.753663 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:37 crc kubenswrapper[4810]: I0219 15:10:37.753767 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:37 crc kubenswrapper[4810]: I0219 15:10:37.753792 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:37 crc kubenswrapper[4810]: I0219 15:10:37.753827 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:37 crc kubenswrapper[4810]: I0219 15:10:37.753855 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:37Z","lastTransitionTime":"2026-02-19T15:10:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:37 crc kubenswrapper[4810]: I0219 15:10:37.856978 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:37 crc kubenswrapper[4810]: I0219 15:10:37.857058 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:37 crc kubenswrapper[4810]: I0219 15:10:37.857083 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:37 crc kubenswrapper[4810]: I0219 15:10:37.857116 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:37 crc kubenswrapper[4810]: I0219 15:10:37.857141 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:37Z","lastTransitionTime":"2026-02-19T15:10:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:37 crc kubenswrapper[4810]: I0219 15:10:37.959757 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:37 crc kubenswrapper[4810]: I0219 15:10:37.959809 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:37 crc kubenswrapper[4810]: I0219 15:10:37.959822 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:37 crc kubenswrapper[4810]: I0219 15:10:37.959842 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:37 crc kubenswrapper[4810]: I0219 15:10:37.959858 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:37Z","lastTransitionTime":"2026-02-19T15:10:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:38 crc kubenswrapper[4810]: I0219 15:10:38.062452 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:38 crc kubenswrapper[4810]: I0219 15:10:38.062508 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:38 crc kubenswrapper[4810]: I0219 15:10:38.062525 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:38 crc kubenswrapper[4810]: I0219 15:10:38.062551 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:38 crc kubenswrapper[4810]: I0219 15:10:38.062568 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:38Z","lastTransitionTime":"2026-02-19T15:10:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:38 crc kubenswrapper[4810]: I0219 15:10:38.164787 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:38 crc kubenswrapper[4810]: I0219 15:10:38.164838 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:38 crc kubenswrapper[4810]: I0219 15:10:38.164847 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:38 crc kubenswrapper[4810]: I0219 15:10:38.164864 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:38 crc kubenswrapper[4810]: I0219 15:10:38.164876 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:38Z","lastTransitionTime":"2026-02-19T15:10:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:38 crc kubenswrapper[4810]: I0219 15:10:38.268206 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:38 crc kubenswrapper[4810]: I0219 15:10:38.268271 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:38 crc kubenswrapper[4810]: I0219 15:10:38.268288 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:38 crc kubenswrapper[4810]: I0219 15:10:38.268318 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:38 crc kubenswrapper[4810]: I0219 15:10:38.268372 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:38Z","lastTransitionTime":"2026-02-19T15:10:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:38 crc kubenswrapper[4810]: I0219 15:10:38.371238 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:38 crc kubenswrapper[4810]: I0219 15:10:38.371286 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:38 crc kubenswrapper[4810]: I0219 15:10:38.371300 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:38 crc kubenswrapper[4810]: I0219 15:10:38.371319 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:38 crc kubenswrapper[4810]: I0219 15:10:38.371348 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:38Z","lastTransitionTime":"2026-02-19T15:10:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:38 crc kubenswrapper[4810]: I0219 15:10:38.406868 4810 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 20:25:20.238390565 +0000 UTC Feb 19 15:10:38 crc kubenswrapper[4810]: I0219 15:10:38.439548 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 15:10:38 crc kubenswrapper[4810]: E0219 15:10:38.439769 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 15:10:38 crc kubenswrapper[4810]: I0219 15:10:38.474514 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:38 crc kubenswrapper[4810]: I0219 15:10:38.474574 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:38 crc kubenswrapper[4810]: I0219 15:10:38.474592 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:38 crc kubenswrapper[4810]: I0219 15:10:38.474618 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:38 crc kubenswrapper[4810]: I0219 15:10:38.474639 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:38Z","lastTransitionTime":"2026-02-19T15:10:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:38 crc kubenswrapper[4810]: I0219 15:10:38.577772 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:38 crc kubenswrapper[4810]: I0219 15:10:38.577838 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:38 crc kubenswrapper[4810]: I0219 15:10:38.577849 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:38 crc kubenswrapper[4810]: I0219 15:10:38.577873 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:38 crc kubenswrapper[4810]: I0219 15:10:38.577887 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:38Z","lastTransitionTime":"2026-02-19T15:10:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:38 crc kubenswrapper[4810]: I0219 15:10:38.681436 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:38 crc kubenswrapper[4810]: I0219 15:10:38.681502 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:38 crc kubenswrapper[4810]: I0219 15:10:38.681530 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:38 crc kubenswrapper[4810]: I0219 15:10:38.681620 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:38 crc kubenswrapper[4810]: I0219 15:10:38.681651 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:38Z","lastTransitionTime":"2026-02-19T15:10:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:38 crc kubenswrapper[4810]: I0219 15:10:38.784607 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:38 crc kubenswrapper[4810]: I0219 15:10:38.784671 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:38 crc kubenswrapper[4810]: I0219 15:10:38.784688 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:38 crc kubenswrapper[4810]: I0219 15:10:38.784738 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:38 crc kubenswrapper[4810]: I0219 15:10:38.784760 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:38Z","lastTransitionTime":"2026-02-19T15:10:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:38 crc kubenswrapper[4810]: I0219 15:10:38.888154 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:38 crc kubenswrapper[4810]: I0219 15:10:38.888302 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:38 crc kubenswrapper[4810]: I0219 15:10:38.888568 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:38 crc kubenswrapper[4810]: I0219 15:10:38.888616 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:38 crc kubenswrapper[4810]: I0219 15:10:38.888680 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:38Z","lastTransitionTime":"2026-02-19T15:10:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:38 crc kubenswrapper[4810]: I0219 15:10:38.992362 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:38 crc kubenswrapper[4810]: I0219 15:10:38.992429 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:38 crc kubenswrapper[4810]: I0219 15:10:38.992458 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:38 crc kubenswrapper[4810]: I0219 15:10:38.992503 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:38 crc kubenswrapper[4810]: I0219 15:10:38.992528 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:38Z","lastTransitionTime":"2026-02-19T15:10:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:39 crc kubenswrapper[4810]: I0219 15:10:39.095259 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:39 crc kubenswrapper[4810]: I0219 15:10:39.095362 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:39 crc kubenswrapper[4810]: I0219 15:10:39.095391 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:39 crc kubenswrapper[4810]: I0219 15:10:39.095419 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:39 crc kubenswrapper[4810]: I0219 15:10:39.095437 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:39Z","lastTransitionTime":"2026-02-19T15:10:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:39 crc kubenswrapper[4810]: I0219 15:10:39.198257 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:39 crc kubenswrapper[4810]: I0219 15:10:39.198371 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:39 crc kubenswrapper[4810]: I0219 15:10:39.198404 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:39 crc kubenswrapper[4810]: I0219 15:10:39.198435 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:39 crc kubenswrapper[4810]: I0219 15:10:39.198455 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:39Z","lastTransitionTime":"2026-02-19T15:10:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:39 crc kubenswrapper[4810]: I0219 15:10:39.301715 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:39 crc kubenswrapper[4810]: I0219 15:10:39.301759 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:39 crc kubenswrapper[4810]: I0219 15:10:39.301771 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:39 crc kubenswrapper[4810]: I0219 15:10:39.301791 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:39 crc kubenswrapper[4810]: I0219 15:10:39.301809 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:39Z","lastTransitionTime":"2026-02-19T15:10:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:39 crc kubenswrapper[4810]: I0219 15:10:39.404721 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:39 crc kubenswrapper[4810]: I0219 15:10:39.404797 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:39 crc kubenswrapper[4810]: I0219 15:10:39.404814 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:39 crc kubenswrapper[4810]: I0219 15:10:39.404838 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:39 crc kubenswrapper[4810]: I0219 15:10:39.404853 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:39Z","lastTransitionTime":"2026-02-19T15:10:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:39 crc kubenswrapper[4810]: I0219 15:10:39.407978 4810 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 00:06:42.182565952 +0000 UTC Feb 19 15:10:39 crc kubenswrapper[4810]: I0219 15:10:39.438394 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2x9v9" Feb 19 15:10:39 crc kubenswrapper[4810]: I0219 15:10:39.438452 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 15:10:39 crc kubenswrapper[4810]: I0219 15:10:39.438394 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 15:10:39 crc kubenswrapper[4810]: E0219 15:10:39.438632 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 15:10:39 crc kubenswrapper[4810]: E0219 15:10:39.438726 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 15:10:39 crc kubenswrapper[4810]: E0219 15:10:39.438799 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2x9v9" podUID="b72d3f7a-e418-4a21-af73-6a43ce3358c1" Feb 19 15:10:39 crc kubenswrapper[4810]: I0219 15:10:39.439647 4810 scope.go:117] "RemoveContainer" containerID="4bb8c0b89e0d91440beb93dc0fe78cebe4ca481940af3c8affe5efd75d589980" Feb 19 15:10:39 crc kubenswrapper[4810]: E0219 15:10:39.439836 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-8k7p5_openshift-ovn-kubernetes(c5a8a15c-53e8-4868-8feb-dcd4e83939a4)\"" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" podUID="c5a8a15c-53e8-4868-8feb-dcd4e83939a4" Feb 19 15:10:39 crc kubenswrapper[4810]: I0219 15:10:39.490679 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:39 crc kubenswrapper[4810]: I0219 15:10:39.490757 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:39 crc kubenswrapper[4810]: I0219 15:10:39.490781 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:39 crc kubenswrapper[4810]: I0219 15:10:39.490815 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:39 crc kubenswrapper[4810]: I0219 15:10:39.490836 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:39Z","lastTransitionTime":"2026-02-19T15:10:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:39 crc kubenswrapper[4810]: I0219 15:10:39.509579 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b72d3f7a-e418-4a21-af73-6a43ce3358c1-metrics-certs\") pod \"network-metrics-daemon-2x9v9\" (UID: \"b72d3f7a-e418-4a21-af73-6a43ce3358c1\") " pod="openshift-multus/network-metrics-daemon-2x9v9" Feb 19 15:10:39 crc kubenswrapper[4810]: E0219 15:10:39.509380 4810 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7961bd2b-8ad7-4e25-b313-8f82bef01c62\\\",\\\"systemUUID\\\":\\\"60bcb373-142f-4da9-846e-4d055863e63a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:39Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:39 crc kubenswrapper[4810]: E0219 15:10:39.510300 4810 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 15:10:39 crc kubenswrapper[4810]: E0219 15:10:39.510412 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b72d3f7a-e418-4a21-af73-6a43ce3358c1-metrics-certs podName:b72d3f7a-e418-4a21-af73-6a43ce3358c1 nodeName:}" failed. No retries permitted until 2026-02-19 15:11:11.510376053 +0000 UTC m=+100.992406177 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b72d3f7a-e418-4a21-af73-6a43ce3358c1-metrics-certs") pod "network-metrics-daemon-2x9v9" (UID: "b72d3f7a-e418-4a21-af73-6a43ce3358c1") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 15:10:39 crc kubenswrapper[4810]: I0219 15:10:39.514224 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:39 crc kubenswrapper[4810]: I0219 15:10:39.514299 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:39 crc kubenswrapper[4810]: I0219 15:10:39.514357 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:39 crc kubenswrapper[4810]: I0219 15:10:39.514404 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:39 crc kubenswrapper[4810]: I0219 15:10:39.514429 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:39Z","lastTransitionTime":"2026-02-19T15:10:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:39 crc kubenswrapper[4810]: E0219 15:10:39.537232 4810 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7961bd2b-8ad7-4e25-b313-8f82bef01c62\\\",\\\"systemUUID\\\":\\\"60bcb373-142f-4da9-846e-4d055863e63a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:39Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:39 crc kubenswrapper[4810]: I0219 15:10:39.543585 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:39 crc kubenswrapper[4810]: I0219 15:10:39.543654 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:39 crc kubenswrapper[4810]: I0219 15:10:39.543677 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:39 crc kubenswrapper[4810]: I0219 15:10:39.543705 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:39 crc kubenswrapper[4810]: I0219 15:10:39.543732 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:39Z","lastTransitionTime":"2026-02-19T15:10:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:39 crc kubenswrapper[4810]: E0219 15:10:39.566601 4810 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7961bd2b-8ad7-4e25-b313-8f82bef01c62\\\",\\\"systemUUID\\\":\\\"60bcb373-142f-4da9-846e-4d055863e63a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:39Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:39 crc kubenswrapper[4810]: I0219 15:10:39.571852 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:39 crc kubenswrapper[4810]: I0219 15:10:39.571922 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:39 crc kubenswrapper[4810]: I0219 15:10:39.571944 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:39 crc kubenswrapper[4810]: I0219 15:10:39.571978 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:39 crc kubenswrapper[4810]: I0219 15:10:39.572004 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:39Z","lastTransitionTime":"2026-02-19T15:10:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:39 crc kubenswrapper[4810]: E0219 15:10:39.593684 4810 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7961bd2b-8ad7-4e25-b313-8f82bef01c62\\\",\\\"systemUUID\\\":\\\"60bcb373-142f-4da9-846e-4d055863e63a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:39Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:39 crc kubenswrapper[4810]: I0219 15:10:39.599479 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:39 crc kubenswrapper[4810]: I0219 15:10:39.599567 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:39 crc kubenswrapper[4810]: I0219 15:10:39.599592 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:39 crc kubenswrapper[4810]: I0219 15:10:39.599626 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:39 crc kubenswrapper[4810]: I0219 15:10:39.599652 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:39Z","lastTransitionTime":"2026-02-19T15:10:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:39 crc kubenswrapper[4810]: E0219 15:10:39.620291 4810 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7961bd2b-8ad7-4e25-b313-8f82bef01c62\\\",\\\"systemUUID\\\":\\\"60bcb373-142f-4da9-846e-4d055863e63a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:39Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:39 crc kubenswrapper[4810]: E0219 15:10:39.620476 4810 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 19 15:10:39 crc kubenswrapper[4810]: I0219 15:10:39.622836 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:39 crc kubenswrapper[4810]: I0219 15:10:39.622879 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:39 crc kubenswrapper[4810]: I0219 15:10:39.622892 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:39 crc kubenswrapper[4810]: I0219 15:10:39.622914 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:39 crc kubenswrapper[4810]: I0219 15:10:39.622927 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:39Z","lastTransitionTime":"2026-02-19T15:10:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:39 crc kubenswrapper[4810]: I0219 15:10:39.725739 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:39 crc kubenswrapper[4810]: I0219 15:10:39.725806 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:39 crc kubenswrapper[4810]: I0219 15:10:39.725823 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:39 crc kubenswrapper[4810]: I0219 15:10:39.725849 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:39 crc kubenswrapper[4810]: I0219 15:10:39.725872 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:39Z","lastTransitionTime":"2026-02-19T15:10:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:39 crc kubenswrapper[4810]: I0219 15:10:39.829072 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:39 crc kubenswrapper[4810]: I0219 15:10:39.829150 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:39 crc kubenswrapper[4810]: I0219 15:10:39.829175 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:39 crc kubenswrapper[4810]: I0219 15:10:39.829209 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:39 crc kubenswrapper[4810]: I0219 15:10:39.829236 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:39Z","lastTransitionTime":"2026-02-19T15:10:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:39 crc kubenswrapper[4810]: I0219 15:10:39.932048 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:39 crc kubenswrapper[4810]: I0219 15:10:39.932091 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:39 crc kubenswrapper[4810]: I0219 15:10:39.932107 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:39 crc kubenswrapper[4810]: I0219 15:10:39.932126 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:39 crc kubenswrapper[4810]: I0219 15:10:39.932142 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:39Z","lastTransitionTime":"2026-02-19T15:10:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:40 crc kubenswrapper[4810]: I0219 15:10:40.034890 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:40 crc kubenswrapper[4810]: I0219 15:10:40.035222 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:40 crc kubenswrapper[4810]: I0219 15:10:40.035314 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:40 crc kubenswrapper[4810]: I0219 15:10:40.035435 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:40 crc kubenswrapper[4810]: I0219 15:10:40.035511 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:40Z","lastTransitionTime":"2026-02-19T15:10:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:40 crc kubenswrapper[4810]: I0219 15:10:40.138894 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:40 crc kubenswrapper[4810]: I0219 15:10:40.138947 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:40 crc kubenswrapper[4810]: I0219 15:10:40.138967 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:40 crc kubenswrapper[4810]: I0219 15:10:40.138994 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:40 crc kubenswrapper[4810]: I0219 15:10:40.139012 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:40Z","lastTransitionTime":"2026-02-19T15:10:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:40 crc kubenswrapper[4810]: I0219 15:10:40.242903 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:40 crc kubenswrapper[4810]: I0219 15:10:40.242965 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:40 crc kubenswrapper[4810]: I0219 15:10:40.243002 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:40 crc kubenswrapper[4810]: I0219 15:10:40.243022 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:40 crc kubenswrapper[4810]: I0219 15:10:40.243035 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:40Z","lastTransitionTime":"2026-02-19T15:10:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:40 crc kubenswrapper[4810]: I0219 15:10:40.346826 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:40 crc kubenswrapper[4810]: I0219 15:10:40.346871 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:40 crc kubenswrapper[4810]: I0219 15:10:40.346881 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:40 crc kubenswrapper[4810]: I0219 15:10:40.346899 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:40 crc kubenswrapper[4810]: I0219 15:10:40.346910 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:40Z","lastTransitionTime":"2026-02-19T15:10:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:40 crc kubenswrapper[4810]: I0219 15:10:40.408365 4810 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 10:23:11.065543924 +0000 UTC Feb 19 15:10:40 crc kubenswrapper[4810]: I0219 15:10:40.439060 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 15:10:40 crc kubenswrapper[4810]: E0219 15:10:40.439284 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 15:10:40 crc kubenswrapper[4810]: I0219 15:10:40.450757 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:40 crc kubenswrapper[4810]: I0219 15:10:40.450844 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:40 crc kubenswrapper[4810]: I0219 15:10:40.450870 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:40 crc kubenswrapper[4810]: I0219 15:10:40.450904 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:40 crc kubenswrapper[4810]: I0219 15:10:40.450927 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:40Z","lastTransitionTime":"2026-02-19T15:10:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:40 crc kubenswrapper[4810]: I0219 15:10:40.554052 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:40 crc kubenswrapper[4810]: I0219 15:10:40.554129 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:40 crc kubenswrapper[4810]: I0219 15:10:40.554152 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:40 crc kubenswrapper[4810]: I0219 15:10:40.554184 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:40 crc kubenswrapper[4810]: I0219 15:10:40.554207 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:40Z","lastTransitionTime":"2026-02-19T15:10:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:40 crc kubenswrapper[4810]: I0219 15:10:40.658088 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:40 crc kubenswrapper[4810]: I0219 15:10:40.658168 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:40 crc kubenswrapper[4810]: I0219 15:10:40.658188 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:40 crc kubenswrapper[4810]: I0219 15:10:40.658216 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:40 crc kubenswrapper[4810]: I0219 15:10:40.658234 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:40Z","lastTransitionTime":"2026-02-19T15:10:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:40 crc kubenswrapper[4810]: I0219 15:10:40.762033 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:40 crc kubenswrapper[4810]: I0219 15:10:40.762118 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:40 crc kubenswrapper[4810]: I0219 15:10:40.762146 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:40 crc kubenswrapper[4810]: I0219 15:10:40.762179 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:40 crc kubenswrapper[4810]: I0219 15:10:40.762201 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:40Z","lastTransitionTime":"2026-02-19T15:10:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:40 crc kubenswrapper[4810]: I0219 15:10:40.864788 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:40 crc kubenswrapper[4810]: I0219 15:10:40.864848 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:40 crc kubenswrapper[4810]: I0219 15:10:40.864870 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:40 crc kubenswrapper[4810]: I0219 15:10:40.864898 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:40 crc kubenswrapper[4810]: I0219 15:10:40.864918 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:40Z","lastTransitionTime":"2026-02-19T15:10:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:40 crc kubenswrapper[4810]: I0219 15:10:40.967656 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:40 crc kubenswrapper[4810]: I0219 15:10:40.967715 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:40 crc kubenswrapper[4810]: I0219 15:10:40.967730 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:40 crc kubenswrapper[4810]: I0219 15:10:40.967754 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:40 crc kubenswrapper[4810]: I0219 15:10:40.967765 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:40Z","lastTransitionTime":"2026-02-19T15:10:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:41 crc kubenswrapper[4810]: I0219 15:10:41.070251 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:41 crc kubenswrapper[4810]: I0219 15:10:41.070355 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:41 crc kubenswrapper[4810]: I0219 15:10:41.070375 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:41 crc kubenswrapper[4810]: I0219 15:10:41.070408 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:41 crc kubenswrapper[4810]: I0219 15:10:41.070426 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:41Z","lastTransitionTime":"2026-02-19T15:10:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:41 crc kubenswrapper[4810]: I0219 15:10:41.173756 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:41 crc kubenswrapper[4810]: I0219 15:10:41.174128 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:41 crc kubenswrapper[4810]: I0219 15:10:41.174150 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:41 crc kubenswrapper[4810]: I0219 15:10:41.174171 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:41 crc kubenswrapper[4810]: I0219 15:10:41.174188 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:41Z","lastTransitionTime":"2026-02-19T15:10:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:41 crc kubenswrapper[4810]: I0219 15:10:41.276988 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:41 crc kubenswrapper[4810]: I0219 15:10:41.277038 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:41 crc kubenswrapper[4810]: I0219 15:10:41.277051 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:41 crc kubenswrapper[4810]: I0219 15:10:41.277071 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:41 crc kubenswrapper[4810]: I0219 15:10:41.277081 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:41Z","lastTransitionTime":"2026-02-19T15:10:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:41 crc kubenswrapper[4810]: I0219 15:10:41.379730 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:41 crc kubenswrapper[4810]: I0219 15:10:41.379798 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:41 crc kubenswrapper[4810]: I0219 15:10:41.379810 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:41 crc kubenswrapper[4810]: I0219 15:10:41.379832 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:41 crc kubenswrapper[4810]: I0219 15:10:41.379845 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:41Z","lastTransitionTime":"2026-02-19T15:10:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:41 crc kubenswrapper[4810]: I0219 15:10:41.409278 4810 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 00:26:26.059493718 +0000 UTC Feb 19 15:10:41 crc kubenswrapper[4810]: I0219 15:10:41.439017 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2x9v9" Feb 19 15:10:41 crc kubenswrapper[4810]: E0219 15:10:41.439186 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2x9v9" podUID="b72d3f7a-e418-4a21-af73-6a43ce3358c1" Feb 19 15:10:41 crc kubenswrapper[4810]: I0219 15:10:41.439554 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 15:10:41 crc kubenswrapper[4810]: I0219 15:10:41.439577 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 15:10:41 crc kubenswrapper[4810]: E0219 15:10:41.439757 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 15:10:41 crc kubenswrapper[4810]: E0219 15:10:41.439794 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 15:10:41 crc kubenswrapper[4810]: I0219 15:10:41.463521 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b60fe11e60f880bde640f63640fdc6a1b66e3044dcd975c711c45a15aed3fb2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:41Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:41 crc kubenswrapper[4810]: I0219 15:10:41.479274 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9db08d13283889d8c43ab89157e2c33f300359ecf66b548100b30ddea3da2be1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8472b62f0ac04884cd8a9d342c063860188b0f17e1ed2201c7ea7288fb44d77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:41Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:41 crc kubenswrapper[4810]: I0219 15:10:41.482571 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:41 crc kubenswrapper[4810]: I0219 15:10:41.482626 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:41 crc kubenswrapper[4810]: I0219 15:10:41.482636 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:41 crc kubenswrapper[4810]: I0219 15:10:41.482698 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:41 crc kubenswrapper[4810]: I0219 15:10:41.482712 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:41Z","lastTransitionTime":"2026-02-19T15:10:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:41 crc kubenswrapper[4810]: I0219 15:10:41.491862 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-flbx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cc3ea69-b881-4fd4-ad4d-42803f27865b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07b26bfbc15981836877e6ddc6aea3fb26a4df1be38ac67c76f8a7a8f6b84b5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqph9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-flbx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:41Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:41 crc kubenswrapper[4810]: I0219 15:10:41.511452 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dcca2640a7625737b151a65d63dac889139b41a8dff1cd5aae2ead1a37138f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://879052e78b8d383d0a7f4fbddb38b98fa931bec3d0d0c7ff7eb830ccb62be46d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5de90aae0b44ad98711b4e4c2764cf25c491a329a7acbe9a6ec49f88fb8e7c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c0568b462e1168949dcfd0131ef4e94d6c3f68dd31906ee5c6a8c87f7fef25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://702a868ef507f31cb27f41ccd862f38ba8d5256e5d4af19a5d8b168939932663\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5089067acda7084123f239150916500b11a0547068b0a39ee60a467c372f3459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bb8c0b89e0d91440beb93dc0fe78cebe4ca481940af3c8affe5efd75d589980\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bb8c0b89e0d91440beb93dc0fe78cebe4ca481940af3c8affe5efd75d589980\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T15:10:23Z\\\",\\\"message\\\":\\\"responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.139:17698:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8efa4d1a-72f5-4dfa-9bc2-9d93ef11ecf2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0219 15:10:23.623215 6458 services_controller.go:443] Built service openshift-machine-config-operator/machine-config-controller LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.16\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:9001, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0219 15:10:23.623235 6458 services_controller.go:444] Built service openshift-machine-config-operator/machine-config-controller LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0219 15:10:23.623244 6458 services_controller.go:445] Built service openshift-machine-config-operator/machine-config-controller LB template configs for network=default: []services.lbConfig(nil)\\\\nF0219 15:10:23.623259 6458 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T15:10:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-8k7p5_openshift-ovn-kubernetes(c5a8a15c-53e8-4868-8feb-dcd4e83939a4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e21fe2d6964872913c8e5a2f983519806b72d0e2880016372ac90fc2b9f61fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8k7p5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:41Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:41 crc kubenswrapper[4810]: I0219 15:10:41.526065 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d74a04f-ffb6-47fb-8823-2fd7f3df4f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2070ce19c777ad791dfd3e5d99eded43e479f4cbbaefe96c5ed8525aa86e5ed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89e969496cd5adab974a5d8b8c3051bcdfb63c582b8f0d6d9af233d25e2c1d3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5559fbe43f7e18617e2a46182d621a6310ae4d0b10548bae2bb098588285e081\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fbf8815e560483199cc8f453952895d3e5d831e7c52671b204187dc686fa4ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:41Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:41 crc kubenswrapper[4810]: I0219 15:10:41.539315 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54ba23405a6182e281d7d423f13e84a03c0f68cf69169f59fe1cd0d5881103c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:41Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:41 crc kubenswrapper[4810]: I0219 15:10:41.552867 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bsztz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a45a199-beeb-4972-b796-15c958fe99d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a674e283274f1e77f6ac89a8170137bf5852153bd4a35a80101a3cd0090b85d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcg8h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bsztz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:41Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:41 crc kubenswrapper[4810]: I0219 15:10:41.568209 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:41Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:41 crc kubenswrapper[4810]: I0219 15:10:41.585408 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:41 crc kubenswrapper[4810]: I0219 15:10:41.585707 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:41 crc kubenswrapper[4810]: I0219 15:10:41.585804 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:41 crc kubenswrapper[4810]: I0219 15:10:41.585899 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:41 crc kubenswrapper[4810]: I0219 15:10:41.585255 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6694b0d-3264-43eb-bd52-5088c7d2bf15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16713d77989f991d6b3136c41aac696aaa3233a0f3454bcd299cd968f46872eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42939a1750dd650a11f4f83812694248cc8767b040f5cab76ee64d30ad6004ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a3707767fcc9c9ee6a808f9c0598457f1710b6394ecf9910f7a1a0ddd8ec1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef740fa90b279363e303fb824188d0a26097b06aa8eaa488c0c800a8caa8f10e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://640d02443d7d20048695866b4db1d1fdd6469c712a804010860ca4c730f59c65\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T15:09:51Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 15:09:51.508430 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 15:09:51.508603 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 15:09:51.509986 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1367038872/tls.crt::/tmp/serving-cert-1367038872/tls.key\\\\\\\"\\\\nI0219 15:09:51.722990 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 15:09:51.726068 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 15:09:51.726086 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 15:09:51.726104 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 15:09:51.726110 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 15:09:51.731095 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 15:09:51.731120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0219 15:09:51.731119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0219 15:09:51.731126 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 15:09:51.731136 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 15:09:51.731141 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 15:09:51.731145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 15:09:51.731149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 15:09:51.734010 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77137e37c7a8cdddbba21456e942c760fb2cc6c31533e3bc6c4561929b41393b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:41Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:41 crc kubenswrapper[4810]: I0219 15:10:41.585970 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:41Z","lastTransitionTime":"2026-02-19T15:10:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:41 crc kubenswrapper[4810]: I0219 15:10:41.601849 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:41Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:41 crc kubenswrapper[4810]: I0219 15:10:41.618239 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmbsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"247e948b-3c17-4675-bd1c-f894b02d2817\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7171cb2bba77488fbb60ec533172b2240d1871e89bc7760ae5c9b67ee6924354\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cbe5a969ecab49494c22bccec30a5b7050699729a8ae6244a4bb016ed9cbe64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cbe5a969ecab49494c22bccec30a5b7050699729a8ae6244a4bb016ed9cbe64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://395e0b7a03b78f71b2823c72ea754e1909bde653a7107597bf07b6e664ba40dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://395e0b7a03b78f71b2823c72ea754e1909bde653a7107597bf07b6e664ba40dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5a0126c8bea39a89fe488d49dcb1e47063abc9839d148f348a7ce592038a450\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5a0126c8bea39a89fe488d49dcb1e47063abc9839d148f348a7ce592038a450\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9864da53b0851b0c4eadd5c11669af9b9e539660908ca14f44618824b5e4158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9864da53b0851b0c4eadd5c11669af9b9e539660908ca14f44618824b5e4158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://301c743749a2bcaee82a70d3c34a6cf13f880ef9986c390e279a7c056f9a99ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://301c743749a2bcaee82a70d3c34a6cf13f880ef9986c390e279a7c056f9a99ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://075e06e40beb99d2338c5850692a5e89314b6342cc72007d75b602ac21a90d9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://075e06e40beb99d2338c5850692a5e89314b6342cc72007d75b602ac21a90d9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmbsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:41Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:41 crc kubenswrapper[4810]: I0219 15:10:41.629224 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2x9v9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b72d3f7a-e418-4a21-af73-6a43ce3358c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7z7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7z7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:10:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2x9v9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:41Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:41 crc kubenswrapper[4810]: I0219 15:10:41.643425 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b08ecb1c-d94b-4563-8f47-7334d52bc0c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af7cc216f97bd72a2d9a57e54a49676e2828ec7d0a0a035f22261c40a6976a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d59e5d74bcb7088e172f31cc207d421cd214b7d4a80382004fb30c167b04e14a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d59e5d74bcb7088e172f31cc207d421cd214b7d4a80382004fb30c167b04e14a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:41Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:41 crc kubenswrapper[4810]: I0219 15:10:41.656154 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:41Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:41 crc kubenswrapper[4810]: I0219 15:10:41.670645 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfbf74ef-1e94-4826-8583-42b2e246ccf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f26baf8d6384c9133f7da338479eac54f1aa88609e4b0854078d4e85e8bf05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjrkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba65a8abc0841494ba4596019b005e3cbb3f3c0897bef615b51a2ab10aa9e651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjrkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t499d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:41Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:41 crc kubenswrapper[4810]: I0219 15:10:41.682283 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-t9jnq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60d7228b-14bd-4988-8dca-cb89f487ba00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb03eee4bad3a037ff0d61a204c9c86a839e9d684dbb89a433a50afa9cfcfcf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gzvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-t9jnq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:41Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:41 crc kubenswrapper[4810]: I0219 15:10:41.688822 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:41 crc kubenswrapper[4810]: I0219 15:10:41.688873 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:41 crc kubenswrapper[4810]: I0219 15:10:41.688886 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:41 crc kubenswrapper[4810]: I0219 15:10:41.688907 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:41 crc kubenswrapper[4810]: I0219 15:10:41.688921 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:41Z","lastTransitionTime":"2026-02-19T15:10:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:41 crc kubenswrapper[4810]: I0219 15:10:41.694494 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-l2g4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06adf9a1-ec31-4acc-9864-41549913d3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://618a71f428d132079ed8a6756ea331ec8fd20c32c98bc00691f56fa0b12669aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh6qr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82d6389909245cbf6091dfb1ef53a497dd254228acd53c508c32c4345f86b7b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh6qr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:10:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-l2g4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:41Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:41 crc kubenswrapper[4810]: I0219 15:10:41.706210 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac4287c9-a6d9-470d-820f-316c037a5d1a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc8b81d39b5b02958a8f7fa7ec86ac9dd530f2c4674042f60b11209e0400433a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://034d625f766dc7e2dd97fcedcfc0ca251971bc0be7e0d04ed4b2bcf3939905b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81860198cc657673df0021324ba4fde92aa6ffbde974993d0553a080872f7fb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eeb662772f98f52da96cf3b347f7b4990f4daa4552da61d972ecb8875d74b01c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eeb662772f98f52da96cf3b347f7b4990f4daa4552da61d972ecb8875d74b01c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:41Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:41 crc kubenswrapper[4810]: I0219 15:10:41.791830 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:41 crc kubenswrapper[4810]: I0219 15:10:41.791885 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:41 crc kubenswrapper[4810]: I0219 15:10:41.791900 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:41 crc kubenswrapper[4810]: I0219 15:10:41.791920 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:41 crc kubenswrapper[4810]: I0219 15:10:41.791930 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:41Z","lastTransitionTime":"2026-02-19T15:10:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:41 crc kubenswrapper[4810]: I0219 15:10:41.894964 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:41 crc kubenswrapper[4810]: I0219 15:10:41.895021 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:41 crc kubenswrapper[4810]: I0219 15:10:41.895067 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:41 crc kubenswrapper[4810]: I0219 15:10:41.895089 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:41 crc kubenswrapper[4810]: I0219 15:10:41.895103 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:41Z","lastTransitionTime":"2026-02-19T15:10:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:41 crc kubenswrapper[4810]: I0219 15:10:41.997850 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:41 crc kubenswrapper[4810]: I0219 15:10:41.997921 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:41 crc kubenswrapper[4810]: I0219 15:10:41.997940 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:41 crc kubenswrapper[4810]: I0219 15:10:41.997966 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:41 crc kubenswrapper[4810]: I0219 15:10:41.997983 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:41Z","lastTransitionTime":"2026-02-19T15:10:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:42 crc kubenswrapper[4810]: I0219 15:10:42.100770 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:42 crc kubenswrapper[4810]: I0219 15:10:42.100829 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:42 crc kubenswrapper[4810]: I0219 15:10:42.100844 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:42 crc kubenswrapper[4810]: I0219 15:10:42.100873 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:42 crc kubenswrapper[4810]: I0219 15:10:42.100890 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:42Z","lastTransitionTime":"2026-02-19T15:10:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:42 crc kubenswrapper[4810]: I0219 15:10:42.202905 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:42 crc kubenswrapper[4810]: I0219 15:10:42.202950 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:42 crc kubenswrapper[4810]: I0219 15:10:42.202962 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:42 crc kubenswrapper[4810]: I0219 15:10:42.202979 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:42 crc kubenswrapper[4810]: I0219 15:10:42.202991 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:42Z","lastTransitionTime":"2026-02-19T15:10:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:42 crc kubenswrapper[4810]: I0219 15:10:42.305885 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:42 crc kubenswrapper[4810]: I0219 15:10:42.305965 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:42 crc kubenswrapper[4810]: I0219 15:10:42.305989 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:42 crc kubenswrapper[4810]: I0219 15:10:42.306020 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:42 crc kubenswrapper[4810]: I0219 15:10:42.306039 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:42Z","lastTransitionTime":"2026-02-19T15:10:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:42 crc kubenswrapper[4810]: I0219 15:10:42.409401 4810 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 02:10:05.177735383 +0000 UTC Feb 19 15:10:42 crc kubenswrapper[4810]: I0219 15:10:42.409455 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:42 crc kubenswrapper[4810]: I0219 15:10:42.409555 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:42 crc kubenswrapper[4810]: I0219 15:10:42.409578 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:42 crc kubenswrapper[4810]: I0219 15:10:42.409610 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:42 crc kubenswrapper[4810]: I0219 15:10:42.409632 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:42Z","lastTransitionTime":"2026-02-19T15:10:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:42 crc kubenswrapper[4810]: I0219 15:10:42.438997 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 15:10:42 crc kubenswrapper[4810]: E0219 15:10:42.439179 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 15:10:42 crc kubenswrapper[4810]: I0219 15:10:42.513634 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:42 crc kubenswrapper[4810]: I0219 15:10:42.513701 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:42 crc kubenswrapper[4810]: I0219 15:10:42.513715 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:42 crc kubenswrapper[4810]: I0219 15:10:42.513738 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:42 crc kubenswrapper[4810]: I0219 15:10:42.513753 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:42Z","lastTransitionTime":"2026-02-19T15:10:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:42 crc kubenswrapper[4810]: I0219 15:10:42.617222 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:42 crc kubenswrapper[4810]: I0219 15:10:42.617285 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:42 crc kubenswrapper[4810]: I0219 15:10:42.617310 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:42 crc kubenswrapper[4810]: I0219 15:10:42.617348 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:42 crc kubenswrapper[4810]: I0219 15:10:42.617359 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:42Z","lastTransitionTime":"2026-02-19T15:10:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:42 crc kubenswrapper[4810]: I0219 15:10:42.720388 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:42 crc kubenswrapper[4810]: I0219 15:10:42.720438 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:42 crc kubenswrapper[4810]: I0219 15:10:42.720449 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:42 crc kubenswrapper[4810]: I0219 15:10:42.720468 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:42 crc kubenswrapper[4810]: I0219 15:10:42.720482 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:42Z","lastTransitionTime":"2026-02-19T15:10:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:42 crc kubenswrapper[4810]: I0219 15:10:42.823817 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:42 crc kubenswrapper[4810]: I0219 15:10:42.823855 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:42 crc kubenswrapper[4810]: I0219 15:10:42.823873 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:42 crc kubenswrapper[4810]: I0219 15:10:42.823891 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:42 crc kubenswrapper[4810]: I0219 15:10:42.823902 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:42Z","lastTransitionTime":"2026-02-19T15:10:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:42 crc kubenswrapper[4810]: I0219 15:10:42.926672 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:42 crc kubenswrapper[4810]: I0219 15:10:42.926707 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:42 crc kubenswrapper[4810]: I0219 15:10:42.926716 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:42 crc kubenswrapper[4810]: I0219 15:10:42.926731 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:42 crc kubenswrapper[4810]: I0219 15:10:42.926740 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:42Z","lastTransitionTime":"2026-02-19T15:10:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:42 crc kubenswrapper[4810]: I0219 15:10:42.964602 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bsztz_2a45a199-beeb-4972-b796-15c958fe99d3/kube-multus/0.log" Feb 19 15:10:42 crc kubenswrapper[4810]: I0219 15:10:42.964660 4810 generic.go:334] "Generic (PLEG): container finished" podID="2a45a199-beeb-4972-b796-15c958fe99d3" containerID="a674e283274f1e77f6ac89a8170137bf5852153bd4a35a80101a3cd0090b85d2" exitCode=1 Feb 19 15:10:42 crc kubenswrapper[4810]: I0219 15:10:42.964691 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bsztz" event={"ID":"2a45a199-beeb-4972-b796-15c958fe99d3","Type":"ContainerDied","Data":"a674e283274f1e77f6ac89a8170137bf5852153bd4a35a80101a3cd0090b85d2"} Feb 19 15:10:42 crc kubenswrapper[4810]: I0219 15:10:42.965120 4810 scope.go:117] "RemoveContainer" containerID="a674e283274f1e77f6ac89a8170137bf5852153bd4a35a80101a3cd0090b85d2" Feb 19 15:10:42 crc kubenswrapper[4810]: I0219 15:10:42.985431 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac4287c9-a6d9-470d-820f-316c037a5d1a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc8b81d39b5b02958a8f7fa7ec86ac9dd530f2c4674042f60b11209e0400433a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://034d625f766dc7e2dd97fcedcfc0ca251971bc0be7e0d04ed4b2bcf3939905b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81860198cc657673df0021324ba4fde92aa6ffbde974993d0553a080872f7fb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eeb662772f98f52da96cf3b347f7b4990f4daa4552da61d972ecb8875d74b01c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eeb662772f98f52da96cf3b347f7b4990f4daa4552da61d972ecb8875d74b01c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:42Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:43 crc kubenswrapper[4810]: I0219 15:10:43.001466 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:42Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:43 crc kubenswrapper[4810]: I0219 15:10:43.016745 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfbf74ef-1e94-4826-8583-42b2e246ccf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f26baf8d6384c9133f7da338479eac54f1aa88609e4b0854078d4e85e8bf05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjrkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba65a8abc0841494ba4596019b005e3cbb3f3c0897bef615b51a2ab10aa9e651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjrkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t499d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:43Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:43 crc kubenswrapper[4810]: I0219 15:10:43.030697 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:43 crc kubenswrapper[4810]: I0219 15:10:43.030742 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:43 crc kubenswrapper[4810]: I0219 15:10:43.030751 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:43 crc kubenswrapper[4810]: I0219 15:10:43.030774 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:43 crc kubenswrapper[4810]: I0219 15:10:43.030786 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:43Z","lastTransitionTime":"2026-02-19T15:10:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:43 crc kubenswrapper[4810]: I0219 15:10:43.031456 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-t9jnq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60d7228b-14bd-4988-8dca-cb89f487ba00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb03eee4bad3a037ff0d61a204c9c86a839e9d684dbb89a433a50afa9cfcfcf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gzvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-t9jnq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:43Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:43 crc kubenswrapper[4810]: I0219 15:10:43.049754 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-l2g4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06adf9a1-ec31-4acc-9864-41549913d3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://618a71f428d132079ed8a6756ea331ec8fd20c32c98bc00691f56fa0b12669aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh6qr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82d6389909245cbf6091dfb1ef53a497dd254228acd53c508c32c4345f86b7b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh6qr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:10:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-l2g4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:43Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:43 crc kubenswrapper[4810]: I0219 15:10:43.068483 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d74a04f-ffb6-47fb-8823-2fd7f3df4f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2070ce19c777ad791dfd3e5d99eded43e479f4cbbaefe96c5ed8525aa86e5ed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89e969496cd5adab974a5d8b8c3051bcdfb63c582b8f0d6d9af233d25e2c1d3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5559fbe43f7e18617e2a46182d621a6310ae4d0b10548bae2bb098588285e081\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fbf8815e560483199cc8f453952895d3e5d831e7c52671b204187dc686fa4ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:43Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:43 crc kubenswrapper[4810]: I0219 15:10:43.083360 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b60fe11e60f880bde640f63640fdc6a1b66e3044dcd975c711c45a15aed3fb2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:43Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:43 crc kubenswrapper[4810]: I0219 15:10:43.105470 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9db08d13283889d8c43ab89157e2c33f300359ecf66b548100b30ddea3da2be1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8472b62f0ac04884cd8a9d342c063860188b0f17e1ed2201c7ea7288fb44d77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:43Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:43 crc kubenswrapper[4810]: I0219 15:10:43.122834 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-flbx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cc3ea69-b881-4fd4-ad4d-42803f27865b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07b26bfbc15981836877e6ddc6aea3fb26a4df1be38ac67c76f8a7a8f6b84b5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqph9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-flbx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:43Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:43 crc kubenswrapper[4810]: I0219 15:10:43.134245 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:43 crc kubenswrapper[4810]: I0219 15:10:43.134283 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:43 crc kubenswrapper[4810]: I0219 15:10:43.134294 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:43 crc kubenswrapper[4810]: I0219 15:10:43.134311 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:43 crc kubenswrapper[4810]: I0219 15:10:43.134338 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:43Z","lastTransitionTime":"2026-02-19T15:10:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:43 crc kubenswrapper[4810]: I0219 15:10:43.148788 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dcca2640a7625737b151a65d63dac889139b41a8dff1cd5aae2ead1a37138f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://879052e78b8d383d0a7f4fbddb38b98fa931bec3d0d0c7ff7eb830ccb62be46d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5de90aae0b44ad98711b4e4c2764cf25c491a329a7acbe9a6ec49f88fb8e7c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c0568b462e1168949dcfd0131ef4e94d6c3f68dd31906ee5c6a8c87f7fef25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://702a868ef507f31cb27f41ccd862f38ba8d5256e5d4af19a5d8b168939932663\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5089067acda7084123f239150916500b11a0547068b0a39ee60a467c372f3459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bb8c0b89e0d91440beb93dc0fe78cebe4ca481940af3c8affe5efd75d589980\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bb8c0b89e0d91440beb93dc0fe78cebe4ca481940af3c8affe5efd75d589980\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T15:10:23Z\\\",\\\"message\\\":\\\"responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.139:17698:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8efa4d1a-72f5-4dfa-9bc2-9d93ef11ecf2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0219 15:10:23.623215 6458 services_controller.go:443] Built service openshift-machine-config-operator/machine-config-controller LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.16\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:9001, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0219 15:10:23.623235 6458 services_controller.go:444] Built service openshift-machine-config-operator/machine-config-controller LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0219 15:10:23.623244 6458 services_controller.go:445] Built service openshift-machine-config-operator/machine-config-controller LB template configs for network=default: []services.lbConfig(nil)\\\\nF0219 15:10:23.623259 6458 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T15:10:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-8k7p5_openshift-ovn-kubernetes(c5a8a15c-53e8-4868-8feb-dcd4e83939a4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e21fe2d6964872913c8e5a2f983519806b72d0e2880016372ac90fc2b9f61fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8k7p5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:43Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:43 crc kubenswrapper[4810]: I0219 15:10:43.168109 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:43Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:43 crc kubenswrapper[4810]: I0219 15:10:43.184121 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54ba23405a6182e281d7d423f13e84a03c0f68cf69169f59fe1cd0d5881103c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:43Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:43 crc kubenswrapper[4810]: I0219 15:10:43.200629 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bsztz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a45a199-beeb-4972-b796-15c958fe99d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a674e283274f1e77f6ac89a8170137bf5852153bd4a35a80101a3cd0090b85d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a674e283274f1e77f6ac89a8170137bf5852153bd4a35a80101a3cd0090b85d2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T15:10:42Z\\\",\\\"message\\\":\\\"2026-02-19T15:09:56+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_e927f559-f3e4-4463-8345-ac34840adcee\\\\n2026-02-19T15:09:56+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_e927f559-f3e4-4463-8345-ac34840adcee to /host/opt/cni/bin/\\\\n2026-02-19T15:09:57Z [verbose] multus-daemon started\\\\n2026-02-19T15:09:57Z [verbose] Readiness Indicator file check\\\\n2026-02-19T15:10:42Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcg8h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bsztz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:43Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:43 crc kubenswrapper[4810]: I0219 15:10:43.215820 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2x9v9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b72d3f7a-e418-4a21-af73-6a43ce3358c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7z7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7z7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:10:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2x9v9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:43Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:43 crc kubenswrapper[4810]: I0219 15:10:43.226992 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b08ecb1c-d94b-4563-8f47-7334d52bc0c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af7cc216f97bd72a2d9a57e54a49676e2828ec7d0a0a035f22261c40a6976a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d59e5d74bcb7088e172f31cc207d421cd214b7d4a80382004fb30c167b04e14a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d59e5d74bcb7088e172f31cc207d421cd214b7d4a80382004fb30c167b04e14a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:43Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:43 crc kubenswrapper[4810]: I0219 15:10:43.237154 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:43 crc kubenswrapper[4810]: I0219 15:10:43.237182 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:43 crc kubenswrapper[4810]: I0219 15:10:43.237192 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:43 crc kubenswrapper[4810]: I0219 15:10:43.237209 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:43 crc kubenswrapper[4810]: I0219 15:10:43.237220 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:43Z","lastTransitionTime":"2026-02-19T15:10:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:43 crc kubenswrapper[4810]: I0219 15:10:43.245526 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6694b0d-3264-43eb-bd52-5088c7d2bf15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16713d77989f991d6b3136c41aac696aaa3233a0f3454bcd299cd968f46872eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42939a1750dd650a11f4f83812694248cc8767b040f5cab76ee64d30ad6004ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a3707767fcc9c9ee6a808f9c0598457f1710b6394ecf9910f7a1a0ddd8ec1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef740fa90b279363e303fb824188d0a26097b06aa8eaa488c0c800a8caa8f10e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://640d02443d7d20048695866b4db1d1fdd6469c712a804010860ca4c730f59c65\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T15:09:51Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 15:09:51.508430 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 15:09:51.508603 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 15:09:51.509986 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1367038872/tls.crt::/tmp/serving-cert-1367038872/tls.key\\\\\\\"\\\\nI0219 15:09:51.722990 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 15:09:51.726068 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 15:09:51.726086 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 15:09:51.726104 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 15:09:51.726110 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 15:09:51.731095 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 15:09:51.731120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0219 15:09:51.731119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0219 15:09:51.731126 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 15:09:51.731136 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 15:09:51.731141 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 15:09:51.731145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 15:09:51.731149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 15:09:51.734010 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77137e37c7a8cdddbba21456e942c760fb2cc6c31533e3bc6c4561929b41393b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:43Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:43 crc kubenswrapper[4810]: I0219 15:10:43.259786 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:43Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:43 crc kubenswrapper[4810]: I0219 15:10:43.279486 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmbsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"247e948b-3c17-4675-bd1c-f894b02d2817\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7171cb2bba77488fbb60ec533172b2240d1871e89bc7760ae5c9b67ee6924354\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cbe5a969ecab49494c22bccec30a5b7050699729a8ae6244a4bb016ed9cbe64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cbe5a969ecab49494c22bccec30a5b7050699729a8ae6244a4bb016ed9cbe64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://395e0b7a03b78f71b2823c72ea754e1909bde653a7107597bf07b6e664ba40dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://395e0b7a03b78f71b2823c72ea754e1909bde653a7107597bf07b6e664ba40dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5a0126c8bea39a89fe488d49dcb1e47063abc9839d148f348a7ce592038a450\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5a0126c8bea39a89fe488d49dcb1e47063abc9839d148f348a7ce592038a450\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9864da53b0851b0c4eadd5c11669af9b9e539660908ca14f44618824b5e4158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9864da53b0851b0c4eadd5c11669af9b9e539660908ca14f44618824b5e4158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://301c743749a2bcaee82a70d3c34a6cf13f880ef9986c390e279a7c056f9a99ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://301c743749a2bcaee82a70d3c34a6cf13f880ef9986c390e279a7c056f9a99ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://075e06e40beb99d2338c5850692a5e89314b6342cc72007d75b602ac21a90d9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://075e06e40beb99d2338c5850692a5e89314b6342cc72007d75b602ac21a90d9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmbsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:43Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:43 crc kubenswrapper[4810]: I0219 15:10:43.340489 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:43 crc kubenswrapper[4810]: I0219 15:10:43.340561 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:43 crc kubenswrapper[4810]: I0219 15:10:43.340580 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:43 crc kubenswrapper[4810]: I0219 15:10:43.341059 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:43 crc kubenswrapper[4810]: I0219 15:10:43.341116 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:43Z","lastTransitionTime":"2026-02-19T15:10:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:43 crc kubenswrapper[4810]: I0219 15:10:43.409630 4810 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 14:58:31.114020175 +0000 UTC Feb 19 15:10:43 crc kubenswrapper[4810]: I0219 15:10:43.439318 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2x9v9" Feb 19 15:10:43 crc kubenswrapper[4810]: I0219 15:10:43.439454 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 15:10:43 crc kubenswrapper[4810]: I0219 15:10:43.439592 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 15:10:43 crc kubenswrapper[4810]: E0219 15:10:43.440063 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 15:10:43 crc kubenswrapper[4810]: E0219 15:10:43.439791 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2x9v9" podUID="b72d3f7a-e418-4a21-af73-6a43ce3358c1" Feb 19 15:10:43 crc kubenswrapper[4810]: E0219 15:10:43.440502 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 15:10:43 crc kubenswrapper[4810]: I0219 15:10:43.444977 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:43 crc kubenswrapper[4810]: I0219 15:10:43.445020 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:43 crc kubenswrapper[4810]: I0219 15:10:43.445039 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:43 crc kubenswrapper[4810]: I0219 15:10:43.445065 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:43 crc kubenswrapper[4810]: I0219 15:10:43.445086 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:43Z","lastTransitionTime":"2026-02-19T15:10:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:43 crc kubenswrapper[4810]: I0219 15:10:43.549898 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:43 crc kubenswrapper[4810]: I0219 15:10:43.550285 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:43 crc kubenswrapper[4810]: I0219 15:10:43.550297 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:43 crc kubenswrapper[4810]: I0219 15:10:43.550340 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:43 crc kubenswrapper[4810]: I0219 15:10:43.550358 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:43Z","lastTransitionTime":"2026-02-19T15:10:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:43 crc kubenswrapper[4810]: I0219 15:10:43.653705 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:43 crc kubenswrapper[4810]: I0219 15:10:43.653762 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:43 crc kubenswrapper[4810]: I0219 15:10:43.653780 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:43 crc kubenswrapper[4810]: I0219 15:10:43.653809 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:43 crc kubenswrapper[4810]: I0219 15:10:43.653829 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:43Z","lastTransitionTime":"2026-02-19T15:10:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:43 crc kubenswrapper[4810]: I0219 15:10:43.757074 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:43 crc kubenswrapper[4810]: I0219 15:10:43.757138 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:43 crc kubenswrapper[4810]: I0219 15:10:43.757156 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:43 crc kubenswrapper[4810]: I0219 15:10:43.757185 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:43 crc kubenswrapper[4810]: I0219 15:10:43.757218 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:43Z","lastTransitionTime":"2026-02-19T15:10:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:43 crc kubenswrapper[4810]: I0219 15:10:43.860768 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:43 crc kubenswrapper[4810]: I0219 15:10:43.860815 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:43 crc kubenswrapper[4810]: I0219 15:10:43.860834 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:43 crc kubenswrapper[4810]: I0219 15:10:43.860859 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:43 crc kubenswrapper[4810]: I0219 15:10:43.860877 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:43Z","lastTransitionTime":"2026-02-19T15:10:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:43 crc kubenswrapper[4810]: I0219 15:10:43.966022 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:43 crc kubenswrapper[4810]: I0219 15:10:43.966081 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:43 crc kubenswrapper[4810]: I0219 15:10:43.966099 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:43 crc kubenswrapper[4810]: I0219 15:10:43.966131 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:43 crc kubenswrapper[4810]: I0219 15:10:43.966151 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:43Z","lastTransitionTime":"2026-02-19T15:10:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:43 crc kubenswrapper[4810]: I0219 15:10:43.971993 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bsztz_2a45a199-beeb-4972-b796-15c958fe99d3/kube-multus/0.log" Feb 19 15:10:43 crc kubenswrapper[4810]: I0219 15:10:43.972095 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bsztz" event={"ID":"2a45a199-beeb-4972-b796-15c958fe99d3","Type":"ContainerStarted","Data":"c1f575901c6e6174387489f4f3a7d8cfa46089db51556b6acc9245faaf36a9ca"} Feb 19 15:10:43 crc kubenswrapper[4810]: I0219 15:10:43.988911 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:43Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:44 crc kubenswrapper[4810]: I0219 15:10:44.009155 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmbsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"247e948b-3c17-4675-bd1c-f894b02d2817\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7171cb2bba77488fbb60ec533172b2240d1871e89bc7760ae5c9b67ee6924354\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cbe5a969ecab49494c22bccec30a5b7050699729a8ae6244a4bb016ed9cbe64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cbe5a969ecab49494c22bccec30a5b7050699729a8ae6244a4bb016ed9cbe64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://395e0b7a03b78f71b2823c72ea754e1909bde653a7107597bf07b6e664ba40dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://395e0b7a03b78f71b2823c72ea754e1909bde653a7107597bf07b6e664ba40dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5a0126c8bea39a89fe488d49dcb1e47063abc9839d148f348a7ce592038a450\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5a0126c8bea39a89fe488d49dcb1e47063abc9839d148f348a7ce592038a450\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9864da53b0851b0c4eadd5c11669af9b9e539660908ca14f44618824b5e4158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9864da53b0851b0c4eadd5c11669af9b9e539660908ca14f44618824b5e4158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://301c743749a2bcaee82a70d3c34a6cf13f880ef9986c390e279a7c056f9a99ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://301c743749a2bcaee82a70d3c34a6cf13f880ef9986c390e279a7c056f9a99ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://075e06e40beb99d2338c5850692a5e89314b6342cc72007d75b602ac21a90d9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://075e06e40beb99d2338c5850692a5e89314b6342cc72007d75b602ac21a90d9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmbsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:44Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:44 crc kubenswrapper[4810]: I0219 15:10:44.024279 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2x9v9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b72d3f7a-e418-4a21-af73-6a43ce3358c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7z7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7z7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:10:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2x9v9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:44Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:44 crc kubenswrapper[4810]: I0219 15:10:44.036500 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b08ecb1c-d94b-4563-8f47-7334d52bc0c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af7cc216f97bd72a2d9a57e54a49676e2828ec7d0a0a035f22261c40a6976a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d59e5d74bcb7088e172f31cc207d421cd214b7d4a80382004fb30c167b04e14a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d59e5d74bcb7088e172f31cc207d421cd214b7d4a80382004fb30c167b04e14a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:44Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:44 crc kubenswrapper[4810]: I0219 15:10:44.052651 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6694b0d-3264-43eb-bd52-5088c7d2bf15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16713d77989f991d6b3136c41aac696aaa3233a0f3454bcd299cd968f46872eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42939a1750dd650a11f4f83812694248cc8767b040f5cab76ee64d30ad6004ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a3707767fcc9c9ee6a808f9c0598457f1710b6394ecf9910f7a1a0ddd8ec1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef740fa90b279363e303fb824188d0a26097b06aa8eaa488c0c800a8caa8f10e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://640d02443d7d20048695866b4db1d1fdd6469c712a804010860ca4c730f59c65\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T15:09:51Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 15:09:51.508430 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 15:09:51.508603 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 15:09:51.509986 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1367038872/tls.crt::/tmp/serving-cert-1367038872/tls.key\\\\\\\"\\\\nI0219 15:09:51.722990 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 15:09:51.726068 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 15:09:51.726086 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 15:09:51.726104 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 15:09:51.726110 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 15:09:51.731095 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 15:09:51.731120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0219 15:09:51.731119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0219 15:09:51.731126 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 15:09:51.731136 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 15:09:51.731141 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 15:09:51.731145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 15:09:51.731149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 15:09:51.734010 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77137e37c7a8cdddbba21456e942c760fb2cc6c31533e3bc6c4561929b41393b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:44Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:44 crc kubenswrapper[4810]: I0219 15:10:44.067208 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-t9jnq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60d7228b-14bd-4988-8dca-cb89f487ba00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb03eee4bad3a037ff0d61a204c9c86a839e9d684dbb89a433a50afa9cfcfcf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gzvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-t9jnq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:44Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:44 crc kubenswrapper[4810]: I0219 15:10:44.069491 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:44 crc kubenswrapper[4810]: I0219 15:10:44.069537 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:44 crc kubenswrapper[4810]: I0219 15:10:44.069549 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:44 crc kubenswrapper[4810]: I0219 15:10:44.069569 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:44 crc kubenswrapper[4810]: I0219 15:10:44.069583 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:44Z","lastTransitionTime":"2026-02-19T15:10:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:44 crc kubenswrapper[4810]: I0219 15:10:44.084568 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-l2g4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06adf9a1-ec31-4acc-9864-41549913d3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://618a71f428d132079ed8a6756ea331ec8fd20c32c98bc00691f56fa0b12669aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh6qr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82d6389909245cbf6091dfb1ef53a497dd254228acd53c508c32c4345f86b7b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh6qr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:10:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-l2g4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:44Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:44 crc kubenswrapper[4810]: I0219 15:10:44.100360 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac4287c9-a6d9-470d-820f-316c037a5d1a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc8b81d39b5b02958a8f7fa7ec86ac9dd530f2c4674042f60b11209e0400433a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://034d625f766dc7e2dd97fcedcfc0ca251971bc0be7e0d04ed4b2bcf3939905b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81860198cc657673df0021324ba4fde92aa6ffbde974993d0553a080872f7fb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eeb662772f98f52da96cf3b347f7b4990f4daa4552da61d972ecb8875d74b01c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eeb662772f98f52da96cf3b347f7b4990f4daa4552da61d972ecb8875d74b01c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:44Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:44 crc kubenswrapper[4810]: I0219 15:10:44.118451 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:44Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:44 crc kubenswrapper[4810]: I0219 15:10:44.132740 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfbf74ef-1e94-4826-8583-42b2e246ccf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f26baf8d6384c9133f7da338479eac54f1aa88609e4b0854078d4e85e8bf05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjrkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba65a8abc0841494ba4596019b005e3cbb3f3c0897bef615b51a2ab10aa9e651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjrkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t499d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:44Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:44 crc kubenswrapper[4810]: I0219 15:10:44.145369 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-flbx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cc3ea69-b881-4fd4-ad4d-42803f27865b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07b26bfbc15981836877e6ddc6aea3fb26a4df1be38ac67c76f8a7a8f6b84b5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqph9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-flbx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:44Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:44 crc kubenswrapper[4810]: I0219 15:10:44.172404 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:44 crc kubenswrapper[4810]: I0219 15:10:44.172474 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:44 crc kubenswrapper[4810]: I0219 15:10:44.172489 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:44 crc kubenswrapper[4810]: I0219 15:10:44.172511 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:44 crc kubenswrapper[4810]: I0219 15:10:44.172526 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:44Z","lastTransitionTime":"2026-02-19T15:10:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:44 crc kubenswrapper[4810]: I0219 15:10:44.176964 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dcca2640a7625737b151a65d63dac889139b41a8dff1cd5aae2ead1a37138f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://879052e78b8d383d0a7f4fbddb38b98fa931bec3d0d0c7ff7eb830ccb62be46d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5de90aae0b44ad98711b4e4c2764cf25c491a329a7acbe9a6ec49f88fb8e7c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c0568b462e1168949dcfd0131ef4e94d6c3f68dd31906ee5c6a8c87f7fef25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://702a868ef507f31cb27f41ccd862f38ba8d5256e5d4af19a5d8b168939932663\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5089067acda7084123f239150916500b11a0547068b0a39ee60a467c372f3459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bb8c0b89e0d91440beb93dc0fe78cebe4ca481940af3c8affe5efd75d589980\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bb8c0b89e0d91440beb93dc0fe78cebe4ca481940af3c8affe5efd75d589980\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T15:10:23Z\\\",\\\"message\\\":\\\"responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.139:17698:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8efa4d1a-72f5-4dfa-9bc2-9d93ef11ecf2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0219 15:10:23.623215 6458 services_controller.go:443] Built service openshift-machine-config-operator/machine-config-controller LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.16\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:9001, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0219 15:10:23.623235 6458 services_controller.go:444] Built service openshift-machine-config-operator/machine-config-controller LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0219 15:10:23.623244 6458 services_controller.go:445] Built service openshift-machine-config-operator/machine-config-controller LB template configs for network=default: []services.lbConfig(nil)\\\\nF0219 15:10:23.623259 6458 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T15:10:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-8k7p5_openshift-ovn-kubernetes(c5a8a15c-53e8-4868-8feb-dcd4e83939a4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e21fe2d6964872913c8e5a2f983519806b72d0e2880016372ac90fc2b9f61fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8k7p5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:44Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:44 crc kubenswrapper[4810]: I0219 15:10:44.192562 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d74a04f-ffb6-47fb-8823-2fd7f3df4f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2070ce19c777ad791dfd3e5d99eded43e479f4cbbaefe96c5ed8525aa86e5ed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89e969496cd5adab974a5d8b8c3051bcdfb63c582b8f0d6d9af233d25e2c1d3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5559fbe43f7e18617e2a46182d621a6310ae4d0b10548bae2bb098588285e081\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fbf8815e560483199cc8f453952895d3e5d831e7c52671b204187dc686fa4ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:44Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:44 crc kubenswrapper[4810]: I0219 15:10:44.211629 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b60fe11e60f880bde640f63640fdc6a1b66e3044dcd975c711c45a15aed3fb2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:44Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:44 crc kubenswrapper[4810]: I0219 15:10:44.225940 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9db08d13283889d8c43ab89157e2c33f300359ecf66b548100b30ddea3da2be1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8472b62f0ac04884cd8a9d342c063860188b0f17e1ed2201c7ea7288fb44d77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:44Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:44 crc kubenswrapper[4810]: I0219 15:10:44.239028 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:44Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:44 crc kubenswrapper[4810]: I0219 15:10:44.252400 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54ba23405a6182e281d7d423f13e84a03c0f68cf69169f59fe1cd0d5881103c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:44Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:44 crc kubenswrapper[4810]: I0219 15:10:44.269289 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bsztz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a45a199-beeb-4972-b796-15c958fe99d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1f575901c6e6174387489f4f3a7d8cfa46089db51556b6acc9245faaf36a9ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a674e283274f1e77f6ac89a8170137bf5852153bd4a35a80101a3cd0090b85d2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T15:10:42Z\\\",\\\"message\\\":\\\"2026-02-19T15:09:56+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_e927f559-f3e4-4463-8345-ac34840adcee\\\\n2026-02-19T15:09:56+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_e927f559-f3e4-4463-8345-ac34840adcee to /host/opt/cni/bin/\\\\n2026-02-19T15:09:57Z [verbose] multus-daemon started\\\\n2026-02-19T15:09:57Z [verbose] Readiness Indicator file check\\\\n2026-02-19T15:10:42Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:53Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcg8h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bsztz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:44Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:44 crc kubenswrapper[4810]: I0219 15:10:44.275253 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:44 crc kubenswrapper[4810]: I0219 15:10:44.275296 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:44 crc kubenswrapper[4810]: I0219 15:10:44.275309 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:44 crc kubenswrapper[4810]: I0219 15:10:44.275341 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:44 crc kubenswrapper[4810]: I0219 15:10:44.275354 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:44Z","lastTransitionTime":"2026-02-19T15:10:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:44 crc kubenswrapper[4810]: I0219 15:10:44.378060 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:44 crc kubenswrapper[4810]: I0219 15:10:44.378122 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:44 crc kubenswrapper[4810]: I0219 15:10:44.378139 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:44 crc kubenswrapper[4810]: I0219 15:10:44.378167 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:44 crc kubenswrapper[4810]: I0219 15:10:44.378190 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:44Z","lastTransitionTime":"2026-02-19T15:10:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:44 crc kubenswrapper[4810]: I0219 15:10:44.410380 4810 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 01:06:02.100117899 +0000 UTC Feb 19 15:10:44 crc kubenswrapper[4810]: I0219 15:10:44.438990 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 15:10:44 crc kubenswrapper[4810]: E0219 15:10:44.439112 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 15:10:44 crc kubenswrapper[4810]: I0219 15:10:44.480013 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:44 crc kubenswrapper[4810]: I0219 15:10:44.480075 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:44 crc kubenswrapper[4810]: I0219 15:10:44.480093 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:44 crc kubenswrapper[4810]: I0219 15:10:44.480118 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:44 crc kubenswrapper[4810]: I0219 15:10:44.480138 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:44Z","lastTransitionTime":"2026-02-19T15:10:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:44 crc kubenswrapper[4810]: I0219 15:10:44.582963 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:44 crc kubenswrapper[4810]: I0219 15:10:44.583014 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:44 crc kubenswrapper[4810]: I0219 15:10:44.583028 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:44 crc kubenswrapper[4810]: I0219 15:10:44.583049 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:44 crc kubenswrapper[4810]: I0219 15:10:44.583061 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:44Z","lastTransitionTime":"2026-02-19T15:10:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:44 crc kubenswrapper[4810]: I0219 15:10:44.686301 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:44 crc kubenswrapper[4810]: I0219 15:10:44.686401 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:44 crc kubenswrapper[4810]: I0219 15:10:44.686420 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:44 crc kubenswrapper[4810]: I0219 15:10:44.686451 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:44 crc kubenswrapper[4810]: I0219 15:10:44.686470 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:44Z","lastTransitionTime":"2026-02-19T15:10:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:44 crc kubenswrapper[4810]: I0219 15:10:44.790152 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:44 crc kubenswrapper[4810]: I0219 15:10:44.790230 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:44 crc kubenswrapper[4810]: I0219 15:10:44.790249 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:44 crc kubenswrapper[4810]: I0219 15:10:44.790277 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:44 crc kubenswrapper[4810]: I0219 15:10:44.790299 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:44Z","lastTransitionTime":"2026-02-19T15:10:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:44 crc kubenswrapper[4810]: I0219 15:10:44.893129 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:44 crc kubenswrapper[4810]: I0219 15:10:44.893235 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:44 crc kubenswrapper[4810]: I0219 15:10:44.893260 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:44 crc kubenswrapper[4810]: I0219 15:10:44.893295 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:44 crc kubenswrapper[4810]: I0219 15:10:44.893318 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:44Z","lastTransitionTime":"2026-02-19T15:10:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:44 crc kubenswrapper[4810]: I0219 15:10:44.996826 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:44 crc kubenswrapper[4810]: I0219 15:10:44.996886 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:44 crc kubenswrapper[4810]: I0219 15:10:44.996902 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:44 crc kubenswrapper[4810]: I0219 15:10:44.996923 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:44 crc kubenswrapper[4810]: I0219 15:10:44.996940 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:44Z","lastTransitionTime":"2026-02-19T15:10:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:45 crc kubenswrapper[4810]: I0219 15:10:45.101128 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:45 crc kubenswrapper[4810]: I0219 15:10:45.101182 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:45 crc kubenswrapper[4810]: I0219 15:10:45.101199 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:45 crc kubenswrapper[4810]: I0219 15:10:45.101223 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:45 crc kubenswrapper[4810]: I0219 15:10:45.101237 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:45Z","lastTransitionTime":"2026-02-19T15:10:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:45 crc kubenswrapper[4810]: I0219 15:10:45.211488 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:45 crc kubenswrapper[4810]: I0219 15:10:45.211567 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:45 crc kubenswrapper[4810]: I0219 15:10:45.211591 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:45 crc kubenswrapper[4810]: I0219 15:10:45.211623 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:45 crc kubenswrapper[4810]: I0219 15:10:45.211647 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:45Z","lastTransitionTime":"2026-02-19T15:10:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:45 crc kubenswrapper[4810]: I0219 15:10:45.315096 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:45 crc kubenswrapper[4810]: I0219 15:10:45.315166 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:45 crc kubenswrapper[4810]: I0219 15:10:45.315184 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:45 crc kubenswrapper[4810]: I0219 15:10:45.315213 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:45 crc kubenswrapper[4810]: I0219 15:10:45.315234 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:45Z","lastTransitionTime":"2026-02-19T15:10:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:45 crc kubenswrapper[4810]: I0219 15:10:45.410968 4810 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 21:52:45.771015556 +0000 UTC Feb 19 15:10:45 crc kubenswrapper[4810]: I0219 15:10:45.419576 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:45 crc kubenswrapper[4810]: I0219 15:10:45.419693 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:45 crc kubenswrapper[4810]: I0219 15:10:45.419714 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:45 crc kubenswrapper[4810]: I0219 15:10:45.419743 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:45 crc kubenswrapper[4810]: I0219 15:10:45.419763 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:45Z","lastTransitionTime":"2026-02-19T15:10:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:45 crc kubenswrapper[4810]: I0219 15:10:45.438947 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 15:10:45 crc kubenswrapper[4810]: E0219 15:10:45.439092 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 15:10:45 crc kubenswrapper[4810]: I0219 15:10:45.439118 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 15:10:45 crc kubenswrapper[4810]: I0219 15:10:45.438944 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2x9v9" Feb 19 15:10:45 crc kubenswrapper[4810]: E0219 15:10:45.439310 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 15:10:45 crc kubenswrapper[4810]: E0219 15:10:45.439490 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2x9v9" podUID="b72d3f7a-e418-4a21-af73-6a43ce3358c1" Feb 19 15:10:45 crc kubenswrapper[4810]: I0219 15:10:45.523426 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:45 crc kubenswrapper[4810]: I0219 15:10:45.523466 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:45 crc kubenswrapper[4810]: I0219 15:10:45.523475 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:45 crc kubenswrapper[4810]: I0219 15:10:45.523492 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:45 crc kubenswrapper[4810]: I0219 15:10:45.523502 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:45Z","lastTransitionTime":"2026-02-19T15:10:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:45 crc kubenswrapper[4810]: I0219 15:10:45.626655 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:45 crc kubenswrapper[4810]: I0219 15:10:45.626710 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:45 crc kubenswrapper[4810]: I0219 15:10:45.626721 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:45 crc kubenswrapper[4810]: I0219 15:10:45.626739 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:45 crc kubenswrapper[4810]: I0219 15:10:45.626750 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:45Z","lastTransitionTime":"2026-02-19T15:10:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:45 crc kubenswrapper[4810]: I0219 15:10:45.729880 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:45 crc kubenswrapper[4810]: I0219 15:10:45.729951 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:45 crc kubenswrapper[4810]: I0219 15:10:45.729968 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:45 crc kubenswrapper[4810]: I0219 15:10:45.729997 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:45 crc kubenswrapper[4810]: I0219 15:10:45.730015 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:45Z","lastTransitionTime":"2026-02-19T15:10:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:45 crc kubenswrapper[4810]: I0219 15:10:45.834489 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:45 crc kubenswrapper[4810]: I0219 15:10:45.834551 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:45 crc kubenswrapper[4810]: I0219 15:10:45.834569 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:45 crc kubenswrapper[4810]: I0219 15:10:45.834598 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:45 crc kubenswrapper[4810]: I0219 15:10:45.834616 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:45Z","lastTransitionTime":"2026-02-19T15:10:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:45 crc kubenswrapper[4810]: I0219 15:10:45.938425 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:45 crc kubenswrapper[4810]: I0219 15:10:45.938517 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:45 crc kubenswrapper[4810]: I0219 15:10:45.938540 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:45 crc kubenswrapper[4810]: I0219 15:10:45.938572 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:45 crc kubenswrapper[4810]: I0219 15:10:45.938592 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:45Z","lastTransitionTime":"2026-02-19T15:10:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:46 crc kubenswrapper[4810]: I0219 15:10:46.041815 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:46 crc kubenswrapper[4810]: I0219 15:10:46.041872 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:46 crc kubenswrapper[4810]: I0219 15:10:46.041889 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:46 crc kubenswrapper[4810]: I0219 15:10:46.041914 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:46 crc kubenswrapper[4810]: I0219 15:10:46.041932 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:46Z","lastTransitionTime":"2026-02-19T15:10:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:46 crc kubenswrapper[4810]: I0219 15:10:46.145856 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:46 crc kubenswrapper[4810]: I0219 15:10:46.145933 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:46 crc kubenswrapper[4810]: I0219 15:10:46.145956 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:46 crc kubenswrapper[4810]: I0219 15:10:46.145985 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:46 crc kubenswrapper[4810]: I0219 15:10:46.146010 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:46Z","lastTransitionTime":"2026-02-19T15:10:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:46 crc kubenswrapper[4810]: I0219 15:10:46.248351 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:46 crc kubenswrapper[4810]: I0219 15:10:46.248434 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:46 crc kubenswrapper[4810]: I0219 15:10:46.248456 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:46 crc kubenswrapper[4810]: I0219 15:10:46.248485 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:46 crc kubenswrapper[4810]: I0219 15:10:46.248505 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:46Z","lastTransitionTime":"2026-02-19T15:10:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:46 crc kubenswrapper[4810]: I0219 15:10:46.351668 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:46 crc kubenswrapper[4810]: I0219 15:10:46.351711 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:46 crc kubenswrapper[4810]: I0219 15:10:46.351749 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:46 crc kubenswrapper[4810]: I0219 15:10:46.351772 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:46 crc kubenswrapper[4810]: I0219 15:10:46.351786 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:46Z","lastTransitionTime":"2026-02-19T15:10:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:46 crc kubenswrapper[4810]: I0219 15:10:46.411557 4810 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 08:22:30.987652663 +0000 UTC Feb 19 15:10:46 crc kubenswrapper[4810]: I0219 15:10:46.439079 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 15:10:46 crc kubenswrapper[4810]: E0219 15:10:46.439383 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 15:10:46 crc kubenswrapper[4810]: I0219 15:10:46.455353 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:46 crc kubenswrapper[4810]: I0219 15:10:46.455408 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:46 crc kubenswrapper[4810]: I0219 15:10:46.455424 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:46 crc kubenswrapper[4810]: I0219 15:10:46.455447 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:46 crc kubenswrapper[4810]: I0219 15:10:46.455461 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:46Z","lastTransitionTime":"2026-02-19T15:10:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:46 crc kubenswrapper[4810]: I0219 15:10:46.558977 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:46 crc kubenswrapper[4810]: I0219 15:10:46.559081 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:46 crc kubenswrapper[4810]: I0219 15:10:46.559115 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:46 crc kubenswrapper[4810]: I0219 15:10:46.559154 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:46 crc kubenswrapper[4810]: I0219 15:10:46.559184 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:46Z","lastTransitionTime":"2026-02-19T15:10:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:46 crc kubenswrapper[4810]: I0219 15:10:46.662486 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:46 crc kubenswrapper[4810]: I0219 15:10:46.662561 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:46 crc kubenswrapper[4810]: I0219 15:10:46.662574 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:46 crc kubenswrapper[4810]: I0219 15:10:46.662600 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:46 crc kubenswrapper[4810]: I0219 15:10:46.662615 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:46Z","lastTransitionTime":"2026-02-19T15:10:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:46 crc kubenswrapper[4810]: I0219 15:10:46.875818 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:46 crc kubenswrapper[4810]: I0219 15:10:46.875922 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:46 crc kubenswrapper[4810]: I0219 15:10:46.875943 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:46 crc kubenswrapper[4810]: I0219 15:10:46.875974 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:46 crc kubenswrapper[4810]: I0219 15:10:46.875994 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:46Z","lastTransitionTime":"2026-02-19T15:10:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:46 crc kubenswrapper[4810]: I0219 15:10:46.979456 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:46 crc kubenswrapper[4810]: I0219 15:10:46.979510 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:46 crc kubenswrapper[4810]: I0219 15:10:46.979528 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:46 crc kubenswrapper[4810]: I0219 15:10:46.979553 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:46 crc kubenswrapper[4810]: I0219 15:10:46.979570 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:46Z","lastTransitionTime":"2026-02-19T15:10:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:47 crc kubenswrapper[4810]: I0219 15:10:47.083127 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:47 crc kubenswrapper[4810]: I0219 15:10:47.083192 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:47 crc kubenswrapper[4810]: I0219 15:10:47.083211 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:47 crc kubenswrapper[4810]: I0219 15:10:47.083238 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:47 crc kubenswrapper[4810]: I0219 15:10:47.083256 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:47Z","lastTransitionTime":"2026-02-19T15:10:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:47 crc kubenswrapper[4810]: I0219 15:10:47.186724 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:47 crc kubenswrapper[4810]: I0219 15:10:47.186794 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:47 crc kubenswrapper[4810]: I0219 15:10:47.186813 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:47 crc kubenswrapper[4810]: I0219 15:10:47.186841 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:47 crc kubenswrapper[4810]: I0219 15:10:47.186861 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:47Z","lastTransitionTime":"2026-02-19T15:10:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:47 crc kubenswrapper[4810]: I0219 15:10:47.290823 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:47 crc kubenswrapper[4810]: I0219 15:10:47.290890 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:47 crc kubenswrapper[4810]: I0219 15:10:47.290907 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:47 crc kubenswrapper[4810]: I0219 15:10:47.290937 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:47 crc kubenswrapper[4810]: I0219 15:10:47.290955 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:47Z","lastTransitionTime":"2026-02-19T15:10:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:47 crc kubenswrapper[4810]: I0219 15:10:47.394911 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:47 crc kubenswrapper[4810]: I0219 15:10:47.394978 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:47 crc kubenswrapper[4810]: I0219 15:10:47.394994 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:47 crc kubenswrapper[4810]: I0219 15:10:47.395023 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:47 crc kubenswrapper[4810]: I0219 15:10:47.395043 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:47Z","lastTransitionTime":"2026-02-19T15:10:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:47 crc kubenswrapper[4810]: I0219 15:10:47.412410 4810 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 10:53:57.969622518 +0000 UTC Feb 19 15:10:47 crc kubenswrapper[4810]: I0219 15:10:47.438985 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 15:10:47 crc kubenswrapper[4810]: I0219 15:10:47.439086 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2x9v9" Feb 19 15:10:47 crc kubenswrapper[4810]: I0219 15:10:47.439014 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 15:10:47 crc kubenswrapper[4810]: E0219 15:10:47.439228 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 15:10:47 crc kubenswrapper[4810]: E0219 15:10:47.439550 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 15:10:47 crc kubenswrapper[4810]: E0219 15:10:47.439637 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2x9v9" podUID="b72d3f7a-e418-4a21-af73-6a43ce3358c1" Feb 19 15:10:47 crc kubenswrapper[4810]: I0219 15:10:47.498743 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:47 crc kubenswrapper[4810]: I0219 15:10:47.498812 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:47 crc kubenswrapper[4810]: I0219 15:10:47.498833 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:47 crc kubenswrapper[4810]: I0219 15:10:47.498855 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:47 crc kubenswrapper[4810]: I0219 15:10:47.498874 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:47Z","lastTransitionTime":"2026-02-19T15:10:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:47 crc kubenswrapper[4810]: I0219 15:10:47.601584 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:47 crc kubenswrapper[4810]: I0219 15:10:47.601650 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:47 crc kubenswrapper[4810]: I0219 15:10:47.601726 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:47 crc kubenswrapper[4810]: I0219 15:10:47.601753 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:47 crc kubenswrapper[4810]: I0219 15:10:47.601773 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:47Z","lastTransitionTime":"2026-02-19T15:10:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:47 crc kubenswrapper[4810]: I0219 15:10:47.705101 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:47 crc kubenswrapper[4810]: I0219 15:10:47.705174 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:47 crc kubenswrapper[4810]: I0219 15:10:47.705200 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:47 crc kubenswrapper[4810]: I0219 15:10:47.705235 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:47 crc kubenswrapper[4810]: I0219 15:10:47.705262 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:47Z","lastTransitionTime":"2026-02-19T15:10:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:47 crc kubenswrapper[4810]: I0219 15:10:47.808791 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:47 crc kubenswrapper[4810]: I0219 15:10:47.808856 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:47 crc kubenswrapper[4810]: I0219 15:10:47.808870 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:47 crc kubenswrapper[4810]: I0219 15:10:47.808891 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:47 crc kubenswrapper[4810]: I0219 15:10:47.808906 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:47Z","lastTransitionTime":"2026-02-19T15:10:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:47 crc kubenswrapper[4810]: I0219 15:10:47.911521 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:47 crc kubenswrapper[4810]: I0219 15:10:47.911592 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:47 crc kubenswrapper[4810]: I0219 15:10:47.911607 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:47 crc kubenswrapper[4810]: I0219 15:10:47.911630 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:47 crc kubenswrapper[4810]: I0219 15:10:47.911644 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:47Z","lastTransitionTime":"2026-02-19T15:10:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:48 crc kubenswrapper[4810]: I0219 15:10:48.014307 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:48 crc kubenswrapper[4810]: I0219 15:10:48.014368 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:48 crc kubenswrapper[4810]: I0219 15:10:48.014378 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:48 crc kubenswrapper[4810]: I0219 15:10:48.014400 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:48 crc kubenswrapper[4810]: I0219 15:10:48.014410 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:48Z","lastTransitionTime":"2026-02-19T15:10:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:48 crc kubenswrapper[4810]: I0219 15:10:48.117449 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:48 crc kubenswrapper[4810]: I0219 15:10:48.117515 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:48 crc kubenswrapper[4810]: I0219 15:10:48.117531 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:48 crc kubenswrapper[4810]: I0219 15:10:48.117552 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:48 crc kubenswrapper[4810]: I0219 15:10:48.117585 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:48Z","lastTransitionTime":"2026-02-19T15:10:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:48 crc kubenswrapper[4810]: I0219 15:10:48.220941 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:48 crc kubenswrapper[4810]: I0219 15:10:48.221019 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:48 crc kubenswrapper[4810]: I0219 15:10:48.221031 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:48 crc kubenswrapper[4810]: I0219 15:10:48.221054 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:48 crc kubenswrapper[4810]: I0219 15:10:48.221067 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:48Z","lastTransitionTime":"2026-02-19T15:10:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:48 crc kubenswrapper[4810]: I0219 15:10:48.324716 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:48 crc kubenswrapper[4810]: I0219 15:10:48.324834 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:48 crc kubenswrapper[4810]: I0219 15:10:48.324853 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:48 crc kubenswrapper[4810]: I0219 15:10:48.324879 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:48 crc kubenswrapper[4810]: I0219 15:10:48.324896 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:48Z","lastTransitionTime":"2026-02-19T15:10:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:48 crc kubenswrapper[4810]: I0219 15:10:48.412987 4810 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 19:54:47.930522842 +0000 UTC Feb 19 15:10:48 crc kubenswrapper[4810]: I0219 15:10:48.427498 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:48 crc kubenswrapper[4810]: I0219 15:10:48.427544 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:48 crc kubenswrapper[4810]: I0219 15:10:48.427558 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:48 crc kubenswrapper[4810]: I0219 15:10:48.427584 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:48 crc kubenswrapper[4810]: I0219 15:10:48.427598 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:48Z","lastTransitionTime":"2026-02-19T15:10:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:48 crc kubenswrapper[4810]: I0219 15:10:48.438864 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 15:10:48 crc kubenswrapper[4810]: E0219 15:10:48.439042 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 15:10:48 crc kubenswrapper[4810]: I0219 15:10:48.531394 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:48 crc kubenswrapper[4810]: I0219 15:10:48.531467 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:48 crc kubenswrapper[4810]: I0219 15:10:48.531486 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:48 crc kubenswrapper[4810]: I0219 15:10:48.531520 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:48 crc kubenswrapper[4810]: I0219 15:10:48.531542 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:48Z","lastTransitionTime":"2026-02-19T15:10:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:48 crc kubenswrapper[4810]: I0219 15:10:48.637555 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:48 crc kubenswrapper[4810]: I0219 15:10:48.637622 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:48 crc kubenswrapper[4810]: I0219 15:10:48.637641 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:48 crc kubenswrapper[4810]: I0219 15:10:48.637670 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:48 crc kubenswrapper[4810]: I0219 15:10:48.637689 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:48Z","lastTransitionTime":"2026-02-19T15:10:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:48 crc kubenswrapper[4810]: I0219 15:10:48.740269 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:48 crc kubenswrapper[4810]: I0219 15:10:48.740378 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:48 crc kubenswrapper[4810]: I0219 15:10:48.740398 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:48 crc kubenswrapper[4810]: I0219 15:10:48.740426 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:48 crc kubenswrapper[4810]: I0219 15:10:48.740447 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:48Z","lastTransitionTime":"2026-02-19T15:10:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:48 crc kubenswrapper[4810]: I0219 15:10:48.844283 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:48 crc kubenswrapper[4810]: I0219 15:10:48.844371 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:48 crc kubenswrapper[4810]: I0219 15:10:48.844386 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:48 crc kubenswrapper[4810]: I0219 15:10:48.844410 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:48 crc kubenswrapper[4810]: I0219 15:10:48.844428 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:48Z","lastTransitionTime":"2026-02-19T15:10:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:48 crc kubenswrapper[4810]: I0219 15:10:48.947568 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:48 crc kubenswrapper[4810]: I0219 15:10:48.947661 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:48 crc kubenswrapper[4810]: I0219 15:10:48.947682 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:48 crc kubenswrapper[4810]: I0219 15:10:48.947709 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:48 crc kubenswrapper[4810]: I0219 15:10:48.947727 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:48Z","lastTransitionTime":"2026-02-19T15:10:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:49 crc kubenswrapper[4810]: I0219 15:10:49.051072 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:49 crc kubenswrapper[4810]: I0219 15:10:49.051137 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:49 crc kubenswrapper[4810]: I0219 15:10:49.051159 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:49 crc kubenswrapper[4810]: I0219 15:10:49.051214 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:49 crc kubenswrapper[4810]: I0219 15:10:49.051239 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:49Z","lastTransitionTime":"2026-02-19T15:10:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:49 crc kubenswrapper[4810]: I0219 15:10:49.154007 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:49 crc kubenswrapper[4810]: I0219 15:10:49.154135 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:49 crc kubenswrapper[4810]: I0219 15:10:49.154155 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:49 crc kubenswrapper[4810]: I0219 15:10:49.154181 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:49 crc kubenswrapper[4810]: I0219 15:10:49.154197 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:49Z","lastTransitionTime":"2026-02-19T15:10:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:49 crc kubenswrapper[4810]: I0219 15:10:49.257933 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:49 crc kubenswrapper[4810]: I0219 15:10:49.258030 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:49 crc kubenswrapper[4810]: I0219 15:10:49.258048 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:49 crc kubenswrapper[4810]: I0219 15:10:49.258077 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:49 crc kubenswrapper[4810]: I0219 15:10:49.258096 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:49Z","lastTransitionTime":"2026-02-19T15:10:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:49 crc kubenswrapper[4810]: I0219 15:10:49.361281 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:49 crc kubenswrapper[4810]: I0219 15:10:49.361393 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:49 crc kubenswrapper[4810]: I0219 15:10:49.361420 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:49 crc kubenswrapper[4810]: I0219 15:10:49.361446 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:49 crc kubenswrapper[4810]: I0219 15:10:49.361462 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:49Z","lastTransitionTime":"2026-02-19T15:10:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:49 crc kubenswrapper[4810]: I0219 15:10:49.413499 4810 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 14:45:58.183236963 +0000 UTC Feb 19 15:10:49 crc kubenswrapper[4810]: I0219 15:10:49.439004 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2x9v9" Feb 19 15:10:49 crc kubenswrapper[4810]: E0219 15:10:49.439257 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2x9v9" podUID="b72d3f7a-e418-4a21-af73-6a43ce3358c1" Feb 19 15:10:49 crc kubenswrapper[4810]: I0219 15:10:49.439043 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 15:10:49 crc kubenswrapper[4810]: E0219 15:10:49.439429 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 15:10:49 crc kubenswrapper[4810]: I0219 15:10:49.439035 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 15:10:49 crc kubenswrapper[4810]: E0219 15:10:49.439819 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 15:10:49 crc kubenswrapper[4810]: I0219 15:10:49.464460 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:49 crc kubenswrapper[4810]: I0219 15:10:49.464505 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:49 crc kubenswrapper[4810]: I0219 15:10:49.464521 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:49 crc kubenswrapper[4810]: I0219 15:10:49.464545 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:49 crc kubenswrapper[4810]: I0219 15:10:49.464561 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:49Z","lastTransitionTime":"2026-02-19T15:10:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:49 crc kubenswrapper[4810]: I0219 15:10:49.567655 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:49 crc kubenswrapper[4810]: I0219 15:10:49.567732 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:49 crc kubenswrapper[4810]: I0219 15:10:49.567750 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:49 crc kubenswrapper[4810]: I0219 15:10:49.567779 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:49 crc kubenswrapper[4810]: I0219 15:10:49.567798 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:49Z","lastTransitionTime":"2026-02-19T15:10:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:49 crc kubenswrapper[4810]: I0219 15:10:49.647346 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:49 crc kubenswrapper[4810]: I0219 15:10:49.647412 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:49 crc kubenswrapper[4810]: I0219 15:10:49.647425 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:49 crc kubenswrapper[4810]: I0219 15:10:49.647445 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:49 crc kubenswrapper[4810]: I0219 15:10:49.647460 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:49Z","lastTransitionTime":"2026-02-19T15:10:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:49 crc kubenswrapper[4810]: E0219 15:10:49.665419 4810 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7961bd2b-8ad7-4e25-b313-8f82bef01c62\\\",\\\"systemUUID\\\":\\\"60bcb373-142f-4da9-846e-4d055863e63a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:49Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:49 crc kubenswrapper[4810]: I0219 15:10:49.669611 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:49 crc kubenswrapper[4810]: I0219 15:10:49.669645 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:49 crc kubenswrapper[4810]: I0219 15:10:49.669654 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:49 crc kubenswrapper[4810]: I0219 15:10:49.669668 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:49 crc kubenswrapper[4810]: I0219 15:10:49.669678 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:49Z","lastTransitionTime":"2026-02-19T15:10:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:49 crc kubenswrapper[4810]: E0219 15:10:49.684458 4810 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7961bd2b-8ad7-4e25-b313-8f82bef01c62\\\",\\\"systemUUID\\\":\\\"60bcb373-142f-4da9-846e-4d055863e63a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:49Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:49 crc kubenswrapper[4810]: I0219 15:10:49.689816 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:49 crc kubenswrapper[4810]: I0219 15:10:49.689884 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:49 crc kubenswrapper[4810]: I0219 15:10:49.689907 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:49 crc kubenswrapper[4810]: I0219 15:10:49.689938 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:49 crc kubenswrapper[4810]: I0219 15:10:49.689959 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:49Z","lastTransitionTime":"2026-02-19T15:10:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:49 crc kubenswrapper[4810]: E0219 15:10:49.707739 4810 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7961bd2b-8ad7-4e25-b313-8f82bef01c62\\\",\\\"systemUUID\\\":\\\"60bcb373-142f-4da9-846e-4d055863e63a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:49Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:49 crc kubenswrapper[4810]: I0219 15:10:49.712601 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:49 crc kubenswrapper[4810]: I0219 15:10:49.712686 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:49 crc kubenswrapper[4810]: I0219 15:10:49.712709 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:49 crc kubenswrapper[4810]: I0219 15:10:49.712737 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:49 crc kubenswrapper[4810]: I0219 15:10:49.712760 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:49Z","lastTransitionTime":"2026-02-19T15:10:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:49 crc kubenswrapper[4810]: E0219 15:10:49.728290 4810 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7961bd2b-8ad7-4e25-b313-8f82bef01c62\\\",\\\"systemUUID\\\":\\\"60bcb373-142f-4da9-846e-4d055863e63a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:49Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:49 crc kubenswrapper[4810]: I0219 15:10:49.740071 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:49 crc kubenswrapper[4810]: I0219 15:10:49.740176 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:49 crc kubenswrapper[4810]: I0219 15:10:49.740192 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:49 crc kubenswrapper[4810]: I0219 15:10:49.740233 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:49 crc kubenswrapper[4810]: I0219 15:10:49.740252 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:49Z","lastTransitionTime":"2026-02-19T15:10:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:49 crc kubenswrapper[4810]: E0219 15:10:49.759704 4810 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T15:10:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7961bd2b-8ad7-4e25-b313-8f82bef01c62\\\",\\\"systemUUID\\\":\\\"60bcb373-142f-4da9-846e-4d055863e63a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:49Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:49 crc kubenswrapper[4810]: E0219 15:10:49.759962 4810 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 19 15:10:49 crc kubenswrapper[4810]: I0219 15:10:49.762516 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:49 crc kubenswrapper[4810]: I0219 15:10:49.762564 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:49 crc kubenswrapper[4810]: I0219 15:10:49.762584 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:49 crc kubenswrapper[4810]: I0219 15:10:49.762612 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:49 crc kubenswrapper[4810]: I0219 15:10:49.762632 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:49Z","lastTransitionTime":"2026-02-19T15:10:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:49 crc kubenswrapper[4810]: I0219 15:10:49.866020 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:49 crc kubenswrapper[4810]: I0219 15:10:49.866088 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:49 crc kubenswrapper[4810]: I0219 15:10:49.866099 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:49 crc kubenswrapper[4810]: I0219 15:10:49.866120 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:49 crc kubenswrapper[4810]: I0219 15:10:49.866134 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:49Z","lastTransitionTime":"2026-02-19T15:10:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:49 crc kubenswrapper[4810]: I0219 15:10:49.969531 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:49 crc kubenswrapper[4810]: I0219 15:10:49.969643 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:49 crc kubenswrapper[4810]: I0219 15:10:49.969666 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:49 crc kubenswrapper[4810]: I0219 15:10:49.969697 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:49 crc kubenswrapper[4810]: I0219 15:10:49.969722 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:49Z","lastTransitionTime":"2026-02-19T15:10:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:50 crc kubenswrapper[4810]: I0219 15:10:50.072959 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:50 crc kubenswrapper[4810]: I0219 15:10:50.073015 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:50 crc kubenswrapper[4810]: I0219 15:10:50.073027 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:50 crc kubenswrapper[4810]: I0219 15:10:50.073049 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:50 crc kubenswrapper[4810]: I0219 15:10:50.073065 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:50Z","lastTransitionTime":"2026-02-19T15:10:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:50 crc kubenswrapper[4810]: I0219 15:10:50.176609 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:50 crc kubenswrapper[4810]: I0219 15:10:50.176693 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:50 crc kubenswrapper[4810]: I0219 15:10:50.176716 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:50 crc kubenswrapper[4810]: I0219 15:10:50.176752 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:50 crc kubenswrapper[4810]: I0219 15:10:50.176774 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:50Z","lastTransitionTime":"2026-02-19T15:10:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:50 crc kubenswrapper[4810]: I0219 15:10:50.279599 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:50 crc kubenswrapper[4810]: I0219 15:10:50.279651 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:50 crc kubenswrapper[4810]: I0219 15:10:50.279664 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:50 crc kubenswrapper[4810]: I0219 15:10:50.279685 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:50 crc kubenswrapper[4810]: I0219 15:10:50.279699 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:50Z","lastTransitionTime":"2026-02-19T15:10:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:50 crc kubenswrapper[4810]: I0219 15:10:50.383370 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:50 crc kubenswrapper[4810]: I0219 15:10:50.383425 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:50 crc kubenswrapper[4810]: I0219 15:10:50.383444 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:50 crc kubenswrapper[4810]: I0219 15:10:50.383473 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:50 crc kubenswrapper[4810]: I0219 15:10:50.383492 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:50Z","lastTransitionTime":"2026-02-19T15:10:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:50 crc kubenswrapper[4810]: I0219 15:10:50.414221 4810 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 16:43:34.600184443 +0000 UTC Feb 19 15:10:50 crc kubenswrapper[4810]: I0219 15:10:50.438687 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 15:10:50 crc kubenswrapper[4810]: E0219 15:10:50.439228 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 15:10:50 crc kubenswrapper[4810]: I0219 15:10:50.439572 4810 scope.go:117] "RemoveContainer" containerID="4bb8c0b89e0d91440beb93dc0fe78cebe4ca481940af3c8affe5efd75d589980" Feb 19 15:10:50 crc kubenswrapper[4810]: I0219 15:10:50.487541 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:50 crc kubenswrapper[4810]: I0219 15:10:50.488033 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:50 crc kubenswrapper[4810]: I0219 15:10:50.488047 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:50 crc kubenswrapper[4810]: I0219 15:10:50.488067 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:50 crc kubenswrapper[4810]: I0219 15:10:50.488080 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:50Z","lastTransitionTime":"2026-02-19T15:10:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:50 crc kubenswrapper[4810]: I0219 15:10:50.591129 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:50 crc kubenswrapper[4810]: I0219 15:10:50.591198 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:50 crc kubenswrapper[4810]: I0219 15:10:50.591217 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:50 crc kubenswrapper[4810]: I0219 15:10:50.591245 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:50 crc kubenswrapper[4810]: I0219 15:10:50.591264 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:50Z","lastTransitionTime":"2026-02-19T15:10:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:50 crc kubenswrapper[4810]: I0219 15:10:50.694610 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:50 crc kubenswrapper[4810]: I0219 15:10:50.694656 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:50 crc kubenswrapper[4810]: I0219 15:10:50.694673 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:50 crc kubenswrapper[4810]: I0219 15:10:50.694698 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:50 crc kubenswrapper[4810]: I0219 15:10:50.694715 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:50Z","lastTransitionTime":"2026-02-19T15:10:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:50 crc kubenswrapper[4810]: I0219 15:10:50.798140 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:50 crc kubenswrapper[4810]: I0219 15:10:50.798203 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:50 crc kubenswrapper[4810]: I0219 15:10:50.798221 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:50 crc kubenswrapper[4810]: I0219 15:10:50.798251 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:50 crc kubenswrapper[4810]: I0219 15:10:50.798272 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:50Z","lastTransitionTime":"2026-02-19T15:10:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:50 crc kubenswrapper[4810]: I0219 15:10:50.901277 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:50 crc kubenswrapper[4810]: I0219 15:10:50.901317 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:50 crc kubenswrapper[4810]: I0219 15:10:50.901339 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:50 crc kubenswrapper[4810]: I0219 15:10:50.901356 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:50 crc kubenswrapper[4810]: I0219 15:10:50.901367 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:50Z","lastTransitionTime":"2026-02-19T15:10:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.000488 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8k7p5_c5a8a15c-53e8-4868-8feb-dcd4e83939a4/ovnkube-controller/2.log" Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.003133 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.003176 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.003191 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.003214 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.003231 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:51Z","lastTransitionTime":"2026-02-19T15:10:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.004290 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" event={"ID":"c5a8a15c-53e8-4868-8feb-dcd4e83939a4","Type":"ContainerStarted","Data":"824c8634103fa360f2c3a6cb4c22c0185f87e2de6475beadb412027255de2cd4"} Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.005106 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.026231 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6694b0d-3264-43eb-bd52-5088c7d2bf15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16713d77989f991d6b3136c41aac696aaa3233a0f3454bcd299cd968f46872eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42939a1750dd650a11f4f83812694248cc8767b040f5cab76ee64d30ad6004ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a3707767fcc9c9ee6a808f9c0598457f1710b6394ecf9910f7a1a0ddd8ec1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef740fa90b279363e303fb824188d0a26097b06aa8eaa488c0c800a8caa8f10e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://640d02443d7d20048695866b4db1d1fdd6469c712a804010860ca4c730f59c65\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T15:09:51Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 15:09:51.508430 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 15:09:51.508603 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 15:09:51.509986 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1367038872/tls.crt::/tmp/serving-cert-1367038872/tls.key\\\\\\\"\\\\nI0219 15:09:51.722990 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 15:09:51.726068 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 15:09:51.726086 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 15:09:51.726104 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 15:09:51.726110 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 15:09:51.731095 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 15:09:51.731120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0219 15:09:51.731119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0219 15:09:51.731126 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 15:09:51.731136 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 15:09:51.731141 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 15:09:51.731145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 15:09:51.731149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 15:09:51.734010 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77137e37c7a8cdddbba21456e942c760fb2cc6c31533e3bc6c4561929b41393b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:51Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.045169 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:51Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.066468 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmbsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"247e948b-3c17-4675-bd1c-f894b02d2817\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7171cb2bba77488fbb60ec533172b2240d1871e89bc7760ae5c9b67ee6924354\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cbe5a969ecab49494c22bccec30a5b7050699729a8ae6244a4bb016ed9cbe64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cbe5a969ecab49494c22bccec30a5b7050699729a8ae6244a4bb016ed9cbe64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://395e0b7a03b78f71b2823c72ea754e1909bde653a7107597bf07b6e664ba40dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://395e0b7a03b78f71b2823c72ea754e1909bde653a7107597bf07b6e664ba40dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5a0126c8bea39a89fe488d49dcb1e47063abc9839d148f348a7ce592038a450\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5a0126c8bea39a89fe488d49dcb1e47063abc9839d148f348a7ce592038a450\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9864da53b0851b0c4eadd5c11669af9b9e539660908ca14f44618824b5e4158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9864da53b0851b0c4eadd5c11669af9b9e539660908ca14f44618824b5e4158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://301c743749a2bcaee82a70d3c34a6cf13f880ef9986c390e279a7c056f9a99ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://301c743749a2bcaee82a70d3c34a6cf13f880ef9986c390e279a7c056f9a99ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://075e06e40beb99d2338c5850692a5e89314b6342cc72007d75b602ac21a90d9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://075e06e40beb99d2338c5850692a5e89314b6342cc72007d75b602ac21a90d9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmbsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:51Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.082134 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2x9v9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b72d3f7a-e418-4a21-af73-6a43ce3358c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7z7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7z7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:10:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2x9v9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:51Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.097137 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b08ecb1c-d94b-4563-8f47-7334d52bc0c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af7cc216f97bd72a2d9a57e54a49676e2828ec7d0a0a035f22261c40a6976a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d59e5d74bcb7088e172f31cc207d421cd214b7d4a80382004fb30c167b04e14a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d59e5d74bcb7088e172f31cc207d421cd214b7d4a80382004fb30c167b04e14a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:51Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.105903 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.105959 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.105981 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.106011 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.106031 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:51Z","lastTransitionTime":"2026-02-19T15:10:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.112545 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfbf74ef-1e94-4826-8583-42b2e246ccf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f26baf8d6384c9133f7da338479eac54f1aa88609e4b0854078d4e85e8bf05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjrkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba65a8abc0841494ba4596019b005e3cbb3f3c0897bef615b51a2ab10aa9e651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjrkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t499d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:51Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.128064 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-t9jnq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60d7228b-14bd-4988-8dca-cb89f487ba00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb03eee4bad3a037ff0d61a204c9c86a839e9d684dbb89a433a50afa9cfcfcf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gzvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-t9jnq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:51Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.140907 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-l2g4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06adf9a1-ec31-4acc-9864-41549913d3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://618a71f428d132079ed8a6756ea331ec8fd20c32c98bc00691f56fa0b12669aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh6qr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82d6389909245cbf6091dfb1ef53a497dd254228acd53c508c32c4345f86b7b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh6qr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:10:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-l2g4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:51Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.156017 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac4287c9-a6d9-470d-820f-316c037a5d1a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc8b81d39b5b02958a8f7fa7ec86ac9dd530f2c4674042f60b11209e0400433a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://034d625f766dc7e2dd97fcedcfc0ca251971bc0be7e0d04ed4b2bcf3939905b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81860198cc657673df0021324ba4fde92aa6ffbde974993d0553a080872f7fb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eeb662772f98f52da96cf3b347f7b4990f4daa4552da61d972ecb8875d74b01c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eeb662772f98f52da96cf3b347f7b4990f4daa4552da61d972ecb8875d74b01c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:51Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.175065 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:51Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.199834 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9db08d13283889d8c43ab89157e2c33f300359ecf66b548100b30ddea3da2be1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8472b62f0ac04884cd8a9d342c063860188b0f17e1ed2201c7ea7288fb44d77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:51Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.209513 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.209550 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.209568 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.209588 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.209603 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:51Z","lastTransitionTime":"2026-02-19T15:10:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.236375 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-flbx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cc3ea69-b881-4fd4-ad4d-42803f27865b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07b26bfbc15981836877e6ddc6aea3fb26a4df1be38ac67c76f8a7a8f6b84b5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqph9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-flbx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:51Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.262712 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dcca2640a7625737b151a65d63dac889139b41a8dff1cd5aae2ead1a37138f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://879052e78b8d383d0a7f4fbddb38b98fa931bec3d0d0c7ff7eb830ccb62be46d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5de90aae0b44ad98711b4e4c2764cf25c491a329a7acbe9a6ec49f88fb8e7c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c0568b462e1168949dcfd0131ef4e94d6c3f68dd31906ee5c6a8c87f7fef25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://702a868ef507f31cb27f41ccd862f38ba8d5256e5d4af19a5d8b168939932663\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5089067acda7084123f239150916500b11a0547068b0a39ee60a467c372f3459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://824c8634103fa360f2c3a6cb4c22c0185f87e2de6475beadb412027255de2cd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bb8c0b89e0d91440beb93dc0fe78cebe4ca481940af3c8affe5efd75d589980\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T15:10:23Z\\\",\\\"message\\\":\\\"responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.139:17698:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8efa4d1a-72f5-4dfa-9bc2-9d93ef11ecf2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0219 15:10:23.623215 6458 services_controller.go:443] Built service openshift-machine-config-operator/machine-config-controller LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.16\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:9001, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0219 15:10:23.623235 6458 services_controller.go:444] Built service openshift-machine-config-operator/machine-config-controller LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0219 15:10:23.623244 6458 services_controller.go:445] Built service openshift-machine-config-operator/machine-config-controller LB template configs for network=default: []services.lbConfig(nil)\\\\nF0219 15:10:23.623259 6458 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T15:10:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e21fe2d6964872913c8e5a2f983519806b72d0e2880016372ac90fc2b9f61fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8k7p5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:51Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.280312 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d74a04f-ffb6-47fb-8823-2fd7f3df4f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2070ce19c777ad791dfd3e5d99eded43e479f4cbbaefe96c5ed8525aa86e5ed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89e969496cd5adab974a5d8b8c3051bcdfb63c582b8f0d6d9af233d25e2c1d3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5559fbe43f7e18617e2a46182d621a6310ae4d0b10548bae2bb098588285e081\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fbf8815e560483199cc8f453952895d3e5d831e7c52671b204187dc686fa4ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:51Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.294370 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b60fe11e60f880bde640f63640fdc6a1b66e3044dcd975c711c45a15aed3fb2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:51Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.312479 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.312532 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.312543 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.312561 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.312573 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:51Z","lastTransitionTime":"2026-02-19T15:10:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.316124 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bsztz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a45a199-beeb-4972-b796-15c958fe99d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1f575901c6e6174387489f4f3a7d8cfa46089db51556b6acc9245faaf36a9ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a674e283274f1e77f6ac89a8170137bf5852153bd4a35a80101a3cd0090b85d2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T15:10:42Z\\\",\\\"message\\\":\\\"2026-02-19T15:09:56+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_e927f559-f3e4-4463-8345-ac34840adcee\\\\n2026-02-19T15:09:56+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_e927f559-f3e4-4463-8345-ac34840adcee to /host/opt/cni/bin/\\\\n2026-02-19T15:09:57Z [verbose] multus-daemon started\\\\n2026-02-19T15:09:57Z [verbose] Readiness Indicator file check\\\\n2026-02-19T15:10:42Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:53Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcg8h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bsztz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:51Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.332818 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:51Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.403107 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54ba23405a6182e281d7d423f13e84a03c0f68cf69169f59fe1cd0d5881103c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:51Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.414473 4810 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 16:57:16.645382422 +0000 UTC Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.414792 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.414814 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.414822 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.414837 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.414847 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:51Z","lastTransitionTime":"2026-02-19T15:10:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.438624 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.438699 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.438624 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2x9v9" Feb 19 15:10:51 crc kubenswrapper[4810]: E0219 15:10:51.438793 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 15:10:51 crc kubenswrapper[4810]: E0219 15:10:51.438900 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2x9v9" podUID="b72d3f7a-e418-4a21-af73-6a43ce3358c1" Feb 19 15:10:51 crc kubenswrapper[4810]: E0219 15:10:51.439018 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.451069 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-flbx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cc3ea69-b881-4fd4-ad4d-42803f27865b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07b26bfbc15981836877e6ddc6aea3fb26a4df1be38ac67c76f8a7a8f6b84b5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqph9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-flbx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:51Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.470823 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dcca2640a7625737b151a65d63dac889139b41a8dff1cd5aae2ead1a37138f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://879052e78b8d383d0a7f4fbddb38b98fa931bec3d0d0c7ff7eb830ccb62be46d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5de90aae0b44ad98711b4e4c2764cf25c491a329a7acbe9a6ec49f88fb8e7c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c0568b462e1168949dcfd0131ef4e94d6c3f68dd31906ee5c6a8c87f7fef25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://702a868ef507f31cb27f41ccd862f38ba8d5256e5d4af19a5d8b168939932663\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5089067acda7084123f239150916500b11a0547068b0a39ee60a467c372f3459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://824c8634103fa360f2c3a6cb4c22c0185f87e2de6475beadb412027255de2cd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bb8c0b89e0d91440beb93dc0fe78cebe4ca481940af3c8affe5efd75d589980\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T15:10:23Z\\\",\\\"message\\\":\\\"responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.139:17698:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8efa4d1a-72f5-4dfa-9bc2-9d93ef11ecf2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0219 15:10:23.623215 6458 services_controller.go:443] Built service openshift-machine-config-operator/machine-config-controller LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.16\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:9001, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0219 15:10:23.623235 6458 services_controller.go:444] Built service openshift-machine-config-operator/machine-config-controller LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0219 15:10:23.623244 6458 services_controller.go:445] Built service openshift-machine-config-operator/machine-config-controller LB template configs for network=default: []services.lbConfig(nil)\\\\nF0219 15:10:23.623259 6458 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T15:10:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e21fe2d6964872913c8e5a2f983519806b72d0e2880016372ac90fc2b9f61fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8k7p5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:51Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.487010 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d74a04f-ffb6-47fb-8823-2fd7f3df4f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2070ce19c777ad791dfd3e5d99eded43e479f4cbbaefe96c5ed8525aa86e5ed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89e969496cd5adab974a5d8b8c3051bcdfb63c582b8f0d6d9af233d25e2c1d3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5559fbe43f7e18617e2a46182d621a6310ae4d0b10548bae2bb098588285e081\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fbf8815e560483199cc8f453952895d3e5d831e7c52671b204187dc686fa4ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:51Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.501795 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b60fe11e60f880bde640f63640fdc6a1b66e3044dcd975c711c45a15aed3fb2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:51Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.514432 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9db08d13283889d8c43ab89157e2c33f300359ecf66b548100b30ddea3da2be1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8472b62f0ac04884cd8a9d342c063860188b0f17e1ed2201c7ea7288fb44d77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:51Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.517700 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.517739 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.517751 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.517770 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.517784 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:51Z","lastTransitionTime":"2026-02-19T15:10:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.529024 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:51Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.539670 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54ba23405a6182e281d7d423f13e84a03c0f68cf69169f59fe1cd0d5881103c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:51Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.552734 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bsztz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a45a199-beeb-4972-b796-15c958fe99d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1f575901c6e6174387489f4f3a7d8cfa46089db51556b6acc9245faaf36a9ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a674e283274f1e77f6ac89a8170137bf5852153bd4a35a80101a3cd0090b85d2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T15:10:42Z\\\",\\\"message\\\":\\\"2026-02-19T15:09:56+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_e927f559-f3e4-4463-8345-ac34840adcee\\\\n2026-02-19T15:09:56+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_e927f559-f3e4-4463-8345-ac34840adcee to /host/opt/cni/bin/\\\\n2026-02-19T15:09:57Z [verbose] multus-daemon started\\\\n2026-02-19T15:09:57Z [verbose] Readiness Indicator file check\\\\n2026-02-19T15:10:42Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:53Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcg8h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bsztz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:51Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.567568 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:51Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.585209 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmbsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"247e948b-3c17-4675-bd1c-f894b02d2817\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7171cb2bba77488fbb60ec533172b2240d1871e89bc7760ae5c9b67ee6924354\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cbe5a969ecab49494c22bccec30a5b7050699729a8ae6244a4bb016ed9cbe64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cbe5a969ecab49494c22bccec30a5b7050699729a8ae6244a4bb016ed9cbe64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://395e0b7a03b78f71b2823c72ea754e1909bde653a7107597bf07b6e664ba40dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://395e0b7a03b78f71b2823c72ea754e1909bde653a7107597bf07b6e664ba40dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5a0126c8bea39a89fe488d49dcb1e47063abc9839d148f348a7ce592038a450\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5a0126c8bea39a89fe488d49dcb1e47063abc9839d148f348a7ce592038a450\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9864da53b0851b0c4eadd5c11669af9b9e539660908ca14f44618824b5e4158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9864da53b0851b0c4eadd5c11669af9b9e539660908ca14f44618824b5e4158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://301c743749a2bcaee82a70d3c34a6cf13f880ef9986c390e279a7c056f9a99ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://301c743749a2bcaee82a70d3c34a6cf13f880ef9986c390e279a7c056f9a99ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://075e06e40beb99d2338c5850692a5e89314b6342cc72007d75b602ac21a90d9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://075e06e40beb99d2338c5850692a5e89314b6342cc72007d75b602ac21a90d9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmbsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:51Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.597650 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2x9v9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b72d3f7a-e418-4a21-af73-6a43ce3358c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7z7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7z7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:10:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2x9v9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:51Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.610627 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b08ecb1c-d94b-4563-8f47-7334d52bc0c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af7cc216f97bd72a2d9a57e54a49676e2828ec7d0a0a035f22261c40a6976a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d59e5d74bcb7088e172f31cc207d421cd214b7d4a80382004fb30c167b04e14a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d59e5d74bcb7088e172f31cc207d421cd214b7d4a80382004fb30c167b04e14a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:51Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.620030 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.620066 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.620079 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.620098 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.620111 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:51Z","lastTransitionTime":"2026-02-19T15:10:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.629772 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6694b0d-3264-43eb-bd52-5088c7d2bf15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16713d77989f991d6b3136c41aac696aaa3233a0f3454bcd299cd968f46872eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42939a1750dd650a11f4f83812694248cc8767b040f5cab76ee64d30ad6004ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a3707767fcc9c9ee6a808f9c0598457f1710b6394ecf9910f7a1a0ddd8ec1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef740fa90b279363e303fb824188d0a26097b06aa8eaa488c0c800a8caa8f10e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://640d02443d7d20048695866b4db1d1fdd6469c712a804010860ca4c730f59c65\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T15:09:51Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 15:09:51.508430 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 15:09:51.508603 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 15:09:51.509986 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1367038872/tls.crt::/tmp/serving-cert-1367038872/tls.key\\\\\\\"\\\\nI0219 15:09:51.722990 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 15:09:51.726068 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 15:09:51.726086 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 15:09:51.726104 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 15:09:51.726110 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 15:09:51.731095 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 15:09:51.731120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0219 15:09:51.731119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0219 15:09:51.731126 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 15:09:51.731136 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 15:09:51.731141 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 15:09:51.731145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 15:09:51.731149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 15:09:51.734010 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77137e37c7a8cdddbba21456e942c760fb2cc6c31533e3bc6c4561929b41393b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:51Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.640598 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-t9jnq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60d7228b-14bd-4988-8dca-cb89f487ba00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb03eee4bad3a037ff0d61a204c9c86a839e9d684dbb89a433a50afa9cfcfcf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gzvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-t9jnq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:51Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.654578 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-l2g4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06adf9a1-ec31-4acc-9864-41549913d3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://618a71f428d132079ed8a6756ea331ec8fd20c32c98bc00691f56fa0b12669aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh6qr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82d6389909245cbf6091dfb1ef53a497dd254228acd53c508c32c4345f86b7b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh6qr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:10:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-l2g4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:51Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.667432 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac4287c9-a6d9-470d-820f-316c037a5d1a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc8b81d39b5b02958a8f7fa7ec86ac9dd530f2c4674042f60b11209e0400433a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://034d625f766dc7e2dd97fcedcfc0ca251971bc0be7e0d04ed4b2bcf3939905b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81860198cc657673df0021324ba4fde92aa6ffbde974993d0553a080872f7fb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eeb662772f98f52da96cf3b347f7b4990f4daa4552da61d972ecb8875d74b01c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eeb662772f98f52da96cf3b347f7b4990f4daa4552da61d972ecb8875d74b01c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:51Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.690149 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:51Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.722661 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.722698 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.722709 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.722727 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.722738 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:51Z","lastTransitionTime":"2026-02-19T15:10:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.733626 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfbf74ef-1e94-4826-8583-42b2e246ccf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f26baf8d6384c9133f7da338479eac54f1aa88609e4b0854078d4e85e8bf05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjrkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba65a8abc0841494ba4596019b005e3cbb3f3c0897bef615b51a2ab10aa9e651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjrkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t499d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:51Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.825717 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.825761 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.825774 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.825793 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.825806 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:51Z","lastTransitionTime":"2026-02-19T15:10:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.928452 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.928524 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.928547 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.928575 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:51 crc kubenswrapper[4810]: I0219 15:10:51.928594 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:51Z","lastTransitionTime":"2026-02-19T15:10:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:52 crc kubenswrapper[4810]: I0219 15:10:52.012278 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8k7p5_c5a8a15c-53e8-4868-8feb-dcd4e83939a4/ovnkube-controller/3.log" Feb 19 15:10:52 crc kubenswrapper[4810]: I0219 15:10:52.014490 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8k7p5_c5a8a15c-53e8-4868-8feb-dcd4e83939a4/ovnkube-controller/2.log" Feb 19 15:10:52 crc kubenswrapper[4810]: I0219 15:10:52.020006 4810 generic.go:334] "Generic (PLEG): container finished" podID="c5a8a15c-53e8-4868-8feb-dcd4e83939a4" containerID="824c8634103fa360f2c3a6cb4c22c0185f87e2de6475beadb412027255de2cd4" exitCode=1 Feb 19 15:10:52 crc kubenswrapper[4810]: I0219 15:10:52.020105 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" event={"ID":"c5a8a15c-53e8-4868-8feb-dcd4e83939a4","Type":"ContainerDied","Data":"824c8634103fa360f2c3a6cb4c22c0185f87e2de6475beadb412027255de2cd4"} Feb 19 15:10:52 crc kubenswrapper[4810]: I0219 15:10:52.020712 4810 scope.go:117] "RemoveContainer" containerID="4bb8c0b89e0d91440beb93dc0fe78cebe4ca481940af3c8affe5efd75d589980" Feb 19 15:10:52 crc kubenswrapper[4810]: I0219 15:10:52.020998 4810 scope.go:117] "RemoveContainer" containerID="824c8634103fa360f2c3a6cb4c22c0185f87e2de6475beadb412027255de2cd4" Feb 19 15:10:52 crc kubenswrapper[4810]: E0219 15:10:52.021198 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-8k7p5_openshift-ovn-kubernetes(c5a8a15c-53e8-4868-8feb-dcd4e83939a4)\"" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" podUID="c5a8a15c-53e8-4868-8feb-dcd4e83939a4" Feb 19 15:10:52 crc kubenswrapper[4810]: I0219 15:10:52.031825 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:52 crc kubenswrapper[4810]: I0219 15:10:52.031870 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:52 crc kubenswrapper[4810]: I0219 15:10:52.031885 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:52 crc kubenswrapper[4810]: I0219 15:10:52.031909 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:52 crc kubenswrapper[4810]: I0219 15:10:52.031923 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:52Z","lastTransitionTime":"2026-02-19T15:10:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:52 crc kubenswrapper[4810]: I0219 15:10:52.040510 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:52Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:52 crc kubenswrapper[4810]: I0219 15:10:52.059842 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmbsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"247e948b-3c17-4675-bd1c-f894b02d2817\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7171cb2bba77488fbb60ec533172b2240d1871e89bc7760ae5c9b67ee6924354\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cbe5a969ecab49494c22bccec30a5b7050699729a8ae6244a4bb016ed9cbe64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cbe5a969ecab49494c22bccec30a5b7050699729a8ae6244a4bb016ed9cbe64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://395e0b7a03b78f71b2823c72ea754e1909bde653a7107597bf07b6e664ba40dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://395e0b7a03b78f71b2823c72ea754e1909bde653a7107597bf07b6e664ba40dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5a0126c8bea39a89fe488d49dcb1e47063abc9839d148f348a7ce592038a450\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5a0126c8bea39a89fe488d49dcb1e47063abc9839d148f348a7ce592038a450\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9864da53b0851b0c4eadd5c11669af9b9e539660908ca14f44618824b5e4158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9864da53b0851b0c4eadd5c11669af9b9e539660908ca14f44618824b5e4158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://301c743749a2bcaee82a70d3c34a6cf13f880ef9986c390e279a7c056f9a99ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://301c743749a2bcaee82a70d3c34a6cf13f880ef9986c390e279a7c056f9a99ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://075e06e40beb99d2338c5850692a5e89314b6342cc72007d75b602ac21a90d9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://075e06e40beb99d2338c5850692a5e89314b6342cc72007d75b602ac21a90d9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmbsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:52Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:52 crc kubenswrapper[4810]: I0219 15:10:52.071977 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2x9v9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b72d3f7a-e418-4a21-af73-6a43ce3358c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7z7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7z7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:10:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2x9v9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:52Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:52 crc kubenswrapper[4810]: I0219 15:10:52.084941 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b08ecb1c-d94b-4563-8f47-7334d52bc0c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af7cc216f97bd72a2d9a57e54a49676e2828ec7d0a0a035f22261c40a6976a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d59e5d74bcb7088e172f31cc207d421cd214b7d4a80382004fb30c167b04e14a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d59e5d74bcb7088e172f31cc207d421cd214b7d4a80382004fb30c167b04e14a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:52Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:52 crc kubenswrapper[4810]: I0219 15:10:52.103391 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6694b0d-3264-43eb-bd52-5088c7d2bf15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16713d77989f991d6b3136c41aac696aaa3233a0f3454bcd299cd968f46872eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42939a1750dd650a11f4f83812694248cc8767b040f5cab76ee64d30ad6004ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a3707767fcc9c9ee6a808f9c0598457f1710b6394ecf9910f7a1a0ddd8ec1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef740fa90b279363e303fb824188d0a26097b06aa8eaa488c0c800a8caa8f10e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://640d02443d7d20048695866b4db1d1fdd6469c712a804010860ca4c730f59c65\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T15:09:51Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 15:09:51.508430 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 15:09:51.508603 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 15:09:51.509986 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1367038872/tls.crt::/tmp/serving-cert-1367038872/tls.key\\\\\\\"\\\\nI0219 15:09:51.722990 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 15:09:51.726068 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 15:09:51.726086 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 15:09:51.726104 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 15:09:51.726110 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 15:09:51.731095 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 15:09:51.731120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0219 15:09:51.731119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0219 15:09:51.731126 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 15:09:51.731136 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 15:09:51.731141 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 15:09:51.731145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 15:09:51.731149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 15:09:51.734010 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77137e37c7a8cdddbba21456e942c760fb2cc6c31533e3bc6c4561929b41393b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:52Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:52 crc kubenswrapper[4810]: I0219 15:10:52.117475 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-t9jnq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60d7228b-14bd-4988-8dca-cb89f487ba00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb03eee4bad3a037ff0d61a204c9c86a839e9d684dbb89a433a50afa9cfcfcf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gzvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-t9jnq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:52Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:52 crc kubenswrapper[4810]: I0219 15:10:52.131590 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-l2g4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06adf9a1-ec31-4acc-9864-41549913d3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://618a71f428d132079ed8a6756ea331ec8fd20c32c98bc00691f56fa0b12669aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh6qr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82d6389909245cbf6091dfb1ef53a497dd254228acd53c508c32c4345f86b7b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh6qr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:10:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-l2g4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:52Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:52 crc kubenswrapper[4810]: I0219 15:10:52.136104 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:52 crc kubenswrapper[4810]: I0219 15:10:52.136140 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:52 crc kubenswrapper[4810]: I0219 15:10:52.136154 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:52 crc kubenswrapper[4810]: I0219 15:10:52.136175 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:52 crc kubenswrapper[4810]: I0219 15:10:52.136186 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:52Z","lastTransitionTime":"2026-02-19T15:10:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:52 crc kubenswrapper[4810]: I0219 15:10:52.148053 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac4287c9-a6d9-470d-820f-316c037a5d1a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc8b81d39b5b02958a8f7fa7ec86ac9dd530f2c4674042f60b11209e0400433a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://034d625f766dc7e2dd97fcedcfc0ca251971bc0be7e0d04ed4b2bcf3939905b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81860198cc657673df0021324ba4fde92aa6ffbde974993d0553a080872f7fb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eeb662772f98f52da96cf3b347f7b4990f4daa4552da61d972ecb8875d74b01c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eeb662772f98f52da96cf3b347f7b4990f4daa4552da61d972ecb8875d74b01c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:52Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:52 crc kubenswrapper[4810]: I0219 15:10:52.165935 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:52Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:52 crc kubenswrapper[4810]: I0219 15:10:52.181414 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfbf74ef-1e94-4826-8583-42b2e246ccf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f26baf8d6384c9133f7da338479eac54f1aa88609e4b0854078d4e85e8bf05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjrkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba65a8abc0841494ba4596019b005e3cbb3f3c0897bef615b51a2ab10aa9e651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjrkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t499d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:52Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:52 crc kubenswrapper[4810]: I0219 15:10:52.195947 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-flbx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cc3ea69-b881-4fd4-ad4d-42803f27865b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07b26bfbc15981836877e6ddc6aea3fb26a4df1be38ac67c76f8a7a8f6b84b5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqph9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-flbx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:52Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:52 crc kubenswrapper[4810]: I0219 15:10:52.216551 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dcca2640a7625737b151a65d63dac889139b41a8dff1cd5aae2ead1a37138f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://879052e78b8d383d0a7f4fbddb38b98fa931bec3d0d0c7ff7eb830ccb62be46d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5de90aae0b44ad98711b4e4c2764cf25c491a329a7acbe9a6ec49f88fb8e7c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c0568b462e1168949dcfd0131ef4e94d6c3f68dd31906ee5c6a8c87f7fef25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://702a868ef507f31cb27f41ccd862f38ba8d5256e5d4af19a5d8b168939932663\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5089067acda7084123f239150916500b11a0547068b0a39ee60a467c372f3459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://824c8634103fa360f2c3a6cb4c22c0185f87e2de6475beadb412027255de2cd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bb8c0b89e0d91440beb93dc0fe78cebe4ca481940af3c8affe5efd75d589980\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T15:10:23Z\\\",\\\"message\\\":\\\"responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.139:17698:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8efa4d1a-72f5-4dfa-9bc2-9d93ef11ecf2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0219 15:10:23.623215 6458 services_controller.go:443] Built service openshift-machine-config-operator/machine-config-controller LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.16\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:9001, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0219 15:10:23.623235 6458 services_controller.go:444] Built service openshift-machine-config-operator/machine-config-controller LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0219 15:10:23.623244 6458 services_controller.go:445] Built service openshift-machine-config-operator/machine-config-controller LB template configs for network=default: []services.lbConfig(nil)\\\\nF0219 15:10:23.623259 6458 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T15:10:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://824c8634103fa360f2c3a6cb4c22c0185f87e2de6475beadb412027255de2cd4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T15:10:51Z\\\",\\\"message\\\":\\\"from k8s.io/client-go/informers/factory.go:160\\\\nI0219 15:10:51.487438 6860 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 15:10:51.487453 6860 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0219 15:10:51.487498 6860 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0219 15:10:51.487670 6860 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0219 15:10:51.487715 6860 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 15:10:51.487769 6860 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 15:10:51.487918 6860 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 15:10:51.487982 6860 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 15:10:51.489213 6860 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T15:10:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e21fe2d6964872913c8e5a2f983519806b72d0e2880016372ac90fc2b9f61fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8k7p5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:52Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:52 crc kubenswrapper[4810]: I0219 15:10:52.239796 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:52 crc kubenswrapper[4810]: I0219 15:10:52.239871 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:52 crc kubenswrapper[4810]: I0219 15:10:52.239896 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:52 crc kubenswrapper[4810]: I0219 15:10:52.239933 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:52 crc kubenswrapper[4810]: I0219 15:10:52.239957 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:52Z","lastTransitionTime":"2026-02-19T15:10:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:52 crc kubenswrapper[4810]: I0219 15:10:52.242544 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d74a04f-ffb6-47fb-8823-2fd7f3df4f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2070ce19c777ad791dfd3e5d99eded43e479f4cbbaefe96c5ed8525aa86e5ed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89e969496cd5adab974a5d8b8c3051bcdfb63c582b8f0d6d9af233d25e2c1d3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5559fbe43f7e18617e2a46182d621a6310ae4d0b10548bae2bb098588285e081\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fbf8815e560483199cc8f453952895d3e5d831e7c52671b204187dc686fa4ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:52Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:52 crc kubenswrapper[4810]: I0219 15:10:52.264962 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b60fe11e60f880bde640f63640fdc6a1b66e3044dcd975c711c45a15aed3fb2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:52Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:52 crc kubenswrapper[4810]: I0219 15:10:52.280159 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9db08d13283889d8c43ab89157e2c33f300359ecf66b548100b30ddea3da2be1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8472b62f0ac04884cd8a9d342c063860188b0f17e1ed2201c7ea7288fb44d77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:52Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:52 crc kubenswrapper[4810]: I0219 15:10:52.295460 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:52Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:52 crc kubenswrapper[4810]: I0219 15:10:52.317107 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54ba23405a6182e281d7d423f13e84a03c0f68cf69169f59fe1cd0d5881103c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:52Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:52 crc kubenswrapper[4810]: I0219 15:10:52.339661 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bsztz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a45a199-beeb-4972-b796-15c958fe99d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1f575901c6e6174387489f4f3a7d8cfa46089db51556b6acc9245faaf36a9ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a674e283274f1e77f6ac89a8170137bf5852153bd4a35a80101a3cd0090b85d2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T15:10:42Z\\\",\\\"message\\\":\\\"2026-02-19T15:09:56+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_e927f559-f3e4-4463-8345-ac34840adcee\\\\n2026-02-19T15:09:56+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_e927f559-f3e4-4463-8345-ac34840adcee to /host/opt/cni/bin/\\\\n2026-02-19T15:09:57Z [verbose] multus-daemon started\\\\n2026-02-19T15:09:57Z [verbose] Readiness Indicator file check\\\\n2026-02-19T15:10:42Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:53Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcg8h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bsztz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:52Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:52 crc kubenswrapper[4810]: I0219 15:10:52.343108 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:52 crc kubenswrapper[4810]: I0219 15:10:52.343161 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:52 crc kubenswrapper[4810]: I0219 15:10:52.343173 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:52 crc kubenswrapper[4810]: I0219 15:10:52.343195 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:52 crc kubenswrapper[4810]: I0219 15:10:52.343207 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:52Z","lastTransitionTime":"2026-02-19T15:10:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:52 crc kubenswrapper[4810]: I0219 15:10:52.414791 4810 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 21:53:42.729029403 +0000 UTC Feb 19 15:10:52 crc kubenswrapper[4810]: I0219 15:10:52.438802 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 15:10:52 crc kubenswrapper[4810]: E0219 15:10:52.439136 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 15:10:52 crc kubenswrapper[4810]: I0219 15:10:52.445945 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:52 crc kubenswrapper[4810]: I0219 15:10:52.445987 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:52 crc kubenswrapper[4810]: I0219 15:10:52.445997 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:52 crc kubenswrapper[4810]: I0219 15:10:52.446015 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:52 crc kubenswrapper[4810]: I0219 15:10:52.446028 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:52Z","lastTransitionTime":"2026-02-19T15:10:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:52 crc kubenswrapper[4810]: I0219 15:10:52.549527 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:52 crc kubenswrapper[4810]: I0219 15:10:52.549880 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:52 crc kubenswrapper[4810]: I0219 15:10:52.550019 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:52 crc kubenswrapper[4810]: I0219 15:10:52.550208 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:52 crc kubenswrapper[4810]: I0219 15:10:52.550374 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:52Z","lastTransitionTime":"2026-02-19T15:10:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:52 crc kubenswrapper[4810]: I0219 15:10:52.654303 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:52 crc kubenswrapper[4810]: I0219 15:10:52.654404 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:52 crc kubenswrapper[4810]: I0219 15:10:52.654424 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:52 crc kubenswrapper[4810]: I0219 15:10:52.654450 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:52 crc kubenswrapper[4810]: I0219 15:10:52.654467 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:52Z","lastTransitionTime":"2026-02-19T15:10:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:52 crc kubenswrapper[4810]: I0219 15:10:52.757521 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:52 crc kubenswrapper[4810]: I0219 15:10:52.757584 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:52 crc kubenswrapper[4810]: I0219 15:10:52.757603 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:52 crc kubenswrapper[4810]: I0219 15:10:52.757630 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:52 crc kubenswrapper[4810]: I0219 15:10:52.757649 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:52Z","lastTransitionTime":"2026-02-19T15:10:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:52 crc kubenswrapper[4810]: I0219 15:10:52.860853 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:52 crc kubenswrapper[4810]: I0219 15:10:52.861197 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:52 crc kubenswrapper[4810]: I0219 15:10:52.861406 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:52 crc kubenswrapper[4810]: I0219 15:10:52.861576 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:52 crc kubenswrapper[4810]: I0219 15:10:52.861716 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:52Z","lastTransitionTime":"2026-02-19T15:10:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:52 crc kubenswrapper[4810]: I0219 15:10:52.965316 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:52 crc kubenswrapper[4810]: I0219 15:10:52.965407 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:52 crc kubenswrapper[4810]: I0219 15:10:52.965423 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:52 crc kubenswrapper[4810]: I0219 15:10:52.965449 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:52 crc kubenswrapper[4810]: I0219 15:10:52.965466 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:52Z","lastTransitionTime":"2026-02-19T15:10:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:53 crc kubenswrapper[4810]: I0219 15:10:53.028594 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8k7p5_c5a8a15c-53e8-4868-8feb-dcd4e83939a4/ovnkube-controller/3.log" Feb 19 15:10:53 crc kubenswrapper[4810]: I0219 15:10:53.034931 4810 scope.go:117] "RemoveContainer" containerID="824c8634103fa360f2c3a6cb4c22c0185f87e2de6475beadb412027255de2cd4" Feb 19 15:10:53 crc kubenswrapper[4810]: E0219 15:10:53.035714 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-8k7p5_openshift-ovn-kubernetes(c5a8a15c-53e8-4868-8feb-dcd4e83939a4)\"" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" podUID="c5a8a15c-53e8-4868-8feb-dcd4e83939a4" Feb 19 15:10:53 crc kubenswrapper[4810]: I0219 15:10:53.059067 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bsztz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a45a199-beeb-4972-b796-15c958fe99d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1f575901c6e6174387489f4f3a7d8cfa46089db51556b6acc9245faaf36a9ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a674e283274f1e77f6ac89a8170137bf5852153bd4a35a80101a3cd0090b85d2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T15:10:42Z\\\",\\\"message\\\":\\\"2026-02-19T15:09:56+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_e927f559-f3e4-4463-8345-ac34840adcee\\\\n2026-02-19T15:09:56+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_e927f559-f3e4-4463-8345-ac34840adcee to /host/opt/cni/bin/\\\\n2026-02-19T15:09:57Z [verbose] multus-daemon started\\\\n2026-02-19T15:09:57Z [verbose] Readiness Indicator file check\\\\n2026-02-19T15:10:42Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:53Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcg8h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bsztz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:53Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:53 crc kubenswrapper[4810]: I0219 15:10:53.068716 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:53 crc kubenswrapper[4810]: I0219 15:10:53.068776 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:53 crc kubenswrapper[4810]: I0219 15:10:53.068800 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:53 crc kubenswrapper[4810]: I0219 15:10:53.068828 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:53 crc kubenswrapper[4810]: I0219 15:10:53.068851 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:53Z","lastTransitionTime":"2026-02-19T15:10:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:53 crc kubenswrapper[4810]: I0219 15:10:53.082599 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:53Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:53 crc kubenswrapper[4810]: I0219 15:10:53.104750 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54ba23405a6182e281d7d423f13e84a03c0f68cf69169f59fe1cd0d5881103c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:53Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:53 crc kubenswrapper[4810]: I0219 15:10:53.125805 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6694b0d-3264-43eb-bd52-5088c7d2bf15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16713d77989f991d6b3136c41aac696aaa3233a0f3454bcd299cd968f46872eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42939a1750dd650a11f4f83812694248cc8767b040f5cab76ee64d30ad6004ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a3707767fcc9c9ee6a808f9c0598457f1710b6394ecf9910f7a1a0ddd8ec1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef740fa90b279363e303fb824188d0a26097b06aa8eaa488c0c800a8caa8f10e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://640d02443d7d20048695866b4db1d1fdd6469c712a804010860ca4c730f59c65\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T15:09:51Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 15:09:51.508430 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 15:09:51.508603 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 15:09:51.509986 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1367038872/tls.crt::/tmp/serving-cert-1367038872/tls.key\\\\\\\"\\\\nI0219 15:09:51.722990 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 15:09:51.726068 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 15:09:51.726086 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 15:09:51.726104 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 15:09:51.726110 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 15:09:51.731095 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 15:09:51.731120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0219 15:09:51.731119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0219 15:09:51.731126 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 15:09:51.731136 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 15:09:51.731141 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 15:09:51.731145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 15:09:51.731149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 15:09:51.734010 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77137e37c7a8cdddbba21456e942c760fb2cc6c31533e3bc6c4561929b41393b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:53Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:53 crc kubenswrapper[4810]: I0219 15:10:53.146701 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:53Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:53 crc kubenswrapper[4810]: I0219 15:10:53.171682 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:53 crc kubenswrapper[4810]: I0219 15:10:53.171778 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:53 crc kubenswrapper[4810]: I0219 15:10:53.171801 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:53 crc kubenswrapper[4810]: I0219 15:10:53.171832 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:53 crc kubenswrapper[4810]: I0219 15:10:53.171854 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:53Z","lastTransitionTime":"2026-02-19T15:10:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:53 crc kubenswrapper[4810]: I0219 15:10:53.174912 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmbsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"247e948b-3c17-4675-bd1c-f894b02d2817\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7171cb2bba77488fbb60ec533172b2240d1871e89bc7760ae5c9b67ee6924354\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cbe5a969ecab49494c22bccec30a5b7050699729a8ae6244a4bb016ed9cbe64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cbe5a969ecab49494c22bccec30a5b7050699729a8ae6244a4bb016ed9cbe64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://395e0b7a03b78f71b2823c72ea754e1909bde653a7107597bf07b6e664ba40dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://395e0b7a03b78f71b2823c72ea754e1909bde653a7107597bf07b6e664ba40dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5a0126c8bea39a89fe488d49dcb1e47063abc9839d148f348a7ce592038a450\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5a0126c8bea39a89fe488d49dcb1e47063abc9839d148f348a7ce592038a450\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9864da53b0851b0c4eadd5c11669af9b9e539660908ca14f44618824b5e4158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9864da53b0851b0c4eadd5c11669af9b9e539660908ca14f44618824b5e4158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://301c743749a2bcaee82a70d3c34a6cf13f880ef9986c390e279a7c056f9a99ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://301c743749a2bcaee82a70d3c34a6cf13f880ef9986c390e279a7c056f9a99ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://075e06e40beb99d2338c5850692a5e89314b6342cc72007d75b602ac21a90d9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://075e06e40beb99d2338c5850692a5e89314b6342cc72007d75b602ac21a90d9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmbsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:53Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:53 crc kubenswrapper[4810]: I0219 15:10:53.193028 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2x9v9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b72d3f7a-e418-4a21-af73-6a43ce3358c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7z7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7z7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:10:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2x9v9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:53Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:53 crc kubenswrapper[4810]: I0219 15:10:53.212008 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b08ecb1c-d94b-4563-8f47-7334d52bc0c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af7cc216f97bd72a2d9a57e54a49676e2828ec7d0a0a035f22261c40a6976a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d59e5d74bcb7088e172f31cc207d421cd214b7d4a80382004fb30c167b04e14a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d59e5d74bcb7088e172f31cc207d421cd214b7d4a80382004fb30c167b04e14a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:53Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:53 crc kubenswrapper[4810]: I0219 15:10:53.235023 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfbf74ef-1e94-4826-8583-42b2e246ccf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f26baf8d6384c9133f7da338479eac54f1aa88609e4b0854078d4e85e8bf05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjrkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba65a8abc0841494ba4596019b005e3cbb3f3c0897bef615b51a2ab10aa9e651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjrkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t499d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:53Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:53 crc kubenswrapper[4810]: I0219 15:10:53.258268 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-t9jnq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60d7228b-14bd-4988-8dca-cb89f487ba00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb03eee4bad3a037ff0d61a204c9c86a839e9d684dbb89a433a50afa9cfcfcf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gzvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-t9jnq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:53Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:53 crc kubenswrapper[4810]: I0219 15:10:53.275418 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:53 crc kubenswrapper[4810]: I0219 15:10:53.275493 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:53 crc kubenswrapper[4810]: I0219 15:10:53.275517 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:53 crc kubenswrapper[4810]: I0219 15:10:53.275551 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:53 crc kubenswrapper[4810]: I0219 15:10:53.275575 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:53Z","lastTransitionTime":"2026-02-19T15:10:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:53 crc kubenswrapper[4810]: I0219 15:10:53.282099 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-l2g4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06adf9a1-ec31-4acc-9864-41549913d3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://618a71f428d132079ed8a6756ea331ec8fd20c32c98bc00691f56fa0b12669aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh6qr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82d6389909245cbf6091dfb1ef53a497dd254228acd53c508c32c4345f86b7b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:10:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh6qr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:10:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-l2g4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:53Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:53 crc kubenswrapper[4810]: I0219 15:10:53.304773 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac4287c9-a6d9-470d-820f-316c037a5d1a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc8b81d39b5b02958a8f7fa7ec86ac9dd530f2c4674042f60b11209e0400433a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://034d625f766dc7e2dd97fcedcfc0ca251971bc0be7e0d04ed4b2bcf3939905b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81860198cc657673df0021324ba4fde92aa6ffbde974993d0553a080872f7fb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eeb662772f98f52da96cf3b347f7b4990f4daa4552da61d972ecb8875d74b01c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eeb662772f98f52da96cf3b347f7b4990f4daa4552da61d972ecb8875d74b01c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:53Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:53 crc kubenswrapper[4810]: I0219 15:10:53.328236 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:53Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:53 crc kubenswrapper[4810]: I0219 15:10:53.353611 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9db08d13283889d8c43ab89157e2c33f300359ecf66b548100b30ddea3da2be1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8472b62f0ac04884cd8a9d342c063860188b0f17e1ed2201c7ea7288fb44d77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:53Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:53 crc kubenswrapper[4810]: I0219 15:10:53.370702 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-flbx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cc3ea69-b881-4fd4-ad4d-42803f27865b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07b26bfbc15981836877e6ddc6aea3fb26a4df1be38ac67c76f8a7a8f6b84b5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqph9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-flbx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:53Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:53 crc kubenswrapper[4810]: I0219 15:10:53.377931 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:53 crc kubenswrapper[4810]: I0219 15:10:53.377965 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:53 crc kubenswrapper[4810]: I0219 15:10:53.377980 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:53 crc kubenswrapper[4810]: I0219 15:10:53.378000 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:53 crc kubenswrapper[4810]: I0219 15:10:53.378016 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:53Z","lastTransitionTime":"2026-02-19T15:10:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:53 crc kubenswrapper[4810]: I0219 15:10:53.404558 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dcca2640a7625737b151a65d63dac889139b41a8dff1cd5aae2ead1a37138f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://879052e78b8d383d0a7f4fbddb38b98fa931bec3d0d0c7ff7eb830ccb62be46d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5de90aae0b44ad98711b4e4c2764cf25c491a329a7acbe9a6ec49f88fb8e7c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c0568b462e1168949dcfd0131ef4e94d6c3f68dd31906ee5c6a8c87f7fef25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://702a868ef507f31cb27f41ccd862f38ba8d5256e5d4af19a5d8b168939932663\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5089067acda7084123f239150916500b11a0547068b0a39ee60a467c372f3459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://824c8634103fa360f2c3a6cb4c22c0185f87e2de6475beadb412027255de2cd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://824c8634103fa360f2c3a6cb4c22c0185f87e2de6475beadb412027255de2cd4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T15:10:51Z\\\",\\\"message\\\":\\\"from k8s.io/client-go/informers/factory.go:160\\\\nI0219 15:10:51.487438 6860 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 15:10:51.487453 6860 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0219 15:10:51.487498 6860 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0219 15:10:51.487670 6860 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0219 15:10:51.487715 6860 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 15:10:51.487769 6860 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 15:10:51.487918 6860 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 15:10:51.487982 6860 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 15:10:51.489213 6860 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T15:10:50Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-8k7p5_openshift-ovn-kubernetes(c5a8a15c-53e8-4868-8feb-dcd4e83939a4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e21fe2d6964872913c8e5a2f983519806b72d0e2880016372ac90fc2b9f61fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T15:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T15:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7xqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8k7p5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:53Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:53 crc kubenswrapper[4810]: I0219 15:10:53.415778 4810 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 07:38:13.206014172 +0000 UTC Feb 19 15:10:53 crc kubenswrapper[4810]: I0219 15:10:53.425315 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d74a04f-ffb6-47fb-8823-2fd7f3df4f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2070ce19c777ad791dfd3e5d99eded43e479f4cbbaefe96c5ed8525aa86e5ed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89e969496cd5adab974a5d8b8c3051bcdfb63c582b8f0d6d9af233d25e2c1d3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5559fbe43f7e18617e2a46182d621a6310ae4d0b10548bae2bb098588285e081\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fbf8815e560483199cc8f453952895d3e5d831e7c52671b204187dc686fa4ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T15:09:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:53Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:53 crc kubenswrapper[4810]: I0219 15:10:53.439500 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 15:10:53 crc kubenswrapper[4810]: I0219 15:10:53.439524 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2x9v9" Feb 19 15:10:53 crc kubenswrapper[4810]: E0219 15:10:53.439705 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 15:10:53 crc kubenswrapper[4810]: I0219 15:10:53.439809 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 15:10:53 crc kubenswrapper[4810]: E0219 15:10:53.439965 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 15:10:53 crc kubenswrapper[4810]: E0219 15:10:53.440124 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2x9v9" podUID="b72d3f7a-e418-4a21-af73-6a43ce3358c1" Feb 19 15:10:53 crc kubenswrapper[4810]: I0219 15:10:53.449321 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T15:09:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b60fe11e60f880bde640f63640fdc6a1b66e3044dcd975c711c45a15aed3fb2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T15:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T15:10:53Z is after 2025-08-24T17:21:41Z" Feb 19 15:10:53 crc kubenswrapper[4810]: I0219 15:10:53.481516 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:53 crc kubenswrapper[4810]: I0219 15:10:53.481576 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:53 crc kubenswrapper[4810]: I0219 15:10:53.481589 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:53 crc kubenswrapper[4810]: I0219 15:10:53.481609 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:53 crc kubenswrapper[4810]: I0219 15:10:53.481623 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:53Z","lastTransitionTime":"2026-02-19T15:10:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:53 crc kubenswrapper[4810]: I0219 15:10:53.585249 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:53 crc kubenswrapper[4810]: I0219 15:10:53.585318 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:53 crc kubenswrapper[4810]: I0219 15:10:53.585385 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:53 crc kubenswrapper[4810]: I0219 15:10:53.585442 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:53 crc kubenswrapper[4810]: I0219 15:10:53.585461 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:53Z","lastTransitionTime":"2026-02-19T15:10:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:53 crc kubenswrapper[4810]: I0219 15:10:53.688476 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:53 crc kubenswrapper[4810]: I0219 15:10:53.688555 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:53 crc kubenswrapper[4810]: I0219 15:10:53.688574 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:53 crc kubenswrapper[4810]: I0219 15:10:53.688601 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:53 crc kubenswrapper[4810]: I0219 15:10:53.688621 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:53Z","lastTransitionTime":"2026-02-19T15:10:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:53 crc kubenswrapper[4810]: I0219 15:10:53.792345 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:53 crc kubenswrapper[4810]: I0219 15:10:53.792453 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:53 crc kubenswrapper[4810]: I0219 15:10:53.792469 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:53 crc kubenswrapper[4810]: I0219 15:10:53.792490 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:53 crc kubenswrapper[4810]: I0219 15:10:53.792504 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:53Z","lastTransitionTime":"2026-02-19T15:10:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:53 crc kubenswrapper[4810]: I0219 15:10:53.895163 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:53 crc kubenswrapper[4810]: I0219 15:10:53.895240 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:53 crc kubenswrapper[4810]: I0219 15:10:53.895299 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:53 crc kubenswrapper[4810]: I0219 15:10:53.895344 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:53 crc kubenswrapper[4810]: I0219 15:10:53.895361 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:53Z","lastTransitionTime":"2026-02-19T15:10:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:53 crc kubenswrapper[4810]: I0219 15:10:53.998144 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:53 crc kubenswrapper[4810]: I0219 15:10:53.998218 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:53 crc kubenswrapper[4810]: I0219 15:10:53.998238 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:53 crc kubenswrapper[4810]: I0219 15:10:53.998271 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:53 crc kubenswrapper[4810]: I0219 15:10:53.998291 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:53Z","lastTransitionTime":"2026-02-19T15:10:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:54 crc kubenswrapper[4810]: I0219 15:10:54.101638 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:54 crc kubenswrapper[4810]: I0219 15:10:54.101704 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:54 crc kubenswrapper[4810]: I0219 15:10:54.101722 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:54 crc kubenswrapper[4810]: I0219 15:10:54.101751 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:54 crc kubenswrapper[4810]: I0219 15:10:54.101769 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:54Z","lastTransitionTime":"2026-02-19T15:10:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:54 crc kubenswrapper[4810]: I0219 15:10:54.204928 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:54 crc kubenswrapper[4810]: I0219 15:10:54.204966 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:54 crc kubenswrapper[4810]: I0219 15:10:54.204977 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:54 crc kubenswrapper[4810]: I0219 15:10:54.204994 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:54 crc kubenswrapper[4810]: I0219 15:10:54.205006 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:54Z","lastTransitionTime":"2026-02-19T15:10:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:54 crc kubenswrapper[4810]: I0219 15:10:54.308396 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:54 crc kubenswrapper[4810]: I0219 15:10:54.308481 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:54 crc kubenswrapper[4810]: I0219 15:10:54.308513 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:54 crc kubenswrapper[4810]: I0219 15:10:54.308541 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:54 crc kubenswrapper[4810]: I0219 15:10:54.308554 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:54Z","lastTransitionTime":"2026-02-19T15:10:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:54 crc kubenswrapper[4810]: I0219 15:10:54.412210 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:54 crc kubenswrapper[4810]: I0219 15:10:54.412290 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:54 crc kubenswrapper[4810]: I0219 15:10:54.412309 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:54 crc kubenswrapper[4810]: I0219 15:10:54.412381 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:54 crc kubenswrapper[4810]: I0219 15:10:54.412405 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:54Z","lastTransitionTime":"2026-02-19T15:10:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:54 crc kubenswrapper[4810]: I0219 15:10:54.416407 4810 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 16:03:54.26117333 +0000 UTC Feb 19 15:10:54 crc kubenswrapper[4810]: I0219 15:10:54.438947 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 15:10:54 crc kubenswrapper[4810]: E0219 15:10:54.439280 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 15:10:54 crc kubenswrapper[4810]: I0219 15:10:54.515488 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:54 crc kubenswrapper[4810]: I0219 15:10:54.515557 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:54 crc kubenswrapper[4810]: I0219 15:10:54.515579 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:54 crc kubenswrapper[4810]: I0219 15:10:54.515609 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:54 crc kubenswrapper[4810]: I0219 15:10:54.515631 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:54Z","lastTransitionTime":"2026-02-19T15:10:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:54 crc kubenswrapper[4810]: I0219 15:10:54.619396 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:54 crc kubenswrapper[4810]: I0219 15:10:54.619466 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:54 crc kubenswrapper[4810]: I0219 15:10:54.619485 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:54 crc kubenswrapper[4810]: I0219 15:10:54.619513 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:54 crc kubenswrapper[4810]: I0219 15:10:54.619533 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:54Z","lastTransitionTime":"2026-02-19T15:10:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:54 crc kubenswrapper[4810]: I0219 15:10:54.722615 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:54 crc kubenswrapper[4810]: I0219 15:10:54.722683 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:54 crc kubenswrapper[4810]: I0219 15:10:54.722708 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:54 crc kubenswrapper[4810]: I0219 15:10:54.722745 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:54 crc kubenswrapper[4810]: I0219 15:10:54.722774 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:54Z","lastTransitionTime":"2026-02-19T15:10:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:54 crc kubenswrapper[4810]: I0219 15:10:54.826441 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:54 crc kubenswrapper[4810]: I0219 15:10:54.826502 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:54 crc kubenswrapper[4810]: I0219 15:10:54.826519 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:54 crc kubenswrapper[4810]: I0219 15:10:54.826547 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:54 crc kubenswrapper[4810]: I0219 15:10:54.826565 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:54Z","lastTransitionTime":"2026-02-19T15:10:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:54 crc kubenswrapper[4810]: I0219 15:10:54.929779 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:54 crc kubenswrapper[4810]: I0219 15:10:54.929854 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:54 crc kubenswrapper[4810]: I0219 15:10:54.929871 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:54 crc kubenswrapper[4810]: I0219 15:10:54.929903 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:54 crc kubenswrapper[4810]: I0219 15:10:54.929924 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:54Z","lastTransitionTime":"2026-02-19T15:10:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:55 crc kubenswrapper[4810]: I0219 15:10:55.033191 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:55 crc kubenswrapper[4810]: I0219 15:10:55.033268 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:55 crc kubenswrapper[4810]: I0219 15:10:55.033286 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:55 crc kubenswrapper[4810]: I0219 15:10:55.033315 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:55 crc kubenswrapper[4810]: I0219 15:10:55.033371 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:55Z","lastTransitionTime":"2026-02-19T15:10:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:55 crc kubenswrapper[4810]: I0219 15:10:55.136630 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:55 crc kubenswrapper[4810]: I0219 15:10:55.136679 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:55 crc kubenswrapper[4810]: I0219 15:10:55.136695 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:55 crc kubenswrapper[4810]: I0219 15:10:55.136720 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:55 crc kubenswrapper[4810]: I0219 15:10:55.136737 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:55Z","lastTransitionTime":"2026-02-19T15:10:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:55 crc kubenswrapper[4810]: I0219 15:10:55.239692 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:55 crc kubenswrapper[4810]: I0219 15:10:55.240269 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:55 crc kubenswrapper[4810]: I0219 15:10:55.240455 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:55 crc kubenswrapper[4810]: I0219 15:10:55.240615 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:55 crc kubenswrapper[4810]: I0219 15:10:55.240782 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:55Z","lastTransitionTime":"2026-02-19T15:10:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:55 crc kubenswrapper[4810]: I0219 15:10:55.268448 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 15:10:55 crc kubenswrapper[4810]: E0219 15:10:55.268639 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 15:11:59.268600186 +0000 UTC m=+148.750630350 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:10:55 crc kubenswrapper[4810]: I0219 15:10:55.343934 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:55 crc kubenswrapper[4810]: I0219 15:10:55.343987 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:55 crc kubenswrapper[4810]: I0219 15:10:55.344003 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:55 crc kubenswrapper[4810]: I0219 15:10:55.344028 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:55 crc kubenswrapper[4810]: I0219 15:10:55.344045 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:55Z","lastTransitionTime":"2026-02-19T15:10:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:55 crc kubenswrapper[4810]: I0219 15:10:55.369728 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 15:10:55 crc kubenswrapper[4810]: E0219 15:10:55.370000 4810 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 15:10:55 crc kubenswrapper[4810]: E0219 15:10:55.370047 4810 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 15:10:55 crc kubenswrapper[4810]: E0219 15:10:55.370072 4810 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 15:10:55 crc kubenswrapper[4810]: E0219 15:10:55.370159 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-19 15:11:59.370125447 +0000 UTC m=+148.852155621 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 15:10:55 crc kubenswrapper[4810]: E0219 15:10:55.370300 4810 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 15:10:55 crc kubenswrapper[4810]: E0219 15:10:55.370386 4810 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 15:10:55 crc kubenswrapper[4810]: E0219 15:10:55.370410 4810 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 15:10:55 crc kubenswrapper[4810]: E0219 15:10:55.370542 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-19 15:11:59.370513946 +0000 UTC m=+148.852544100 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 15:10:55 crc kubenswrapper[4810]: I0219 15:10:55.370732 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 15:10:55 crc kubenswrapper[4810]: I0219 15:10:55.370931 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 15:10:55 crc kubenswrapper[4810]: E0219 15:10:55.371061 4810 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 15:10:55 crc kubenswrapper[4810]: E0219 15:10:55.371142 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 15:11:59.371120931 +0000 UTC m=+148.853151095 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 15:10:55 crc kubenswrapper[4810]: I0219 15:10:55.371515 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 15:10:55 crc kubenswrapper[4810]: E0219 15:10:55.371612 4810 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 15:10:55 crc kubenswrapper[4810]: E0219 15:10:55.371904 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 15:11:59.37187856 +0000 UTC m=+148.853908714 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 15:10:55 crc kubenswrapper[4810]: I0219 15:10:55.417058 4810 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 19:21:41.604414512 +0000 UTC Feb 19 15:10:55 crc kubenswrapper[4810]: I0219 15:10:55.438404 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 15:10:55 crc kubenswrapper[4810]: I0219 15:10:55.438484 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 15:10:55 crc kubenswrapper[4810]: I0219 15:10:55.438404 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2x9v9" Feb 19 15:10:55 crc kubenswrapper[4810]: E0219 15:10:55.438587 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 15:10:55 crc kubenswrapper[4810]: E0219 15:10:55.438717 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2x9v9" podUID="b72d3f7a-e418-4a21-af73-6a43ce3358c1" Feb 19 15:10:55 crc kubenswrapper[4810]: E0219 15:10:55.438785 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 15:10:55 crc kubenswrapper[4810]: I0219 15:10:55.446051 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:55 crc kubenswrapper[4810]: I0219 15:10:55.446254 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:55 crc kubenswrapper[4810]: I0219 15:10:55.446388 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:55 crc kubenswrapper[4810]: I0219 15:10:55.446556 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:55 crc kubenswrapper[4810]: I0219 15:10:55.446689 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:55Z","lastTransitionTime":"2026-02-19T15:10:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:55 crc kubenswrapper[4810]: I0219 15:10:55.550238 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:55 crc kubenswrapper[4810]: I0219 15:10:55.550295 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:55 crc kubenswrapper[4810]: I0219 15:10:55.550318 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:55 crc kubenswrapper[4810]: I0219 15:10:55.550409 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:55 crc kubenswrapper[4810]: I0219 15:10:55.550435 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:55Z","lastTransitionTime":"2026-02-19T15:10:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:55 crc kubenswrapper[4810]: I0219 15:10:55.654291 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:55 crc kubenswrapper[4810]: I0219 15:10:55.654455 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:55 crc kubenswrapper[4810]: I0219 15:10:55.654480 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:55 crc kubenswrapper[4810]: I0219 15:10:55.654518 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:55 crc kubenswrapper[4810]: I0219 15:10:55.654545 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:55Z","lastTransitionTime":"2026-02-19T15:10:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:55 crc kubenswrapper[4810]: I0219 15:10:55.757538 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:55 crc kubenswrapper[4810]: I0219 15:10:55.757625 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:55 crc kubenswrapper[4810]: I0219 15:10:55.757651 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:55 crc kubenswrapper[4810]: I0219 15:10:55.757685 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:55 crc kubenswrapper[4810]: I0219 15:10:55.757708 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:55Z","lastTransitionTime":"2026-02-19T15:10:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:55 crc kubenswrapper[4810]: I0219 15:10:55.861741 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:55 crc kubenswrapper[4810]: I0219 15:10:55.862108 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:55 crc kubenswrapper[4810]: I0219 15:10:55.862233 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:55 crc kubenswrapper[4810]: I0219 15:10:55.862352 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:55 crc kubenswrapper[4810]: I0219 15:10:55.862451 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:55Z","lastTransitionTime":"2026-02-19T15:10:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:55 crc kubenswrapper[4810]: I0219 15:10:55.964558 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:55 crc kubenswrapper[4810]: I0219 15:10:55.964628 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:55 crc kubenswrapper[4810]: I0219 15:10:55.964647 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:55 crc kubenswrapper[4810]: I0219 15:10:55.964676 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:55 crc kubenswrapper[4810]: I0219 15:10:55.964695 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:55Z","lastTransitionTime":"2026-02-19T15:10:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:56 crc kubenswrapper[4810]: I0219 15:10:56.068073 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:56 crc kubenswrapper[4810]: I0219 15:10:56.068141 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:56 crc kubenswrapper[4810]: I0219 15:10:56.068154 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:56 crc kubenswrapper[4810]: I0219 15:10:56.068189 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:56 crc kubenswrapper[4810]: I0219 15:10:56.068204 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:56Z","lastTransitionTime":"2026-02-19T15:10:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:56 crc kubenswrapper[4810]: I0219 15:10:56.171989 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:56 crc kubenswrapper[4810]: I0219 15:10:56.172048 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:56 crc kubenswrapper[4810]: I0219 15:10:56.172065 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:56 crc kubenswrapper[4810]: I0219 15:10:56.172089 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:56 crc kubenswrapper[4810]: I0219 15:10:56.172104 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:56Z","lastTransitionTime":"2026-02-19T15:10:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:56 crc kubenswrapper[4810]: I0219 15:10:56.275436 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:56 crc kubenswrapper[4810]: I0219 15:10:56.275509 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:56 crc kubenswrapper[4810]: I0219 15:10:56.275527 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:56 crc kubenswrapper[4810]: I0219 15:10:56.275558 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:56 crc kubenswrapper[4810]: I0219 15:10:56.275578 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:56Z","lastTransitionTime":"2026-02-19T15:10:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:56 crc kubenswrapper[4810]: I0219 15:10:56.379080 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:56 crc kubenswrapper[4810]: I0219 15:10:56.379187 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:56 crc kubenswrapper[4810]: I0219 15:10:56.379208 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:56 crc kubenswrapper[4810]: I0219 15:10:56.379236 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:56 crc kubenswrapper[4810]: I0219 15:10:56.379257 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:56Z","lastTransitionTime":"2026-02-19T15:10:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:56 crc kubenswrapper[4810]: I0219 15:10:56.417862 4810 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 19:53:57.638221094 +0000 UTC Feb 19 15:10:56 crc kubenswrapper[4810]: I0219 15:10:56.438598 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 15:10:56 crc kubenswrapper[4810]: E0219 15:10:56.439059 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 15:10:56 crc kubenswrapper[4810]: I0219 15:10:56.481934 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:56 crc kubenswrapper[4810]: I0219 15:10:56.481991 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:56 crc kubenswrapper[4810]: I0219 15:10:56.482006 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:56 crc kubenswrapper[4810]: I0219 15:10:56.482031 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:56 crc kubenswrapper[4810]: I0219 15:10:56.482048 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:56Z","lastTransitionTime":"2026-02-19T15:10:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:56 crc kubenswrapper[4810]: I0219 15:10:56.585772 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:56 crc kubenswrapper[4810]: I0219 15:10:56.585881 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:56 crc kubenswrapper[4810]: I0219 15:10:56.585907 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:56 crc kubenswrapper[4810]: I0219 15:10:56.585939 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:56 crc kubenswrapper[4810]: I0219 15:10:56.585965 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:56Z","lastTransitionTime":"2026-02-19T15:10:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:56 crc kubenswrapper[4810]: I0219 15:10:56.688426 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:56 crc kubenswrapper[4810]: I0219 15:10:56.688500 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:56 crc kubenswrapper[4810]: I0219 15:10:56.688521 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:56 crc kubenswrapper[4810]: I0219 15:10:56.688549 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:56 crc kubenswrapper[4810]: I0219 15:10:56.688567 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:56Z","lastTransitionTime":"2026-02-19T15:10:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:56 crc kubenswrapper[4810]: I0219 15:10:56.791208 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:56 crc kubenswrapper[4810]: I0219 15:10:56.791282 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:56 crc kubenswrapper[4810]: I0219 15:10:56.791303 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:56 crc kubenswrapper[4810]: I0219 15:10:56.791345 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:56 crc kubenswrapper[4810]: I0219 15:10:56.791358 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:56Z","lastTransitionTime":"2026-02-19T15:10:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:56 crc kubenswrapper[4810]: I0219 15:10:56.894122 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:56 crc kubenswrapper[4810]: I0219 15:10:56.894205 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:56 crc kubenswrapper[4810]: I0219 15:10:56.894224 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:56 crc kubenswrapper[4810]: I0219 15:10:56.894255 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:56 crc kubenswrapper[4810]: I0219 15:10:56.894280 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:56Z","lastTransitionTime":"2026-02-19T15:10:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:56 crc kubenswrapper[4810]: I0219 15:10:56.997814 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:56 crc kubenswrapper[4810]: I0219 15:10:56.997895 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:56 crc kubenswrapper[4810]: I0219 15:10:56.997916 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:56 crc kubenswrapper[4810]: I0219 15:10:56.997946 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:56 crc kubenswrapper[4810]: I0219 15:10:56.997965 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:56Z","lastTransitionTime":"2026-02-19T15:10:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:57 crc kubenswrapper[4810]: I0219 15:10:57.100463 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:57 crc kubenswrapper[4810]: I0219 15:10:57.100553 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:57 crc kubenswrapper[4810]: I0219 15:10:57.100585 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:57 crc kubenswrapper[4810]: I0219 15:10:57.100621 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:57 crc kubenswrapper[4810]: I0219 15:10:57.100644 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:57Z","lastTransitionTime":"2026-02-19T15:10:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:57 crc kubenswrapper[4810]: I0219 15:10:57.203725 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:57 crc kubenswrapper[4810]: I0219 15:10:57.203812 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:57 crc kubenswrapper[4810]: I0219 15:10:57.203834 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:57 crc kubenswrapper[4810]: I0219 15:10:57.203868 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:57 crc kubenswrapper[4810]: I0219 15:10:57.203896 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:57Z","lastTransitionTime":"2026-02-19T15:10:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:57 crc kubenswrapper[4810]: I0219 15:10:57.306649 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:57 crc kubenswrapper[4810]: I0219 15:10:57.306757 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:57 crc kubenswrapper[4810]: I0219 15:10:57.306776 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:57 crc kubenswrapper[4810]: I0219 15:10:57.306804 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:57 crc kubenswrapper[4810]: I0219 15:10:57.306821 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:57Z","lastTransitionTime":"2026-02-19T15:10:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:57 crc kubenswrapper[4810]: I0219 15:10:57.410233 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:57 crc kubenswrapper[4810]: I0219 15:10:57.410285 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:57 crc kubenswrapper[4810]: I0219 15:10:57.410295 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:57 crc kubenswrapper[4810]: I0219 15:10:57.410314 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:57 crc kubenswrapper[4810]: I0219 15:10:57.410345 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:57Z","lastTransitionTime":"2026-02-19T15:10:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:57 crc kubenswrapper[4810]: I0219 15:10:57.418505 4810 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 03:12:12.360329026 +0000 UTC Feb 19 15:10:57 crc kubenswrapper[4810]: I0219 15:10:57.439758 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2x9v9" Feb 19 15:10:57 crc kubenswrapper[4810]: I0219 15:10:57.439879 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 15:10:57 crc kubenswrapper[4810]: E0219 15:10:57.439977 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2x9v9" podUID="b72d3f7a-e418-4a21-af73-6a43ce3358c1" Feb 19 15:10:57 crc kubenswrapper[4810]: I0219 15:10:57.440104 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 15:10:57 crc kubenswrapper[4810]: E0219 15:10:57.440094 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 15:10:57 crc kubenswrapper[4810]: E0219 15:10:57.440216 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 15:10:57 crc kubenswrapper[4810]: I0219 15:10:57.464781 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Feb 19 15:10:57 crc kubenswrapper[4810]: I0219 15:10:57.513044 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:57 crc kubenswrapper[4810]: I0219 15:10:57.513110 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:57 crc kubenswrapper[4810]: I0219 15:10:57.513121 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:57 crc kubenswrapper[4810]: I0219 15:10:57.513142 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:57 crc kubenswrapper[4810]: I0219 15:10:57.513154 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:57Z","lastTransitionTime":"2026-02-19T15:10:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:57 crc kubenswrapper[4810]: I0219 15:10:57.616351 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:57 crc kubenswrapper[4810]: I0219 15:10:57.616437 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:57 crc kubenswrapper[4810]: I0219 15:10:57.616448 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:57 crc kubenswrapper[4810]: I0219 15:10:57.616468 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:57 crc kubenswrapper[4810]: I0219 15:10:57.616480 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:57Z","lastTransitionTime":"2026-02-19T15:10:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:57 crc kubenswrapper[4810]: I0219 15:10:57.719885 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:57 crc kubenswrapper[4810]: I0219 15:10:57.719963 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:57 crc kubenswrapper[4810]: I0219 15:10:57.719985 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:57 crc kubenswrapper[4810]: I0219 15:10:57.720015 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:57 crc kubenswrapper[4810]: I0219 15:10:57.720038 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:57Z","lastTransitionTime":"2026-02-19T15:10:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:57 crc kubenswrapper[4810]: I0219 15:10:57.823710 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:57 crc kubenswrapper[4810]: I0219 15:10:57.823765 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:57 crc kubenswrapper[4810]: I0219 15:10:57.823780 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:57 crc kubenswrapper[4810]: I0219 15:10:57.823802 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:57 crc kubenswrapper[4810]: I0219 15:10:57.823818 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:57Z","lastTransitionTime":"2026-02-19T15:10:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:57 crc kubenswrapper[4810]: I0219 15:10:57.926341 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:57 crc kubenswrapper[4810]: I0219 15:10:57.926390 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:57 crc kubenswrapper[4810]: I0219 15:10:57.926404 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:57 crc kubenswrapper[4810]: I0219 15:10:57.926423 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:57 crc kubenswrapper[4810]: I0219 15:10:57.926435 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:57Z","lastTransitionTime":"2026-02-19T15:10:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:58 crc kubenswrapper[4810]: I0219 15:10:58.030199 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:58 crc kubenswrapper[4810]: I0219 15:10:58.030239 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:58 crc kubenswrapper[4810]: I0219 15:10:58.030248 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:58 crc kubenswrapper[4810]: I0219 15:10:58.030264 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:58 crc kubenswrapper[4810]: I0219 15:10:58.030275 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:58Z","lastTransitionTime":"2026-02-19T15:10:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:58 crc kubenswrapper[4810]: I0219 15:10:58.133087 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:58 crc kubenswrapper[4810]: I0219 15:10:58.133613 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:58 crc kubenswrapper[4810]: I0219 15:10:58.133626 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:58 crc kubenswrapper[4810]: I0219 15:10:58.133647 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:58 crc kubenswrapper[4810]: I0219 15:10:58.133663 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:58Z","lastTransitionTime":"2026-02-19T15:10:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:58 crc kubenswrapper[4810]: I0219 15:10:58.236548 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:58 crc kubenswrapper[4810]: I0219 15:10:58.236591 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:58 crc kubenswrapper[4810]: I0219 15:10:58.236610 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:58 crc kubenswrapper[4810]: I0219 15:10:58.236633 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:58 crc kubenswrapper[4810]: I0219 15:10:58.236649 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:58Z","lastTransitionTime":"2026-02-19T15:10:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:58 crc kubenswrapper[4810]: I0219 15:10:58.339206 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:58 crc kubenswrapper[4810]: I0219 15:10:58.339266 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:58 crc kubenswrapper[4810]: I0219 15:10:58.339282 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:58 crc kubenswrapper[4810]: I0219 15:10:58.339300 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:58 crc kubenswrapper[4810]: I0219 15:10:58.339311 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:58Z","lastTransitionTime":"2026-02-19T15:10:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:58 crc kubenswrapper[4810]: I0219 15:10:58.419915 4810 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 17:04:07.677122347 +0000 UTC Feb 19 15:10:58 crc kubenswrapper[4810]: I0219 15:10:58.438385 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 15:10:58 crc kubenswrapper[4810]: E0219 15:10:58.438562 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 15:10:58 crc kubenswrapper[4810]: I0219 15:10:58.442242 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:58 crc kubenswrapper[4810]: I0219 15:10:58.442292 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:58 crc kubenswrapper[4810]: I0219 15:10:58.442303 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:58 crc kubenswrapper[4810]: I0219 15:10:58.442321 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:58 crc kubenswrapper[4810]: I0219 15:10:58.442364 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:58Z","lastTransitionTime":"2026-02-19T15:10:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:58 crc kubenswrapper[4810]: I0219 15:10:58.544357 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:58 crc kubenswrapper[4810]: I0219 15:10:58.544620 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:58 crc kubenswrapper[4810]: I0219 15:10:58.544859 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:58 crc kubenswrapper[4810]: I0219 15:10:58.545013 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:58 crc kubenswrapper[4810]: I0219 15:10:58.545223 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:58Z","lastTransitionTime":"2026-02-19T15:10:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:58 crc kubenswrapper[4810]: I0219 15:10:58.649006 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:58 crc kubenswrapper[4810]: I0219 15:10:58.649238 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:58 crc kubenswrapper[4810]: I0219 15:10:58.649437 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:58 crc kubenswrapper[4810]: I0219 15:10:58.649618 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:58 crc kubenswrapper[4810]: I0219 15:10:58.649761 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:58Z","lastTransitionTime":"2026-02-19T15:10:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:58 crc kubenswrapper[4810]: I0219 15:10:58.752492 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:58 crc kubenswrapper[4810]: I0219 15:10:58.752549 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:58 crc kubenswrapper[4810]: I0219 15:10:58.752566 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:58 crc kubenswrapper[4810]: I0219 15:10:58.752653 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:58 crc kubenswrapper[4810]: I0219 15:10:58.752672 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:58Z","lastTransitionTime":"2026-02-19T15:10:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:58 crc kubenswrapper[4810]: I0219 15:10:58.856196 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:58 crc kubenswrapper[4810]: I0219 15:10:58.856261 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:58 crc kubenswrapper[4810]: I0219 15:10:58.856279 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:58 crc kubenswrapper[4810]: I0219 15:10:58.856305 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:58 crc kubenswrapper[4810]: I0219 15:10:58.856366 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:58Z","lastTransitionTime":"2026-02-19T15:10:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:58 crc kubenswrapper[4810]: I0219 15:10:58.959346 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:58 crc kubenswrapper[4810]: I0219 15:10:58.959412 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:58 crc kubenswrapper[4810]: I0219 15:10:58.959431 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:58 crc kubenswrapper[4810]: I0219 15:10:58.959458 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:58 crc kubenswrapper[4810]: I0219 15:10:58.959476 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:58Z","lastTransitionTime":"2026-02-19T15:10:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:59 crc kubenswrapper[4810]: I0219 15:10:59.062255 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:59 crc kubenswrapper[4810]: I0219 15:10:59.062668 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:59 crc kubenswrapper[4810]: I0219 15:10:59.062784 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:59 crc kubenswrapper[4810]: I0219 15:10:59.062875 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:59 crc kubenswrapper[4810]: I0219 15:10:59.062969 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:59Z","lastTransitionTime":"2026-02-19T15:10:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:59 crc kubenswrapper[4810]: I0219 15:10:59.166554 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:59 crc kubenswrapper[4810]: I0219 15:10:59.166929 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:59 crc kubenswrapper[4810]: I0219 15:10:59.167145 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:59 crc kubenswrapper[4810]: I0219 15:10:59.167305 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:59 crc kubenswrapper[4810]: I0219 15:10:59.167489 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:59Z","lastTransitionTime":"2026-02-19T15:10:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:59 crc kubenswrapper[4810]: I0219 15:10:59.271167 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:59 crc kubenswrapper[4810]: I0219 15:10:59.271257 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:59 crc kubenswrapper[4810]: I0219 15:10:59.271292 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:59 crc kubenswrapper[4810]: I0219 15:10:59.271352 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:59 crc kubenswrapper[4810]: I0219 15:10:59.271377 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:59Z","lastTransitionTime":"2026-02-19T15:10:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:59 crc kubenswrapper[4810]: I0219 15:10:59.374729 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:59 crc kubenswrapper[4810]: I0219 15:10:59.375197 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:59 crc kubenswrapper[4810]: I0219 15:10:59.375369 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:59 crc kubenswrapper[4810]: I0219 15:10:59.375528 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:59 crc kubenswrapper[4810]: I0219 15:10:59.375668 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:59Z","lastTransitionTime":"2026-02-19T15:10:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:59 crc kubenswrapper[4810]: I0219 15:10:59.420817 4810 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 15:23:05.403484802 +0000 UTC Feb 19 15:10:59 crc kubenswrapper[4810]: I0219 15:10:59.439008 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 15:10:59 crc kubenswrapper[4810]: E0219 15:10:59.439569 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 15:10:59 crc kubenswrapper[4810]: I0219 15:10:59.439635 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2x9v9" Feb 19 15:10:59 crc kubenswrapper[4810]: I0219 15:10:59.439661 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 15:10:59 crc kubenswrapper[4810]: E0219 15:10:59.440061 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 15:10:59 crc kubenswrapper[4810]: E0219 15:10:59.440076 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2x9v9" podUID="b72d3f7a-e418-4a21-af73-6a43ce3358c1" Feb 19 15:10:59 crc kubenswrapper[4810]: I0219 15:10:59.478897 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:59 crc kubenswrapper[4810]: I0219 15:10:59.478978 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:59 crc kubenswrapper[4810]: I0219 15:10:59.479001 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:59 crc kubenswrapper[4810]: I0219 15:10:59.479032 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:59 crc kubenswrapper[4810]: I0219 15:10:59.479054 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:59Z","lastTransitionTime":"2026-02-19T15:10:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:59 crc kubenswrapper[4810]: I0219 15:10:59.582975 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:59 crc kubenswrapper[4810]: I0219 15:10:59.583043 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:59 crc kubenswrapper[4810]: I0219 15:10:59.583064 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:59 crc kubenswrapper[4810]: I0219 15:10:59.583094 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:59 crc kubenswrapper[4810]: I0219 15:10:59.583121 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:59Z","lastTransitionTime":"2026-02-19T15:10:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:59 crc kubenswrapper[4810]: I0219 15:10:59.687055 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:59 crc kubenswrapper[4810]: I0219 15:10:59.687121 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:59 crc kubenswrapper[4810]: I0219 15:10:59.687134 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:59 crc kubenswrapper[4810]: I0219 15:10:59.687160 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:59 crc kubenswrapper[4810]: I0219 15:10:59.687173 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:59Z","lastTransitionTime":"2026-02-19T15:10:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:59 crc kubenswrapper[4810]: I0219 15:10:59.790762 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:59 crc kubenswrapper[4810]: I0219 15:10:59.790845 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:59 crc kubenswrapper[4810]: I0219 15:10:59.790868 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:59 crc kubenswrapper[4810]: I0219 15:10:59.790896 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:59 crc kubenswrapper[4810]: I0219 15:10:59.790917 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:59Z","lastTransitionTime":"2026-02-19T15:10:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:59 crc kubenswrapper[4810]: I0219 15:10:59.894441 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:59 crc kubenswrapper[4810]: I0219 15:10:59.894498 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:59 crc kubenswrapper[4810]: I0219 15:10:59.894510 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:59 crc kubenswrapper[4810]: I0219 15:10:59.894532 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:59 crc kubenswrapper[4810]: I0219 15:10:59.894545 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:59Z","lastTransitionTime":"2026-02-19T15:10:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:59 crc kubenswrapper[4810]: I0219 15:10:59.914161 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 15:10:59 crc kubenswrapper[4810]: I0219 15:10:59.914241 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 15:10:59 crc kubenswrapper[4810]: I0219 15:10:59.914260 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 15:10:59 crc kubenswrapper[4810]: I0219 15:10:59.914292 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 15:10:59 crc kubenswrapper[4810]: I0219 15:10:59.914313 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T15:10:59Z","lastTransitionTime":"2026-02-19T15:10:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 15:10:59 crc kubenswrapper[4810]: I0219 15:10:59.981212 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-r8wfc"] Feb 19 15:10:59 crc kubenswrapper[4810]: I0219 15:10:59.981996 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r8wfc" Feb 19 15:10:59 crc kubenswrapper[4810]: I0219 15:10:59.984699 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 19 15:10:59 crc kubenswrapper[4810]: I0219 15:10:59.984813 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 19 15:10:59 crc kubenswrapper[4810]: I0219 15:10:59.984873 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 19 15:10:59 crc kubenswrapper[4810]: I0219 15:10:59.986456 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 19 15:11:00 crc kubenswrapper[4810]: I0219 15:11:00.028304 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e566cb6c-c157-4186-91e6-a0474949d42e-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-r8wfc\" (UID: \"e566cb6c-c157-4186-91e6-a0474949d42e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r8wfc" Feb 19 15:11:00 crc kubenswrapper[4810]: I0219 15:11:00.028483 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/e566cb6c-c157-4186-91e6-a0474949d42e-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-r8wfc\" (UID: \"e566cb6c-c157-4186-91e6-a0474949d42e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r8wfc" Feb 19 15:11:00 crc kubenswrapper[4810]: I0219 15:11:00.028521 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e566cb6c-c157-4186-91e6-a0474949d42e-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-r8wfc\" (UID: \"e566cb6c-c157-4186-91e6-a0474949d42e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r8wfc" Feb 19 15:11:00 crc kubenswrapper[4810]: I0219 15:11:00.028593 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/e566cb6c-c157-4186-91e6-a0474949d42e-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-r8wfc\" (UID: \"e566cb6c-c157-4186-91e6-a0474949d42e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r8wfc" Feb 19 15:11:00 crc kubenswrapper[4810]: I0219 15:11:00.028627 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e566cb6c-c157-4186-91e6-a0474949d42e-service-ca\") pod \"cluster-version-operator-5c965bbfc6-r8wfc\" (UID: \"e566cb6c-c157-4186-91e6-a0474949d42e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r8wfc" Feb 19 15:11:00 crc kubenswrapper[4810]: I0219 15:11:00.040739 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=68.040698574 podStartE2EDuration="1m8.040698574s" podCreationTimestamp="2026-02-19 15:09:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:11:00.011478881 +0000 UTC m=+89.493509045" watchObservedRunningTime="2026-02-19 15:11:00.040698574 +0000 UTC m=+89.522728748" Feb 19 15:11:00 crc kubenswrapper[4810]: I0219 15:11:00.123652 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-flbx5" podStartSLOduration=68.123626605 podStartE2EDuration="1m8.123626605s" podCreationTimestamp="2026-02-19 15:09:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:11:00.08540758 +0000 UTC m=+89.567437704" watchObservedRunningTime="2026-02-19 15:11:00.123626605 +0000 UTC m=+89.605656739" Feb 19 15:11:00 crc kubenswrapper[4810]: I0219 15:11:00.129402 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e566cb6c-c157-4186-91e6-a0474949d42e-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-r8wfc\" (UID: \"e566cb6c-c157-4186-91e6-a0474949d42e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r8wfc" Feb 19 15:11:00 crc kubenswrapper[4810]: I0219 15:11:00.129469 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/e566cb6c-c157-4186-91e6-a0474949d42e-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-r8wfc\" (UID: \"e566cb6c-c157-4186-91e6-a0474949d42e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r8wfc" Feb 19 15:11:00 crc kubenswrapper[4810]: I0219 15:11:00.129541 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/e566cb6c-c157-4186-91e6-a0474949d42e-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-r8wfc\" (UID: \"e566cb6c-c157-4186-91e6-a0474949d42e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r8wfc" Feb 19 15:11:00 crc kubenswrapper[4810]: I0219 15:11:00.129572 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e566cb6c-c157-4186-91e6-a0474949d42e-service-ca\") pod \"cluster-version-operator-5c965bbfc6-r8wfc\" (UID: \"e566cb6c-c157-4186-91e6-a0474949d42e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r8wfc" Feb 19 15:11:00 crc kubenswrapper[4810]: I0219 15:11:00.129614 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e566cb6c-c157-4186-91e6-a0474949d42e-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-r8wfc\" (UID: \"e566cb6c-c157-4186-91e6-a0474949d42e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r8wfc" Feb 19 15:11:00 crc kubenswrapper[4810]: I0219 15:11:00.129662 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/e566cb6c-c157-4186-91e6-a0474949d42e-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-r8wfc\" (UID: \"e566cb6c-c157-4186-91e6-a0474949d42e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r8wfc" Feb 19 15:11:00 crc kubenswrapper[4810]: I0219 15:11:00.129737 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/e566cb6c-c157-4186-91e6-a0474949d42e-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-r8wfc\" (UID: \"e566cb6c-c157-4186-91e6-a0474949d42e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r8wfc" Feb 19 15:11:00 crc kubenswrapper[4810]: I0219 15:11:00.130456 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e566cb6c-c157-4186-91e6-a0474949d42e-service-ca\") pod \"cluster-version-operator-5c965bbfc6-r8wfc\" (UID: \"e566cb6c-c157-4186-91e6-a0474949d42e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r8wfc" Feb 19 15:11:00 crc kubenswrapper[4810]: I0219 15:11:00.139717 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e566cb6c-c157-4186-91e6-a0474949d42e-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-r8wfc\" (UID: \"e566cb6c-c157-4186-91e6-a0474949d42e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r8wfc" Feb 19 15:11:00 crc kubenswrapper[4810]: I0219 15:11:00.150749 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e566cb6c-c157-4186-91e6-a0474949d42e-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-r8wfc\" (UID: \"e566cb6c-c157-4186-91e6-a0474949d42e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r8wfc" Feb 19 15:11:00 crc kubenswrapper[4810]: I0219 15:11:00.190789 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-bsztz" podStartSLOduration=68.190757505 podStartE2EDuration="1m8.190757505s" podCreationTimestamp="2026-02-19 15:09:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:11:00.177469127 +0000 UTC m=+89.659499281" watchObservedRunningTime="2026-02-19 15:11:00.190757505 +0000 UTC m=+89.672787659" Feb 19 15:11:00 crc kubenswrapper[4810]: I0219 15:11:00.222895 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=3.222868839 podStartE2EDuration="3.222868839s" podCreationTimestamp="2026-02-19 15:10:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:11:00.220638694 +0000 UTC m=+89.702668838" watchObservedRunningTime="2026-02-19 15:11:00.222868839 +0000 UTC m=+89.704898973" Feb 19 15:11:00 crc kubenswrapper[4810]: I0219 15:11:00.223589 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=23.223578717 podStartE2EDuration="23.223578717s" podCreationTimestamp="2026-02-19 15:10:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:11:00.191279928 +0000 UTC m=+89.673310092" watchObservedRunningTime="2026-02-19 15:11:00.223578717 +0000 UTC m=+89.705608851" Feb 19 15:11:00 crc kubenswrapper[4810]: I0219 15:11:00.250289 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=68.250263327 podStartE2EDuration="1m8.250263327s" podCreationTimestamp="2026-02-19 15:09:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:11:00.249298223 +0000 UTC m=+89.731328377" watchObservedRunningTime="2026-02-19 15:11:00.250263327 +0000 UTC m=+89.732293461" Feb 19 15:11:00 crc kubenswrapper[4810]: I0219 15:11:00.295586 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-nmbsx" podStartSLOduration=67.295553877 podStartE2EDuration="1m7.295553877s" podCreationTimestamp="2026-02-19 15:09:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:11:00.293957338 +0000 UTC m=+89.775987472" watchObservedRunningTime="2026-02-19 15:11:00.295553877 +0000 UTC m=+89.777584021" Feb 19 15:11:00 crc kubenswrapper[4810]: I0219 15:11:00.299698 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r8wfc" Feb 19 15:11:00 crc kubenswrapper[4810]: W0219 15:11:00.329864 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode566cb6c_c157_4186_91e6_a0474949d42e.slice/crio-7dae6c7b79931e41fae8b7e491d59a8e9eeeaa3cfdbd5c988f0ccc51e00792fc WatchSource:0}: Error finding container 7dae6c7b79931e41fae8b7e491d59a8e9eeeaa3cfdbd5c988f0ccc51e00792fc: Status 404 returned error can't find the container with id 7dae6c7b79931e41fae8b7e491d59a8e9eeeaa3cfdbd5c988f0ccc51e00792fc Feb 19 15:11:00 crc kubenswrapper[4810]: I0219 15:11:00.347181 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=42.347154383 podStartE2EDuration="42.347154383s" podCreationTimestamp="2026-02-19 15:10:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:11:00.346131258 +0000 UTC m=+89.828161392" watchObservedRunningTime="2026-02-19 15:11:00.347154383 +0000 UTC m=+89.829184517" Feb 19 15:11:00 crc kubenswrapper[4810]: I0219 15:11:00.382881 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podStartSLOduration=68.382861066 podStartE2EDuration="1m8.382861066s" podCreationTimestamp="2026-02-19 15:09:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:11:00.382344283 +0000 UTC m=+89.864374437" watchObservedRunningTime="2026-02-19 15:11:00.382861066 +0000 UTC m=+89.864891200" Feb 19 15:11:00 crc kubenswrapper[4810]: I0219 15:11:00.398718 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-t9jnq" podStartSLOduration=68.398698248 podStartE2EDuration="1m8.398698248s" podCreationTimestamp="2026-02-19 15:09:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:11:00.397852627 +0000 UTC m=+89.879882751" watchObservedRunningTime="2026-02-19 15:11:00.398698248 +0000 UTC m=+89.880728372" Feb 19 15:11:00 crc kubenswrapper[4810]: I0219 15:11:00.421035 4810 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 12:54:59.615872567 +0000 UTC Feb 19 15:11:00 crc kubenswrapper[4810]: I0219 15:11:00.421112 4810 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Feb 19 15:11:00 crc kubenswrapper[4810]: I0219 15:11:00.427817 4810 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 19 15:11:00 crc kubenswrapper[4810]: I0219 15:11:00.438426 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 15:11:00 crc kubenswrapper[4810]: E0219 15:11:00.438675 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 15:11:01 crc kubenswrapper[4810]: I0219 15:11:01.063577 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r8wfc" event={"ID":"e566cb6c-c157-4186-91e6-a0474949d42e","Type":"ContainerStarted","Data":"482b0b1df225989bc6d0b54aa328c18be90fb10c12be5eb9f4dc2a9771aa5312"} Feb 19 15:11:01 crc kubenswrapper[4810]: I0219 15:11:01.063964 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r8wfc" event={"ID":"e566cb6c-c157-4186-91e6-a0474949d42e","Type":"ContainerStarted","Data":"7dae6c7b79931e41fae8b7e491d59a8e9eeeaa3cfdbd5c988f0ccc51e00792fc"} Feb 19 15:11:01 crc kubenswrapper[4810]: I0219 15:11:01.083906 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-l2g4t" podStartSLOduration=68.083867492 podStartE2EDuration="1m8.083867492s" podCreationTimestamp="2026-02-19 15:09:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:11:00.414859628 +0000 UTC m=+89.896889752" watchObservedRunningTime="2026-02-19 15:11:01.083867492 +0000 UTC m=+90.565897626" Feb 19 15:11:01 crc kubenswrapper[4810]: I0219 15:11:01.084098 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r8wfc" podStartSLOduration=69.084092447 podStartE2EDuration="1m9.084092447s" podCreationTimestamp="2026-02-19 15:09:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:11:01.083618975 +0000 UTC m=+90.565649139" watchObservedRunningTime="2026-02-19 15:11:01.084092447 +0000 UTC m=+90.566122581" Feb 19 15:11:01 crc kubenswrapper[4810]: I0219 15:11:01.438901 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2x9v9" Feb 19 15:11:01 crc kubenswrapper[4810]: I0219 15:11:01.439031 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 15:11:01 crc kubenswrapper[4810]: I0219 15:11:01.438912 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 15:11:01 crc kubenswrapper[4810]: E0219 15:11:01.441175 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2x9v9" podUID="b72d3f7a-e418-4a21-af73-6a43ce3358c1" Feb 19 15:11:01 crc kubenswrapper[4810]: E0219 15:11:01.441397 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 15:11:01 crc kubenswrapper[4810]: E0219 15:11:01.441571 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 15:11:02 crc kubenswrapper[4810]: I0219 15:11:02.438456 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 15:11:02 crc kubenswrapper[4810]: E0219 15:11:02.439010 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 15:11:03 crc kubenswrapper[4810]: I0219 15:11:03.438733 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2x9v9" Feb 19 15:11:03 crc kubenswrapper[4810]: I0219 15:11:03.438832 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 15:11:03 crc kubenswrapper[4810]: I0219 15:11:03.438844 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 15:11:03 crc kubenswrapper[4810]: E0219 15:11:03.439173 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2x9v9" podUID="b72d3f7a-e418-4a21-af73-6a43ce3358c1" Feb 19 15:11:03 crc kubenswrapper[4810]: E0219 15:11:03.439472 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 15:11:03 crc kubenswrapper[4810]: E0219 15:11:03.439734 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 15:11:04 crc kubenswrapper[4810]: I0219 15:11:04.438948 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 15:11:04 crc kubenswrapper[4810]: E0219 15:11:04.439115 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 15:11:05 crc kubenswrapper[4810]: I0219 15:11:05.438970 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 15:11:05 crc kubenswrapper[4810]: I0219 15:11:05.438996 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2x9v9" Feb 19 15:11:05 crc kubenswrapper[4810]: I0219 15:11:05.439376 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 15:11:05 crc kubenswrapper[4810]: E0219 15:11:05.439678 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 15:11:05 crc kubenswrapper[4810]: E0219 15:11:05.439839 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2x9v9" podUID="b72d3f7a-e418-4a21-af73-6a43ce3358c1" Feb 19 15:11:05 crc kubenswrapper[4810]: E0219 15:11:05.440100 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 15:11:06 crc kubenswrapper[4810]: I0219 15:11:06.438270 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 15:11:06 crc kubenswrapper[4810]: E0219 15:11:06.438496 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 15:11:07 crc kubenswrapper[4810]: I0219 15:11:07.438999 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 15:11:07 crc kubenswrapper[4810]: I0219 15:11:07.439051 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 15:11:07 crc kubenswrapper[4810]: I0219 15:11:07.438990 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2x9v9" Feb 19 15:11:07 crc kubenswrapper[4810]: E0219 15:11:07.439194 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 15:11:07 crc kubenswrapper[4810]: E0219 15:11:07.439791 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2x9v9" podUID="b72d3f7a-e418-4a21-af73-6a43ce3358c1" Feb 19 15:11:07 crc kubenswrapper[4810]: E0219 15:11:07.439901 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 15:11:07 crc kubenswrapper[4810]: I0219 15:11:07.440341 4810 scope.go:117] "RemoveContainer" containerID="824c8634103fa360f2c3a6cb4c22c0185f87e2de6475beadb412027255de2cd4" Feb 19 15:11:07 crc kubenswrapper[4810]: E0219 15:11:07.440580 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-8k7p5_openshift-ovn-kubernetes(c5a8a15c-53e8-4868-8feb-dcd4e83939a4)\"" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" podUID="c5a8a15c-53e8-4868-8feb-dcd4e83939a4" Feb 19 15:11:08 crc kubenswrapper[4810]: I0219 15:11:08.438814 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 15:11:08 crc kubenswrapper[4810]: E0219 15:11:08.439107 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 15:11:09 crc kubenswrapper[4810]: I0219 15:11:09.438599 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 15:11:09 crc kubenswrapper[4810]: I0219 15:11:09.438708 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2x9v9" Feb 19 15:11:09 crc kubenswrapper[4810]: I0219 15:11:09.438612 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 15:11:09 crc kubenswrapper[4810]: E0219 15:11:09.438795 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 15:11:09 crc kubenswrapper[4810]: E0219 15:11:09.438947 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 15:11:09 crc kubenswrapper[4810]: E0219 15:11:09.439143 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2x9v9" podUID="b72d3f7a-e418-4a21-af73-6a43ce3358c1" Feb 19 15:11:10 crc kubenswrapper[4810]: I0219 15:11:10.439484 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 15:11:10 crc kubenswrapper[4810]: E0219 15:11:10.439786 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 15:11:11 crc kubenswrapper[4810]: I0219 15:11:11.439286 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 15:11:11 crc kubenswrapper[4810]: I0219 15:11:11.439300 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 15:11:11 crc kubenswrapper[4810]: I0219 15:11:11.439113 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2x9v9" Feb 19 15:11:11 crc kubenswrapper[4810]: E0219 15:11:11.441800 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 15:11:11 crc kubenswrapper[4810]: E0219 15:11:11.442169 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 15:11:11 crc kubenswrapper[4810]: E0219 15:11:11.442218 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2x9v9" podUID="b72d3f7a-e418-4a21-af73-6a43ce3358c1" Feb 19 15:11:11 crc kubenswrapper[4810]: I0219 15:11:11.564576 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b72d3f7a-e418-4a21-af73-6a43ce3358c1-metrics-certs\") pod \"network-metrics-daemon-2x9v9\" (UID: \"b72d3f7a-e418-4a21-af73-6a43ce3358c1\") " pod="openshift-multus/network-metrics-daemon-2x9v9" Feb 19 15:11:11 crc kubenswrapper[4810]: E0219 15:11:11.564739 4810 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 15:11:11 crc kubenswrapper[4810]: E0219 15:11:11.564794 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b72d3f7a-e418-4a21-af73-6a43ce3358c1-metrics-certs podName:b72d3f7a-e418-4a21-af73-6a43ce3358c1 nodeName:}" failed. No retries permitted until 2026-02-19 15:12:15.564775924 +0000 UTC m=+165.046806048 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b72d3f7a-e418-4a21-af73-6a43ce3358c1-metrics-certs") pod "network-metrics-daemon-2x9v9" (UID: "b72d3f7a-e418-4a21-af73-6a43ce3358c1") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 15:11:12 crc kubenswrapper[4810]: I0219 15:11:12.438615 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 15:11:12 crc kubenswrapper[4810]: E0219 15:11:12.438812 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 15:11:13 crc kubenswrapper[4810]: I0219 15:11:13.438877 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2x9v9" Feb 19 15:11:13 crc kubenswrapper[4810]: I0219 15:11:13.439073 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 15:11:13 crc kubenswrapper[4810]: I0219 15:11:13.439119 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 15:11:13 crc kubenswrapper[4810]: E0219 15:11:13.439409 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2x9v9" podUID="b72d3f7a-e418-4a21-af73-6a43ce3358c1" Feb 19 15:11:13 crc kubenswrapper[4810]: E0219 15:11:13.439591 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 15:11:13 crc kubenswrapper[4810]: E0219 15:11:13.439864 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 15:11:14 crc kubenswrapper[4810]: I0219 15:11:14.438685 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 15:11:14 crc kubenswrapper[4810]: E0219 15:11:14.438895 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 15:11:15 crc kubenswrapper[4810]: I0219 15:11:15.438758 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2x9v9" Feb 19 15:11:15 crc kubenswrapper[4810]: I0219 15:11:15.438869 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 15:11:15 crc kubenswrapper[4810]: I0219 15:11:15.438990 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 15:11:15 crc kubenswrapper[4810]: E0219 15:11:15.439053 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2x9v9" podUID="b72d3f7a-e418-4a21-af73-6a43ce3358c1" Feb 19 15:11:15 crc kubenswrapper[4810]: E0219 15:11:15.439251 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 15:11:15 crc kubenswrapper[4810]: E0219 15:11:15.439441 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 15:11:16 crc kubenswrapper[4810]: I0219 15:11:16.438598 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 15:11:16 crc kubenswrapper[4810]: E0219 15:11:16.438817 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 15:11:17 crc kubenswrapper[4810]: I0219 15:11:17.438879 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2x9v9" Feb 19 15:11:17 crc kubenswrapper[4810]: I0219 15:11:17.438936 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 15:11:17 crc kubenswrapper[4810]: E0219 15:11:17.439157 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2x9v9" podUID="b72d3f7a-e418-4a21-af73-6a43ce3358c1" Feb 19 15:11:17 crc kubenswrapper[4810]: E0219 15:11:17.439251 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 15:11:17 crc kubenswrapper[4810]: I0219 15:11:17.439630 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 15:11:17 crc kubenswrapper[4810]: E0219 15:11:17.439886 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 15:11:18 crc kubenswrapper[4810]: I0219 15:11:18.438447 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 15:11:18 crc kubenswrapper[4810]: E0219 15:11:18.438673 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 15:11:19 crc kubenswrapper[4810]: I0219 15:11:19.439383 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 15:11:19 crc kubenswrapper[4810]: I0219 15:11:19.439426 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 15:11:19 crc kubenswrapper[4810]: E0219 15:11:19.439607 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 15:11:19 crc kubenswrapper[4810]: I0219 15:11:19.439646 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2x9v9" Feb 19 15:11:19 crc kubenswrapper[4810]: E0219 15:11:19.439959 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2x9v9" podUID="b72d3f7a-e418-4a21-af73-6a43ce3358c1" Feb 19 15:11:19 crc kubenswrapper[4810]: E0219 15:11:19.440461 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 15:11:20 crc kubenswrapper[4810]: I0219 15:11:20.438606 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 15:11:20 crc kubenswrapper[4810]: E0219 15:11:20.438784 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 15:11:21 crc kubenswrapper[4810]: I0219 15:11:21.438526 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 15:11:21 crc kubenswrapper[4810]: I0219 15:11:21.438666 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2x9v9" Feb 19 15:11:21 crc kubenswrapper[4810]: I0219 15:11:21.438542 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 15:11:21 crc kubenswrapper[4810]: E0219 15:11:21.438817 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 15:11:21 crc kubenswrapper[4810]: E0219 15:11:21.438995 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 15:11:21 crc kubenswrapper[4810]: E0219 15:11:21.439159 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2x9v9" podUID="b72d3f7a-e418-4a21-af73-6a43ce3358c1" Feb 19 15:11:22 crc kubenswrapper[4810]: I0219 15:11:22.438859 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 15:11:22 crc kubenswrapper[4810]: E0219 15:11:22.439521 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 15:11:22 crc kubenswrapper[4810]: I0219 15:11:22.439919 4810 scope.go:117] "RemoveContainer" containerID="824c8634103fa360f2c3a6cb4c22c0185f87e2de6475beadb412027255de2cd4" Feb 19 15:11:22 crc kubenswrapper[4810]: E0219 15:11:22.440153 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-8k7p5_openshift-ovn-kubernetes(c5a8a15c-53e8-4868-8feb-dcd4e83939a4)\"" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" podUID="c5a8a15c-53e8-4868-8feb-dcd4e83939a4" Feb 19 15:11:23 crc kubenswrapper[4810]: I0219 15:11:23.439123 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 15:11:23 crc kubenswrapper[4810]: I0219 15:11:23.439208 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 15:11:23 crc kubenswrapper[4810]: E0219 15:11:23.439350 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 15:11:23 crc kubenswrapper[4810]: I0219 15:11:23.439413 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2x9v9" Feb 19 15:11:23 crc kubenswrapper[4810]: E0219 15:11:23.439500 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 15:11:23 crc kubenswrapper[4810]: E0219 15:11:23.439586 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2x9v9" podUID="b72d3f7a-e418-4a21-af73-6a43ce3358c1" Feb 19 15:11:24 crc kubenswrapper[4810]: I0219 15:11:24.439212 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 15:11:24 crc kubenswrapper[4810]: E0219 15:11:24.439973 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 15:11:25 crc kubenswrapper[4810]: I0219 15:11:25.438941 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 15:11:25 crc kubenswrapper[4810]: I0219 15:11:25.438982 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 15:11:25 crc kubenswrapper[4810]: I0219 15:11:25.439209 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2x9v9" Feb 19 15:11:25 crc kubenswrapper[4810]: E0219 15:11:25.439362 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 15:11:25 crc kubenswrapper[4810]: E0219 15:11:25.439752 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2x9v9" podUID="b72d3f7a-e418-4a21-af73-6a43ce3358c1" Feb 19 15:11:25 crc kubenswrapper[4810]: E0219 15:11:25.439979 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 15:11:26 crc kubenswrapper[4810]: I0219 15:11:26.438818 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 15:11:26 crc kubenswrapper[4810]: E0219 15:11:26.439027 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 15:11:27 crc kubenswrapper[4810]: I0219 15:11:27.439093 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2x9v9" Feb 19 15:11:27 crc kubenswrapper[4810]: I0219 15:11:27.439180 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 15:11:27 crc kubenswrapper[4810]: I0219 15:11:27.439111 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 15:11:27 crc kubenswrapper[4810]: E0219 15:11:27.439374 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2x9v9" podUID="b72d3f7a-e418-4a21-af73-6a43ce3358c1" Feb 19 15:11:27 crc kubenswrapper[4810]: E0219 15:11:27.439510 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 15:11:27 crc kubenswrapper[4810]: E0219 15:11:27.439714 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 15:11:28 crc kubenswrapper[4810]: I0219 15:11:28.438695 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 15:11:28 crc kubenswrapper[4810]: E0219 15:11:28.439018 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 15:11:29 crc kubenswrapper[4810]: I0219 15:11:29.177770 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bsztz_2a45a199-beeb-4972-b796-15c958fe99d3/kube-multus/1.log" Feb 19 15:11:29 crc kubenswrapper[4810]: I0219 15:11:29.178433 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bsztz_2a45a199-beeb-4972-b796-15c958fe99d3/kube-multus/0.log" Feb 19 15:11:29 crc kubenswrapper[4810]: I0219 15:11:29.178492 4810 generic.go:334] "Generic (PLEG): container finished" podID="2a45a199-beeb-4972-b796-15c958fe99d3" containerID="c1f575901c6e6174387489f4f3a7d8cfa46089db51556b6acc9245faaf36a9ca" exitCode=1 Feb 19 15:11:29 crc kubenswrapper[4810]: I0219 15:11:29.178533 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bsztz" event={"ID":"2a45a199-beeb-4972-b796-15c958fe99d3","Type":"ContainerDied","Data":"c1f575901c6e6174387489f4f3a7d8cfa46089db51556b6acc9245faaf36a9ca"} Feb 19 15:11:29 crc kubenswrapper[4810]: I0219 15:11:29.178586 4810 scope.go:117] "RemoveContainer" containerID="a674e283274f1e77f6ac89a8170137bf5852153bd4a35a80101a3cd0090b85d2" Feb 19 15:11:29 crc kubenswrapper[4810]: I0219 15:11:29.179115 4810 scope.go:117] "RemoveContainer" containerID="c1f575901c6e6174387489f4f3a7d8cfa46089db51556b6acc9245faaf36a9ca" Feb 19 15:11:29 crc kubenswrapper[4810]: E0219 15:11:29.179355 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-bsztz_openshift-multus(2a45a199-beeb-4972-b796-15c958fe99d3)\"" pod="openshift-multus/multus-bsztz" podUID="2a45a199-beeb-4972-b796-15c958fe99d3" Feb 19 15:11:29 crc kubenswrapper[4810]: I0219 15:11:29.438832 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 15:11:29 crc kubenswrapper[4810]: I0219 15:11:29.438869 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2x9v9" Feb 19 15:11:29 crc kubenswrapper[4810]: I0219 15:11:29.438933 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 15:11:29 crc kubenswrapper[4810]: E0219 15:11:29.439012 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 15:11:29 crc kubenswrapper[4810]: E0219 15:11:29.439195 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 15:11:29 crc kubenswrapper[4810]: E0219 15:11:29.439403 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2x9v9" podUID="b72d3f7a-e418-4a21-af73-6a43ce3358c1" Feb 19 15:11:30 crc kubenswrapper[4810]: I0219 15:11:30.184456 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bsztz_2a45a199-beeb-4972-b796-15c958fe99d3/kube-multus/1.log" Feb 19 15:11:30 crc kubenswrapper[4810]: I0219 15:11:30.439021 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 15:11:30 crc kubenswrapper[4810]: E0219 15:11:30.439204 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 15:11:31 crc kubenswrapper[4810]: E0219 15:11:31.389370 4810 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Feb 19 15:11:31 crc kubenswrapper[4810]: I0219 15:11:31.439199 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 15:11:31 crc kubenswrapper[4810]: I0219 15:11:31.439238 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2x9v9" Feb 19 15:11:31 crc kubenswrapper[4810]: I0219 15:11:31.439312 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 15:11:31 crc kubenswrapper[4810]: E0219 15:11:31.441269 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 15:11:31 crc kubenswrapper[4810]: E0219 15:11:31.441470 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2x9v9" podUID="b72d3f7a-e418-4a21-af73-6a43ce3358c1" Feb 19 15:11:31 crc kubenswrapper[4810]: E0219 15:11:31.441619 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 15:11:31 crc kubenswrapper[4810]: E0219 15:11:31.587533 4810 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 19 15:11:32 crc kubenswrapper[4810]: I0219 15:11:32.439187 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 15:11:32 crc kubenswrapper[4810]: E0219 15:11:32.439485 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 15:11:33 crc kubenswrapper[4810]: I0219 15:11:33.439247 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2x9v9" Feb 19 15:11:33 crc kubenswrapper[4810]: I0219 15:11:33.439393 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 15:11:33 crc kubenswrapper[4810]: E0219 15:11:33.439632 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2x9v9" podUID="b72d3f7a-e418-4a21-af73-6a43ce3358c1" Feb 19 15:11:33 crc kubenswrapper[4810]: E0219 15:11:33.439761 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 15:11:33 crc kubenswrapper[4810]: I0219 15:11:33.440213 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 15:11:33 crc kubenswrapper[4810]: E0219 15:11:33.440450 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 15:11:34 crc kubenswrapper[4810]: I0219 15:11:34.438743 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 15:11:34 crc kubenswrapper[4810]: E0219 15:11:34.439271 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 15:11:35 crc kubenswrapper[4810]: I0219 15:11:35.439109 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2x9v9" Feb 19 15:11:35 crc kubenswrapper[4810]: E0219 15:11:35.439316 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2x9v9" podUID="b72d3f7a-e418-4a21-af73-6a43ce3358c1" Feb 19 15:11:35 crc kubenswrapper[4810]: I0219 15:11:35.440516 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 15:11:35 crc kubenswrapper[4810]: E0219 15:11:35.440628 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 15:11:35 crc kubenswrapper[4810]: I0219 15:11:35.440809 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 15:11:35 crc kubenswrapper[4810]: E0219 15:11:35.440912 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 15:11:35 crc kubenswrapper[4810]: I0219 15:11:35.442473 4810 scope.go:117] "RemoveContainer" containerID="824c8634103fa360f2c3a6cb4c22c0185f87e2de6475beadb412027255de2cd4" Feb 19 15:11:36 crc kubenswrapper[4810]: I0219 15:11:36.211632 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8k7p5_c5a8a15c-53e8-4868-8feb-dcd4e83939a4/ovnkube-controller/3.log" Feb 19 15:11:36 crc kubenswrapper[4810]: I0219 15:11:36.214920 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" event={"ID":"c5a8a15c-53e8-4868-8feb-dcd4e83939a4","Type":"ContainerStarted","Data":"07a16a020eab7c7e2c4939aceb3c9448061dd4b2196bce8720223a0590498c5b"} Feb 19 15:11:36 crc kubenswrapper[4810]: I0219 15:11:36.215401 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" Feb 19 15:11:36 crc kubenswrapper[4810]: I0219 15:11:36.257678 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" podStartSLOduration=103.257658128 podStartE2EDuration="1m43.257658128s" podCreationTimestamp="2026-02-19 15:09:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:11:36.256317625 +0000 UTC m=+125.738347749" watchObservedRunningTime="2026-02-19 15:11:36.257658128 +0000 UTC m=+125.739688252" Feb 19 15:11:36 crc kubenswrapper[4810]: I0219 15:11:36.427415 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-2x9v9"] Feb 19 15:11:36 crc kubenswrapper[4810]: I0219 15:11:36.427567 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2x9v9" Feb 19 15:11:36 crc kubenswrapper[4810]: E0219 15:11:36.427692 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2x9v9" podUID="b72d3f7a-e418-4a21-af73-6a43ce3358c1" Feb 19 15:11:36 crc kubenswrapper[4810]: I0219 15:11:36.438802 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 15:11:36 crc kubenswrapper[4810]: E0219 15:11:36.439934 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 15:11:36 crc kubenswrapper[4810]: E0219 15:11:36.589248 4810 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 19 15:11:37 crc kubenswrapper[4810]: I0219 15:11:37.439406 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 15:11:37 crc kubenswrapper[4810]: I0219 15:11:37.439559 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 15:11:37 crc kubenswrapper[4810]: E0219 15:11:37.439620 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 15:11:37 crc kubenswrapper[4810]: E0219 15:11:37.439786 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 15:11:38 crc kubenswrapper[4810]: I0219 15:11:38.438597 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2x9v9" Feb 19 15:11:38 crc kubenswrapper[4810]: I0219 15:11:38.438622 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 15:11:38 crc kubenswrapper[4810]: E0219 15:11:38.438863 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2x9v9" podUID="b72d3f7a-e418-4a21-af73-6a43ce3358c1" Feb 19 15:11:38 crc kubenswrapper[4810]: E0219 15:11:38.439064 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 15:11:39 crc kubenswrapper[4810]: I0219 15:11:39.438603 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 15:11:39 crc kubenswrapper[4810]: E0219 15:11:39.438778 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 15:11:39 crc kubenswrapper[4810]: I0219 15:11:39.438801 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 15:11:39 crc kubenswrapper[4810]: E0219 15:11:39.439250 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 15:11:40 crc kubenswrapper[4810]: I0219 15:11:40.439033 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 15:11:40 crc kubenswrapper[4810]: I0219 15:11:40.439124 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2x9v9" Feb 19 15:11:40 crc kubenswrapper[4810]: E0219 15:11:40.439470 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2x9v9" podUID="b72d3f7a-e418-4a21-af73-6a43ce3358c1" Feb 19 15:11:40 crc kubenswrapper[4810]: I0219 15:11:40.439593 4810 scope.go:117] "RemoveContainer" containerID="c1f575901c6e6174387489f4f3a7d8cfa46089db51556b6acc9245faaf36a9ca" Feb 19 15:11:40 crc kubenswrapper[4810]: E0219 15:11:40.439654 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 15:11:41 crc kubenswrapper[4810]: I0219 15:11:41.239126 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bsztz_2a45a199-beeb-4972-b796-15c958fe99d3/kube-multus/1.log" Feb 19 15:11:41 crc kubenswrapper[4810]: I0219 15:11:41.239636 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bsztz" event={"ID":"2a45a199-beeb-4972-b796-15c958fe99d3","Type":"ContainerStarted","Data":"92e30886e49380d7d876397116f7db4e85388275b36d2f8ee0ab84b9167f3dde"} Feb 19 15:11:41 crc kubenswrapper[4810]: I0219 15:11:41.438878 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 15:11:41 crc kubenswrapper[4810]: E0219 15:11:41.440472 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 15:11:41 crc kubenswrapper[4810]: I0219 15:11:41.440557 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 15:11:41 crc kubenswrapper[4810]: E0219 15:11:41.440708 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 15:11:41 crc kubenswrapper[4810]: E0219 15:11:41.589905 4810 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 19 15:11:42 crc kubenswrapper[4810]: I0219 15:11:42.438587 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2x9v9" Feb 19 15:11:42 crc kubenswrapper[4810]: I0219 15:11:42.438645 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 15:11:42 crc kubenswrapper[4810]: E0219 15:11:42.438790 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2x9v9" podUID="b72d3f7a-e418-4a21-af73-6a43ce3358c1" Feb 19 15:11:42 crc kubenswrapper[4810]: E0219 15:11:42.438914 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 15:11:43 crc kubenswrapper[4810]: I0219 15:11:43.439048 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 15:11:43 crc kubenswrapper[4810]: E0219 15:11:43.439415 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 15:11:43 crc kubenswrapper[4810]: I0219 15:11:43.439501 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 15:11:43 crc kubenswrapper[4810]: E0219 15:11:43.439657 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 15:11:44 crc kubenswrapper[4810]: I0219 15:11:44.438300 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2x9v9" Feb 19 15:11:44 crc kubenswrapper[4810]: I0219 15:11:44.438368 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 15:11:44 crc kubenswrapper[4810]: E0219 15:11:44.438506 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2x9v9" podUID="b72d3f7a-e418-4a21-af73-6a43ce3358c1" Feb 19 15:11:44 crc kubenswrapper[4810]: E0219 15:11:44.438684 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 15:11:45 crc kubenswrapper[4810]: I0219 15:11:45.438502 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 15:11:45 crc kubenswrapper[4810]: I0219 15:11:45.438512 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 15:11:45 crc kubenswrapper[4810]: E0219 15:11:45.438744 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 15:11:45 crc kubenswrapper[4810]: E0219 15:11:45.438813 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 15:11:46 crc kubenswrapper[4810]: I0219 15:11:46.439013 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 15:11:46 crc kubenswrapper[4810]: I0219 15:11:46.439092 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2x9v9" Feb 19 15:11:46 crc kubenswrapper[4810]: E0219 15:11:46.439240 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 15:11:46 crc kubenswrapper[4810]: E0219 15:11:46.439663 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2x9v9" podUID="b72d3f7a-e418-4a21-af73-6a43ce3358c1" Feb 19 15:11:47 crc kubenswrapper[4810]: I0219 15:11:47.438846 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 15:11:47 crc kubenswrapper[4810]: I0219 15:11:47.439064 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 15:11:47 crc kubenswrapper[4810]: I0219 15:11:47.443966 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 19 15:11:47 crc kubenswrapper[4810]: I0219 15:11:47.443991 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 19 15:11:47 crc kubenswrapper[4810]: I0219 15:11:47.444075 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 19 15:11:47 crc kubenswrapper[4810]: I0219 15:11:47.444176 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 19 15:11:48 crc kubenswrapper[4810]: I0219 15:11:48.438602 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 15:11:48 crc kubenswrapper[4810]: I0219 15:11:48.438671 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2x9v9" Feb 19 15:11:48 crc kubenswrapper[4810]: I0219 15:11:48.442464 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 19 15:11:48 crc kubenswrapper[4810]: I0219 15:11:48.443119 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.124943 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.172626 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-5d4rp"] Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.173261 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-5d4rp" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.175615 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.175797 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.176563 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.176672 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.176929 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.177069 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.178264 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.178528 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.183881 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.183894 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.196178 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-l66cb"] Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.196802 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-l66cb" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.198596 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.201490 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.201909 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.202144 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.202432 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.202560 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-sl5p9"] Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.202928 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-sl5p9" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.203576 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.203901 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.204737 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-8mxdc"] Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.205121 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8mxdc" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.206463 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.206637 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.206858 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.207763 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.207919 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.208231 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.209440 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-55x9c"] Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.209765 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-55x9c" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.211309 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-jm2nk"] Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.211674 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-jm2nk" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.214250 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.216297 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.216493 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.216678 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.216808 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-n8zth"] Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.216855 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.217028 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.217067 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xhzlb"] Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.217335 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-wpzzq"] Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.217636 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-wpzzq" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.217878 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n8zth" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.218068 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xhzlb" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.218517 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.218744 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kclsk"] Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.218835 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.219639 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kclsk" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.220774 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xhg7l"] Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.220831 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.221072 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xhg7l" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.221173 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.221458 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.221492 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.221558 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.221648 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.221653 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.221919 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.222260 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.222383 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.222496 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.222579 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.222706 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.222745 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.222857 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.222902 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.222979 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.230917 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-d8pqg"] Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.231358 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-d8pqg" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.249974 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-q6zwc"] Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.267029 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.267647 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.267958 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.268252 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.268599 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.268831 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.268964 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a4b3e85d-d02e-4e13-8bab-aa86d2629d85-audit-dir\") pod \"apiserver-76f77b778f-5d4rp\" (UID: \"a4b3e85d-d02e-4e13-8bab-aa86d2629d85\") " pod="openshift-apiserver/apiserver-76f77b778f-5d4rp" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.269000 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a4b3e85d-d02e-4e13-8bab-aa86d2629d85-node-pullsecrets\") pod \"apiserver-76f77b778f-5d4rp\" (UID: \"a4b3e85d-d02e-4e13-8bab-aa86d2629d85\") " pod="openshift-apiserver/apiserver-76f77b778f-5d4rp" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.269019 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4b3e85d-d02e-4e13-8bab-aa86d2629d85-config\") pod \"apiserver-76f77b778f-5d4rp\" (UID: \"a4b3e85d-d02e-4e13-8bab-aa86d2629d85\") " pod="openshift-apiserver/apiserver-76f77b778f-5d4rp" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.269034 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/a4b3e85d-d02e-4e13-8bab-aa86d2629d85-image-import-ca\") pod \"apiserver-76f77b778f-5d4rp\" (UID: \"a4b3e85d-d02e-4e13-8bab-aa86d2629d85\") " pod="openshift-apiserver/apiserver-76f77b778f-5d4rp" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.269078 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a4b3e85d-d02e-4e13-8bab-aa86d2629d85-etcd-client\") pod \"apiserver-76f77b778f-5d4rp\" (UID: \"a4b3e85d-d02e-4e13-8bab-aa86d2629d85\") " pod="openshift-apiserver/apiserver-76f77b778f-5d4rp" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.269095 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a4b3e85d-d02e-4e13-8bab-aa86d2629d85-serving-cert\") pod \"apiserver-76f77b778f-5d4rp\" (UID: \"a4b3e85d-d02e-4e13-8bab-aa86d2629d85\") " pod="openshift-apiserver/apiserver-76f77b778f-5d4rp" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.269108 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a4b3e85d-d02e-4e13-8bab-aa86d2629d85-trusted-ca-bundle\") pod \"apiserver-76f77b778f-5d4rp\" (UID: \"a4b3e85d-d02e-4e13-8bab-aa86d2629d85\") " pod="openshift-apiserver/apiserver-76f77b778f-5d4rp" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.269121 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a4b3e85d-d02e-4e13-8bab-aa86d2629d85-encryption-config\") pod \"apiserver-76f77b778f-5d4rp\" (UID: \"a4b3e85d-d02e-4e13-8bab-aa86d2629d85\") " pod="openshift-apiserver/apiserver-76f77b778f-5d4rp" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.269144 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a4b3e85d-d02e-4e13-8bab-aa86d2629d85-etcd-serving-ca\") pod \"apiserver-76f77b778f-5d4rp\" (UID: \"a4b3e85d-d02e-4e13-8bab-aa86d2629d85\") " pod="openshift-apiserver/apiserver-76f77b778f-5d4rp" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.269158 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/a4b3e85d-d02e-4e13-8bab-aa86d2629d85-audit\") pod \"apiserver-76f77b778f-5d4rp\" (UID: \"a4b3e85d-d02e-4e13-8bab-aa86d2629d85\") " pod="openshift-apiserver/apiserver-76f77b778f-5d4rp" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.269170 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2zrj\" (UniqueName: \"kubernetes.io/projected/a4b3e85d-d02e-4e13-8bab-aa86d2629d85-kube-api-access-j2zrj\") pod \"apiserver-76f77b778f-5d4rp\" (UID: \"a4b3e85d-d02e-4e13-8bab-aa86d2629d85\") " pod="openshift-apiserver/apiserver-76f77b778f-5d4rp" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.270759 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.270908 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.271115 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.271307 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.271422 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.271811 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.271827 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.273411 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.273636 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.273787 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.274064 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.276458 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.276499 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.276770 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.276883 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.277064 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.278713 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.279620 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.294730 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-dkppn"] Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.295076 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-4hddt"] Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.295341 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q6zwc" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.295403 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-hvw7f"] Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.295495 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-4hddt" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.295551 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-dkppn" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.297344 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l924n"] Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.297581 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8j2gh"] Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.297806 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-zrsn2"] Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.298135 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-kmhmh"] Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.298486 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-r74mv"] Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.298716 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-f4qrf"] Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.299038 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2phqk"] Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.299294 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2nn8b"] Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.299696 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2nn8b" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.299960 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-hvw7f" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.300118 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9lqqd"] Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.300201 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8j2gh" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.300361 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.300880 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9lqqd" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.300180 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l924n" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.302068 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-zrsn2" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.302247 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.302284 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kmhmh" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.302559 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.302629 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-r74mv" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.302989 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.303062 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-f4qrf" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.303096 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.303139 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.303239 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2phqk" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.303271 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.303291 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.303461 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.303518 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.303525 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.303541 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.303649 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.303697 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.303721 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.303727 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.303741 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.303812 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.303820 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.303902 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.303834 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.303914 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.331680 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hmxtq"] Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.332560 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-25cmt"] Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.333249 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-25cmt" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.333580 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hmxtq" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.334200 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-65ltw"] Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.344679 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.345685 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.345913 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.346180 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.346293 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.388690 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.389590 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-65ltw" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.390258 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.396823 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.397958 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c696fe96-0485-44d0-b4fb-161503c334e8-serving-cert\") pod \"apiserver-7bbb656c7d-q6zwc\" (UID: \"c696fe96-0485-44d0-b4fb-161503c334e8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q6zwc" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.397400 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-895xv"] Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.398771 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.399096 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.399345 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/77544608-a940-4c0c-9a1a-a5a98f480134-auth-proxy-config\") pod \"machine-approver-56656f9798-8mxdc\" (UID: \"77544608-a940-4c0c-9a1a-a5a98f480134\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8mxdc" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.399437 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-895xv" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.399583 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4sn5\" (UniqueName: \"kubernetes.io/projected/fee373cf-50b8-42f4-b30d-4a3d230ca27e-kube-api-access-v4sn5\") pod \"etcd-operator-b45778765-d8pqg\" (UID: \"fee373cf-50b8-42f4-b30d-4a3d230ca27e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-d8pqg" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.399743 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a4b3e85d-d02e-4e13-8bab-aa86d2629d85-etcd-serving-ca\") pod \"apiserver-76f77b778f-5d4rp\" (UID: \"a4b3e85d-d02e-4e13-8bab-aa86d2629d85\") " pod="openshift-apiserver/apiserver-76f77b778f-5d4rp" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.399851 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a4b3e85d-d02e-4e13-8bab-aa86d2629d85-encryption-config\") pod \"apiserver-76f77b778f-5d4rp\" (UID: \"a4b3e85d-d02e-4e13-8bab-aa86d2629d85\") " pod="openshift-apiserver/apiserver-76f77b778f-5d4rp" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.399935 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2ff8\" (UniqueName: \"kubernetes.io/projected/e8c9b478-4884-4c32-acf1-5fdec0cfac06-kube-api-access-v2ff8\") pod \"dns-operator-744455d44c-wpzzq\" (UID: \"e8c9b478-4884-4c32-acf1-5fdec0cfac06\") " pod="openshift-dns-operator/dns-operator-744455d44c-wpzzq" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.400206 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/a4b3e85d-d02e-4e13-8bab-aa86d2629d85-audit\") pod \"apiserver-76f77b778f-5d4rp\" (UID: \"a4b3e85d-d02e-4e13-8bab-aa86d2629d85\") " pod="openshift-apiserver/apiserver-76f77b778f-5d4rp" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.400247 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.400272 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/362cd55c-b576-44bd-843c-078bf26b3b1e-console-config\") pod \"console-f9d7485db-4hddt\" (UID: \"362cd55c-b576-44bd-843c-078bf26b3b1e\") " pod="openshift-console/console-f9d7485db-4hddt" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.400368 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/0e4ea5aa-2074-4100-a916-6bdfb3331d43-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-xhg7l\" (UID: \"0e4ea5aa-2074-4100-a916-6bdfb3331d43\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xhg7l" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.400393 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l478x\" (UniqueName: \"kubernetes.io/projected/ad43df2c-4944-45e2-919f-0c297f4092d4-kube-api-access-l478x\") pod \"route-controller-manager-6576b87f9c-n8zth\" (UID: \"ad43df2c-4944-45e2-919f-0c297f4092d4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n8zth" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.400801 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.401356 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a4b3e85d-d02e-4e13-8bab-aa86d2629d85-etcd-serving-ca\") pod \"apiserver-76f77b778f-5d4rp\" (UID: \"a4b3e85d-d02e-4e13-8bab-aa86d2629d85\") " pod="openshift-apiserver/apiserver-76f77b778f-5d4rp" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.402379 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/a4b3e85d-d02e-4e13-8bab-aa86d2629d85-audit\") pod \"apiserver-76f77b778f-5d4rp\" (UID: \"a4b3e85d-d02e-4e13-8bab-aa86d2629d85\") " pod="openshift-apiserver/apiserver-76f77b778f-5d4rp" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.402489 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mkt5x"] Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.402736 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.402986 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6f0a78b8-77b3-47dd-9dbe-e578b0374cf6-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-sl5p9\" (UID: \"6f0a78b8-77b3-47dd-9dbe-e578b0374cf6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sl5p9" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.403045 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/fee373cf-50b8-42f4-b30d-4a3d230ca27e-etcd-service-ca\") pod \"etcd-operator-b45778765-d8pqg\" (UID: \"fee373cf-50b8-42f4-b30d-4a3d230ca27e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-d8pqg" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.403066 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f4b9328-0efe-42a8-9a73-a80eb6a26151-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-kclsk\" (UID: \"8f4b9328-0efe-42a8-9a73-a80eb6a26151\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kclsk" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.403100 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hxq2\" (UniqueName: \"kubernetes.io/projected/c696fe96-0485-44d0-b4fb-161503c334e8-kube-api-access-9hxq2\") pod \"apiserver-7bbb656c7d-q6zwc\" (UID: \"c696fe96-0485-44d0-b4fb-161503c334e8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q6zwc" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.403441 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mkt5x" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.403537 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-7bnq2"] Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.403570 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/362cd55c-b576-44bd-843c-078bf26b3b1e-console-oauth-config\") pod \"console-f9d7485db-4hddt\" (UID: \"362cd55c-b576-44bd-843c-078bf26b3b1e\") " pod="openshift-console/console-f9d7485db-4hddt" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.403837 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/239280e2-b335-4f87-89a8-00cb6f8e3c69-config\") pod \"authentication-operator-69f744f599-jm2nk\" (UID: \"239280e2-b335-4f87-89a8-00cb6f8e3c69\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jm2nk" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.403886 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/239280e2-b335-4f87-89a8-00cb6f8e3c69-service-ca-bundle\") pod \"authentication-operator-69f744f599-jm2nk\" (UID: \"239280e2-b335-4f87-89a8-00cb6f8e3c69\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jm2nk" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.403909 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7c9k2\" (UniqueName: \"kubernetes.io/projected/0f46b00f-f770-4539-92f9-60e1146308ab-kube-api-access-7c9k2\") pod \"machine-config-controller-84d6567774-f4qrf\" (UID: \"0f46b00f-f770-4539-92f9-60e1146308ab\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-f4qrf" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.403959 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4b3e85d-d02e-4e13-8bab-aa86d2629d85-config\") pod \"apiserver-76f77b778f-5d4rp\" (UID: \"a4b3e85d-d02e-4e13-8bab-aa86d2629d85\") " pod="openshift-apiserver/apiserver-76f77b778f-5d4rp" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.404001 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/a4b3e85d-d02e-4e13-8bab-aa86d2629d85-image-import-ca\") pod \"apiserver-76f77b778f-5d4rp\" (UID: \"a4b3e85d-d02e-4e13-8bab-aa86d2629d85\") " pod="openshift-apiserver/apiserver-76f77b778f-5d4rp" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.404019 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c696fe96-0485-44d0-b4fb-161503c334e8-etcd-client\") pod \"apiserver-7bbb656c7d-q6zwc\" (UID: \"c696fe96-0485-44d0-b4fb-161503c334e8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q6zwc" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.404046 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7x4fb\" (UniqueName: \"kubernetes.io/projected/9a7776ca-1a56-4eca-9e44-ba1b7b15510f-kube-api-access-7x4fb\") pod \"machine-api-operator-5694c8668f-l66cb\" (UID: \"9a7776ca-1a56-4eca-9e44-ba1b7b15510f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-l66cb" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.404065 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0e4ea5aa-2074-4100-a916-6bdfb3331d43-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-xhg7l\" (UID: \"0e4ea5aa-2074-4100-a916-6bdfb3331d43\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xhg7l" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.404084 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87t8g\" (UniqueName: \"kubernetes.io/projected/0e4ea5aa-2074-4100-a916-6bdfb3331d43-kube-api-access-87t8g\") pod \"cluster-image-registry-operator-dc59b4c8b-xhg7l\" (UID: \"0e4ea5aa-2074-4100-a916-6bdfb3331d43\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xhg7l" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.404109 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f0a78b8-77b3-47dd-9dbe-e578b0374cf6-config\") pod \"controller-manager-879f6c89f-sl5p9\" (UID: \"6f0a78b8-77b3-47dd-9dbe-e578b0374cf6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sl5p9" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.404128 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0f46b00f-f770-4539-92f9-60e1146308ab-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-f4qrf\" (UID: \"0f46b00f-f770-4539-92f9-60e1146308ab\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-f4qrf" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.404153 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ad43df2c-4944-45e2-919f-0c297f4092d4-client-ca\") pod \"route-controller-manager-6576b87f9c-n8zth\" (UID: \"ad43df2c-4944-45e2-919f-0c297f4092d4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n8zth" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.404170 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/239280e2-b335-4f87-89a8-00cb6f8e3c69-serving-cert\") pod \"authentication-operator-69f744f599-jm2nk\" (UID: \"239280e2-b335-4f87-89a8-00cb6f8e3c69\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jm2nk" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.404205 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a7776ca-1a56-4eca-9e44-ba1b7b15510f-config\") pod \"machine-api-operator-5694c8668f-l66cb\" (UID: \"9a7776ca-1a56-4eca-9e44-ba1b7b15510f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-l66cb" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.404227 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fee373cf-50b8-42f4-b30d-4a3d230ca27e-config\") pod \"etcd-operator-b45778765-d8pqg\" (UID: \"fee373cf-50b8-42f4-b30d-4a3d230ca27e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-d8pqg" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.404249 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f4b9328-0efe-42a8-9a73-a80eb6a26151-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-kclsk\" (UID: \"8f4b9328-0efe-42a8-9a73-a80eb6a26151\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kclsk" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.404269 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad43df2c-4944-45e2-919f-0c297f4092d4-serving-cert\") pod \"route-controller-manager-6576b87f9c-n8zth\" (UID: \"ad43df2c-4944-45e2-919f-0c297f4092d4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n8zth" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.404302 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c696fe96-0485-44d0-b4fb-161503c334e8-audit-dir\") pod \"apiserver-7bbb656c7d-q6zwc\" (UID: \"c696fe96-0485-44d0-b4fb-161503c334e8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q6zwc" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.404319 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fee373cf-50b8-42f4-b30d-4a3d230ca27e-serving-cert\") pod \"etcd-operator-b45778765-d8pqg\" (UID: \"fee373cf-50b8-42f4-b30d-4a3d230ca27e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-d8pqg" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.404355 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mbjx\" (UniqueName: \"kubernetes.io/projected/7a29951a-027e-49b4-a7ea-a8e363942414-kube-api-access-2mbjx\") pod \"downloads-7954f5f757-dkppn\" (UID: \"7a29951a-027e-49b4-a7ea-a8e363942414\") " pod="openshift-console/downloads-7954f5f757-dkppn" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.404583 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a4b3e85d-d02e-4e13-8bab-aa86d2629d85-serving-cert\") pod \"apiserver-76f77b778f-5d4rp\" (UID: \"a4b3e85d-d02e-4e13-8bab-aa86d2629d85\") " pod="openshift-apiserver/apiserver-76f77b778f-5d4rp" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.404633 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a4b3e85d-d02e-4e13-8bab-aa86d2629d85-trusted-ca-bundle\") pod \"apiserver-76f77b778f-5d4rp\" (UID: \"a4b3e85d-d02e-4e13-8bab-aa86d2629d85\") " pod="openshift-apiserver/apiserver-76f77b778f-5d4rp" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.405053 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rwvxk"] Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.405147 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4b3e85d-d02e-4e13-8bab-aa86d2629d85-config\") pod \"apiserver-76f77b778f-5d4rp\" (UID: \"a4b3e85d-d02e-4e13-8bab-aa86d2629d85\") " pod="openshift-apiserver/apiserver-76f77b778f-5d4rp" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.405737 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rwvxk" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.406049 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.406528 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-sl5p9"] Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.407168 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad43df2c-4944-45e2-919f-0c297f4092d4-config\") pod \"route-controller-manager-6576b87f9c-n8zth\" (UID: \"ad43df2c-4944-45e2-919f-0c297f4092d4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n8zth" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.407290 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2zrj\" (UniqueName: \"kubernetes.io/projected/a4b3e85d-d02e-4e13-8bab-aa86d2629d85-kube-api-access-j2zrj\") pod \"apiserver-76f77b778f-5d4rp\" (UID: \"a4b3e85d-d02e-4e13-8bab-aa86d2629d85\") " pod="openshift-apiserver/apiserver-76f77b778f-5d4rp" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.407397 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c696fe96-0485-44d0-b4fb-161503c334e8-audit-policies\") pod \"apiserver-7bbb656c7d-q6zwc\" (UID: \"c696fe96-0485-44d0-b4fb-161503c334e8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q6zwc" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.407500 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525220-wqnml"] Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.407592 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/362cd55c-b576-44bd-843c-078bf26b3b1e-trusted-ca-bundle\") pod \"console-f9d7485db-4hddt\" (UID: \"362cd55c-b576-44bd-843c-078bf26b3b1e\") " pod="openshift-console/console-f9d7485db-4hddt" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.407632 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnqb7\" (UniqueName: \"kubernetes.io/projected/362cd55c-b576-44bd-843c-078bf26b3b1e-kube-api-access-pnqb7\") pod \"console-f9d7485db-4hddt\" (UID: \"362cd55c-b576-44bd-843c-078bf26b3b1e\") " pod="openshift-console/console-f9d7485db-4hddt" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.407671 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f0a78b8-77b3-47dd-9dbe-e578b0374cf6-serving-cert\") pod \"controller-manager-879f6c89f-sl5p9\" (UID: \"6f0a78b8-77b3-47dd-9dbe-e578b0374cf6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sl5p9" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.407709 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jh895\" (UniqueName: \"kubernetes.io/projected/25e05c8a-335b-405c-9033-f689c21c5ecc-kube-api-access-jh895\") pod \"cluster-samples-operator-665b6dd947-xhzlb\" (UID: \"25e05c8a-335b-405c-9033-f689c21c5ecc\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xhzlb" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.407734 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kr7hk\" (UniqueName: \"kubernetes.io/projected/f7aa90a6-963b-4ff0-b6de-ad46fa896e18-kube-api-access-kr7hk\") pod \"openshift-apiserver-operator-796bbdcf4f-55x9c\" (UID: \"f7aa90a6-963b-4ff0-b6de-ad46fa896e18\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-55x9c" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.407830 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c696fe96-0485-44d0-b4fb-161503c334e8-encryption-config\") pod \"apiserver-7bbb656c7d-q6zwc\" (UID: \"c696fe96-0485-44d0-b4fb-161503c334e8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q6zwc" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.407856 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jh2sf\" (UniqueName: \"kubernetes.io/projected/77544608-a940-4c0c-9a1a-a5a98f480134-kube-api-access-jh2sf\") pod \"machine-approver-56656f9798-8mxdc\" (UID: \"77544608-a940-4c0c-9a1a-a5a98f480134\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8mxdc" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.407895 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/fee373cf-50b8-42f4-b30d-4a3d230ca27e-etcd-ca\") pod \"etcd-operator-b45778765-d8pqg\" (UID: \"fee373cf-50b8-42f4-b30d-4a3d230ca27e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-d8pqg" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.407931 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6f0a78b8-77b3-47dd-9dbe-e578b0374cf6-client-ca\") pod \"controller-manager-879f6c89f-sl5p9\" (UID: \"6f0a78b8-77b3-47dd-9dbe-e578b0374cf6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sl5p9" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.407962 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/362cd55c-b576-44bd-843c-078bf26b3b1e-console-serving-cert\") pod \"console-f9d7485db-4hddt\" (UID: \"362cd55c-b576-44bd-843c-078bf26b3b1e\") " pod="openshift-console/console-f9d7485db-4hddt" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.407985 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c696fe96-0485-44d0-b4fb-161503c334e8-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-q6zwc\" (UID: \"c696fe96-0485-44d0-b4fb-161503c334e8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q6zwc" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.408008 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/9a7776ca-1a56-4eca-9e44-ba1b7b15510f-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-l66cb\" (UID: \"9a7776ca-1a56-4eca-9e44-ba1b7b15510f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-l66cb" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.408015 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a4b3e85d-d02e-4e13-8bab-aa86d2629d85-encryption-config\") pod \"apiserver-76f77b778f-5d4rp\" (UID: \"a4b3e85d-d02e-4e13-8bab-aa86d2629d85\") " pod="openshift-apiserver/apiserver-76f77b778f-5d4rp" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.408034 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7aa90a6-963b-4ff0-b6de-ad46fa896e18-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-55x9c\" (UID: \"f7aa90a6-963b-4ff0-b6de-ad46fa896e18\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-55x9c" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.408076 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgcbs\" (UniqueName: \"kubernetes.io/projected/8f4b9328-0efe-42a8-9a73-a80eb6a26151-kube-api-access-hgcbs\") pod \"openshift-controller-manager-operator-756b6f6bc6-kclsk\" (UID: \"8f4b9328-0efe-42a8-9a73-a80eb6a26151\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kclsk" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.408106 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0f46b00f-f770-4539-92f9-60e1146308ab-proxy-tls\") pod \"machine-config-controller-84d6567774-f4qrf\" (UID: \"0f46b00f-f770-4539-92f9-60e1146308ab\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-f4qrf" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.408145 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a4b3e85d-d02e-4e13-8bab-aa86d2629d85-audit-dir\") pod \"apiserver-76f77b778f-5d4rp\" (UID: \"a4b3e85d-d02e-4e13-8bab-aa86d2629d85\") " pod="openshift-apiserver/apiserver-76f77b778f-5d4rp" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.408179 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/fee373cf-50b8-42f4-b30d-4a3d230ca27e-etcd-client\") pod \"etcd-operator-b45778765-d8pqg\" (UID: \"fee373cf-50b8-42f4-b30d-4a3d230ca27e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-d8pqg" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.408214 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a4b3e85d-d02e-4e13-8bab-aa86d2629d85-audit-dir\") pod \"apiserver-76f77b778f-5d4rp\" (UID: \"a4b3e85d-d02e-4e13-8bab-aa86d2629d85\") " pod="openshift-apiserver/apiserver-76f77b778f-5d4rp" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.408257 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a4b3e85d-d02e-4e13-8bab-aa86d2629d85-node-pullsecrets\") pod \"apiserver-76f77b778f-5d4rp\" (UID: \"a4b3e85d-d02e-4e13-8bab-aa86d2629d85\") " pod="openshift-apiserver/apiserver-76f77b778f-5d4rp" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.408292 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/362cd55c-b576-44bd-843c-078bf26b3b1e-service-ca\") pod \"console-f9d7485db-4hddt\" (UID: \"362cd55c-b576-44bd-843c-078bf26b3b1e\") " pod="openshift-console/console-f9d7485db-4hddt" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.408317 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8k7k5\" (UniqueName: \"kubernetes.io/projected/6f0a78b8-77b3-47dd-9dbe-e578b0374cf6-kube-api-access-8k7k5\") pod \"controller-manager-879f6c89f-sl5p9\" (UID: \"6f0a78b8-77b3-47dd-9dbe-e578b0374cf6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sl5p9" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.408362 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9a7776ca-1a56-4eca-9e44-ba1b7b15510f-images\") pod \"machine-api-operator-5694c8668f-l66cb\" (UID: \"9a7776ca-1a56-4eca-9e44-ba1b7b15510f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-l66cb" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.408380 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7aa90a6-963b-4ff0-b6de-ad46fa896e18-config\") pod \"openshift-apiserver-operator-796bbdcf4f-55x9c\" (UID: \"f7aa90a6-963b-4ff0-b6de-ad46fa896e18\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-55x9c" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.408450 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a4b3e85d-d02e-4e13-8bab-aa86d2629d85-node-pullsecrets\") pod \"apiserver-76f77b778f-5d4rp\" (UID: \"a4b3e85d-d02e-4e13-8bab-aa86d2629d85\") " pod="openshift-apiserver/apiserver-76f77b778f-5d4rp" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.408500 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/362cd55c-b576-44bd-843c-078bf26b3b1e-oauth-serving-cert\") pod \"console-f9d7485db-4hddt\" (UID: \"362cd55c-b576-44bd-843c-078bf26b3b1e\") " pod="openshift-console/console-f9d7485db-4hddt" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.408530 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e8c9b478-4884-4c32-acf1-5fdec0cfac06-metrics-tls\") pod \"dns-operator-744455d44c-wpzzq\" (UID: \"e8c9b478-4884-4c32-acf1-5fdec0cfac06\") " pod="openshift-dns-operator/dns-operator-744455d44c-wpzzq" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.408557 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/25e05c8a-335b-405c-9033-f689c21c5ecc-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-xhzlb\" (UID: \"25e05c8a-335b-405c-9033-f689c21c5ecc\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xhzlb" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.408604 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c696fe96-0485-44d0-b4fb-161503c334e8-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-q6zwc\" (UID: \"c696fe96-0485-44d0-b4fb-161503c334e8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q6zwc" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.408637 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/77544608-a940-4c0c-9a1a-a5a98f480134-machine-approver-tls\") pod \"machine-approver-56656f9798-8mxdc\" (UID: \"77544608-a940-4c0c-9a1a-a5a98f480134\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8mxdc" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.408668 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gzjq\" (UniqueName: \"kubernetes.io/projected/239280e2-b335-4f87-89a8-00cb6f8e3c69-kube-api-access-8gzjq\") pod \"authentication-operator-69f744f599-jm2nk\" (UID: \"239280e2-b335-4f87-89a8-00cb6f8e3c69\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jm2nk" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.408765 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77544608-a940-4c0c-9a1a-a5a98f480134-config\") pod \"machine-approver-56656f9798-8mxdc\" (UID: \"77544608-a940-4c0c-9a1a-a5a98f480134\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8mxdc" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.408800 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/239280e2-b335-4f87-89a8-00cb6f8e3c69-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-jm2nk\" (UID: \"239280e2-b335-4f87-89a8-00cb6f8e3c69\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jm2nk" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.408842 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a4b3e85d-d02e-4e13-8bab-aa86d2629d85-etcd-client\") pod \"apiserver-76f77b778f-5d4rp\" (UID: \"a4b3e85d-d02e-4e13-8bab-aa86d2629d85\") " pod="openshift-apiserver/apiserver-76f77b778f-5d4rp" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.408883 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0e4ea5aa-2074-4100-a916-6bdfb3331d43-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-xhg7l\" (UID: \"0e4ea5aa-2074-4100-a916-6bdfb3331d43\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xhg7l" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.413638 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-w6k69"] Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.413808 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525220-wqnml" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.414460 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-r2tqm"] Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.414468 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/a4b3e85d-d02e-4e13-8bab-aa86d2629d85-image-import-ca\") pod \"apiserver-76f77b778f-5d4rp\" (UID: \"a4b3e85d-d02e-4e13-8bab-aa86d2629d85\") " pod="openshift-apiserver/apiserver-76f77b778f-5d4rp" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.414570 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-w6k69" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.414962 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-r2tqm" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.415605 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a4b3e85d-d02e-4e13-8bab-aa86d2629d85-etcd-client\") pod \"apiserver-76f77b778f-5d4rp\" (UID: \"a4b3e85d-d02e-4e13-8bab-aa86d2629d85\") " pod="openshift-apiserver/apiserver-76f77b778f-5d4rp" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.415873 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.417032 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a4b3e85d-d02e-4e13-8bab-aa86d2629d85-serving-cert\") pod \"apiserver-76f77b778f-5d4rp\" (UID: \"a4b3e85d-d02e-4e13-8bab-aa86d2629d85\") " pod="openshift-apiserver/apiserver-76f77b778f-5d4rp" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.419386 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-frmnw"] Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.420269 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-frmnw" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.420354 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-n6fl9"] Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.421045 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-n6fl9" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.424568 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-5d4rp"] Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.425501 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-55x9c"] Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.427828 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a4b3e85d-d02e-4e13-8bab-aa86d2629d85-trusted-ca-bundle\") pod \"apiserver-76f77b778f-5d4rp\" (UID: \"a4b3e85d-d02e-4e13-8bab-aa86d2629d85\") " pod="openshift-apiserver/apiserver-76f77b778f-5d4rp" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.427915 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-h68pj"] Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.428608 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-h68pj" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.430685 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-jm2nk"] Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.435516 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.435628 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-l66cb"] Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.447066 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xhzlb"] Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.447106 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kclsk"] Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.459388 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-zgcrd"] Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.461348 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-q6zwc"] Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.461552 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-n8zth"] Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.461833 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-zgcrd" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.462584 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-4hddt"] Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.463346 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xhg7l"] Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.464926 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.465058 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8j2gh"] Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.469841 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-kmhmh"] Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.476465 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.482471 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l924n"] Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.485344 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-dkppn"] Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.489407 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9lqqd"] Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.491954 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-zgcrd"] Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.493338 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-d8pqg"] Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.496707 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-zrsn2"] Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.496956 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.501007 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-25cmt"] Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.503005 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2nn8b"] Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.506704 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2phqk"] Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.510224 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-wpzzq"] Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.511087 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/76ffdcba-57d6-4636-8373-f088926a716d-stats-auth\") pod \"router-default-5444994796-hvw7f\" (UID: \"76ffdcba-57d6-4636-8373-f088926a716d\") " pod="openshift-ingress/router-default-5444994796-hvw7f" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.511275 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/77544608-a940-4c0c-9a1a-a5a98f480134-auth-proxy-config\") pod \"machine-approver-56656f9798-8mxdc\" (UID: \"77544608-a940-4c0c-9a1a-a5a98f480134\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8mxdc" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.511441 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2ff8\" (UniqueName: \"kubernetes.io/projected/e8c9b478-4884-4c32-acf1-5fdec0cfac06-kube-api-access-v2ff8\") pod \"dns-operator-744455d44c-wpzzq\" (UID: \"e8c9b478-4884-4c32-acf1-5fdec0cfac06\") " pod="openshift-dns-operator/dns-operator-744455d44c-wpzzq" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.511585 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/362cd55c-b576-44bd-843c-078bf26b3b1e-console-config\") pod \"console-f9d7485db-4hddt\" (UID: \"362cd55c-b576-44bd-843c-078bf26b3b1e\") " pod="openshift-console/console-f9d7485db-4hddt" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.511765 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/0e4ea5aa-2074-4100-a916-6bdfb3331d43-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-xhg7l\" (UID: \"0e4ea5aa-2074-4100-a916-6bdfb3331d43\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xhg7l" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.511955 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/76ffdcba-57d6-4636-8373-f088926a716d-service-ca-bundle\") pod \"router-default-5444994796-hvw7f\" (UID: \"76ffdcba-57d6-4636-8373-f088926a716d\") " pod="openshift-ingress/router-default-5444994796-hvw7f" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.512231 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmvtp\" (UniqueName: \"kubernetes.io/projected/6a43ce91-6673-4641-a2d6-551afe72688d-kube-api-access-xmvtp\") pod \"packageserver-d55dfcdfc-hmxtq\" (UID: \"6a43ce91-6673-4641-a2d6-551afe72688d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hmxtq" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.512941 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/362cd55c-b576-44bd-843c-078bf26b3b1e-console-oauth-config\") pod \"console-f9d7485db-4hddt\" (UID: \"362cd55c-b576-44bd-843c-078bf26b3b1e\") " pod="openshift-console/console-f9d7485db-4hddt" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.513099 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/239280e2-b335-4f87-89a8-00cb6f8e3c69-config\") pod \"authentication-operator-69f744f599-jm2nk\" (UID: \"239280e2-b335-4f87-89a8-00cb6f8e3c69\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jm2nk" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.512812 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/362cd55c-b576-44bd-843c-078bf26b3b1e-console-config\") pod \"console-f9d7485db-4hddt\" (UID: \"362cd55c-b576-44bd-843c-078bf26b3b1e\") " pod="openshift-console/console-f9d7485db-4hddt" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.512298 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/77544608-a940-4c0c-9a1a-a5a98f480134-auth-proxy-config\") pod \"machine-approver-56656f9798-8mxdc\" (UID: \"77544608-a940-4c0c-9a1a-a5a98f480134\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8mxdc" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.513715 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5cc06c55-7085-4cc0-8399-833b4243b51e-auth-proxy-config\") pod \"machine-config-operator-74547568cd-65ltw\" (UID: \"5cc06c55-7085-4cc0-8399-833b4243b51e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-65ltw" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.514082 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7c9k2\" (UniqueName: \"kubernetes.io/projected/0f46b00f-f770-4539-92f9-60e1146308ab-kube-api-access-7c9k2\") pod \"machine-config-controller-84d6567774-f4qrf\" (UID: \"0f46b00f-f770-4539-92f9-60e1146308ab\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-f4qrf" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.514242 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/026a2812-dc11-4b60-b911-bb41a0d39d7d-config\") pod \"kube-apiserver-operator-766d6c64bb-9lqqd\" (UID: \"026a2812-dc11-4b60-b911-bb41a0d39d7d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9lqqd" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.514517 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-r74mv\" (UID: \"c18fb461-ce5b-43ad-85ca-305c3f8a7d46\") " pod="openshift-authentication/oauth-openshift-558db77b4-r74mv" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.514697 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ef2828b3-f501-4105-abc8-6b1ce9658301-srv-cert\") pod \"catalog-operator-68c6474976-2phqk\" (UID: \"ef2828b3-f501-4105-abc8-6b1ce9658301\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2phqk" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.514814 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87t8g\" (UniqueName: \"kubernetes.io/projected/0e4ea5aa-2074-4100-a916-6bdfb3331d43-kube-api-access-87t8g\") pod \"cluster-image-registry-operator-dc59b4c8b-xhg7l\" (UID: \"0e4ea5aa-2074-4100-a916-6bdfb3331d43\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xhg7l" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.514966 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f0a78b8-77b3-47dd-9dbe-e578b0374cf6-config\") pod \"controller-manager-879f6c89f-sl5p9\" (UID: \"6f0a78b8-77b3-47dd-9dbe-e578b0374cf6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sl5p9" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.515081 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0f46b00f-f770-4539-92f9-60e1146308ab-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-f4qrf\" (UID: \"0f46b00f-f770-4539-92f9-60e1146308ab\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-f4qrf" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.515407 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-audit-policies\") pod \"oauth-openshift-558db77b4-r74mv\" (UID: \"c18fb461-ce5b-43ad-85ca-305c3f8a7d46\") " pod="openshift-authentication/oauth-openshift-558db77b4-r74mv" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.515523 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ad43df2c-4944-45e2-919f-0c297f4092d4-client-ca\") pod \"route-controller-manager-6576b87f9c-n8zth\" (UID: \"ad43df2c-4944-45e2-919f-0c297f4092d4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n8zth" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.515635 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/239280e2-b335-4f87-89a8-00cb6f8e3c69-serving-cert\") pod \"authentication-operator-69f744f599-jm2nk\" (UID: \"239280e2-b335-4f87-89a8-00cb6f8e3c69\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jm2nk" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.515739 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a7776ca-1a56-4eca-9e44-ba1b7b15510f-config\") pod \"machine-api-operator-5694c8668f-l66cb\" (UID: \"9a7776ca-1a56-4eca-9e44-ba1b7b15510f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-l66cb" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.516066 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/239280e2-b335-4f87-89a8-00cb6f8e3c69-config\") pod \"authentication-operator-69f744f599-jm2nk\" (UID: \"239280e2-b335-4f87-89a8-00cb6f8e3c69\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jm2nk" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.516382 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f0a78b8-77b3-47dd-9dbe-e578b0374cf6-config\") pod \"controller-manager-879f6c89f-sl5p9\" (UID: \"6f0a78b8-77b3-47dd-9dbe-e578b0374cf6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sl5p9" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.516549 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fee373cf-50b8-42f4-b30d-4a3d230ca27e-config\") pod \"etcd-operator-b45778765-d8pqg\" (UID: \"fee373cf-50b8-42f4-b30d-4a3d230ca27e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-d8pqg" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.516680 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f4b9328-0efe-42a8-9a73-a80eb6a26151-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-kclsk\" (UID: \"8f4b9328-0efe-42a8-9a73-a80eb6a26151\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kclsk" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.516801 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/63674760-f499-49b8-a575-a8ae954eada4-bound-sa-token\") pod \"ingress-operator-5b745b69d9-kmhmh\" (UID: \"63674760-f499-49b8-a575-a8ae954eada4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kmhmh" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.517000 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/6a43ce91-6673-4641-a2d6-551afe72688d-tmpfs\") pod \"packageserver-d55dfcdfc-hmxtq\" (UID: \"6a43ce91-6673-4641-a2d6-551afe72688d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hmxtq" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.517093 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6a43ce91-6673-4641-a2d6-551afe72688d-apiservice-cert\") pod \"packageserver-d55dfcdfc-hmxtq\" (UID: \"6a43ce91-6673-4641-a2d6-551afe72688d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hmxtq" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.517177 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad43df2c-4944-45e2-919f-0c297f4092d4-config\") pod \"route-controller-manager-6576b87f9c-n8zth\" (UID: \"ad43df2c-4944-45e2-919f-0c297f4092d4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n8zth" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.517271 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/b1fae478-5e9a-4da1-b2e3-35c1ee7f8fa3-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-2nn8b\" (UID: \"b1fae478-5e9a-4da1-b2e3-35c1ee7f8fa3\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2nn8b" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.517390 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kr7hk\" (UniqueName: \"kubernetes.io/projected/f7aa90a6-963b-4ff0-b6de-ad46fa896e18-kube-api-access-kr7hk\") pod \"openshift-apiserver-operator-796bbdcf4f-55x9c\" (UID: \"f7aa90a6-963b-4ff0-b6de-ad46fa896e18\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-55x9c" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.517479 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/fee373cf-50b8-42f4-b30d-4a3d230ca27e-etcd-ca\") pod \"etcd-operator-b45778765-d8pqg\" (UID: \"fee373cf-50b8-42f4-b30d-4a3d230ca27e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-d8pqg" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.517563 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f4b9328-0efe-42a8-9a73-a80eb6a26151-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-kclsk\" (UID: \"8f4b9328-0efe-42a8-9a73-a80eb6a26151\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kclsk" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.517578 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6f0a78b8-77b3-47dd-9dbe-e578b0374cf6-client-ca\") pod \"controller-manager-879f6c89f-sl5p9\" (UID: \"6f0a78b8-77b3-47dd-9dbe-e578b0374cf6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sl5p9" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.517615 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ad43df2c-4944-45e2-919f-0c297f4092d4-client-ca\") pod \"route-controller-manager-6576b87f9c-n8zth\" (UID: \"ad43df2c-4944-45e2-919f-0c297f4092d4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n8zth" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.517680 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hmxtq"] Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.517685 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a7776ca-1a56-4eca-9e44-ba1b7b15510f-config\") pod \"machine-api-operator-5694c8668f-l66cb\" (UID: \"9a7776ca-1a56-4eca-9e44-ba1b7b15510f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-l66cb" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.517759 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fee373cf-50b8-42f4-b30d-4a3d230ca27e-config\") pod \"etcd-operator-b45778765-d8pqg\" (UID: \"fee373cf-50b8-42f4-b30d-4a3d230ca27e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-d8pqg" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.517763 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0f46b00f-f770-4539-92f9-60e1146308ab-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-f4qrf\" (UID: \"0f46b00f-f770-4539-92f9-60e1146308ab\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-f4qrf" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.517756 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b1e6fd37-d069-441e-9c0f-caf60d8f8d4f-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8j2gh\" (UID: \"b1e6fd37-d069-441e-9c0f-caf60d8f8d4f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8j2gh" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.518274 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/fee373cf-50b8-42f4-b30d-4a3d230ca27e-etcd-ca\") pod \"etcd-operator-b45778765-d8pqg\" (UID: \"fee373cf-50b8-42f4-b30d-4a3d230ca27e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-d8pqg" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.518293 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/9a7776ca-1a56-4eca-9e44-ba1b7b15510f-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-l66cb\" (UID: \"9a7776ca-1a56-4eca-9e44-ba1b7b15510f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-l66cb" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.518339 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7aa90a6-963b-4ff0-b6de-ad46fa896e18-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-55x9c\" (UID: \"f7aa90a6-963b-4ff0-b6de-ad46fa896e18\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-55x9c" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.518364 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/76ffdcba-57d6-4636-8373-f088926a716d-metrics-certs\") pod \"router-default-5444994796-hvw7f\" (UID: \"76ffdcba-57d6-4636-8373-f088926a716d\") " pod="openshift-ingress/router-default-5444994796-hvw7f" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.518387 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.518389 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0f46b00f-f770-4539-92f9-60e1146308ab-proxy-tls\") pod \"machine-config-controller-84d6567774-f4qrf\" (UID: \"0f46b00f-f770-4539-92f9-60e1146308ab\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-f4qrf" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.519757 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/fee373cf-50b8-42f4-b30d-4a3d230ca27e-etcd-client\") pod \"etcd-operator-b45778765-d8pqg\" (UID: \"fee373cf-50b8-42f4-b30d-4a3d230ca27e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-d8pqg" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.519789 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqq2f\" (UniqueName: \"kubernetes.io/projected/13f4a10d-50aa-41ec-9931-cb835ba1f54c-kube-api-access-zqq2f\") pod \"multus-admission-controller-857f4d67dd-zrsn2\" (UID: \"13f4a10d-50aa-41ec-9931-cb835ba1f54c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-zrsn2" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.519812 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/362cd55c-b576-44bd-843c-078bf26b3b1e-service-ca\") pod \"console-f9d7485db-4hddt\" (UID: \"362cd55c-b576-44bd-843c-078bf26b3b1e\") " pod="openshift-console/console-f9d7485db-4hddt" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.519834 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-r74mv\" (UID: \"c18fb461-ce5b-43ad-85ca-305c3f8a7d46\") " pod="openshift-authentication/oauth-openshift-558db77b4-r74mv" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.519867 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/362cd55c-b576-44bd-843c-078bf26b3b1e-oauth-serving-cert\") pod \"console-f9d7485db-4hddt\" (UID: \"362cd55c-b576-44bd-843c-078bf26b3b1e\") " pod="openshift-console/console-f9d7485db-4hddt" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.519882 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6f0a78b8-77b3-47dd-9dbe-e578b0374cf6-client-ca\") pod \"controller-manager-879f6c89f-sl5p9\" (UID: \"6f0a78b8-77b3-47dd-9dbe-e578b0374cf6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sl5p9" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.519886 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e8c9b478-4884-4c32-acf1-5fdec0cfac06-metrics-tls\") pod \"dns-operator-744455d44c-wpzzq\" (UID: \"e8c9b478-4884-4c32-acf1-5fdec0cfac06\") " pod="openshift-dns-operator/dns-operator-744455d44c-wpzzq" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.519929 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c696fe96-0485-44d0-b4fb-161503c334e8-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-q6zwc\" (UID: \"c696fe96-0485-44d0-b4fb-161503c334e8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q6zwc" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.519947 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/77544608-a940-4c0c-9a1a-a5a98f480134-machine-approver-tls\") pod \"machine-approver-56656f9798-8mxdc\" (UID: \"77544608-a940-4c0c-9a1a-a5a98f480134\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8mxdc" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.519967 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gzjq\" (UniqueName: \"kubernetes.io/projected/239280e2-b335-4f87-89a8-00cb6f8e3c69-kube-api-access-8gzjq\") pod \"authentication-operator-69f744f599-jm2nk\" (UID: \"239280e2-b335-4f87-89a8-00cb6f8e3c69\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jm2nk" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.519993 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2f48\" (UniqueName: \"kubernetes.io/projected/63674760-f499-49b8-a575-a8ae954eada4-kube-api-access-d2f48\") pod \"ingress-operator-5b745b69d9-kmhmh\" (UID: \"63674760-f499-49b8-a575-a8ae954eada4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kmhmh" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.520017 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-r74mv\" (UID: \"c18fb461-ce5b-43ad-85ca-305c3f8a7d46\") " pod="openshift-authentication/oauth-openshift-558db77b4-r74mv" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.520050 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77544608-a940-4c0c-9a1a-a5a98f480134-config\") pod \"machine-approver-56656f9798-8mxdc\" (UID: \"77544608-a940-4c0c-9a1a-a5a98f480134\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8mxdc" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.520595 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad43df2c-4944-45e2-919f-0c297f4092d4-config\") pod \"route-controller-manager-6576b87f9c-n8zth\" (UID: \"ad43df2c-4944-45e2-919f-0c297f4092d4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n8zth" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.521827 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/362cd55c-b576-44bd-843c-078bf26b3b1e-console-oauth-config\") pod \"console-f9d7485db-4hddt\" (UID: \"362cd55c-b576-44bd-843c-078bf26b3b1e\") " pod="openshift-console/console-f9d7485db-4hddt" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.524163 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-f4qrf"] Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.525660 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rwvxk"] Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.526560 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c696fe96-0485-44d0-b4fb-161503c334e8-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-q6zwc\" (UID: \"c696fe96-0485-44d0-b4fb-161503c334e8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q6zwc" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.526574 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/239280e2-b335-4f87-89a8-00cb6f8e3c69-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-jm2nk\" (UID: \"239280e2-b335-4f87-89a8-00cb6f8e3c69\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jm2nk" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.526673 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-r74mv\" (UID: \"c18fb461-ce5b-43ad-85ca-305c3f8a7d46\") " pod="openshift-authentication/oauth-openshift-558db77b4-r74mv" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.526734 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0e4ea5aa-2074-4100-a916-6bdfb3331d43-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-xhg7l\" (UID: \"0e4ea5aa-2074-4100-a916-6bdfb3331d43\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xhg7l" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.526765 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c696fe96-0485-44d0-b4fb-161503c334e8-serving-cert\") pod \"apiserver-7bbb656c7d-q6zwc\" (UID: \"c696fe96-0485-44d0-b4fb-161503c334e8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q6zwc" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.526802 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/362cd55c-b576-44bd-843c-078bf26b3b1e-service-ca\") pod \"console-f9d7485db-4hddt\" (UID: \"362cd55c-b576-44bd-843c-078bf26b3b1e\") " pod="openshift-console/console-f9d7485db-4hddt" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.526607 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/239280e2-b335-4f87-89a8-00cb6f8e3c69-serving-cert\") pod \"authentication-operator-69f744f599-jm2nk\" (UID: \"239280e2-b335-4f87-89a8-00cb6f8e3c69\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jm2nk" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.526811 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4sn5\" (UniqueName: \"kubernetes.io/projected/fee373cf-50b8-42f4-b30d-4a3d230ca27e-kube-api-access-v4sn5\") pod \"etcd-operator-b45778765-d8pqg\" (UID: \"fee373cf-50b8-42f4-b30d-4a3d230ca27e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-d8pqg" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.527191 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ef2828b3-f501-4105-abc8-6b1ce9658301-profile-collector-cert\") pod \"catalog-operator-68c6474976-2phqk\" (UID: \"ef2828b3-f501-4105-abc8-6b1ce9658301\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2phqk" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.527219 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbnnp\" (UniqueName: \"kubernetes.io/projected/bd317655-38ea-4fdb-95d0-82adc08456a8-kube-api-access-dbnnp\") pod \"openshift-config-operator-7777fb866f-25cmt\" (UID: \"bd317655-38ea-4fdb-95d0-82adc08456a8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-25cmt" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.527278 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l478x\" (UniqueName: \"kubernetes.io/projected/ad43df2c-4944-45e2-919f-0c297f4092d4-kube-api-access-l478x\") pod \"route-controller-manager-6576b87f9c-n8zth\" (UID: \"ad43df2c-4944-45e2-919f-0c297f4092d4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n8zth" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.527299 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/026a2812-dc11-4b60-b911-bb41a0d39d7d-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-9lqqd\" (UID: \"026a2812-dc11-4b60-b911-bb41a0d39d7d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9lqqd" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.527357 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b7a0b611-cb8e-431b-b527-b6164471c85f-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-l924n\" (UID: \"b7a0b611-cb8e-431b-b527-b6164471c85f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l924n" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.527380 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6f0a78b8-77b3-47dd-9dbe-e578b0374cf6-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-sl5p9\" (UID: \"6f0a78b8-77b3-47dd-9dbe-e578b0374cf6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sl5p9" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.527403 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/fee373cf-50b8-42f4-b30d-4a3d230ca27e-etcd-service-ca\") pod \"etcd-operator-b45778765-d8pqg\" (UID: \"fee373cf-50b8-42f4-b30d-4a3d230ca27e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-d8pqg" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.527422 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f4b9328-0efe-42a8-9a73-a80eb6a26151-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-kclsk\" (UID: \"8f4b9328-0efe-42a8-9a73-a80eb6a26151\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kclsk" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.527444 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b1e6fd37-d069-441e-9c0f-caf60d8f8d4f-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8j2gh\" (UID: \"b1e6fd37-d069-441e-9c0f-caf60d8f8d4f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8j2gh" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.527466 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-r74mv\" (UID: \"c18fb461-ce5b-43ad-85ca-305c3f8a7d46\") " pod="openshift-authentication/oauth-openshift-558db77b4-r74mv" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.527487 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-r74mv\" (UID: \"c18fb461-ce5b-43ad-85ca-305c3f8a7d46\") " pod="openshift-authentication/oauth-openshift-558db77b4-r74mv" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.527509 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hxq2\" (UniqueName: \"kubernetes.io/projected/c696fe96-0485-44d0-b4fb-161503c334e8-kube-api-access-9hxq2\") pod \"apiserver-7bbb656c7d-q6zwc\" (UID: \"c696fe96-0485-44d0-b4fb-161503c334e8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q6zwc" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.527531 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/63674760-f499-49b8-a575-a8ae954eada4-metrics-tls\") pod \"ingress-operator-5b745b69d9-kmhmh\" (UID: \"63674760-f499-49b8-a575-a8ae954eada4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kmhmh" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.527552 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/13f4a10d-50aa-41ec-9931-cb835ba1f54c-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-zrsn2\" (UID: \"13f4a10d-50aa-41ec-9931-cb835ba1f54c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-zrsn2" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.527569 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-r74mv\" (UID: \"c18fb461-ce5b-43ad-85ca-305c3f8a7d46\") " pod="openshift-authentication/oauth-openshift-558db77b4-r74mv" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.527590 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/239280e2-b335-4f87-89a8-00cb6f8e3c69-service-ca-bundle\") pod \"authentication-operator-69f744f599-jm2nk\" (UID: \"239280e2-b335-4f87-89a8-00cb6f8e3c69\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jm2nk" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.527610 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5cc06c55-7085-4cc0-8399-833b4243b51e-images\") pod \"machine-config-operator-74547568cd-65ltw\" (UID: \"5cc06c55-7085-4cc0-8399-833b4243b51e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-65ltw" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.527632 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ff4j\" (UniqueName: \"kubernetes.io/projected/b1fae478-5e9a-4da1-b2e3-35c1ee7f8fa3-kube-api-access-8ff4j\") pod \"package-server-manager-789f6589d5-2nn8b\" (UID: \"b1fae478-5e9a-4da1-b2e3-35c1ee7f8fa3\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2nn8b" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.527656 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd317655-38ea-4fdb-95d0-82adc08456a8-serving-cert\") pod \"openshift-config-operator-7777fb866f-25cmt\" (UID: \"bd317655-38ea-4fdb-95d0-82adc08456a8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-25cmt" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.527676 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-r74mv\" (UID: \"c18fb461-ce5b-43ad-85ca-305c3f8a7d46\") " pod="openshift-authentication/oauth-openshift-558db77b4-r74mv" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.527784 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77544608-a940-4c0c-9a1a-a5a98f480134-config\") pod \"machine-approver-56656f9798-8mxdc\" (UID: \"77544608-a940-4c0c-9a1a-a5a98f480134\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8mxdc" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.527944 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e8c9b478-4884-4c32-acf1-5fdec0cfac06-metrics-tls\") pod \"dns-operator-744455d44c-wpzzq\" (UID: \"e8c9b478-4884-4c32-acf1-5fdec0cfac06\") " pod="openshift-dns-operator/dns-operator-744455d44c-wpzzq" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.528130 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7aa90a6-963b-4ff0-b6de-ad46fa896e18-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-55x9c\" (UID: \"f7aa90a6-963b-4ff0-b6de-ad46fa896e18\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-55x9c" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.528188 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mkt5x"] Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.528269 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/239280e2-b335-4f87-89a8-00cb6f8e3c69-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-jm2nk\" (UID: \"239280e2-b335-4f87-89a8-00cb6f8e3c69\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jm2nk" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.528358 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525220-wqnml"] Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.528891 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/239280e2-b335-4f87-89a8-00cb6f8e3c69-service-ca-bundle\") pod \"authentication-operator-69f744f599-jm2nk\" (UID: \"239280e2-b335-4f87-89a8-00cb6f8e3c69\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jm2nk" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.528933 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6f0a78b8-77b3-47dd-9dbe-e578b0374cf6-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-sl5p9\" (UID: \"6f0a78b8-77b3-47dd-9dbe-e578b0374cf6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sl5p9" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.527697 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c696fe96-0485-44d0-b4fb-161503c334e8-etcd-client\") pod \"apiserver-7bbb656c7d-q6zwc\" (UID: \"c696fe96-0485-44d0-b4fb-161503c334e8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q6zwc" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.529570 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7x4fb\" (UniqueName: \"kubernetes.io/projected/9a7776ca-1a56-4eca-9e44-ba1b7b15510f-kube-api-access-7x4fb\") pod \"machine-api-operator-5694c8668f-l66cb\" (UID: \"9a7776ca-1a56-4eca-9e44-ba1b7b15510f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-l66cb" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.529254 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/362cd55c-b576-44bd-843c-078bf26b3b1e-oauth-serving-cert\") pod \"console-f9d7485db-4hddt\" (UID: \"362cd55c-b576-44bd-843c-078bf26b3b1e\") " pod="openshift-console/console-f9d7485db-4hddt" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.529528 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/fee373cf-50b8-42f4-b30d-4a3d230ca27e-etcd-service-ca\") pod \"etcd-operator-b45778765-d8pqg\" (UID: \"fee373cf-50b8-42f4-b30d-4a3d230ca27e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-d8pqg" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.529262 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/fee373cf-50b8-42f4-b30d-4a3d230ca27e-etcd-client\") pod \"etcd-operator-b45778765-d8pqg\" (UID: \"fee373cf-50b8-42f4-b30d-4a3d230ca27e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-d8pqg" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.529822 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0e4ea5aa-2074-4100-a916-6bdfb3331d43-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-xhg7l\" (UID: \"0e4ea5aa-2074-4100-a916-6bdfb3331d43\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xhg7l" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.530277 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/9a7776ca-1a56-4eca-9e44-ba1b7b15510f-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-l66cb\" (UID: \"9a7776ca-1a56-4eca-9e44-ba1b7b15510f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-l66cb" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.530621 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/0e4ea5aa-2074-4100-a916-6bdfb3331d43-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-xhg7l\" (UID: \"0e4ea5aa-2074-4100-a916-6bdfb3331d43\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xhg7l" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.530999 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-r74mv"] Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.531664 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0e4ea5aa-2074-4100-a916-6bdfb3331d43-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-xhg7l\" (UID: \"0e4ea5aa-2074-4100-a916-6bdfb3331d43\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xhg7l" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.531769 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f4b9328-0efe-42a8-9a73-a80eb6a26151-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-kclsk\" (UID: \"8f4b9328-0efe-42a8-9a73-a80eb6a26151\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kclsk" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.531797 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/026a2812-dc11-4b60-b911-bb41a0d39d7d-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-9lqqd\" (UID: \"026a2812-dc11-4b60-b911-bb41a0d39d7d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9lqqd" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.531823 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad43df2c-4944-45e2-919f-0c297f4092d4-serving-cert\") pod \"route-controller-manager-6576b87f9c-n8zth\" (UID: \"ad43df2c-4944-45e2-919f-0c297f4092d4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n8zth" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.531841 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c696fe96-0485-44d0-b4fb-161503c334e8-audit-dir\") pod \"apiserver-7bbb656c7d-q6zwc\" (UID: \"c696fe96-0485-44d0-b4fb-161503c334e8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q6zwc" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.531862 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fee373cf-50b8-42f4-b30d-4a3d230ca27e-serving-cert\") pod \"etcd-operator-b45778765-d8pqg\" (UID: \"fee373cf-50b8-42f4-b30d-4a3d230ca27e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-d8pqg" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.531894 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mbjx\" (UniqueName: \"kubernetes.io/projected/7a29951a-027e-49b4-a7ea-a8e363942414-kube-api-access-2mbjx\") pod \"downloads-7954f5f757-dkppn\" (UID: \"7a29951a-027e-49b4-a7ea-a8e363942414\") " pod="openshift-console/downloads-7954f5f757-dkppn" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.531928 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1e6fd37-d069-441e-9c0f-caf60d8f8d4f-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8j2gh\" (UID: \"b1e6fd37-d069-441e-9c0f-caf60d8f8d4f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8j2gh" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.531949 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6a43ce91-6673-4641-a2d6-551afe72688d-webhook-cert\") pod \"packageserver-d55dfcdfc-hmxtq\" (UID: \"6a43ce91-6673-4641-a2d6-551afe72688d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hmxtq" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.531983 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/63674760-f499-49b8-a575-a8ae954eada4-trusted-ca\") pod \"ingress-operator-5b745b69d9-kmhmh\" (UID: \"63674760-f499-49b8-a575-a8ae954eada4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kmhmh" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.532003 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-r74mv\" (UID: \"c18fb461-ce5b-43ad-85ca-305c3f8a7d46\") " pod="openshift-authentication/oauth-openshift-558db77b4-r74mv" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.532032 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c696fe96-0485-44d0-b4fb-161503c334e8-audit-policies\") pod \"apiserver-7bbb656c7d-q6zwc\" (UID: \"c696fe96-0485-44d0-b4fb-161503c334e8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q6zwc" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.532052 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/362cd55c-b576-44bd-843c-078bf26b3b1e-trusted-ca-bundle\") pod \"console-f9d7485db-4hddt\" (UID: \"362cd55c-b576-44bd-843c-078bf26b3b1e\") " pod="openshift-console/console-f9d7485db-4hddt" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.532073 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnqb7\" (UniqueName: \"kubernetes.io/projected/362cd55c-b576-44bd-843c-078bf26b3b1e-kube-api-access-pnqb7\") pod \"console-f9d7485db-4hddt\" (UID: \"362cd55c-b576-44bd-843c-078bf26b3b1e\") " pod="openshift-console/console-f9d7485db-4hddt" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.532091 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f0a78b8-77b3-47dd-9dbe-e578b0374cf6-serving-cert\") pod \"controller-manager-879f6c89f-sl5p9\" (UID: \"6f0a78b8-77b3-47dd-9dbe-e578b0374cf6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sl5p9" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.532115 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jh895\" (UniqueName: \"kubernetes.io/projected/25e05c8a-335b-405c-9033-f689c21c5ecc-kube-api-access-jh895\") pod \"cluster-samples-operator-665b6dd947-xhzlb\" (UID: \"25e05c8a-335b-405c-9033-f689c21c5ecc\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xhzlb" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.532135 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7a0b611-cb8e-431b-b527-b6164471c85f-config\") pod \"kube-controller-manager-operator-78b949d7b-l924n\" (UID: \"b7a0b611-cb8e-431b-b527-b6164471c85f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l924n" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.532176 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bd317655-38ea-4fdb-95d0-82adc08456a8-available-featuregates\") pod \"openshift-config-operator-7777fb866f-25cmt\" (UID: \"bd317655-38ea-4fdb-95d0-82adc08456a8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-25cmt" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.532194 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5cc06c55-7085-4cc0-8399-833b4243b51e-proxy-tls\") pod \"machine-config-operator-74547568cd-65ltw\" (UID: \"5cc06c55-7085-4cc0-8399-833b4243b51e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-65ltw" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.532216 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c696fe96-0485-44d0-b4fb-161503c334e8-encryption-config\") pod \"apiserver-7bbb656c7d-q6zwc\" (UID: \"c696fe96-0485-44d0-b4fb-161503c334e8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q6zwc" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.532237 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jh2sf\" (UniqueName: \"kubernetes.io/projected/77544608-a940-4c0c-9a1a-a5a98f480134-kube-api-access-jh2sf\") pod \"machine-approver-56656f9798-8mxdc\" (UID: \"77544608-a940-4c0c-9a1a-a5a98f480134\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8mxdc" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.532256 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b7a0b611-cb8e-431b-b527-b6164471c85f-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-l924n\" (UID: \"b7a0b611-cb8e-431b-b527-b6164471c85f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l924n" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.532279 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lhl5\" (UniqueName: \"kubernetes.io/projected/76ffdcba-57d6-4636-8373-f088926a716d-kube-api-access-7lhl5\") pod \"router-default-5444994796-hvw7f\" (UID: \"76ffdcba-57d6-4636-8373-f088926a716d\") " pod="openshift-ingress/router-default-5444994796-hvw7f" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.532297 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-r74mv\" (UID: \"c18fb461-ce5b-43ad-85ca-305c3f8a7d46\") " pod="openshift-authentication/oauth-openshift-558db77b4-r74mv" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.532315 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-r74mv\" (UID: \"c18fb461-ce5b-43ad-85ca-305c3f8a7d46\") " pod="openshift-authentication/oauth-openshift-558db77b4-r74mv" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.532350 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/362cd55c-b576-44bd-843c-078bf26b3b1e-console-serving-cert\") pod \"console-f9d7485db-4hddt\" (UID: \"362cd55c-b576-44bd-843c-078bf26b3b1e\") " pod="openshift-console/console-f9d7485db-4hddt" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.532367 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c696fe96-0485-44d0-b4fb-161503c334e8-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-q6zwc\" (UID: \"c696fe96-0485-44d0-b4fb-161503c334e8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q6zwc" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.532386 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgcbs\" (UniqueName: \"kubernetes.io/projected/8f4b9328-0efe-42a8-9a73-a80eb6a26151-kube-api-access-hgcbs\") pod \"openshift-controller-manager-operator-756b6f6bc6-kclsk\" (UID: \"8f4b9328-0efe-42a8-9a73-a80eb6a26151\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kclsk" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.532408 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8k7k5\" (UniqueName: \"kubernetes.io/projected/6f0a78b8-77b3-47dd-9dbe-e578b0374cf6-kube-api-access-8k7k5\") pod \"controller-manager-879f6c89f-sl5p9\" (UID: \"6f0a78b8-77b3-47dd-9dbe-e578b0374cf6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sl5p9" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.532425 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9a7776ca-1a56-4eca-9e44-ba1b7b15510f-images\") pod \"machine-api-operator-5694c8668f-l66cb\" (UID: \"9a7776ca-1a56-4eca-9e44-ba1b7b15510f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-l66cb" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.532441 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7aa90a6-963b-4ff0-b6de-ad46fa896e18-config\") pod \"openshift-apiserver-operator-796bbdcf4f-55x9c\" (UID: \"f7aa90a6-963b-4ff0-b6de-ad46fa896e18\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-55x9c" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.532459 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/76ffdcba-57d6-4636-8373-f088926a716d-default-certificate\") pod \"router-default-5444994796-hvw7f\" (UID: \"76ffdcba-57d6-4636-8373-f088926a716d\") " pod="openshift-ingress/router-default-5444994796-hvw7f" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.532477 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/25e05c8a-335b-405c-9033-f689c21c5ecc-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-xhzlb\" (UID: \"25e05c8a-335b-405c-9033-f689c21c5ecc\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xhzlb" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.532494 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-audit-dir\") pod \"oauth-openshift-558db77b4-r74mv\" (UID: \"c18fb461-ce5b-43ad-85ca-305c3f8a7d46\") " pod="openshift-authentication/oauth-openshift-558db77b4-r74mv" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.532517 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxw8d\" (UniqueName: \"kubernetes.io/projected/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-kube-api-access-zxw8d\") pod \"oauth-openshift-558db77b4-r74mv\" (UID: \"c18fb461-ce5b-43ad-85ca-305c3f8a7d46\") " pod="openshift-authentication/oauth-openshift-558db77b4-r74mv" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.532538 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9hrx\" (UniqueName: \"kubernetes.io/projected/ef2828b3-f501-4105-abc8-6b1ce9658301-kube-api-access-x9hrx\") pod \"catalog-operator-68c6474976-2phqk\" (UID: \"ef2828b3-f501-4105-abc8-6b1ce9658301\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2phqk" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.532556 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9xt4\" (UniqueName: \"kubernetes.io/projected/5cc06c55-7085-4cc0-8399-833b4243b51e-kube-api-access-x9xt4\") pod \"machine-config-operator-74547568cd-65ltw\" (UID: \"5cc06c55-7085-4cc0-8399-833b4243b51e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-65ltw" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.532885 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c696fe96-0485-44d0-b4fb-161503c334e8-audit-policies\") pod \"apiserver-7bbb656c7d-q6zwc\" (UID: \"c696fe96-0485-44d0-b4fb-161503c334e8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q6zwc" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.532949 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c696fe96-0485-44d0-b4fb-161503c334e8-audit-dir\") pod \"apiserver-7bbb656c7d-q6zwc\" (UID: \"c696fe96-0485-44d0-b4fb-161503c334e8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q6zwc" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.533083 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-4nds4"] Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.533580 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7aa90a6-963b-4ff0-b6de-ad46fa896e18-config\") pod \"openshift-apiserver-operator-796bbdcf4f-55x9c\" (UID: \"f7aa90a6-963b-4ff0-b6de-ad46fa896e18\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-55x9c" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.534456 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c696fe96-0485-44d0-b4fb-161503c334e8-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-q6zwc\" (UID: \"c696fe96-0485-44d0-b4fb-161503c334e8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q6zwc" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.534541 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c696fe96-0485-44d0-b4fb-161503c334e8-etcd-client\") pod \"apiserver-7bbb656c7d-q6zwc\" (UID: \"c696fe96-0485-44d0-b4fb-161503c334e8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q6zwc" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.534650 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/362cd55c-b576-44bd-843c-078bf26b3b1e-trusted-ca-bundle\") pod \"console-f9d7485db-4hddt\" (UID: \"362cd55c-b576-44bd-843c-078bf26b3b1e\") " pod="openshift-console/console-f9d7485db-4hddt" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.535450 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.535725 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c696fe96-0485-44d0-b4fb-161503c334e8-serving-cert\") pod \"apiserver-7bbb656c7d-q6zwc\" (UID: \"c696fe96-0485-44d0-b4fb-161503c334e8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q6zwc" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.536598 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9a7776ca-1a56-4eca-9e44-ba1b7b15510f-images\") pod \"machine-api-operator-5694c8668f-l66cb\" (UID: \"9a7776ca-1a56-4eca-9e44-ba1b7b15510f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-l66cb" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.537650 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fee373cf-50b8-42f4-b30d-4a3d230ca27e-serving-cert\") pod \"etcd-operator-b45778765-d8pqg\" (UID: \"fee373cf-50b8-42f4-b30d-4a3d230ca27e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-d8pqg" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.537792 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/362cd55c-b576-44bd-843c-078bf26b3b1e-console-serving-cert\") pod \"console-f9d7485db-4hddt\" (UID: \"362cd55c-b576-44bd-843c-078bf26b3b1e\") " pod="openshift-console/console-f9d7485db-4hddt" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.537879 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-hlw9s"] Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.538161 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/77544608-a940-4c0c-9a1a-a5a98f480134-machine-approver-tls\") pod \"machine-approver-56656f9798-8mxdc\" (UID: \"77544608-a940-4c0c-9a1a-a5a98f480134\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8mxdc" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.538401 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-4nds4" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.538682 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-hlw9s" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.539114 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/25e05c8a-335b-405c-9033-f689c21c5ecc-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-xhzlb\" (UID: \"25e05c8a-335b-405c-9033-f689c21c5ecc\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xhzlb" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.539461 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f0a78b8-77b3-47dd-9dbe-e578b0374cf6-serving-cert\") pod \"controller-manager-879f6c89f-sl5p9\" (UID: \"6f0a78b8-77b3-47dd-9dbe-e578b0374cf6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sl5p9" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.539861 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-895xv"] Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.540237 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c696fe96-0485-44d0-b4fb-161503c334e8-encryption-config\") pod \"apiserver-7bbb656c7d-q6zwc\" (UID: \"c696fe96-0485-44d0-b4fb-161503c334e8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q6zwc" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.541817 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-w6k69"] Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.543586 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-65ltw"] Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.544134 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad43df2c-4944-45e2-919f-0c297f4092d4-serving-cert\") pod \"route-controller-manager-6576b87f9c-n8zth\" (UID: \"ad43df2c-4944-45e2-919f-0c297f4092d4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n8zth" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.544844 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-frmnw"] Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.547228 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-n6fl9"] Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.548714 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-7bnq2"] Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.549916 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-4nds4"] Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.551422 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-h68pj"] Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.553033 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-r2tqm"] Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.554519 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-cdfxh"] Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.555270 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-cdfxh" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.556245 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-cdfxh"] Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.576146 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.596753 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.616570 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.633493 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2f48\" (UniqueName: \"kubernetes.io/projected/63674760-f499-49b8-a575-a8ae954eada4-kube-api-access-d2f48\") pod \"ingress-operator-5b745b69d9-kmhmh\" (UID: \"63674760-f499-49b8-a575-a8ae954eada4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kmhmh" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.633540 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-r74mv\" (UID: \"c18fb461-ce5b-43ad-85ca-305c3f8a7d46\") " pod="openshift-authentication/oauth-openshift-558db77b4-r74mv" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.633564 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-r74mv\" (UID: \"c18fb461-ce5b-43ad-85ca-305c3f8a7d46\") " pod="openshift-authentication/oauth-openshift-558db77b4-r74mv" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.633609 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ef2828b3-f501-4105-abc8-6b1ce9658301-profile-collector-cert\") pod \"catalog-operator-68c6474976-2phqk\" (UID: \"ef2828b3-f501-4105-abc8-6b1ce9658301\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2phqk" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.633632 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbnnp\" (UniqueName: \"kubernetes.io/projected/bd317655-38ea-4fdb-95d0-82adc08456a8-kube-api-access-dbnnp\") pod \"openshift-config-operator-7777fb866f-25cmt\" (UID: \"bd317655-38ea-4fdb-95d0-82adc08456a8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-25cmt" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.633660 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/026a2812-dc11-4b60-b911-bb41a0d39d7d-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-9lqqd\" (UID: \"026a2812-dc11-4b60-b911-bb41a0d39d7d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9lqqd" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.633681 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b7a0b611-cb8e-431b-b527-b6164471c85f-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-l924n\" (UID: \"b7a0b611-cb8e-431b-b527-b6164471c85f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l924n" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.633713 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b1e6fd37-d069-441e-9c0f-caf60d8f8d4f-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8j2gh\" (UID: \"b1e6fd37-d069-441e-9c0f-caf60d8f8d4f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8j2gh" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.633732 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-r74mv\" (UID: \"c18fb461-ce5b-43ad-85ca-305c3f8a7d46\") " pod="openshift-authentication/oauth-openshift-558db77b4-r74mv" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.633751 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-r74mv\" (UID: \"c18fb461-ce5b-43ad-85ca-305c3f8a7d46\") " pod="openshift-authentication/oauth-openshift-558db77b4-r74mv" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.633770 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/63674760-f499-49b8-a575-a8ae954eada4-metrics-tls\") pod \"ingress-operator-5b745b69d9-kmhmh\" (UID: \"63674760-f499-49b8-a575-a8ae954eada4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kmhmh" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.633801 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/13f4a10d-50aa-41ec-9931-cb835ba1f54c-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-zrsn2\" (UID: \"13f4a10d-50aa-41ec-9931-cb835ba1f54c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-zrsn2" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.633820 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-r74mv\" (UID: \"c18fb461-ce5b-43ad-85ca-305c3f8a7d46\") " pod="openshift-authentication/oauth-openshift-558db77b4-r74mv" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.633841 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ff4j\" (UniqueName: \"kubernetes.io/projected/b1fae478-5e9a-4da1-b2e3-35c1ee7f8fa3-kube-api-access-8ff4j\") pod \"package-server-manager-789f6589d5-2nn8b\" (UID: \"b1fae478-5e9a-4da1-b2e3-35c1ee7f8fa3\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2nn8b" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.633868 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5cc06c55-7085-4cc0-8399-833b4243b51e-images\") pod \"machine-config-operator-74547568cd-65ltw\" (UID: \"5cc06c55-7085-4cc0-8399-833b4243b51e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-65ltw" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.633888 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-r74mv\" (UID: \"c18fb461-ce5b-43ad-85ca-305c3f8a7d46\") " pod="openshift-authentication/oauth-openshift-558db77b4-r74mv" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.633908 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd317655-38ea-4fdb-95d0-82adc08456a8-serving-cert\") pod \"openshift-config-operator-7777fb866f-25cmt\" (UID: \"bd317655-38ea-4fdb-95d0-82adc08456a8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-25cmt" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.633936 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/026a2812-dc11-4b60-b911-bb41a0d39d7d-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-9lqqd\" (UID: \"026a2812-dc11-4b60-b911-bb41a0d39d7d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9lqqd" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.633961 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1e6fd37-d069-441e-9c0f-caf60d8f8d4f-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8j2gh\" (UID: \"b1e6fd37-d069-441e-9c0f-caf60d8f8d4f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8j2gh" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.633994 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6a43ce91-6673-4641-a2d6-551afe72688d-webhook-cert\") pod \"packageserver-d55dfcdfc-hmxtq\" (UID: \"6a43ce91-6673-4641-a2d6-551afe72688d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hmxtq" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.634022 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/63674760-f499-49b8-a575-a8ae954eada4-trusted-ca\") pod \"ingress-operator-5b745b69d9-kmhmh\" (UID: \"63674760-f499-49b8-a575-a8ae954eada4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kmhmh" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.634040 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-r74mv\" (UID: \"c18fb461-ce5b-43ad-85ca-305c3f8a7d46\") " pod="openshift-authentication/oauth-openshift-558db77b4-r74mv" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.634088 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5cc06c55-7085-4cc0-8399-833b4243b51e-proxy-tls\") pod \"machine-config-operator-74547568cd-65ltw\" (UID: \"5cc06c55-7085-4cc0-8399-833b4243b51e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-65ltw" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.634112 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7a0b611-cb8e-431b-b527-b6164471c85f-config\") pod \"kube-controller-manager-operator-78b949d7b-l924n\" (UID: \"b7a0b611-cb8e-431b-b527-b6164471c85f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l924n" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.634136 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bd317655-38ea-4fdb-95d0-82adc08456a8-available-featuregates\") pod \"openshift-config-operator-7777fb866f-25cmt\" (UID: \"bd317655-38ea-4fdb-95d0-82adc08456a8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-25cmt" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.634170 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b7a0b611-cb8e-431b-b527-b6164471c85f-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-l924n\" (UID: \"b7a0b611-cb8e-431b-b527-b6164471c85f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l924n" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.634194 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lhl5\" (UniqueName: \"kubernetes.io/projected/76ffdcba-57d6-4636-8373-f088926a716d-kube-api-access-7lhl5\") pod \"router-default-5444994796-hvw7f\" (UID: \"76ffdcba-57d6-4636-8373-f088926a716d\") " pod="openshift-ingress/router-default-5444994796-hvw7f" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.634224 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-r74mv\" (UID: \"c18fb461-ce5b-43ad-85ca-305c3f8a7d46\") " pod="openshift-authentication/oauth-openshift-558db77b4-r74mv" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.634250 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-r74mv\" (UID: \"c18fb461-ce5b-43ad-85ca-305c3f8a7d46\") " pod="openshift-authentication/oauth-openshift-558db77b4-r74mv" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.634293 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/76ffdcba-57d6-4636-8373-f088926a716d-default-certificate\") pod \"router-default-5444994796-hvw7f\" (UID: \"76ffdcba-57d6-4636-8373-f088926a716d\") " pod="openshift-ingress/router-default-5444994796-hvw7f" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.634313 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-audit-dir\") pod \"oauth-openshift-558db77b4-r74mv\" (UID: \"c18fb461-ce5b-43ad-85ca-305c3f8a7d46\") " pod="openshift-authentication/oauth-openshift-558db77b4-r74mv" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.634358 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxw8d\" (UniqueName: \"kubernetes.io/projected/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-kube-api-access-zxw8d\") pod \"oauth-openshift-558db77b4-r74mv\" (UID: \"c18fb461-ce5b-43ad-85ca-305c3f8a7d46\") " pod="openshift-authentication/oauth-openshift-558db77b4-r74mv" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.634388 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9hrx\" (UniqueName: \"kubernetes.io/projected/ef2828b3-f501-4105-abc8-6b1ce9658301-kube-api-access-x9hrx\") pod \"catalog-operator-68c6474976-2phqk\" (UID: \"ef2828b3-f501-4105-abc8-6b1ce9658301\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2phqk" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.634406 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9xt4\" (UniqueName: \"kubernetes.io/projected/5cc06c55-7085-4cc0-8399-833b4243b51e-kube-api-access-x9xt4\") pod \"machine-config-operator-74547568cd-65ltw\" (UID: \"5cc06c55-7085-4cc0-8399-833b4243b51e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-65ltw" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.634445 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/76ffdcba-57d6-4636-8373-f088926a716d-stats-auth\") pod \"router-default-5444994796-hvw7f\" (UID: \"76ffdcba-57d6-4636-8373-f088926a716d\") " pod="openshift-ingress/router-default-5444994796-hvw7f" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.634475 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmvtp\" (UniqueName: \"kubernetes.io/projected/6a43ce91-6673-4641-a2d6-551afe72688d-kube-api-access-xmvtp\") pod \"packageserver-d55dfcdfc-hmxtq\" (UID: \"6a43ce91-6673-4641-a2d6-551afe72688d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hmxtq" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.634496 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/76ffdcba-57d6-4636-8373-f088926a716d-service-ca-bundle\") pod \"router-default-5444994796-hvw7f\" (UID: \"76ffdcba-57d6-4636-8373-f088926a716d\") " pod="openshift-ingress/router-default-5444994796-hvw7f" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.634524 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5cc06c55-7085-4cc0-8399-833b4243b51e-auth-proxy-config\") pod \"machine-config-operator-74547568cd-65ltw\" (UID: \"5cc06c55-7085-4cc0-8399-833b4243b51e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-65ltw" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.634554 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/026a2812-dc11-4b60-b911-bb41a0d39d7d-config\") pod \"kube-apiserver-operator-766d6c64bb-9lqqd\" (UID: \"026a2812-dc11-4b60-b911-bb41a0d39d7d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9lqqd" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.634573 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-r74mv\" (UID: \"c18fb461-ce5b-43ad-85ca-305c3f8a7d46\") " pod="openshift-authentication/oauth-openshift-558db77b4-r74mv" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.634592 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ef2828b3-f501-4105-abc8-6b1ce9658301-srv-cert\") pod \"catalog-operator-68c6474976-2phqk\" (UID: \"ef2828b3-f501-4105-abc8-6b1ce9658301\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2phqk" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.634610 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-audit-policies\") pod \"oauth-openshift-558db77b4-r74mv\" (UID: \"c18fb461-ce5b-43ad-85ca-305c3f8a7d46\") " pod="openshift-authentication/oauth-openshift-558db77b4-r74mv" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.634636 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/63674760-f499-49b8-a575-a8ae954eada4-bound-sa-token\") pod \"ingress-operator-5b745b69d9-kmhmh\" (UID: \"63674760-f499-49b8-a575-a8ae954eada4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kmhmh" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.634654 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/6a43ce91-6673-4641-a2d6-551afe72688d-tmpfs\") pod \"packageserver-d55dfcdfc-hmxtq\" (UID: \"6a43ce91-6673-4641-a2d6-551afe72688d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hmxtq" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.634677 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6a43ce91-6673-4641-a2d6-551afe72688d-apiservice-cert\") pod \"packageserver-d55dfcdfc-hmxtq\" (UID: \"6a43ce91-6673-4641-a2d6-551afe72688d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hmxtq" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.634697 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/b1fae478-5e9a-4da1-b2e3-35c1ee7f8fa3-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-2nn8b\" (UID: \"b1fae478-5e9a-4da1-b2e3-35c1ee7f8fa3\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2nn8b" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.634730 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b1e6fd37-d069-441e-9c0f-caf60d8f8d4f-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8j2gh\" (UID: \"b1e6fd37-d069-441e-9c0f-caf60d8f8d4f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8j2gh" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.634751 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/76ffdcba-57d6-4636-8373-f088926a716d-metrics-certs\") pod \"router-default-5444994796-hvw7f\" (UID: \"76ffdcba-57d6-4636-8373-f088926a716d\") " pod="openshift-ingress/router-default-5444994796-hvw7f" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.634777 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqq2f\" (UniqueName: \"kubernetes.io/projected/13f4a10d-50aa-41ec-9931-cb835ba1f54c-kube-api-access-zqq2f\") pod \"multus-admission-controller-857f4d67dd-zrsn2\" (UID: \"13f4a10d-50aa-41ec-9931-cb835ba1f54c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-zrsn2" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.634794 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-r74mv\" (UID: \"c18fb461-ce5b-43ad-85ca-305c3f8a7d46\") " pod="openshift-authentication/oauth-openshift-558db77b4-r74mv" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.635620 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1e6fd37-d069-441e-9c0f-caf60d8f8d4f-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8j2gh\" (UID: \"b1e6fd37-d069-441e-9c0f-caf60d8f8d4f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8j2gh" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.635796 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bd317655-38ea-4fdb-95d0-82adc08456a8-available-featuregates\") pod \"openshift-config-operator-7777fb866f-25cmt\" (UID: \"bd317655-38ea-4fdb-95d0-82adc08456a8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-25cmt" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.636587 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.636580 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/76ffdcba-57d6-4636-8373-f088926a716d-service-ca-bundle\") pod \"router-default-5444994796-hvw7f\" (UID: \"76ffdcba-57d6-4636-8373-f088926a716d\") " pod="openshift-ingress/router-default-5444994796-hvw7f" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.634727 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-audit-dir\") pod \"oauth-openshift-558db77b4-r74mv\" (UID: \"c18fb461-ce5b-43ad-85ca-305c3f8a7d46\") " pod="openshift-authentication/oauth-openshift-558db77b4-r74mv" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.636930 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7a0b611-cb8e-431b-b527-b6164471c85f-config\") pod \"kube-controller-manager-operator-78b949d7b-l924n\" (UID: \"b7a0b611-cb8e-431b-b527-b6164471c85f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l924n" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.637089 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/6a43ce91-6673-4641-a2d6-551afe72688d-tmpfs\") pod \"packageserver-d55dfcdfc-hmxtq\" (UID: \"6a43ce91-6673-4641-a2d6-551afe72688d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hmxtq" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.637402 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5cc06c55-7085-4cc0-8399-833b4243b51e-auth-proxy-config\") pod \"machine-config-operator-74547568cd-65ltw\" (UID: \"5cc06c55-7085-4cc0-8399-833b4243b51e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-65ltw" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.638419 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/026a2812-dc11-4b60-b911-bb41a0d39d7d-config\") pod \"kube-apiserver-operator-766d6c64bb-9lqqd\" (UID: \"026a2812-dc11-4b60-b911-bb41a0d39d7d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9lqqd" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.638855 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/63674760-f499-49b8-a575-a8ae954eada4-metrics-tls\") pod \"ingress-operator-5b745b69d9-kmhmh\" (UID: \"63674760-f499-49b8-a575-a8ae954eada4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kmhmh" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.639121 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/026a2812-dc11-4b60-b911-bb41a0d39d7d-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-9lqqd\" (UID: \"026a2812-dc11-4b60-b911-bb41a0d39d7d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9lqqd" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.639482 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b1e6fd37-d069-441e-9c0f-caf60d8f8d4f-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8j2gh\" (UID: \"b1e6fd37-d069-441e-9c0f-caf60d8f8d4f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8j2gh" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.640061 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/76ffdcba-57d6-4636-8373-f088926a716d-default-certificate\") pod \"router-default-5444994796-hvw7f\" (UID: \"76ffdcba-57d6-4636-8373-f088926a716d\") " pod="openshift-ingress/router-default-5444994796-hvw7f" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.642065 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/b1fae478-5e9a-4da1-b2e3-35c1ee7f8fa3-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-2nn8b\" (UID: \"b1fae478-5e9a-4da1-b2e3-35c1ee7f8fa3\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2nn8b" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.642231 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/76ffdcba-57d6-4636-8373-f088926a716d-metrics-certs\") pod \"router-default-5444994796-hvw7f\" (UID: \"76ffdcba-57d6-4636-8373-f088926a716d\") " pod="openshift-ingress/router-default-5444994796-hvw7f" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.642378 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b7a0b611-cb8e-431b-b527-b6164471c85f-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-l924n\" (UID: \"b7a0b611-cb8e-431b-b527-b6164471c85f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l924n" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.644038 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/13f4a10d-50aa-41ec-9931-cb835ba1f54c-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-zrsn2\" (UID: \"13f4a10d-50aa-41ec-9931-cb835ba1f54c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-zrsn2" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.644815 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/76ffdcba-57d6-4636-8373-f088926a716d-stats-auth\") pod \"router-default-5444994796-hvw7f\" (UID: \"76ffdcba-57d6-4636-8373-f088926a716d\") " pod="openshift-ingress/router-default-5444994796-hvw7f" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.666521 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.668038 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/63674760-f499-49b8-a575-a8ae954eada4-trusted-ca\") pod \"ingress-operator-5b745b69d9-kmhmh\" (UID: \"63674760-f499-49b8-a575-a8ae954eada4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kmhmh" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.676433 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.696992 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.717292 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.757230 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.757505 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.768414 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-r74mv\" (UID: \"c18fb461-ce5b-43ad-85ca-305c3f8a7d46\") " pod="openshift-authentication/oauth-openshift-558db77b4-r74mv" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.770108 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-r74mv\" (UID: \"c18fb461-ce5b-43ad-85ca-305c3f8a7d46\") " pod="openshift-authentication/oauth-openshift-558db77b4-r74mv" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.776783 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.787174 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-audit-policies\") pod \"oauth-openshift-558db77b4-r74mv\" (UID: \"c18fb461-ce5b-43ad-85ca-305c3f8a7d46\") " pod="openshift-authentication/oauth-openshift-558db77b4-r74mv" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.797186 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.805679 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-r74mv\" (UID: \"c18fb461-ce5b-43ad-85ca-305c3f8a7d46\") " pod="openshift-authentication/oauth-openshift-558db77b4-r74mv" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.816281 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.818672 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-r74mv\" (UID: \"c18fb461-ce5b-43ad-85ca-305c3f8a7d46\") " pod="openshift-authentication/oauth-openshift-558db77b4-r74mv" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.843500 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.855801 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.868248 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-r74mv\" (UID: \"c18fb461-ce5b-43ad-85ca-305c3f8a7d46\") " pod="openshift-authentication/oauth-openshift-558db77b4-r74mv" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.876015 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.881656 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-r74mv\" (UID: \"c18fb461-ce5b-43ad-85ca-305c3f8a7d46\") " pod="openshift-authentication/oauth-openshift-558db77b4-r74mv" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.895410 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.904057 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0f46b00f-f770-4539-92f9-60e1146308ab-proxy-tls\") pod \"machine-config-controller-84d6567774-f4qrf\" (UID: \"0f46b00f-f770-4539-92f9-60e1146308ab\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-f4qrf" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.923303 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.931844 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-r74mv\" (UID: \"c18fb461-ce5b-43ad-85ca-305c3f8a7d46\") " pod="openshift-authentication/oauth-openshift-558db77b4-r74mv" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.937065 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.949359 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-r74mv\" (UID: \"c18fb461-ce5b-43ad-85ca-305c3f8a7d46\") " pod="openshift-authentication/oauth-openshift-558db77b4-r74mv" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.956058 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.970904 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ef2828b3-f501-4105-abc8-6b1ce9658301-profile-collector-cert\") pod \"catalog-operator-68c6474976-2phqk\" (UID: \"ef2828b3-f501-4105-abc8-6b1ce9658301\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2phqk" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.976538 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 19 15:11:51 crc kubenswrapper[4810]: I0219 15:11:51.996122 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 19 15:11:52 crc kubenswrapper[4810]: I0219 15:11:52.001469 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ef2828b3-f501-4105-abc8-6b1ce9658301-srv-cert\") pod \"catalog-operator-68c6474976-2phqk\" (UID: \"ef2828b3-f501-4105-abc8-6b1ce9658301\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2phqk" Feb 19 15:11:52 crc kubenswrapper[4810]: I0219 15:11:52.015845 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 19 15:11:52 crc kubenswrapper[4810]: I0219 15:11:52.031204 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-r74mv\" (UID: \"c18fb461-ce5b-43ad-85ca-305c3f8a7d46\") " pod="openshift-authentication/oauth-openshift-558db77b4-r74mv" Feb 19 15:11:52 crc kubenswrapper[4810]: I0219 15:11:52.036842 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 19 15:11:52 crc kubenswrapper[4810]: I0219 15:11:52.058173 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 19 15:11:52 crc kubenswrapper[4810]: I0219 15:11:52.075457 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-r74mv\" (UID: \"c18fb461-ce5b-43ad-85ca-305c3f8a7d46\") " pod="openshift-authentication/oauth-openshift-558db77b4-r74mv" Feb 19 15:11:52 crc kubenswrapper[4810]: I0219 15:11:52.076766 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 19 15:11:52 crc kubenswrapper[4810]: I0219 15:11:52.091855 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-r74mv\" (UID: \"c18fb461-ce5b-43ad-85ca-305c3f8a7d46\") " pod="openshift-authentication/oauth-openshift-558db77b4-r74mv" Feb 19 15:11:52 crc kubenswrapper[4810]: I0219 15:11:52.096210 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 19 15:11:52 crc kubenswrapper[4810]: I0219 15:11:52.116848 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 19 15:11:52 crc kubenswrapper[4810]: I0219 15:11:52.131290 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6a43ce91-6673-4641-a2d6-551afe72688d-apiservice-cert\") pod \"packageserver-d55dfcdfc-hmxtq\" (UID: \"6a43ce91-6673-4641-a2d6-551afe72688d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hmxtq" Feb 19 15:11:52 crc kubenswrapper[4810]: I0219 15:11:52.132941 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6a43ce91-6673-4641-a2d6-551afe72688d-webhook-cert\") pod \"packageserver-d55dfcdfc-hmxtq\" (UID: \"6a43ce91-6673-4641-a2d6-551afe72688d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hmxtq" Feb 19 15:11:52 crc kubenswrapper[4810]: I0219 15:11:52.137029 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 19 15:11:52 crc kubenswrapper[4810]: I0219 15:11:52.156628 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 19 15:11:52 crc kubenswrapper[4810]: I0219 15:11:52.160674 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd317655-38ea-4fdb-95d0-82adc08456a8-serving-cert\") pod \"openshift-config-operator-7777fb866f-25cmt\" (UID: \"bd317655-38ea-4fdb-95d0-82adc08456a8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-25cmt" Feb 19 15:11:52 crc kubenswrapper[4810]: I0219 15:11:52.176390 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 19 15:11:52 crc kubenswrapper[4810]: I0219 15:11:52.196935 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 19 15:11:52 crc kubenswrapper[4810]: I0219 15:11:52.207864 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5cc06c55-7085-4cc0-8399-833b4243b51e-images\") pod \"machine-config-operator-74547568cd-65ltw\" (UID: \"5cc06c55-7085-4cc0-8399-833b4243b51e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-65ltw" Feb 19 15:11:52 crc kubenswrapper[4810]: I0219 15:11:52.217580 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 19 15:11:52 crc kubenswrapper[4810]: I0219 15:11:52.237637 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 19 15:11:52 crc kubenswrapper[4810]: I0219 15:11:52.252275 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5cc06c55-7085-4cc0-8399-833b4243b51e-proxy-tls\") pod \"machine-config-operator-74547568cd-65ltw\" (UID: \"5cc06c55-7085-4cc0-8399-833b4243b51e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-65ltw" Feb 19 15:11:52 crc kubenswrapper[4810]: I0219 15:11:52.256233 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 19 15:11:52 crc kubenswrapper[4810]: I0219 15:11:52.276168 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 19 15:11:52 crc kubenswrapper[4810]: I0219 15:11:52.296285 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 19 15:11:52 crc kubenswrapper[4810]: I0219 15:11:52.316856 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 19 15:11:52 crc kubenswrapper[4810]: I0219 15:11:52.336843 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 19 15:11:52 crc kubenswrapper[4810]: I0219 15:11:52.356252 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 19 15:11:52 crc kubenswrapper[4810]: I0219 15:11:52.376655 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 19 15:11:52 crc kubenswrapper[4810]: I0219 15:11:52.396309 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 19 15:11:52 crc kubenswrapper[4810]: I0219 15:11:52.413883 4810 request.go:700] Waited for 1.007369853s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-storage-version-migrator-operator/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&limit=500&resourceVersion=0 Feb 19 15:11:52 crc kubenswrapper[4810]: I0219 15:11:52.416648 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 19 15:11:52 crc kubenswrapper[4810]: I0219 15:11:52.437432 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 19 15:11:52 crc kubenswrapper[4810]: I0219 15:11:52.457149 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 19 15:11:52 crc kubenswrapper[4810]: I0219 15:11:52.496144 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 19 15:11:52 crc kubenswrapper[4810]: I0219 15:11:52.498932 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2zrj\" (UniqueName: \"kubernetes.io/projected/a4b3e85d-d02e-4e13-8bab-aa86d2629d85-kube-api-access-j2zrj\") pod \"apiserver-76f77b778f-5d4rp\" (UID: \"a4b3e85d-d02e-4e13-8bab-aa86d2629d85\") " pod="openshift-apiserver/apiserver-76f77b778f-5d4rp" Feb 19 15:11:52 crc kubenswrapper[4810]: I0219 15:11:52.517588 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 19 15:11:52 crc kubenswrapper[4810]: I0219 15:11:52.536206 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 19 15:11:52 crc kubenswrapper[4810]: I0219 15:11:52.556181 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 19 15:11:52 crc kubenswrapper[4810]: I0219 15:11:52.576139 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 19 15:11:52 crc kubenswrapper[4810]: I0219 15:11:52.597566 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 19 15:11:52 crc kubenswrapper[4810]: I0219 15:11:52.616645 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 19 15:11:52 crc kubenswrapper[4810]: I0219 15:11:52.635969 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 19 15:11:52 crc kubenswrapper[4810]: I0219 15:11:52.656881 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 19 15:11:52 crc kubenswrapper[4810]: I0219 15:11:52.683292 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 19 15:11:52 crc kubenswrapper[4810]: I0219 15:11:52.695244 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 19 15:11:52 crc kubenswrapper[4810]: I0219 15:11:52.697876 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-5d4rp" Feb 19 15:11:52 crc kubenswrapper[4810]: I0219 15:11:52.717306 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 19 15:11:52 crc kubenswrapper[4810]: I0219 15:11:52.736819 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 19 15:11:52 crc kubenswrapper[4810]: I0219 15:11:52.756540 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 19 15:11:52 crc kubenswrapper[4810]: I0219 15:11:52.776271 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 19 15:11:52 crc kubenswrapper[4810]: I0219 15:11:52.796921 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 19 15:11:52 crc kubenswrapper[4810]: I0219 15:11:52.817096 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 19 15:11:52 crc kubenswrapper[4810]: I0219 15:11:52.836025 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 19 15:11:52 crc kubenswrapper[4810]: I0219 15:11:52.857705 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 19 15:11:52 crc kubenswrapper[4810]: I0219 15:11:52.876318 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 19 15:11:52 crc kubenswrapper[4810]: I0219 15:11:52.895303 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 19 15:11:52 crc kubenswrapper[4810]: I0219 15:11:52.915817 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 19 15:11:52 crc kubenswrapper[4810]: I0219 15:11:52.931385 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-5d4rp"] Feb 19 15:11:52 crc kubenswrapper[4810]: I0219 15:11:52.936904 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 19 15:11:52 crc kubenswrapper[4810]: W0219 15:11:52.944178 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4b3e85d_d02e_4e13_8bab_aa86d2629d85.slice/crio-53df8586a8388b6fc9e9be585e86c564e80a0916443047f532e578facc1131c5 WatchSource:0}: Error finding container 53df8586a8388b6fc9e9be585e86c564e80a0916443047f532e578facc1131c5: Status 404 returned error can't find the container with id 53df8586a8388b6fc9e9be585e86c564e80a0916443047f532e578facc1131c5 Feb 19 15:11:52 crc kubenswrapper[4810]: I0219 15:11:52.965824 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 19 15:11:52 crc kubenswrapper[4810]: I0219 15:11:52.975485 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 19 15:11:52 crc kubenswrapper[4810]: I0219 15:11:52.995831 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.016135 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.035628 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.055431 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.121611 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2ff8\" (UniqueName: \"kubernetes.io/projected/e8c9b478-4884-4c32-acf1-5fdec0cfac06-kube-api-access-v2ff8\") pod \"dns-operator-744455d44c-wpzzq\" (UID: \"e8c9b478-4884-4c32-acf1-5fdec0cfac06\") " pod="openshift-dns-operator/dns-operator-744455d44c-wpzzq" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.143773 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7c9k2\" (UniqueName: \"kubernetes.io/projected/0f46b00f-f770-4539-92f9-60e1146308ab-kube-api-access-7c9k2\") pod \"machine-config-controller-84d6567774-f4qrf\" (UID: \"0f46b00f-f770-4539-92f9-60e1146308ab\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-f4qrf" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.157694 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87t8g\" (UniqueName: \"kubernetes.io/projected/0e4ea5aa-2074-4100-a916-6bdfb3331d43-kube-api-access-87t8g\") pod \"cluster-image-registry-operator-dc59b4c8b-xhg7l\" (UID: \"0e4ea5aa-2074-4100-a916-6bdfb3331d43\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xhg7l" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.169887 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kr7hk\" (UniqueName: \"kubernetes.io/projected/f7aa90a6-963b-4ff0-b6de-ad46fa896e18-kube-api-access-kr7hk\") pod \"openshift-apiserver-operator-796bbdcf4f-55x9c\" (UID: \"f7aa90a6-963b-4ff0-b6de-ad46fa896e18\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-55x9c" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.188616 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gzjq\" (UniqueName: \"kubernetes.io/projected/239280e2-b335-4f87-89a8-00cb6f8e3c69-kube-api-access-8gzjq\") pod \"authentication-operator-69f744f599-jm2nk\" (UID: \"239280e2-b335-4f87-89a8-00cb6f8e3c69\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jm2nk" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.202193 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-wpzzq" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.216079 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4sn5\" (UniqueName: \"kubernetes.io/projected/fee373cf-50b8-42f4-b30d-4a3d230ca27e-kube-api-access-v4sn5\") pod \"etcd-operator-b45778765-d8pqg\" (UID: \"fee373cf-50b8-42f4-b30d-4a3d230ca27e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-d8pqg" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.227824 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0e4ea5aa-2074-4100-a916-6bdfb3331d43-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-xhg7l\" (UID: \"0e4ea5aa-2074-4100-a916-6bdfb3331d43\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xhg7l" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.251399 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l478x\" (UniqueName: \"kubernetes.io/projected/ad43df2c-4944-45e2-919f-0c297f4092d4-kube-api-access-l478x\") pod \"route-controller-manager-6576b87f9c-n8zth\" (UID: \"ad43df2c-4944-45e2-919f-0c297f4092d4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n8zth" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.265885 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xhg7l" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.270034 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-d8pqg" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.270612 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hxq2\" (UniqueName: \"kubernetes.io/projected/c696fe96-0485-44d0-b4fb-161503c334e8-kube-api-access-9hxq2\") pod \"apiserver-7bbb656c7d-q6zwc\" (UID: \"c696fe96-0485-44d0-b4fb-161503c334e8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q6zwc" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.292217 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q6zwc" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.303285 4810 generic.go:334] "Generic (PLEG): container finished" podID="a4b3e85d-d02e-4e13-8bab-aa86d2629d85" containerID="e1298251b6d3f264c15b9e63f0ef24620188b4f18194f013ad69b9aa3b66a9fa" exitCode=0 Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.303365 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-5d4rp" event={"ID":"a4b3e85d-d02e-4e13-8bab-aa86d2629d85","Type":"ContainerDied","Data":"e1298251b6d3f264c15b9e63f0ef24620188b4f18194f013ad69b9aa3b66a9fa"} Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.303401 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-5d4rp" event={"ID":"a4b3e85d-d02e-4e13-8bab-aa86d2629d85","Type":"ContainerStarted","Data":"53df8586a8388b6fc9e9be585e86c564e80a0916443047f532e578facc1131c5"} Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.305741 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7x4fb\" (UniqueName: \"kubernetes.io/projected/9a7776ca-1a56-4eca-9e44-ba1b7b15510f-kube-api-access-7x4fb\") pod \"machine-api-operator-5694c8668f-l66cb\" (UID: \"9a7776ca-1a56-4eca-9e44-ba1b7b15510f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-l66cb" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.311578 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnqb7\" (UniqueName: \"kubernetes.io/projected/362cd55c-b576-44bd-843c-078bf26b3b1e-kube-api-access-pnqb7\") pod \"console-f9d7485db-4hddt\" (UID: \"362cd55c-b576-44bd-843c-078bf26b3b1e\") " pod="openshift-console/console-f9d7485db-4hddt" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.319723 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-l66cb" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.339950 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgcbs\" (UniqueName: \"kubernetes.io/projected/8f4b9328-0efe-42a8-9a73-a80eb6a26151-kube-api-access-hgcbs\") pod \"openshift-controller-manager-operator-756b6f6bc6-kclsk\" (UID: \"8f4b9328-0efe-42a8-9a73-a80eb6a26151\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kclsk" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.357028 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mbjx\" (UniqueName: \"kubernetes.io/projected/7a29951a-027e-49b4-a7ea-a8e363942414-kube-api-access-2mbjx\") pod \"downloads-7954f5f757-dkppn\" (UID: \"7a29951a-027e-49b4-a7ea-a8e363942414\") " pod="openshift-console/downloads-7954f5f757-dkppn" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.370867 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kclsk" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.371468 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jh2sf\" (UniqueName: \"kubernetes.io/projected/77544608-a940-4c0c-9a1a-a5a98f480134-kube-api-access-jh2sf\") pod \"machine-approver-56656f9798-8mxdc\" (UID: \"77544608-a940-4c0c-9a1a-a5a98f480134\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8mxdc" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.380595 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8mxdc" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.380688 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-wpzzq"] Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.389783 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-f4qrf" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.392981 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8k7k5\" (UniqueName: \"kubernetes.io/projected/6f0a78b8-77b3-47dd-9dbe-e578b0374cf6-kube-api-access-8k7k5\") pod \"controller-manager-879f6c89f-sl5p9\" (UID: \"6f0a78b8-77b3-47dd-9dbe-e578b0374cf6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sl5p9" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.411554 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jh895\" (UniqueName: \"kubernetes.io/projected/25e05c8a-335b-405c-9033-f689c21c5ecc-kube-api-access-jh895\") pod \"cluster-samples-operator-665b6dd947-xhzlb\" (UID: \"25e05c8a-335b-405c-9033-f689c21c5ecc\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xhzlb" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.414319 4810 request.go:700] Waited for 1.87568074s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/hostpath-provisioner/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&limit=500&resourceVersion=0 Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.415436 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.437768 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-55x9c" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.439814 4810 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.448622 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-jm2nk" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.456238 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.476729 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.499998 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.516043 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xhg7l"] Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.526107 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.537745 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.542201 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-q6zwc"] Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.544220 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xhzlb" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.545088 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n8zth" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.557276 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.575561 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.576893 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-d8pqg"] Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.578099 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-4hddt" Feb 19 15:11:53 crc kubenswrapper[4810]: W0219 15:11:53.579302 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc696fe96_0485_44d0_b4fb_161503c334e8.slice/crio-4cfc3c3a4162660b44696564144790aac4145b83976abe219d6df4b4502d2dc6 WatchSource:0}: Error finding container 4cfc3c3a4162660b44696564144790aac4145b83976abe219d6df4b4502d2dc6: Status 404 returned error can't find the container with id 4cfc3c3a4162660b44696564144790aac4145b83976abe219d6df4b4502d2dc6 Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.585012 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-dkppn" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.595831 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.604921 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-l66cb"] Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.645142 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-sl5p9" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.659054 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbnnp\" (UniqueName: \"kubernetes.io/projected/bd317655-38ea-4fdb-95d0-82adc08456a8-kube-api-access-dbnnp\") pod \"openshift-config-operator-7777fb866f-25cmt\" (UID: \"bd317655-38ea-4fdb-95d0-82adc08456a8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-25cmt" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.675771 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/026a2812-dc11-4b60-b911-bb41a0d39d7d-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-9lqqd\" (UID: \"026a2812-dc11-4b60-b911-bb41a0d39d7d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9lqqd" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.686965 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kclsk"] Feb 19 15:11:53 crc kubenswrapper[4810]: W0219 15:11:53.690713 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfee373cf_50b8_42f4_b30d_4a3d230ca27e.slice/crio-3989d96419fba09da07c9e99ef43b0eb4df43c01359b3c17be42be6e9e26f30f WatchSource:0}: Error finding container 3989d96419fba09da07c9e99ef43b0eb4df43c01359b3c17be42be6e9e26f30f: Status 404 returned error can't find the container with id 3989d96419fba09da07c9e99ef43b0eb4df43c01359b3c17be42be6e9e26f30f Feb 19 15:11:53 crc kubenswrapper[4810]: W0219 15:11:53.696037 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a7776ca_1a56_4eca_9e44_ba1b7b15510f.slice/crio-b87f6328992f13d54a1ed3e925e1d3835339d68ebe81ed0173553e621b7c78da WatchSource:0}: Error finding container b87f6328992f13d54a1ed3e925e1d3835339d68ebe81ed0173553e621b7c78da: Status 404 returned error can't find the container with id b87f6328992f13d54a1ed3e925e1d3835339d68ebe81ed0173553e621b7c78da Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.700893 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2f48\" (UniqueName: \"kubernetes.io/projected/63674760-f499-49b8-a575-a8ae954eada4-kube-api-access-d2f48\") pod \"ingress-operator-5b745b69d9-kmhmh\" (UID: \"63674760-f499-49b8-a575-a8ae954eada4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kmhmh" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.704083 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-25cmt" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.707404 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-f4qrf"] Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.720063 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ff4j\" (UniqueName: \"kubernetes.io/projected/b1fae478-5e9a-4da1-b2e3-35c1ee7f8fa3-kube-api-access-8ff4j\") pod \"package-server-manager-789f6589d5-2nn8b\" (UID: \"b1fae478-5e9a-4da1-b2e3-35c1ee7f8fa3\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2nn8b" Feb 19 15:11:53 crc kubenswrapper[4810]: W0219 15:11:53.724495 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f4b9328_0efe_42a8_9a73_a80eb6a26151.slice/crio-724adfe31aa7fad7a5da317cec629a9441c15ea118c52e1a602faab0955fb919 WatchSource:0}: Error finding container 724adfe31aa7fad7a5da317cec629a9441c15ea118c52e1a602faab0955fb919: Status 404 returned error can't find the container with id 724adfe31aa7fad7a5da317cec629a9441c15ea118c52e1a602faab0955fb919 Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.733967 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxw8d\" (UniqueName: \"kubernetes.io/projected/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-kube-api-access-zxw8d\") pod \"oauth-openshift-558db77b4-r74mv\" (UID: \"c18fb461-ce5b-43ad-85ca-305c3f8a7d46\") " pod="openshift-authentication/oauth-openshift-558db77b4-r74mv" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.763510 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9hrx\" (UniqueName: \"kubernetes.io/projected/ef2828b3-f501-4105-abc8-6b1ce9658301-kube-api-access-x9hrx\") pod \"catalog-operator-68c6474976-2phqk\" (UID: \"ef2828b3-f501-4105-abc8-6b1ce9658301\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2phqk" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.765409 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-jm2nk"] Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.780583 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9xt4\" (UniqueName: \"kubernetes.io/projected/5cc06c55-7085-4cc0-8399-833b4243b51e-kube-api-access-x9xt4\") pod \"machine-config-operator-74547568cd-65ltw\" (UID: \"5cc06c55-7085-4cc0-8399-833b4243b51e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-65ltw" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.792241 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/63674760-f499-49b8-a575-a8ae954eada4-bound-sa-token\") pod \"ingress-operator-5b745b69d9-kmhmh\" (UID: \"63674760-f499-49b8-a575-a8ae954eada4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kmhmh" Feb 19 15:11:53 crc kubenswrapper[4810]: W0219 15:11:53.799196 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod239280e2_b335_4f87_89a8_00cb6f8e3c69.slice/crio-f40aa4e7f50582e05ed4f854cfdd01edd619d078425d9ec703efac5568e62189 WatchSource:0}: Error finding container f40aa4e7f50582e05ed4f854cfdd01edd619d078425d9ec703efac5568e62189: Status 404 returned error can't find the container with id f40aa4e7f50582e05ed4f854cfdd01edd619d078425d9ec703efac5568e62189 Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.828413 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmvtp\" (UniqueName: \"kubernetes.io/projected/6a43ce91-6673-4641-a2d6-551afe72688d-kube-api-access-xmvtp\") pod \"packageserver-d55dfcdfc-hmxtq\" (UID: \"6a43ce91-6673-4641-a2d6-551afe72688d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hmxtq" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.842241 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b7a0b611-cb8e-431b-b527-b6164471c85f-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-l924n\" (UID: \"b7a0b611-cb8e-431b-b527-b6164471c85f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l924n" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.844851 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-55x9c"] Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.865510 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lhl5\" (UniqueName: \"kubernetes.io/projected/76ffdcba-57d6-4636-8373-f088926a716d-kube-api-access-7lhl5\") pod \"router-default-5444994796-hvw7f\" (UID: \"76ffdcba-57d6-4636-8373-f088926a716d\") " pod="openshift-ingress/router-default-5444994796-hvw7f" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.891970 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b1e6fd37-d069-441e-9c0f-caf60d8f8d4f-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8j2gh\" (UID: \"b1e6fd37-d069-441e-9c0f-caf60d8f8d4f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8j2gh" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.905978 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqq2f\" (UniqueName: \"kubernetes.io/projected/13f4a10d-50aa-41ec-9931-cb835ba1f54c-kube-api-access-zqq2f\") pod \"multus-admission-controller-857f4d67dd-zrsn2\" (UID: \"13f4a10d-50aa-41ec-9931-cb835ba1f54c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-zrsn2" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.914901 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2nn8b" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.915738 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-hvw7f" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.947799 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9lqqd" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.949617 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l924n" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.953537 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-4hddt"] Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.956906 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8j2gh" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.965261 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-zrsn2" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.971117 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g96x2\" (UniqueName: \"kubernetes.io/projected/a6cb5092-5f01-4dd9-a940-804d88907744-kube-api-access-g96x2\") pod \"image-registry-697d97f7c8-7bnq2\" (UID: \"a6cb5092-5f01-4dd9-a940-804d88907744\") " pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.971168 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/969b9cb6-f89f-47e9-b8f7-754804a41dea-serving-cert\") pod \"console-operator-58897d9998-h68pj\" (UID: \"969b9cb6-f89f-47e9-b8f7-754804a41dea\") " pod="openshift-console-operator/console-operator-58897d9998-h68pj" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.971196 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a6cb5092-5f01-4dd9-a940-804d88907744-registry-certificates\") pod \"image-registry-697d97f7c8-7bnq2\" (UID: \"a6cb5092-5f01-4dd9-a940-804d88907744\") " pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.971667 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kmhmh" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.971730 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b8b1faba-e1b8-436c-aa84-ae4353c5f0a9-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-r2tqm\" (UID: \"b8b1faba-e1b8-436c-aa84-ae4353c5f0a9\") " pod="openshift-marketplace/marketplace-operator-79b997595-r2tqm" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.971860 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d3068e5d-4ff8-438f-958b-f0f90e773ca1-profile-collector-cert\") pod \"olm-operator-6b444d44fb-mkt5x\" (UID: \"d3068e5d-4ff8-438f-958b-f0f90e773ca1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mkt5x" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.971902 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a6cb5092-5f01-4dd9-a940-804d88907744-installation-pull-secrets\") pod \"image-registry-697d97f7c8-7bnq2\" (UID: \"a6cb5092-5f01-4dd9-a940-804d88907744\") " pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.972623 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/969b9cb6-f89f-47e9-b8f7-754804a41dea-trusted-ca\") pod \"console-operator-58897d9998-h68pj\" (UID: \"969b9cb6-f89f-47e9-b8f7-754804a41dea\") " pod="openshift-console-operator/console-operator-58897d9998-h68pj" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.972722 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/74295390-5384-402d-8c5b-dc2559bb6d9c-signing-key\") pod \"service-ca-9c57cc56f-frmnw\" (UID: \"74295390-5384-402d-8c5b-dc2559bb6d9c\") " pod="openshift-service-ca/service-ca-9c57cc56f-frmnw" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.972759 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a6cb5092-5f01-4dd9-a940-804d88907744-registry-tls\") pod \"image-registry-697d97f7c8-7bnq2\" (UID: \"a6cb5092-5f01-4dd9-a940-804d88907744\") " pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.972805 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9d9ff005-2d60-4eb5-b8fa-59b84661617f-config-volume\") pod \"dns-default-zgcrd\" (UID: \"9d9ff005-2d60-4eb5-b8fa-59b84661617f\") " pod="openshift-dns/dns-default-zgcrd" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.972879 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c7e5d88-e6b7-416c-abbd-eed95cc772de-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-rwvxk\" (UID: \"0c7e5d88-e6b7-416c-abbd-eed95cc772de\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rwvxk" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.972907 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a6cb5092-5f01-4dd9-a940-804d88907744-trusted-ca\") pod \"image-registry-697d97f7c8-7bnq2\" (UID: \"a6cb5092-5f01-4dd9-a940-804d88907744\") " pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.972929 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a6cb5092-5f01-4dd9-a940-804d88907744-ca-trust-extracted\") pod \"image-registry-697d97f7c8-7bnq2\" (UID: \"a6cb5092-5f01-4dd9-a940-804d88907744\") " pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.973001 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b8f83b3-fabb-4404-88ce-e64c4db8a568-config\") pod \"service-ca-operator-777779d784-n6fl9\" (UID: \"9b8f83b3-fabb-4404-88ce-e64c4db8a568\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-n6fl9" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.973061 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxtvx\" (UniqueName: \"kubernetes.io/projected/d3068e5d-4ff8-438f-958b-f0f90e773ca1-kube-api-access-sxtvx\") pod \"olm-operator-6b444d44fb-mkt5x\" (UID: \"d3068e5d-4ff8-438f-958b-f0f90e773ca1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mkt5x" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.973165 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a6cb5092-5f01-4dd9-a940-804d88907744-bound-sa-token\") pod \"image-registry-697d97f7c8-7bnq2\" (UID: \"a6cb5092-5f01-4dd9-a940-804d88907744\") " pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.973227 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b8f83b3-fabb-4404-88ce-e64c4db8a568-serving-cert\") pod \"service-ca-operator-777779d784-n6fl9\" (UID: \"9b8f83b3-fabb-4404-88ce-e64c4db8a568\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-n6fl9" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.973260 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/969b9cb6-f89f-47e9-b8f7-754804a41dea-config\") pod \"console-operator-58897d9998-h68pj\" (UID: \"969b9cb6-f89f-47e9-b8f7-754804a41dea\") " pod="openshift-console-operator/console-operator-58897d9998-h68pj" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.973291 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c7e5d88-e6b7-416c-abbd-eed95cc772de-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-rwvxk\" (UID: \"0c7e5d88-e6b7-416c-abbd-eed95cc772de\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rwvxk" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.973361 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfppl\" (UniqueName: \"kubernetes.io/projected/9d9ff005-2d60-4eb5-b8fa-59b84661617f-kube-api-access-dfppl\") pod \"dns-default-zgcrd\" (UID: \"9d9ff005-2d60-4eb5-b8fa-59b84661617f\") " pod="openshift-dns/dns-default-zgcrd" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.973396 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45klr\" (UniqueName: \"kubernetes.io/projected/e3f898c2-3784-4157-b1c5-fdadcaf69bef-kube-api-access-45klr\") pod \"migrator-59844c95c7-w6k69\" (UID: \"e3f898c2-3784-4157-b1c5-fdadcaf69bef\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-w6k69" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.973681 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/64dc0d58-11d4-456b-97ab-a4d3ec28225b-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-895xv\" (UID: \"64dc0d58-11d4-456b-97ab-a4d3ec28225b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-895xv" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.973705 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdm7r\" (UniqueName: \"kubernetes.io/projected/8383a9e3-149b-4512-a9fd-12cd0b65e370-kube-api-access-jdm7r\") pod \"collect-profiles-29525220-wqnml\" (UID: \"8383a9e3-149b-4512-a9fd-12cd0b65e370\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525220-wqnml" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.973747 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hdzd\" (UniqueName: \"kubernetes.io/projected/969b9cb6-f89f-47e9-b8f7-754804a41dea-kube-api-access-4hdzd\") pod \"console-operator-58897d9998-h68pj\" (UID: \"969b9cb6-f89f-47e9-b8f7-754804a41dea\") " pod="openshift-console-operator/console-operator-58897d9998-h68pj" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.973865 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rrc6\" (UniqueName: \"kubernetes.io/projected/74295390-5384-402d-8c5b-dc2559bb6d9c-kube-api-access-8rrc6\") pod \"service-ca-9c57cc56f-frmnw\" (UID: \"74295390-5384-402d-8c5b-dc2559bb6d9c\") " pod="openshift-service-ca/service-ca-9c57cc56f-frmnw" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.973936 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7bnq2\" (UID: \"a6cb5092-5f01-4dd9-a940-804d88907744\") " pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.973958 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8mnc\" (UniqueName: \"kubernetes.io/projected/0c7e5d88-e6b7-416c-abbd-eed95cc772de-kube-api-access-z8mnc\") pod \"kube-storage-version-migrator-operator-b67b599dd-rwvxk\" (UID: \"0c7e5d88-e6b7-416c-abbd-eed95cc772de\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rwvxk" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.973978 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6x6lv\" (UniqueName: \"kubernetes.io/projected/64dc0d58-11d4-456b-97ab-a4d3ec28225b-kube-api-access-6x6lv\") pod \"control-plane-machine-set-operator-78cbb6b69f-895xv\" (UID: \"64dc0d58-11d4-456b-97ab-a4d3ec28225b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-895xv" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.974028 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9vz7\" (UniqueName: \"kubernetes.io/projected/b8b1faba-e1b8-436c-aa84-ae4353c5f0a9-kube-api-access-q9vz7\") pod \"marketplace-operator-79b997595-r2tqm\" (UID: \"b8b1faba-e1b8-436c-aa84-ae4353c5f0a9\") " pod="openshift-marketplace/marketplace-operator-79b997595-r2tqm" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.974049 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9d9ff005-2d60-4eb5-b8fa-59b84661617f-metrics-tls\") pod \"dns-default-zgcrd\" (UID: \"9d9ff005-2d60-4eb5-b8fa-59b84661617f\") " pod="openshift-dns/dns-default-zgcrd" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.974199 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/74295390-5384-402d-8c5b-dc2559bb6d9c-signing-cabundle\") pod \"service-ca-9c57cc56f-frmnw\" (UID: \"74295390-5384-402d-8c5b-dc2559bb6d9c\") " pod="openshift-service-ca/service-ca-9c57cc56f-frmnw" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.974237 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8383a9e3-149b-4512-a9fd-12cd0b65e370-config-volume\") pod \"collect-profiles-29525220-wqnml\" (UID: \"8383a9e3-149b-4512-a9fd-12cd0b65e370\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525220-wqnml" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.974263 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8383a9e3-149b-4512-a9fd-12cd0b65e370-secret-volume\") pod \"collect-profiles-29525220-wqnml\" (UID: \"8383a9e3-149b-4512-a9fd-12cd0b65e370\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525220-wqnml" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.974353 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b8b1faba-e1b8-436c-aa84-ae4353c5f0a9-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-r2tqm\" (UID: \"b8b1faba-e1b8-436c-aa84-ae4353c5f0a9\") " pod="openshift-marketplace/marketplace-operator-79b997595-r2tqm" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.974388 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncbgj\" (UniqueName: \"kubernetes.io/projected/9b8f83b3-fabb-4404-88ce-e64c4db8a568-kube-api-access-ncbgj\") pod \"service-ca-operator-777779d784-n6fl9\" (UID: \"9b8f83b3-fabb-4404-88ce-e64c4db8a568\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-n6fl9" Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.974503 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d3068e5d-4ff8-438f-958b-f0f90e773ca1-srv-cert\") pod \"olm-operator-6b444d44fb-mkt5x\" (UID: \"d3068e5d-4ff8-438f-958b-f0f90e773ca1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mkt5x" Feb 19 15:11:53 crc kubenswrapper[4810]: E0219 15:11:53.977905 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 15:11:54.477888213 +0000 UTC m=+143.959918337 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7bnq2" (UID: "a6cb5092-5f01-4dd9-a940-804d88907744") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:11:53 crc kubenswrapper[4810]: I0219 15:11:53.978968 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-r74mv" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.006185 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2phqk" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.010222 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hmxtq" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.024191 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-65ltw" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.075804 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 15:11:54 crc kubenswrapper[4810]: E0219 15:11:54.076206 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 15:11:54.576169417 +0000 UTC m=+144.058199541 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.076932 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d3068e5d-4ff8-438f-958b-f0f90e773ca1-srv-cert\") pod \"olm-operator-6b444d44fb-mkt5x\" (UID: \"d3068e5d-4ff8-438f-958b-f0f90e773ca1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mkt5x" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.076971 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g96x2\" (UniqueName: \"kubernetes.io/projected/a6cb5092-5f01-4dd9-a940-804d88907744-kube-api-access-g96x2\") pod \"image-registry-697d97f7c8-7bnq2\" (UID: \"a6cb5092-5f01-4dd9-a940-804d88907744\") " pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.076999 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a6cb5092-5f01-4dd9-a940-804d88907744-registry-certificates\") pod \"image-registry-697d97f7c8-7bnq2\" (UID: \"a6cb5092-5f01-4dd9-a940-804d88907744\") " pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.077027 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/969b9cb6-f89f-47e9-b8f7-754804a41dea-serving-cert\") pod \"console-operator-58897d9998-h68pj\" (UID: \"969b9cb6-f89f-47e9-b8f7-754804a41dea\") " pod="openshift-console-operator/console-operator-58897d9998-h68pj" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.077057 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/59eb54a2-ac34-429b-a275-7d365be9ad2f-node-bootstrap-token\") pod \"machine-config-server-hlw9s\" (UID: \"59eb54a2-ac34-429b-a275-7d365be9ad2f\") " pod="openshift-machine-config-operator/machine-config-server-hlw9s" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.077082 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b8b1faba-e1b8-436c-aa84-ae4353c5f0a9-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-r2tqm\" (UID: \"b8b1faba-e1b8-436c-aa84-ae4353c5f0a9\") " pod="openshift-marketplace/marketplace-operator-79b997595-r2tqm" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.077100 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cbdn\" (UniqueName: \"kubernetes.io/projected/59eb54a2-ac34-429b-a275-7d365be9ad2f-kube-api-access-2cbdn\") pod \"machine-config-server-hlw9s\" (UID: \"59eb54a2-ac34-429b-a275-7d365be9ad2f\") " pod="openshift-machine-config-operator/machine-config-server-hlw9s" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.077128 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d3068e5d-4ff8-438f-958b-f0f90e773ca1-profile-collector-cert\") pod \"olm-operator-6b444d44fb-mkt5x\" (UID: \"d3068e5d-4ff8-438f-958b-f0f90e773ca1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mkt5x" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.077145 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/51a1f271-446d-42d2-b946-ad816257e990-mountpoint-dir\") pod \"csi-hostpathplugin-4nds4\" (UID: \"51a1f271-446d-42d2-b946-ad816257e990\") " pod="hostpath-provisioner/csi-hostpathplugin-4nds4" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.077162 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a6cb5092-5f01-4dd9-a940-804d88907744-installation-pull-secrets\") pod \"image-registry-697d97f7c8-7bnq2\" (UID: \"a6cb5092-5f01-4dd9-a940-804d88907744\") " pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.077179 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/969b9cb6-f89f-47e9-b8f7-754804a41dea-trusted-ca\") pod \"console-operator-58897d9998-h68pj\" (UID: \"969b9cb6-f89f-47e9-b8f7-754804a41dea\") " pod="openshift-console-operator/console-operator-58897d9998-h68pj" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.077198 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/74295390-5384-402d-8c5b-dc2559bb6d9c-signing-key\") pod \"service-ca-9c57cc56f-frmnw\" (UID: \"74295390-5384-402d-8c5b-dc2559bb6d9c\") " pod="openshift-service-ca/service-ca-9c57cc56f-frmnw" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.077216 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a6cb5092-5f01-4dd9-a940-804d88907744-registry-tls\") pod \"image-registry-697d97f7c8-7bnq2\" (UID: \"a6cb5092-5f01-4dd9-a940-804d88907744\") " pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.077235 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9d9ff005-2d60-4eb5-b8fa-59b84661617f-config-volume\") pod \"dns-default-zgcrd\" (UID: \"9d9ff005-2d60-4eb5-b8fa-59b84661617f\") " pod="openshift-dns/dns-default-zgcrd" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.077259 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a6cb5092-5f01-4dd9-a940-804d88907744-trusted-ca\") pod \"image-registry-697d97f7c8-7bnq2\" (UID: \"a6cb5092-5f01-4dd9-a940-804d88907744\") " pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.077278 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c7e5d88-e6b7-416c-abbd-eed95cc772de-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-rwvxk\" (UID: \"0c7e5d88-e6b7-416c-abbd-eed95cc772de\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rwvxk" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.077296 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a6cb5092-5f01-4dd9-a940-804d88907744-ca-trust-extracted\") pod \"image-registry-697d97f7c8-7bnq2\" (UID: \"a6cb5092-5f01-4dd9-a940-804d88907744\") " pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.077335 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b8f83b3-fabb-4404-88ce-e64c4db8a568-config\") pod \"service-ca-operator-777779d784-n6fl9\" (UID: \"9b8f83b3-fabb-4404-88ce-e64c4db8a568\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-n6fl9" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.077355 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxtvx\" (UniqueName: \"kubernetes.io/projected/d3068e5d-4ff8-438f-958b-f0f90e773ca1-kube-api-access-sxtvx\") pod \"olm-operator-6b444d44fb-mkt5x\" (UID: \"d3068e5d-4ff8-438f-958b-f0f90e773ca1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mkt5x" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.077378 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9cg2\" (UniqueName: \"kubernetes.io/projected/500c6ae7-fd1f-4af6-ad6c-f1db0c2af222-kube-api-access-m9cg2\") pod \"ingress-canary-cdfxh\" (UID: \"500c6ae7-fd1f-4af6-ad6c-f1db0c2af222\") " pod="openshift-ingress-canary/ingress-canary-cdfxh" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.077397 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jld4k\" (UniqueName: \"kubernetes.io/projected/51a1f271-446d-42d2-b946-ad816257e990-kube-api-access-jld4k\") pod \"csi-hostpathplugin-4nds4\" (UID: \"51a1f271-446d-42d2-b946-ad816257e990\") " pod="hostpath-provisioner/csi-hostpathplugin-4nds4" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.077418 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a6cb5092-5f01-4dd9-a940-804d88907744-bound-sa-token\") pod \"image-registry-697d97f7c8-7bnq2\" (UID: \"a6cb5092-5f01-4dd9-a940-804d88907744\") " pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.077438 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b8f83b3-fabb-4404-88ce-e64c4db8a568-serving-cert\") pod \"service-ca-operator-777779d784-n6fl9\" (UID: \"9b8f83b3-fabb-4404-88ce-e64c4db8a568\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-n6fl9" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.077467 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/969b9cb6-f89f-47e9-b8f7-754804a41dea-config\") pod \"console-operator-58897d9998-h68pj\" (UID: \"969b9cb6-f89f-47e9-b8f7-754804a41dea\") " pod="openshift-console-operator/console-operator-58897d9998-h68pj" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.077490 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c7e5d88-e6b7-416c-abbd-eed95cc772de-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-rwvxk\" (UID: \"0c7e5d88-e6b7-416c-abbd-eed95cc772de\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rwvxk" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.077512 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45klr\" (UniqueName: \"kubernetes.io/projected/e3f898c2-3784-4157-b1c5-fdadcaf69bef-kube-api-access-45klr\") pod \"migrator-59844c95c7-w6k69\" (UID: \"e3f898c2-3784-4157-b1c5-fdadcaf69bef\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-w6k69" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.077533 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfppl\" (UniqueName: \"kubernetes.io/projected/9d9ff005-2d60-4eb5-b8fa-59b84661617f-kube-api-access-dfppl\") pod \"dns-default-zgcrd\" (UID: \"9d9ff005-2d60-4eb5-b8fa-59b84661617f\") " pod="openshift-dns/dns-default-zgcrd" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.077555 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/64dc0d58-11d4-456b-97ab-a4d3ec28225b-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-895xv\" (UID: \"64dc0d58-11d4-456b-97ab-a4d3ec28225b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-895xv" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.077576 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/51a1f271-446d-42d2-b946-ad816257e990-socket-dir\") pod \"csi-hostpathplugin-4nds4\" (UID: \"51a1f271-446d-42d2-b946-ad816257e990\") " pod="hostpath-provisioner/csi-hostpathplugin-4nds4" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.077597 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/51a1f271-446d-42d2-b946-ad816257e990-plugins-dir\") pod \"csi-hostpathplugin-4nds4\" (UID: \"51a1f271-446d-42d2-b946-ad816257e990\") " pod="hostpath-provisioner/csi-hostpathplugin-4nds4" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.077617 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdm7r\" (UniqueName: \"kubernetes.io/projected/8383a9e3-149b-4512-a9fd-12cd0b65e370-kube-api-access-jdm7r\") pod \"collect-profiles-29525220-wqnml\" (UID: \"8383a9e3-149b-4512-a9fd-12cd0b65e370\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525220-wqnml" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.077635 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/51a1f271-446d-42d2-b946-ad816257e990-registration-dir\") pod \"csi-hostpathplugin-4nds4\" (UID: \"51a1f271-446d-42d2-b946-ad816257e990\") " pod="hostpath-provisioner/csi-hostpathplugin-4nds4" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.077658 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hdzd\" (UniqueName: \"kubernetes.io/projected/969b9cb6-f89f-47e9-b8f7-754804a41dea-kube-api-access-4hdzd\") pod \"console-operator-58897d9998-h68pj\" (UID: \"969b9cb6-f89f-47e9-b8f7-754804a41dea\") " pod="openshift-console-operator/console-operator-58897d9998-h68pj" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.077677 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rrc6\" (UniqueName: \"kubernetes.io/projected/74295390-5384-402d-8c5b-dc2559bb6d9c-kube-api-access-8rrc6\") pod \"service-ca-9c57cc56f-frmnw\" (UID: \"74295390-5384-402d-8c5b-dc2559bb6d9c\") " pod="openshift-service-ca/service-ca-9c57cc56f-frmnw" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.077695 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/500c6ae7-fd1f-4af6-ad6c-f1db0c2af222-cert\") pod \"ingress-canary-cdfxh\" (UID: \"500c6ae7-fd1f-4af6-ad6c-f1db0c2af222\") " pod="openshift-ingress-canary/ingress-canary-cdfxh" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.077728 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8mnc\" (UniqueName: \"kubernetes.io/projected/0c7e5d88-e6b7-416c-abbd-eed95cc772de-kube-api-access-z8mnc\") pod \"kube-storage-version-migrator-operator-b67b599dd-rwvxk\" (UID: \"0c7e5d88-e6b7-416c-abbd-eed95cc772de\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rwvxk" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.078359 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6x6lv\" (UniqueName: \"kubernetes.io/projected/64dc0d58-11d4-456b-97ab-a4d3ec28225b-kube-api-access-6x6lv\") pod \"control-plane-machine-set-operator-78cbb6b69f-895xv\" (UID: \"64dc0d58-11d4-456b-97ab-a4d3ec28225b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-895xv" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.078381 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7bnq2\" (UID: \"a6cb5092-5f01-4dd9-a940-804d88907744\") " pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.078404 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9vz7\" (UniqueName: \"kubernetes.io/projected/b8b1faba-e1b8-436c-aa84-ae4353c5f0a9-kube-api-access-q9vz7\") pod \"marketplace-operator-79b997595-r2tqm\" (UID: \"b8b1faba-e1b8-436c-aa84-ae4353c5f0a9\") " pod="openshift-marketplace/marketplace-operator-79b997595-r2tqm" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.078421 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9d9ff005-2d60-4eb5-b8fa-59b84661617f-metrics-tls\") pod \"dns-default-zgcrd\" (UID: \"9d9ff005-2d60-4eb5-b8fa-59b84661617f\") " pod="openshift-dns/dns-default-zgcrd" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.078448 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/59eb54a2-ac34-429b-a275-7d365be9ad2f-certs\") pod \"machine-config-server-hlw9s\" (UID: \"59eb54a2-ac34-429b-a275-7d365be9ad2f\") " pod="openshift-machine-config-operator/machine-config-server-hlw9s" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.078476 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/74295390-5384-402d-8c5b-dc2559bb6d9c-signing-cabundle\") pod \"service-ca-9c57cc56f-frmnw\" (UID: \"74295390-5384-402d-8c5b-dc2559bb6d9c\") " pod="openshift-service-ca/service-ca-9c57cc56f-frmnw" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.078505 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8383a9e3-149b-4512-a9fd-12cd0b65e370-config-volume\") pod \"collect-profiles-29525220-wqnml\" (UID: \"8383a9e3-149b-4512-a9fd-12cd0b65e370\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525220-wqnml" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.078523 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8383a9e3-149b-4512-a9fd-12cd0b65e370-secret-volume\") pod \"collect-profiles-29525220-wqnml\" (UID: \"8383a9e3-149b-4512-a9fd-12cd0b65e370\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525220-wqnml" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.078542 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b8b1faba-e1b8-436c-aa84-ae4353c5f0a9-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-r2tqm\" (UID: \"b8b1faba-e1b8-436c-aa84-ae4353c5f0a9\") " pod="openshift-marketplace/marketplace-operator-79b997595-r2tqm" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.078561 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncbgj\" (UniqueName: \"kubernetes.io/projected/9b8f83b3-fabb-4404-88ce-e64c4db8a568-kube-api-access-ncbgj\") pod \"service-ca-operator-777779d784-n6fl9\" (UID: \"9b8f83b3-fabb-4404-88ce-e64c4db8a568\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-n6fl9" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.078581 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/51a1f271-446d-42d2-b946-ad816257e990-csi-data-dir\") pod \"csi-hostpathplugin-4nds4\" (UID: \"51a1f271-446d-42d2-b946-ad816257e990\") " pod="hostpath-provisioner/csi-hostpathplugin-4nds4" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.078394 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9d9ff005-2d60-4eb5-b8fa-59b84661617f-config-volume\") pod \"dns-default-zgcrd\" (UID: \"9d9ff005-2d60-4eb5-b8fa-59b84661617f\") " pod="openshift-dns/dns-default-zgcrd" Feb 19 15:11:54 crc kubenswrapper[4810]: E0219 15:11:54.079786 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 15:11:54.579774076 +0000 UTC m=+144.061804200 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7bnq2" (UID: "a6cb5092-5f01-4dd9-a940-804d88907744") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.081474 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a6cb5092-5f01-4dd9-a940-804d88907744-registry-certificates\") pod \"image-registry-697d97f7c8-7bnq2\" (UID: \"a6cb5092-5f01-4dd9-a940-804d88907744\") " pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.081626 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a6cb5092-5f01-4dd9-a940-804d88907744-trusted-ca\") pod \"image-registry-697d97f7c8-7bnq2\" (UID: \"a6cb5092-5f01-4dd9-a940-804d88907744\") " pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.082592 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b8b1faba-e1b8-436c-aa84-ae4353c5f0a9-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-r2tqm\" (UID: \"b8b1faba-e1b8-436c-aa84-ae4353c5f0a9\") " pod="openshift-marketplace/marketplace-operator-79b997595-r2tqm" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.083347 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/74295390-5384-402d-8c5b-dc2559bb6d9c-signing-cabundle\") pod \"service-ca-9c57cc56f-frmnw\" (UID: \"74295390-5384-402d-8c5b-dc2559bb6d9c\") " pod="openshift-service-ca/service-ca-9c57cc56f-frmnw" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.084922 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a6cb5092-5f01-4dd9-a940-804d88907744-ca-trust-extracted\") pod \"image-registry-697d97f7c8-7bnq2\" (UID: \"a6cb5092-5f01-4dd9-a940-804d88907744\") " pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.084988 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b8f83b3-fabb-4404-88ce-e64c4db8a568-serving-cert\") pod \"service-ca-operator-777779d784-n6fl9\" (UID: \"9b8f83b3-fabb-4404-88ce-e64c4db8a568\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-n6fl9" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.085635 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b8f83b3-fabb-4404-88ce-e64c4db8a568-config\") pod \"service-ca-operator-777779d784-n6fl9\" (UID: \"9b8f83b3-fabb-4404-88ce-e64c4db8a568\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-n6fl9" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.085873 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8383a9e3-149b-4512-a9fd-12cd0b65e370-config-volume\") pod \"collect-profiles-29525220-wqnml\" (UID: \"8383a9e3-149b-4512-a9fd-12cd0b65e370\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525220-wqnml" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.086132 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a6cb5092-5f01-4dd9-a940-804d88907744-installation-pull-secrets\") pod \"image-registry-697d97f7c8-7bnq2\" (UID: \"a6cb5092-5f01-4dd9-a940-804d88907744\") " pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.087125 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/969b9cb6-f89f-47e9-b8f7-754804a41dea-trusted-ca\") pod \"console-operator-58897d9998-h68pj\" (UID: \"969b9cb6-f89f-47e9-b8f7-754804a41dea\") " pod="openshift-console-operator/console-operator-58897d9998-h68pj" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.087381 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8383a9e3-149b-4512-a9fd-12cd0b65e370-secret-volume\") pod \"collect-profiles-29525220-wqnml\" (UID: \"8383a9e3-149b-4512-a9fd-12cd0b65e370\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525220-wqnml" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.087830 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/969b9cb6-f89f-47e9-b8f7-754804a41dea-config\") pod \"console-operator-58897d9998-h68pj\" (UID: \"969b9cb6-f89f-47e9-b8f7-754804a41dea\") " pod="openshift-console-operator/console-operator-58897d9998-h68pj" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.087937 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c7e5d88-e6b7-416c-abbd-eed95cc772de-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-rwvxk\" (UID: \"0c7e5d88-e6b7-416c-abbd-eed95cc772de\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rwvxk" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.088516 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9d9ff005-2d60-4eb5-b8fa-59b84661617f-metrics-tls\") pod \"dns-default-zgcrd\" (UID: \"9d9ff005-2d60-4eb5-b8fa-59b84661617f\") " pod="openshift-dns/dns-default-zgcrd" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.089199 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/969b9cb6-f89f-47e9-b8f7-754804a41dea-serving-cert\") pod \"console-operator-58897d9998-h68pj\" (UID: \"969b9cb6-f89f-47e9-b8f7-754804a41dea\") " pod="openshift-console-operator/console-operator-58897d9998-h68pj" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.091465 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/64dc0d58-11d4-456b-97ab-a4d3ec28225b-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-895xv\" (UID: \"64dc0d58-11d4-456b-97ab-a4d3ec28225b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-895xv" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.092053 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c7e5d88-e6b7-416c-abbd-eed95cc772de-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-rwvxk\" (UID: \"0c7e5d88-e6b7-416c-abbd-eed95cc772de\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rwvxk" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.092083 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d3068e5d-4ff8-438f-958b-f0f90e773ca1-srv-cert\") pod \"olm-operator-6b444d44fb-mkt5x\" (UID: \"d3068e5d-4ff8-438f-958b-f0f90e773ca1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mkt5x" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.094112 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d3068e5d-4ff8-438f-958b-f0f90e773ca1-profile-collector-cert\") pod \"olm-operator-6b444d44fb-mkt5x\" (UID: \"d3068e5d-4ff8-438f-958b-f0f90e773ca1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mkt5x" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.095020 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a6cb5092-5f01-4dd9-a940-804d88907744-registry-tls\") pod \"image-registry-697d97f7c8-7bnq2\" (UID: \"a6cb5092-5f01-4dd9-a940-804d88907744\") " pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.103463 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b8b1faba-e1b8-436c-aa84-ae4353c5f0a9-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-r2tqm\" (UID: \"b8b1faba-e1b8-436c-aa84-ae4353c5f0a9\") " pod="openshift-marketplace/marketplace-operator-79b997595-r2tqm" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.103712 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/74295390-5384-402d-8c5b-dc2559bb6d9c-signing-key\") pod \"service-ca-9c57cc56f-frmnw\" (UID: \"74295390-5384-402d-8c5b-dc2559bb6d9c\") " pod="openshift-service-ca/service-ca-9c57cc56f-frmnw" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.116215 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdm7r\" (UniqueName: \"kubernetes.io/projected/8383a9e3-149b-4512-a9fd-12cd0b65e370-kube-api-access-jdm7r\") pod \"collect-profiles-29525220-wqnml\" (UID: \"8383a9e3-149b-4512-a9fd-12cd0b65e370\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525220-wqnml" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.164403 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9vz7\" (UniqueName: \"kubernetes.io/projected/b8b1faba-e1b8-436c-aa84-ae4353c5f0a9-kube-api-access-q9vz7\") pod \"marketplace-operator-79b997595-r2tqm\" (UID: \"b8b1faba-e1b8-436c-aa84-ae4353c5f0a9\") " pod="openshift-marketplace/marketplace-operator-79b997595-r2tqm" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.180649 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 15:11:54 crc kubenswrapper[4810]: E0219 15:11:54.180911 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 15:11:54.680876427 +0000 UTC m=+144.162906541 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.181078 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/51a1f271-446d-42d2-b946-ad816257e990-socket-dir\") pod \"csi-hostpathplugin-4nds4\" (UID: \"51a1f271-446d-42d2-b946-ad816257e990\") " pod="hostpath-provisioner/csi-hostpathplugin-4nds4" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.181103 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/51a1f271-446d-42d2-b946-ad816257e990-plugins-dir\") pod \"csi-hostpathplugin-4nds4\" (UID: \"51a1f271-446d-42d2-b946-ad816257e990\") " pod="hostpath-provisioner/csi-hostpathplugin-4nds4" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.181123 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/51a1f271-446d-42d2-b946-ad816257e990-registration-dir\") pod \"csi-hostpathplugin-4nds4\" (UID: \"51a1f271-446d-42d2-b946-ad816257e990\") " pod="hostpath-provisioner/csi-hostpathplugin-4nds4" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.181165 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/500c6ae7-fd1f-4af6-ad6c-f1db0c2af222-cert\") pod \"ingress-canary-cdfxh\" (UID: \"500c6ae7-fd1f-4af6-ad6c-f1db0c2af222\") " pod="openshift-ingress-canary/ingress-canary-cdfxh" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.181210 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7bnq2\" (UID: \"a6cb5092-5f01-4dd9-a940-804d88907744\") " pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.181266 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/59eb54a2-ac34-429b-a275-7d365be9ad2f-certs\") pod \"machine-config-server-hlw9s\" (UID: \"59eb54a2-ac34-429b-a275-7d365be9ad2f\") " pod="openshift-machine-config-operator/machine-config-server-hlw9s" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.181644 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/51a1f271-446d-42d2-b946-ad816257e990-csi-data-dir\") pod \"csi-hostpathplugin-4nds4\" (UID: \"51a1f271-446d-42d2-b946-ad816257e990\") " pod="hostpath-provisioner/csi-hostpathplugin-4nds4" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.181719 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/59eb54a2-ac34-429b-a275-7d365be9ad2f-node-bootstrap-token\") pod \"machine-config-server-hlw9s\" (UID: \"59eb54a2-ac34-429b-a275-7d365be9ad2f\") " pod="openshift-machine-config-operator/machine-config-server-hlw9s" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.181768 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cbdn\" (UniqueName: \"kubernetes.io/projected/59eb54a2-ac34-429b-a275-7d365be9ad2f-kube-api-access-2cbdn\") pod \"machine-config-server-hlw9s\" (UID: \"59eb54a2-ac34-429b-a275-7d365be9ad2f\") " pod="openshift-machine-config-operator/machine-config-server-hlw9s" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.181801 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/51a1f271-446d-42d2-b946-ad816257e990-mountpoint-dir\") pod \"csi-hostpathplugin-4nds4\" (UID: \"51a1f271-446d-42d2-b946-ad816257e990\") " pod="hostpath-provisioner/csi-hostpathplugin-4nds4" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.181814 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/51a1f271-446d-42d2-b946-ad816257e990-registration-dir\") pod \"csi-hostpathplugin-4nds4\" (UID: \"51a1f271-446d-42d2-b946-ad816257e990\") " pod="hostpath-provisioner/csi-hostpathplugin-4nds4" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.181899 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/51a1f271-446d-42d2-b946-ad816257e990-mountpoint-dir\") pod \"csi-hostpathplugin-4nds4\" (UID: \"51a1f271-446d-42d2-b946-ad816257e990\") " pod="hostpath-provisioner/csi-hostpathplugin-4nds4" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.181982 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9cg2\" (UniqueName: \"kubernetes.io/projected/500c6ae7-fd1f-4af6-ad6c-f1db0c2af222-kube-api-access-m9cg2\") pod \"ingress-canary-cdfxh\" (UID: \"500c6ae7-fd1f-4af6-ad6c-f1db0c2af222\") " pod="openshift-ingress-canary/ingress-canary-cdfxh" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.182043 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jld4k\" (UniqueName: \"kubernetes.io/projected/51a1f271-446d-42d2-b946-ad816257e990-kube-api-access-jld4k\") pod \"csi-hostpathplugin-4nds4\" (UID: \"51a1f271-446d-42d2-b946-ad816257e990\") " pod="hostpath-provisioner/csi-hostpathplugin-4nds4" Feb 19 15:11:54 crc kubenswrapper[4810]: E0219 15:11:54.182103 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 15:11:54.68208335 +0000 UTC m=+144.164113634 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7bnq2" (UID: "a6cb5092-5f01-4dd9-a940-804d88907744") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.182708 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/51a1f271-446d-42d2-b946-ad816257e990-socket-dir\") pod \"csi-hostpathplugin-4nds4\" (UID: \"51a1f271-446d-42d2-b946-ad816257e990\") " pod="hostpath-provisioner/csi-hostpathplugin-4nds4" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.182748 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/51a1f271-446d-42d2-b946-ad816257e990-plugins-dir\") pod \"csi-hostpathplugin-4nds4\" (UID: \"51a1f271-446d-42d2-b946-ad816257e990\") " pod="hostpath-provisioner/csi-hostpathplugin-4nds4" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.183503 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/51a1f271-446d-42d2-b946-ad816257e990-csi-data-dir\") pod \"csi-hostpathplugin-4nds4\" (UID: \"51a1f271-446d-42d2-b946-ad816257e990\") " pod="hostpath-provisioner/csi-hostpathplugin-4nds4" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.187631 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g96x2\" (UniqueName: \"kubernetes.io/projected/a6cb5092-5f01-4dd9-a940-804d88907744-kube-api-access-g96x2\") pod \"image-registry-697d97f7c8-7bnq2\" (UID: \"a6cb5092-5f01-4dd9-a940-804d88907744\") " pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.201954 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/500c6ae7-fd1f-4af6-ad6c-f1db0c2af222-cert\") pod \"ingress-canary-cdfxh\" (UID: \"500c6ae7-fd1f-4af6-ad6c-f1db0c2af222\") " pod="openshift-ingress-canary/ingress-canary-cdfxh" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.211163 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/59eb54a2-ac34-429b-a275-7d365be9ad2f-certs\") pod \"machine-config-server-hlw9s\" (UID: \"59eb54a2-ac34-429b-a275-7d365be9ad2f\") " pod="openshift-machine-config-operator/machine-config-server-hlw9s" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.211565 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/59eb54a2-ac34-429b-a275-7d365be9ad2f-node-bootstrap-token\") pod \"machine-config-server-hlw9s\" (UID: \"59eb54a2-ac34-429b-a275-7d365be9ad2f\") " pod="openshift-machine-config-operator/machine-config-server-hlw9s" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.225040 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hdzd\" (UniqueName: \"kubernetes.io/projected/969b9cb6-f89f-47e9-b8f7-754804a41dea-kube-api-access-4hdzd\") pod \"console-operator-58897d9998-h68pj\" (UID: \"969b9cb6-f89f-47e9-b8f7-754804a41dea\") " pod="openshift-console-operator/console-operator-58897d9998-h68pj" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.227183 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rrc6\" (UniqueName: \"kubernetes.io/projected/74295390-5384-402d-8c5b-dc2559bb6d9c-kube-api-access-8rrc6\") pod \"service-ca-9c57cc56f-frmnw\" (UID: \"74295390-5384-402d-8c5b-dc2559bb6d9c\") " pod="openshift-service-ca/service-ca-9c57cc56f-frmnw" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.230124 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8mnc\" (UniqueName: \"kubernetes.io/projected/0c7e5d88-e6b7-416c-abbd-eed95cc772de-kube-api-access-z8mnc\") pod \"kube-storage-version-migrator-operator-b67b599dd-rwvxk\" (UID: \"0c7e5d88-e6b7-416c-abbd-eed95cc772de\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rwvxk" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.254825 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6x6lv\" (UniqueName: \"kubernetes.io/projected/64dc0d58-11d4-456b-97ab-a4d3ec28225b-kube-api-access-6x6lv\") pod \"control-plane-machine-set-operator-78cbb6b69f-895xv\" (UID: \"64dc0d58-11d4-456b-97ab-a4d3ec28225b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-895xv" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.284306 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxtvx\" (UniqueName: \"kubernetes.io/projected/d3068e5d-4ff8-438f-958b-f0f90e773ca1-kube-api-access-sxtvx\") pod \"olm-operator-6b444d44fb-mkt5x\" (UID: \"d3068e5d-4ff8-438f-958b-f0f90e773ca1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mkt5x" Feb 19 15:11:54 crc kubenswrapper[4810]: E0219 15:11:54.286600 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 15:11:54.78639606 +0000 UTC m=+144.268426184 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.286631 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 15:11:54 crc kubenswrapper[4810]: E0219 15:11:54.288494 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 15:11:54.788483897 +0000 UTC m=+144.270514021 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7bnq2" (UID: "a6cb5092-5f01-4dd9-a940-804d88907744") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.287887 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7bnq2\" (UID: \"a6cb5092-5f01-4dd9-a940-804d88907744\") " pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.302237 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncbgj\" (UniqueName: \"kubernetes.io/projected/9b8f83b3-fabb-4404-88ce-e64c4db8a568-kube-api-access-ncbgj\") pod \"service-ca-operator-777779d784-n6fl9\" (UID: \"9b8f83b3-fabb-4404-88ce-e64c4db8a568\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-n6fl9" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.318026 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a6cb5092-5f01-4dd9-a940-804d88907744-bound-sa-token\") pod \"image-registry-697d97f7c8-7bnq2\" (UID: \"a6cb5092-5f01-4dd9-a940-804d88907744\") " pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.330187 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-895xv" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.347619 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rwvxk" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.349513 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mkt5x" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.358802 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525220-wqnml" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.365006 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfppl\" (UniqueName: \"kubernetes.io/projected/9d9ff005-2d60-4eb5-b8fa-59b84661617f-kube-api-access-dfppl\") pod \"dns-default-zgcrd\" (UID: \"9d9ff005-2d60-4eb5-b8fa-59b84661617f\") " pod="openshift-dns/dns-default-zgcrd" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.368716 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8mxdc" event={"ID":"77544608-a940-4c0c-9a1a-a5a98f480134","Type":"ContainerStarted","Data":"007011a340e0d09a98b8c44a23894fcef1b5f303cc95cbdb4af47959fb716490"} Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.368771 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8mxdc" event={"ID":"77544608-a940-4c0c-9a1a-a5a98f480134","Type":"ContainerStarted","Data":"3f00fd9a2efd649d59c4ed484657232da995bc534999cc19374c5d465c923bb7"} Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.369720 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-n8zth"] Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.369983 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45klr\" (UniqueName: \"kubernetes.io/projected/e3f898c2-3784-4157-b1c5-fdadcaf69bef-kube-api-access-45klr\") pod \"migrator-59844c95c7-w6k69\" (UID: \"e3f898c2-3784-4157-b1c5-fdadcaf69bef\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-w6k69" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.372711 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-dkppn"] Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.374262 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-r2tqm" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.376164 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xhzlb"] Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.384220 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q6zwc" event={"ID":"c696fe96-0485-44d0-b4fb-161503c334e8","Type":"ContainerStarted","Data":"4cfc3c3a4162660b44696564144790aac4145b83976abe219d6df4b4502d2dc6"} Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.385686 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-frmnw" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.390831 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.391800 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-n6fl9" Feb 19 15:11:54 crc kubenswrapper[4810]: E0219 15:11:54.392027 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 15:11:54.891999515 +0000 UTC m=+144.374029639 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.402369 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-h68pj" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.404872 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-zgcrd" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.405736 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-sl5p9"] Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.410539 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jld4k\" (UniqueName: \"kubernetes.io/projected/51a1f271-446d-42d2-b946-ad816257e990-kube-api-access-jld4k\") pod \"csi-hostpathplugin-4nds4\" (UID: \"51a1f271-446d-42d2-b946-ad816257e990\") " pod="hostpath-provisioner/csi-hostpathplugin-4nds4" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.412029 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-5d4rp" event={"ID":"a4b3e85d-d02e-4e13-8bab-aa86d2629d85","Type":"ContainerStarted","Data":"fb5c58eed043ec9926958facd1fa402602fa16dc839b95897784e8bb09f69736"} Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.424351 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-wpzzq" event={"ID":"e8c9b478-4884-4c32-acf1-5fdec0cfac06","Type":"ContainerStarted","Data":"521ba8d34498bf4aa0dff4e5633490e78b4ea0c74bad270709ecee20deb1c27e"} Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.424416 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-wpzzq" event={"ID":"e8c9b478-4884-4c32-acf1-5fdec0cfac06","Type":"ContainerStarted","Data":"eadc7dbbe4e29ff05dc2f2ea7386fe9ea9b5e63c2fe953be59a0fed2b43d32d8"} Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.426014 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-d8pqg" event={"ID":"fee373cf-50b8-42f4-b30d-4a3d230ca27e","Type":"ContainerStarted","Data":"3989d96419fba09da07c9e99ef43b0eb4df43c01359b3c17be42be6e9e26f30f"} Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.427784 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-4hddt" event={"ID":"362cd55c-b576-44bd-843c-078bf26b3b1e","Type":"ContainerStarted","Data":"a1d8a2975e22eb56e23640790355f60287c10a0504259d614d431ce0dc78edbb"} Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.428836 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-4nds4" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.429241 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-jm2nk" event={"ID":"239280e2-b335-4f87-89a8-00cb6f8e3c69","Type":"ContainerStarted","Data":"f40aa4e7f50582e05ed4f854cfdd01edd619d078425d9ec703efac5568e62189"} Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.437635 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9cg2\" (UniqueName: \"kubernetes.io/projected/500c6ae7-fd1f-4af6-ad6c-f1db0c2af222-kube-api-access-m9cg2\") pod \"ingress-canary-cdfxh\" (UID: \"500c6ae7-fd1f-4af6-ad6c-f1db0c2af222\") " pod="openshift-ingress-canary/ingress-canary-cdfxh" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.441431 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-25cmt"] Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.442947 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-cdfxh" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.477660 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cbdn\" (UniqueName: \"kubernetes.io/projected/59eb54a2-ac34-429b-a275-7d365be9ad2f-kube-api-access-2cbdn\") pod \"machine-config-server-hlw9s\" (UID: \"59eb54a2-ac34-429b-a275-7d365be9ad2f\") " pod="openshift-machine-config-operator/machine-config-server-hlw9s" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.494492 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7bnq2\" (UID: \"a6cb5092-5f01-4dd9-a940-804d88907744\") " pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" Feb 19 15:11:54 crc kubenswrapper[4810]: E0219 15:11:54.494909 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 15:11:54.994893586 +0000 UTC m=+144.476923710 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7bnq2" (UID: "a6cb5092-5f01-4dd9-a940-804d88907744") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.503268 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-55x9c" event={"ID":"f7aa90a6-963b-4ff0-b6de-ad46fa896e18","Type":"ContainerStarted","Data":"30f63e40e749a75dfb650e94c528a04f4c79cec4ef168e0a445ddba70fb1b578"} Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.509013 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-hvw7f" event={"ID":"76ffdcba-57d6-4636-8373-f088926a716d","Type":"ContainerStarted","Data":"88d5c2e54e548489dab83140a0f451859ce48da18a62480421da02e8dbaacdb0"} Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.515033 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-l66cb" event={"ID":"9a7776ca-1a56-4eca-9e44-ba1b7b15510f","Type":"ContainerStarted","Data":"257557aef8e6cd3ebd3462b014b8d22da58a9b420e2ca31f7c43087d3fa398e2"} Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.515096 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-l66cb" event={"ID":"9a7776ca-1a56-4eca-9e44-ba1b7b15510f","Type":"ContainerStarted","Data":"b87f6328992f13d54a1ed3e925e1d3835339d68ebe81ed0173553e621b7c78da"} Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.557700 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-f4qrf" event={"ID":"0f46b00f-f770-4539-92f9-60e1146308ab","Type":"ContainerStarted","Data":"9d1149ddc52048dfa1aa76cbd88f80346d5338d7f3d8064a038a742f836264e5"} Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.595253 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 15:11:54 crc kubenswrapper[4810]: E0219 15:11:54.598867 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 15:11:55.098835535 +0000 UTC m=+144.580865659 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.667955 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-w6k69" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.671046 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kclsk" event={"ID":"8f4b9328-0efe-42a8-9a73-a80eb6a26151","Type":"ContainerStarted","Data":"cdd204fb4f3750e895bfe6ec3412aecfe2ba2f2ffb7194f6c70cdb6a4ef6b993"} Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.671106 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kclsk" event={"ID":"8f4b9328-0efe-42a8-9a73-a80eb6a26151","Type":"ContainerStarted","Data":"724adfe31aa7fad7a5da317cec629a9441c15ea118c52e1a602faab0955fb919"} Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.678341 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l924n"] Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.682977 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8j2gh"] Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.700651 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7bnq2\" (UID: \"a6cb5092-5f01-4dd9-a940-804d88907744\") " pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" Feb 19 15:11:54 crc kubenswrapper[4810]: E0219 15:11:54.701073 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 15:11:55.201056617 +0000 UTC m=+144.683086741 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7bnq2" (UID: "a6cb5092-5f01-4dd9-a940-804d88907744") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:11:54 crc kubenswrapper[4810]: W0219 15:11:54.728577 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd317655_38ea_4fdb_95d0_82adc08456a8.slice/crio-6d4f444c4ef909a4018798854927d2c08e92bcbe7bcb6657cd36b18b543b34ef WatchSource:0}: Error finding container 6d4f444c4ef909a4018798854927d2c08e92bcbe7bcb6657cd36b18b543b34ef: Status 404 returned error can't find the container with id 6d4f444c4ef909a4018798854927d2c08e92bcbe7bcb6657cd36b18b543b34ef Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.730700 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xhg7l" event={"ID":"0e4ea5aa-2074-4100-a916-6bdfb3331d43","Type":"ContainerStarted","Data":"56f2f0ec8373e4a7100fbe6a0f0a7408ac55c7274506628fbc45edb2a88d3e00"} Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.731017 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xhg7l" event={"ID":"0e4ea5aa-2074-4100-a916-6bdfb3331d43","Type":"ContainerStarted","Data":"54ad6164609fa4bf8b377833d348d622ed2bede550b38e85012c018a8367e0d4"} Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.734020 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-hlw9s" Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.768012 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2nn8b"] Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.807232 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.818118 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-zrsn2"] Feb 19 15:11:54 crc kubenswrapper[4810]: E0219 15:11:54.821857 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 15:11:55.321833379 +0000 UTC m=+144.803863503 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.857356 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9lqqd"] Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.872805 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-r74mv"] Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.918357 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-kmhmh"] Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.918519 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7bnq2\" (UID: \"a6cb5092-5f01-4dd9-a940-804d88907744\") " pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" Feb 19 15:11:54 crc kubenswrapper[4810]: E0219 15:11:54.919003 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 15:11:55.418985182 +0000 UTC m=+144.901015306 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7bnq2" (UID: "a6cb5092-5f01-4dd9-a940-804d88907744") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.934983 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2phqk"] Feb 19 15:11:54 crc kubenswrapper[4810]: I0219 15:11:54.966029 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hmxtq"] Feb 19 15:11:55 crc kubenswrapper[4810]: I0219 15:11:55.029023 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 15:11:55 crc kubenswrapper[4810]: E0219 15:11:55.029524 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 15:11:55.529212704 +0000 UTC m=+145.011242828 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:11:55 crc kubenswrapper[4810]: I0219 15:11:55.029680 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7bnq2\" (UID: \"a6cb5092-5f01-4dd9-a940-804d88907744\") " pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" Feb 19 15:11:55 crc kubenswrapper[4810]: E0219 15:11:55.030137 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 15:11:55.530119399 +0000 UTC m=+145.012149513 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7bnq2" (UID: "a6cb5092-5f01-4dd9-a940-804d88907744") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:11:55 crc kubenswrapper[4810]: I0219 15:11:55.122262 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-65ltw"] Feb 19 15:11:55 crc kubenswrapper[4810]: I0219 15:11:55.131236 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 15:11:55 crc kubenswrapper[4810]: E0219 15:11:55.131833 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 15:11:55.631812976 +0000 UTC m=+145.113843100 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:11:55 crc kubenswrapper[4810]: I0219 15:11:55.233436 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7bnq2\" (UID: \"a6cb5092-5f01-4dd9-a940-804d88907744\") " pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" Feb 19 15:11:55 crc kubenswrapper[4810]: E0219 15:11:55.235182 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 15:11:55.734934293 +0000 UTC m=+145.216964417 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7bnq2" (UID: "a6cb5092-5f01-4dd9-a940-804d88907744") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:11:55 crc kubenswrapper[4810]: W0219 15:11:55.265972 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59eb54a2_ac34_429b_a275_7d365be9ad2f.slice/crio-57a99ec104b993daec9138ab90fe7aa1004eeeb9fb7b47a9a690d2f75cb7076a WatchSource:0}: Error finding container 57a99ec104b993daec9138ab90fe7aa1004eeeb9fb7b47a9a690d2f75cb7076a: Status 404 returned error can't find the container with id 57a99ec104b993daec9138ab90fe7aa1004eeeb9fb7b47a9a690d2f75cb7076a Feb 19 15:11:55 crc kubenswrapper[4810]: I0219 15:11:55.335970 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 15:11:55 crc kubenswrapper[4810]: E0219 15:11:55.339070 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 15:11:55.839046417 +0000 UTC m=+145.321076541 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:11:55 crc kubenswrapper[4810]: I0219 15:11:55.348636 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525220-wqnml"] Feb 19 15:11:55 crc kubenswrapper[4810]: I0219 15:11:55.452721 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7bnq2\" (UID: \"a6cb5092-5f01-4dd9-a940-804d88907744\") " pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" Feb 19 15:11:55 crc kubenswrapper[4810]: E0219 15:11:55.453720 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 15:11:55.953703322 +0000 UTC m=+145.435733446 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7bnq2" (UID: "a6cb5092-5f01-4dd9-a940-804d88907744") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:11:55 crc kubenswrapper[4810]: I0219 15:11:55.528307 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mkt5x"] Feb 19 15:11:55 crc kubenswrapper[4810]: I0219 15:11:55.554443 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 15:11:55 crc kubenswrapper[4810]: E0219 15:11:55.554733 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 15:11:56.0546841 +0000 UTC m=+145.536714224 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:11:55 crc kubenswrapper[4810]: I0219 15:11:55.555239 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7bnq2\" (UID: \"a6cb5092-5f01-4dd9-a940-804d88907744\") " pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" Feb 19 15:11:55 crc kubenswrapper[4810]: E0219 15:11:55.555751 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 15:11:56.055743549 +0000 UTC m=+145.537773673 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7bnq2" (UID: "a6cb5092-5f01-4dd9-a940-804d88907744") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:11:55 crc kubenswrapper[4810]: I0219 15:11:55.656648 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rwvxk"] Feb 19 15:11:55 crc kubenswrapper[4810]: I0219 15:11:55.669588 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 15:11:55 crc kubenswrapper[4810]: E0219 15:11:55.670073 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 15:11:56.170053233 +0000 UTC m=+145.652083357 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:11:55 crc kubenswrapper[4810]: I0219 15:11:55.702533 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-jm2nk" podStartSLOduration=123.702507376 podStartE2EDuration="2m3.702507376s" podCreationTimestamp="2026-02-19 15:09:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:11:55.670530987 +0000 UTC m=+145.152561111" watchObservedRunningTime="2026-02-19 15:11:55.702507376 +0000 UTC m=+145.184537500" Feb 19 15:11:55 crc kubenswrapper[4810]: I0219 15:11:55.773840 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7bnq2\" (UID: \"a6cb5092-5f01-4dd9-a940-804d88907744\") " pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" Feb 19 15:11:55 crc kubenswrapper[4810]: E0219 15:11:55.774546 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 15:11:56.274531728 +0000 UTC m=+145.756561852 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7bnq2" (UID: "a6cb5092-5f01-4dd9-a940-804d88907744") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:11:55 crc kubenswrapper[4810]: I0219 15:11:55.781187 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-65ltw" event={"ID":"5cc06c55-7085-4cc0-8399-833b4243b51e","Type":"ContainerStarted","Data":"c8b2a88dff64984dccbf60956cd019d8c8c69f8324695b17d3af2bbfc0d23a72"} Feb 19 15:11:55 crc kubenswrapper[4810]: I0219 15:11:55.790862 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-d8pqg" event={"ID":"fee373cf-50b8-42f4-b30d-4a3d230ca27e","Type":"ContainerStarted","Data":"5b07111955df88d3b4e36c415c8d6b917021b9b7758c003f572c5b199fd318d9"} Feb 19 15:11:55 crc kubenswrapper[4810]: I0219 15:11:55.808975 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n8zth" event={"ID":"ad43df2c-4944-45e2-919f-0c297f4092d4","Type":"ContainerStarted","Data":"b43634a16e9cdfb66105b0eff470452a3406e103e0c90e02477b5f9de0072e03"} Feb 19 15:11:55 crc kubenswrapper[4810]: I0219 15:11:55.809043 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n8zth" event={"ID":"ad43df2c-4944-45e2-919f-0c297f4092d4","Type":"ContainerStarted","Data":"8973ecaad02b7aeec4f31a47d961686dc669236fd8c026776f4e494af608cf1b"} Feb 19 15:11:55 crc kubenswrapper[4810]: I0219 15:11:55.810482 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n8zth" Feb 19 15:11:55 crc kubenswrapper[4810]: I0219 15:11:55.820180 4810 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-n8zth container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Feb 19 15:11:55 crc kubenswrapper[4810]: I0219 15:11:55.820229 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n8zth" podUID="ad43df2c-4944-45e2-919f-0c297f4092d4" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" Feb 19 15:11:55 crc kubenswrapper[4810]: I0219 15:11:55.853172 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-895xv"] Feb 19 15:11:55 crc kubenswrapper[4810]: I0219 15:11:55.876875 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 15:11:55 crc kubenswrapper[4810]: E0219 15:11:55.894903 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 15:11:56.394853068 +0000 UTC m=+145.876883412 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:11:55 crc kubenswrapper[4810]: I0219 15:11:55.899444 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8mxdc" event={"ID":"77544608-a940-4c0c-9a1a-a5a98f480134","Type":"ContainerStarted","Data":"93dcb98ad4742d66dab7aacdcb5a3a8c191921c6d151afcc27724a90e94526cf"} Feb 19 15:11:55 crc kubenswrapper[4810]: I0219 15:11:55.910411 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9lqqd" event={"ID":"026a2812-dc11-4b60-b911-bb41a0d39d7d","Type":"ContainerStarted","Data":"1ff7e02020c5142042143d63aae5530e3213e86b9c32b9d0bf9607e45acf8227"} Feb 19 15:11:55 crc kubenswrapper[4810]: I0219 15:11:55.914258 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-r2tqm"] Feb 19 15:11:55 crc kubenswrapper[4810]: I0219 15:11:55.917103 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-sl5p9" event={"ID":"6f0a78b8-77b3-47dd-9dbe-e578b0374cf6","Type":"ContainerStarted","Data":"918504ecdc8f526f00360e8dd63cb3df985da70843c55db6ce3d06ae23d89251"} Feb 19 15:11:55 crc kubenswrapper[4810]: I0219 15:11:55.928048 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-55x9c" podStartSLOduration=123.927995539 podStartE2EDuration="2m3.927995539s" podCreationTimestamp="2026-02-19 15:09:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:11:55.894307953 +0000 UTC m=+145.376338077" watchObservedRunningTime="2026-02-19 15:11:55.927995539 +0000 UTC m=+145.410025653" Feb 19 15:11:55 crc kubenswrapper[4810]: W0219 15:11:55.949582 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c7e5d88_e6b7_416c_abbd_eed95cc772de.slice/crio-ad1fce03b3ad1750486622c5e19772212fbe373da005f0287038be329e0b340f WatchSource:0}: Error finding container ad1fce03b3ad1750486622c5e19772212fbe373da005f0287038be329e0b340f: Status 404 returned error can't find the container with id ad1fce03b3ad1750486622c5e19772212fbe373da005f0287038be329e0b340f Feb 19 15:11:55 crc kubenswrapper[4810]: I0219 15:11:55.981367 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7bnq2\" (UID: \"a6cb5092-5f01-4dd9-a940-804d88907744\") " pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" Feb 19 15:11:55 crc kubenswrapper[4810]: E0219 15:11:55.984698 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 15:11:56.484675119 +0000 UTC m=+145.966705243 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7bnq2" (UID: "a6cb5092-5f01-4dd9-a940-804d88907744") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:11:56 crc kubenswrapper[4810]: I0219 15:11:56.021037 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-5d4rp" event={"ID":"a4b3e85d-d02e-4e13-8bab-aa86d2629d85","Type":"ContainerStarted","Data":"1ee9376227c338a67f41bb1e8709c6a10ad973763bdae3460d5f5bfe180e269d"} Feb 19 15:11:56 crc kubenswrapper[4810]: I0219 15:11:56.031035 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kmhmh" event={"ID":"63674760-f499-49b8-a575-a8ae954eada4","Type":"ContainerStarted","Data":"0bc300969ec8d531d557a6a53297ce22a290ca361d42de96194eab1ad4a502b6"} Feb 19 15:11:56 crc kubenswrapper[4810]: I0219 15:11:56.052118 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8mxdc" podStartSLOduration=124.052092093 podStartE2EDuration="2m4.052092093s" podCreationTimestamp="2026-02-19 15:09:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:11:56.050110129 +0000 UTC m=+145.532140253" watchObservedRunningTime="2026-02-19 15:11:56.052092093 +0000 UTC m=+145.534122217" Feb 19 15:11:56 crc kubenswrapper[4810]: I0219 15:11:56.061285 4810 generic.go:334] "Generic (PLEG): container finished" podID="c696fe96-0485-44d0-b4fb-161503c334e8" containerID="19f280cf9c36b378a47999cd163230981c655a13662892691e3cba02c8a1e6ca" exitCode=0 Feb 19 15:11:56 crc kubenswrapper[4810]: I0219 15:11:56.062189 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q6zwc" event={"ID":"c696fe96-0485-44d0-b4fb-161503c334e8","Type":"ContainerDied","Data":"19f280cf9c36b378a47999cd163230981c655a13662892691e3cba02c8a1e6ca"} Feb 19 15:11:56 crc kubenswrapper[4810]: I0219 15:11:56.088350 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-h68pj"] Feb 19 15:11:56 crc kubenswrapper[4810]: I0219 15:11:56.089714 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 15:11:56 crc kubenswrapper[4810]: E0219 15:11:56.090467 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 15:11:56.590446398 +0000 UTC m=+146.072476512 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:11:56 crc kubenswrapper[4810]: I0219 15:11:56.093604 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-zgcrd"] Feb 19 15:11:56 crc kubenswrapper[4810]: I0219 15:11:56.106387 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-l66cb" event={"ID":"9a7776ca-1a56-4eca-9e44-ba1b7b15510f","Type":"ContainerStarted","Data":"caa24f0209a0103d15ad0edc37f840b56250f9d3ee7fee9166d950d7a854c17a"} Feb 19 15:11:56 crc kubenswrapper[4810]: I0219 15:11:56.111931 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-jm2nk" event={"ID":"239280e2-b335-4f87-89a8-00cb6f8e3c69","Type":"ContainerStarted","Data":"e22c0b66bd2edf2cc9e2215240a0a048e856f1ac853f7478d160bbf2ed8bfe87"} Feb 19 15:11:56 crc kubenswrapper[4810]: I0219 15:11:56.116199 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-d8pqg" podStartSLOduration=123.116172566 podStartE2EDuration="2m3.116172566s" podCreationTimestamp="2026-02-19 15:09:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:11:56.115793176 +0000 UTC m=+145.597823290" watchObservedRunningTime="2026-02-19 15:11:56.116172566 +0000 UTC m=+145.598202690" Feb 19 15:11:56 crc kubenswrapper[4810]: I0219 15:11:56.120284 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8j2gh" event={"ID":"b1e6fd37-d069-441e-9c0f-caf60d8f8d4f","Type":"ContainerStarted","Data":"6e3e16aaad4c49b043c295a1cde93cc27f9c6489b08a009bea8ceb47ace5b8a7"} Feb 19 15:11:56 crc kubenswrapper[4810]: I0219 15:11:56.145253 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-dkppn" event={"ID":"7a29951a-027e-49b4-a7ea-a8e363942414","Type":"ContainerStarted","Data":"def13ac9f3c80f2633deeaca2f3aa28e400ac5dc93d6737800aaec0153001453"} Feb 19 15:11:56 crc kubenswrapper[4810]: I0219 15:11:56.147200 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-dkppn" Feb 19 15:11:56 crc kubenswrapper[4810]: I0219 15:11:56.160009 4810 patch_prober.go:28] interesting pod/downloads-7954f5f757-dkppn container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Feb 19 15:11:56 crc kubenswrapper[4810]: I0219 15:11:56.160469 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dkppn" podUID="7a29951a-027e-49b4-a7ea-a8e363942414" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Feb 19 15:11:56 crc kubenswrapper[4810]: I0219 15:11:56.160337 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2phqk" event={"ID":"ef2828b3-f501-4105-abc8-6b1ce9658301","Type":"ContainerStarted","Data":"8ce155f4159ba341fb65543f9f8c0c293a0fca56dd0fb197345bf6368f2d8430"} Feb 19 15:11:56 crc kubenswrapper[4810]: I0219 15:11:56.167026 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-hlw9s" event={"ID":"59eb54a2-ac34-429b-a275-7d365be9ad2f","Type":"ContainerStarted","Data":"57a99ec104b993daec9138ab90fe7aa1004eeeb9fb7b47a9a690d2f75cb7076a"} Feb 19 15:11:56 crc kubenswrapper[4810]: I0219 15:11:56.192021 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7bnq2\" (UID: \"a6cb5092-5f01-4dd9-a940-804d88907744\") " pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" Feb 19 15:11:56 crc kubenswrapper[4810]: E0219 15:11:56.194651 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 15:11:56.694629364 +0000 UTC m=+146.176659488 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7bnq2" (UID: "a6cb5092-5f01-4dd9-a940-804d88907744") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:11:56 crc kubenswrapper[4810]: I0219 15:11:56.200130 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-hvw7f" event={"ID":"76ffdcba-57d6-4636-8373-f088926a716d","Type":"ContainerStarted","Data":"079683c2352488e17623f733b719501b4fd3f0ddbd65095b33566865f235c2ff"} Feb 19 15:11:56 crc kubenswrapper[4810]: I0219 15:11:56.212864 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n8zth" podStartSLOduration=123.212833245 podStartE2EDuration="2m3.212833245s" podCreationTimestamp="2026-02-19 15:09:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:11:56.146294435 +0000 UTC m=+145.628324559" watchObservedRunningTime="2026-02-19 15:11:56.212833245 +0000 UTC m=+145.694863369" Feb 19 15:11:56 crc kubenswrapper[4810]: I0219 15:11:56.216161 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kclsk" podStartSLOduration=123.216142996 podStartE2EDuration="2m3.216142996s" podCreationTimestamp="2026-02-19 15:09:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:11:56.202536742 +0000 UTC m=+145.684566866" watchObservedRunningTime="2026-02-19 15:11:56.216142996 +0000 UTC m=+145.698173110" Feb 19 15:11:56 crc kubenswrapper[4810]: I0219 15:11:56.224873 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-25cmt" event={"ID":"bd317655-38ea-4fdb-95d0-82adc08456a8","Type":"ContainerStarted","Data":"6d4f444c4ef909a4018798854927d2c08e92bcbe7bcb6657cd36b18b543b34ef"} Feb 19 15:11:56 crc kubenswrapper[4810]: I0219 15:11:56.245441 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xhg7l" podStartSLOduration=123.245412011 podStartE2EDuration="2m3.245412011s" podCreationTimestamp="2026-02-19 15:09:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:11:56.244244029 +0000 UTC m=+145.726274153" watchObservedRunningTime="2026-02-19 15:11:56.245412011 +0000 UTC m=+145.727442135" Feb 19 15:11:56 crc kubenswrapper[4810]: I0219 15:11:56.255489 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-r74mv" event={"ID":"c18fb461-ce5b-43ad-85ca-305c3f8a7d46","Type":"ContainerStarted","Data":"a5b8d6a2012cb01f6278524103e645fa596a71b16bda554c88859e183269d288"} Feb 19 15:11:56 crc kubenswrapper[4810]: I0219 15:11:56.270529 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-zrsn2" event={"ID":"13f4a10d-50aa-41ec-9931-cb835ba1f54c","Type":"ContainerStarted","Data":"697a9c21c216e48b42269ff99e0a27246ada05bb0cfec928ad6adc7a3b7dfc0e"} Feb 19 15:11:56 crc kubenswrapper[4810]: I0219 15:11:56.271052 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-hlw9s" podStartSLOduration=5.271030506 podStartE2EDuration="5.271030506s" podCreationTimestamp="2026-02-19 15:11:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:11:56.270637885 +0000 UTC m=+145.752668009" watchObservedRunningTime="2026-02-19 15:11:56.271030506 +0000 UTC m=+145.753060620" Feb 19 15:11:56 crc kubenswrapper[4810]: I0219 15:11:56.293812 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 15:11:56 crc kubenswrapper[4810]: E0219 15:11:56.295914 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 15:11:56.79588717 +0000 UTC m=+146.277917294 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:11:56 crc kubenswrapper[4810]: I0219 15:11:56.296859 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525220-wqnml" event={"ID":"8383a9e3-149b-4512-a9fd-12cd0b65e370","Type":"ContainerStarted","Data":"28491df923b9f9d021efc8a5dd0472dd72e0e7e179d81a8ba24dae22fa983519"} Feb 19 15:11:56 crc kubenswrapper[4810]: I0219 15:11:56.331206 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-4hddt" event={"ID":"362cd55c-b576-44bd-843c-078bf26b3b1e","Type":"ContainerStarted","Data":"f08283b34555217bd3055e9709afdf48fe7bc7bac3a42fcf9d68868beda18106"} Feb 19 15:11:56 crc kubenswrapper[4810]: I0219 15:11:56.333604 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-l66cb" podStartSLOduration=123.333592337 podStartE2EDuration="2m3.333592337s" podCreationTimestamp="2026-02-19 15:09:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:11:56.329635098 +0000 UTC m=+145.811665222" watchObservedRunningTime="2026-02-19 15:11:56.333592337 +0000 UTC m=+145.815622451" Feb 19 15:11:56 crc kubenswrapper[4810]: I0219 15:11:56.360741 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-55x9c" event={"ID":"f7aa90a6-963b-4ff0-b6de-ad46fa896e18","Type":"ContainerStarted","Data":"66a4dd2101aef46f877ee54c0bc2cd6bef63a169b280c404cfe51166671c33db"} Feb 19 15:11:56 crc kubenswrapper[4810]: I0219 15:11:56.405316 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2nn8b" event={"ID":"b1fae478-5e9a-4da1-b2e3-35c1ee7f8fa3","Type":"ContainerStarted","Data":"c70d6dcda85722487ba517667cc3541d1e921cb75ec6411edace40c22883a80c"} Feb 19 15:11:56 crc kubenswrapper[4810]: I0219 15:11:56.406972 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7bnq2\" (UID: \"a6cb5092-5f01-4dd9-a940-804d88907744\") " pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" Feb 19 15:11:56 crc kubenswrapper[4810]: E0219 15:11:56.407585 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 15:11:56.907552092 +0000 UTC m=+146.389582216 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7bnq2" (UID: "a6cb5092-5f01-4dd9-a940-804d88907744") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:11:56 crc kubenswrapper[4810]: I0219 15:11:56.428881 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-n6fl9"] Feb 19 15:11:56 crc kubenswrapper[4810]: I0219 15:11:56.443589 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xhzlb" event={"ID":"25e05c8a-335b-405c-9033-f689c21c5ecc","Type":"ContainerStarted","Data":"6c1f93bc451f045e7ecf4f2a061a155dfe1a74fde876c581e5b01b87f5b9cbe9"} Feb 19 15:11:56 crc kubenswrapper[4810]: I0219 15:11:56.456778 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-wpzzq" event={"ID":"e8c9b478-4884-4c32-acf1-5fdec0cfac06","Type":"ContainerStarted","Data":"daa16aa602b5e2229fd5e812438b46bfa9aeab028776fae426ca3979d07a8b12"} Feb 19 15:11:56 crc kubenswrapper[4810]: I0219 15:11:56.464430 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-cdfxh"] Feb 19 15:11:56 crc kubenswrapper[4810]: I0219 15:11:56.466294 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-w6k69"] Feb 19 15:11:56 crc kubenswrapper[4810]: I0219 15:11:56.472231 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-f4qrf" event={"ID":"0f46b00f-f770-4539-92f9-60e1146308ab","Type":"ContainerStarted","Data":"c95535592b477e29bf45cc0f9dac15fc777217f356e754d6cd687ad25e827210"} Feb 19 15:11:56 crc kubenswrapper[4810]: I0219 15:11:56.472277 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-f4qrf" event={"ID":"0f46b00f-f770-4539-92f9-60e1146308ab","Type":"ContainerStarted","Data":"3003c524a1a06f61e29fbc5a85cb4540506b1c36ddf20b35eb80aa3726f139b5"} Feb 19 15:11:56 crc kubenswrapper[4810]: I0219 15:11:56.475839 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hmxtq" event={"ID":"6a43ce91-6673-4641-a2d6-551afe72688d","Type":"ContainerStarted","Data":"418927cc81e88b2f74b96db863cf7acce36f70d0978d3506c79ab078ab3e5026"} Feb 19 15:11:56 crc kubenswrapper[4810]: I0219 15:11:56.504757 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-4nds4"] Feb 19 15:11:56 crc kubenswrapper[4810]: I0219 15:11:56.508942 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 15:11:56 crc kubenswrapper[4810]: E0219 15:11:56.510099 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 15:11:57.010079532 +0000 UTC m=+146.492109646 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:11:56 crc kubenswrapper[4810]: I0219 15:11:56.547015 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l924n" event={"ID":"b7a0b611-cb8e-431b-b527-b6164471c85f","Type":"ContainerStarted","Data":"5a285a1bca8274979baec136d5aba84737e9df208c7943b87864477fe5900c2e"} Feb 19 15:11:56 crc kubenswrapper[4810]: I0219 15:11:56.548083 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-5d4rp" podStartSLOduration=124.548061097 podStartE2EDuration="2m4.548061097s" podCreationTimestamp="2026-02-19 15:09:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:11:56.546055262 +0000 UTC m=+146.028085406" watchObservedRunningTime="2026-02-19 15:11:56.548061097 +0000 UTC m=+146.030091221" Feb 19 15:11:56 crc kubenswrapper[4810]: I0219 15:11:56.610368 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7bnq2\" (UID: \"a6cb5092-5f01-4dd9-a940-804d88907744\") " pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" Feb 19 15:11:56 crc kubenswrapper[4810]: E0219 15:11:56.611070 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 15:11:57.1110519 +0000 UTC m=+146.593082024 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7bnq2" (UID: "a6cb5092-5f01-4dd9-a940-804d88907744") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:11:56 crc kubenswrapper[4810]: W0219 15:11:56.623669 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod500c6ae7_fd1f_4af6_ad6c_f1db0c2af222.slice/crio-4b2c43301c34bb1c2be42ee14c12d2749dce24b2f61ffd4dcf0b7cbdb2ea4ec7 WatchSource:0}: Error finding container 4b2c43301c34bb1c2be42ee14c12d2749dce24b2f61ffd4dcf0b7cbdb2ea4ec7: Status 404 returned error can't find the container with id 4b2c43301c34bb1c2be42ee14c12d2749dce24b2f61ffd4dcf0b7cbdb2ea4ec7 Feb 19 15:11:56 crc kubenswrapper[4810]: I0219 15:11:56.626732 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-frmnw"] Feb 19 15:11:56 crc kubenswrapper[4810]: I0219 15:11:56.641202 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-dkppn" podStartSLOduration=124.641172219 podStartE2EDuration="2m4.641172219s" podCreationTimestamp="2026-02-19 15:09:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:11:56.638646889 +0000 UTC m=+146.120677013" watchObservedRunningTime="2026-02-19 15:11:56.641172219 +0000 UTC m=+146.123202343" Feb 19 15:11:56 crc kubenswrapper[4810]: I0219 15:11:56.694722 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-hvw7f" podStartSLOduration=123.694698141 podStartE2EDuration="2m3.694698141s" podCreationTimestamp="2026-02-19 15:09:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:11:56.678213528 +0000 UTC m=+146.160243652" watchObservedRunningTime="2026-02-19 15:11:56.694698141 +0000 UTC m=+146.176728265" Feb 19 15:11:56 crc kubenswrapper[4810]: I0219 15:11:56.711851 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 15:11:56 crc kubenswrapper[4810]: E0219 15:11:56.713298 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 15:11:57.213265732 +0000 UTC m=+146.695296016 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:11:56 crc kubenswrapper[4810]: I0219 15:11:56.715255 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-f4qrf" podStartSLOduration=123.715194005 podStartE2EDuration="2m3.715194005s" podCreationTimestamp="2026-02-19 15:09:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:11:56.713457707 +0000 UTC m=+146.195487831" watchObservedRunningTime="2026-02-19 15:11:56.715194005 +0000 UTC m=+146.197224129" Feb 19 15:11:56 crc kubenswrapper[4810]: I0219 15:11:56.751874 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-4hddt" podStartSLOduration=124.751849833 podStartE2EDuration="2m4.751849833s" podCreationTimestamp="2026-02-19 15:09:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:11:56.74918255 +0000 UTC m=+146.231212674" watchObservedRunningTime="2026-02-19 15:11:56.751849833 +0000 UTC m=+146.233879957" Feb 19 15:11:56 crc kubenswrapper[4810]: I0219 15:11:56.813452 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l924n" podStartSLOduration=123.813412777 podStartE2EDuration="2m3.813412777s" podCreationTimestamp="2026-02-19 15:09:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:11:56.803613328 +0000 UTC m=+146.285643452" watchObservedRunningTime="2026-02-19 15:11:56.813412777 +0000 UTC m=+146.295442901" Feb 19 15:11:56 crc kubenswrapper[4810]: I0219 15:11:56.820207 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7bnq2\" (UID: \"a6cb5092-5f01-4dd9-a940-804d88907744\") " pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" Feb 19 15:11:56 crc kubenswrapper[4810]: E0219 15:11:56.820728 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 15:11:57.320711288 +0000 UTC m=+146.802741402 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7bnq2" (UID: "a6cb5092-5f01-4dd9-a940-804d88907744") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:11:56 crc kubenswrapper[4810]: I0219 15:11:56.923852 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 15:11:56 crc kubenswrapper[4810]: E0219 15:11:56.924577 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 15:11:57.424551914 +0000 UTC m=+146.906582038 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:11:56 crc kubenswrapper[4810]: I0219 15:11:56.924614 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-hvw7f" Feb 19 15:11:56 crc kubenswrapper[4810]: I0219 15:11:56.944066 4810 patch_prober.go:28] interesting pod/router-default-5444994796-hvw7f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 15:11:56 crc kubenswrapper[4810]: [-]has-synced failed: reason withheld Feb 19 15:11:56 crc kubenswrapper[4810]: [+]process-running ok Feb 19 15:11:56 crc kubenswrapper[4810]: healthz check failed Feb 19 15:11:56 crc kubenswrapper[4810]: I0219 15:11:56.944543 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hvw7f" podUID="76ffdcba-57d6-4636-8373-f088926a716d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 15:11:57 crc kubenswrapper[4810]: I0219 15:11:57.026014 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7bnq2\" (UID: \"a6cb5092-5f01-4dd9-a940-804d88907744\") " pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" Feb 19 15:11:57 crc kubenswrapper[4810]: E0219 15:11:57.026445 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 15:11:57.526431567 +0000 UTC m=+147.008461681 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7bnq2" (UID: "a6cb5092-5f01-4dd9-a940-804d88907744") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:11:57 crc kubenswrapper[4810]: I0219 15:11:57.132141 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 15:11:57 crc kubenswrapper[4810]: E0219 15:11:57.132638 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 15:11:57.632616488 +0000 UTC m=+147.114646612 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:11:57 crc kubenswrapper[4810]: I0219 15:11:57.215370 4810 csr.go:261] certificate signing request csr-kl2l4 is approved, waiting to be issued Feb 19 15:11:57 crc kubenswrapper[4810]: I0219 15:11:57.234810 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7bnq2\" (UID: \"a6cb5092-5f01-4dd9-a940-804d88907744\") " pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" Feb 19 15:11:57 crc kubenswrapper[4810]: I0219 15:11:57.234977 4810 csr.go:257] certificate signing request csr-kl2l4 is issued Feb 19 15:11:57 crc kubenswrapper[4810]: E0219 15:11:57.236089 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 15:11:57.735893259 +0000 UTC m=+147.217923383 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7bnq2" (UID: "a6cb5092-5f01-4dd9-a940-804d88907744") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:11:57 crc kubenswrapper[4810]: I0219 15:11:57.338533 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 15:11:57 crc kubenswrapper[4810]: E0219 15:11:57.339347 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 15:11:57.839308784 +0000 UTC m=+147.321338908 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:11:57 crc kubenswrapper[4810]: I0219 15:11:57.447566 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7bnq2\" (UID: \"a6cb5092-5f01-4dd9-a940-804d88907744\") " pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" Feb 19 15:11:57 crc kubenswrapper[4810]: E0219 15:11:57.447968 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 15:11:57.947947853 +0000 UTC m=+147.429977977 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7bnq2" (UID: "a6cb5092-5f01-4dd9-a940-804d88907744") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:11:57 crc kubenswrapper[4810]: I0219 15:11:57.549424 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 15:11:57 crc kubenswrapper[4810]: E0219 15:11:57.549650 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 15:11:58.04960117 +0000 UTC m=+147.531631294 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:11:57 crc kubenswrapper[4810]: I0219 15:11:57.549726 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7bnq2\" (UID: \"a6cb5092-5f01-4dd9-a940-804d88907744\") " pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" Feb 19 15:11:57 crc kubenswrapper[4810]: E0219 15:11:57.550112 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 15:11:58.050104183 +0000 UTC m=+147.532134297 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7bnq2" (UID: "a6cb5092-5f01-4dd9-a940-804d88907744") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:11:57 crc kubenswrapper[4810]: I0219 15:11:57.594625 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rwvxk" event={"ID":"0c7e5d88-e6b7-416c-abbd-eed95cc772de","Type":"ContainerStarted","Data":"03d1da5fa69b0120b4edb3d8d72d814e92881d0def41990e885bbb57dcbed6f1"} Feb 19 15:11:57 crc kubenswrapper[4810]: I0219 15:11:57.594676 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rwvxk" event={"ID":"0c7e5d88-e6b7-416c-abbd-eed95cc772de","Type":"ContainerStarted","Data":"ad1fce03b3ad1750486622c5e19772212fbe373da005f0287038be329e0b340f"} Feb 19 15:11:57 crc kubenswrapper[4810]: I0219 15:11:57.598859 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-w6k69" event={"ID":"e3f898c2-3784-4157-b1c5-fdadcaf69bef","Type":"ContainerStarted","Data":"478a0a3f318d9b05dedcd04ea22ce7fdc59f6a62f343db5dab5c45f7df34f58b"} Feb 19 15:11:57 crc kubenswrapper[4810]: I0219 15:11:57.605523 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" Feb 19 15:11:57 crc kubenswrapper[4810]: I0219 15:11:57.615726 4810 generic.go:334] "Generic (PLEG): container finished" podID="bd317655-38ea-4fdb-95d0-82adc08456a8" containerID="28e82457805a6bf079bf2dad976697aad3c472aa72e00d44a0a62bb4d125ff0a" exitCode=0 Feb 19 15:11:57 crc kubenswrapper[4810]: I0219 15:11:57.615813 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-25cmt" event={"ID":"bd317655-38ea-4fdb-95d0-82adc08456a8","Type":"ContainerDied","Data":"28e82457805a6bf079bf2dad976697aad3c472aa72e00d44a0a62bb4d125ff0a"} Feb 19 15:11:57 crc kubenswrapper[4810]: I0219 15:11:57.616186 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rwvxk" podStartSLOduration=124.616161401 podStartE2EDuration="2m4.616161401s" podCreationTimestamp="2026-02-19 15:09:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:11:57.614846034 +0000 UTC m=+147.096876158" watchObservedRunningTime="2026-02-19 15:11:57.616161401 +0000 UTC m=+147.098191525" Feb 19 15:11:57 crc kubenswrapper[4810]: I0219 15:11:57.617793 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-wpzzq" podStartSLOduration=124.617786165 podStartE2EDuration="2m4.617786165s" podCreationTimestamp="2026-02-19 15:09:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:11:56.857000606 +0000 UTC m=+146.339030730" watchObservedRunningTime="2026-02-19 15:11:57.617786165 +0000 UTC m=+147.099816289" Feb 19 15:11:57 crc kubenswrapper[4810]: I0219 15:11:57.619173 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-r74mv" event={"ID":"c18fb461-ce5b-43ad-85ca-305c3f8a7d46","Type":"ContainerStarted","Data":"1540e32985c892982c268c96e96e5e47cb3be178ce36f945dc12ccf2635f82d2"} Feb 19 15:11:57 crc kubenswrapper[4810]: I0219 15:11:57.622451 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-r74mv" Feb 19 15:11:57 crc kubenswrapper[4810]: I0219 15:11:57.625253 4810 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-r74mv container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.33:6443/healthz\": dial tcp 10.217.0.33:6443: connect: connection refused" start-of-body= Feb 19 15:11:57 crc kubenswrapper[4810]: I0219 15:11:57.625316 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-r74mv" podUID="c18fb461-ce5b-43ad-85ca-305c3f8a7d46" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.33:6443/healthz\": dial tcp 10.217.0.33:6443: connect: connection refused" Feb 19 15:11:57 crc kubenswrapper[4810]: I0219 15:11:57.635348 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l924n" event={"ID":"b7a0b611-cb8e-431b-b527-b6164471c85f","Type":"ContainerStarted","Data":"417b1c9ef68274eb78636581f01b1e27c85f526832c5afc8942d6f5d65fbce96"} Feb 19 15:11:57 crc kubenswrapper[4810]: I0219 15:11:57.643064 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-r2tqm" event={"ID":"b8b1faba-e1b8-436c-aa84-ae4353c5f0a9","Type":"ContainerStarted","Data":"20e10c5e198ddee65262d0528102f59529d30ad7bead31c5ab5c764fe94b9de0"} Feb 19 15:11:57 crc kubenswrapper[4810]: I0219 15:11:57.643128 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-r2tqm" event={"ID":"b8b1faba-e1b8-436c-aa84-ae4353c5f0a9","Type":"ContainerStarted","Data":"23f5d4bb5ee04c131b85331f5b6ae4b924ed6b6ac6634c148de3748ba4fdc4ad"} Feb 19 15:11:57 crc kubenswrapper[4810]: I0219 15:11:57.643437 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-r2tqm" Feb 19 15:11:57 crc kubenswrapper[4810]: I0219 15:11:57.645214 4810 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-r2tqm container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.42:8080/healthz\": dial tcp 10.217.0.42:8080: connect: connection refused" start-of-body= Feb 19 15:11:57 crc kubenswrapper[4810]: I0219 15:11:57.645357 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-r2tqm" podUID="b8b1faba-e1b8-436c-aa84-ae4353c5f0a9" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.42:8080/healthz\": dial tcp 10.217.0.42:8080: connect: connection refused" Feb 19 15:11:57 crc kubenswrapper[4810]: I0219 15:11:57.645774 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-65ltw" event={"ID":"5cc06c55-7085-4cc0-8399-833b4243b51e","Type":"ContainerStarted","Data":"3d9d1cfaea4bc4e51b6abb70330420f848060901beca560335cc25805281c669"} Feb 19 15:11:57 crc kubenswrapper[4810]: I0219 15:11:57.647559 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-4nds4" event={"ID":"51a1f271-446d-42d2-b946-ad816257e990","Type":"ContainerStarted","Data":"d3b0a9e492aad83b4cf9a1e4c4559e72e6ceb284e2bdb13c44e59af397812c51"} Feb 19 15:11:57 crc kubenswrapper[4810]: I0219 15:11:57.649357 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zgcrd" event={"ID":"9d9ff005-2d60-4eb5-b8fa-59b84661617f","Type":"ContainerStarted","Data":"e50bf417d204b2f09e8c39f3284669676184c4b43195ceec33490c41a512a600"} Feb 19 15:11:57 crc kubenswrapper[4810]: I0219 15:11:57.650363 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 15:11:57 crc kubenswrapper[4810]: E0219 15:11:57.652676 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 15:11:58.152644914 +0000 UTC m=+147.634675038 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:11:57 crc kubenswrapper[4810]: I0219 15:11:57.654035 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8j2gh" event={"ID":"b1e6fd37-d069-441e-9c0f-caf60d8f8d4f","Type":"ContainerStarted","Data":"b72ae89c857a22287ea3377fd79343c983ace6b61cae026d7092f60f166bba01"} Feb 19 15:11:57 crc kubenswrapper[4810]: I0219 15:11:57.659986 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-dkppn" event={"ID":"7a29951a-027e-49b4-a7ea-a8e363942414","Type":"ContainerStarted","Data":"248f3396a274a7c02cca57cafc68effaca4124e54f5008022c7525d6c545f088"} Feb 19 15:11:57 crc kubenswrapper[4810]: I0219 15:11:57.661129 4810 patch_prober.go:28] interesting pod/downloads-7954f5f757-dkppn container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Feb 19 15:11:57 crc kubenswrapper[4810]: I0219 15:11:57.661186 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dkppn" podUID="7a29951a-027e-49b4-a7ea-a8e363942414" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Feb 19 15:11:57 crc kubenswrapper[4810]: I0219 15:11:57.662762 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mkt5x" event={"ID":"d3068e5d-4ff8-438f-958b-f0f90e773ca1","Type":"ContainerStarted","Data":"0da1adf715ffdda188ae81236960cc45f0f62f1e9b196e4f1787585c4749917b"} Feb 19 15:11:57 crc kubenswrapper[4810]: I0219 15:11:57.662862 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mkt5x" event={"ID":"d3068e5d-4ff8-438f-958b-f0f90e773ca1","Type":"ContainerStarted","Data":"6f2e9520232ccf40a4afb1888425bf5a4d15f80f23bf1c3779655c495f6cbea9"} Feb 19 15:11:57 crc kubenswrapper[4810]: I0219 15:11:57.663373 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mkt5x" Feb 19 15:11:57 crc kubenswrapper[4810]: I0219 15:11:57.667698 4810 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-mkt5x container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" start-of-body= Feb 19 15:11:57 crc kubenswrapper[4810]: I0219 15:11:57.667808 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mkt5x" podUID="d3068e5d-4ff8-438f-958b-f0f90e773ca1" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" Feb 19 15:11:57 crc kubenswrapper[4810]: I0219 15:11:57.688733 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2nn8b" event={"ID":"b1fae478-5e9a-4da1-b2e3-35c1ee7f8fa3","Type":"ContainerStarted","Data":"17c9e8b8353f936cab2574451ae92e4f6600cade690dc61485fe9ca9793dedec"} Feb 19 15:11:57 crc kubenswrapper[4810]: I0219 15:11:57.699538 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-5d4rp" Feb 19 15:11:57 crc kubenswrapper[4810]: I0219 15:11:57.700125 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-5d4rp" Feb 19 15:11:57 crc kubenswrapper[4810]: I0219 15:11:57.702400 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-r74mv" podStartSLOduration=125.702380012 podStartE2EDuration="2m5.702380012s" podCreationTimestamp="2026-02-19 15:09:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:11:57.699726459 +0000 UTC m=+147.181756573" watchObservedRunningTime="2026-02-19 15:11:57.702380012 +0000 UTC m=+147.184410136" Feb 19 15:11:57 crc kubenswrapper[4810]: I0219 15:11:57.722633 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hmxtq" event={"ID":"6a43ce91-6673-4641-a2d6-551afe72688d","Type":"ContainerStarted","Data":"725a8cab464d281a961e6feaf009a3b4a1df8ec4670406a42d12eba66bc52eef"} Feb 19 15:11:57 crc kubenswrapper[4810]: I0219 15:11:57.724607 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hmxtq" Feb 19 15:11:57 crc kubenswrapper[4810]: I0219 15:11:57.739390 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8j2gh" podStartSLOduration=124.73936417 podStartE2EDuration="2m4.73936417s" podCreationTimestamp="2026-02-19 15:09:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:11:57.737684674 +0000 UTC m=+147.219714798" watchObservedRunningTime="2026-02-19 15:11:57.73936417 +0000 UTC m=+147.221394294" Feb 19 15:11:57 crc kubenswrapper[4810]: I0219 15:11:57.739946 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525220-wqnml" event={"ID":"8383a9e3-149b-4512-a9fd-12cd0b65e370","Type":"ContainerStarted","Data":"c19d21352b758655d40944eafe4d1d6cfee80125c13e6f74424f93ccd9aee7cf"} Feb 19 15:11:57 crc kubenswrapper[4810]: I0219 15:11:57.763969 4810 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-hmxtq container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.25:5443/healthz\": dial tcp 10.217.0.25:5443: connect: connection refused" start-of-body= Feb 19 15:11:57 crc kubenswrapper[4810]: I0219 15:11:57.764028 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hmxtq" podUID="6a43ce91-6673-4641-a2d6-551afe72688d" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.25:5443/healthz\": dial tcp 10.217.0.25:5443: connect: connection refused" Feb 19 15:11:57 crc kubenswrapper[4810]: I0219 15:11:57.765097 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7bnq2\" (UID: \"a6cb5092-5f01-4dd9-a940-804d88907744\") " pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" Feb 19 15:11:57 crc kubenswrapper[4810]: E0219 15:11:57.766720 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 15:11:58.266699452 +0000 UTC m=+147.748729576 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7bnq2" (UID: "a6cb5092-5f01-4dd9-a940-804d88907744") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:11:57 crc kubenswrapper[4810]: I0219 15:11:57.782665 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-n6fl9" event={"ID":"9b8f83b3-fabb-4404-88ce-e64c4db8a568","Type":"ContainerStarted","Data":"b67a2ca301f32b5bba0c87aa2c08dea248aa7238bcaa9a1aab349c455f01385f"} Feb 19 15:11:57 crc kubenswrapper[4810]: I0219 15:11:57.792715 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-r2tqm" podStartSLOduration=124.792691907 podStartE2EDuration="2m4.792691907s" podCreationTimestamp="2026-02-19 15:09:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:11:57.790571649 +0000 UTC m=+147.272601773" watchObservedRunningTime="2026-02-19 15:11:57.792691907 +0000 UTC m=+147.274722031" Feb 19 15:11:57 crc kubenswrapper[4810]: I0219 15:11:57.812996 4810 patch_prober.go:28] interesting pod/apiserver-76f77b778f-5d4rp container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Feb 19 15:11:57 crc kubenswrapper[4810]: [+]log ok Feb 19 15:11:57 crc kubenswrapper[4810]: [+]etcd ok Feb 19 15:11:57 crc kubenswrapper[4810]: [+]poststarthook/start-apiserver-admission-initializer ok Feb 19 15:11:57 crc kubenswrapper[4810]: [+]poststarthook/generic-apiserver-start-informers ok Feb 19 15:11:57 crc kubenswrapper[4810]: [+]poststarthook/max-in-flight-filter ok Feb 19 15:11:57 crc kubenswrapper[4810]: [+]poststarthook/storage-object-count-tracker-hook ok Feb 19 15:11:57 crc kubenswrapper[4810]: [+]poststarthook/image.openshift.io-apiserver-caches ok Feb 19 15:11:57 crc kubenswrapper[4810]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Feb 19 15:11:57 crc kubenswrapper[4810]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Feb 19 15:11:57 crc kubenswrapper[4810]: [+]poststarthook/project.openshift.io-projectcache ok Feb 19 15:11:57 crc kubenswrapper[4810]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Feb 19 15:11:57 crc kubenswrapper[4810]: [+]poststarthook/openshift.io-startinformers ok Feb 19 15:11:57 crc kubenswrapper[4810]: [+]poststarthook/openshift.io-restmapperupdater ok Feb 19 15:11:57 crc kubenswrapper[4810]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Feb 19 15:11:57 crc kubenswrapper[4810]: livez check failed Feb 19 15:11:57 crc kubenswrapper[4810]: I0219 15:11:57.813083 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-5d4rp" podUID="a4b3e85d-d02e-4e13-8bab-aa86d2629d85" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 15:11:57 crc kubenswrapper[4810]: I0219 15:11:57.817382 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xhzlb" event={"ID":"25e05c8a-335b-405c-9033-f689c21c5ecc","Type":"ContainerStarted","Data":"202cd18f5d7c80a1e69ade4303444999603aab1e57b5b36c43cfeab608e76af1"} Feb 19 15:11:57 crc kubenswrapper[4810]: I0219 15:11:57.843683 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2phqk" event={"ID":"ef2828b3-f501-4105-abc8-6b1ce9658301","Type":"ContainerStarted","Data":"56611b5e8046125630e65a48127c89c1579026b4e92c66468343073cf75da26c"} Feb 19 15:11:57 crc kubenswrapper[4810]: I0219 15:11:57.844857 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2phqk" Feb 19 15:11:57 crc kubenswrapper[4810]: I0219 15:11:57.851505 4810 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-2phqk container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" start-of-body= Feb 19 15:11:57 crc kubenswrapper[4810]: I0219 15:11:57.851567 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2phqk" podUID="ef2828b3-f501-4105-abc8-6b1ce9658301" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" Feb 19 15:11:57 crc kubenswrapper[4810]: I0219 15:11:57.868505 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 15:11:57 crc kubenswrapper[4810]: E0219 15:11:57.869805 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 15:11:58.369785578 +0000 UTC m=+147.851815702 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:11:57 crc kubenswrapper[4810]: I0219 15:11:57.874055 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-sl5p9" event={"ID":"6f0a78b8-77b3-47dd-9dbe-e578b0374cf6","Type":"ContainerStarted","Data":"dc986c76de8632d646d80420e0f140b9a738f5647ff326454ca42eca27431ca5"} Feb 19 15:11:57 crc kubenswrapper[4810]: I0219 15:11:57.875069 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-sl5p9" Feb 19 15:11:57 crc kubenswrapper[4810]: I0219 15:11:57.881429 4810 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-sl5p9 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Feb 19 15:11:57 crc kubenswrapper[4810]: I0219 15:11:57.881471 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-sl5p9" podUID="6f0a78b8-77b3-47dd-9dbe-e578b0374cf6" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Feb 19 15:11:57 crc kubenswrapper[4810]: I0219 15:11:57.885721 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-frmnw" event={"ID":"74295390-5384-402d-8c5b-dc2559bb6d9c","Type":"ContainerStarted","Data":"c1c70d9edb3fc636663264f1069b22c28c56ea4c83e4d1e90ce90161570a061c"} Feb 19 15:11:57 crc kubenswrapper[4810]: I0219 15:11:57.890441 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-895xv" event={"ID":"64dc0d58-11d4-456b-97ab-a4d3ec28225b","Type":"ContainerStarted","Data":"edb528b7805eeb9b438b67a66e2248b6ee9c5aa489bd899fce4c6b547b3d343b"} Feb 19 15:11:57 crc kubenswrapper[4810]: I0219 15:11:57.890470 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-895xv" event={"ID":"64dc0d58-11d4-456b-97ab-a4d3ec28225b","Type":"ContainerStarted","Data":"81fea48dc549d3dae06b3f0a36281a04deb8df53595d251f4e43f11ceb3755c6"} Feb 19 15:11:57 crc kubenswrapper[4810]: I0219 15:11:57.894458 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mkt5x" podStartSLOduration=124.894440836 podStartE2EDuration="2m4.894440836s" podCreationTimestamp="2026-02-19 15:09:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:11:57.887882436 +0000 UTC m=+147.369912560" watchObservedRunningTime="2026-02-19 15:11:57.894440836 +0000 UTC m=+147.376470960" Feb 19 15:11:57 crc kubenswrapper[4810]: I0219 15:11:57.912722 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xhzlb" podStartSLOduration=125.912701998 podStartE2EDuration="2m5.912701998s" podCreationTimestamp="2026-02-19 15:09:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:11:57.909697706 +0000 UTC m=+147.391727820" watchObservedRunningTime="2026-02-19 15:11:57.912701998 +0000 UTC m=+147.394732122" Feb 19 15:11:57 crc kubenswrapper[4810]: I0219 15:11:57.914902 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-zrsn2" event={"ID":"13f4a10d-50aa-41ec-9931-cb835ba1f54c","Type":"ContainerStarted","Data":"2a6589f5ecdc63853851db8ee9c7337f46484398d1ef968d041031fbf184b894"} Feb 19 15:11:57 crc kubenswrapper[4810]: I0219 15:11:57.935610 4810 patch_prober.go:28] interesting pod/router-default-5444994796-hvw7f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 15:11:57 crc kubenswrapper[4810]: [-]has-synced failed: reason withheld Feb 19 15:11:57 crc kubenswrapper[4810]: [+]process-running ok Feb 19 15:11:57 crc kubenswrapper[4810]: healthz check failed Feb 19 15:11:57 crc kubenswrapper[4810]: I0219 15:11:57.935702 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hvw7f" podUID="76ffdcba-57d6-4636-8373-f088926a716d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 15:11:57 crc kubenswrapper[4810]: I0219 15:11:57.962001 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-h68pj" event={"ID":"969b9cb6-f89f-47e9-b8f7-754804a41dea","Type":"ContainerStarted","Data":"26f3a7f9b20109269e9a2e47b0214f0759582a1f5ef12e8bd49e8ec1dd79876c"} Feb 19 15:11:57 crc kubenswrapper[4810]: I0219 15:11:57.963340 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-h68pj" Feb 19 15:11:58 crc kubenswrapper[4810]: I0219 15:11:58.004002 4810 patch_prober.go:28] interesting pod/console-operator-58897d9998-h68pj container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.41:8443/readyz\": dial tcp 10.217.0.41:8443: connect: connection refused" start-of-body= Feb 19 15:11:58 crc kubenswrapper[4810]: I0219 15:11:58.004065 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-h68pj" podUID="969b9cb6-f89f-47e9-b8f7-754804a41dea" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.41:8443/readyz\": dial tcp 10.217.0.41:8443: connect: connection refused" Feb 19 15:11:58 crc kubenswrapper[4810]: I0219 15:11:58.005899 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7bnq2\" (UID: \"a6cb5092-5f01-4dd9-a940-804d88907744\") " pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" Feb 19 15:11:58 crc kubenswrapper[4810]: E0219 15:11:58.007512 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 15:11:58.507489226 +0000 UTC m=+147.989519350 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7bnq2" (UID: "a6cb5092-5f01-4dd9-a940-804d88907744") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:11:58 crc kubenswrapper[4810]: I0219 15:11:58.020057 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kmhmh" event={"ID":"63674760-f499-49b8-a575-a8ae954eada4","Type":"ContainerStarted","Data":"8f55397a990d53e8fdcf63205f71dbdf1875eff2c346ffa6c66631e0b4440b65"} Feb 19 15:11:58 crc kubenswrapper[4810]: I0219 15:11:58.022010 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2phqk" podStartSLOduration=125.021981935 podStartE2EDuration="2m5.021981935s" podCreationTimestamp="2026-02-19 15:09:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:11:57.997340927 +0000 UTC m=+147.479371051" watchObservedRunningTime="2026-02-19 15:11:58.021981935 +0000 UTC m=+147.504012059" Feb 19 15:11:58 crc kubenswrapper[4810]: I0219 15:11:58.042888 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-hlw9s" event={"ID":"59eb54a2-ac34-429b-a275-7d365be9ad2f","Type":"ContainerStarted","Data":"254dcc44696270546043618fad29adfb2ea027a838c257af261d9174dee95d07"} Feb 19 15:11:58 crc kubenswrapper[4810]: I0219 15:11:58.048813 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hmxtq" podStartSLOduration=125.048773932 podStartE2EDuration="2m5.048773932s" podCreationTimestamp="2026-02-19 15:09:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:11:57.947864576 +0000 UTC m=+147.429894700" watchObservedRunningTime="2026-02-19 15:11:58.048773932 +0000 UTC m=+147.530804056" Feb 19 15:11:58 crc kubenswrapper[4810]: I0219 15:11:58.055993 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29525220-wqnml" podStartSLOduration=126.05597007 podStartE2EDuration="2m6.05597007s" podCreationTimestamp="2026-02-19 15:09:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:11:58.045358968 +0000 UTC m=+147.527389092" watchObservedRunningTime="2026-02-19 15:11:58.05597007 +0000 UTC m=+147.538000194" Feb 19 15:11:58 crc kubenswrapper[4810]: I0219 15:11:58.074469 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-cdfxh" event={"ID":"500c6ae7-fd1f-4af6-ad6c-f1db0c2af222","Type":"ContainerStarted","Data":"4b2c43301c34bb1c2be42ee14c12d2749dce24b2f61ffd4dcf0b7cbdb2ea4ec7"} Feb 19 15:11:58 crc kubenswrapper[4810]: I0219 15:11:58.084834 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n8zth" Feb 19 15:11:58 crc kubenswrapper[4810]: I0219 15:11:58.113292 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 15:11:58 crc kubenswrapper[4810]: I0219 15:11:58.114353 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-sl5p9" podStartSLOduration=125.114304724 podStartE2EDuration="2m5.114304724s" podCreationTimestamp="2026-02-19 15:09:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:11:58.112539126 +0000 UTC m=+147.594569250" watchObservedRunningTime="2026-02-19 15:11:58.114304724 +0000 UTC m=+147.596334848" Feb 19 15:11:58 crc kubenswrapper[4810]: E0219 15:11:58.115365 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 15:11:58.615344673 +0000 UTC m=+148.097374797 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:11:58 crc kubenswrapper[4810]: I0219 15:11:58.191567 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-h68pj" podStartSLOduration=126.191551689 podStartE2EDuration="2m6.191551689s" podCreationTimestamp="2026-02-19 15:09:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:11:58.148059533 +0000 UTC m=+147.630089667" watchObservedRunningTime="2026-02-19 15:11:58.191551689 +0000 UTC m=+147.673581813" Feb 19 15:11:58 crc kubenswrapper[4810]: I0219 15:11:58.193249 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-895xv" podStartSLOduration=125.193243426 podStartE2EDuration="2m5.193243426s" podCreationTimestamp="2026-02-19 15:09:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:11:58.189773021 +0000 UTC m=+147.671803145" watchObservedRunningTime="2026-02-19 15:11:58.193243426 +0000 UTC m=+147.675273550" Feb 19 15:11:58 crc kubenswrapper[4810]: I0219 15:11:58.222672 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7bnq2\" (UID: \"a6cb5092-5f01-4dd9-a940-804d88907744\") " pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" Feb 19 15:11:58 crc kubenswrapper[4810]: E0219 15:11:58.223210 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 15:11:58.72319306 +0000 UTC m=+148.205223194 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7bnq2" (UID: "a6cb5092-5f01-4dd9-a940-804d88907744") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:11:58 crc kubenswrapper[4810]: I0219 15:11:58.240434 4810 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-19 15:06:57 +0000 UTC, rotation deadline is 2026-12-31 18:02:41.286319873 +0000 UTC Feb 19 15:11:58 crc kubenswrapper[4810]: I0219 15:11:58.240492 4810 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7562h50m43.045831627s for next certificate rotation Feb 19 15:11:58 crc kubenswrapper[4810]: I0219 15:11:58.324339 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 15:11:58 crc kubenswrapper[4810]: E0219 15:11:58.324785 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 15:11:58.824764923 +0000 UTC m=+148.306795047 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:11:58 crc kubenswrapper[4810]: I0219 15:11:58.425753 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7bnq2\" (UID: \"a6cb5092-5f01-4dd9-a940-804d88907744\") " pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" Feb 19 15:11:58 crc kubenswrapper[4810]: E0219 15:11:58.426439 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 15:11:58.92642653 +0000 UTC m=+148.408456654 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7bnq2" (UID: "a6cb5092-5f01-4dd9-a940-804d88907744") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:11:58 crc kubenswrapper[4810]: I0219 15:11:58.527389 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 15:11:58 crc kubenswrapper[4810]: E0219 15:11:58.527728 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 15:11:59.027705186 +0000 UTC m=+148.509735300 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:11:58 crc kubenswrapper[4810]: I0219 15:11:58.629746 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7bnq2\" (UID: \"a6cb5092-5f01-4dd9-a940-804d88907744\") " pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" Feb 19 15:11:58 crc kubenswrapper[4810]: E0219 15:11:58.630291 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 15:11:59.130268068 +0000 UTC m=+148.612298192 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7bnq2" (UID: "a6cb5092-5f01-4dd9-a940-804d88907744") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:11:58 crc kubenswrapper[4810]: I0219 15:11:58.730988 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 15:11:58 crc kubenswrapper[4810]: E0219 15:11:58.731407 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 15:11:59.231355749 +0000 UTC m=+148.713385873 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:11:58 crc kubenswrapper[4810]: I0219 15:11:58.731583 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7bnq2\" (UID: \"a6cb5092-5f01-4dd9-a940-804d88907744\") " pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" Feb 19 15:11:58 crc kubenswrapper[4810]: E0219 15:11:58.732291 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 15:11:59.232284244 +0000 UTC m=+148.714314368 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7bnq2" (UID: "a6cb5092-5f01-4dd9-a940-804d88907744") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:11:58 crc kubenswrapper[4810]: I0219 15:11:58.833131 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 15:11:58 crc kubenswrapper[4810]: E0219 15:11:58.833370 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 15:11:59.333293353 +0000 UTC m=+148.815323477 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:11:58 crc kubenswrapper[4810]: I0219 15:11:58.833941 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7bnq2\" (UID: \"a6cb5092-5f01-4dd9-a940-804d88907744\") " pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" Feb 19 15:11:58 crc kubenswrapper[4810]: E0219 15:11:58.834465 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 15:11:59.334448625 +0000 UTC m=+148.816478749 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7bnq2" (UID: "a6cb5092-5f01-4dd9-a940-804d88907744") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:11:58 crc kubenswrapper[4810]: I0219 15:11:58.922721 4810 patch_prober.go:28] interesting pod/router-default-5444994796-hvw7f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 15:11:58 crc kubenswrapper[4810]: [-]has-synced failed: reason withheld Feb 19 15:11:58 crc kubenswrapper[4810]: [+]process-running ok Feb 19 15:11:58 crc kubenswrapper[4810]: healthz check failed Feb 19 15:11:58 crc kubenswrapper[4810]: I0219 15:11:58.922805 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hvw7f" podUID="76ffdcba-57d6-4636-8373-f088926a716d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 15:11:58 crc kubenswrapper[4810]: I0219 15:11:58.935193 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 15:11:58 crc kubenswrapper[4810]: E0219 15:11:58.935413 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 15:11:59.435376841 +0000 UTC m=+148.917406965 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:11:58 crc kubenswrapper[4810]: I0219 15:11:58.935598 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7bnq2\" (UID: \"a6cb5092-5f01-4dd9-a940-804d88907744\") " pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" Feb 19 15:11:58 crc kubenswrapper[4810]: E0219 15:11:58.935986 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 15:11:59.435979648 +0000 UTC m=+148.918009772 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7bnq2" (UID: "a6cb5092-5f01-4dd9-a940-804d88907744") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:11:59 crc kubenswrapper[4810]: I0219 15:11:59.036593 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 15:11:59 crc kubenswrapper[4810]: E0219 15:11:59.037044 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 15:11:59.537024157 +0000 UTC m=+149.019054281 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:11:59 crc kubenswrapper[4810]: I0219 15:11:59.080981 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zgcrd" event={"ID":"9d9ff005-2d60-4eb5-b8fa-59b84661617f","Type":"ContainerStarted","Data":"a0af2a85460b19646d885a9256a84dadad96f69cd267f25a21c80efe282ec882"} Feb 19 15:11:59 crc kubenswrapper[4810]: I0219 15:11:59.081033 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zgcrd" event={"ID":"9d9ff005-2d60-4eb5-b8fa-59b84661617f","Type":"ContainerStarted","Data":"ae85b1ddc4b1ab2da9cea5f9ee9ebd0a2db8c3b79c669a3398728442a02a3d05"} Feb 19 15:11:59 crc kubenswrapper[4810]: I0219 15:11:59.081109 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-zgcrd" Feb 19 15:11:59 crc kubenswrapper[4810]: I0219 15:11:59.082619 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-n6fl9" event={"ID":"9b8f83b3-fabb-4404-88ce-e64c4db8a568","Type":"ContainerStarted","Data":"205358139c66fa88dffe0a0748c08c915ce0d1c24490505b18156024f28936f7"} Feb 19 15:11:59 crc kubenswrapper[4810]: I0219 15:11:59.084626 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-w6k69" event={"ID":"e3f898c2-3784-4157-b1c5-fdadcaf69bef","Type":"ContainerStarted","Data":"0d83aec25fc1a573e7c3e19ad4c8affa483842fbb3a3e191159c8cbc243fed32"} Feb 19 15:11:59 crc kubenswrapper[4810]: I0219 15:11:59.084680 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-w6k69" event={"ID":"e3f898c2-3784-4157-b1c5-fdadcaf69bef","Type":"ContainerStarted","Data":"b38c1b3ffd216993301de29c67a678209dd2347fd0adc66bc7cbde507213a16b"} Feb 19 15:11:59 crc kubenswrapper[4810]: I0219 15:11:59.086666 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kmhmh" event={"ID":"63674760-f499-49b8-a575-a8ae954eada4","Type":"ContainerStarted","Data":"8f834404cc5083f567a6fe53c7623a57002a3c6eff3c4ebfdcf86c4bd07b136b"} Feb 19 15:11:59 crc kubenswrapper[4810]: I0219 15:11:59.092371 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-cdfxh" event={"ID":"500c6ae7-fd1f-4af6-ad6c-f1db0c2af222","Type":"ContainerStarted","Data":"7adc9b250a6a76b10499f674c3e12527dfa13fd141d2706c9ebf94e73a56c217"} Feb 19 15:11:59 crc kubenswrapper[4810]: I0219 15:11:59.096558 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xhzlb" event={"ID":"25e05c8a-335b-405c-9033-f689c21c5ecc","Type":"ContainerStarted","Data":"eb00d6cba86be6f8cddf9c0c8c12c82f35e2b4846b28e3717d2bda8599a88c6f"} Feb 19 15:11:59 crc kubenswrapper[4810]: I0219 15:11:59.099195 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-65ltw" event={"ID":"5cc06c55-7085-4cc0-8399-833b4243b51e","Type":"ContainerStarted","Data":"a1237df0c53b58e31bf7fa911823c601de88872c3b524920384c6eddfa11fa5a"} Feb 19 15:11:59 crc kubenswrapper[4810]: I0219 15:11:59.101306 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-25cmt" event={"ID":"bd317655-38ea-4fdb-95d0-82adc08456a8","Type":"ContainerStarted","Data":"cbe32080fb6211542039c554f671bd604c6b9c57b66f75dcdfda41ae07006341"} Feb 19 15:11:59 crc kubenswrapper[4810]: I0219 15:11:59.101488 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-25cmt" Feb 19 15:11:59 crc kubenswrapper[4810]: I0219 15:11:59.102978 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-4nds4" event={"ID":"51a1f271-446d-42d2-b946-ad816257e990","Type":"ContainerStarted","Data":"a12468bff98271bf07b1ca74b1418dfc217b671839f3b05aa88a63a026968cf3"} Feb 19 15:11:59 crc kubenswrapper[4810]: I0219 15:11:59.104671 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-frmnw" event={"ID":"74295390-5384-402d-8c5b-dc2559bb6d9c","Type":"ContainerStarted","Data":"db08f4f480788aa31dc85e6273694f95382402643122d80f7d2b42f9a5302119"} Feb 19 15:11:59 crc kubenswrapper[4810]: I0219 15:11:59.105956 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-zgcrd" podStartSLOduration=8.105942693 podStartE2EDuration="8.105942693s" podCreationTimestamp="2026-02-19 15:11:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:11:59.104121633 +0000 UTC m=+148.586151757" watchObservedRunningTime="2026-02-19 15:11:59.105942693 +0000 UTC m=+148.587972807" Feb 19 15:11:59 crc kubenswrapper[4810]: I0219 15:11:59.106113 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-h68pj" event={"ID":"969b9cb6-f89f-47e9-b8f7-754804a41dea","Type":"ContainerStarted","Data":"fe78a3f6ab8ab3705b559ae0a675b50325af5e9b16ae7e704e9077c458264031"} Feb 19 15:11:59 crc kubenswrapper[4810]: I0219 15:11:59.106759 4810 patch_prober.go:28] interesting pod/console-operator-58897d9998-h68pj container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.41:8443/readyz\": dial tcp 10.217.0.41:8443: connect: connection refused" start-of-body= Feb 19 15:11:59 crc kubenswrapper[4810]: I0219 15:11:59.106814 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-h68pj" podUID="969b9cb6-f89f-47e9-b8f7-754804a41dea" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.41:8443/readyz\": dial tcp 10.217.0.41:8443: connect: connection refused" Feb 19 15:11:59 crc kubenswrapper[4810]: I0219 15:11:59.107545 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2nn8b" event={"ID":"b1fae478-5e9a-4da1-b2e3-35c1ee7f8fa3","Type":"ContainerStarted","Data":"25b5d07e3dfc7cdea6a77c1f385d6d71ab82685362e3b633794ae509f5b6007d"} Feb 19 15:11:59 crc kubenswrapper[4810]: I0219 15:11:59.107713 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2nn8b" Feb 19 15:11:59 crc kubenswrapper[4810]: I0219 15:11:59.109378 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9lqqd" event={"ID":"026a2812-dc11-4b60-b911-bb41a0d39d7d","Type":"ContainerStarted","Data":"50e6ad8f65a86bcce9aca00f227e01c29e67e169768666571b50bcdceeff81d4"} Feb 19 15:11:59 crc kubenswrapper[4810]: I0219 15:11:59.126988 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q6zwc" event={"ID":"c696fe96-0485-44d0-b4fb-161503c334e8","Type":"ContainerStarted","Data":"d40017474d6321ea79edca666f5d7c4fe5ef4acaa79e4cf58bd8ca4e34ac6557"} Feb 19 15:11:59 crc kubenswrapper[4810]: I0219 15:11:59.130736 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-zrsn2" event={"ID":"13f4a10d-50aa-41ec-9931-cb835ba1f54c","Type":"ContainerStarted","Data":"80bf2fc295e5bc2097742d7714644f5e5d8647fe7c6739e1ddaf2d35ffcd233b"} Feb 19 15:11:59 crc kubenswrapper[4810]: I0219 15:11:59.131856 4810 patch_prober.go:28] interesting pod/downloads-7954f5f757-dkppn container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Feb 19 15:11:59 crc kubenswrapper[4810]: I0219 15:11:59.131904 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dkppn" podUID="7a29951a-027e-49b4-a7ea-a8e363942414" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Feb 19 15:11:59 crc kubenswrapper[4810]: I0219 15:11:59.132119 4810 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-r2tqm container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.42:8080/healthz\": dial tcp 10.217.0.42:8080: connect: connection refused" start-of-body= Feb 19 15:11:59 crc kubenswrapper[4810]: I0219 15:11:59.132196 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-r2tqm" podUID="b8b1faba-e1b8-436c-aa84-ae4353c5f0a9" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.42:8080/healthz\": dial tcp 10.217.0.42:8080: connect: connection refused" Feb 19 15:11:59 crc kubenswrapper[4810]: I0219 15:11:59.143458 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7bnq2\" (UID: \"a6cb5092-5f01-4dd9-a940-804d88907744\") " pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" Feb 19 15:11:59 crc kubenswrapper[4810]: E0219 15:11:59.143946 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 15:11:59.643931929 +0000 UTC m=+149.125962053 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7bnq2" (UID: "a6cb5092-5f01-4dd9-a940-804d88907744") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:11:59 crc kubenswrapper[4810]: I0219 15:11:59.144535 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-w6k69" podStartSLOduration=126.144515465 podStartE2EDuration="2m6.144515465s" podCreationTimestamp="2026-02-19 15:09:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:11:59.142454388 +0000 UTC m=+148.624484512" watchObservedRunningTime="2026-02-19 15:11:59.144515465 +0000 UTC m=+148.626545589" Feb 19 15:11:59 crc kubenswrapper[4810]: I0219 15:11:59.151701 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-sl5p9" Feb 19 15:11:59 crc kubenswrapper[4810]: I0219 15:11:59.151799 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2phqk" Feb 19 15:11:59 crc kubenswrapper[4810]: I0219 15:11:59.155515 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hmxtq" Feb 19 15:11:59 crc kubenswrapper[4810]: I0219 15:11:59.173883 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kmhmh" podStartSLOduration=126.173852862 podStartE2EDuration="2m6.173852862s" podCreationTimestamp="2026-02-19 15:09:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:11:59.172266168 +0000 UTC m=+148.654296292" watchObservedRunningTime="2026-02-19 15:11:59.173852862 +0000 UTC m=+148.655882986" Feb 19 15:11:59 crc kubenswrapper[4810]: I0219 15:11:59.191599 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mkt5x" Feb 19 15:11:59 crc kubenswrapper[4810]: I0219 15:11:59.251271 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 15:11:59 crc kubenswrapper[4810]: E0219 15:11:59.252522 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 15:11:59.752490635 +0000 UTC m=+149.234520759 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:11:59 crc kubenswrapper[4810]: I0219 15:11:59.253285 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7bnq2\" (UID: \"a6cb5092-5f01-4dd9-a940-804d88907744\") " pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" Feb 19 15:11:59 crc kubenswrapper[4810]: I0219 15:11:59.286910 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-cdfxh" podStartSLOduration=8.286871321 podStartE2EDuration="8.286871321s" podCreationTimestamp="2026-02-19 15:11:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:11:59.230748107 +0000 UTC m=+148.712778231" watchObservedRunningTime="2026-02-19 15:11:59.286871321 +0000 UTC m=+148.768901445" Feb 19 15:11:59 crc kubenswrapper[4810]: I0219 15:11:59.288758 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-65ltw" podStartSLOduration=126.288747082 podStartE2EDuration="2m6.288747082s" podCreationTimestamp="2026-02-19 15:09:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:11:59.264064173 +0000 UTC m=+148.746094297" watchObservedRunningTime="2026-02-19 15:11:59.288747082 +0000 UTC m=+148.770777196" Feb 19 15:11:59 crc kubenswrapper[4810]: E0219 15:11:59.289701 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 15:11:59.789679748 +0000 UTC m=+149.271709872 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7bnq2" (UID: "a6cb5092-5f01-4dd9-a940-804d88907744") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:11:59 crc kubenswrapper[4810]: I0219 15:11:59.356038 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-n6fl9" podStartSLOduration=126.356003573 podStartE2EDuration="2m6.356003573s" podCreationTimestamp="2026-02-19 15:09:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:11:59.355318494 +0000 UTC m=+148.837348618" watchObservedRunningTime="2026-02-19 15:11:59.356003573 +0000 UTC m=+148.838033697" Feb 19 15:11:59 crc kubenswrapper[4810]: I0219 15:11:59.397725 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-25cmt" podStartSLOduration=127.3977067 podStartE2EDuration="2m7.3977067s" podCreationTimestamp="2026-02-19 15:09:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:11:59.395064587 +0000 UTC m=+148.877094711" watchObservedRunningTime="2026-02-19 15:11:59.3977067 +0000 UTC m=+148.879736824" Feb 19 15:11:59 crc kubenswrapper[4810]: I0219 15:11:59.415396 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 15:11:59 crc kubenswrapper[4810]: I0219 15:11:59.415755 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 15:11:59 crc kubenswrapper[4810]: I0219 15:11:59.415795 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 15:11:59 crc kubenswrapper[4810]: I0219 15:11:59.415827 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 15:11:59 crc kubenswrapper[4810]: I0219 15:11:59.415856 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 15:11:59 crc kubenswrapper[4810]: E0219 15:11:59.417180 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 15:11:59.917130944 +0000 UTC m=+149.399161058 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:11:59 crc kubenswrapper[4810]: I0219 15:11:59.418810 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 15:11:59 crc kubenswrapper[4810]: I0219 15:11:59.425625 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 15:11:59 crc kubenswrapper[4810]: I0219 15:11:59.432974 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q6zwc" podStartSLOduration=126.43295202 podStartE2EDuration="2m6.43295202s" podCreationTimestamp="2026-02-19 15:09:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:11:59.431713835 +0000 UTC m=+148.913743959" watchObservedRunningTime="2026-02-19 15:11:59.43295202 +0000 UTC m=+148.914982144" Feb 19 15:11:59 crc kubenswrapper[4810]: I0219 15:11:59.436845 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 15:11:59 crc kubenswrapper[4810]: I0219 15:11:59.455417 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 15:11:59 crc kubenswrapper[4810]: I0219 15:11:59.470746 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 15:11:59 crc kubenswrapper[4810]: I0219 15:11:59.482446 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 15:11:59 crc kubenswrapper[4810]: I0219 15:11:59.527268 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7bnq2\" (UID: \"a6cb5092-5f01-4dd9-a940-804d88907744\") " pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" Feb 19 15:11:59 crc kubenswrapper[4810]: E0219 15:11:59.527810 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 15:12:00.027789479 +0000 UTC m=+149.509819603 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7bnq2" (UID: "a6cb5092-5f01-4dd9-a940-804d88907744") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:11:59 crc kubenswrapper[4810]: I0219 15:11:59.556696 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 15:11:59 crc kubenswrapper[4810]: I0219 15:11:59.614279 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-zrsn2" podStartSLOduration=126.614261117 podStartE2EDuration="2m6.614261117s" podCreationTimestamp="2026-02-19 15:09:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:11:59.564708924 +0000 UTC m=+149.046739038" watchObservedRunningTime="2026-02-19 15:11:59.614261117 +0000 UTC m=+149.096291241" Feb 19 15:11:59 crc kubenswrapper[4810]: I0219 15:11:59.629073 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 15:11:59 crc kubenswrapper[4810]: E0219 15:11:59.629698 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 15:12:00.129679822 +0000 UTC m=+149.611709946 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:11:59 crc kubenswrapper[4810]: I0219 15:11:59.653135 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-frmnw" podStartSLOduration=126.653113356 podStartE2EDuration="2m6.653113356s" podCreationTimestamp="2026-02-19 15:09:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:11:59.649827286 +0000 UTC m=+149.131857410" watchObservedRunningTime="2026-02-19 15:11:59.653113356 +0000 UTC m=+149.135143480" Feb 19 15:11:59 crc kubenswrapper[4810]: I0219 15:11:59.702979 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2nn8b" podStartSLOduration=126.702958708 podStartE2EDuration="2m6.702958708s" podCreationTimestamp="2026-02-19 15:09:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:11:59.694657229 +0000 UTC m=+149.176687353" watchObservedRunningTime="2026-02-19 15:11:59.702958708 +0000 UTC m=+149.184988832" Feb 19 15:11:59 crc kubenswrapper[4810]: I0219 15:11:59.741169 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7bnq2\" (UID: \"a6cb5092-5f01-4dd9-a940-804d88907744\") " pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" Feb 19 15:11:59 crc kubenswrapper[4810]: E0219 15:11:59.741537 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 15:12:00.241524318 +0000 UTC m=+149.723554442 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7bnq2" (UID: "a6cb5092-5f01-4dd9-a940-804d88907744") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:11:59 crc kubenswrapper[4810]: I0219 15:11:59.827296 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9lqqd" podStartSLOduration=126.827275177 podStartE2EDuration="2m6.827275177s" podCreationTimestamp="2026-02-19 15:09:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:11:59.82590776 +0000 UTC m=+149.307937894" watchObservedRunningTime="2026-02-19 15:11:59.827275177 +0000 UTC m=+149.309305301" Feb 19 15:11:59 crc kubenswrapper[4810]: I0219 15:11:59.842174 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 15:11:59 crc kubenswrapper[4810]: E0219 15:11:59.842582 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 15:12:00.342560148 +0000 UTC m=+149.824590272 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:11:59 crc kubenswrapper[4810]: I0219 15:11:59.920369 4810 patch_prober.go:28] interesting pod/router-default-5444994796-hvw7f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 15:11:59 crc kubenswrapper[4810]: [-]has-synced failed: reason withheld Feb 19 15:11:59 crc kubenswrapper[4810]: [+]process-running ok Feb 19 15:11:59 crc kubenswrapper[4810]: healthz check failed Feb 19 15:11:59 crc kubenswrapper[4810]: I0219 15:11:59.920441 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hvw7f" podUID="76ffdcba-57d6-4636-8373-f088926a716d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 15:11:59 crc kubenswrapper[4810]: I0219 15:11:59.944206 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7bnq2\" (UID: \"a6cb5092-5f01-4dd9-a940-804d88907744\") " pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" Feb 19 15:11:59 crc kubenswrapper[4810]: E0219 15:11:59.944589 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 15:12:00.444574264 +0000 UTC m=+149.926604388 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7bnq2" (UID: "a6cb5092-5f01-4dd9-a940-804d88907744") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:12:00 crc kubenswrapper[4810]: I0219 15:12:00.046788 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 15:12:00 crc kubenswrapper[4810]: E0219 15:12:00.047161 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 15:12:00.547141716 +0000 UTC m=+150.029171840 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:12:00 crc kubenswrapper[4810]: I0219 15:12:00.134420 4810 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-r74mv container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.33:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 19 15:12:00 crc kubenswrapper[4810]: I0219 15:12:00.134489 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-r74mv" podUID="c18fb461-ce5b-43ad-85ca-305c3f8a7d46" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.33:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 19 15:12:00 crc kubenswrapper[4810]: I0219 15:12:00.148456 4810 patch_prober.go:28] interesting pod/console-operator-58897d9998-h68pj container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.41:8443/readyz\": dial tcp 10.217.0.41:8443: connect: connection refused" start-of-body= Feb 19 15:12:00 crc kubenswrapper[4810]: I0219 15:12:00.148518 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-h68pj" podUID="969b9cb6-f89f-47e9-b8f7-754804a41dea" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.41:8443/readyz\": dial tcp 10.217.0.41:8443: connect: connection refused" Feb 19 15:12:00 crc kubenswrapper[4810]: I0219 15:12:00.149127 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7bnq2\" (UID: \"a6cb5092-5f01-4dd9-a940-804d88907744\") " pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" Feb 19 15:12:00 crc kubenswrapper[4810]: E0219 15:12:00.149482 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 15:12:00.649467311 +0000 UTC m=+150.131497435 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7bnq2" (UID: "a6cb5092-5f01-4dd9-a940-804d88907744") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:12:00 crc kubenswrapper[4810]: I0219 15:12:00.250036 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 15:12:00 crc kubenswrapper[4810]: E0219 15:12:00.251647 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 15:12:00.751616511 +0000 UTC m=+150.233646635 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:12:00 crc kubenswrapper[4810]: I0219 15:12:00.353493 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7bnq2\" (UID: \"a6cb5092-5f01-4dd9-a940-804d88907744\") " pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" Feb 19 15:12:00 crc kubenswrapper[4810]: E0219 15:12:00.354358 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 15:12:00.854342647 +0000 UTC m=+150.336372771 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7bnq2" (UID: "a6cb5092-5f01-4dd9-a940-804d88907744") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:12:00 crc kubenswrapper[4810]: I0219 15:12:00.447054 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rk4vw"] Feb 19 15:12:00 crc kubenswrapper[4810]: I0219 15:12:00.448262 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rk4vw" Feb 19 15:12:00 crc kubenswrapper[4810]: I0219 15:12:00.457102 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 15:12:00 crc kubenswrapper[4810]: E0219 15:12:00.457587 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 15:12:00.957565057 +0000 UTC m=+150.439595181 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:12:00 crc kubenswrapper[4810]: I0219 15:12:00.471271 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rk4vw"] Feb 19 15:12:00 crc kubenswrapper[4810]: I0219 15:12:00.471568 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-r74mv" Feb 19 15:12:00 crc kubenswrapper[4810]: I0219 15:12:00.506868 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 19 15:12:00 crc kubenswrapper[4810]: I0219 15:12:00.562691 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bf3315d-3d2f-4aeb-b925-c3832e102e85-catalog-content\") pod \"certified-operators-rk4vw\" (UID: \"3bf3315d-3d2f-4aeb-b925-c3832e102e85\") " pod="openshift-marketplace/certified-operators-rk4vw" Feb 19 15:12:00 crc kubenswrapper[4810]: I0219 15:12:00.562810 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7bnq2\" (UID: \"a6cb5092-5f01-4dd9-a940-804d88907744\") " pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" Feb 19 15:12:00 crc kubenswrapper[4810]: I0219 15:12:00.562837 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bf3315d-3d2f-4aeb-b925-c3832e102e85-utilities\") pod \"certified-operators-rk4vw\" (UID: \"3bf3315d-3d2f-4aeb-b925-c3832e102e85\") " pod="openshift-marketplace/certified-operators-rk4vw" Feb 19 15:12:00 crc kubenswrapper[4810]: I0219 15:12:00.562871 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqzns\" (UniqueName: \"kubernetes.io/projected/3bf3315d-3d2f-4aeb-b925-c3832e102e85-kube-api-access-fqzns\") pod \"certified-operators-rk4vw\" (UID: \"3bf3315d-3d2f-4aeb-b925-c3832e102e85\") " pod="openshift-marketplace/certified-operators-rk4vw" Feb 19 15:12:00 crc kubenswrapper[4810]: E0219 15:12:00.569548 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 15:12:01.069534397 +0000 UTC m=+150.551564511 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7bnq2" (UID: "a6cb5092-5f01-4dd9-a940-804d88907744") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:12:00 crc kubenswrapper[4810]: W0219 15:12:00.579500 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-570a8297512c532cdc027f9e5ab8b517aab8acf061999ac36570f8d6d0967282 WatchSource:0}: Error finding container 570a8297512c532cdc027f9e5ab8b517aab8acf061999ac36570f8d6d0967282: Status 404 returned error can't find the container with id 570a8297512c532cdc027f9e5ab8b517aab8acf061999ac36570f8d6d0967282 Feb 19 15:12:00 crc kubenswrapper[4810]: I0219 15:12:00.615105 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-d5ks5"] Feb 19 15:12:00 crc kubenswrapper[4810]: I0219 15:12:00.616586 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d5ks5" Feb 19 15:12:00 crc kubenswrapper[4810]: I0219 15:12:00.626436 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 19 15:12:00 crc kubenswrapper[4810]: I0219 15:12:00.666539 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 15:12:00 crc kubenswrapper[4810]: I0219 15:12:00.666809 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bf3315d-3d2f-4aeb-b925-c3832e102e85-utilities\") pod \"certified-operators-rk4vw\" (UID: \"3bf3315d-3d2f-4aeb-b925-c3832e102e85\") " pod="openshift-marketplace/certified-operators-rk4vw" Feb 19 15:12:00 crc kubenswrapper[4810]: I0219 15:12:00.666847 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqzns\" (UniqueName: \"kubernetes.io/projected/3bf3315d-3d2f-4aeb-b925-c3832e102e85-kube-api-access-fqzns\") pod \"certified-operators-rk4vw\" (UID: \"3bf3315d-3d2f-4aeb-b925-c3832e102e85\") " pod="openshift-marketplace/certified-operators-rk4vw" Feb 19 15:12:00 crc kubenswrapper[4810]: I0219 15:12:00.666883 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bf3315d-3d2f-4aeb-b925-c3832e102e85-catalog-content\") pod \"certified-operators-rk4vw\" (UID: \"3bf3315d-3d2f-4aeb-b925-c3832e102e85\") " pod="openshift-marketplace/certified-operators-rk4vw" Feb 19 15:12:00 crc kubenswrapper[4810]: I0219 15:12:00.667394 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bf3315d-3d2f-4aeb-b925-c3832e102e85-catalog-content\") pod \"certified-operators-rk4vw\" (UID: \"3bf3315d-3d2f-4aeb-b925-c3832e102e85\") " pod="openshift-marketplace/certified-operators-rk4vw" Feb 19 15:12:00 crc kubenswrapper[4810]: E0219 15:12:00.667483 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 15:12:01.167463911 +0000 UTC m=+150.649494025 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:12:00 crc kubenswrapper[4810]: I0219 15:12:00.667696 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bf3315d-3d2f-4aeb-b925-c3832e102e85-utilities\") pod \"certified-operators-rk4vw\" (UID: \"3bf3315d-3d2f-4aeb-b925-c3832e102e85\") " pod="openshift-marketplace/certified-operators-rk4vw" Feb 19 15:12:00 crc kubenswrapper[4810]: I0219 15:12:00.684601 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d5ks5"] Feb 19 15:12:00 crc kubenswrapper[4810]: I0219 15:12:00.724573 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqzns\" (UniqueName: \"kubernetes.io/projected/3bf3315d-3d2f-4aeb-b925-c3832e102e85-kube-api-access-fqzns\") pod \"certified-operators-rk4vw\" (UID: \"3bf3315d-3d2f-4aeb-b925-c3832e102e85\") " pod="openshift-marketplace/certified-operators-rk4vw" Feb 19 15:12:00 crc kubenswrapper[4810]: I0219 15:12:00.769862 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a3d6b1f-2011-4f7f-bea0-1d303007fe41-utilities\") pod \"community-operators-d5ks5\" (UID: \"9a3d6b1f-2011-4f7f-bea0-1d303007fe41\") " pod="openshift-marketplace/community-operators-d5ks5" Feb 19 15:12:00 crc kubenswrapper[4810]: I0219 15:12:00.769936 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7bnq2\" (UID: \"a6cb5092-5f01-4dd9-a940-804d88907744\") " pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" Feb 19 15:12:00 crc kubenswrapper[4810]: I0219 15:12:00.769964 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqz6g\" (UniqueName: \"kubernetes.io/projected/9a3d6b1f-2011-4f7f-bea0-1d303007fe41-kube-api-access-cqz6g\") pod \"community-operators-d5ks5\" (UID: \"9a3d6b1f-2011-4f7f-bea0-1d303007fe41\") " pod="openshift-marketplace/community-operators-d5ks5" Feb 19 15:12:00 crc kubenswrapper[4810]: I0219 15:12:00.770008 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a3d6b1f-2011-4f7f-bea0-1d303007fe41-catalog-content\") pod \"community-operators-d5ks5\" (UID: \"9a3d6b1f-2011-4f7f-bea0-1d303007fe41\") " pod="openshift-marketplace/community-operators-d5ks5" Feb 19 15:12:00 crc kubenswrapper[4810]: E0219 15:12:00.770352 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 15:12:01.270338211 +0000 UTC m=+150.752368335 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7bnq2" (UID: "a6cb5092-5f01-4dd9-a940-804d88907744") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:12:00 crc kubenswrapper[4810]: W0219 15:12:00.791265 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-baad19be76aa3ec4b98ca06f5e19af7486a086e59048c2e0cba45eebcb02a1c0 WatchSource:0}: Error finding container baad19be76aa3ec4b98ca06f5e19af7486a086e59048c2e0cba45eebcb02a1c0: Status 404 returned error can't find the container with id baad19be76aa3ec4b98ca06f5e19af7486a086e59048c2e0cba45eebcb02a1c0 Feb 19 15:12:00 crc kubenswrapper[4810]: I0219 15:12:00.802512 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-blpmq"] Feb 19 15:12:00 crc kubenswrapper[4810]: I0219 15:12:00.806816 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-blpmq" Feb 19 15:12:00 crc kubenswrapper[4810]: I0219 15:12:00.836353 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-blpmq"] Feb 19 15:12:00 crc kubenswrapper[4810]: I0219 15:12:00.868261 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rk4vw" Feb 19 15:12:00 crc kubenswrapper[4810]: I0219 15:12:00.873316 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 15:12:00 crc kubenswrapper[4810]: I0219 15:12:00.873593 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqz6g\" (UniqueName: \"kubernetes.io/projected/9a3d6b1f-2011-4f7f-bea0-1d303007fe41-kube-api-access-cqz6g\") pod \"community-operators-d5ks5\" (UID: \"9a3d6b1f-2011-4f7f-bea0-1d303007fe41\") " pod="openshift-marketplace/community-operators-d5ks5" Feb 19 15:12:00 crc kubenswrapper[4810]: I0219 15:12:00.873622 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4127fef2-ef2b-4cc4-967d-d52dac26f314-utilities\") pod \"certified-operators-blpmq\" (UID: \"4127fef2-ef2b-4cc4-967d-d52dac26f314\") " pod="openshift-marketplace/certified-operators-blpmq" Feb 19 15:12:00 crc kubenswrapper[4810]: I0219 15:12:00.873650 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4127fef2-ef2b-4cc4-967d-d52dac26f314-catalog-content\") pod \"certified-operators-blpmq\" (UID: \"4127fef2-ef2b-4cc4-967d-d52dac26f314\") " pod="openshift-marketplace/certified-operators-blpmq" Feb 19 15:12:00 crc kubenswrapper[4810]: I0219 15:12:00.873684 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a3d6b1f-2011-4f7f-bea0-1d303007fe41-catalog-content\") pod \"community-operators-d5ks5\" (UID: \"9a3d6b1f-2011-4f7f-bea0-1d303007fe41\") " pod="openshift-marketplace/community-operators-d5ks5" Feb 19 15:12:00 crc kubenswrapper[4810]: I0219 15:12:00.873721 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a3d6b1f-2011-4f7f-bea0-1d303007fe41-utilities\") pod \"community-operators-d5ks5\" (UID: \"9a3d6b1f-2011-4f7f-bea0-1d303007fe41\") " pod="openshift-marketplace/community-operators-d5ks5" Feb 19 15:12:00 crc kubenswrapper[4810]: I0219 15:12:00.873740 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thvr6\" (UniqueName: \"kubernetes.io/projected/4127fef2-ef2b-4cc4-967d-d52dac26f314-kube-api-access-thvr6\") pod \"certified-operators-blpmq\" (UID: \"4127fef2-ef2b-4cc4-967d-d52dac26f314\") " pod="openshift-marketplace/certified-operators-blpmq" Feb 19 15:12:00 crc kubenswrapper[4810]: E0219 15:12:00.873851 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 15:12:01.373832598 +0000 UTC m=+150.855862722 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:12:00 crc kubenswrapper[4810]: I0219 15:12:00.874494 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a3d6b1f-2011-4f7f-bea0-1d303007fe41-catalog-content\") pod \"community-operators-d5ks5\" (UID: \"9a3d6b1f-2011-4f7f-bea0-1d303007fe41\") " pod="openshift-marketplace/community-operators-d5ks5" Feb 19 15:12:00 crc kubenswrapper[4810]: I0219 15:12:00.874694 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a3d6b1f-2011-4f7f-bea0-1d303007fe41-utilities\") pod \"community-operators-d5ks5\" (UID: \"9a3d6b1f-2011-4f7f-bea0-1d303007fe41\") " pod="openshift-marketplace/community-operators-d5ks5" Feb 19 15:12:00 crc kubenswrapper[4810]: I0219 15:12:00.921473 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqz6g\" (UniqueName: \"kubernetes.io/projected/9a3d6b1f-2011-4f7f-bea0-1d303007fe41-kube-api-access-cqz6g\") pod \"community-operators-d5ks5\" (UID: \"9a3d6b1f-2011-4f7f-bea0-1d303007fe41\") " pod="openshift-marketplace/community-operators-d5ks5" Feb 19 15:12:00 crc kubenswrapper[4810]: I0219 15:12:00.928800 4810 patch_prober.go:28] interesting pod/router-default-5444994796-hvw7f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 15:12:00 crc kubenswrapper[4810]: [-]has-synced failed: reason withheld Feb 19 15:12:00 crc kubenswrapper[4810]: [+]process-running ok Feb 19 15:12:00 crc kubenswrapper[4810]: healthz check failed Feb 19 15:12:00 crc kubenswrapper[4810]: I0219 15:12:00.928856 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hvw7f" podUID="76ffdcba-57d6-4636-8373-f088926a716d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 15:12:00 crc kubenswrapper[4810]: I0219 15:12:00.976506 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thvr6\" (UniqueName: \"kubernetes.io/projected/4127fef2-ef2b-4cc4-967d-d52dac26f314-kube-api-access-thvr6\") pod \"certified-operators-blpmq\" (UID: \"4127fef2-ef2b-4cc4-967d-d52dac26f314\") " pod="openshift-marketplace/certified-operators-blpmq" Feb 19 15:12:00 crc kubenswrapper[4810]: I0219 15:12:00.976571 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7bnq2\" (UID: \"a6cb5092-5f01-4dd9-a940-804d88907744\") " pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" Feb 19 15:12:00 crc kubenswrapper[4810]: I0219 15:12:00.976599 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4127fef2-ef2b-4cc4-967d-d52dac26f314-utilities\") pod \"certified-operators-blpmq\" (UID: \"4127fef2-ef2b-4cc4-967d-d52dac26f314\") " pod="openshift-marketplace/certified-operators-blpmq" Feb 19 15:12:00 crc kubenswrapper[4810]: I0219 15:12:00.976631 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4127fef2-ef2b-4cc4-967d-d52dac26f314-catalog-content\") pod \"certified-operators-blpmq\" (UID: \"4127fef2-ef2b-4cc4-967d-d52dac26f314\") " pod="openshift-marketplace/certified-operators-blpmq" Feb 19 15:12:00 crc kubenswrapper[4810]: I0219 15:12:00.977017 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4127fef2-ef2b-4cc4-967d-d52dac26f314-catalog-content\") pod \"certified-operators-blpmq\" (UID: \"4127fef2-ef2b-4cc4-967d-d52dac26f314\") " pod="openshift-marketplace/certified-operators-blpmq" Feb 19 15:12:00 crc kubenswrapper[4810]: E0219 15:12:00.977026 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 15:12:01.477006917 +0000 UTC m=+150.959037041 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7bnq2" (UID: "a6cb5092-5f01-4dd9-a940-804d88907744") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:12:00 crc kubenswrapper[4810]: I0219 15:12:00.977271 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4127fef2-ef2b-4cc4-967d-d52dac26f314-utilities\") pod \"certified-operators-blpmq\" (UID: \"4127fef2-ef2b-4cc4-967d-d52dac26f314\") " pod="openshift-marketplace/certified-operators-blpmq" Feb 19 15:12:01 crc kubenswrapper[4810]: I0219 15:12:01.000575 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-x8sn2"] Feb 19 15:12:01 crc kubenswrapper[4810]: I0219 15:12:01.001962 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x8sn2" Feb 19 15:12:01 crc kubenswrapper[4810]: I0219 15:12:01.019783 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d5ks5" Feb 19 15:12:01 crc kubenswrapper[4810]: I0219 15:12:01.052905 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thvr6\" (UniqueName: \"kubernetes.io/projected/4127fef2-ef2b-4cc4-967d-d52dac26f314-kube-api-access-thvr6\") pod \"certified-operators-blpmq\" (UID: \"4127fef2-ef2b-4cc4-967d-d52dac26f314\") " pod="openshift-marketplace/certified-operators-blpmq" Feb 19 15:12:01 crc kubenswrapper[4810]: I0219 15:12:01.053236 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-x8sn2"] Feb 19 15:12:01 crc kubenswrapper[4810]: I0219 15:12:01.089900 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 15:12:01 crc kubenswrapper[4810]: I0219 15:12:01.090141 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc8ce195-1fe1-4684-8172-e710b3552fb5-catalog-content\") pod \"community-operators-x8sn2\" (UID: \"cc8ce195-1fe1-4684-8172-e710b3552fb5\") " pod="openshift-marketplace/community-operators-x8sn2" Feb 19 15:12:01 crc kubenswrapper[4810]: I0219 15:12:01.090173 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc8ce195-1fe1-4684-8172-e710b3552fb5-utilities\") pod \"community-operators-x8sn2\" (UID: \"cc8ce195-1fe1-4684-8172-e710b3552fb5\") " pod="openshift-marketplace/community-operators-x8sn2" Feb 19 15:12:01 crc kubenswrapper[4810]: I0219 15:12:01.090205 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xzxn\" (UniqueName: \"kubernetes.io/projected/cc8ce195-1fe1-4684-8172-e710b3552fb5-kube-api-access-4xzxn\") pod \"community-operators-x8sn2\" (UID: \"cc8ce195-1fe1-4684-8172-e710b3552fb5\") " pod="openshift-marketplace/community-operators-x8sn2" Feb 19 15:12:01 crc kubenswrapper[4810]: E0219 15:12:01.090311 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 15:12:01.590291593 +0000 UTC m=+151.072321717 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:12:01 crc kubenswrapper[4810]: I0219 15:12:01.135587 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-blpmq" Feb 19 15:12:01 crc kubenswrapper[4810]: I0219 15:12:01.192011 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc8ce195-1fe1-4684-8172-e710b3552fb5-catalog-content\") pod \"community-operators-x8sn2\" (UID: \"cc8ce195-1fe1-4684-8172-e710b3552fb5\") " pod="openshift-marketplace/community-operators-x8sn2" Feb 19 15:12:01 crc kubenswrapper[4810]: I0219 15:12:01.192062 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc8ce195-1fe1-4684-8172-e710b3552fb5-utilities\") pod \"community-operators-x8sn2\" (UID: \"cc8ce195-1fe1-4684-8172-e710b3552fb5\") " pod="openshift-marketplace/community-operators-x8sn2" Feb 19 15:12:01 crc kubenswrapper[4810]: I0219 15:12:01.192086 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7bnq2\" (UID: \"a6cb5092-5f01-4dd9-a940-804d88907744\") " pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" Feb 19 15:12:01 crc kubenswrapper[4810]: I0219 15:12:01.192113 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xzxn\" (UniqueName: \"kubernetes.io/projected/cc8ce195-1fe1-4684-8172-e710b3552fb5-kube-api-access-4xzxn\") pod \"community-operators-x8sn2\" (UID: \"cc8ce195-1fe1-4684-8172-e710b3552fb5\") " pod="openshift-marketplace/community-operators-x8sn2" Feb 19 15:12:01 crc kubenswrapper[4810]: I0219 15:12:01.192818 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc8ce195-1fe1-4684-8172-e710b3552fb5-catalog-content\") pod \"community-operators-x8sn2\" (UID: \"cc8ce195-1fe1-4684-8172-e710b3552fb5\") " pod="openshift-marketplace/community-operators-x8sn2" Feb 19 15:12:01 crc kubenswrapper[4810]: I0219 15:12:01.193032 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc8ce195-1fe1-4684-8172-e710b3552fb5-utilities\") pod \"community-operators-x8sn2\" (UID: \"cc8ce195-1fe1-4684-8172-e710b3552fb5\") " pod="openshift-marketplace/community-operators-x8sn2" Feb 19 15:12:01 crc kubenswrapper[4810]: E0219 15:12:01.193261 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 15:12:01.693251075 +0000 UTC m=+151.175281199 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7bnq2" (UID: "a6cb5092-5f01-4dd9-a940-804d88907744") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:12:01 crc kubenswrapper[4810]: I0219 15:12:01.194547 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"93d461b585c01439340c8c9488368a4a7839b7c75ac33dcc961a4492e523cd98"} Feb 19 15:12:01 crc kubenswrapper[4810]: I0219 15:12:01.194610 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"570a8297512c532cdc027f9e5ab8b517aab8acf061999ac36570f8d6d0967282"} Feb 19 15:12:01 crc kubenswrapper[4810]: I0219 15:12:01.208126 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"baad19be76aa3ec4b98ca06f5e19af7486a086e59048c2e0cba45eebcb02a1c0"} Feb 19 15:12:01 crc kubenswrapper[4810]: I0219 15:12:01.209551 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-4nds4" event={"ID":"51a1f271-446d-42d2-b946-ad816257e990","Type":"ContainerStarted","Data":"d0072ff36543b17d469c813991b0fb32b7a39ed74b27ca4f0397f845c9c9b41b"} Feb 19 15:12:01 crc kubenswrapper[4810]: I0219 15:12:01.214468 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"07af0fda8448a35fda3d865973906afaea74b6956c2198bcbe319d672854e33f"} Feb 19 15:12:01 crc kubenswrapper[4810]: I0219 15:12:01.214515 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"aa410fc7de0d5cf567d6d79f919b256f04936d22b1fff2a8b2883c5a0cba291b"} Feb 19 15:12:01 crc kubenswrapper[4810]: I0219 15:12:01.215038 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 15:12:01 crc kubenswrapper[4810]: I0219 15:12:01.246067 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xzxn\" (UniqueName: \"kubernetes.io/projected/cc8ce195-1fe1-4684-8172-e710b3552fb5-kube-api-access-4xzxn\") pod \"community-operators-x8sn2\" (UID: \"cc8ce195-1fe1-4684-8172-e710b3552fb5\") " pod="openshift-marketplace/community-operators-x8sn2" Feb 19 15:12:01 crc kubenswrapper[4810]: I0219 15:12:01.294968 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 15:12:01 crc kubenswrapper[4810]: E0219 15:12:01.296475 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 15:12:01.796451045 +0000 UTC m=+151.278481169 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:12:01 crc kubenswrapper[4810]: I0219 15:12:01.352861 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x8sn2" Feb 19 15:12:01 crc kubenswrapper[4810]: I0219 15:12:01.397882 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7bnq2\" (UID: \"a6cb5092-5f01-4dd9-a940-804d88907744\") " pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" Feb 19 15:12:01 crc kubenswrapper[4810]: E0219 15:12:01.398201 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 15:12:01.898189083 +0000 UTC m=+151.380219207 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7bnq2" (UID: "a6cb5092-5f01-4dd9-a940-804d88907744") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:12:01 crc kubenswrapper[4810]: I0219 15:12:01.500876 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 15:12:01 crc kubenswrapper[4810]: E0219 15:12:01.501703 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 15:12:02.001685321 +0000 UTC m=+151.483715445 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:12:01 crc kubenswrapper[4810]: I0219 15:12:01.588848 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rk4vw"] Feb 19 15:12:01 crc kubenswrapper[4810]: I0219 15:12:01.603853 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7bnq2\" (UID: \"a6cb5092-5f01-4dd9-a940-804d88907744\") " pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" Feb 19 15:12:01 crc kubenswrapper[4810]: E0219 15:12:01.604201 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 15:12:02.10418845 +0000 UTC m=+151.586218574 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7bnq2" (UID: "a6cb5092-5f01-4dd9-a940-804d88907744") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:12:01 crc kubenswrapper[4810]: I0219 15:12:01.704762 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 15:12:01 crc kubenswrapper[4810]: E0219 15:12:01.705233 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 15:12:02.20521396 +0000 UTC m=+151.687244084 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:12:01 crc kubenswrapper[4810]: I0219 15:12:01.744129 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d5ks5"] Feb 19 15:12:01 crc kubenswrapper[4810]: I0219 15:12:01.808577 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7bnq2\" (UID: \"a6cb5092-5f01-4dd9-a940-804d88907744\") " pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" Feb 19 15:12:01 crc kubenswrapper[4810]: E0219 15:12:01.809096 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 15:12:02.309077077 +0000 UTC m=+151.791107211 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7bnq2" (UID: "a6cb5092-5f01-4dd9-a940-804d88907744") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:12:01 crc kubenswrapper[4810]: I0219 15:12:01.874592 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-blpmq"] Feb 19 15:12:01 crc kubenswrapper[4810]: I0219 15:12:01.900663 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 19 15:12:01 crc kubenswrapper[4810]: I0219 15:12:01.901717 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 15:12:01 crc kubenswrapper[4810]: I0219 15:12:01.909346 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 15:12:01 crc kubenswrapper[4810]: E0219 15:12:01.909580 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 15:12:02.40955116 +0000 UTC m=+151.891581284 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:12:01 crc kubenswrapper[4810]: I0219 15:12:01.909703 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7bnq2\" (UID: \"a6cb5092-5f01-4dd9-a940-804d88907744\") " pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" Feb 19 15:12:01 crc kubenswrapper[4810]: E0219 15:12:01.910060 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 15:12:02.410048464 +0000 UTC m=+151.892078588 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7bnq2" (UID: "a6cb5092-5f01-4dd9-a940-804d88907744") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:12:01 crc kubenswrapper[4810]: I0219 15:12:01.910550 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Feb 19 15:12:01 crc kubenswrapper[4810]: I0219 15:12:01.911230 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Feb 19 15:12:01 crc kubenswrapper[4810]: I0219 15:12:01.915464 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 19 15:12:01 crc kubenswrapper[4810]: I0219 15:12:01.927485 4810 patch_prober.go:28] interesting pod/router-default-5444994796-hvw7f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 15:12:01 crc kubenswrapper[4810]: [-]has-synced failed: reason withheld Feb 19 15:12:01 crc kubenswrapper[4810]: [+]process-running ok Feb 19 15:12:01 crc kubenswrapper[4810]: healthz check failed Feb 19 15:12:01 crc kubenswrapper[4810]: I0219 15:12:01.927535 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hvw7f" podUID="76ffdcba-57d6-4636-8373-f088926a716d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 15:12:02 crc kubenswrapper[4810]: I0219 15:12:02.010528 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 15:12:02 crc kubenswrapper[4810]: I0219 15:12:02.010818 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9bf4faab-dab6-498d-bf73-7a740448c64b-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"9bf4faab-dab6-498d-bf73-7a740448c64b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 15:12:02 crc kubenswrapper[4810]: I0219 15:12:02.010859 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9bf4faab-dab6-498d-bf73-7a740448c64b-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"9bf4faab-dab6-498d-bf73-7a740448c64b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 15:12:02 crc kubenswrapper[4810]: E0219 15:12:02.011007 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 15:12:02.510989211 +0000 UTC m=+151.993019335 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:12:02 crc kubenswrapper[4810]: I0219 15:12:02.113892 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9bf4faab-dab6-498d-bf73-7a740448c64b-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"9bf4faab-dab6-498d-bf73-7a740448c64b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 15:12:02 crc kubenswrapper[4810]: I0219 15:12:02.114415 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7bnq2\" (UID: \"a6cb5092-5f01-4dd9-a940-804d88907744\") " pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" Feb 19 15:12:02 crc kubenswrapper[4810]: I0219 15:12:02.114438 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9bf4faab-dab6-498d-bf73-7a740448c64b-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"9bf4faab-dab6-498d-bf73-7a740448c64b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 15:12:02 crc kubenswrapper[4810]: I0219 15:12:02.114886 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9bf4faab-dab6-498d-bf73-7a740448c64b-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"9bf4faab-dab6-498d-bf73-7a740448c64b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 15:12:02 crc kubenswrapper[4810]: E0219 15:12:02.115221 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 15:12:02.615207608 +0000 UTC m=+152.097237732 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7bnq2" (UID: "a6cb5092-5f01-4dd9-a940-804d88907744") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:12:02 crc kubenswrapper[4810]: I0219 15:12:02.170001 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9bf4faab-dab6-498d-bf73-7a740448c64b-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"9bf4faab-dab6-498d-bf73-7a740448c64b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 15:12:02 crc kubenswrapper[4810]: I0219 15:12:02.217931 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 15:12:02 crc kubenswrapper[4810]: E0219 15:12:02.218259 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 15:12:02.718242232 +0000 UTC m=+152.200272356 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:12:02 crc kubenswrapper[4810]: I0219 15:12:02.234950 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-4nds4" event={"ID":"51a1f271-446d-42d2-b946-ad816257e990","Type":"ContainerStarted","Data":"399d89cabfd6cf6cada79838bfbcfea0a2a7903c17e611bc093418cf9fe0ed03"} Feb 19 15:12:02 crc kubenswrapper[4810]: I0219 15:12:02.236984 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d5ks5" event={"ID":"9a3d6b1f-2011-4f7f-bea0-1d303007fe41","Type":"ContainerStarted","Data":"1962f43ea8735830650ac9b311ce674cd5cebcb42c9922bc390ae19775d9f9f0"} Feb 19 15:12:02 crc kubenswrapper[4810]: I0219 15:12:02.237023 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d5ks5" event={"ID":"9a3d6b1f-2011-4f7f-bea0-1d303007fe41","Type":"ContainerStarted","Data":"c3d08bc3ddaa041e0392052ef7f026d6557f47271ad22913b60948a058b74b85"} Feb 19 15:12:02 crc kubenswrapper[4810]: I0219 15:12:02.240774 4810 generic.go:334] "Generic (PLEG): container finished" podID="8383a9e3-149b-4512-a9fd-12cd0b65e370" containerID="c19d21352b758655d40944eafe4d1d6cfee80125c13e6f74424f93ccd9aee7cf" exitCode=0 Feb 19 15:12:02 crc kubenswrapper[4810]: I0219 15:12:02.240826 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525220-wqnml" event={"ID":"8383a9e3-149b-4512-a9fd-12cd0b65e370","Type":"ContainerDied","Data":"c19d21352b758655d40944eafe4d1d6cfee80125c13e6f74424f93ccd9aee7cf"} Feb 19 15:12:02 crc kubenswrapper[4810]: I0219 15:12:02.242091 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-blpmq" event={"ID":"4127fef2-ef2b-4cc4-967d-d52dac26f314","Type":"ContainerStarted","Data":"4442c4b83b02a776b63ed28285fde96beca3df86a20a033d8feb27311a4298e1"} Feb 19 15:12:02 crc kubenswrapper[4810]: I0219 15:12:02.243105 4810 generic.go:334] "Generic (PLEG): container finished" podID="3bf3315d-3d2f-4aeb-b925-c3832e102e85" containerID="dca6d9a99a30ff4bef03f7e86c179f1a1309a876ad27e10ee78ace680fa82510" exitCode=0 Feb 19 15:12:02 crc kubenswrapper[4810]: I0219 15:12:02.243146 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rk4vw" event={"ID":"3bf3315d-3d2f-4aeb-b925-c3832e102e85","Type":"ContainerDied","Data":"dca6d9a99a30ff4bef03f7e86c179f1a1309a876ad27e10ee78ace680fa82510"} Feb 19 15:12:02 crc kubenswrapper[4810]: I0219 15:12:02.243174 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rk4vw" event={"ID":"3bf3315d-3d2f-4aeb-b925-c3832e102e85","Type":"ContainerStarted","Data":"2a93b7168fafbe84b16d4aeee817860063438d02590293e9edf6bad1699c168a"} Feb 19 15:12:02 crc kubenswrapper[4810]: I0219 15:12:02.244442 4810 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 15:12:02 crc kubenswrapper[4810]: I0219 15:12:02.257276 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"72d421e29d3b69f3ec6be0bc92465003b9eef087291dccffe7e971f2ede9604f"} Feb 19 15:12:02 crc kubenswrapper[4810]: I0219 15:12:02.268689 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 15:12:02 crc kubenswrapper[4810]: I0219 15:12:02.306506 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-x8sn2"] Feb 19 15:12:02 crc kubenswrapper[4810]: I0219 15:12:02.324082 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7bnq2\" (UID: \"a6cb5092-5f01-4dd9-a940-804d88907744\") " pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" Feb 19 15:12:02 crc kubenswrapper[4810]: E0219 15:12:02.327705 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 15:12:02.827668342 +0000 UTC m=+152.309698466 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7bnq2" (UID: "a6cb5092-5f01-4dd9-a940-804d88907744") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:12:02 crc kubenswrapper[4810]: I0219 15:12:02.396272 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ptbh9"] Feb 19 15:12:02 crc kubenswrapper[4810]: I0219 15:12:02.398893 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ptbh9" Feb 19 15:12:02 crc kubenswrapper[4810]: I0219 15:12:02.404926 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 19 15:12:02 crc kubenswrapper[4810]: I0219 15:12:02.418491 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ptbh9"] Feb 19 15:12:02 crc kubenswrapper[4810]: I0219 15:12:02.447386 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 15:12:02 crc kubenswrapper[4810]: I0219 15:12:02.447653 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbxf5\" (UniqueName: \"kubernetes.io/projected/7e18c7b6-8ee0-4dbd-a0b2-3830b3a7bc53-kube-api-access-gbxf5\") pod \"redhat-marketplace-ptbh9\" (UID: \"7e18c7b6-8ee0-4dbd-a0b2-3830b3a7bc53\") " pod="openshift-marketplace/redhat-marketplace-ptbh9" Feb 19 15:12:02 crc kubenswrapper[4810]: I0219 15:12:02.447694 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e18c7b6-8ee0-4dbd-a0b2-3830b3a7bc53-utilities\") pod \"redhat-marketplace-ptbh9\" (UID: \"7e18c7b6-8ee0-4dbd-a0b2-3830b3a7bc53\") " pod="openshift-marketplace/redhat-marketplace-ptbh9" Feb 19 15:12:02 crc kubenswrapper[4810]: I0219 15:12:02.447732 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e18c7b6-8ee0-4dbd-a0b2-3830b3a7bc53-catalog-content\") pod \"redhat-marketplace-ptbh9\" (UID: \"7e18c7b6-8ee0-4dbd-a0b2-3830b3a7bc53\") " pod="openshift-marketplace/redhat-marketplace-ptbh9" Feb 19 15:12:02 crc kubenswrapper[4810]: E0219 15:12:02.447847 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 15:12:02.947831448 +0000 UTC m=+152.429861572 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:12:02 crc kubenswrapper[4810]: I0219 15:12:02.548872 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbxf5\" (UniqueName: \"kubernetes.io/projected/7e18c7b6-8ee0-4dbd-a0b2-3830b3a7bc53-kube-api-access-gbxf5\") pod \"redhat-marketplace-ptbh9\" (UID: \"7e18c7b6-8ee0-4dbd-a0b2-3830b3a7bc53\") " pod="openshift-marketplace/redhat-marketplace-ptbh9" Feb 19 15:12:02 crc kubenswrapper[4810]: I0219 15:12:02.549221 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e18c7b6-8ee0-4dbd-a0b2-3830b3a7bc53-utilities\") pod \"redhat-marketplace-ptbh9\" (UID: \"7e18c7b6-8ee0-4dbd-a0b2-3830b3a7bc53\") " pod="openshift-marketplace/redhat-marketplace-ptbh9" Feb 19 15:12:02 crc kubenswrapper[4810]: I0219 15:12:02.549258 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e18c7b6-8ee0-4dbd-a0b2-3830b3a7bc53-catalog-content\") pod \"redhat-marketplace-ptbh9\" (UID: \"7e18c7b6-8ee0-4dbd-a0b2-3830b3a7bc53\") " pod="openshift-marketplace/redhat-marketplace-ptbh9" Feb 19 15:12:02 crc kubenswrapper[4810]: I0219 15:12:02.549304 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7bnq2\" (UID: \"a6cb5092-5f01-4dd9-a940-804d88907744\") " pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" Feb 19 15:12:02 crc kubenswrapper[4810]: E0219 15:12:02.549663 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 15:12:03.049642229 +0000 UTC m=+152.531672353 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7bnq2" (UID: "a6cb5092-5f01-4dd9-a940-804d88907744") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 15:12:02 crc kubenswrapper[4810]: I0219 15:12:02.550026 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e18c7b6-8ee0-4dbd-a0b2-3830b3a7bc53-utilities\") pod \"redhat-marketplace-ptbh9\" (UID: \"7e18c7b6-8ee0-4dbd-a0b2-3830b3a7bc53\") " pod="openshift-marketplace/redhat-marketplace-ptbh9" Feb 19 15:12:02 crc kubenswrapper[4810]: I0219 15:12:02.550373 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e18c7b6-8ee0-4dbd-a0b2-3830b3a7bc53-catalog-content\") pod \"redhat-marketplace-ptbh9\" (UID: \"7e18c7b6-8ee0-4dbd-a0b2-3830b3a7bc53\") " pod="openshift-marketplace/redhat-marketplace-ptbh9" Feb 19 15:12:02 crc kubenswrapper[4810]: I0219 15:12:02.572987 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbxf5\" (UniqueName: \"kubernetes.io/projected/7e18c7b6-8ee0-4dbd-a0b2-3830b3a7bc53-kube-api-access-gbxf5\") pod \"redhat-marketplace-ptbh9\" (UID: \"7e18c7b6-8ee0-4dbd-a0b2-3830b3a7bc53\") " pod="openshift-marketplace/redhat-marketplace-ptbh9" Feb 19 15:12:02 crc kubenswrapper[4810]: I0219 15:12:02.603406 4810 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Feb 19 15:12:02 crc kubenswrapper[4810]: I0219 15:12:02.622902 4810 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-02-19T15:12:02.603444679Z","Handler":null,"Name":""} Feb 19 15:12:02 crc kubenswrapper[4810]: I0219 15:12:02.639584 4810 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Feb 19 15:12:02 crc kubenswrapper[4810]: I0219 15:12:02.639971 4810 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Feb 19 15:12:02 crc kubenswrapper[4810]: I0219 15:12:02.651986 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 15:12:02 crc kubenswrapper[4810]: I0219 15:12:02.673672 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 19 15:12:02 crc kubenswrapper[4810]: I0219 15:12:02.725663 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-25cmt" Feb 19 15:12:02 crc kubenswrapper[4810]: I0219 15:12:02.726830 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ptbh9" Feb 19 15:12:02 crc kubenswrapper[4810]: I0219 15:12:02.730604 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-5d4rp" Feb 19 15:12:02 crc kubenswrapper[4810]: I0219 15:12:02.739029 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-5d4rp" Feb 19 15:12:02 crc kubenswrapper[4810]: I0219 15:12:02.762632 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7bnq2\" (UID: \"a6cb5092-5f01-4dd9-a940-804d88907744\") " pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" Feb 19 15:12:02 crc kubenswrapper[4810]: I0219 15:12:02.831418 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 19 15:12:02 crc kubenswrapper[4810]: I0219 15:12:02.832870 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dfqg8"] Feb 19 15:12:02 crc kubenswrapper[4810]: I0219 15:12:02.868220 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dfqg8" Feb 19 15:12:02 crc kubenswrapper[4810]: I0219 15:12:02.903161 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dfqg8"] Feb 19 15:12:02 crc kubenswrapper[4810]: I0219 15:12:02.945210 4810 patch_prober.go:28] interesting pod/router-default-5444994796-hvw7f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 15:12:02 crc kubenswrapper[4810]: [-]has-synced failed: reason withheld Feb 19 15:12:02 crc kubenswrapper[4810]: [+]process-running ok Feb 19 15:12:02 crc kubenswrapper[4810]: healthz check failed Feb 19 15:12:02 crc kubenswrapper[4810]: I0219 15:12:02.945276 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hvw7f" podUID="76ffdcba-57d6-4636-8373-f088926a716d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 15:12:02 crc kubenswrapper[4810]: I0219 15:12:02.969058 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4nld\" (UniqueName: \"kubernetes.io/projected/5c654206-f2d0-4b40-9df0-577dbf27e5e4-kube-api-access-r4nld\") pod \"redhat-marketplace-dfqg8\" (UID: \"5c654206-f2d0-4b40-9df0-577dbf27e5e4\") " pod="openshift-marketplace/redhat-marketplace-dfqg8" Feb 19 15:12:02 crc kubenswrapper[4810]: I0219 15:12:02.969141 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c654206-f2d0-4b40-9df0-577dbf27e5e4-catalog-content\") pod \"redhat-marketplace-dfqg8\" (UID: \"5c654206-f2d0-4b40-9df0-577dbf27e5e4\") " pod="openshift-marketplace/redhat-marketplace-dfqg8" Feb 19 15:12:02 crc kubenswrapper[4810]: I0219 15:12:02.969196 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c654206-f2d0-4b40-9df0-577dbf27e5e4-utilities\") pod \"redhat-marketplace-dfqg8\" (UID: \"5c654206-f2d0-4b40-9df0-577dbf27e5e4\") " pod="openshift-marketplace/redhat-marketplace-dfqg8" Feb 19 15:12:02 crc kubenswrapper[4810]: I0219 15:12:02.976344 4810 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 15:12:02 crc kubenswrapper[4810]: I0219 15:12:02.976384 4810 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7bnq2\" (UID: \"a6cb5092-5f01-4dd9-a940-804d88907744\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" Feb 19 15:12:03 crc kubenswrapper[4810]: I0219 15:12:03.072044 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c654206-f2d0-4b40-9df0-577dbf27e5e4-utilities\") pod \"redhat-marketplace-dfqg8\" (UID: \"5c654206-f2d0-4b40-9df0-577dbf27e5e4\") " pod="openshift-marketplace/redhat-marketplace-dfqg8" Feb 19 15:12:03 crc kubenswrapper[4810]: I0219 15:12:03.072111 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4nld\" (UniqueName: \"kubernetes.io/projected/5c654206-f2d0-4b40-9df0-577dbf27e5e4-kube-api-access-r4nld\") pod \"redhat-marketplace-dfqg8\" (UID: \"5c654206-f2d0-4b40-9df0-577dbf27e5e4\") " pod="openshift-marketplace/redhat-marketplace-dfqg8" Feb 19 15:12:03 crc kubenswrapper[4810]: I0219 15:12:03.072155 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c654206-f2d0-4b40-9df0-577dbf27e5e4-catalog-content\") pod \"redhat-marketplace-dfqg8\" (UID: \"5c654206-f2d0-4b40-9df0-577dbf27e5e4\") " pod="openshift-marketplace/redhat-marketplace-dfqg8" Feb 19 15:12:03 crc kubenswrapper[4810]: I0219 15:12:03.072686 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c654206-f2d0-4b40-9df0-577dbf27e5e4-catalog-content\") pod \"redhat-marketplace-dfqg8\" (UID: \"5c654206-f2d0-4b40-9df0-577dbf27e5e4\") " pod="openshift-marketplace/redhat-marketplace-dfqg8" Feb 19 15:12:03 crc kubenswrapper[4810]: I0219 15:12:03.072919 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c654206-f2d0-4b40-9df0-577dbf27e5e4-utilities\") pod \"redhat-marketplace-dfqg8\" (UID: \"5c654206-f2d0-4b40-9df0-577dbf27e5e4\") " pod="openshift-marketplace/redhat-marketplace-dfqg8" Feb 19 15:12:03 crc kubenswrapper[4810]: I0219 15:12:03.125409 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4nld\" (UniqueName: \"kubernetes.io/projected/5c654206-f2d0-4b40-9df0-577dbf27e5e4-kube-api-access-r4nld\") pod \"redhat-marketplace-dfqg8\" (UID: \"5c654206-f2d0-4b40-9df0-577dbf27e5e4\") " pod="openshift-marketplace/redhat-marketplace-dfqg8" Feb 19 15:12:03 crc kubenswrapper[4810]: I0219 15:12:03.183876 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7bnq2\" (UID: \"a6cb5092-5f01-4dd9-a940-804d88907744\") " pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" Feb 19 15:12:03 crc kubenswrapper[4810]: I0219 15:12:03.219836 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dfqg8" Feb 19 15:12:03 crc kubenswrapper[4810]: I0219 15:12:03.283529 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ptbh9"] Feb 19 15:12:03 crc kubenswrapper[4810]: I0219 15:12:03.303762 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q6zwc" Feb 19 15:12:03 crc kubenswrapper[4810]: I0219 15:12:03.304838 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q6zwc" Feb 19 15:12:03 crc kubenswrapper[4810]: I0219 15:12:03.320172 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q6zwc" Feb 19 15:12:03 crc kubenswrapper[4810]: I0219 15:12:03.320250 4810 generic.go:334] "Generic (PLEG): container finished" podID="cc8ce195-1fe1-4684-8172-e710b3552fb5" containerID="5c922635ea487f2a0968c5eace1227e4dcace0fce70a7ca4e66511c946968284" exitCode=0 Feb 19 15:12:03 crc kubenswrapper[4810]: I0219 15:12:03.320393 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x8sn2" event={"ID":"cc8ce195-1fe1-4684-8172-e710b3552fb5","Type":"ContainerDied","Data":"5c922635ea487f2a0968c5eace1227e4dcace0fce70a7ca4e66511c946968284"} Feb 19 15:12:03 crc kubenswrapper[4810]: I0219 15:12:03.320473 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x8sn2" event={"ID":"cc8ce195-1fe1-4684-8172-e710b3552fb5","Type":"ContainerStarted","Data":"b765fd7f2fdc89df238064ee08672584bc5ff6e6cd24e9a1fe430dad87064297"} Feb 19 15:12:03 crc kubenswrapper[4810]: I0219 15:12:03.325150 4810 generic.go:334] "Generic (PLEG): container finished" podID="4127fef2-ef2b-4cc4-967d-d52dac26f314" containerID="01e828c8531030a2de34cc2f43410bf4a69a3af4e19f1adb2dbd098f1f78eca6" exitCode=0 Feb 19 15:12:03 crc kubenswrapper[4810]: I0219 15:12:03.325370 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-blpmq" event={"ID":"4127fef2-ef2b-4cc4-967d-d52dac26f314","Type":"ContainerDied","Data":"01e828c8531030a2de34cc2f43410bf4a69a3af4e19f1adb2dbd098f1f78eca6"} Feb 19 15:12:03 crc kubenswrapper[4810]: I0219 15:12:03.337140 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"9bf4faab-dab6-498d-bf73-7a740448c64b","Type":"ContainerStarted","Data":"b81cbe17b0c85b33afea4e73bfe0c252ec94240f8fe691abb4c277e1104d95b3"} Feb 19 15:12:03 crc kubenswrapper[4810]: I0219 15:12:03.340176 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-4nds4" event={"ID":"51a1f271-446d-42d2-b946-ad816257e990","Type":"ContainerStarted","Data":"efb67ae02881bb15abecd46f79301f6c12b8a8783efc2371e4daa8fc6157156e"} Feb 19 15:12:03 crc kubenswrapper[4810]: I0219 15:12:03.352253 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" Feb 19 15:12:03 crc kubenswrapper[4810]: I0219 15:12:03.373454 4810 generic.go:334] "Generic (PLEG): container finished" podID="9a3d6b1f-2011-4f7f-bea0-1d303007fe41" containerID="1962f43ea8735830650ac9b311ce674cd5cebcb42c9922bc390ae19775d9f9f0" exitCode=0 Feb 19 15:12:03 crc kubenswrapper[4810]: I0219 15:12:03.374823 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d5ks5" event={"ID":"9a3d6b1f-2011-4f7f-bea0-1d303007fe41","Type":"ContainerDied","Data":"1962f43ea8735830650ac9b311ce674cd5cebcb42c9922bc390ae19775d9f9f0"} Feb 19 15:12:03 crc kubenswrapper[4810]: I0219 15:12:03.390541 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-4nds4" podStartSLOduration=12.390520801 podStartE2EDuration="12.390520801s" podCreationTimestamp="2026-02-19 15:11:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:12:03.371704084 +0000 UTC m=+152.853734208" watchObservedRunningTime="2026-02-19 15:12:03.390520801 +0000 UTC m=+152.872550925" Feb 19 15:12:03 crc kubenswrapper[4810]: I0219 15:12:03.532046 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Feb 19 15:12:03 crc kubenswrapper[4810]: I0219 15:12:03.582947 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-4hddt" Feb 19 15:12:03 crc kubenswrapper[4810]: I0219 15:12:03.583475 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-4hddt" Feb 19 15:12:03 crc kubenswrapper[4810]: I0219 15:12:03.585609 4810 patch_prober.go:28] interesting pod/console-f9d7485db-4hddt container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.15:8443/health\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Feb 19 15:12:03 crc kubenswrapper[4810]: I0219 15:12:03.585734 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-4hddt" podUID="362cd55c-b576-44bd-843c-078bf26b3b1e" containerName="console" probeResult="failure" output="Get \"https://10.217.0.15:8443/health\": dial tcp 10.217.0.15:8443: connect: connection refused" Feb 19 15:12:03 crc kubenswrapper[4810]: I0219 15:12:03.586949 4810 patch_prober.go:28] interesting pod/downloads-7954f5f757-dkppn container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Feb 19 15:12:03 crc kubenswrapper[4810]: I0219 15:12:03.586987 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dkppn" podUID="7a29951a-027e-49b4-a7ea-a8e363942414" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Feb 19 15:12:03 crc kubenswrapper[4810]: I0219 15:12:03.587070 4810 patch_prober.go:28] interesting pod/downloads-7954f5f757-dkppn container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Feb 19 15:12:03 crc kubenswrapper[4810]: I0219 15:12:03.587086 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-dkppn" podUID="7a29951a-027e-49b4-a7ea-a8e363942414" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Feb 19 15:12:03 crc kubenswrapper[4810]: I0219 15:12:03.610792 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gp8sg"] Feb 19 15:12:03 crc kubenswrapper[4810]: I0219 15:12:03.613413 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gp8sg" Feb 19 15:12:03 crc kubenswrapper[4810]: I0219 15:12:03.619974 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 19 15:12:03 crc kubenswrapper[4810]: I0219 15:12:03.623208 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gp8sg"] Feb 19 15:12:03 crc kubenswrapper[4810]: I0219 15:12:03.673197 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dfqg8"] Feb 19 15:12:03 crc kubenswrapper[4810]: I0219 15:12:03.686435 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3146bc9a-c4fc-4aa1-acae-032db4aa0582-catalog-content\") pod \"redhat-operators-gp8sg\" (UID: \"3146bc9a-c4fc-4aa1-acae-032db4aa0582\") " pod="openshift-marketplace/redhat-operators-gp8sg" Feb 19 15:12:03 crc kubenswrapper[4810]: I0219 15:12:03.686535 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b45t8\" (UniqueName: \"kubernetes.io/projected/3146bc9a-c4fc-4aa1-acae-032db4aa0582-kube-api-access-b45t8\") pod \"redhat-operators-gp8sg\" (UID: \"3146bc9a-c4fc-4aa1-acae-032db4aa0582\") " pod="openshift-marketplace/redhat-operators-gp8sg" Feb 19 15:12:03 crc kubenswrapper[4810]: I0219 15:12:03.686561 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3146bc9a-c4fc-4aa1-acae-032db4aa0582-utilities\") pod \"redhat-operators-gp8sg\" (UID: \"3146bc9a-c4fc-4aa1-acae-032db4aa0582\") " pod="openshift-marketplace/redhat-operators-gp8sg" Feb 19 15:12:03 crc kubenswrapper[4810]: I0219 15:12:03.789162 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3146bc9a-c4fc-4aa1-acae-032db4aa0582-catalog-content\") pod \"redhat-operators-gp8sg\" (UID: \"3146bc9a-c4fc-4aa1-acae-032db4aa0582\") " pod="openshift-marketplace/redhat-operators-gp8sg" Feb 19 15:12:03 crc kubenswrapper[4810]: I0219 15:12:03.789724 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b45t8\" (UniqueName: \"kubernetes.io/projected/3146bc9a-c4fc-4aa1-acae-032db4aa0582-kube-api-access-b45t8\") pod \"redhat-operators-gp8sg\" (UID: \"3146bc9a-c4fc-4aa1-acae-032db4aa0582\") " pod="openshift-marketplace/redhat-operators-gp8sg" Feb 19 15:12:03 crc kubenswrapper[4810]: I0219 15:12:03.790963 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3146bc9a-c4fc-4aa1-acae-032db4aa0582-utilities\") pod \"redhat-operators-gp8sg\" (UID: \"3146bc9a-c4fc-4aa1-acae-032db4aa0582\") " pod="openshift-marketplace/redhat-operators-gp8sg" Feb 19 15:12:03 crc kubenswrapper[4810]: I0219 15:12:03.792011 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3146bc9a-c4fc-4aa1-acae-032db4aa0582-utilities\") pod \"redhat-operators-gp8sg\" (UID: \"3146bc9a-c4fc-4aa1-acae-032db4aa0582\") " pod="openshift-marketplace/redhat-operators-gp8sg" Feb 19 15:12:03 crc kubenswrapper[4810]: I0219 15:12:03.792728 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3146bc9a-c4fc-4aa1-acae-032db4aa0582-catalog-content\") pod \"redhat-operators-gp8sg\" (UID: \"3146bc9a-c4fc-4aa1-acae-032db4aa0582\") " pod="openshift-marketplace/redhat-operators-gp8sg" Feb 19 15:12:03 crc kubenswrapper[4810]: I0219 15:12:03.851956 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b45t8\" (UniqueName: \"kubernetes.io/projected/3146bc9a-c4fc-4aa1-acae-032db4aa0582-kube-api-access-b45t8\") pod \"redhat-operators-gp8sg\" (UID: \"3146bc9a-c4fc-4aa1-acae-032db4aa0582\") " pod="openshift-marketplace/redhat-operators-gp8sg" Feb 19 15:12:03 crc kubenswrapper[4810]: I0219 15:12:03.916363 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-hvw7f" Feb 19 15:12:03 crc kubenswrapper[4810]: I0219 15:12:03.923370 4810 patch_prober.go:28] interesting pod/router-default-5444994796-hvw7f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 15:12:03 crc kubenswrapper[4810]: [-]has-synced failed: reason withheld Feb 19 15:12:03 crc kubenswrapper[4810]: [+]process-running ok Feb 19 15:12:03 crc kubenswrapper[4810]: healthz check failed Feb 19 15:12:03 crc kubenswrapper[4810]: I0219 15:12:03.923423 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hvw7f" podUID="76ffdcba-57d6-4636-8373-f088926a716d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 15:12:03 crc kubenswrapper[4810]: I0219 15:12:03.933951 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525220-wqnml" Feb 19 15:12:03 crc kubenswrapper[4810]: I0219 15:12:03.979304 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gp8sg" Feb 19 15:12:04 crc kubenswrapper[4810]: I0219 15:12:04.014071 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8383a9e3-149b-4512-a9fd-12cd0b65e370-secret-volume\") pod \"8383a9e3-149b-4512-a9fd-12cd0b65e370\" (UID: \"8383a9e3-149b-4512-a9fd-12cd0b65e370\") " Feb 19 15:12:04 crc kubenswrapper[4810]: I0219 15:12:04.014124 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jdm7r\" (UniqueName: \"kubernetes.io/projected/8383a9e3-149b-4512-a9fd-12cd0b65e370-kube-api-access-jdm7r\") pod \"8383a9e3-149b-4512-a9fd-12cd0b65e370\" (UID: \"8383a9e3-149b-4512-a9fd-12cd0b65e370\") " Feb 19 15:12:04 crc kubenswrapper[4810]: I0219 15:12:04.014151 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8383a9e3-149b-4512-a9fd-12cd0b65e370-config-volume\") pod \"8383a9e3-149b-4512-a9fd-12cd0b65e370\" (UID: \"8383a9e3-149b-4512-a9fd-12cd0b65e370\") " Feb 19 15:12:04 crc kubenswrapper[4810]: I0219 15:12:04.024639 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8383a9e3-149b-4512-a9fd-12cd0b65e370-kube-api-access-jdm7r" (OuterVolumeSpecName: "kube-api-access-jdm7r") pod "8383a9e3-149b-4512-a9fd-12cd0b65e370" (UID: "8383a9e3-149b-4512-a9fd-12cd0b65e370"). InnerVolumeSpecName "kube-api-access-jdm7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:12:04 crc kubenswrapper[4810]: I0219 15:12:04.024727 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8383a9e3-149b-4512-a9fd-12cd0b65e370-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "8383a9e3-149b-4512-a9fd-12cd0b65e370" (UID: "8383a9e3-149b-4512-a9fd-12cd0b65e370"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:12:04 crc kubenswrapper[4810]: I0219 15:12:04.027255 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8383a9e3-149b-4512-a9fd-12cd0b65e370-config-volume" (OuterVolumeSpecName: "config-volume") pod "8383a9e3-149b-4512-a9fd-12cd0b65e370" (UID: "8383a9e3-149b-4512-a9fd-12cd0b65e370"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:12:04 crc kubenswrapper[4810]: I0219 15:12:04.028577 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-v87ss"] Feb 19 15:12:04 crc kubenswrapper[4810]: E0219 15:12:04.028868 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8383a9e3-149b-4512-a9fd-12cd0b65e370" containerName="collect-profiles" Feb 19 15:12:04 crc kubenswrapper[4810]: I0219 15:12:04.028890 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="8383a9e3-149b-4512-a9fd-12cd0b65e370" containerName="collect-profiles" Feb 19 15:12:04 crc kubenswrapper[4810]: I0219 15:12:04.029027 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="8383a9e3-149b-4512-a9fd-12cd0b65e370" containerName="collect-profiles" Feb 19 15:12:04 crc kubenswrapper[4810]: I0219 15:12:04.030001 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v87ss" Feb 19 15:12:04 crc kubenswrapper[4810]: W0219 15:12:04.046180 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6cb5092_5f01_4dd9_a940_804d88907744.slice/crio-958cccf7fb34847e1b79d6753f3c0b3ac89a306a0da6aa21a70c7c233870987b WatchSource:0}: Error finding container 958cccf7fb34847e1b79d6753f3c0b3ac89a306a0da6aa21a70c7c233870987b: Status 404 returned error can't find the container with id 958cccf7fb34847e1b79d6753f3c0b3ac89a306a0da6aa21a70c7c233870987b Feb 19 15:12:04 crc kubenswrapper[4810]: I0219 15:12:04.055564 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-7bnq2"] Feb 19 15:12:04 crc kubenswrapper[4810]: I0219 15:12:04.061425 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v87ss"] Feb 19 15:12:04 crc kubenswrapper[4810]: I0219 15:12:04.115414 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbjkj\" (UniqueName: \"kubernetes.io/projected/ee54de34-1c90-401d-8102-2cc1e4116661-kube-api-access-fbjkj\") pod \"redhat-operators-v87ss\" (UID: \"ee54de34-1c90-401d-8102-2cc1e4116661\") " pod="openshift-marketplace/redhat-operators-v87ss" Feb 19 15:12:04 crc kubenswrapper[4810]: I0219 15:12:04.115500 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee54de34-1c90-401d-8102-2cc1e4116661-catalog-content\") pod \"redhat-operators-v87ss\" (UID: \"ee54de34-1c90-401d-8102-2cc1e4116661\") " pod="openshift-marketplace/redhat-operators-v87ss" Feb 19 15:12:04 crc kubenswrapper[4810]: I0219 15:12:04.115537 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee54de34-1c90-401d-8102-2cc1e4116661-utilities\") pod \"redhat-operators-v87ss\" (UID: \"ee54de34-1c90-401d-8102-2cc1e4116661\") " pod="openshift-marketplace/redhat-operators-v87ss" Feb 19 15:12:04 crc kubenswrapper[4810]: I0219 15:12:04.115578 4810 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8383a9e3-149b-4512-a9fd-12cd0b65e370-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 19 15:12:04 crc kubenswrapper[4810]: I0219 15:12:04.115592 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jdm7r\" (UniqueName: \"kubernetes.io/projected/8383a9e3-149b-4512-a9fd-12cd0b65e370-kube-api-access-jdm7r\") on node \"crc\" DevicePath \"\"" Feb 19 15:12:04 crc kubenswrapper[4810]: I0219 15:12:04.115600 4810 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8383a9e3-149b-4512-a9fd-12cd0b65e370-config-volume\") on node \"crc\" DevicePath \"\"" Feb 19 15:12:04 crc kubenswrapper[4810]: I0219 15:12:04.216743 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbjkj\" (UniqueName: \"kubernetes.io/projected/ee54de34-1c90-401d-8102-2cc1e4116661-kube-api-access-fbjkj\") pod \"redhat-operators-v87ss\" (UID: \"ee54de34-1c90-401d-8102-2cc1e4116661\") " pod="openshift-marketplace/redhat-operators-v87ss" Feb 19 15:12:04 crc kubenswrapper[4810]: I0219 15:12:04.216843 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee54de34-1c90-401d-8102-2cc1e4116661-catalog-content\") pod \"redhat-operators-v87ss\" (UID: \"ee54de34-1c90-401d-8102-2cc1e4116661\") " pod="openshift-marketplace/redhat-operators-v87ss" Feb 19 15:12:04 crc kubenswrapper[4810]: I0219 15:12:04.217021 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee54de34-1c90-401d-8102-2cc1e4116661-utilities\") pod \"redhat-operators-v87ss\" (UID: \"ee54de34-1c90-401d-8102-2cc1e4116661\") " pod="openshift-marketplace/redhat-operators-v87ss" Feb 19 15:12:04 crc kubenswrapper[4810]: I0219 15:12:04.217916 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee54de34-1c90-401d-8102-2cc1e4116661-utilities\") pod \"redhat-operators-v87ss\" (UID: \"ee54de34-1c90-401d-8102-2cc1e4116661\") " pod="openshift-marketplace/redhat-operators-v87ss" Feb 19 15:12:04 crc kubenswrapper[4810]: I0219 15:12:04.218179 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee54de34-1c90-401d-8102-2cc1e4116661-catalog-content\") pod \"redhat-operators-v87ss\" (UID: \"ee54de34-1c90-401d-8102-2cc1e4116661\") " pod="openshift-marketplace/redhat-operators-v87ss" Feb 19 15:12:04 crc kubenswrapper[4810]: I0219 15:12:04.246436 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbjkj\" (UniqueName: \"kubernetes.io/projected/ee54de34-1c90-401d-8102-2cc1e4116661-kube-api-access-fbjkj\") pod \"redhat-operators-v87ss\" (UID: \"ee54de34-1c90-401d-8102-2cc1e4116661\") " pod="openshift-marketplace/redhat-operators-v87ss" Feb 19 15:12:04 crc kubenswrapper[4810]: I0219 15:12:04.320594 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gp8sg"] Feb 19 15:12:04 crc kubenswrapper[4810]: I0219 15:12:04.352156 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v87ss" Feb 19 15:12:04 crc kubenswrapper[4810]: I0219 15:12:04.378612 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-r2tqm" Feb 19 15:12:04 crc kubenswrapper[4810]: I0219 15:12:04.393404 4810 generic.go:334] "Generic (PLEG): container finished" podID="7e18c7b6-8ee0-4dbd-a0b2-3830b3a7bc53" containerID="50a196ba034be9702770fc3245e22281a78913a08bd60f8b507db74c90490792" exitCode=0 Feb 19 15:12:04 crc kubenswrapper[4810]: I0219 15:12:04.393521 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ptbh9" event={"ID":"7e18c7b6-8ee0-4dbd-a0b2-3830b3a7bc53","Type":"ContainerDied","Data":"50a196ba034be9702770fc3245e22281a78913a08bd60f8b507db74c90490792"} Feb 19 15:12:04 crc kubenswrapper[4810]: I0219 15:12:04.393550 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ptbh9" event={"ID":"7e18c7b6-8ee0-4dbd-a0b2-3830b3a7bc53","Type":"ContainerStarted","Data":"585fb3dde24254534020203ea59c98681809ba91054fff093daf819845736af6"} Feb 19 15:12:04 crc kubenswrapper[4810]: I0219 15:12:04.399316 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525220-wqnml" event={"ID":"8383a9e3-149b-4512-a9fd-12cd0b65e370","Type":"ContainerDied","Data":"28491df923b9f9d021efc8a5dd0472dd72e0e7e179d81a8ba24dae22fa983519"} Feb 19 15:12:04 crc kubenswrapper[4810]: I0219 15:12:04.399360 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="28491df923b9f9d021efc8a5dd0472dd72e0e7e179d81a8ba24dae22fa983519" Feb 19 15:12:04 crc kubenswrapper[4810]: I0219 15:12:04.399377 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525220-wqnml" Feb 19 15:12:04 crc kubenswrapper[4810]: I0219 15:12:04.419630 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-h68pj" Feb 19 15:12:04 crc kubenswrapper[4810]: I0219 15:12:04.439399 4810 generic.go:334] "Generic (PLEG): container finished" podID="5c654206-f2d0-4b40-9df0-577dbf27e5e4" containerID="ae8bc998d14d3fe5b46f631e4cd6f287af277e6334648d3823ae6a448b5c6c06" exitCode=0 Feb 19 15:12:04 crc kubenswrapper[4810]: I0219 15:12:04.439459 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dfqg8" event={"ID":"5c654206-f2d0-4b40-9df0-577dbf27e5e4","Type":"ContainerDied","Data":"ae8bc998d14d3fe5b46f631e4cd6f287af277e6334648d3823ae6a448b5c6c06"} Feb 19 15:12:04 crc kubenswrapper[4810]: I0219 15:12:04.439477 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dfqg8" event={"ID":"5c654206-f2d0-4b40-9df0-577dbf27e5e4","Type":"ContainerStarted","Data":"673ebc55d1a6b447cd1eff3908534a0236e7f9bed8a79855c901abb60e30a35e"} Feb 19 15:12:04 crc kubenswrapper[4810]: I0219 15:12:04.466202 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gp8sg" event={"ID":"3146bc9a-c4fc-4aa1-acae-032db4aa0582","Type":"ContainerStarted","Data":"8f98c34b7848371876e8226d32a5b9a72fad3efa01f7370b3de8b257667df91e"} Feb 19 15:12:04 crc kubenswrapper[4810]: I0219 15:12:04.489404 4810 generic.go:334] "Generic (PLEG): container finished" podID="9bf4faab-dab6-498d-bf73-7a740448c64b" containerID="24c3c0e67f6cb8d737307870275884530deb2d7f9328c8dbf07a497146ef635e" exitCode=0 Feb 19 15:12:04 crc kubenswrapper[4810]: I0219 15:12:04.489846 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"9bf4faab-dab6-498d-bf73-7a740448c64b","Type":"ContainerDied","Data":"24c3c0e67f6cb8d737307870275884530deb2d7f9328c8dbf07a497146ef635e"} Feb 19 15:12:04 crc kubenswrapper[4810]: I0219 15:12:04.513298 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" event={"ID":"a6cb5092-5f01-4dd9-a940-804d88907744","Type":"ContainerStarted","Data":"5251301af1540af7d5ebbbedf95175c9c2ed8c2deb1460dfe0ea2da0c4971799"} Feb 19 15:12:04 crc kubenswrapper[4810]: I0219 15:12:04.513368 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" event={"ID":"a6cb5092-5f01-4dd9-a940-804d88907744","Type":"ContainerStarted","Data":"958cccf7fb34847e1b79d6753f3c0b3ac89a306a0da6aa21a70c7c233870987b"} Feb 19 15:12:04 crc kubenswrapper[4810]: I0219 15:12:04.515233 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" Feb 19 15:12:04 crc kubenswrapper[4810]: I0219 15:12:04.527143 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q6zwc" Feb 19 15:12:04 crc kubenswrapper[4810]: I0219 15:12:04.560098 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" podStartSLOduration=131.560078886 podStartE2EDuration="2m11.560078886s" podCreationTimestamp="2026-02-19 15:09:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:12:04.555451958 +0000 UTC m=+154.037482082" watchObservedRunningTime="2026-02-19 15:12:04.560078886 +0000 UTC m=+154.042109000" Feb 19 15:12:04 crc kubenswrapper[4810]: I0219 15:12:04.802494 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v87ss"] Feb 19 15:12:04 crc kubenswrapper[4810]: W0219 15:12:04.853634 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee54de34_1c90_401d_8102_2cc1e4116661.slice/crio-e9e0d068483405f5413181ac3893890a639ed339e87615e73aeb8a60ceb19c52 WatchSource:0}: Error finding container e9e0d068483405f5413181ac3893890a639ed339e87615e73aeb8a60ceb19c52: Status 404 returned error can't find the container with id e9e0d068483405f5413181ac3893890a639ed339e87615e73aeb8a60ceb19c52 Feb 19 15:12:04 crc kubenswrapper[4810]: I0219 15:12:04.920084 4810 patch_prober.go:28] interesting pod/router-default-5444994796-hvw7f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 15:12:04 crc kubenswrapper[4810]: [-]has-synced failed: reason withheld Feb 19 15:12:04 crc kubenswrapper[4810]: [+]process-running ok Feb 19 15:12:04 crc kubenswrapper[4810]: healthz check failed Feb 19 15:12:04 crc kubenswrapper[4810]: I0219 15:12:04.920175 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hvw7f" podUID="76ffdcba-57d6-4636-8373-f088926a716d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 15:12:05 crc kubenswrapper[4810]: I0219 15:12:05.532507 4810 generic.go:334] "Generic (PLEG): container finished" podID="3146bc9a-c4fc-4aa1-acae-032db4aa0582" containerID="db4a068bb20ce6903e18758cc6f38e8bff29bd023c7f06ec3db18c434439f0c7" exitCode=0 Feb 19 15:12:05 crc kubenswrapper[4810]: I0219 15:12:05.532721 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gp8sg" event={"ID":"3146bc9a-c4fc-4aa1-acae-032db4aa0582","Type":"ContainerDied","Data":"db4a068bb20ce6903e18758cc6f38e8bff29bd023c7f06ec3db18c434439f0c7"} Feb 19 15:12:05 crc kubenswrapper[4810]: I0219 15:12:05.540039 4810 generic.go:334] "Generic (PLEG): container finished" podID="ee54de34-1c90-401d-8102-2cc1e4116661" containerID="14425fa062e8aa4022f788165f140217518c2e7e2510e6d081b098762154d7a3" exitCode=0 Feb 19 15:12:05 crc kubenswrapper[4810]: I0219 15:12:05.540954 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v87ss" event={"ID":"ee54de34-1c90-401d-8102-2cc1e4116661","Type":"ContainerDied","Data":"14425fa062e8aa4022f788165f140217518c2e7e2510e6d081b098762154d7a3"} Feb 19 15:12:05 crc kubenswrapper[4810]: I0219 15:12:05.540979 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v87ss" event={"ID":"ee54de34-1c90-401d-8102-2cc1e4116661","Type":"ContainerStarted","Data":"e9e0d068483405f5413181ac3893890a639ed339e87615e73aeb8a60ceb19c52"} Feb 19 15:12:05 crc kubenswrapper[4810]: I0219 15:12:05.942884 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-hvw7f" Feb 19 15:12:05 crc kubenswrapper[4810]: I0219 15:12:05.954515 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-hvw7f" Feb 19 15:12:06 crc kubenswrapper[4810]: I0219 15:12:06.234777 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 15:12:06 crc kubenswrapper[4810]: I0219 15:12:06.372462 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9bf4faab-dab6-498d-bf73-7a740448c64b-kubelet-dir\") pod \"9bf4faab-dab6-498d-bf73-7a740448c64b\" (UID: \"9bf4faab-dab6-498d-bf73-7a740448c64b\") " Feb 19 15:12:06 crc kubenswrapper[4810]: I0219 15:12:06.372543 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9bf4faab-dab6-498d-bf73-7a740448c64b-kube-api-access\") pod \"9bf4faab-dab6-498d-bf73-7a740448c64b\" (UID: \"9bf4faab-dab6-498d-bf73-7a740448c64b\") " Feb 19 15:12:06 crc kubenswrapper[4810]: I0219 15:12:06.372609 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9bf4faab-dab6-498d-bf73-7a740448c64b-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "9bf4faab-dab6-498d-bf73-7a740448c64b" (UID: "9bf4faab-dab6-498d-bf73-7a740448c64b"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 15:12:06 crc kubenswrapper[4810]: I0219 15:12:06.372870 4810 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9bf4faab-dab6-498d-bf73-7a740448c64b-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 19 15:12:06 crc kubenswrapper[4810]: I0219 15:12:06.388534 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bf4faab-dab6-498d-bf73-7a740448c64b-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "9bf4faab-dab6-498d-bf73-7a740448c64b" (UID: "9bf4faab-dab6-498d-bf73-7a740448c64b"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:12:06 crc kubenswrapper[4810]: I0219 15:12:06.474881 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9bf4faab-dab6-498d-bf73-7a740448c64b-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 19 15:12:06 crc kubenswrapper[4810]: I0219 15:12:06.562899 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"9bf4faab-dab6-498d-bf73-7a740448c64b","Type":"ContainerDied","Data":"b81cbe17b0c85b33afea4e73bfe0c252ec94240f8fe691abb4c277e1104d95b3"} Feb 19 15:12:06 crc kubenswrapper[4810]: I0219 15:12:06.562986 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 15:12:06 crc kubenswrapper[4810]: I0219 15:12:06.562998 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b81cbe17b0c85b33afea4e73bfe0c252ec94240f8fe691abb4c277e1104d95b3" Feb 19 15:12:07 crc kubenswrapper[4810]: I0219 15:12:07.023707 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 19 15:12:07 crc kubenswrapper[4810]: E0219 15:12:07.024072 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bf4faab-dab6-498d-bf73-7a740448c64b" containerName="pruner" Feb 19 15:12:07 crc kubenswrapper[4810]: I0219 15:12:07.024094 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bf4faab-dab6-498d-bf73-7a740448c64b" containerName="pruner" Feb 19 15:12:07 crc kubenswrapper[4810]: I0219 15:12:07.024242 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bf4faab-dab6-498d-bf73-7a740448c64b" containerName="pruner" Feb 19 15:12:07 crc kubenswrapper[4810]: I0219 15:12:07.025202 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 15:12:07 crc kubenswrapper[4810]: I0219 15:12:07.027594 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 19 15:12:07 crc kubenswrapper[4810]: I0219 15:12:07.028069 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 19 15:12:07 crc kubenswrapper[4810]: I0219 15:12:07.041466 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 19 15:12:07 crc kubenswrapper[4810]: I0219 15:12:07.188354 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d2dbc717-bc2f-458b-9fbd-0082602a0e6d-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"d2dbc717-bc2f-458b-9fbd-0082602a0e6d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 15:12:07 crc kubenswrapper[4810]: I0219 15:12:07.188497 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d2dbc717-bc2f-458b-9fbd-0082602a0e6d-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"d2dbc717-bc2f-458b-9fbd-0082602a0e6d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 15:12:07 crc kubenswrapper[4810]: I0219 15:12:07.290279 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d2dbc717-bc2f-458b-9fbd-0082602a0e6d-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"d2dbc717-bc2f-458b-9fbd-0082602a0e6d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 15:12:07 crc kubenswrapper[4810]: I0219 15:12:07.290455 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d2dbc717-bc2f-458b-9fbd-0082602a0e6d-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"d2dbc717-bc2f-458b-9fbd-0082602a0e6d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 15:12:07 crc kubenswrapper[4810]: I0219 15:12:07.290565 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d2dbc717-bc2f-458b-9fbd-0082602a0e6d-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"d2dbc717-bc2f-458b-9fbd-0082602a0e6d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 15:12:07 crc kubenswrapper[4810]: I0219 15:12:07.330518 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d2dbc717-bc2f-458b-9fbd-0082602a0e6d-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"d2dbc717-bc2f-458b-9fbd-0082602a0e6d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 15:12:07 crc kubenswrapper[4810]: I0219 15:12:07.375795 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 15:12:07 crc kubenswrapper[4810]: I0219 15:12:07.830844 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 19 15:12:08 crc kubenswrapper[4810]: I0219 15:12:08.653476 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"d2dbc717-bc2f-458b-9fbd-0082602a0e6d","Type":"ContainerStarted","Data":"779d684e378a82c58f3e61eadda27c4746a790326de1db0560d46b865a873e75"} Feb 19 15:12:09 crc kubenswrapper[4810]: I0219 15:12:09.413061 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-zgcrd" Feb 19 15:12:09 crc kubenswrapper[4810]: I0219 15:12:09.686064 4810 generic.go:334] "Generic (PLEG): container finished" podID="d2dbc717-bc2f-458b-9fbd-0082602a0e6d" containerID="df11bfb2abf6b4d6666c47f84367d2443ac85c6c12fe6f9c1ef0751d1d9da3bb" exitCode=0 Feb 19 15:12:09 crc kubenswrapper[4810]: I0219 15:12:09.686138 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"d2dbc717-bc2f-458b-9fbd-0082602a0e6d","Type":"ContainerDied","Data":"df11bfb2abf6b4d6666c47f84367d2443ac85c6c12fe6f9c1ef0751d1d9da3bb"} Feb 19 15:12:11 crc kubenswrapper[4810]: I0219 15:12:11.107042 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 15:12:11 crc kubenswrapper[4810]: I0219 15:12:11.194906 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d2dbc717-bc2f-458b-9fbd-0082602a0e6d-kube-api-access\") pod \"d2dbc717-bc2f-458b-9fbd-0082602a0e6d\" (UID: \"d2dbc717-bc2f-458b-9fbd-0082602a0e6d\") " Feb 19 15:12:11 crc kubenswrapper[4810]: I0219 15:12:11.194976 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d2dbc717-bc2f-458b-9fbd-0082602a0e6d-kubelet-dir\") pod \"d2dbc717-bc2f-458b-9fbd-0082602a0e6d\" (UID: \"d2dbc717-bc2f-458b-9fbd-0082602a0e6d\") " Feb 19 15:12:11 crc kubenswrapper[4810]: I0219 15:12:11.195384 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d2dbc717-bc2f-458b-9fbd-0082602a0e6d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "d2dbc717-bc2f-458b-9fbd-0082602a0e6d" (UID: "d2dbc717-bc2f-458b-9fbd-0082602a0e6d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 15:12:11 crc kubenswrapper[4810]: I0219 15:12:11.215621 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2dbc717-bc2f-458b-9fbd-0082602a0e6d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d2dbc717-bc2f-458b-9fbd-0082602a0e6d" (UID: "d2dbc717-bc2f-458b-9fbd-0082602a0e6d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:12:11 crc kubenswrapper[4810]: I0219 15:12:11.301372 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d2dbc717-bc2f-458b-9fbd-0082602a0e6d-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 19 15:12:11 crc kubenswrapper[4810]: I0219 15:12:11.301420 4810 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d2dbc717-bc2f-458b-9fbd-0082602a0e6d-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 19 15:12:11 crc kubenswrapper[4810]: I0219 15:12:11.731824 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"d2dbc717-bc2f-458b-9fbd-0082602a0e6d","Type":"ContainerDied","Data":"779d684e378a82c58f3e61eadda27c4746a790326de1db0560d46b865a873e75"} Feb 19 15:12:11 crc kubenswrapper[4810]: I0219 15:12:11.732230 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="779d684e378a82c58f3e61eadda27c4746a790326de1db0560d46b865a873e75" Feb 19 15:12:11 crc kubenswrapper[4810]: I0219 15:12:11.731890 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 15:12:13 crc kubenswrapper[4810]: I0219 15:12:13.586465 4810 patch_prober.go:28] interesting pod/downloads-7954f5f757-dkppn container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Feb 19 15:12:13 crc kubenswrapper[4810]: I0219 15:12:13.586985 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-dkppn" podUID="7a29951a-027e-49b4-a7ea-a8e363942414" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Feb 19 15:12:13 crc kubenswrapper[4810]: I0219 15:12:13.586465 4810 patch_prober.go:28] interesting pod/downloads-7954f5f757-dkppn container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Feb 19 15:12:13 crc kubenswrapper[4810]: I0219 15:12:13.587147 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dkppn" podUID="7a29951a-027e-49b4-a7ea-a8e363942414" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Feb 19 15:12:13 crc kubenswrapper[4810]: I0219 15:12:13.650299 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-4hddt" Feb 19 15:12:13 crc kubenswrapper[4810]: I0219 15:12:13.659696 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-4hddt" Feb 19 15:12:15 crc kubenswrapper[4810]: I0219 15:12:15.573801 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b72d3f7a-e418-4a21-af73-6a43ce3358c1-metrics-certs\") pod \"network-metrics-daemon-2x9v9\" (UID: \"b72d3f7a-e418-4a21-af73-6a43ce3358c1\") " pod="openshift-multus/network-metrics-daemon-2x9v9" Feb 19 15:12:15 crc kubenswrapper[4810]: I0219 15:12:15.596472 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b72d3f7a-e418-4a21-af73-6a43ce3358c1-metrics-certs\") pod \"network-metrics-daemon-2x9v9\" (UID: \"b72d3f7a-e418-4a21-af73-6a43ce3358c1\") " pod="openshift-multus/network-metrics-daemon-2x9v9" Feb 19 15:12:15 crc kubenswrapper[4810]: I0219 15:12:15.764810 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2x9v9" Feb 19 15:12:16 crc kubenswrapper[4810]: I0219 15:12:16.864689 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-sl5p9"] Feb 19 15:12:16 crc kubenswrapper[4810]: I0219 15:12:16.865854 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-sl5p9" podUID="6f0a78b8-77b3-47dd-9dbe-e578b0374cf6" containerName="controller-manager" containerID="cri-o://dc986c76de8632d646d80420e0f140b9a738f5647ff326454ca42eca27431ca5" gracePeriod=30 Feb 19 15:12:16 crc kubenswrapper[4810]: I0219 15:12:16.903947 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-n8zth"] Feb 19 15:12:16 crc kubenswrapper[4810]: I0219 15:12:16.906658 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n8zth" podUID="ad43df2c-4944-45e2-919f-0c297f4092d4" containerName="route-controller-manager" containerID="cri-o://b43634a16e9cdfb66105b0eff470452a3406e103e0c90e02477b5f9de0072e03" gracePeriod=30 Feb 19 15:12:18 crc kubenswrapper[4810]: I0219 15:12:18.813173 4810 generic.go:334] "Generic (PLEG): container finished" podID="6f0a78b8-77b3-47dd-9dbe-e578b0374cf6" containerID="dc986c76de8632d646d80420e0f140b9a738f5647ff326454ca42eca27431ca5" exitCode=0 Feb 19 15:12:18 crc kubenswrapper[4810]: I0219 15:12:18.813300 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-sl5p9" event={"ID":"6f0a78b8-77b3-47dd-9dbe-e578b0374cf6","Type":"ContainerDied","Data":"dc986c76de8632d646d80420e0f140b9a738f5647ff326454ca42eca27431ca5"} Feb 19 15:12:19 crc kubenswrapper[4810]: I0219 15:12:19.537787 4810 patch_prober.go:28] interesting pod/machine-config-daemon-t499d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 15:12:19 crc kubenswrapper[4810]: I0219 15:12:19.537881 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 15:12:23 crc kubenswrapper[4810]: I0219 15:12:23.358672 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" Feb 19 15:12:23 crc kubenswrapper[4810]: I0219 15:12:23.547249 4810 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-n8zth container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Feb 19 15:12:23 crc kubenswrapper[4810]: I0219 15:12:23.547771 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n8zth" podUID="ad43df2c-4944-45e2-919f-0c297f4092d4" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" Feb 19 15:12:23 crc kubenswrapper[4810]: I0219 15:12:23.587541 4810 patch_prober.go:28] interesting pod/downloads-7954f5f757-dkppn container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Feb 19 15:12:23 crc kubenswrapper[4810]: I0219 15:12:23.587605 4810 patch_prober.go:28] interesting pod/downloads-7954f5f757-dkppn container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Feb 19 15:12:23 crc kubenswrapper[4810]: I0219 15:12:23.587662 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-dkppn" podUID="7a29951a-027e-49b4-a7ea-a8e363942414" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Feb 19 15:12:23 crc kubenswrapper[4810]: I0219 15:12:23.587686 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dkppn" podUID="7a29951a-027e-49b4-a7ea-a8e363942414" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Feb 19 15:12:23 crc kubenswrapper[4810]: I0219 15:12:23.587774 4810 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console/downloads-7954f5f757-dkppn" Feb 19 15:12:23 crc kubenswrapper[4810]: I0219 15:12:23.588566 4810 patch_prober.go:28] interesting pod/downloads-7954f5f757-dkppn container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Feb 19 15:12:23 crc kubenswrapper[4810]: I0219 15:12:23.588655 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dkppn" podUID="7a29951a-027e-49b4-a7ea-a8e363942414" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Feb 19 15:12:23 crc kubenswrapper[4810]: I0219 15:12:23.589008 4810 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="download-server" containerStatusID={"Type":"cri-o","ID":"248f3396a274a7c02cca57cafc68effaca4124e54f5008022c7525d6c545f088"} pod="openshift-console/downloads-7954f5f757-dkppn" containerMessage="Container download-server failed liveness probe, will be restarted" Feb 19 15:12:23 crc kubenswrapper[4810]: I0219 15:12:23.589225 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/downloads-7954f5f757-dkppn" podUID="7a29951a-027e-49b4-a7ea-a8e363942414" containerName="download-server" containerID="cri-o://248f3396a274a7c02cca57cafc68effaca4124e54f5008022c7525d6c545f088" gracePeriod=2 Feb 19 15:12:23 crc kubenswrapper[4810]: I0219 15:12:23.646756 4810 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-sl5p9 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Feb 19 15:12:23 crc kubenswrapper[4810]: I0219 15:12:23.646833 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-sl5p9" podUID="6f0a78b8-77b3-47dd-9dbe-e578b0374cf6" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Feb 19 15:12:28 crc kubenswrapper[4810]: I0219 15:12:28.896963 4810 generic.go:334] "Generic (PLEG): container finished" podID="ad43df2c-4944-45e2-919f-0c297f4092d4" containerID="b43634a16e9cdfb66105b0eff470452a3406e103e0c90e02477b5f9de0072e03" exitCode=0 Feb 19 15:12:28 crc kubenswrapper[4810]: I0219 15:12:28.897193 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n8zth" event={"ID":"ad43df2c-4944-45e2-919f-0c297f4092d4","Type":"ContainerDied","Data":"b43634a16e9cdfb66105b0eff470452a3406e103e0c90e02477b5f9de0072e03"} Feb 19 15:12:29 crc kubenswrapper[4810]: I0219 15:12:29.908381 4810 generic.go:334] "Generic (PLEG): container finished" podID="7a29951a-027e-49b4-a7ea-a8e363942414" containerID="248f3396a274a7c02cca57cafc68effaca4124e54f5008022c7525d6c545f088" exitCode=0 Feb 19 15:12:29 crc kubenswrapper[4810]: I0219 15:12:29.908710 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-dkppn" event={"ID":"7a29951a-027e-49b4-a7ea-a8e363942414","Type":"ContainerDied","Data":"248f3396a274a7c02cca57cafc68effaca4124e54f5008022c7525d6c545f088"} Feb 19 15:12:31 crc kubenswrapper[4810]: I0219 15:12:31.704262 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n8zth" Feb 19 15:12:31 crc kubenswrapper[4810]: I0219 15:12:31.716035 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-sl5p9" Feb 19 15:12:31 crc kubenswrapper[4810]: I0219 15:12:31.744903 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58bf9f9fdf-sfcvl"] Feb 19 15:12:31 crc kubenswrapper[4810]: E0219 15:12:31.745283 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad43df2c-4944-45e2-919f-0c297f4092d4" containerName="route-controller-manager" Feb 19 15:12:31 crc kubenswrapper[4810]: I0219 15:12:31.745305 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad43df2c-4944-45e2-919f-0c297f4092d4" containerName="route-controller-manager" Feb 19 15:12:31 crc kubenswrapper[4810]: E0219 15:12:31.745364 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f0a78b8-77b3-47dd-9dbe-e578b0374cf6" containerName="controller-manager" Feb 19 15:12:31 crc kubenswrapper[4810]: I0219 15:12:31.745379 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f0a78b8-77b3-47dd-9dbe-e578b0374cf6" containerName="controller-manager" Feb 19 15:12:31 crc kubenswrapper[4810]: E0219 15:12:31.745401 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2dbc717-bc2f-458b-9fbd-0082602a0e6d" containerName="pruner" Feb 19 15:12:31 crc kubenswrapper[4810]: I0219 15:12:31.745415 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2dbc717-bc2f-458b-9fbd-0082602a0e6d" containerName="pruner" Feb 19 15:12:31 crc kubenswrapper[4810]: I0219 15:12:31.745606 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2dbc717-bc2f-458b-9fbd-0082602a0e6d" containerName="pruner" Feb 19 15:12:31 crc kubenswrapper[4810]: I0219 15:12:31.745630 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad43df2c-4944-45e2-919f-0c297f4092d4" containerName="route-controller-manager" Feb 19 15:12:31 crc kubenswrapper[4810]: I0219 15:12:31.745655 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f0a78b8-77b3-47dd-9dbe-e578b0374cf6" containerName="controller-manager" Feb 19 15:12:31 crc kubenswrapper[4810]: I0219 15:12:31.746371 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-58bf9f9fdf-sfcvl" Feb 19 15:12:31 crc kubenswrapper[4810]: I0219 15:12:31.747722 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l478x\" (UniqueName: \"kubernetes.io/projected/ad43df2c-4944-45e2-919f-0c297f4092d4-kube-api-access-l478x\") pod \"ad43df2c-4944-45e2-919f-0c297f4092d4\" (UID: \"ad43df2c-4944-45e2-919f-0c297f4092d4\") " Feb 19 15:12:31 crc kubenswrapper[4810]: I0219 15:12:31.747778 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad43df2c-4944-45e2-919f-0c297f4092d4-config\") pod \"ad43df2c-4944-45e2-919f-0c297f4092d4\" (UID: \"ad43df2c-4944-45e2-919f-0c297f4092d4\") " Feb 19 15:12:31 crc kubenswrapper[4810]: I0219 15:12:31.747957 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6f0a78b8-77b3-47dd-9dbe-e578b0374cf6-proxy-ca-bundles\") pod \"6f0a78b8-77b3-47dd-9dbe-e578b0374cf6\" (UID: \"6f0a78b8-77b3-47dd-9dbe-e578b0374cf6\") " Feb 19 15:12:31 crc kubenswrapper[4810]: I0219 15:12:31.748114 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f0a78b8-77b3-47dd-9dbe-e578b0374cf6-serving-cert\") pod \"6f0a78b8-77b3-47dd-9dbe-e578b0374cf6\" (UID: \"6f0a78b8-77b3-47dd-9dbe-e578b0374cf6\") " Feb 19 15:12:31 crc kubenswrapper[4810]: I0219 15:12:31.748194 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad43df2c-4944-45e2-919f-0c297f4092d4-serving-cert\") pod \"ad43df2c-4944-45e2-919f-0c297f4092d4\" (UID: \"ad43df2c-4944-45e2-919f-0c297f4092d4\") " Feb 19 15:12:31 crc kubenswrapper[4810]: I0219 15:12:31.748234 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6f0a78b8-77b3-47dd-9dbe-e578b0374cf6-client-ca\") pod \"6f0a78b8-77b3-47dd-9dbe-e578b0374cf6\" (UID: \"6f0a78b8-77b3-47dd-9dbe-e578b0374cf6\") " Feb 19 15:12:31 crc kubenswrapper[4810]: I0219 15:12:31.748300 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8k7k5\" (UniqueName: \"kubernetes.io/projected/6f0a78b8-77b3-47dd-9dbe-e578b0374cf6-kube-api-access-8k7k5\") pod \"6f0a78b8-77b3-47dd-9dbe-e578b0374cf6\" (UID: \"6f0a78b8-77b3-47dd-9dbe-e578b0374cf6\") " Feb 19 15:12:31 crc kubenswrapper[4810]: I0219 15:12:31.748371 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ad43df2c-4944-45e2-919f-0c297f4092d4-client-ca\") pod \"ad43df2c-4944-45e2-919f-0c297f4092d4\" (UID: \"ad43df2c-4944-45e2-919f-0c297f4092d4\") " Feb 19 15:12:31 crc kubenswrapper[4810]: I0219 15:12:31.748462 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f0a78b8-77b3-47dd-9dbe-e578b0374cf6-config\") pod \"6f0a78b8-77b3-47dd-9dbe-e578b0374cf6\" (UID: \"6f0a78b8-77b3-47dd-9dbe-e578b0374cf6\") " Feb 19 15:12:31 crc kubenswrapper[4810]: I0219 15:12:31.750036 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f0a78b8-77b3-47dd-9dbe-e578b0374cf6-config" (OuterVolumeSpecName: "config") pod "6f0a78b8-77b3-47dd-9dbe-e578b0374cf6" (UID: "6f0a78b8-77b3-47dd-9dbe-e578b0374cf6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:12:31 crc kubenswrapper[4810]: I0219 15:12:31.756516 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f0a78b8-77b3-47dd-9dbe-e578b0374cf6-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "6f0a78b8-77b3-47dd-9dbe-e578b0374cf6" (UID: "6f0a78b8-77b3-47dd-9dbe-e578b0374cf6"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:12:31 crc kubenswrapper[4810]: I0219 15:12:31.757154 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad43df2c-4944-45e2-919f-0c297f4092d4-config" (OuterVolumeSpecName: "config") pod "ad43df2c-4944-45e2-919f-0c297f4092d4" (UID: "ad43df2c-4944-45e2-919f-0c297f4092d4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:12:31 crc kubenswrapper[4810]: I0219 15:12:31.758103 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad43df2c-4944-45e2-919f-0c297f4092d4-client-ca" (OuterVolumeSpecName: "client-ca") pod "ad43df2c-4944-45e2-919f-0c297f4092d4" (UID: "ad43df2c-4944-45e2-919f-0c297f4092d4"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:12:31 crc kubenswrapper[4810]: I0219 15:12:31.758133 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f0a78b8-77b3-47dd-9dbe-e578b0374cf6-client-ca" (OuterVolumeSpecName: "client-ca") pod "6f0a78b8-77b3-47dd-9dbe-e578b0374cf6" (UID: "6f0a78b8-77b3-47dd-9dbe-e578b0374cf6"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:12:31 crc kubenswrapper[4810]: I0219 15:12:31.759348 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58bf9f9fdf-sfcvl"] Feb 19 15:12:31 crc kubenswrapper[4810]: I0219 15:12:31.760920 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad43df2c-4944-45e2-919f-0c297f4092d4-kube-api-access-l478x" (OuterVolumeSpecName: "kube-api-access-l478x") pod "ad43df2c-4944-45e2-919f-0c297f4092d4" (UID: "ad43df2c-4944-45e2-919f-0c297f4092d4"). InnerVolumeSpecName "kube-api-access-l478x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:12:31 crc kubenswrapper[4810]: I0219 15:12:31.762045 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad43df2c-4944-45e2-919f-0c297f4092d4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ad43df2c-4944-45e2-919f-0c297f4092d4" (UID: "ad43df2c-4944-45e2-919f-0c297f4092d4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:12:31 crc kubenswrapper[4810]: I0219 15:12:31.779357 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f0a78b8-77b3-47dd-9dbe-e578b0374cf6-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6f0a78b8-77b3-47dd-9dbe-e578b0374cf6" (UID: "6f0a78b8-77b3-47dd-9dbe-e578b0374cf6"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:12:31 crc kubenswrapper[4810]: I0219 15:12:31.786160 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f0a78b8-77b3-47dd-9dbe-e578b0374cf6-kube-api-access-8k7k5" (OuterVolumeSpecName: "kube-api-access-8k7k5") pod "6f0a78b8-77b3-47dd-9dbe-e578b0374cf6" (UID: "6f0a78b8-77b3-47dd-9dbe-e578b0374cf6"). InnerVolumeSpecName "kube-api-access-8k7k5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:12:31 crc kubenswrapper[4810]: I0219 15:12:31.849704 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3a9911b7-9775-45a3-8eba-2418c6a8c7da-client-ca\") pod \"route-controller-manager-58bf9f9fdf-sfcvl\" (UID: \"3a9911b7-9775-45a3-8eba-2418c6a8c7da\") " pod="openshift-route-controller-manager/route-controller-manager-58bf9f9fdf-sfcvl" Feb 19 15:12:31 crc kubenswrapper[4810]: I0219 15:12:31.849772 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvn9b\" (UniqueName: \"kubernetes.io/projected/3a9911b7-9775-45a3-8eba-2418c6a8c7da-kube-api-access-dvn9b\") pod \"route-controller-manager-58bf9f9fdf-sfcvl\" (UID: \"3a9911b7-9775-45a3-8eba-2418c6a8c7da\") " pod="openshift-route-controller-manager/route-controller-manager-58bf9f9fdf-sfcvl" Feb 19 15:12:31 crc kubenswrapper[4810]: I0219 15:12:31.849820 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a9911b7-9775-45a3-8eba-2418c6a8c7da-serving-cert\") pod \"route-controller-manager-58bf9f9fdf-sfcvl\" (UID: \"3a9911b7-9775-45a3-8eba-2418c6a8c7da\") " pod="openshift-route-controller-manager/route-controller-manager-58bf9f9fdf-sfcvl" Feb 19 15:12:31 crc kubenswrapper[4810]: I0219 15:12:31.849910 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a9911b7-9775-45a3-8eba-2418c6a8c7da-config\") pod \"route-controller-manager-58bf9f9fdf-sfcvl\" (UID: \"3a9911b7-9775-45a3-8eba-2418c6a8c7da\") " pod="openshift-route-controller-manager/route-controller-manager-58bf9f9fdf-sfcvl" Feb 19 15:12:31 crc kubenswrapper[4810]: I0219 15:12:31.849957 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l478x\" (UniqueName: \"kubernetes.io/projected/ad43df2c-4944-45e2-919f-0c297f4092d4-kube-api-access-l478x\") on node \"crc\" DevicePath \"\"" Feb 19 15:12:31 crc kubenswrapper[4810]: I0219 15:12:31.849969 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad43df2c-4944-45e2-919f-0c297f4092d4-config\") on node \"crc\" DevicePath \"\"" Feb 19 15:12:31 crc kubenswrapper[4810]: I0219 15:12:31.849980 4810 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6f0a78b8-77b3-47dd-9dbe-e578b0374cf6-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 19 15:12:31 crc kubenswrapper[4810]: I0219 15:12:31.849989 4810 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f0a78b8-77b3-47dd-9dbe-e578b0374cf6-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 15:12:31 crc kubenswrapper[4810]: I0219 15:12:31.850030 4810 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad43df2c-4944-45e2-919f-0c297f4092d4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 15:12:31 crc kubenswrapper[4810]: I0219 15:12:31.850039 4810 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6f0a78b8-77b3-47dd-9dbe-e578b0374cf6-client-ca\") on node \"crc\" DevicePath \"\"" Feb 19 15:12:31 crc kubenswrapper[4810]: I0219 15:12:31.850048 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8k7k5\" (UniqueName: \"kubernetes.io/projected/6f0a78b8-77b3-47dd-9dbe-e578b0374cf6-kube-api-access-8k7k5\") on node \"crc\" DevicePath \"\"" Feb 19 15:12:31 crc kubenswrapper[4810]: I0219 15:12:31.850056 4810 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ad43df2c-4944-45e2-919f-0c297f4092d4-client-ca\") on node \"crc\" DevicePath \"\"" Feb 19 15:12:31 crc kubenswrapper[4810]: I0219 15:12:31.850064 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f0a78b8-77b3-47dd-9dbe-e578b0374cf6-config\") on node \"crc\" DevicePath \"\"" Feb 19 15:12:31 crc kubenswrapper[4810]: I0219 15:12:31.924417 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-sl5p9" event={"ID":"6f0a78b8-77b3-47dd-9dbe-e578b0374cf6","Type":"ContainerDied","Data":"918504ecdc8f526f00360e8dd63cb3df985da70843c55db6ce3d06ae23d89251"} Feb 19 15:12:31 crc kubenswrapper[4810]: I0219 15:12:31.924460 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-sl5p9" Feb 19 15:12:31 crc kubenswrapper[4810]: I0219 15:12:31.924501 4810 scope.go:117] "RemoveContainer" containerID="dc986c76de8632d646d80420e0f140b9a738f5647ff326454ca42eca27431ca5" Feb 19 15:12:31 crc kubenswrapper[4810]: I0219 15:12:31.930154 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n8zth" event={"ID":"ad43df2c-4944-45e2-919f-0c297f4092d4","Type":"ContainerDied","Data":"8973ecaad02b7aeec4f31a47d961686dc669236fd8c026776f4e494af608cf1b"} Feb 19 15:12:31 crc kubenswrapper[4810]: I0219 15:12:31.930313 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n8zth" Feb 19 15:12:31 crc kubenswrapper[4810]: I0219 15:12:31.950857 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvn9b\" (UniqueName: \"kubernetes.io/projected/3a9911b7-9775-45a3-8eba-2418c6a8c7da-kube-api-access-dvn9b\") pod \"route-controller-manager-58bf9f9fdf-sfcvl\" (UID: \"3a9911b7-9775-45a3-8eba-2418c6a8c7da\") " pod="openshift-route-controller-manager/route-controller-manager-58bf9f9fdf-sfcvl" Feb 19 15:12:31 crc kubenswrapper[4810]: I0219 15:12:31.950915 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a9911b7-9775-45a3-8eba-2418c6a8c7da-serving-cert\") pod \"route-controller-manager-58bf9f9fdf-sfcvl\" (UID: \"3a9911b7-9775-45a3-8eba-2418c6a8c7da\") " pod="openshift-route-controller-manager/route-controller-manager-58bf9f9fdf-sfcvl" Feb 19 15:12:31 crc kubenswrapper[4810]: I0219 15:12:31.950981 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a9911b7-9775-45a3-8eba-2418c6a8c7da-config\") pod \"route-controller-manager-58bf9f9fdf-sfcvl\" (UID: \"3a9911b7-9775-45a3-8eba-2418c6a8c7da\") " pod="openshift-route-controller-manager/route-controller-manager-58bf9f9fdf-sfcvl" Feb 19 15:12:31 crc kubenswrapper[4810]: I0219 15:12:31.951029 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3a9911b7-9775-45a3-8eba-2418c6a8c7da-client-ca\") pod \"route-controller-manager-58bf9f9fdf-sfcvl\" (UID: \"3a9911b7-9775-45a3-8eba-2418c6a8c7da\") " pod="openshift-route-controller-manager/route-controller-manager-58bf9f9fdf-sfcvl" Feb 19 15:12:31 crc kubenswrapper[4810]: I0219 15:12:31.952116 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3a9911b7-9775-45a3-8eba-2418c6a8c7da-client-ca\") pod \"route-controller-manager-58bf9f9fdf-sfcvl\" (UID: \"3a9911b7-9775-45a3-8eba-2418c6a8c7da\") " pod="openshift-route-controller-manager/route-controller-manager-58bf9f9fdf-sfcvl" Feb 19 15:12:31 crc kubenswrapper[4810]: I0219 15:12:31.952367 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a9911b7-9775-45a3-8eba-2418c6a8c7da-config\") pod \"route-controller-manager-58bf9f9fdf-sfcvl\" (UID: \"3a9911b7-9775-45a3-8eba-2418c6a8c7da\") " pod="openshift-route-controller-manager/route-controller-manager-58bf9f9fdf-sfcvl" Feb 19 15:12:31 crc kubenswrapper[4810]: I0219 15:12:31.961179 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a9911b7-9775-45a3-8eba-2418c6a8c7da-serving-cert\") pod \"route-controller-manager-58bf9f9fdf-sfcvl\" (UID: \"3a9911b7-9775-45a3-8eba-2418c6a8c7da\") " pod="openshift-route-controller-manager/route-controller-manager-58bf9f9fdf-sfcvl" Feb 19 15:12:31 crc kubenswrapper[4810]: I0219 15:12:31.961830 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-sl5p9"] Feb 19 15:12:31 crc kubenswrapper[4810]: I0219 15:12:31.972577 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-sl5p9"] Feb 19 15:12:31 crc kubenswrapper[4810]: I0219 15:12:31.972853 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvn9b\" (UniqueName: \"kubernetes.io/projected/3a9911b7-9775-45a3-8eba-2418c6a8c7da-kube-api-access-dvn9b\") pod \"route-controller-manager-58bf9f9fdf-sfcvl\" (UID: \"3a9911b7-9775-45a3-8eba-2418c6a8c7da\") " pod="openshift-route-controller-manager/route-controller-manager-58bf9f9fdf-sfcvl" Feb 19 15:12:31 crc kubenswrapper[4810]: I0219 15:12:31.976432 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-n8zth"] Feb 19 15:12:31 crc kubenswrapper[4810]: I0219 15:12:31.979516 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-n8zth"] Feb 19 15:12:32 crc kubenswrapper[4810]: I0219 15:12:32.112545 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-58bf9f9fdf-sfcvl" Feb 19 15:12:33 crc kubenswrapper[4810]: I0219 15:12:33.447634 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f0a78b8-77b3-47dd-9dbe-e578b0374cf6" path="/var/lib/kubelet/pods/6f0a78b8-77b3-47dd-9dbe-e578b0374cf6/volumes" Feb 19 15:12:33 crc kubenswrapper[4810]: I0219 15:12:33.448615 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad43df2c-4944-45e2-919f-0c297f4092d4" path="/var/lib/kubelet/pods/ad43df2c-4944-45e2-919f-0c297f4092d4/volumes" Feb 19 15:12:33 crc kubenswrapper[4810]: I0219 15:12:33.586341 4810 patch_prober.go:28] interesting pod/downloads-7954f5f757-dkppn container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Feb 19 15:12:33 crc kubenswrapper[4810]: I0219 15:12:33.586399 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dkppn" podUID="7a29951a-027e-49b4-a7ea-a8e363942414" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Feb 19 15:12:33 crc kubenswrapper[4810]: I0219 15:12:33.815540 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-bf8444975-mbpl8"] Feb 19 15:12:33 crc kubenswrapper[4810]: I0219 15:12:33.816594 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-bf8444975-mbpl8" Feb 19 15:12:33 crc kubenswrapper[4810]: I0219 15:12:33.820878 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 19 15:12:33 crc kubenswrapper[4810]: I0219 15:12:33.821148 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 19 15:12:33 crc kubenswrapper[4810]: I0219 15:12:33.821501 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 19 15:12:33 crc kubenswrapper[4810]: I0219 15:12:33.821853 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 19 15:12:33 crc kubenswrapper[4810]: I0219 15:12:33.823978 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 19 15:12:33 crc kubenswrapper[4810]: I0219 15:12:33.825659 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 19 15:12:33 crc kubenswrapper[4810]: I0219 15:12:33.834271 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-bf8444975-mbpl8"] Feb 19 15:12:33 crc kubenswrapper[4810]: I0219 15:12:33.841715 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 19 15:12:33 crc kubenswrapper[4810]: I0219 15:12:33.874907 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptwvr\" (UniqueName: \"kubernetes.io/projected/f3507810-2d89-4c4c-bb78-f824d27a6b33-kube-api-access-ptwvr\") pod \"controller-manager-bf8444975-mbpl8\" (UID: \"f3507810-2d89-4c4c-bb78-f824d27a6b33\") " pod="openshift-controller-manager/controller-manager-bf8444975-mbpl8" Feb 19 15:12:33 crc kubenswrapper[4810]: I0219 15:12:33.874983 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f3507810-2d89-4c4c-bb78-f824d27a6b33-proxy-ca-bundles\") pod \"controller-manager-bf8444975-mbpl8\" (UID: \"f3507810-2d89-4c4c-bb78-f824d27a6b33\") " pod="openshift-controller-manager/controller-manager-bf8444975-mbpl8" Feb 19 15:12:33 crc kubenswrapper[4810]: I0219 15:12:33.875064 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3507810-2d89-4c4c-bb78-f824d27a6b33-config\") pod \"controller-manager-bf8444975-mbpl8\" (UID: \"f3507810-2d89-4c4c-bb78-f824d27a6b33\") " pod="openshift-controller-manager/controller-manager-bf8444975-mbpl8" Feb 19 15:12:33 crc kubenswrapper[4810]: I0219 15:12:33.875116 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f3507810-2d89-4c4c-bb78-f824d27a6b33-serving-cert\") pod \"controller-manager-bf8444975-mbpl8\" (UID: \"f3507810-2d89-4c4c-bb78-f824d27a6b33\") " pod="openshift-controller-manager/controller-manager-bf8444975-mbpl8" Feb 19 15:12:33 crc kubenswrapper[4810]: I0219 15:12:33.875150 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f3507810-2d89-4c4c-bb78-f824d27a6b33-client-ca\") pod \"controller-manager-bf8444975-mbpl8\" (UID: \"f3507810-2d89-4c4c-bb78-f824d27a6b33\") " pod="openshift-controller-manager/controller-manager-bf8444975-mbpl8" Feb 19 15:12:33 crc kubenswrapper[4810]: I0219 15:12:33.919257 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2nn8b" Feb 19 15:12:33 crc kubenswrapper[4810]: I0219 15:12:33.976084 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3507810-2d89-4c4c-bb78-f824d27a6b33-config\") pod \"controller-manager-bf8444975-mbpl8\" (UID: \"f3507810-2d89-4c4c-bb78-f824d27a6b33\") " pod="openshift-controller-manager/controller-manager-bf8444975-mbpl8" Feb 19 15:12:33 crc kubenswrapper[4810]: I0219 15:12:33.976158 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f3507810-2d89-4c4c-bb78-f824d27a6b33-serving-cert\") pod \"controller-manager-bf8444975-mbpl8\" (UID: \"f3507810-2d89-4c4c-bb78-f824d27a6b33\") " pod="openshift-controller-manager/controller-manager-bf8444975-mbpl8" Feb 19 15:12:33 crc kubenswrapper[4810]: I0219 15:12:33.976185 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f3507810-2d89-4c4c-bb78-f824d27a6b33-client-ca\") pod \"controller-manager-bf8444975-mbpl8\" (UID: \"f3507810-2d89-4c4c-bb78-f824d27a6b33\") " pod="openshift-controller-manager/controller-manager-bf8444975-mbpl8" Feb 19 15:12:33 crc kubenswrapper[4810]: I0219 15:12:33.976254 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptwvr\" (UniqueName: \"kubernetes.io/projected/f3507810-2d89-4c4c-bb78-f824d27a6b33-kube-api-access-ptwvr\") pod \"controller-manager-bf8444975-mbpl8\" (UID: \"f3507810-2d89-4c4c-bb78-f824d27a6b33\") " pod="openshift-controller-manager/controller-manager-bf8444975-mbpl8" Feb 19 15:12:33 crc kubenswrapper[4810]: I0219 15:12:33.976281 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f3507810-2d89-4c4c-bb78-f824d27a6b33-proxy-ca-bundles\") pod \"controller-manager-bf8444975-mbpl8\" (UID: \"f3507810-2d89-4c4c-bb78-f824d27a6b33\") " pod="openshift-controller-manager/controller-manager-bf8444975-mbpl8" Feb 19 15:12:33 crc kubenswrapper[4810]: I0219 15:12:33.978614 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f3507810-2d89-4c4c-bb78-f824d27a6b33-proxy-ca-bundles\") pod \"controller-manager-bf8444975-mbpl8\" (UID: \"f3507810-2d89-4c4c-bb78-f824d27a6b33\") " pod="openshift-controller-manager/controller-manager-bf8444975-mbpl8" Feb 19 15:12:33 crc kubenswrapper[4810]: I0219 15:12:33.978720 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3507810-2d89-4c4c-bb78-f824d27a6b33-config\") pod \"controller-manager-bf8444975-mbpl8\" (UID: \"f3507810-2d89-4c4c-bb78-f824d27a6b33\") " pod="openshift-controller-manager/controller-manager-bf8444975-mbpl8" Feb 19 15:12:33 crc kubenswrapper[4810]: I0219 15:12:33.979299 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f3507810-2d89-4c4c-bb78-f824d27a6b33-client-ca\") pod \"controller-manager-bf8444975-mbpl8\" (UID: \"f3507810-2d89-4c4c-bb78-f824d27a6b33\") " pod="openshift-controller-manager/controller-manager-bf8444975-mbpl8" Feb 19 15:12:33 crc kubenswrapper[4810]: I0219 15:12:33.985775 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f3507810-2d89-4c4c-bb78-f824d27a6b33-serving-cert\") pod \"controller-manager-bf8444975-mbpl8\" (UID: \"f3507810-2d89-4c4c-bb78-f824d27a6b33\") " pod="openshift-controller-manager/controller-manager-bf8444975-mbpl8" Feb 19 15:12:33 crc kubenswrapper[4810]: I0219 15:12:33.998076 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptwvr\" (UniqueName: \"kubernetes.io/projected/f3507810-2d89-4c4c-bb78-f824d27a6b33-kube-api-access-ptwvr\") pod \"controller-manager-bf8444975-mbpl8\" (UID: \"f3507810-2d89-4c4c-bb78-f824d27a6b33\") " pod="openshift-controller-manager/controller-manager-bf8444975-mbpl8" Feb 19 15:12:34 crc kubenswrapper[4810]: I0219 15:12:34.154531 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-bf8444975-mbpl8" Feb 19 15:12:36 crc kubenswrapper[4810]: I0219 15:12:36.843770 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-bf8444975-mbpl8"] Feb 19 15:12:36 crc kubenswrapper[4810]: I0219 15:12:36.935364 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58bf9f9fdf-sfcvl"] Feb 19 15:12:39 crc kubenswrapper[4810]: I0219 15:12:39.532408 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 15:12:39 crc kubenswrapper[4810]: E0219 15:12:39.648591 4810 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 19 15:12:39 crc kubenswrapper[4810]: E0219 15:12:39.648783 4810 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cqz6g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-d5ks5_openshift-marketplace(9a3d6b1f-2011-4f7f-bea0-1d303007fe41): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 19 15:12:39 crc kubenswrapper[4810]: E0219 15:12:39.649957 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-d5ks5" podUID="9a3d6b1f-2011-4f7f-bea0-1d303007fe41" Feb 19 15:12:39 crc kubenswrapper[4810]: E0219 15:12:39.681869 4810 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 19 15:12:39 crc kubenswrapper[4810]: E0219 15:12:39.682083 4810 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4xzxn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-x8sn2_openshift-marketplace(cc8ce195-1fe1-4684-8172-e710b3552fb5): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 19 15:12:39 crc kubenswrapper[4810]: E0219 15:12:39.683809 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-x8sn2" podUID="cc8ce195-1fe1-4684-8172-e710b3552fb5" Feb 19 15:12:41 crc kubenswrapper[4810]: E0219 15:12:40.999654 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-x8sn2" podUID="cc8ce195-1fe1-4684-8172-e710b3552fb5" Feb 19 15:12:41 crc kubenswrapper[4810]: E0219 15:12:41.000754 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-d5ks5" podUID="9a3d6b1f-2011-4f7f-bea0-1d303007fe41" Feb 19 15:12:41 crc kubenswrapper[4810]: I0219 15:12:41.019585 4810 scope.go:117] "RemoveContainer" containerID="b43634a16e9cdfb66105b0eff470452a3406e103e0c90e02477b5f9de0072e03" Feb 19 15:12:41 crc kubenswrapper[4810]: E0219 15:12:41.062647 4810 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 19 15:12:41 crc kubenswrapper[4810]: E0219 15:12:41.063141 4810 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r4nld,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-dfqg8_openshift-marketplace(5c654206-f2d0-4b40-9df0-577dbf27e5e4): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 19 15:12:41 crc kubenswrapper[4810]: E0219 15:12:41.064384 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-dfqg8" podUID="5c654206-f2d0-4b40-9df0-577dbf27e5e4" Feb 19 15:12:41 crc kubenswrapper[4810]: E0219 15:12:41.148465 4810 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 19 15:12:41 crc kubenswrapper[4810]: E0219 15:12:41.148661 4810 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gbxf5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-ptbh9_openshift-marketplace(7e18c7b6-8ee0-4dbd-a0b2-3830b3a7bc53): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 19 15:12:41 crc kubenswrapper[4810]: E0219 15:12:41.149899 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-ptbh9" podUID="7e18c7b6-8ee0-4dbd-a0b2-3830b3a7bc53" Feb 19 15:12:41 crc kubenswrapper[4810]: E0219 15:12:41.211573 4810 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 19 15:12:41 crc kubenswrapper[4810]: E0219 15:12:41.211712 4810 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-thvr6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-blpmq_openshift-marketplace(4127fef2-ef2b-4cc4-967d-d52dac26f314): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 19 15:12:41 crc kubenswrapper[4810]: E0219 15:12:41.217186 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-blpmq" podUID="4127fef2-ef2b-4cc4-967d-d52dac26f314" Feb 19 15:12:41 crc kubenswrapper[4810]: I0219 15:12:41.405281 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-bf8444975-mbpl8"] Feb 19 15:12:41 crc kubenswrapper[4810]: I0219 15:12:41.412300 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-2x9v9"] Feb 19 15:12:41 crc kubenswrapper[4810]: W0219 15:12:41.430873 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb72d3f7a_e418_4a21_af73_6a43ce3358c1.slice/crio-c7be3375c01757ff909a664a4f5366321aa5a0fc9a5b4c30ad042d60df9dfd32 WatchSource:0}: Error finding container c7be3375c01757ff909a664a4f5366321aa5a0fc9a5b4c30ad042d60df9dfd32: Status 404 returned error can't find the container with id c7be3375c01757ff909a664a4f5366321aa5a0fc9a5b4c30ad042d60df9dfd32 Feb 19 15:12:41 crc kubenswrapper[4810]: I0219 15:12:41.539137 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58bf9f9fdf-sfcvl"] Feb 19 15:12:41 crc kubenswrapper[4810]: W0219 15:12:41.547252 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a9911b7_9775_45a3_8eba_2418c6a8c7da.slice/crio-ec358f68ac8aafcdf69584f6ce0aea9148f028a0a525c1f72e78fe0953401995 WatchSource:0}: Error finding container ec358f68ac8aafcdf69584f6ce0aea9148f028a0a525c1f72e78fe0953401995: Status 404 returned error can't find the container with id ec358f68ac8aafcdf69584f6ce0aea9148f028a0a525c1f72e78fe0953401995 Feb 19 15:12:41 crc kubenswrapper[4810]: I0219 15:12:41.796993 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 19 15:12:41 crc kubenswrapper[4810]: I0219 15:12:41.798104 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 15:12:41 crc kubenswrapper[4810]: I0219 15:12:41.800992 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 19 15:12:41 crc kubenswrapper[4810]: I0219 15:12:41.801077 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 19 15:12:41 crc kubenswrapper[4810]: I0219 15:12:41.810932 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 19 15:12:41 crc kubenswrapper[4810]: I0219 15:12:41.920955 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b0ed7553-c229-4444-9e9d-53a16d271385-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"b0ed7553-c229-4444-9e9d-53a16d271385\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 15:12:41 crc kubenswrapper[4810]: I0219 15:12:41.921043 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b0ed7553-c229-4444-9e9d-53a16d271385-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"b0ed7553-c229-4444-9e9d-53a16d271385\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 15:12:42 crc kubenswrapper[4810]: I0219 15:12:42.016770 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-58bf9f9fdf-sfcvl" event={"ID":"3a9911b7-9775-45a3-8eba-2418c6a8c7da","Type":"ContainerStarted","Data":"05732258b05d3841e954e41368cf1974cbd08c6655bd488dd5b6669cff2eff10"} Feb 19 15:12:42 crc kubenswrapper[4810]: I0219 15:12:42.016821 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-58bf9f9fdf-sfcvl" event={"ID":"3a9911b7-9775-45a3-8eba-2418c6a8c7da","Type":"ContainerStarted","Data":"ec358f68ac8aafcdf69584f6ce0aea9148f028a0a525c1f72e78fe0953401995"} Feb 19 15:12:42 crc kubenswrapper[4810]: I0219 15:12:42.019204 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-bf8444975-mbpl8" event={"ID":"f3507810-2d89-4c4c-bb78-f824d27a6b33","Type":"ContainerStarted","Data":"a9f9e44ebaf4a998457e0db44c51717fbe64d4e0dd456c466ed47a6c88be4c47"} Feb 19 15:12:42 crc kubenswrapper[4810]: I0219 15:12:42.019272 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-bf8444975-mbpl8" event={"ID":"f3507810-2d89-4c4c-bb78-f824d27a6b33","Type":"ContainerStarted","Data":"ae4ec9dd615a7b98a2515fa91901e518b04ab1ddd1e2dc3384fefaeb39a9e580"} Feb 19 15:12:42 crc kubenswrapper[4810]: I0219 15:12:42.019643 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-bf8444975-mbpl8" podUID="f3507810-2d89-4c4c-bb78-f824d27a6b33" containerName="controller-manager" containerID="cri-o://a9f9e44ebaf4a998457e0db44c51717fbe64d4e0dd456c466ed47a6c88be4c47" gracePeriod=30 Feb 19 15:12:42 crc kubenswrapper[4810]: I0219 15:12:42.019918 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-bf8444975-mbpl8" Feb 19 15:12:42 crc kubenswrapper[4810]: I0219 15:12:42.021335 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2x9v9" event={"ID":"b72d3f7a-e418-4a21-af73-6a43ce3358c1","Type":"ContainerStarted","Data":"c7be3375c01757ff909a664a4f5366321aa5a0fc9a5b4c30ad042d60df9dfd32"} Feb 19 15:12:42 crc kubenswrapper[4810]: I0219 15:12:42.021799 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b0ed7553-c229-4444-9e9d-53a16d271385-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"b0ed7553-c229-4444-9e9d-53a16d271385\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 15:12:42 crc kubenswrapper[4810]: I0219 15:12:42.021856 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b0ed7553-c229-4444-9e9d-53a16d271385-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"b0ed7553-c229-4444-9e9d-53a16d271385\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 15:12:42 crc kubenswrapper[4810]: I0219 15:12:42.022195 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b0ed7553-c229-4444-9e9d-53a16d271385-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"b0ed7553-c229-4444-9e9d-53a16d271385\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 15:12:42 crc kubenswrapper[4810]: I0219 15:12:42.024442 4810 generic.go:334] "Generic (PLEG): container finished" podID="3bf3315d-3d2f-4aeb-b925-c3832e102e85" containerID="2338631bd769d72dd75e6311ec37172a35cf219ae1ae08bee9e394a35d599110" exitCode=0 Feb 19 15:12:42 crc kubenswrapper[4810]: I0219 15:12:42.024512 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rk4vw" event={"ID":"3bf3315d-3d2f-4aeb-b925-c3832e102e85","Type":"ContainerDied","Data":"2338631bd769d72dd75e6311ec37172a35cf219ae1ae08bee9e394a35d599110"} Feb 19 15:12:42 crc kubenswrapper[4810]: I0219 15:12:42.030608 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gp8sg" event={"ID":"3146bc9a-c4fc-4aa1-acae-032db4aa0582","Type":"ContainerStarted","Data":"678210fe6fd1c3abf47690dcbbfba0fc503dc30b786177c25e50c7cce621be3d"} Feb 19 15:12:42 crc kubenswrapper[4810]: I0219 15:12:42.038286 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v87ss" event={"ID":"ee54de34-1c90-401d-8102-2cc1e4116661","Type":"ContainerStarted","Data":"d8d3ab6d086deece4cdefec9736b01ba8f86997b96e710debe90df076af13fd1"} Feb 19 15:12:42 crc kubenswrapper[4810]: I0219 15:12:42.040365 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-dkppn" event={"ID":"7a29951a-027e-49b4-a7ea-a8e363942414","Type":"ContainerStarted","Data":"a924c306a0172ac4f69e59d668d26991f48bbb4e8ddce89d336a0e0fb6dfb9e5"} Feb 19 15:12:42 crc kubenswrapper[4810]: I0219 15:12:42.041371 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-dkppn" Feb 19 15:12:42 crc kubenswrapper[4810]: I0219 15:12:42.041455 4810 patch_prober.go:28] interesting pod/downloads-7954f5f757-dkppn container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Feb 19 15:12:42 crc kubenswrapper[4810]: I0219 15:12:42.041508 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dkppn" podUID="7a29951a-027e-49b4-a7ea-a8e363942414" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Feb 19 15:12:42 crc kubenswrapper[4810]: I0219 15:12:42.052594 4810 patch_prober.go:28] interesting pod/controller-manager-bf8444975-mbpl8 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.55:8443/healthz\": read tcp 10.217.0.2:59546->10.217.0.55:8443: read: connection reset by peer" start-of-body= Feb 19 15:12:42 crc kubenswrapper[4810]: I0219 15:12:42.052679 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-bf8444975-mbpl8" podUID="f3507810-2d89-4c4c-bb78-f824d27a6b33" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.55:8443/healthz\": read tcp 10.217.0.2:59546->10.217.0.55:8443: read: connection reset by peer" Feb 19 15:12:42 crc kubenswrapper[4810]: I0219 15:12:42.054225 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b0ed7553-c229-4444-9e9d-53a16d271385-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"b0ed7553-c229-4444-9e9d-53a16d271385\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 15:12:42 crc kubenswrapper[4810]: E0219 15:12:42.056853 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-ptbh9" podUID="7e18c7b6-8ee0-4dbd-a0b2-3830b3a7bc53" Feb 19 15:12:42 crc kubenswrapper[4810]: E0219 15:12:42.056926 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-dfqg8" podUID="5c654206-f2d0-4b40-9df0-577dbf27e5e4" Feb 19 15:12:42 crc kubenswrapper[4810]: E0219 15:12:42.058204 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-blpmq" podUID="4127fef2-ef2b-4cc4-967d-d52dac26f314" Feb 19 15:12:42 crc kubenswrapper[4810]: I0219 15:12:42.072694 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-bf8444975-mbpl8" podStartSLOduration=26.072673906 podStartE2EDuration="26.072673906s" podCreationTimestamp="2026-02-19 15:12:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:12:42.054110575 +0000 UTC m=+191.536140699" watchObservedRunningTime="2026-02-19 15:12:42.072673906 +0000 UTC m=+191.554704030" Feb 19 15:12:42 crc kubenswrapper[4810]: I0219 15:12:42.121760 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 15:12:42 crc kubenswrapper[4810]: I0219 15:12:42.341021 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 19 15:12:43 crc kubenswrapper[4810]: I0219 15:12:43.051450 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-bf8444975-mbpl8" Feb 19 15:12:43 crc kubenswrapper[4810]: I0219 15:12:43.061767 4810 generic.go:334] "Generic (PLEG): container finished" podID="3146bc9a-c4fc-4aa1-acae-032db4aa0582" containerID="678210fe6fd1c3abf47690dcbbfba0fc503dc30b786177c25e50c7cce621be3d" exitCode=0 Feb 19 15:12:43 crc kubenswrapper[4810]: I0219 15:12:43.061853 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gp8sg" event={"ID":"3146bc9a-c4fc-4aa1-acae-032db4aa0582","Type":"ContainerDied","Data":"678210fe6fd1c3abf47690dcbbfba0fc503dc30b786177c25e50c7cce621be3d"} Feb 19 15:12:43 crc kubenswrapper[4810]: I0219 15:12:43.063237 4810 generic.go:334] "Generic (PLEG): container finished" podID="f3507810-2d89-4c4c-bb78-f824d27a6b33" containerID="a9f9e44ebaf4a998457e0db44c51717fbe64d4e0dd456c466ed47a6c88be4c47" exitCode=0 Feb 19 15:12:43 crc kubenswrapper[4810]: I0219 15:12:43.063303 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-bf8444975-mbpl8" Feb 19 15:12:43 crc kubenswrapper[4810]: I0219 15:12:43.063349 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-bf8444975-mbpl8" event={"ID":"f3507810-2d89-4c4c-bb78-f824d27a6b33","Type":"ContainerDied","Data":"a9f9e44ebaf4a998457e0db44c51717fbe64d4e0dd456c466ed47a6c88be4c47"} Feb 19 15:12:43 crc kubenswrapper[4810]: I0219 15:12:43.063402 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-bf8444975-mbpl8" event={"ID":"f3507810-2d89-4c4c-bb78-f824d27a6b33","Type":"ContainerDied","Data":"ae4ec9dd615a7b98a2515fa91901e518b04ab1ddd1e2dc3384fefaeb39a9e580"} Feb 19 15:12:43 crc kubenswrapper[4810]: I0219 15:12:43.063431 4810 scope.go:117] "RemoveContainer" containerID="a9f9e44ebaf4a998457e0db44c51717fbe64d4e0dd456c466ed47a6c88be4c47" Feb 19 15:12:43 crc kubenswrapper[4810]: I0219 15:12:43.065181 4810 generic.go:334] "Generic (PLEG): container finished" podID="ee54de34-1c90-401d-8102-2cc1e4116661" containerID="d8d3ab6d086deece4cdefec9736b01ba8f86997b96e710debe90df076af13fd1" exitCode=0 Feb 19 15:12:43 crc kubenswrapper[4810]: I0219 15:12:43.065210 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v87ss" event={"ID":"ee54de34-1c90-401d-8102-2cc1e4116661","Type":"ContainerDied","Data":"d8d3ab6d086deece4cdefec9736b01ba8f86997b96e710debe90df076af13fd1"} Feb 19 15:12:43 crc kubenswrapper[4810]: I0219 15:12:43.067653 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2x9v9" event={"ID":"b72d3f7a-e418-4a21-af73-6a43ce3358c1","Type":"ContainerStarted","Data":"d44f15231af8cb21f7ccee9cd51d7426e3bca1e6b8b1f3e479c05ee5769791bb"} Feb 19 15:12:43 crc kubenswrapper[4810]: I0219 15:12:43.067689 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2x9v9" event={"ID":"b72d3f7a-e418-4a21-af73-6a43ce3358c1","Type":"ContainerStarted","Data":"938fef66a55244a280a168ee18a5bb33fcb5ba9de655a49d6c5ffcf64eef4843"} Feb 19 15:12:43 crc kubenswrapper[4810]: I0219 15:12:43.086229 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"b0ed7553-c229-4444-9e9d-53a16d271385","Type":"ContainerStarted","Data":"cbf5f2fe685e0232708247bcf761dd389dbfe07942605aa730596520aea1f4e5"} Feb 19 15:12:43 crc kubenswrapper[4810]: I0219 15:12:43.086272 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"b0ed7553-c229-4444-9e9d-53a16d271385","Type":"ContainerStarted","Data":"ca55f8aae7ec39babf30ac9cbecf6ad25f43591f6457b091b32a87ba31a0bf92"} Feb 19 15:12:43 crc kubenswrapper[4810]: I0219 15:12:43.086386 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-58bf9f9fdf-sfcvl" podUID="3a9911b7-9775-45a3-8eba-2418c6a8c7da" containerName="route-controller-manager" containerID="cri-o://05732258b05d3841e954e41368cf1974cbd08c6655bd488dd5b6669cff2eff10" gracePeriod=30 Feb 19 15:12:43 crc kubenswrapper[4810]: I0219 15:12:43.086814 4810 patch_prober.go:28] interesting pod/downloads-7954f5f757-dkppn container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Feb 19 15:12:43 crc kubenswrapper[4810]: I0219 15:12:43.086847 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-58bf9f9fdf-sfcvl" Feb 19 15:12:43 crc kubenswrapper[4810]: I0219 15:12:43.086881 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dkppn" podUID="7a29951a-027e-49b4-a7ea-a8e363942414" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Feb 19 15:12:43 crc kubenswrapper[4810]: I0219 15:12:43.094625 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-58bf9f9fdf-sfcvl" Feb 19 15:12:43 crc kubenswrapper[4810]: I0219 15:12:43.096090 4810 scope.go:117] "RemoveContainer" containerID="a9f9e44ebaf4a998457e0db44c51717fbe64d4e0dd456c466ed47a6c88be4c47" Feb 19 15:12:43 crc kubenswrapper[4810]: E0219 15:12:43.096607 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9f9e44ebaf4a998457e0db44c51717fbe64d4e0dd456c466ed47a6c88be4c47\": container with ID starting with a9f9e44ebaf4a998457e0db44c51717fbe64d4e0dd456c466ed47a6c88be4c47 not found: ID does not exist" containerID="a9f9e44ebaf4a998457e0db44c51717fbe64d4e0dd456c466ed47a6c88be4c47" Feb 19 15:12:43 crc kubenswrapper[4810]: I0219 15:12:43.096648 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9f9e44ebaf4a998457e0db44c51717fbe64d4e0dd456c466ed47a6c88be4c47"} err="failed to get container status \"a9f9e44ebaf4a998457e0db44c51717fbe64d4e0dd456c466ed47a6c88be4c47\": rpc error: code = NotFound desc = could not find container \"a9f9e44ebaf4a998457e0db44c51717fbe64d4e0dd456c466ed47a6c88be4c47\": container with ID starting with a9f9e44ebaf4a998457e0db44c51717fbe64d4e0dd456c466ed47a6c88be4c47 not found: ID does not exist" Feb 19 15:12:43 crc kubenswrapper[4810]: I0219 15:12:43.106749 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5b7784947b-gg4f2"] Feb 19 15:12:43 crc kubenswrapper[4810]: E0219 15:12:43.107076 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3507810-2d89-4c4c-bb78-f824d27a6b33" containerName="controller-manager" Feb 19 15:12:43 crc kubenswrapper[4810]: I0219 15:12:43.107096 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3507810-2d89-4c4c-bb78-f824d27a6b33" containerName="controller-manager" Feb 19 15:12:43 crc kubenswrapper[4810]: I0219 15:12:43.107243 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3507810-2d89-4c4c-bb78-f824d27a6b33" containerName="controller-manager" Feb 19 15:12:43 crc kubenswrapper[4810]: I0219 15:12:43.107674 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b7784947b-gg4f2" Feb 19 15:12:43 crc kubenswrapper[4810]: I0219 15:12:43.110084 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5b7784947b-gg4f2"] Feb 19 15:12:43 crc kubenswrapper[4810]: I0219 15:12:43.152167 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-58bf9f9fdf-sfcvl" podStartSLOduration=27.152150362 podStartE2EDuration="27.152150362s" podCreationTimestamp="2026-02-19 15:12:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:12:43.148297096 +0000 UTC m=+192.630327220" watchObservedRunningTime="2026-02-19 15:12:43.152150362 +0000 UTC m=+192.634180486" Feb 19 15:12:43 crc kubenswrapper[4810]: I0219 15:12:43.237172 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3507810-2d89-4c4c-bb78-f824d27a6b33-config\") pod \"f3507810-2d89-4c4c-bb78-f824d27a6b33\" (UID: \"f3507810-2d89-4c4c-bb78-f824d27a6b33\") " Feb 19 15:12:43 crc kubenswrapper[4810]: I0219 15:12:43.237233 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f3507810-2d89-4c4c-bb78-f824d27a6b33-proxy-ca-bundles\") pod \"f3507810-2d89-4c4c-bb78-f824d27a6b33\" (UID: \"f3507810-2d89-4c4c-bb78-f824d27a6b33\") " Feb 19 15:12:43 crc kubenswrapper[4810]: I0219 15:12:43.237272 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f3507810-2d89-4c4c-bb78-f824d27a6b33-client-ca\") pod \"f3507810-2d89-4c4c-bb78-f824d27a6b33\" (UID: \"f3507810-2d89-4c4c-bb78-f824d27a6b33\") " Feb 19 15:12:43 crc kubenswrapper[4810]: I0219 15:12:43.237346 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f3507810-2d89-4c4c-bb78-f824d27a6b33-serving-cert\") pod \"f3507810-2d89-4c4c-bb78-f824d27a6b33\" (UID: \"f3507810-2d89-4c4c-bb78-f824d27a6b33\") " Feb 19 15:12:43 crc kubenswrapper[4810]: I0219 15:12:43.237406 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptwvr\" (UniqueName: \"kubernetes.io/projected/f3507810-2d89-4c4c-bb78-f824d27a6b33-kube-api-access-ptwvr\") pod \"f3507810-2d89-4c4c-bb78-f824d27a6b33\" (UID: \"f3507810-2d89-4c4c-bb78-f824d27a6b33\") " Feb 19 15:12:43 crc kubenswrapper[4810]: I0219 15:12:43.237586 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b5a4ac5-f403-4492-93a9-eca271fc69cf-config\") pod \"controller-manager-5b7784947b-gg4f2\" (UID: \"0b5a4ac5-f403-4492-93a9-eca271fc69cf\") " pod="openshift-controller-manager/controller-manager-5b7784947b-gg4f2" Feb 19 15:12:43 crc kubenswrapper[4810]: I0219 15:12:43.237630 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0b5a4ac5-f403-4492-93a9-eca271fc69cf-client-ca\") pod \"controller-manager-5b7784947b-gg4f2\" (UID: \"0b5a4ac5-f403-4492-93a9-eca271fc69cf\") " pod="openshift-controller-manager/controller-manager-5b7784947b-gg4f2" Feb 19 15:12:43 crc kubenswrapper[4810]: I0219 15:12:43.237753 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0b5a4ac5-f403-4492-93a9-eca271fc69cf-proxy-ca-bundles\") pod \"controller-manager-5b7784947b-gg4f2\" (UID: \"0b5a4ac5-f403-4492-93a9-eca271fc69cf\") " pod="openshift-controller-manager/controller-manager-5b7784947b-gg4f2" Feb 19 15:12:43 crc kubenswrapper[4810]: I0219 15:12:43.237780 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpstf\" (UniqueName: \"kubernetes.io/projected/0b5a4ac5-f403-4492-93a9-eca271fc69cf-kube-api-access-zpstf\") pod \"controller-manager-5b7784947b-gg4f2\" (UID: \"0b5a4ac5-f403-4492-93a9-eca271fc69cf\") " pod="openshift-controller-manager/controller-manager-5b7784947b-gg4f2" Feb 19 15:12:43 crc kubenswrapper[4810]: I0219 15:12:43.237824 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b5a4ac5-f403-4492-93a9-eca271fc69cf-serving-cert\") pod \"controller-manager-5b7784947b-gg4f2\" (UID: \"0b5a4ac5-f403-4492-93a9-eca271fc69cf\") " pod="openshift-controller-manager/controller-manager-5b7784947b-gg4f2" Feb 19 15:12:43 crc kubenswrapper[4810]: I0219 15:12:43.238966 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3507810-2d89-4c4c-bb78-f824d27a6b33-config" (OuterVolumeSpecName: "config") pod "f3507810-2d89-4c4c-bb78-f824d27a6b33" (UID: "f3507810-2d89-4c4c-bb78-f824d27a6b33"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:12:43 crc kubenswrapper[4810]: I0219 15:12:43.240265 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3507810-2d89-4c4c-bb78-f824d27a6b33-client-ca" (OuterVolumeSpecName: "client-ca") pod "f3507810-2d89-4c4c-bb78-f824d27a6b33" (UID: "f3507810-2d89-4c4c-bb78-f824d27a6b33"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:12:43 crc kubenswrapper[4810]: I0219 15:12:43.240998 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3507810-2d89-4c4c-bb78-f824d27a6b33-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "f3507810-2d89-4c4c-bb78-f824d27a6b33" (UID: "f3507810-2d89-4c4c-bb78-f824d27a6b33"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:12:43 crc kubenswrapper[4810]: I0219 15:12:43.248117 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3507810-2d89-4c4c-bb78-f824d27a6b33-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f3507810-2d89-4c4c-bb78-f824d27a6b33" (UID: "f3507810-2d89-4c4c-bb78-f824d27a6b33"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:12:43 crc kubenswrapper[4810]: I0219 15:12:43.248976 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3507810-2d89-4c4c-bb78-f824d27a6b33-kube-api-access-ptwvr" (OuterVolumeSpecName: "kube-api-access-ptwvr") pod "f3507810-2d89-4c4c-bb78-f824d27a6b33" (UID: "f3507810-2d89-4c4c-bb78-f824d27a6b33"). InnerVolumeSpecName "kube-api-access-ptwvr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:12:43 crc kubenswrapper[4810]: I0219 15:12:43.339567 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b5a4ac5-f403-4492-93a9-eca271fc69cf-config\") pod \"controller-manager-5b7784947b-gg4f2\" (UID: \"0b5a4ac5-f403-4492-93a9-eca271fc69cf\") " pod="openshift-controller-manager/controller-manager-5b7784947b-gg4f2" Feb 19 15:12:43 crc kubenswrapper[4810]: I0219 15:12:43.339638 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0b5a4ac5-f403-4492-93a9-eca271fc69cf-client-ca\") pod \"controller-manager-5b7784947b-gg4f2\" (UID: \"0b5a4ac5-f403-4492-93a9-eca271fc69cf\") " pod="openshift-controller-manager/controller-manager-5b7784947b-gg4f2" Feb 19 15:12:43 crc kubenswrapper[4810]: I0219 15:12:43.339706 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0b5a4ac5-f403-4492-93a9-eca271fc69cf-proxy-ca-bundles\") pod \"controller-manager-5b7784947b-gg4f2\" (UID: \"0b5a4ac5-f403-4492-93a9-eca271fc69cf\") " pod="openshift-controller-manager/controller-manager-5b7784947b-gg4f2" Feb 19 15:12:43 crc kubenswrapper[4810]: I0219 15:12:43.339736 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpstf\" (UniqueName: \"kubernetes.io/projected/0b5a4ac5-f403-4492-93a9-eca271fc69cf-kube-api-access-zpstf\") pod \"controller-manager-5b7784947b-gg4f2\" (UID: \"0b5a4ac5-f403-4492-93a9-eca271fc69cf\") " pod="openshift-controller-manager/controller-manager-5b7784947b-gg4f2" Feb 19 15:12:43 crc kubenswrapper[4810]: I0219 15:12:43.339775 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b5a4ac5-f403-4492-93a9-eca271fc69cf-serving-cert\") pod \"controller-manager-5b7784947b-gg4f2\" (UID: \"0b5a4ac5-f403-4492-93a9-eca271fc69cf\") " pod="openshift-controller-manager/controller-manager-5b7784947b-gg4f2" Feb 19 15:12:43 crc kubenswrapper[4810]: I0219 15:12:43.339823 4810 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f3507810-2d89-4c4c-bb78-f824d27a6b33-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 15:12:43 crc kubenswrapper[4810]: I0219 15:12:43.339839 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ptwvr\" (UniqueName: \"kubernetes.io/projected/f3507810-2d89-4c4c-bb78-f824d27a6b33-kube-api-access-ptwvr\") on node \"crc\" DevicePath \"\"" Feb 19 15:12:43 crc kubenswrapper[4810]: I0219 15:12:43.339851 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3507810-2d89-4c4c-bb78-f824d27a6b33-config\") on node \"crc\" DevicePath \"\"" Feb 19 15:12:43 crc kubenswrapper[4810]: I0219 15:12:43.339860 4810 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f3507810-2d89-4c4c-bb78-f824d27a6b33-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 19 15:12:43 crc kubenswrapper[4810]: I0219 15:12:43.339870 4810 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f3507810-2d89-4c4c-bb78-f824d27a6b33-client-ca\") on node \"crc\" DevicePath \"\"" Feb 19 15:12:43 crc kubenswrapper[4810]: I0219 15:12:43.341715 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0b5a4ac5-f403-4492-93a9-eca271fc69cf-proxy-ca-bundles\") pod \"controller-manager-5b7784947b-gg4f2\" (UID: \"0b5a4ac5-f403-4492-93a9-eca271fc69cf\") " pod="openshift-controller-manager/controller-manager-5b7784947b-gg4f2" Feb 19 15:12:43 crc kubenswrapper[4810]: I0219 15:12:43.341837 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0b5a4ac5-f403-4492-93a9-eca271fc69cf-client-ca\") pod \"controller-manager-5b7784947b-gg4f2\" (UID: \"0b5a4ac5-f403-4492-93a9-eca271fc69cf\") " pod="openshift-controller-manager/controller-manager-5b7784947b-gg4f2" Feb 19 15:12:43 crc kubenswrapper[4810]: I0219 15:12:43.343481 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b5a4ac5-f403-4492-93a9-eca271fc69cf-serving-cert\") pod \"controller-manager-5b7784947b-gg4f2\" (UID: \"0b5a4ac5-f403-4492-93a9-eca271fc69cf\") " pod="openshift-controller-manager/controller-manager-5b7784947b-gg4f2" Feb 19 15:12:43 crc kubenswrapper[4810]: I0219 15:12:43.350621 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b5a4ac5-f403-4492-93a9-eca271fc69cf-config\") pod \"controller-manager-5b7784947b-gg4f2\" (UID: \"0b5a4ac5-f403-4492-93a9-eca271fc69cf\") " pod="openshift-controller-manager/controller-manager-5b7784947b-gg4f2" Feb 19 15:12:43 crc kubenswrapper[4810]: I0219 15:12:43.357804 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpstf\" (UniqueName: \"kubernetes.io/projected/0b5a4ac5-f403-4492-93a9-eca271fc69cf-kube-api-access-zpstf\") pod \"controller-manager-5b7784947b-gg4f2\" (UID: \"0b5a4ac5-f403-4492-93a9-eca271fc69cf\") " pod="openshift-controller-manager/controller-manager-5b7784947b-gg4f2" Feb 19 15:12:43 crc kubenswrapper[4810]: I0219 15:12:43.399584 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-bf8444975-mbpl8"] Feb 19 15:12:43 crc kubenswrapper[4810]: I0219 15:12:43.402731 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-bf8444975-mbpl8"] Feb 19 15:12:43 crc kubenswrapper[4810]: I0219 15:12:43.425353 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b7784947b-gg4f2" Feb 19 15:12:43 crc kubenswrapper[4810]: I0219 15:12:43.464473 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3507810-2d89-4c4c-bb78-f824d27a6b33" path="/var/lib/kubelet/pods/f3507810-2d89-4c4c-bb78-f824d27a6b33/volumes" Feb 19 15:12:43 crc kubenswrapper[4810]: I0219 15:12:43.585811 4810 patch_prober.go:28] interesting pod/downloads-7954f5f757-dkppn container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Feb 19 15:12:43 crc kubenswrapper[4810]: I0219 15:12:43.585877 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-dkppn" podUID="7a29951a-027e-49b4-a7ea-a8e363942414" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Feb 19 15:12:43 crc kubenswrapper[4810]: I0219 15:12:43.586034 4810 patch_prober.go:28] interesting pod/downloads-7954f5f757-dkppn container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Feb 19 15:12:43 crc kubenswrapper[4810]: I0219 15:12:43.586097 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dkppn" podUID="7a29951a-027e-49b4-a7ea-a8e363942414" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Feb 19 15:12:43 crc kubenswrapper[4810]: I0219 15:12:43.655591 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5b7784947b-gg4f2"] Feb 19 15:12:43 crc kubenswrapper[4810]: W0219 15:12:43.665849 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b5a4ac5_f403_4492_93a9_eca271fc69cf.slice/crio-f33a645c1b493c26e6d3ed1478c77f60dcd1bd0b429730e5b33e0b76bf136c78 WatchSource:0}: Error finding container f33a645c1b493c26e6d3ed1478c77f60dcd1bd0b429730e5b33e0b76bf136c78: Status 404 returned error can't find the container with id f33a645c1b493c26e6d3ed1478c77f60dcd1bd0b429730e5b33e0b76bf136c78 Feb 19 15:12:44 crc kubenswrapper[4810]: I0219 15:12:44.097699 4810 generic.go:334] "Generic (PLEG): container finished" podID="3a9911b7-9775-45a3-8eba-2418c6a8c7da" containerID="05732258b05d3841e954e41368cf1974cbd08c6655bd488dd5b6669cff2eff10" exitCode=0 Feb 19 15:12:44 crc kubenswrapper[4810]: I0219 15:12:44.097826 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-58bf9f9fdf-sfcvl" event={"ID":"3a9911b7-9775-45a3-8eba-2418c6a8c7da","Type":"ContainerDied","Data":"05732258b05d3841e954e41368cf1974cbd08c6655bd488dd5b6669cff2eff10"} Feb 19 15:12:44 crc kubenswrapper[4810]: I0219 15:12:44.101665 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b7784947b-gg4f2" event={"ID":"0b5a4ac5-f403-4492-93a9-eca271fc69cf","Type":"ContainerStarted","Data":"f33a645c1b493c26e6d3ed1478c77f60dcd1bd0b429730e5b33e0b76bf136c78"} Feb 19 15:12:44 crc kubenswrapper[4810]: I0219 15:12:44.103918 4810 generic.go:334] "Generic (PLEG): container finished" podID="b0ed7553-c229-4444-9e9d-53a16d271385" containerID="cbf5f2fe685e0232708247bcf761dd389dbfe07942605aa730596520aea1f4e5" exitCode=0 Feb 19 15:12:44 crc kubenswrapper[4810]: I0219 15:12:44.103966 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"b0ed7553-c229-4444-9e9d-53a16d271385","Type":"ContainerDied","Data":"cbf5f2fe685e0232708247bcf761dd389dbfe07942605aa730596520aea1f4e5"} Feb 19 15:12:44 crc kubenswrapper[4810]: I0219 15:12:44.105075 4810 patch_prober.go:28] interesting pod/downloads-7954f5f757-dkppn container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Feb 19 15:12:44 crc kubenswrapper[4810]: I0219 15:12:44.105372 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dkppn" podUID="7a29951a-027e-49b4-a7ea-a8e363942414" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Feb 19 15:12:44 crc kubenswrapper[4810]: I0219 15:12:44.122796 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-2x9v9" podStartSLOduration=171.122775384 podStartE2EDuration="2m51.122775384s" podCreationTimestamp="2026-02-19 15:09:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:12:44.119207056 +0000 UTC m=+193.601237200" watchObservedRunningTime="2026-02-19 15:12:44.122775384 +0000 UTC m=+193.604805518" Feb 19 15:12:44 crc kubenswrapper[4810]: I0219 15:12:44.323585 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-58bf9f9fdf-sfcvl" Feb 19 15:12:44 crc kubenswrapper[4810]: I0219 15:12:44.455550 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a9911b7-9775-45a3-8eba-2418c6a8c7da-serving-cert\") pod \"3a9911b7-9775-45a3-8eba-2418c6a8c7da\" (UID: \"3a9911b7-9775-45a3-8eba-2418c6a8c7da\") " Feb 19 15:12:44 crc kubenswrapper[4810]: I0219 15:12:44.455621 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a9911b7-9775-45a3-8eba-2418c6a8c7da-config\") pod \"3a9911b7-9775-45a3-8eba-2418c6a8c7da\" (UID: \"3a9911b7-9775-45a3-8eba-2418c6a8c7da\") " Feb 19 15:12:44 crc kubenswrapper[4810]: I0219 15:12:44.455653 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvn9b\" (UniqueName: \"kubernetes.io/projected/3a9911b7-9775-45a3-8eba-2418c6a8c7da-kube-api-access-dvn9b\") pod \"3a9911b7-9775-45a3-8eba-2418c6a8c7da\" (UID: \"3a9911b7-9775-45a3-8eba-2418c6a8c7da\") " Feb 19 15:12:44 crc kubenswrapper[4810]: I0219 15:12:44.455729 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3a9911b7-9775-45a3-8eba-2418c6a8c7da-client-ca\") pod \"3a9911b7-9775-45a3-8eba-2418c6a8c7da\" (UID: \"3a9911b7-9775-45a3-8eba-2418c6a8c7da\") " Feb 19 15:12:44 crc kubenswrapper[4810]: I0219 15:12:44.456457 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a9911b7-9775-45a3-8eba-2418c6a8c7da-client-ca" (OuterVolumeSpecName: "client-ca") pod "3a9911b7-9775-45a3-8eba-2418c6a8c7da" (UID: "3a9911b7-9775-45a3-8eba-2418c6a8c7da"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:12:44 crc kubenswrapper[4810]: I0219 15:12:44.456463 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a9911b7-9775-45a3-8eba-2418c6a8c7da-config" (OuterVolumeSpecName: "config") pod "3a9911b7-9775-45a3-8eba-2418c6a8c7da" (UID: "3a9911b7-9775-45a3-8eba-2418c6a8c7da"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:12:44 crc kubenswrapper[4810]: I0219 15:12:44.462901 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a9911b7-9775-45a3-8eba-2418c6a8c7da-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3a9911b7-9775-45a3-8eba-2418c6a8c7da" (UID: "3a9911b7-9775-45a3-8eba-2418c6a8c7da"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:12:44 crc kubenswrapper[4810]: I0219 15:12:44.463003 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a9911b7-9775-45a3-8eba-2418c6a8c7da-kube-api-access-dvn9b" (OuterVolumeSpecName: "kube-api-access-dvn9b") pod "3a9911b7-9775-45a3-8eba-2418c6a8c7da" (UID: "3a9911b7-9775-45a3-8eba-2418c6a8c7da"). InnerVolumeSpecName "kube-api-access-dvn9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:12:44 crc kubenswrapper[4810]: I0219 15:12:44.556927 4810 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a9911b7-9775-45a3-8eba-2418c6a8c7da-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 15:12:44 crc kubenswrapper[4810]: I0219 15:12:44.556970 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a9911b7-9775-45a3-8eba-2418c6a8c7da-config\") on node \"crc\" DevicePath \"\"" Feb 19 15:12:44 crc kubenswrapper[4810]: I0219 15:12:44.556983 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvn9b\" (UniqueName: \"kubernetes.io/projected/3a9911b7-9775-45a3-8eba-2418c6a8c7da-kube-api-access-dvn9b\") on node \"crc\" DevicePath \"\"" Feb 19 15:12:44 crc kubenswrapper[4810]: I0219 15:12:44.556993 4810 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3a9911b7-9775-45a3-8eba-2418c6a8c7da-client-ca\") on node \"crc\" DevicePath \"\"" Feb 19 15:12:45 crc kubenswrapper[4810]: I0219 15:12:45.113714 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b7784947b-gg4f2" event={"ID":"0b5a4ac5-f403-4492-93a9-eca271fc69cf","Type":"ContainerStarted","Data":"42468dd1baefcb60fe30e73c95955a3bb2eaf741c06d1f46bf0941fa5d8419da"} Feb 19 15:12:45 crc kubenswrapper[4810]: I0219 15:12:45.114176 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5b7784947b-gg4f2" Feb 19 15:12:45 crc kubenswrapper[4810]: I0219 15:12:45.117800 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-58bf9f9fdf-sfcvl" Feb 19 15:12:45 crc kubenswrapper[4810]: I0219 15:12:45.120580 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-58bf9f9fdf-sfcvl" event={"ID":"3a9911b7-9775-45a3-8eba-2418c6a8c7da","Type":"ContainerDied","Data":"ec358f68ac8aafcdf69584f6ce0aea9148f028a0a525c1f72e78fe0953401995"} Feb 19 15:12:45 crc kubenswrapper[4810]: I0219 15:12:45.120638 4810 scope.go:117] "RemoveContainer" containerID="05732258b05d3841e954e41368cf1974cbd08c6655bd488dd5b6669cff2eff10" Feb 19 15:12:45 crc kubenswrapper[4810]: I0219 15:12:45.125652 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5b7784947b-gg4f2" Feb 19 15:12:45 crc kubenswrapper[4810]: I0219 15:12:45.141915 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5b7784947b-gg4f2" podStartSLOduration=9.141891669 podStartE2EDuration="9.141891669s" podCreationTimestamp="2026-02-19 15:12:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:12:45.132294985 +0000 UTC m=+194.614325119" watchObservedRunningTime="2026-02-19 15:12:45.141891669 +0000 UTC m=+194.623921793" Feb 19 15:12:45 crc kubenswrapper[4810]: I0219 15:12:45.181248 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58bf9f9fdf-sfcvl"] Feb 19 15:12:45 crc kubenswrapper[4810]: I0219 15:12:45.185002 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58bf9f9fdf-sfcvl"] Feb 19 15:12:45 crc kubenswrapper[4810]: I0219 15:12:45.438364 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 15:12:45 crc kubenswrapper[4810]: I0219 15:12:45.449953 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a9911b7-9775-45a3-8eba-2418c6a8c7da" path="/var/lib/kubelet/pods/3a9911b7-9775-45a3-8eba-2418c6a8c7da/volumes" Feb 19 15:12:45 crc kubenswrapper[4810]: I0219 15:12:45.568045 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b0ed7553-c229-4444-9e9d-53a16d271385-kube-api-access\") pod \"b0ed7553-c229-4444-9e9d-53a16d271385\" (UID: \"b0ed7553-c229-4444-9e9d-53a16d271385\") " Feb 19 15:12:45 crc kubenswrapper[4810]: I0219 15:12:45.568203 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b0ed7553-c229-4444-9e9d-53a16d271385-kubelet-dir\") pod \"b0ed7553-c229-4444-9e9d-53a16d271385\" (UID: \"b0ed7553-c229-4444-9e9d-53a16d271385\") " Feb 19 15:12:45 crc kubenswrapper[4810]: I0219 15:12:45.568517 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b0ed7553-c229-4444-9e9d-53a16d271385-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "b0ed7553-c229-4444-9e9d-53a16d271385" (UID: "b0ed7553-c229-4444-9e9d-53a16d271385"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 15:12:45 crc kubenswrapper[4810]: I0219 15:12:45.579110 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0ed7553-c229-4444-9e9d-53a16d271385-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "b0ed7553-c229-4444-9e9d-53a16d271385" (UID: "b0ed7553-c229-4444-9e9d-53a16d271385"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:12:45 crc kubenswrapper[4810]: I0219 15:12:45.670272 4810 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b0ed7553-c229-4444-9e9d-53a16d271385-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 19 15:12:45 crc kubenswrapper[4810]: I0219 15:12:45.670319 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b0ed7553-c229-4444-9e9d-53a16d271385-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 19 15:12:45 crc kubenswrapper[4810]: I0219 15:12:45.822426 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57ffd7999b-9kf5c"] Feb 19 15:12:45 crc kubenswrapper[4810]: E0219 15:12:45.823226 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0ed7553-c229-4444-9e9d-53a16d271385" containerName="pruner" Feb 19 15:12:45 crc kubenswrapper[4810]: I0219 15:12:45.823241 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0ed7553-c229-4444-9e9d-53a16d271385" containerName="pruner" Feb 19 15:12:45 crc kubenswrapper[4810]: E0219 15:12:45.823253 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a9911b7-9775-45a3-8eba-2418c6a8c7da" containerName="route-controller-manager" Feb 19 15:12:45 crc kubenswrapper[4810]: I0219 15:12:45.823261 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a9911b7-9775-45a3-8eba-2418c6a8c7da" containerName="route-controller-manager" Feb 19 15:12:45 crc kubenswrapper[4810]: I0219 15:12:45.823423 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a9911b7-9775-45a3-8eba-2418c6a8c7da" containerName="route-controller-manager" Feb 19 15:12:45 crc kubenswrapper[4810]: I0219 15:12:45.823437 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0ed7553-c229-4444-9e9d-53a16d271385" containerName="pruner" Feb 19 15:12:45 crc kubenswrapper[4810]: I0219 15:12:45.824116 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57ffd7999b-9kf5c" Feb 19 15:12:45 crc kubenswrapper[4810]: I0219 15:12:45.826738 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 19 15:12:45 crc kubenswrapper[4810]: I0219 15:12:45.826743 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 19 15:12:45 crc kubenswrapper[4810]: I0219 15:12:45.833257 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57ffd7999b-9kf5c"] Feb 19 15:12:45 crc kubenswrapper[4810]: I0219 15:12:45.836230 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 19 15:12:45 crc kubenswrapper[4810]: I0219 15:12:45.836232 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 19 15:12:45 crc kubenswrapper[4810]: I0219 15:12:45.836376 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 19 15:12:45 crc kubenswrapper[4810]: I0219 15:12:45.836814 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 19 15:12:45 crc kubenswrapper[4810]: I0219 15:12:45.973912 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f4526d27-e1ba-4a55-b017-2f7f003521b4-client-ca\") pod \"route-controller-manager-57ffd7999b-9kf5c\" (UID: \"f4526d27-e1ba-4a55-b017-2f7f003521b4\") " pod="openshift-route-controller-manager/route-controller-manager-57ffd7999b-9kf5c" Feb 19 15:12:45 crc kubenswrapper[4810]: I0219 15:12:45.973950 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f4526d27-e1ba-4a55-b017-2f7f003521b4-serving-cert\") pod \"route-controller-manager-57ffd7999b-9kf5c\" (UID: \"f4526d27-e1ba-4a55-b017-2f7f003521b4\") " pod="openshift-route-controller-manager/route-controller-manager-57ffd7999b-9kf5c" Feb 19 15:12:45 crc kubenswrapper[4810]: I0219 15:12:45.974459 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4526d27-e1ba-4a55-b017-2f7f003521b4-config\") pod \"route-controller-manager-57ffd7999b-9kf5c\" (UID: \"f4526d27-e1ba-4a55-b017-2f7f003521b4\") " pod="openshift-route-controller-manager/route-controller-manager-57ffd7999b-9kf5c" Feb 19 15:12:45 crc kubenswrapper[4810]: I0219 15:12:45.974703 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kb2f2\" (UniqueName: \"kubernetes.io/projected/f4526d27-e1ba-4a55-b017-2f7f003521b4-kube-api-access-kb2f2\") pod \"route-controller-manager-57ffd7999b-9kf5c\" (UID: \"f4526d27-e1ba-4a55-b017-2f7f003521b4\") " pod="openshift-route-controller-manager/route-controller-manager-57ffd7999b-9kf5c" Feb 19 15:12:46 crc kubenswrapper[4810]: I0219 15:12:46.076121 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4526d27-e1ba-4a55-b017-2f7f003521b4-config\") pod \"route-controller-manager-57ffd7999b-9kf5c\" (UID: \"f4526d27-e1ba-4a55-b017-2f7f003521b4\") " pod="openshift-route-controller-manager/route-controller-manager-57ffd7999b-9kf5c" Feb 19 15:12:46 crc kubenswrapper[4810]: I0219 15:12:46.076199 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kb2f2\" (UniqueName: \"kubernetes.io/projected/f4526d27-e1ba-4a55-b017-2f7f003521b4-kube-api-access-kb2f2\") pod \"route-controller-manager-57ffd7999b-9kf5c\" (UID: \"f4526d27-e1ba-4a55-b017-2f7f003521b4\") " pod="openshift-route-controller-manager/route-controller-manager-57ffd7999b-9kf5c" Feb 19 15:12:46 crc kubenswrapper[4810]: I0219 15:12:46.076241 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f4526d27-e1ba-4a55-b017-2f7f003521b4-client-ca\") pod \"route-controller-manager-57ffd7999b-9kf5c\" (UID: \"f4526d27-e1ba-4a55-b017-2f7f003521b4\") " pod="openshift-route-controller-manager/route-controller-manager-57ffd7999b-9kf5c" Feb 19 15:12:46 crc kubenswrapper[4810]: I0219 15:12:46.076266 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f4526d27-e1ba-4a55-b017-2f7f003521b4-serving-cert\") pod \"route-controller-manager-57ffd7999b-9kf5c\" (UID: \"f4526d27-e1ba-4a55-b017-2f7f003521b4\") " pod="openshift-route-controller-manager/route-controller-manager-57ffd7999b-9kf5c" Feb 19 15:12:46 crc kubenswrapper[4810]: I0219 15:12:46.077209 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f4526d27-e1ba-4a55-b017-2f7f003521b4-client-ca\") pod \"route-controller-manager-57ffd7999b-9kf5c\" (UID: \"f4526d27-e1ba-4a55-b017-2f7f003521b4\") " pod="openshift-route-controller-manager/route-controller-manager-57ffd7999b-9kf5c" Feb 19 15:12:46 crc kubenswrapper[4810]: I0219 15:12:46.078011 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4526d27-e1ba-4a55-b017-2f7f003521b4-config\") pod \"route-controller-manager-57ffd7999b-9kf5c\" (UID: \"f4526d27-e1ba-4a55-b017-2f7f003521b4\") " pod="openshift-route-controller-manager/route-controller-manager-57ffd7999b-9kf5c" Feb 19 15:12:46 crc kubenswrapper[4810]: I0219 15:12:46.080930 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f4526d27-e1ba-4a55-b017-2f7f003521b4-serving-cert\") pod \"route-controller-manager-57ffd7999b-9kf5c\" (UID: \"f4526d27-e1ba-4a55-b017-2f7f003521b4\") " pod="openshift-route-controller-manager/route-controller-manager-57ffd7999b-9kf5c" Feb 19 15:12:46 crc kubenswrapper[4810]: I0219 15:12:46.096715 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kb2f2\" (UniqueName: \"kubernetes.io/projected/f4526d27-e1ba-4a55-b017-2f7f003521b4-kube-api-access-kb2f2\") pod \"route-controller-manager-57ffd7999b-9kf5c\" (UID: \"f4526d27-e1ba-4a55-b017-2f7f003521b4\") " pod="openshift-route-controller-manager/route-controller-manager-57ffd7999b-9kf5c" Feb 19 15:12:46 crc kubenswrapper[4810]: I0219 15:12:46.126355 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rk4vw" event={"ID":"3bf3315d-3d2f-4aeb-b925-c3832e102e85","Type":"ContainerStarted","Data":"077475bcb0b82366e9846158038585fb73fc53528d8e624159315394b6489745"} Feb 19 15:12:46 crc kubenswrapper[4810]: I0219 15:12:46.130729 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"b0ed7553-c229-4444-9e9d-53a16d271385","Type":"ContainerDied","Data":"ca55f8aae7ec39babf30ac9cbecf6ad25f43591f6457b091b32a87ba31a0bf92"} Feb 19 15:12:46 crc kubenswrapper[4810]: I0219 15:12:46.130907 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca55f8aae7ec39babf30ac9cbecf6ad25f43591f6457b091b32a87ba31a0bf92" Feb 19 15:12:46 crc kubenswrapper[4810]: I0219 15:12:46.130968 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 15:12:46 crc kubenswrapper[4810]: I0219 15:12:46.138772 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57ffd7999b-9kf5c" Feb 19 15:12:47 crc kubenswrapper[4810]: I0219 15:12:47.162417 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rk4vw" podStartSLOduration=3.787131675 podStartE2EDuration="47.162392236s" podCreationTimestamp="2026-02-19 15:12:00 +0000 UTC" firstStartedPulling="2026-02-19 15:12:02.244116614 +0000 UTC m=+151.726146738" lastFinishedPulling="2026-02-19 15:12:45.619377175 +0000 UTC m=+195.101407299" observedRunningTime="2026-02-19 15:12:47.16139175 +0000 UTC m=+196.643421874" watchObservedRunningTime="2026-02-19 15:12:47.162392236 +0000 UTC m=+196.644422360" Feb 19 15:12:47 crc kubenswrapper[4810]: I0219 15:12:47.224054 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57ffd7999b-9kf5c"] Feb 19 15:12:47 crc kubenswrapper[4810]: W0219 15:12:47.231458 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4526d27_e1ba_4a55_b017_2f7f003521b4.slice/crio-0fb16d39f37e006c3ff91fbde25cbf07205195d0c458afc9aea3b63ed377d307 WatchSource:0}: Error finding container 0fb16d39f37e006c3ff91fbde25cbf07205195d0c458afc9aea3b63ed377d307: Status 404 returned error can't find the container with id 0fb16d39f37e006c3ff91fbde25cbf07205195d0c458afc9aea3b63ed377d307 Feb 19 15:12:48 crc kubenswrapper[4810]: I0219 15:12:48.143991 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v87ss" event={"ID":"ee54de34-1c90-401d-8102-2cc1e4116661","Type":"ContainerStarted","Data":"e76c00c840c072923d95803a179729bb3992d5d640b8cfc79a009aafae36db32"} Feb 19 15:12:48 crc kubenswrapper[4810]: I0219 15:12:48.146677 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57ffd7999b-9kf5c" event={"ID":"f4526d27-e1ba-4a55-b017-2f7f003521b4","Type":"ContainerStarted","Data":"0fb16d39f37e006c3ff91fbde25cbf07205195d0c458afc9aea3b63ed377d307"} Feb 19 15:12:48 crc kubenswrapper[4810]: I0219 15:12:48.161039 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-v87ss" podStartSLOduration=3.7826496069999997 podStartE2EDuration="45.161022307s" podCreationTimestamp="2026-02-19 15:12:03 +0000 UTC" firstStartedPulling="2026-02-19 15:12:05.541928805 +0000 UTC m=+155.023958919" lastFinishedPulling="2026-02-19 15:12:46.920301495 +0000 UTC m=+196.402331619" observedRunningTime="2026-02-19 15:12:48.160300699 +0000 UTC m=+197.642330823" watchObservedRunningTime="2026-02-19 15:12:48.161022307 +0000 UTC m=+197.643052431" Feb 19 15:12:48 crc kubenswrapper[4810]: I0219 15:12:48.206015 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 19 15:12:48 crc kubenswrapper[4810]: I0219 15:12:48.206681 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 19 15:12:48 crc kubenswrapper[4810]: I0219 15:12:48.212113 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 19 15:12:48 crc kubenswrapper[4810]: I0219 15:12:48.212555 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 19 15:12:48 crc kubenswrapper[4810]: I0219 15:12:48.213491 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 19 15:12:48 crc kubenswrapper[4810]: I0219 15:12:48.309944 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3788d870-2889-4190-9675-e4da44f69a71-kubelet-dir\") pod \"installer-9-crc\" (UID: \"3788d870-2889-4190-9675-e4da44f69a71\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 19 15:12:48 crc kubenswrapper[4810]: I0219 15:12:48.309998 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3788d870-2889-4190-9675-e4da44f69a71-var-lock\") pod \"installer-9-crc\" (UID: \"3788d870-2889-4190-9675-e4da44f69a71\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 19 15:12:48 crc kubenswrapper[4810]: I0219 15:12:48.310051 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3788d870-2889-4190-9675-e4da44f69a71-kube-api-access\") pod \"installer-9-crc\" (UID: \"3788d870-2889-4190-9675-e4da44f69a71\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 19 15:12:48 crc kubenswrapper[4810]: I0219 15:12:48.411500 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3788d870-2889-4190-9675-e4da44f69a71-kubelet-dir\") pod \"installer-9-crc\" (UID: \"3788d870-2889-4190-9675-e4da44f69a71\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 19 15:12:48 crc kubenswrapper[4810]: I0219 15:12:48.411577 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3788d870-2889-4190-9675-e4da44f69a71-var-lock\") pod \"installer-9-crc\" (UID: \"3788d870-2889-4190-9675-e4da44f69a71\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 19 15:12:48 crc kubenswrapper[4810]: I0219 15:12:48.411655 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3788d870-2889-4190-9675-e4da44f69a71-kube-api-access\") pod \"installer-9-crc\" (UID: \"3788d870-2889-4190-9675-e4da44f69a71\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 19 15:12:48 crc kubenswrapper[4810]: I0219 15:12:48.411659 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3788d870-2889-4190-9675-e4da44f69a71-kubelet-dir\") pod \"installer-9-crc\" (UID: \"3788d870-2889-4190-9675-e4da44f69a71\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 19 15:12:48 crc kubenswrapper[4810]: I0219 15:12:48.411815 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3788d870-2889-4190-9675-e4da44f69a71-var-lock\") pod \"installer-9-crc\" (UID: \"3788d870-2889-4190-9675-e4da44f69a71\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 19 15:12:48 crc kubenswrapper[4810]: I0219 15:12:48.434670 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3788d870-2889-4190-9675-e4da44f69a71-kube-api-access\") pod \"installer-9-crc\" (UID: \"3788d870-2889-4190-9675-e4da44f69a71\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 19 15:12:48 crc kubenswrapper[4810]: I0219 15:12:48.526796 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 19 15:12:48 crc kubenswrapper[4810]: I0219 15:12:48.957362 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 19 15:12:48 crc kubenswrapper[4810]: W0219 15:12:48.967456 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod3788d870_2889_4190_9675_e4da44f69a71.slice/crio-edeb9d3a80185cfad5f2d137a73e03675b14c1a6f2b489fec04d11b503684967 WatchSource:0}: Error finding container edeb9d3a80185cfad5f2d137a73e03675b14c1a6f2b489fec04d11b503684967: Status 404 returned error can't find the container with id edeb9d3a80185cfad5f2d137a73e03675b14c1a6f2b489fec04d11b503684967 Feb 19 15:12:49 crc kubenswrapper[4810]: I0219 15:12:49.152684 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"3788d870-2889-4190-9675-e4da44f69a71","Type":"ContainerStarted","Data":"edeb9d3a80185cfad5f2d137a73e03675b14c1a6f2b489fec04d11b503684967"} Feb 19 15:12:49 crc kubenswrapper[4810]: I0219 15:12:49.155220 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gp8sg" event={"ID":"3146bc9a-c4fc-4aa1-acae-032db4aa0582","Type":"ContainerStarted","Data":"f7e09ee181e03814c9bffa4e23b2c1f95c88075e138b0f5ea2eafb7e14efd0cc"} Feb 19 15:12:49 crc kubenswrapper[4810]: I0219 15:12:49.157236 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57ffd7999b-9kf5c" event={"ID":"f4526d27-e1ba-4a55-b017-2f7f003521b4","Type":"ContainerStarted","Data":"7f9ff27afa865c538f30b8e556f3e9f6b3f4147d3617810bab36ea8dda821ac5"} Feb 19 15:12:49 crc kubenswrapper[4810]: I0219 15:12:49.171640 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gp8sg" podStartSLOduration=3.381662295 podStartE2EDuration="46.171626385s" podCreationTimestamp="2026-02-19 15:12:03 +0000 UTC" firstStartedPulling="2026-02-19 15:12:05.537626557 +0000 UTC m=+155.019656681" lastFinishedPulling="2026-02-19 15:12:48.327590647 +0000 UTC m=+197.809620771" observedRunningTime="2026-02-19 15:12:49.168519595 +0000 UTC m=+198.650549709" watchObservedRunningTime="2026-02-19 15:12:49.171626385 +0000 UTC m=+198.653656509" Feb 19 15:12:49 crc kubenswrapper[4810]: I0219 15:12:49.188979 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-57ffd7999b-9kf5c" podStartSLOduration=13.18895462 podStartE2EDuration="13.18895462s" podCreationTimestamp="2026-02-19 15:12:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:12:49.18467298 +0000 UTC m=+198.666703104" watchObservedRunningTime="2026-02-19 15:12:49.18895462 +0000 UTC m=+198.670984744" Feb 19 15:12:49 crc kubenswrapper[4810]: I0219 15:12:49.537541 4810 patch_prober.go:28] interesting pod/machine-config-daemon-t499d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 15:12:49 crc kubenswrapper[4810]: I0219 15:12:49.537599 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 15:12:50 crc kubenswrapper[4810]: I0219 15:12:50.164337 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"3788d870-2889-4190-9675-e4da44f69a71","Type":"ContainerStarted","Data":"df933f73e8a108c6e7ee4ac92776be1551ecd1fa292247fa01da1f7c1d90d486"} Feb 19 15:12:50 crc kubenswrapper[4810]: I0219 15:12:50.164639 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-57ffd7999b-9kf5c" Feb 19 15:12:50 crc kubenswrapper[4810]: I0219 15:12:50.174015 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-57ffd7999b-9kf5c" Feb 19 15:12:50 crc kubenswrapper[4810]: I0219 15:12:50.179051 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.179038082 podStartE2EDuration="2.179038082s" podCreationTimestamp="2026-02-19 15:12:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:12:50.177492212 +0000 UTC m=+199.659522326" watchObservedRunningTime="2026-02-19 15:12:50.179038082 +0000 UTC m=+199.661068206" Feb 19 15:12:50 crc kubenswrapper[4810]: I0219 15:12:50.869743 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rk4vw" Feb 19 15:12:50 crc kubenswrapper[4810]: I0219 15:12:50.869803 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rk4vw" Feb 19 15:12:51 crc kubenswrapper[4810]: I0219 15:12:51.436269 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rk4vw" Feb 19 15:12:51 crc kubenswrapper[4810]: I0219 15:12:51.475546 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rk4vw" Feb 19 15:12:53 crc kubenswrapper[4810]: I0219 15:12:53.596186 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-dkppn" Feb 19 15:12:53 crc kubenswrapper[4810]: I0219 15:12:53.979705 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gp8sg" Feb 19 15:12:53 crc kubenswrapper[4810]: I0219 15:12:53.981116 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gp8sg" Feb 19 15:12:54 crc kubenswrapper[4810]: I0219 15:12:54.353135 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-v87ss" Feb 19 15:12:54 crc kubenswrapper[4810]: I0219 15:12:54.353466 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-v87ss" Feb 19 15:12:55 crc kubenswrapper[4810]: I0219 15:12:55.054039 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gp8sg" podUID="3146bc9a-c4fc-4aa1-acae-032db4aa0582" containerName="registry-server" probeResult="failure" output=< Feb 19 15:12:55 crc kubenswrapper[4810]: timeout: failed to connect service ":50051" within 1s Feb 19 15:12:55 crc kubenswrapper[4810]: > Feb 19 15:12:55 crc kubenswrapper[4810]: I0219 15:12:55.392315 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-v87ss" podUID="ee54de34-1c90-401d-8102-2cc1e4116661" containerName="registry-server" probeResult="failure" output=< Feb 19 15:12:55 crc kubenswrapper[4810]: timeout: failed to connect service ":50051" within 1s Feb 19 15:12:55 crc kubenswrapper[4810]: > Feb 19 15:12:56 crc kubenswrapper[4810]: I0219 15:12:56.843932 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5b7784947b-gg4f2"] Feb 19 15:12:56 crc kubenswrapper[4810]: I0219 15:12:56.844396 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5b7784947b-gg4f2" podUID="0b5a4ac5-f403-4492-93a9-eca271fc69cf" containerName="controller-manager" containerID="cri-o://42468dd1baefcb60fe30e73c95955a3bb2eaf741c06d1f46bf0941fa5d8419da" gracePeriod=30 Feb 19 15:12:56 crc kubenswrapper[4810]: I0219 15:12:56.859412 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57ffd7999b-9kf5c"] Feb 19 15:12:56 crc kubenswrapper[4810]: I0219 15:12:56.859654 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-57ffd7999b-9kf5c" podUID="f4526d27-e1ba-4a55-b017-2f7f003521b4" containerName="route-controller-manager" containerID="cri-o://7f9ff27afa865c538f30b8e556f3e9f6b3f4147d3617810bab36ea8dda821ac5" gracePeriod=30 Feb 19 15:12:57 crc kubenswrapper[4810]: I0219 15:12:57.968075 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57ffd7999b-9kf5c" Feb 19 15:12:58 crc kubenswrapper[4810]: I0219 15:12:58.010900 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6fb4659b76-cmjpw"] Feb 19 15:12:58 crc kubenswrapper[4810]: E0219 15:12:58.011164 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4526d27-e1ba-4a55-b017-2f7f003521b4" containerName="route-controller-manager" Feb 19 15:12:58 crc kubenswrapper[4810]: I0219 15:12:58.012522 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4526d27-e1ba-4a55-b017-2f7f003521b4" containerName="route-controller-manager" Feb 19 15:12:58 crc kubenswrapper[4810]: I0219 15:12:58.012764 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4526d27-e1ba-4a55-b017-2f7f003521b4" containerName="route-controller-manager" Feb 19 15:12:58 crc kubenswrapper[4810]: I0219 15:12:58.013364 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6fb4659b76-cmjpw" Feb 19 15:12:58 crc kubenswrapper[4810]: I0219 15:12:58.017548 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6fb4659b76-cmjpw"] Feb 19 15:12:58 crc kubenswrapper[4810]: I0219 15:12:58.106012 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b7784947b-gg4f2" Feb 19 15:12:58 crc kubenswrapper[4810]: I0219 15:12:58.150983 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4526d27-e1ba-4a55-b017-2f7f003521b4-config\") pod \"f4526d27-e1ba-4a55-b017-2f7f003521b4\" (UID: \"f4526d27-e1ba-4a55-b017-2f7f003521b4\") " Feb 19 15:12:58 crc kubenswrapper[4810]: I0219 15:12:58.151169 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kb2f2\" (UniqueName: \"kubernetes.io/projected/f4526d27-e1ba-4a55-b017-2f7f003521b4-kube-api-access-kb2f2\") pod \"f4526d27-e1ba-4a55-b017-2f7f003521b4\" (UID: \"f4526d27-e1ba-4a55-b017-2f7f003521b4\") " Feb 19 15:12:58 crc kubenswrapper[4810]: I0219 15:12:58.151290 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f4526d27-e1ba-4a55-b017-2f7f003521b4-client-ca\") pod \"f4526d27-e1ba-4a55-b017-2f7f003521b4\" (UID: \"f4526d27-e1ba-4a55-b017-2f7f003521b4\") " Feb 19 15:12:58 crc kubenswrapper[4810]: I0219 15:12:58.151310 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f4526d27-e1ba-4a55-b017-2f7f003521b4-serving-cert\") pod \"f4526d27-e1ba-4a55-b017-2f7f003521b4\" (UID: \"f4526d27-e1ba-4a55-b017-2f7f003521b4\") " Feb 19 15:12:58 crc kubenswrapper[4810]: I0219 15:12:58.152019 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4526d27-e1ba-4a55-b017-2f7f003521b4-client-ca" (OuterVolumeSpecName: "client-ca") pod "f4526d27-e1ba-4a55-b017-2f7f003521b4" (UID: "f4526d27-e1ba-4a55-b017-2f7f003521b4"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:12:58 crc kubenswrapper[4810]: I0219 15:12:58.152036 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4526d27-e1ba-4a55-b017-2f7f003521b4-config" (OuterVolumeSpecName: "config") pod "f4526d27-e1ba-4a55-b017-2f7f003521b4" (UID: "f4526d27-e1ba-4a55-b017-2f7f003521b4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:12:58 crc kubenswrapper[4810]: I0219 15:12:58.152473 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9d474cf7-0ff3-43d0-88e8-19100a658851-client-ca\") pod \"route-controller-manager-6fb4659b76-cmjpw\" (UID: \"9d474cf7-0ff3-43d0-88e8-19100a658851\") " pod="openshift-route-controller-manager/route-controller-manager-6fb4659b76-cmjpw" Feb 19 15:12:58 crc kubenswrapper[4810]: I0219 15:12:58.152527 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d474cf7-0ff3-43d0-88e8-19100a658851-serving-cert\") pod \"route-controller-manager-6fb4659b76-cmjpw\" (UID: \"9d474cf7-0ff3-43d0-88e8-19100a658851\") " pod="openshift-route-controller-manager/route-controller-manager-6fb4659b76-cmjpw" Feb 19 15:12:58 crc kubenswrapper[4810]: I0219 15:12:58.152580 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d474cf7-0ff3-43d0-88e8-19100a658851-config\") pod \"route-controller-manager-6fb4659b76-cmjpw\" (UID: \"9d474cf7-0ff3-43d0-88e8-19100a658851\") " pod="openshift-route-controller-manager/route-controller-manager-6fb4659b76-cmjpw" Feb 19 15:12:58 crc kubenswrapper[4810]: I0219 15:12:58.152608 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxmxw\" (UniqueName: \"kubernetes.io/projected/9d474cf7-0ff3-43d0-88e8-19100a658851-kube-api-access-nxmxw\") pod \"route-controller-manager-6fb4659b76-cmjpw\" (UID: \"9d474cf7-0ff3-43d0-88e8-19100a658851\") " pod="openshift-route-controller-manager/route-controller-manager-6fb4659b76-cmjpw" Feb 19 15:12:58 crc kubenswrapper[4810]: I0219 15:12:58.152655 4810 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f4526d27-e1ba-4a55-b017-2f7f003521b4-client-ca\") on node \"crc\" DevicePath \"\"" Feb 19 15:12:58 crc kubenswrapper[4810]: I0219 15:12:58.152665 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4526d27-e1ba-4a55-b017-2f7f003521b4-config\") on node \"crc\" DevicePath \"\"" Feb 19 15:12:58 crc kubenswrapper[4810]: I0219 15:12:58.158470 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4526d27-e1ba-4a55-b017-2f7f003521b4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f4526d27-e1ba-4a55-b017-2f7f003521b4" (UID: "f4526d27-e1ba-4a55-b017-2f7f003521b4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:12:58 crc kubenswrapper[4810]: I0219 15:12:58.161013 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4526d27-e1ba-4a55-b017-2f7f003521b4-kube-api-access-kb2f2" (OuterVolumeSpecName: "kube-api-access-kb2f2") pod "f4526d27-e1ba-4a55-b017-2f7f003521b4" (UID: "f4526d27-e1ba-4a55-b017-2f7f003521b4"). InnerVolumeSpecName "kube-api-access-kb2f2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:12:58 crc kubenswrapper[4810]: I0219 15:12:58.211967 4810 generic.go:334] "Generic (PLEG): container finished" podID="7e18c7b6-8ee0-4dbd-a0b2-3830b3a7bc53" containerID="2ba8cb7edb1b0d7ff70efe1c621d14ae3cd3fc1499b0b814053d764332922921" exitCode=0 Feb 19 15:12:58 crc kubenswrapper[4810]: I0219 15:12:58.212058 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ptbh9" event={"ID":"7e18c7b6-8ee0-4dbd-a0b2-3830b3a7bc53","Type":"ContainerDied","Data":"2ba8cb7edb1b0d7ff70efe1c621d14ae3cd3fc1499b0b814053d764332922921"} Feb 19 15:12:58 crc kubenswrapper[4810]: I0219 15:12:58.214687 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x8sn2" event={"ID":"cc8ce195-1fe1-4684-8172-e710b3552fb5","Type":"ContainerStarted","Data":"9ab517ec22370626097c33f80a892ee93279a0d966337802ec729d7271a4e233"} Feb 19 15:12:58 crc kubenswrapper[4810]: I0219 15:12:58.216918 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-blpmq" event={"ID":"4127fef2-ef2b-4cc4-967d-d52dac26f314","Type":"ContainerStarted","Data":"b6b451b278f70ee6f85bbefa846a37e02c57266a0a01e80c05241319be84555e"} Feb 19 15:12:58 crc kubenswrapper[4810]: I0219 15:12:58.224964 4810 generic.go:334] "Generic (PLEG): container finished" podID="5c654206-f2d0-4b40-9df0-577dbf27e5e4" containerID="471901a772079c16e7d5c328b745070f868fa529208773cf4e67ab79b2945769" exitCode=0 Feb 19 15:12:58 crc kubenswrapper[4810]: I0219 15:12:58.225044 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dfqg8" event={"ID":"5c654206-f2d0-4b40-9df0-577dbf27e5e4","Type":"ContainerDied","Data":"471901a772079c16e7d5c328b745070f868fa529208773cf4e67ab79b2945769"} Feb 19 15:12:58 crc kubenswrapper[4810]: I0219 15:12:58.227215 4810 generic.go:334] "Generic (PLEG): container finished" podID="f4526d27-e1ba-4a55-b017-2f7f003521b4" containerID="7f9ff27afa865c538f30b8e556f3e9f6b3f4147d3617810bab36ea8dda821ac5" exitCode=0 Feb 19 15:12:58 crc kubenswrapper[4810]: I0219 15:12:58.227272 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57ffd7999b-9kf5c" event={"ID":"f4526d27-e1ba-4a55-b017-2f7f003521b4","Type":"ContainerDied","Data":"7f9ff27afa865c538f30b8e556f3e9f6b3f4147d3617810bab36ea8dda821ac5"} Feb 19 15:12:58 crc kubenswrapper[4810]: I0219 15:12:58.227292 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57ffd7999b-9kf5c" event={"ID":"f4526d27-e1ba-4a55-b017-2f7f003521b4","Type":"ContainerDied","Data":"0fb16d39f37e006c3ff91fbde25cbf07205195d0c458afc9aea3b63ed377d307"} Feb 19 15:12:58 crc kubenswrapper[4810]: I0219 15:12:58.227307 4810 scope.go:117] "RemoveContainer" containerID="7f9ff27afa865c538f30b8e556f3e9f6b3f4147d3617810bab36ea8dda821ac5" Feb 19 15:12:58 crc kubenswrapper[4810]: I0219 15:12:58.227431 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57ffd7999b-9kf5c" Feb 19 15:12:58 crc kubenswrapper[4810]: I0219 15:12:58.229486 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d5ks5" event={"ID":"9a3d6b1f-2011-4f7f-bea0-1d303007fe41","Type":"ContainerStarted","Data":"cf8a20a9712326d7f7917d432dc449c8f1126f425b48ed7c1328c76b1ca7b19c"} Feb 19 15:12:58 crc kubenswrapper[4810]: I0219 15:12:58.232243 4810 generic.go:334] "Generic (PLEG): container finished" podID="0b5a4ac5-f403-4492-93a9-eca271fc69cf" containerID="42468dd1baefcb60fe30e73c95955a3bb2eaf741c06d1f46bf0941fa5d8419da" exitCode=0 Feb 19 15:12:58 crc kubenswrapper[4810]: I0219 15:12:58.232266 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b7784947b-gg4f2" event={"ID":"0b5a4ac5-f403-4492-93a9-eca271fc69cf","Type":"ContainerDied","Data":"42468dd1baefcb60fe30e73c95955a3bb2eaf741c06d1f46bf0941fa5d8419da"} Feb 19 15:12:58 crc kubenswrapper[4810]: I0219 15:12:58.232280 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b7784947b-gg4f2" event={"ID":"0b5a4ac5-f403-4492-93a9-eca271fc69cf","Type":"ContainerDied","Data":"f33a645c1b493c26e6d3ed1478c77f60dcd1bd0b429730e5b33e0b76bf136c78"} Feb 19 15:12:58 crc kubenswrapper[4810]: I0219 15:12:58.232362 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b7784947b-gg4f2" Feb 19 15:12:58 crc kubenswrapper[4810]: I0219 15:12:58.254048 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0b5a4ac5-f403-4492-93a9-eca271fc69cf-client-ca\") pod \"0b5a4ac5-f403-4492-93a9-eca271fc69cf\" (UID: \"0b5a4ac5-f403-4492-93a9-eca271fc69cf\") " Feb 19 15:12:58 crc kubenswrapper[4810]: I0219 15:12:58.254146 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0b5a4ac5-f403-4492-93a9-eca271fc69cf-proxy-ca-bundles\") pod \"0b5a4ac5-f403-4492-93a9-eca271fc69cf\" (UID: \"0b5a4ac5-f403-4492-93a9-eca271fc69cf\") " Feb 19 15:12:58 crc kubenswrapper[4810]: I0219 15:12:58.254187 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zpstf\" (UniqueName: \"kubernetes.io/projected/0b5a4ac5-f403-4492-93a9-eca271fc69cf-kube-api-access-zpstf\") pod \"0b5a4ac5-f403-4492-93a9-eca271fc69cf\" (UID: \"0b5a4ac5-f403-4492-93a9-eca271fc69cf\") " Feb 19 15:12:58 crc kubenswrapper[4810]: I0219 15:12:58.254222 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b5a4ac5-f403-4492-93a9-eca271fc69cf-config\") pod \"0b5a4ac5-f403-4492-93a9-eca271fc69cf\" (UID: \"0b5a4ac5-f403-4492-93a9-eca271fc69cf\") " Feb 19 15:12:58 crc kubenswrapper[4810]: I0219 15:12:58.254961 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b5a4ac5-f403-4492-93a9-eca271fc69cf-serving-cert\") pod \"0b5a4ac5-f403-4492-93a9-eca271fc69cf\" (UID: \"0b5a4ac5-f403-4492-93a9-eca271fc69cf\") " Feb 19 15:12:58 crc kubenswrapper[4810]: I0219 15:12:58.255185 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9d474cf7-0ff3-43d0-88e8-19100a658851-client-ca\") pod \"route-controller-manager-6fb4659b76-cmjpw\" (UID: \"9d474cf7-0ff3-43d0-88e8-19100a658851\") " pod="openshift-route-controller-manager/route-controller-manager-6fb4659b76-cmjpw" Feb 19 15:12:58 crc kubenswrapper[4810]: I0219 15:12:58.255219 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d474cf7-0ff3-43d0-88e8-19100a658851-serving-cert\") pod \"route-controller-manager-6fb4659b76-cmjpw\" (UID: \"9d474cf7-0ff3-43d0-88e8-19100a658851\") " pod="openshift-route-controller-manager/route-controller-manager-6fb4659b76-cmjpw" Feb 19 15:12:58 crc kubenswrapper[4810]: I0219 15:12:58.255267 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d474cf7-0ff3-43d0-88e8-19100a658851-config\") pod \"route-controller-manager-6fb4659b76-cmjpw\" (UID: \"9d474cf7-0ff3-43d0-88e8-19100a658851\") " pod="openshift-route-controller-manager/route-controller-manager-6fb4659b76-cmjpw" Feb 19 15:12:58 crc kubenswrapper[4810]: I0219 15:12:58.255299 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxmxw\" (UniqueName: \"kubernetes.io/projected/9d474cf7-0ff3-43d0-88e8-19100a658851-kube-api-access-nxmxw\") pod \"route-controller-manager-6fb4659b76-cmjpw\" (UID: \"9d474cf7-0ff3-43d0-88e8-19100a658851\") " pod="openshift-route-controller-manager/route-controller-manager-6fb4659b76-cmjpw" Feb 19 15:12:58 crc kubenswrapper[4810]: I0219 15:12:58.255371 4810 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f4526d27-e1ba-4a55-b017-2f7f003521b4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 15:12:58 crc kubenswrapper[4810]: I0219 15:12:58.255384 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kb2f2\" (UniqueName: \"kubernetes.io/projected/f4526d27-e1ba-4a55-b017-2f7f003521b4-kube-api-access-kb2f2\") on node \"crc\" DevicePath \"\"" Feb 19 15:12:58 crc kubenswrapper[4810]: I0219 15:12:58.255656 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b5a4ac5-f403-4492-93a9-eca271fc69cf-client-ca" (OuterVolumeSpecName: "client-ca") pod "0b5a4ac5-f403-4492-93a9-eca271fc69cf" (UID: "0b5a4ac5-f403-4492-93a9-eca271fc69cf"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:12:58 crc kubenswrapper[4810]: I0219 15:12:58.256038 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b5a4ac5-f403-4492-93a9-eca271fc69cf-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "0b5a4ac5-f403-4492-93a9-eca271fc69cf" (UID: "0b5a4ac5-f403-4492-93a9-eca271fc69cf"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:12:58 crc kubenswrapper[4810]: I0219 15:12:58.256170 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b5a4ac5-f403-4492-93a9-eca271fc69cf-config" (OuterVolumeSpecName: "config") pod "0b5a4ac5-f403-4492-93a9-eca271fc69cf" (UID: "0b5a4ac5-f403-4492-93a9-eca271fc69cf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:12:58 crc kubenswrapper[4810]: I0219 15:12:58.257268 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9d474cf7-0ff3-43d0-88e8-19100a658851-client-ca\") pod \"route-controller-manager-6fb4659b76-cmjpw\" (UID: \"9d474cf7-0ff3-43d0-88e8-19100a658851\") " pod="openshift-route-controller-manager/route-controller-manager-6fb4659b76-cmjpw" Feb 19 15:12:58 crc kubenswrapper[4810]: I0219 15:12:58.259675 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d474cf7-0ff3-43d0-88e8-19100a658851-config\") pod \"route-controller-manager-6fb4659b76-cmjpw\" (UID: \"9d474cf7-0ff3-43d0-88e8-19100a658851\") " pod="openshift-route-controller-manager/route-controller-manager-6fb4659b76-cmjpw" Feb 19 15:12:58 crc kubenswrapper[4810]: I0219 15:12:58.259806 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b5a4ac5-f403-4492-93a9-eca271fc69cf-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b5a4ac5-f403-4492-93a9-eca271fc69cf" (UID: "0b5a4ac5-f403-4492-93a9-eca271fc69cf"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:12:58 crc kubenswrapper[4810]: I0219 15:12:58.260691 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b5a4ac5-f403-4492-93a9-eca271fc69cf-kube-api-access-zpstf" (OuterVolumeSpecName: "kube-api-access-zpstf") pod "0b5a4ac5-f403-4492-93a9-eca271fc69cf" (UID: "0b5a4ac5-f403-4492-93a9-eca271fc69cf"). InnerVolumeSpecName "kube-api-access-zpstf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:12:58 crc kubenswrapper[4810]: I0219 15:12:58.267972 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d474cf7-0ff3-43d0-88e8-19100a658851-serving-cert\") pod \"route-controller-manager-6fb4659b76-cmjpw\" (UID: \"9d474cf7-0ff3-43d0-88e8-19100a658851\") " pod="openshift-route-controller-manager/route-controller-manager-6fb4659b76-cmjpw" Feb 19 15:12:58 crc kubenswrapper[4810]: I0219 15:12:58.286532 4810 scope.go:117] "RemoveContainer" containerID="7f9ff27afa865c538f30b8e556f3e9f6b3f4147d3617810bab36ea8dda821ac5" Feb 19 15:12:58 crc kubenswrapper[4810]: E0219 15:12:58.287529 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f9ff27afa865c538f30b8e556f3e9f6b3f4147d3617810bab36ea8dda821ac5\": container with ID starting with 7f9ff27afa865c538f30b8e556f3e9f6b3f4147d3617810bab36ea8dda821ac5 not found: ID does not exist" containerID="7f9ff27afa865c538f30b8e556f3e9f6b3f4147d3617810bab36ea8dda821ac5" Feb 19 15:12:58 crc kubenswrapper[4810]: I0219 15:12:58.287638 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f9ff27afa865c538f30b8e556f3e9f6b3f4147d3617810bab36ea8dda821ac5"} err="failed to get container status \"7f9ff27afa865c538f30b8e556f3e9f6b3f4147d3617810bab36ea8dda821ac5\": rpc error: code = NotFound desc = could not find container \"7f9ff27afa865c538f30b8e556f3e9f6b3f4147d3617810bab36ea8dda821ac5\": container with ID starting with 7f9ff27afa865c538f30b8e556f3e9f6b3f4147d3617810bab36ea8dda821ac5 not found: ID does not exist" Feb 19 15:12:58 crc kubenswrapper[4810]: I0219 15:12:58.287695 4810 scope.go:117] "RemoveContainer" containerID="42468dd1baefcb60fe30e73c95955a3bb2eaf741c06d1f46bf0941fa5d8419da" Feb 19 15:12:58 crc kubenswrapper[4810]: I0219 15:12:58.291734 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxmxw\" (UniqueName: \"kubernetes.io/projected/9d474cf7-0ff3-43d0-88e8-19100a658851-kube-api-access-nxmxw\") pod \"route-controller-manager-6fb4659b76-cmjpw\" (UID: \"9d474cf7-0ff3-43d0-88e8-19100a658851\") " pod="openshift-route-controller-manager/route-controller-manager-6fb4659b76-cmjpw" Feb 19 15:12:58 crc kubenswrapper[4810]: I0219 15:12:58.301814 4810 scope.go:117] "RemoveContainer" containerID="42468dd1baefcb60fe30e73c95955a3bb2eaf741c06d1f46bf0941fa5d8419da" Feb 19 15:12:58 crc kubenswrapper[4810]: E0219 15:12:58.302259 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42468dd1baefcb60fe30e73c95955a3bb2eaf741c06d1f46bf0941fa5d8419da\": container with ID starting with 42468dd1baefcb60fe30e73c95955a3bb2eaf741c06d1f46bf0941fa5d8419da not found: ID does not exist" containerID="42468dd1baefcb60fe30e73c95955a3bb2eaf741c06d1f46bf0941fa5d8419da" Feb 19 15:12:58 crc kubenswrapper[4810]: I0219 15:12:58.302292 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42468dd1baefcb60fe30e73c95955a3bb2eaf741c06d1f46bf0941fa5d8419da"} err="failed to get container status \"42468dd1baefcb60fe30e73c95955a3bb2eaf741c06d1f46bf0941fa5d8419da\": rpc error: code = NotFound desc = could not find container \"42468dd1baefcb60fe30e73c95955a3bb2eaf741c06d1f46bf0941fa5d8419da\": container with ID starting with 42468dd1baefcb60fe30e73c95955a3bb2eaf741c06d1f46bf0941fa5d8419da not found: ID does not exist" Feb 19 15:12:58 crc kubenswrapper[4810]: I0219 15:12:58.331489 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6fb4659b76-cmjpw" Feb 19 15:12:58 crc kubenswrapper[4810]: I0219 15:12:58.358710 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57ffd7999b-9kf5c"] Feb 19 15:12:58 crc kubenswrapper[4810]: I0219 15:12:58.359369 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57ffd7999b-9kf5c"] Feb 19 15:12:58 crc kubenswrapper[4810]: I0219 15:12:58.360492 4810 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b5a4ac5-f403-4492-93a9-eca271fc69cf-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 15:12:58 crc kubenswrapper[4810]: I0219 15:12:58.360540 4810 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0b5a4ac5-f403-4492-93a9-eca271fc69cf-client-ca\") on node \"crc\" DevicePath \"\"" Feb 19 15:12:58 crc kubenswrapper[4810]: I0219 15:12:58.360554 4810 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0b5a4ac5-f403-4492-93a9-eca271fc69cf-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 19 15:12:58 crc kubenswrapper[4810]: I0219 15:12:58.360571 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zpstf\" (UniqueName: \"kubernetes.io/projected/0b5a4ac5-f403-4492-93a9-eca271fc69cf-kube-api-access-zpstf\") on node \"crc\" DevicePath \"\"" Feb 19 15:12:58 crc kubenswrapper[4810]: I0219 15:12:58.360584 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b5a4ac5-f403-4492-93a9-eca271fc69cf-config\") on node \"crc\" DevicePath \"\"" Feb 19 15:12:58 crc kubenswrapper[4810]: I0219 15:12:58.563472 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5b7784947b-gg4f2"] Feb 19 15:12:58 crc kubenswrapper[4810]: I0219 15:12:58.566151 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5b7784947b-gg4f2"] Feb 19 15:12:58 crc kubenswrapper[4810]: I0219 15:12:58.589093 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6fb4659b76-cmjpw"] Feb 19 15:12:58 crc kubenswrapper[4810]: W0219 15:12:58.598406 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d474cf7_0ff3_43d0_88e8_19100a658851.slice/crio-61cb0e99accc7f3521da0dd967157522b71ffbbbce8920140bb606b51134702b WatchSource:0}: Error finding container 61cb0e99accc7f3521da0dd967157522b71ffbbbce8920140bb606b51134702b: Status 404 returned error can't find the container with id 61cb0e99accc7f3521da0dd967157522b71ffbbbce8920140bb606b51134702b Feb 19 15:12:59 crc kubenswrapper[4810]: I0219 15:12:59.248030 4810 generic.go:334] "Generic (PLEG): container finished" podID="4127fef2-ef2b-4cc4-967d-d52dac26f314" containerID="b6b451b278f70ee6f85bbefa846a37e02c57266a0a01e80c05241319be84555e" exitCode=0 Feb 19 15:12:59 crc kubenswrapper[4810]: I0219 15:12:59.248088 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-blpmq" event={"ID":"4127fef2-ef2b-4cc4-967d-d52dac26f314","Type":"ContainerDied","Data":"b6b451b278f70ee6f85bbefa846a37e02c57266a0a01e80c05241319be84555e"} Feb 19 15:12:59 crc kubenswrapper[4810]: I0219 15:12:59.253976 4810 generic.go:334] "Generic (PLEG): container finished" podID="cc8ce195-1fe1-4684-8172-e710b3552fb5" containerID="9ab517ec22370626097c33f80a892ee93279a0d966337802ec729d7271a4e233" exitCode=0 Feb 19 15:12:59 crc kubenswrapper[4810]: I0219 15:12:59.254048 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x8sn2" event={"ID":"cc8ce195-1fe1-4684-8172-e710b3552fb5","Type":"ContainerDied","Data":"9ab517ec22370626097c33f80a892ee93279a0d966337802ec729d7271a4e233"} Feb 19 15:12:59 crc kubenswrapper[4810]: I0219 15:12:59.269109 4810 generic.go:334] "Generic (PLEG): container finished" podID="9a3d6b1f-2011-4f7f-bea0-1d303007fe41" containerID="cf8a20a9712326d7f7917d432dc449c8f1126f425b48ed7c1328c76b1ca7b19c" exitCode=0 Feb 19 15:12:59 crc kubenswrapper[4810]: I0219 15:12:59.269229 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d5ks5" event={"ID":"9a3d6b1f-2011-4f7f-bea0-1d303007fe41","Type":"ContainerDied","Data":"cf8a20a9712326d7f7917d432dc449c8f1126f425b48ed7c1328c76b1ca7b19c"} Feb 19 15:12:59 crc kubenswrapper[4810]: I0219 15:12:59.281550 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6fb4659b76-cmjpw" event={"ID":"9d474cf7-0ff3-43d0-88e8-19100a658851","Type":"ContainerStarted","Data":"a2b82759b901660a79c47e74ce7b041d5164f2bc3f90bc7fb84fbbfb09e3a3e7"} Feb 19 15:12:59 crc kubenswrapper[4810]: I0219 15:12:59.281714 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6fb4659b76-cmjpw" event={"ID":"9d474cf7-0ff3-43d0-88e8-19100a658851","Type":"ContainerStarted","Data":"61cb0e99accc7f3521da0dd967157522b71ffbbbce8920140bb606b51134702b"} Feb 19 15:12:59 crc kubenswrapper[4810]: I0219 15:12:59.281839 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6fb4659b76-cmjpw" Feb 19 15:12:59 crc kubenswrapper[4810]: I0219 15:12:59.349475 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6fb4659b76-cmjpw" podStartSLOduration=3.349440675 podStartE2EDuration="3.349440675s" podCreationTimestamp="2026-02-19 15:12:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:12:59.347280289 +0000 UTC m=+208.829310423" watchObservedRunningTime="2026-02-19 15:12:59.349440675 +0000 UTC m=+208.831470839" Feb 19 15:12:59 crc kubenswrapper[4810]: I0219 15:12:59.450196 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b5a4ac5-f403-4492-93a9-eca271fc69cf" path="/var/lib/kubelet/pods/0b5a4ac5-f403-4492-93a9-eca271fc69cf/volumes" Feb 19 15:12:59 crc kubenswrapper[4810]: I0219 15:12:59.451296 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4526d27-e1ba-4a55-b017-2f7f003521b4" path="/var/lib/kubelet/pods/f4526d27-e1ba-4a55-b017-2f7f003521b4/volumes" Feb 19 15:12:59 crc kubenswrapper[4810]: I0219 15:12:59.542386 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6fb4659b76-cmjpw" Feb 19 15:13:00 crc kubenswrapper[4810]: I0219 15:13:00.829477 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-698c8bc94d-lz9st"] Feb 19 15:13:00 crc kubenswrapper[4810]: E0219 15:13:00.830041 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b5a4ac5-f403-4492-93a9-eca271fc69cf" containerName="controller-manager" Feb 19 15:13:00 crc kubenswrapper[4810]: I0219 15:13:00.830057 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b5a4ac5-f403-4492-93a9-eca271fc69cf" containerName="controller-manager" Feb 19 15:13:00 crc kubenswrapper[4810]: I0219 15:13:00.830222 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b5a4ac5-f403-4492-93a9-eca271fc69cf" containerName="controller-manager" Feb 19 15:13:00 crc kubenswrapper[4810]: I0219 15:13:00.830792 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-698c8bc94d-lz9st" Feb 19 15:13:00 crc kubenswrapper[4810]: I0219 15:13:00.833824 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 19 15:13:00 crc kubenswrapper[4810]: I0219 15:13:00.834275 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 19 15:13:00 crc kubenswrapper[4810]: I0219 15:13:00.834354 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 19 15:13:00 crc kubenswrapper[4810]: I0219 15:13:00.838947 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 19 15:13:00 crc kubenswrapper[4810]: I0219 15:13:00.839435 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 19 15:13:00 crc kubenswrapper[4810]: I0219 15:13:00.840193 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 19 15:13:00 crc kubenswrapper[4810]: I0219 15:13:00.840953 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-698c8bc94d-lz9st"] Feb 19 15:13:00 crc kubenswrapper[4810]: I0219 15:13:00.844955 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 19 15:13:00 crc kubenswrapper[4810]: I0219 15:13:00.995159 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fab1a62-cdc3-4a2c-ab2d-dbd8f1f097f1-config\") pod \"controller-manager-698c8bc94d-lz9st\" (UID: \"9fab1a62-cdc3-4a2c-ab2d-dbd8f1f097f1\") " pod="openshift-controller-manager/controller-manager-698c8bc94d-lz9st" Feb 19 15:13:00 crc kubenswrapper[4810]: I0219 15:13:00.995207 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9fab1a62-cdc3-4a2c-ab2d-dbd8f1f097f1-serving-cert\") pod \"controller-manager-698c8bc94d-lz9st\" (UID: \"9fab1a62-cdc3-4a2c-ab2d-dbd8f1f097f1\") " pod="openshift-controller-manager/controller-manager-698c8bc94d-lz9st" Feb 19 15:13:00 crc kubenswrapper[4810]: I0219 15:13:00.995235 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9fab1a62-cdc3-4a2c-ab2d-dbd8f1f097f1-client-ca\") pod \"controller-manager-698c8bc94d-lz9st\" (UID: \"9fab1a62-cdc3-4a2c-ab2d-dbd8f1f097f1\") " pod="openshift-controller-manager/controller-manager-698c8bc94d-lz9st" Feb 19 15:13:00 crc kubenswrapper[4810]: I0219 15:13:00.995262 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9fab1a62-cdc3-4a2c-ab2d-dbd8f1f097f1-proxy-ca-bundles\") pod \"controller-manager-698c8bc94d-lz9st\" (UID: \"9fab1a62-cdc3-4a2c-ab2d-dbd8f1f097f1\") " pod="openshift-controller-manager/controller-manager-698c8bc94d-lz9st" Feb 19 15:13:00 crc kubenswrapper[4810]: I0219 15:13:00.995284 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfgv7\" (UniqueName: \"kubernetes.io/projected/9fab1a62-cdc3-4a2c-ab2d-dbd8f1f097f1-kube-api-access-bfgv7\") pod \"controller-manager-698c8bc94d-lz9st\" (UID: \"9fab1a62-cdc3-4a2c-ab2d-dbd8f1f097f1\") " pod="openshift-controller-manager/controller-manager-698c8bc94d-lz9st" Feb 19 15:13:01 crc kubenswrapper[4810]: I0219 15:13:01.096412 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fab1a62-cdc3-4a2c-ab2d-dbd8f1f097f1-config\") pod \"controller-manager-698c8bc94d-lz9st\" (UID: \"9fab1a62-cdc3-4a2c-ab2d-dbd8f1f097f1\") " pod="openshift-controller-manager/controller-manager-698c8bc94d-lz9st" Feb 19 15:13:01 crc kubenswrapper[4810]: I0219 15:13:01.096477 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9fab1a62-cdc3-4a2c-ab2d-dbd8f1f097f1-serving-cert\") pod \"controller-manager-698c8bc94d-lz9st\" (UID: \"9fab1a62-cdc3-4a2c-ab2d-dbd8f1f097f1\") " pod="openshift-controller-manager/controller-manager-698c8bc94d-lz9st" Feb 19 15:13:01 crc kubenswrapper[4810]: I0219 15:13:01.096500 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9fab1a62-cdc3-4a2c-ab2d-dbd8f1f097f1-client-ca\") pod \"controller-manager-698c8bc94d-lz9st\" (UID: \"9fab1a62-cdc3-4a2c-ab2d-dbd8f1f097f1\") " pod="openshift-controller-manager/controller-manager-698c8bc94d-lz9st" Feb 19 15:13:01 crc kubenswrapper[4810]: I0219 15:13:01.096529 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9fab1a62-cdc3-4a2c-ab2d-dbd8f1f097f1-proxy-ca-bundles\") pod \"controller-manager-698c8bc94d-lz9st\" (UID: \"9fab1a62-cdc3-4a2c-ab2d-dbd8f1f097f1\") " pod="openshift-controller-manager/controller-manager-698c8bc94d-lz9st" Feb 19 15:13:01 crc kubenswrapper[4810]: I0219 15:13:01.096557 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfgv7\" (UniqueName: \"kubernetes.io/projected/9fab1a62-cdc3-4a2c-ab2d-dbd8f1f097f1-kube-api-access-bfgv7\") pod \"controller-manager-698c8bc94d-lz9st\" (UID: \"9fab1a62-cdc3-4a2c-ab2d-dbd8f1f097f1\") " pod="openshift-controller-manager/controller-manager-698c8bc94d-lz9st" Feb 19 15:13:01 crc kubenswrapper[4810]: I0219 15:13:01.098353 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fab1a62-cdc3-4a2c-ab2d-dbd8f1f097f1-config\") pod \"controller-manager-698c8bc94d-lz9st\" (UID: \"9fab1a62-cdc3-4a2c-ab2d-dbd8f1f097f1\") " pod="openshift-controller-manager/controller-manager-698c8bc94d-lz9st" Feb 19 15:13:01 crc kubenswrapper[4810]: I0219 15:13:01.099208 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9fab1a62-cdc3-4a2c-ab2d-dbd8f1f097f1-proxy-ca-bundles\") pod \"controller-manager-698c8bc94d-lz9st\" (UID: \"9fab1a62-cdc3-4a2c-ab2d-dbd8f1f097f1\") " pod="openshift-controller-manager/controller-manager-698c8bc94d-lz9st" Feb 19 15:13:01 crc kubenswrapper[4810]: I0219 15:13:01.099613 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9fab1a62-cdc3-4a2c-ab2d-dbd8f1f097f1-client-ca\") pod \"controller-manager-698c8bc94d-lz9st\" (UID: \"9fab1a62-cdc3-4a2c-ab2d-dbd8f1f097f1\") " pod="openshift-controller-manager/controller-manager-698c8bc94d-lz9st" Feb 19 15:13:01 crc kubenswrapper[4810]: I0219 15:13:01.103561 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9fab1a62-cdc3-4a2c-ab2d-dbd8f1f097f1-serving-cert\") pod \"controller-manager-698c8bc94d-lz9st\" (UID: \"9fab1a62-cdc3-4a2c-ab2d-dbd8f1f097f1\") " pod="openshift-controller-manager/controller-manager-698c8bc94d-lz9st" Feb 19 15:13:01 crc kubenswrapper[4810]: I0219 15:13:01.112260 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfgv7\" (UniqueName: \"kubernetes.io/projected/9fab1a62-cdc3-4a2c-ab2d-dbd8f1f097f1-kube-api-access-bfgv7\") pod \"controller-manager-698c8bc94d-lz9st\" (UID: \"9fab1a62-cdc3-4a2c-ab2d-dbd8f1f097f1\") " pod="openshift-controller-manager/controller-manager-698c8bc94d-lz9st" Feb 19 15:13:01 crc kubenswrapper[4810]: I0219 15:13:01.150395 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-698c8bc94d-lz9st" Feb 19 15:13:02 crc kubenswrapper[4810]: I0219 15:13:02.844157 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-698c8bc94d-lz9st"] Feb 19 15:13:03 crc kubenswrapper[4810]: I0219 15:13:03.325286 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-698c8bc94d-lz9st" event={"ID":"9fab1a62-cdc3-4a2c-ab2d-dbd8f1f097f1","Type":"ContainerStarted","Data":"5a6aca7188bb15f6a1934c0cbf78cc5b48de2dae39ca1f408904e66c1d41c473"} Feb 19 15:13:04 crc kubenswrapper[4810]: I0219 15:13:04.021989 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gp8sg" Feb 19 15:13:04 crc kubenswrapper[4810]: I0219 15:13:04.069043 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gp8sg" Feb 19 15:13:04 crc kubenswrapper[4810]: I0219 15:13:04.333730 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-698c8bc94d-lz9st" event={"ID":"9fab1a62-cdc3-4a2c-ab2d-dbd8f1f097f1","Type":"ContainerStarted","Data":"26464e252c8755e254a428e223c59bad6858b683b17cb2dde5fe0926737087fb"} Feb 19 15:13:04 crc kubenswrapper[4810]: I0219 15:13:04.334287 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-698c8bc94d-lz9st" Feb 19 15:13:04 crc kubenswrapper[4810]: I0219 15:13:04.337573 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dfqg8" event={"ID":"5c654206-f2d0-4b40-9df0-577dbf27e5e4","Type":"ContainerStarted","Data":"32a771c22d7bf5ec3610bad3dd5fe81ac2ad5a407f7d5f0293d46b10d47c380d"} Feb 19 15:13:04 crc kubenswrapper[4810]: I0219 15:13:04.340946 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-698c8bc94d-lz9st" Feb 19 15:13:04 crc kubenswrapper[4810]: I0219 15:13:04.363530 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-698c8bc94d-lz9st" podStartSLOduration=8.363459545 podStartE2EDuration="8.363459545s" podCreationTimestamp="2026-02-19 15:12:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:13:04.358090917 +0000 UTC m=+213.840121061" watchObservedRunningTime="2026-02-19 15:13:04.363459545 +0000 UTC m=+213.845489719" Feb 19 15:13:04 crc kubenswrapper[4810]: I0219 15:13:04.404042 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-v87ss" Feb 19 15:13:04 crc kubenswrapper[4810]: I0219 15:13:04.407366 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dfqg8" podStartSLOduration=4.494835201 podStartE2EDuration="1m2.407345563s" podCreationTimestamp="2026-02-19 15:12:02 +0000 UTC" firstStartedPulling="2026-02-19 15:12:04.471998743 +0000 UTC m=+153.954028867" lastFinishedPulling="2026-02-19 15:13:02.384509065 +0000 UTC m=+211.866539229" observedRunningTime="2026-02-19 15:13:04.405369582 +0000 UTC m=+213.887399706" watchObservedRunningTime="2026-02-19 15:13:04.407345563 +0000 UTC m=+213.889375697" Feb 19 15:13:04 crc kubenswrapper[4810]: I0219 15:13:04.450469 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-v87ss" Feb 19 15:13:06 crc kubenswrapper[4810]: I0219 15:13:06.872271 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-v87ss"] Feb 19 15:13:06 crc kubenswrapper[4810]: I0219 15:13:06.872905 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-v87ss" podUID="ee54de34-1c90-401d-8102-2cc1e4116661" containerName="registry-server" containerID="cri-o://e76c00c840c072923d95803a179729bb3992d5d640b8cfc79a009aafae36db32" gracePeriod=2 Feb 19 15:13:07 crc kubenswrapper[4810]: I0219 15:13:07.385335 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ptbh9" event={"ID":"7e18c7b6-8ee0-4dbd-a0b2-3830b3a7bc53","Type":"ContainerStarted","Data":"3c4b22d47de57aea7d57602a01f5293afd50a4c1301de10fd9d0be527b9184bd"} Feb 19 15:13:08 crc kubenswrapper[4810]: I0219 15:13:08.408882 4810 generic.go:334] "Generic (PLEG): container finished" podID="ee54de34-1c90-401d-8102-2cc1e4116661" containerID="e76c00c840c072923d95803a179729bb3992d5d640b8cfc79a009aafae36db32" exitCode=0 Feb 19 15:13:08 crc kubenswrapper[4810]: I0219 15:13:08.408957 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v87ss" event={"ID":"ee54de34-1c90-401d-8102-2cc1e4116661","Type":"ContainerDied","Data":"e76c00c840c072923d95803a179729bb3992d5d640b8cfc79a009aafae36db32"} Feb 19 15:13:08 crc kubenswrapper[4810]: I0219 15:13:08.430396 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ptbh9" podStartSLOduration=4.887551294 podStartE2EDuration="1m6.430307787s" podCreationTimestamp="2026-02-19 15:12:02 +0000 UTC" firstStartedPulling="2026-02-19 15:12:04.406403768 +0000 UTC m=+153.888433882" lastFinishedPulling="2026-02-19 15:13:05.949160241 +0000 UTC m=+215.431190375" observedRunningTime="2026-02-19 15:13:08.428880241 +0000 UTC m=+217.910910375" watchObservedRunningTime="2026-02-19 15:13:08.430307787 +0000 UTC m=+217.912337911" Feb 19 15:13:08 crc kubenswrapper[4810]: I0219 15:13:08.521273 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v87ss" Feb 19 15:13:08 crc kubenswrapper[4810]: I0219 15:13:08.704592 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbjkj\" (UniqueName: \"kubernetes.io/projected/ee54de34-1c90-401d-8102-2cc1e4116661-kube-api-access-fbjkj\") pod \"ee54de34-1c90-401d-8102-2cc1e4116661\" (UID: \"ee54de34-1c90-401d-8102-2cc1e4116661\") " Feb 19 15:13:08 crc kubenswrapper[4810]: I0219 15:13:08.705301 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee54de34-1c90-401d-8102-2cc1e4116661-utilities\") pod \"ee54de34-1c90-401d-8102-2cc1e4116661\" (UID: \"ee54de34-1c90-401d-8102-2cc1e4116661\") " Feb 19 15:13:08 crc kubenswrapper[4810]: I0219 15:13:08.705410 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee54de34-1c90-401d-8102-2cc1e4116661-catalog-content\") pod \"ee54de34-1c90-401d-8102-2cc1e4116661\" (UID: \"ee54de34-1c90-401d-8102-2cc1e4116661\") " Feb 19 15:13:08 crc kubenswrapper[4810]: I0219 15:13:08.706145 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee54de34-1c90-401d-8102-2cc1e4116661-utilities" (OuterVolumeSpecName: "utilities") pod "ee54de34-1c90-401d-8102-2cc1e4116661" (UID: "ee54de34-1c90-401d-8102-2cc1e4116661"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:13:08 crc kubenswrapper[4810]: I0219 15:13:08.713109 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee54de34-1c90-401d-8102-2cc1e4116661-kube-api-access-fbjkj" (OuterVolumeSpecName: "kube-api-access-fbjkj") pod "ee54de34-1c90-401d-8102-2cc1e4116661" (UID: "ee54de34-1c90-401d-8102-2cc1e4116661"). InnerVolumeSpecName "kube-api-access-fbjkj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:13:08 crc kubenswrapper[4810]: I0219 15:13:08.808225 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fbjkj\" (UniqueName: \"kubernetes.io/projected/ee54de34-1c90-401d-8102-2cc1e4116661-kube-api-access-fbjkj\") on node \"crc\" DevicePath \"\"" Feb 19 15:13:08 crc kubenswrapper[4810]: I0219 15:13:08.808276 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee54de34-1c90-401d-8102-2cc1e4116661-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 15:13:08 crc kubenswrapper[4810]: I0219 15:13:08.839380 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee54de34-1c90-401d-8102-2cc1e4116661-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ee54de34-1c90-401d-8102-2cc1e4116661" (UID: "ee54de34-1c90-401d-8102-2cc1e4116661"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:13:08 crc kubenswrapper[4810]: I0219 15:13:08.909360 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee54de34-1c90-401d-8102-2cc1e4116661-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 15:13:09 crc kubenswrapper[4810]: I0219 15:13:09.435215 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v87ss" event={"ID":"ee54de34-1c90-401d-8102-2cc1e4116661","Type":"ContainerDied","Data":"e9e0d068483405f5413181ac3893890a639ed339e87615e73aeb8a60ceb19c52"} Feb 19 15:13:09 crc kubenswrapper[4810]: I0219 15:13:09.435289 4810 scope.go:117] "RemoveContainer" containerID="e76c00c840c072923d95803a179729bb3992d5d640b8cfc79a009aafae36db32" Feb 19 15:13:09 crc kubenswrapper[4810]: I0219 15:13:09.435443 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v87ss" Feb 19 15:13:09 crc kubenswrapper[4810]: I0219 15:13:09.474314 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-v87ss"] Feb 19 15:13:09 crc kubenswrapper[4810]: I0219 15:13:09.480379 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-v87ss"] Feb 19 15:13:09 crc kubenswrapper[4810]: I0219 15:13:09.547160 4810 scope.go:117] "RemoveContainer" containerID="d8d3ab6d086deece4cdefec9736b01ba8f86997b96e710debe90df076af13fd1" Feb 19 15:13:10 crc kubenswrapper[4810]: I0219 15:13:10.449516 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x8sn2" event={"ID":"cc8ce195-1fe1-4684-8172-e710b3552fb5","Type":"ContainerStarted","Data":"266b7a0eecc4ea263050674e9b0fa34134a650122976b06935949729514b8110"} Feb 19 15:13:10 crc kubenswrapper[4810]: I0219 15:13:10.475509 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-x8sn2" podStartSLOduration=5.383213754 podStartE2EDuration="1m10.47547664s" podCreationTimestamp="2026-02-19 15:12:00 +0000 UTC" firstStartedPulling="2026-02-19 15:12:03.323521368 +0000 UTC m=+152.805551492" lastFinishedPulling="2026-02-19 15:13:08.415784254 +0000 UTC m=+217.897814378" observedRunningTime="2026-02-19 15:13:10.472823451 +0000 UTC m=+219.954853575" watchObservedRunningTime="2026-02-19 15:13:10.47547664 +0000 UTC m=+219.957506764" Feb 19 15:13:10 crc kubenswrapper[4810]: I0219 15:13:10.644972 4810 scope.go:117] "RemoveContainer" containerID="14425fa062e8aa4022f788165f140217518c2e7e2510e6d081b098762154d7a3" Feb 19 15:13:11 crc kubenswrapper[4810]: I0219 15:13:11.354762 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-x8sn2" Feb 19 15:13:11 crc kubenswrapper[4810]: I0219 15:13:11.354854 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-x8sn2" Feb 19 15:13:11 crc kubenswrapper[4810]: I0219 15:13:11.450255 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee54de34-1c90-401d-8102-2cc1e4116661" path="/var/lib/kubelet/pods/ee54de34-1c90-401d-8102-2cc1e4116661/volumes" Feb 19 15:13:12 crc kubenswrapper[4810]: I0219 15:13:12.413544 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-x8sn2" podUID="cc8ce195-1fe1-4684-8172-e710b3552fb5" containerName="registry-server" probeResult="failure" output=< Feb 19 15:13:12 crc kubenswrapper[4810]: timeout: failed to connect service ":50051" within 1s Feb 19 15:13:12 crc kubenswrapper[4810]: > Feb 19 15:13:12 crc kubenswrapper[4810]: I0219 15:13:12.473540 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-blpmq" event={"ID":"4127fef2-ef2b-4cc4-967d-d52dac26f314","Type":"ContainerStarted","Data":"6cc469c93c48a9ab5d144e46acaa99a2b78d56e0e2fbdf938f36668cd8a62128"} Feb 19 15:13:12 crc kubenswrapper[4810]: I0219 15:13:12.477107 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d5ks5" event={"ID":"9a3d6b1f-2011-4f7f-bea0-1d303007fe41","Type":"ContainerStarted","Data":"6d399395ed59b272e5ce80489e65062cb3d89e0840588f5165efb67b6a223708"} Feb 19 15:13:12 crc kubenswrapper[4810]: I0219 15:13:12.514765 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-blpmq" podStartSLOduration=3.8235967 podStartE2EDuration="1m12.514731001s" podCreationTimestamp="2026-02-19 15:12:00 +0000 UTC" firstStartedPulling="2026-02-19 15:12:03.341070351 +0000 UTC m=+152.823100465" lastFinishedPulling="2026-02-19 15:13:12.032204642 +0000 UTC m=+221.514234766" observedRunningTime="2026-02-19 15:13:12.511218451 +0000 UTC m=+221.993248585" watchObservedRunningTime="2026-02-19 15:13:12.514731001 +0000 UTC m=+221.996761125" Feb 19 15:13:12 crc kubenswrapper[4810]: I0219 15:13:12.536990 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-d5ks5" podStartSLOduration=3.881865864 podStartE2EDuration="1m12.536966772s" podCreationTimestamp="2026-02-19 15:12:00 +0000 UTC" firstStartedPulling="2026-02-19 15:12:03.383647052 +0000 UTC m=+152.865677176" lastFinishedPulling="2026-02-19 15:13:12.03874794 +0000 UTC m=+221.520778084" observedRunningTime="2026-02-19 15:13:12.536441209 +0000 UTC m=+222.018471333" watchObservedRunningTime="2026-02-19 15:13:12.536966772 +0000 UTC m=+222.018996896" Feb 19 15:13:12 crc kubenswrapper[4810]: I0219 15:13:12.691158 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-r74mv"] Feb 19 15:13:12 crc kubenswrapper[4810]: I0219 15:13:12.727722 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ptbh9" Feb 19 15:13:12 crc kubenswrapper[4810]: I0219 15:13:12.728202 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ptbh9" Feb 19 15:13:12 crc kubenswrapper[4810]: I0219 15:13:12.793122 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ptbh9" Feb 19 15:13:13 crc kubenswrapper[4810]: I0219 15:13:13.221483 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dfqg8" Feb 19 15:13:13 crc kubenswrapper[4810]: I0219 15:13:13.221545 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dfqg8" Feb 19 15:13:13 crc kubenswrapper[4810]: I0219 15:13:13.275529 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dfqg8" Feb 19 15:13:13 crc kubenswrapper[4810]: I0219 15:13:13.528591 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ptbh9" Feb 19 15:13:13 crc kubenswrapper[4810]: I0219 15:13:13.533098 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dfqg8" Feb 19 15:13:16 crc kubenswrapper[4810]: I0219 15:13:16.878134 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dfqg8"] Feb 19 15:13:16 crc kubenswrapper[4810]: I0219 15:13:16.878632 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-dfqg8" podUID="5c654206-f2d0-4b40-9df0-577dbf27e5e4" containerName="registry-server" containerID="cri-o://32a771c22d7bf5ec3610bad3dd5fe81ac2ad5a407f7d5f0293d46b10d47c380d" gracePeriod=2 Feb 19 15:13:16 crc kubenswrapper[4810]: I0219 15:13:16.913187 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-698c8bc94d-lz9st"] Feb 19 15:13:16 crc kubenswrapper[4810]: I0219 15:13:16.913549 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-698c8bc94d-lz9st" podUID="9fab1a62-cdc3-4a2c-ab2d-dbd8f1f097f1" containerName="controller-manager" containerID="cri-o://26464e252c8755e254a428e223c59bad6858b683b17cb2dde5fe0926737087fb" gracePeriod=30 Feb 19 15:13:17 crc kubenswrapper[4810]: I0219 15:13:17.004879 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6fb4659b76-cmjpw"] Feb 19 15:13:17 crc kubenswrapper[4810]: I0219 15:13:17.005165 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6fb4659b76-cmjpw" podUID="9d474cf7-0ff3-43d0-88e8-19100a658851" containerName="route-controller-manager" containerID="cri-o://a2b82759b901660a79c47e74ce7b041d5164f2bc3f90bc7fb84fbbfb09e3a3e7" gracePeriod=30 Feb 19 15:13:17 crc kubenswrapper[4810]: I0219 15:13:17.512434 4810 generic.go:334] "Generic (PLEG): container finished" podID="9fab1a62-cdc3-4a2c-ab2d-dbd8f1f097f1" containerID="26464e252c8755e254a428e223c59bad6858b683b17cb2dde5fe0926737087fb" exitCode=0 Feb 19 15:13:17 crc kubenswrapper[4810]: I0219 15:13:17.512534 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-698c8bc94d-lz9st" event={"ID":"9fab1a62-cdc3-4a2c-ab2d-dbd8f1f097f1","Type":"ContainerDied","Data":"26464e252c8755e254a428e223c59bad6858b683b17cb2dde5fe0926737087fb"} Feb 19 15:13:17 crc kubenswrapper[4810]: I0219 15:13:17.515655 4810 generic.go:334] "Generic (PLEG): container finished" podID="9d474cf7-0ff3-43d0-88e8-19100a658851" containerID="a2b82759b901660a79c47e74ce7b041d5164f2bc3f90bc7fb84fbbfb09e3a3e7" exitCode=0 Feb 19 15:13:17 crc kubenswrapper[4810]: I0219 15:13:17.515780 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6fb4659b76-cmjpw" event={"ID":"9d474cf7-0ff3-43d0-88e8-19100a658851","Type":"ContainerDied","Data":"a2b82759b901660a79c47e74ce7b041d5164f2bc3f90bc7fb84fbbfb09e3a3e7"} Feb 19 15:13:17 crc kubenswrapper[4810]: I0219 15:13:17.515828 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6fb4659b76-cmjpw" event={"ID":"9d474cf7-0ff3-43d0-88e8-19100a658851","Type":"ContainerDied","Data":"61cb0e99accc7f3521da0dd967157522b71ffbbbce8920140bb606b51134702b"} Feb 19 15:13:17 crc kubenswrapper[4810]: I0219 15:13:17.515851 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="61cb0e99accc7f3521da0dd967157522b71ffbbbce8920140bb606b51134702b" Feb 19 15:13:17 crc kubenswrapper[4810]: I0219 15:13:17.518597 4810 generic.go:334] "Generic (PLEG): container finished" podID="5c654206-f2d0-4b40-9df0-577dbf27e5e4" containerID="32a771c22d7bf5ec3610bad3dd5fe81ac2ad5a407f7d5f0293d46b10d47c380d" exitCode=0 Feb 19 15:13:17 crc kubenswrapper[4810]: I0219 15:13:17.518648 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dfqg8" event={"ID":"5c654206-f2d0-4b40-9df0-577dbf27e5e4","Type":"ContainerDied","Data":"32a771c22d7bf5ec3610bad3dd5fe81ac2ad5a407f7d5f0293d46b10d47c380d"} Feb 19 15:13:17 crc kubenswrapper[4810]: I0219 15:13:17.518678 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dfqg8" event={"ID":"5c654206-f2d0-4b40-9df0-577dbf27e5e4","Type":"ContainerDied","Data":"673ebc55d1a6b447cd1eff3908534a0236e7f9bed8a79855c901abb60e30a35e"} Feb 19 15:13:17 crc kubenswrapper[4810]: I0219 15:13:17.518689 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="673ebc55d1a6b447cd1eff3908534a0236e7f9bed8a79855c901abb60e30a35e" Feb 19 15:13:17 crc kubenswrapper[4810]: I0219 15:13:17.548984 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dfqg8" Feb 19 15:13:17 crc kubenswrapper[4810]: I0219 15:13:17.557591 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6fb4659b76-cmjpw" Feb 19 15:13:17 crc kubenswrapper[4810]: I0219 15:13:17.562395 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4nld\" (UniqueName: \"kubernetes.io/projected/5c654206-f2d0-4b40-9df0-577dbf27e5e4-kube-api-access-r4nld\") pod \"5c654206-f2d0-4b40-9df0-577dbf27e5e4\" (UID: \"5c654206-f2d0-4b40-9df0-577dbf27e5e4\") " Feb 19 15:13:17 crc kubenswrapper[4810]: I0219 15:13:17.562452 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c654206-f2d0-4b40-9df0-577dbf27e5e4-catalog-content\") pod \"5c654206-f2d0-4b40-9df0-577dbf27e5e4\" (UID: \"5c654206-f2d0-4b40-9df0-577dbf27e5e4\") " Feb 19 15:13:17 crc kubenswrapper[4810]: I0219 15:13:17.562501 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c654206-f2d0-4b40-9df0-577dbf27e5e4-utilities\") pod \"5c654206-f2d0-4b40-9df0-577dbf27e5e4\" (UID: \"5c654206-f2d0-4b40-9df0-577dbf27e5e4\") " Feb 19 15:13:17 crc kubenswrapper[4810]: I0219 15:13:17.562518 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxmxw\" (UniqueName: \"kubernetes.io/projected/9d474cf7-0ff3-43d0-88e8-19100a658851-kube-api-access-nxmxw\") pod \"9d474cf7-0ff3-43d0-88e8-19100a658851\" (UID: \"9d474cf7-0ff3-43d0-88e8-19100a658851\") " Feb 19 15:13:17 crc kubenswrapper[4810]: I0219 15:13:17.562546 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d474cf7-0ff3-43d0-88e8-19100a658851-serving-cert\") pod \"9d474cf7-0ff3-43d0-88e8-19100a658851\" (UID: \"9d474cf7-0ff3-43d0-88e8-19100a658851\") " Feb 19 15:13:17 crc kubenswrapper[4810]: I0219 15:13:17.562566 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9d474cf7-0ff3-43d0-88e8-19100a658851-client-ca\") pod \"9d474cf7-0ff3-43d0-88e8-19100a658851\" (UID: \"9d474cf7-0ff3-43d0-88e8-19100a658851\") " Feb 19 15:13:17 crc kubenswrapper[4810]: I0219 15:13:17.562586 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d474cf7-0ff3-43d0-88e8-19100a658851-config\") pod \"9d474cf7-0ff3-43d0-88e8-19100a658851\" (UID: \"9d474cf7-0ff3-43d0-88e8-19100a658851\") " Feb 19 15:13:17 crc kubenswrapper[4810]: I0219 15:13:17.563471 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d474cf7-0ff3-43d0-88e8-19100a658851-client-ca" (OuterVolumeSpecName: "client-ca") pod "9d474cf7-0ff3-43d0-88e8-19100a658851" (UID: "9d474cf7-0ff3-43d0-88e8-19100a658851"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:13:17 crc kubenswrapper[4810]: I0219 15:13:17.563499 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d474cf7-0ff3-43d0-88e8-19100a658851-config" (OuterVolumeSpecName: "config") pod "9d474cf7-0ff3-43d0-88e8-19100a658851" (UID: "9d474cf7-0ff3-43d0-88e8-19100a658851"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:13:17 crc kubenswrapper[4810]: I0219 15:13:17.563500 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c654206-f2d0-4b40-9df0-577dbf27e5e4-utilities" (OuterVolumeSpecName: "utilities") pod "5c654206-f2d0-4b40-9df0-577dbf27e5e4" (UID: "5c654206-f2d0-4b40-9df0-577dbf27e5e4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:13:17 crc kubenswrapper[4810]: I0219 15:13:17.571419 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c654206-f2d0-4b40-9df0-577dbf27e5e4-kube-api-access-r4nld" (OuterVolumeSpecName: "kube-api-access-r4nld") pod "5c654206-f2d0-4b40-9df0-577dbf27e5e4" (UID: "5c654206-f2d0-4b40-9df0-577dbf27e5e4"). InnerVolumeSpecName "kube-api-access-r4nld". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:13:17 crc kubenswrapper[4810]: I0219 15:13:17.575931 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d474cf7-0ff3-43d0-88e8-19100a658851-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d474cf7-0ff3-43d0-88e8-19100a658851" (UID: "9d474cf7-0ff3-43d0-88e8-19100a658851"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:13:17 crc kubenswrapper[4810]: I0219 15:13:17.594778 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d474cf7-0ff3-43d0-88e8-19100a658851-kube-api-access-nxmxw" (OuterVolumeSpecName: "kube-api-access-nxmxw") pod "9d474cf7-0ff3-43d0-88e8-19100a658851" (UID: "9d474cf7-0ff3-43d0-88e8-19100a658851"). InnerVolumeSpecName "kube-api-access-nxmxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:13:17 crc kubenswrapper[4810]: I0219 15:13:17.603954 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c654206-f2d0-4b40-9df0-577dbf27e5e4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5c654206-f2d0-4b40-9df0-577dbf27e5e4" (UID: "5c654206-f2d0-4b40-9df0-577dbf27e5e4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:13:17 crc kubenswrapper[4810]: I0219 15:13:17.662477 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-698c8bc94d-lz9st" Feb 19 15:13:17 crc kubenswrapper[4810]: I0219 15:13:17.663313 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c654206-f2d0-4b40-9df0-577dbf27e5e4-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 15:13:17 crc kubenswrapper[4810]: I0219 15:13:17.663347 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxmxw\" (UniqueName: \"kubernetes.io/projected/9d474cf7-0ff3-43d0-88e8-19100a658851-kube-api-access-nxmxw\") on node \"crc\" DevicePath \"\"" Feb 19 15:13:17 crc kubenswrapper[4810]: I0219 15:13:17.663358 4810 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d474cf7-0ff3-43d0-88e8-19100a658851-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 15:13:17 crc kubenswrapper[4810]: I0219 15:13:17.663366 4810 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9d474cf7-0ff3-43d0-88e8-19100a658851-client-ca\") on node \"crc\" DevicePath \"\"" Feb 19 15:13:17 crc kubenswrapper[4810]: I0219 15:13:17.663374 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d474cf7-0ff3-43d0-88e8-19100a658851-config\") on node \"crc\" DevicePath \"\"" Feb 19 15:13:17 crc kubenswrapper[4810]: I0219 15:13:17.663383 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4nld\" (UniqueName: \"kubernetes.io/projected/5c654206-f2d0-4b40-9df0-577dbf27e5e4-kube-api-access-r4nld\") on node \"crc\" DevicePath \"\"" Feb 19 15:13:17 crc kubenswrapper[4810]: I0219 15:13:17.663390 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c654206-f2d0-4b40-9df0-577dbf27e5e4-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 15:13:17 crc kubenswrapper[4810]: I0219 15:13:17.764797 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9fab1a62-cdc3-4a2c-ab2d-dbd8f1f097f1-client-ca\") pod \"9fab1a62-cdc3-4a2c-ab2d-dbd8f1f097f1\" (UID: \"9fab1a62-cdc3-4a2c-ab2d-dbd8f1f097f1\") " Feb 19 15:13:17 crc kubenswrapper[4810]: I0219 15:13:17.764854 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bfgv7\" (UniqueName: \"kubernetes.io/projected/9fab1a62-cdc3-4a2c-ab2d-dbd8f1f097f1-kube-api-access-bfgv7\") pod \"9fab1a62-cdc3-4a2c-ab2d-dbd8f1f097f1\" (UID: \"9fab1a62-cdc3-4a2c-ab2d-dbd8f1f097f1\") " Feb 19 15:13:17 crc kubenswrapper[4810]: I0219 15:13:17.764880 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9fab1a62-cdc3-4a2c-ab2d-dbd8f1f097f1-proxy-ca-bundles\") pod \"9fab1a62-cdc3-4a2c-ab2d-dbd8f1f097f1\" (UID: \"9fab1a62-cdc3-4a2c-ab2d-dbd8f1f097f1\") " Feb 19 15:13:17 crc kubenswrapper[4810]: I0219 15:13:17.764926 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9fab1a62-cdc3-4a2c-ab2d-dbd8f1f097f1-serving-cert\") pod \"9fab1a62-cdc3-4a2c-ab2d-dbd8f1f097f1\" (UID: \"9fab1a62-cdc3-4a2c-ab2d-dbd8f1f097f1\") " Feb 19 15:13:17 crc kubenswrapper[4810]: I0219 15:13:17.764959 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fab1a62-cdc3-4a2c-ab2d-dbd8f1f097f1-config\") pod \"9fab1a62-cdc3-4a2c-ab2d-dbd8f1f097f1\" (UID: \"9fab1a62-cdc3-4a2c-ab2d-dbd8f1f097f1\") " Feb 19 15:13:17 crc kubenswrapper[4810]: I0219 15:13:17.765672 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9fab1a62-cdc3-4a2c-ab2d-dbd8f1f097f1-client-ca" (OuterVolumeSpecName: "client-ca") pod "9fab1a62-cdc3-4a2c-ab2d-dbd8f1f097f1" (UID: "9fab1a62-cdc3-4a2c-ab2d-dbd8f1f097f1"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:13:17 crc kubenswrapper[4810]: I0219 15:13:17.765765 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9fab1a62-cdc3-4a2c-ab2d-dbd8f1f097f1-config" (OuterVolumeSpecName: "config") pod "9fab1a62-cdc3-4a2c-ab2d-dbd8f1f097f1" (UID: "9fab1a62-cdc3-4a2c-ab2d-dbd8f1f097f1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:13:17 crc kubenswrapper[4810]: I0219 15:13:17.765874 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9fab1a62-cdc3-4a2c-ab2d-dbd8f1f097f1-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "9fab1a62-cdc3-4a2c-ab2d-dbd8f1f097f1" (UID: "9fab1a62-cdc3-4a2c-ab2d-dbd8f1f097f1"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:13:17 crc kubenswrapper[4810]: I0219 15:13:17.769797 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fab1a62-cdc3-4a2c-ab2d-dbd8f1f097f1-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9fab1a62-cdc3-4a2c-ab2d-dbd8f1f097f1" (UID: "9fab1a62-cdc3-4a2c-ab2d-dbd8f1f097f1"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:13:17 crc kubenswrapper[4810]: I0219 15:13:17.769830 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fab1a62-cdc3-4a2c-ab2d-dbd8f1f097f1-kube-api-access-bfgv7" (OuterVolumeSpecName: "kube-api-access-bfgv7") pod "9fab1a62-cdc3-4a2c-ab2d-dbd8f1f097f1" (UID: "9fab1a62-cdc3-4a2c-ab2d-dbd8f1f097f1"). InnerVolumeSpecName "kube-api-access-bfgv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:13:17 crc kubenswrapper[4810]: I0219 15:13:17.866433 4810 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9fab1a62-cdc3-4a2c-ab2d-dbd8f1f097f1-client-ca\") on node \"crc\" DevicePath \"\"" Feb 19 15:13:17 crc kubenswrapper[4810]: I0219 15:13:17.866497 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bfgv7\" (UniqueName: \"kubernetes.io/projected/9fab1a62-cdc3-4a2c-ab2d-dbd8f1f097f1-kube-api-access-bfgv7\") on node \"crc\" DevicePath \"\"" Feb 19 15:13:17 crc kubenswrapper[4810]: I0219 15:13:17.866514 4810 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9fab1a62-cdc3-4a2c-ab2d-dbd8f1f097f1-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 19 15:13:17 crc kubenswrapper[4810]: I0219 15:13:17.866524 4810 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9fab1a62-cdc3-4a2c-ab2d-dbd8f1f097f1-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 15:13:17 crc kubenswrapper[4810]: I0219 15:13:17.866535 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fab1a62-cdc3-4a2c-ab2d-dbd8f1f097f1-config\") on node \"crc\" DevicePath \"\"" Feb 19 15:13:18 crc kubenswrapper[4810]: I0219 15:13:18.528718 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-698c8bc94d-lz9st" event={"ID":"9fab1a62-cdc3-4a2c-ab2d-dbd8f1f097f1","Type":"ContainerDied","Data":"5a6aca7188bb15f6a1934c0cbf78cc5b48de2dae39ca1f408904e66c1d41c473"} Feb 19 15:13:18 crc kubenswrapper[4810]: I0219 15:13:18.528819 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-698c8bc94d-lz9st" Feb 19 15:13:18 crc kubenswrapper[4810]: I0219 15:13:18.528828 4810 scope.go:117] "RemoveContainer" containerID="26464e252c8755e254a428e223c59bad6858b683b17cb2dde5fe0926737087fb" Feb 19 15:13:18 crc kubenswrapper[4810]: I0219 15:13:18.528751 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dfqg8" Feb 19 15:13:18 crc kubenswrapper[4810]: I0219 15:13:18.530469 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6fb4659b76-cmjpw" Feb 19 15:13:18 crc kubenswrapper[4810]: I0219 15:13:18.575107 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dfqg8"] Feb 19 15:13:18 crc kubenswrapper[4810]: I0219 15:13:18.586600 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-dfqg8"] Feb 19 15:13:18 crc kubenswrapper[4810]: I0219 15:13:18.594865 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-698c8bc94d-lz9st"] Feb 19 15:13:18 crc kubenswrapper[4810]: I0219 15:13:18.601412 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-698c8bc94d-lz9st"] Feb 19 15:13:18 crc kubenswrapper[4810]: I0219 15:13:18.607911 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6fb4659b76-cmjpw"] Feb 19 15:13:18 crc kubenswrapper[4810]: I0219 15:13:18.613086 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6fb4659b76-cmjpw"] Feb 19 15:13:18 crc kubenswrapper[4810]: I0219 15:13:18.844010 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d79b8558f-lfs67"] Feb 19 15:13:18 crc kubenswrapper[4810]: E0219 15:13:18.846703 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c654206-f2d0-4b40-9df0-577dbf27e5e4" containerName="registry-server" Feb 19 15:13:18 crc kubenswrapper[4810]: I0219 15:13:18.846751 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c654206-f2d0-4b40-9df0-577dbf27e5e4" containerName="registry-server" Feb 19 15:13:18 crc kubenswrapper[4810]: E0219 15:13:18.846782 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee54de34-1c90-401d-8102-2cc1e4116661" containerName="extract-content" Feb 19 15:13:18 crc kubenswrapper[4810]: I0219 15:13:18.846798 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee54de34-1c90-401d-8102-2cc1e4116661" containerName="extract-content" Feb 19 15:13:18 crc kubenswrapper[4810]: E0219 15:13:18.846816 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fab1a62-cdc3-4a2c-ab2d-dbd8f1f097f1" containerName="controller-manager" Feb 19 15:13:18 crc kubenswrapper[4810]: I0219 15:13:18.846834 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fab1a62-cdc3-4a2c-ab2d-dbd8f1f097f1" containerName="controller-manager" Feb 19 15:13:18 crc kubenswrapper[4810]: E0219 15:13:18.846864 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c654206-f2d0-4b40-9df0-577dbf27e5e4" containerName="extract-content" Feb 19 15:13:18 crc kubenswrapper[4810]: I0219 15:13:18.846883 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c654206-f2d0-4b40-9df0-577dbf27e5e4" containerName="extract-content" Feb 19 15:13:18 crc kubenswrapper[4810]: E0219 15:13:18.846918 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee54de34-1c90-401d-8102-2cc1e4116661" containerName="extract-utilities" Feb 19 15:13:18 crc kubenswrapper[4810]: I0219 15:13:18.846934 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee54de34-1c90-401d-8102-2cc1e4116661" containerName="extract-utilities" Feb 19 15:13:18 crc kubenswrapper[4810]: E0219 15:13:18.846956 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d474cf7-0ff3-43d0-88e8-19100a658851" containerName="route-controller-manager" Feb 19 15:13:18 crc kubenswrapper[4810]: I0219 15:13:18.846972 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d474cf7-0ff3-43d0-88e8-19100a658851" containerName="route-controller-manager" Feb 19 15:13:18 crc kubenswrapper[4810]: E0219 15:13:18.847006 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c654206-f2d0-4b40-9df0-577dbf27e5e4" containerName="extract-utilities" Feb 19 15:13:18 crc kubenswrapper[4810]: I0219 15:13:18.847021 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c654206-f2d0-4b40-9df0-577dbf27e5e4" containerName="extract-utilities" Feb 19 15:13:18 crc kubenswrapper[4810]: E0219 15:13:18.847039 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee54de34-1c90-401d-8102-2cc1e4116661" containerName="registry-server" Feb 19 15:13:18 crc kubenswrapper[4810]: I0219 15:13:18.847055 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee54de34-1c90-401d-8102-2cc1e4116661" containerName="registry-server" Feb 19 15:13:18 crc kubenswrapper[4810]: I0219 15:13:18.847280 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee54de34-1c90-401d-8102-2cc1e4116661" containerName="registry-server" Feb 19 15:13:18 crc kubenswrapper[4810]: I0219 15:13:18.847321 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d474cf7-0ff3-43d0-88e8-19100a658851" containerName="route-controller-manager" Feb 19 15:13:18 crc kubenswrapper[4810]: I0219 15:13:18.851237 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fab1a62-cdc3-4a2c-ab2d-dbd8f1f097f1" containerName="controller-manager" Feb 19 15:13:18 crc kubenswrapper[4810]: I0219 15:13:18.851265 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c654206-f2d0-4b40-9df0-577dbf27e5e4" containerName="registry-server" Feb 19 15:13:18 crc kubenswrapper[4810]: I0219 15:13:18.852181 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-d6d66b55f-52rdb"] Feb 19 15:13:18 crc kubenswrapper[4810]: I0219 15:13:18.852494 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d79b8558f-lfs67" Feb 19 15:13:18 crc kubenswrapper[4810]: I0219 15:13:18.853491 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-d6d66b55f-52rdb" Feb 19 15:13:18 crc kubenswrapper[4810]: I0219 15:13:18.855159 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 19 15:13:18 crc kubenswrapper[4810]: I0219 15:13:18.856027 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-d6d66b55f-52rdb"] Feb 19 15:13:18 crc kubenswrapper[4810]: I0219 15:13:18.856203 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 19 15:13:18 crc kubenswrapper[4810]: I0219 15:13:18.856361 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 19 15:13:18 crc kubenswrapper[4810]: I0219 15:13:18.858246 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 19 15:13:18 crc kubenswrapper[4810]: I0219 15:13:18.858295 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 19 15:13:18 crc kubenswrapper[4810]: I0219 15:13:18.858590 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 19 15:13:18 crc kubenswrapper[4810]: I0219 15:13:18.858774 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 19 15:13:18 crc kubenswrapper[4810]: I0219 15:13:18.858910 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 19 15:13:18 crc kubenswrapper[4810]: I0219 15:13:18.859085 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 19 15:13:18 crc kubenswrapper[4810]: I0219 15:13:18.859612 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 19 15:13:18 crc kubenswrapper[4810]: I0219 15:13:18.861980 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 19 15:13:18 crc kubenswrapper[4810]: I0219 15:13:18.864042 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 19 15:13:18 crc kubenswrapper[4810]: I0219 15:13:18.868947 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d79b8558f-lfs67"] Feb 19 15:13:18 crc kubenswrapper[4810]: I0219 15:13:18.869872 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 19 15:13:18 crc kubenswrapper[4810]: I0219 15:13:18.982259 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xw4x\" (UniqueName: \"kubernetes.io/projected/af9230b1-4c3c-46d1-9513-6b7b0234f7f9-kube-api-access-8xw4x\") pod \"route-controller-manager-5d79b8558f-lfs67\" (UID: \"af9230b1-4c3c-46d1-9513-6b7b0234f7f9\") " pod="openshift-route-controller-manager/route-controller-manager-5d79b8558f-lfs67" Feb 19 15:13:18 crc kubenswrapper[4810]: I0219 15:13:18.982364 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/581f2a9a-402b-45f5-a6e1-572fe1ccd601-proxy-ca-bundles\") pod \"controller-manager-d6d66b55f-52rdb\" (UID: \"581f2a9a-402b-45f5-a6e1-572fe1ccd601\") " pod="openshift-controller-manager/controller-manager-d6d66b55f-52rdb" Feb 19 15:13:18 crc kubenswrapper[4810]: I0219 15:13:18.982526 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/581f2a9a-402b-45f5-a6e1-572fe1ccd601-config\") pod \"controller-manager-d6d66b55f-52rdb\" (UID: \"581f2a9a-402b-45f5-a6e1-572fe1ccd601\") " pod="openshift-controller-manager/controller-manager-d6d66b55f-52rdb" Feb 19 15:13:18 crc kubenswrapper[4810]: I0219 15:13:18.982707 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/581f2a9a-402b-45f5-a6e1-572fe1ccd601-serving-cert\") pod \"controller-manager-d6d66b55f-52rdb\" (UID: \"581f2a9a-402b-45f5-a6e1-572fe1ccd601\") " pod="openshift-controller-manager/controller-manager-d6d66b55f-52rdb" Feb 19 15:13:18 crc kubenswrapper[4810]: I0219 15:13:18.982755 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llzqr\" (UniqueName: \"kubernetes.io/projected/581f2a9a-402b-45f5-a6e1-572fe1ccd601-kube-api-access-llzqr\") pod \"controller-manager-d6d66b55f-52rdb\" (UID: \"581f2a9a-402b-45f5-a6e1-572fe1ccd601\") " pod="openshift-controller-manager/controller-manager-d6d66b55f-52rdb" Feb 19 15:13:18 crc kubenswrapper[4810]: I0219 15:13:18.982791 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af9230b1-4c3c-46d1-9513-6b7b0234f7f9-config\") pod \"route-controller-manager-5d79b8558f-lfs67\" (UID: \"af9230b1-4c3c-46d1-9513-6b7b0234f7f9\") " pod="openshift-route-controller-manager/route-controller-manager-5d79b8558f-lfs67" Feb 19 15:13:18 crc kubenswrapper[4810]: I0219 15:13:18.982823 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af9230b1-4c3c-46d1-9513-6b7b0234f7f9-serving-cert\") pod \"route-controller-manager-5d79b8558f-lfs67\" (UID: \"af9230b1-4c3c-46d1-9513-6b7b0234f7f9\") " pod="openshift-route-controller-manager/route-controller-manager-5d79b8558f-lfs67" Feb 19 15:13:18 crc kubenswrapper[4810]: I0219 15:13:18.982853 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/581f2a9a-402b-45f5-a6e1-572fe1ccd601-client-ca\") pod \"controller-manager-d6d66b55f-52rdb\" (UID: \"581f2a9a-402b-45f5-a6e1-572fe1ccd601\") " pod="openshift-controller-manager/controller-manager-d6d66b55f-52rdb" Feb 19 15:13:18 crc kubenswrapper[4810]: I0219 15:13:18.982900 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/af9230b1-4c3c-46d1-9513-6b7b0234f7f9-client-ca\") pod \"route-controller-manager-5d79b8558f-lfs67\" (UID: \"af9230b1-4c3c-46d1-9513-6b7b0234f7f9\") " pod="openshift-route-controller-manager/route-controller-manager-5d79b8558f-lfs67" Feb 19 15:13:19 crc kubenswrapper[4810]: I0219 15:13:19.083742 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xw4x\" (UniqueName: \"kubernetes.io/projected/af9230b1-4c3c-46d1-9513-6b7b0234f7f9-kube-api-access-8xw4x\") pod \"route-controller-manager-5d79b8558f-lfs67\" (UID: \"af9230b1-4c3c-46d1-9513-6b7b0234f7f9\") " pod="openshift-route-controller-manager/route-controller-manager-5d79b8558f-lfs67" Feb 19 15:13:19 crc kubenswrapper[4810]: I0219 15:13:19.083800 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/581f2a9a-402b-45f5-a6e1-572fe1ccd601-proxy-ca-bundles\") pod \"controller-manager-d6d66b55f-52rdb\" (UID: \"581f2a9a-402b-45f5-a6e1-572fe1ccd601\") " pod="openshift-controller-manager/controller-manager-d6d66b55f-52rdb" Feb 19 15:13:19 crc kubenswrapper[4810]: I0219 15:13:19.083858 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/581f2a9a-402b-45f5-a6e1-572fe1ccd601-config\") pod \"controller-manager-d6d66b55f-52rdb\" (UID: \"581f2a9a-402b-45f5-a6e1-572fe1ccd601\") " pod="openshift-controller-manager/controller-manager-d6d66b55f-52rdb" Feb 19 15:13:19 crc kubenswrapper[4810]: I0219 15:13:19.083918 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/581f2a9a-402b-45f5-a6e1-572fe1ccd601-serving-cert\") pod \"controller-manager-d6d66b55f-52rdb\" (UID: \"581f2a9a-402b-45f5-a6e1-572fe1ccd601\") " pod="openshift-controller-manager/controller-manager-d6d66b55f-52rdb" Feb 19 15:13:19 crc kubenswrapper[4810]: I0219 15:13:19.083940 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af9230b1-4c3c-46d1-9513-6b7b0234f7f9-config\") pod \"route-controller-manager-5d79b8558f-lfs67\" (UID: \"af9230b1-4c3c-46d1-9513-6b7b0234f7f9\") " pod="openshift-route-controller-manager/route-controller-manager-5d79b8558f-lfs67" Feb 19 15:13:19 crc kubenswrapper[4810]: I0219 15:13:19.083963 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llzqr\" (UniqueName: \"kubernetes.io/projected/581f2a9a-402b-45f5-a6e1-572fe1ccd601-kube-api-access-llzqr\") pod \"controller-manager-d6d66b55f-52rdb\" (UID: \"581f2a9a-402b-45f5-a6e1-572fe1ccd601\") " pod="openshift-controller-manager/controller-manager-d6d66b55f-52rdb" Feb 19 15:13:19 crc kubenswrapper[4810]: I0219 15:13:19.083988 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af9230b1-4c3c-46d1-9513-6b7b0234f7f9-serving-cert\") pod \"route-controller-manager-5d79b8558f-lfs67\" (UID: \"af9230b1-4c3c-46d1-9513-6b7b0234f7f9\") " pod="openshift-route-controller-manager/route-controller-manager-5d79b8558f-lfs67" Feb 19 15:13:19 crc kubenswrapper[4810]: I0219 15:13:19.084009 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/581f2a9a-402b-45f5-a6e1-572fe1ccd601-client-ca\") pod \"controller-manager-d6d66b55f-52rdb\" (UID: \"581f2a9a-402b-45f5-a6e1-572fe1ccd601\") " pod="openshift-controller-manager/controller-manager-d6d66b55f-52rdb" Feb 19 15:13:19 crc kubenswrapper[4810]: I0219 15:13:19.084030 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/af9230b1-4c3c-46d1-9513-6b7b0234f7f9-client-ca\") pod \"route-controller-manager-5d79b8558f-lfs67\" (UID: \"af9230b1-4c3c-46d1-9513-6b7b0234f7f9\") " pod="openshift-route-controller-manager/route-controller-manager-5d79b8558f-lfs67" Feb 19 15:13:19 crc kubenswrapper[4810]: I0219 15:13:19.085424 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/af9230b1-4c3c-46d1-9513-6b7b0234f7f9-client-ca\") pod \"route-controller-manager-5d79b8558f-lfs67\" (UID: \"af9230b1-4c3c-46d1-9513-6b7b0234f7f9\") " pod="openshift-route-controller-manager/route-controller-manager-5d79b8558f-lfs67" Feb 19 15:13:19 crc kubenswrapper[4810]: I0219 15:13:19.085530 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af9230b1-4c3c-46d1-9513-6b7b0234f7f9-config\") pod \"route-controller-manager-5d79b8558f-lfs67\" (UID: \"af9230b1-4c3c-46d1-9513-6b7b0234f7f9\") " pod="openshift-route-controller-manager/route-controller-manager-5d79b8558f-lfs67" Feb 19 15:13:19 crc kubenswrapper[4810]: I0219 15:13:19.085646 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/581f2a9a-402b-45f5-a6e1-572fe1ccd601-proxy-ca-bundles\") pod \"controller-manager-d6d66b55f-52rdb\" (UID: \"581f2a9a-402b-45f5-a6e1-572fe1ccd601\") " pod="openshift-controller-manager/controller-manager-d6d66b55f-52rdb" Feb 19 15:13:19 crc kubenswrapper[4810]: I0219 15:13:19.086150 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/581f2a9a-402b-45f5-a6e1-572fe1ccd601-client-ca\") pod \"controller-manager-d6d66b55f-52rdb\" (UID: \"581f2a9a-402b-45f5-a6e1-572fe1ccd601\") " pod="openshift-controller-manager/controller-manager-d6d66b55f-52rdb" Feb 19 15:13:19 crc kubenswrapper[4810]: I0219 15:13:19.086309 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/581f2a9a-402b-45f5-a6e1-572fe1ccd601-config\") pod \"controller-manager-d6d66b55f-52rdb\" (UID: \"581f2a9a-402b-45f5-a6e1-572fe1ccd601\") " pod="openshift-controller-manager/controller-manager-d6d66b55f-52rdb" Feb 19 15:13:19 crc kubenswrapper[4810]: I0219 15:13:19.088123 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af9230b1-4c3c-46d1-9513-6b7b0234f7f9-serving-cert\") pod \"route-controller-manager-5d79b8558f-lfs67\" (UID: \"af9230b1-4c3c-46d1-9513-6b7b0234f7f9\") " pod="openshift-route-controller-manager/route-controller-manager-5d79b8558f-lfs67" Feb 19 15:13:19 crc kubenswrapper[4810]: I0219 15:13:19.088341 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/581f2a9a-402b-45f5-a6e1-572fe1ccd601-serving-cert\") pod \"controller-manager-d6d66b55f-52rdb\" (UID: \"581f2a9a-402b-45f5-a6e1-572fe1ccd601\") " pod="openshift-controller-manager/controller-manager-d6d66b55f-52rdb" Feb 19 15:13:19 crc kubenswrapper[4810]: I0219 15:13:19.107967 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xw4x\" (UniqueName: \"kubernetes.io/projected/af9230b1-4c3c-46d1-9513-6b7b0234f7f9-kube-api-access-8xw4x\") pod \"route-controller-manager-5d79b8558f-lfs67\" (UID: \"af9230b1-4c3c-46d1-9513-6b7b0234f7f9\") " pod="openshift-route-controller-manager/route-controller-manager-5d79b8558f-lfs67" Feb 19 15:13:19 crc kubenswrapper[4810]: I0219 15:13:19.111033 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llzqr\" (UniqueName: \"kubernetes.io/projected/581f2a9a-402b-45f5-a6e1-572fe1ccd601-kube-api-access-llzqr\") pod \"controller-manager-d6d66b55f-52rdb\" (UID: \"581f2a9a-402b-45f5-a6e1-572fe1ccd601\") " pod="openshift-controller-manager/controller-manager-d6d66b55f-52rdb" Feb 19 15:13:19 crc kubenswrapper[4810]: I0219 15:13:19.202755 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-d6d66b55f-52rdb" Feb 19 15:13:19 crc kubenswrapper[4810]: I0219 15:13:19.224861 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d79b8558f-lfs67" Feb 19 15:13:19 crc kubenswrapper[4810]: I0219 15:13:19.447364 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c654206-f2d0-4b40-9df0-577dbf27e5e4" path="/var/lib/kubelet/pods/5c654206-f2d0-4b40-9df0-577dbf27e5e4/volumes" Feb 19 15:13:19 crc kubenswrapper[4810]: I0219 15:13:19.448731 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d474cf7-0ff3-43d0-88e8-19100a658851" path="/var/lib/kubelet/pods/9d474cf7-0ff3-43d0-88e8-19100a658851/volumes" Feb 19 15:13:19 crc kubenswrapper[4810]: I0219 15:13:19.449554 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fab1a62-cdc3-4a2c-ab2d-dbd8f1f097f1" path="/var/lib/kubelet/pods/9fab1a62-cdc3-4a2c-ab2d-dbd8f1f097f1/volumes" Feb 19 15:13:19 crc kubenswrapper[4810]: I0219 15:13:19.538148 4810 patch_prober.go:28] interesting pod/machine-config-daemon-t499d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 15:13:19 crc kubenswrapper[4810]: I0219 15:13:19.538232 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 15:13:19 crc kubenswrapper[4810]: I0219 15:13:19.538344 4810 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t499d" Feb 19 15:13:19 crc kubenswrapper[4810]: I0219 15:13:19.539230 4810 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ba65a8abc0841494ba4596019b005e3cbb3f3c0897bef615b51a2ab10aa9e651"} pod="openshift-machine-config-operator/machine-config-daemon-t499d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 15:13:19 crc kubenswrapper[4810]: I0219 15:13:19.539301 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" containerID="cri-o://ba65a8abc0841494ba4596019b005e3cbb3f3c0897bef615b51a2ab10aa9e651" gracePeriod=600 Feb 19 15:13:19 crc kubenswrapper[4810]: I0219 15:13:19.720267 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d79b8558f-lfs67"] Feb 19 15:13:19 crc kubenswrapper[4810]: I0219 15:13:19.724517 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-d6d66b55f-52rdb"] Feb 19 15:13:19 crc kubenswrapper[4810]: W0219 15:13:19.736144 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf9230b1_4c3c_46d1_9513_6b7b0234f7f9.slice/crio-10e92dccf46416517ce6b2a1fb6279117989063ab1a528462ce3408ea940dd8f WatchSource:0}: Error finding container 10e92dccf46416517ce6b2a1fb6279117989063ab1a528462ce3408ea940dd8f: Status 404 returned error can't find the container with id 10e92dccf46416517ce6b2a1fb6279117989063ab1a528462ce3408ea940dd8f Feb 19 15:13:20 crc kubenswrapper[4810]: I0219 15:13:20.543952 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-d6d66b55f-52rdb" event={"ID":"581f2a9a-402b-45f5-a6e1-572fe1ccd601","Type":"ContainerStarted","Data":"666d5b4bfd03ae8416fff90e79d71d92d336b908d952756de1f2eb7e95e48118"} Feb 19 15:13:20 crc kubenswrapper[4810]: I0219 15:13:20.545704 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-d6d66b55f-52rdb" event={"ID":"581f2a9a-402b-45f5-a6e1-572fe1ccd601","Type":"ContainerStarted","Data":"482664dd86c2e2adfb359f5beb6bef16aa9a095fa74e70cf41969a4b378d1e2c"} Feb 19 15:13:20 crc kubenswrapper[4810]: I0219 15:13:20.546865 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d79b8558f-lfs67" event={"ID":"af9230b1-4c3c-46d1-9513-6b7b0234f7f9","Type":"ContainerStarted","Data":"c011eb13462d97403c9e0803b312a77eecf8646ceae2b042da0cb35dcbe87d08"} Feb 19 15:13:20 crc kubenswrapper[4810]: I0219 15:13:20.546913 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-d6d66b55f-52rdb" Feb 19 15:13:20 crc kubenswrapper[4810]: I0219 15:13:20.546933 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5d79b8558f-lfs67" Feb 19 15:13:20 crc kubenswrapper[4810]: I0219 15:13:20.546944 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d79b8558f-lfs67" event={"ID":"af9230b1-4c3c-46d1-9513-6b7b0234f7f9","Type":"ContainerStarted","Data":"10e92dccf46416517ce6b2a1fb6279117989063ab1a528462ce3408ea940dd8f"} Feb 19 15:13:20 crc kubenswrapper[4810]: I0219 15:13:20.553140 4810 generic.go:334] "Generic (PLEG): container finished" podID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerID="ba65a8abc0841494ba4596019b005e3cbb3f3c0897bef615b51a2ab10aa9e651" exitCode=0 Feb 19 15:13:20 crc kubenswrapper[4810]: I0219 15:13:20.553193 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" event={"ID":"dfbf74ef-1e94-4826-8583-42b2e246ccf3","Type":"ContainerDied","Data":"ba65a8abc0841494ba4596019b005e3cbb3f3c0897bef615b51a2ab10aa9e651"} Feb 19 15:13:20 crc kubenswrapper[4810]: I0219 15:13:20.553240 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" event={"ID":"dfbf74ef-1e94-4826-8583-42b2e246ccf3","Type":"ContainerStarted","Data":"4a50af848183862ac97eb1221e89b46a5711d55d54c9bd96026946aef1766658"} Feb 19 15:13:20 crc kubenswrapper[4810]: I0219 15:13:20.563277 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-d6d66b55f-52rdb" Feb 19 15:13:20 crc kubenswrapper[4810]: I0219 15:13:20.569762 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5d79b8558f-lfs67" Feb 19 15:13:20 crc kubenswrapper[4810]: I0219 15:13:20.571152 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-d6d66b55f-52rdb" podStartSLOduration=4.571125458 podStartE2EDuration="4.571125458s" podCreationTimestamp="2026-02-19 15:13:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:13:20.565922825 +0000 UTC m=+230.047952969" watchObservedRunningTime="2026-02-19 15:13:20.571125458 +0000 UTC m=+230.053155582" Feb 19 15:13:20 crc kubenswrapper[4810]: I0219 15:13:20.609064 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5d79b8558f-lfs67" podStartSLOduration=3.609037712 podStartE2EDuration="3.609037712s" podCreationTimestamp="2026-02-19 15:13:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:13:20.607621056 +0000 UTC m=+230.089651180" watchObservedRunningTime="2026-02-19 15:13:20.609037712 +0000 UTC m=+230.091067836" Feb 19 15:13:21 crc kubenswrapper[4810]: I0219 15:13:21.020348 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-d5ks5" Feb 19 15:13:21 crc kubenswrapper[4810]: I0219 15:13:21.020577 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-d5ks5" Feb 19 15:13:21 crc kubenswrapper[4810]: I0219 15:13:21.059094 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-d5ks5" Feb 19 15:13:21 crc kubenswrapper[4810]: I0219 15:13:21.136214 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-blpmq" Feb 19 15:13:21 crc kubenswrapper[4810]: I0219 15:13:21.136272 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-blpmq" Feb 19 15:13:21 crc kubenswrapper[4810]: I0219 15:13:21.176583 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-blpmq" Feb 19 15:13:21 crc kubenswrapper[4810]: I0219 15:13:21.412355 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-x8sn2" Feb 19 15:13:21 crc kubenswrapper[4810]: I0219 15:13:21.458407 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-x8sn2" Feb 19 15:13:21 crc kubenswrapper[4810]: I0219 15:13:21.613682 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-d5ks5" Feb 19 15:13:21 crc kubenswrapper[4810]: I0219 15:13:21.631192 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-blpmq" Feb 19 15:13:23 crc kubenswrapper[4810]: I0219 15:13:23.078574 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-x8sn2"] Feb 19 15:13:23 crc kubenswrapper[4810]: I0219 15:13:23.079015 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-x8sn2" podUID="cc8ce195-1fe1-4684-8172-e710b3552fb5" containerName="registry-server" containerID="cri-o://266b7a0eecc4ea263050674e9b0fa34134a650122976b06935949729514b8110" gracePeriod=2 Feb 19 15:13:23 crc kubenswrapper[4810]: I0219 15:13:23.535460 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x8sn2" Feb 19 15:13:23 crc kubenswrapper[4810]: I0219 15:13:23.576606 4810 generic.go:334] "Generic (PLEG): container finished" podID="cc8ce195-1fe1-4684-8172-e710b3552fb5" containerID="266b7a0eecc4ea263050674e9b0fa34134a650122976b06935949729514b8110" exitCode=0 Feb 19 15:13:23 crc kubenswrapper[4810]: I0219 15:13:23.576729 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x8sn2" Feb 19 15:13:23 crc kubenswrapper[4810]: I0219 15:13:23.576722 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x8sn2" event={"ID":"cc8ce195-1fe1-4684-8172-e710b3552fb5","Type":"ContainerDied","Data":"266b7a0eecc4ea263050674e9b0fa34134a650122976b06935949729514b8110"} Feb 19 15:13:23 crc kubenswrapper[4810]: I0219 15:13:23.577196 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x8sn2" event={"ID":"cc8ce195-1fe1-4684-8172-e710b3552fb5","Type":"ContainerDied","Data":"b765fd7f2fdc89df238064ee08672584bc5ff6e6cd24e9a1fe430dad87064297"} Feb 19 15:13:23 crc kubenswrapper[4810]: I0219 15:13:23.577225 4810 scope.go:117] "RemoveContainer" containerID="266b7a0eecc4ea263050674e9b0fa34134a650122976b06935949729514b8110" Feb 19 15:13:23 crc kubenswrapper[4810]: I0219 15:13:23.598675 4810 scope.go:117] "RemoveContainer" containerID="9ab517ec22370626097c33f80a892ee93279a0d966337802ec729d7271a4e233" Feb 19 15:13:23 crc kubenswrapper[4810]: I0219 15:13:23.617719 4810 scope.go:117] "RemoveContainer" containerID="5c922635ea487f2a0968c5eace1227e4dcace0fce70a7ca4e66511c946968284" Feb 19 15:13:23 crc kubenswrapper[4810]: I0219 15:13:23.640526 4810 scope.go:117] "RemoveContainer" containerID="266b7a0eecc4ea263050674e9b0fa34134a650122976b06935949729514b8110" Feb 19 15:13:23 crc kubenswrapper[4810]: E0219 15:13:23.641166 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"266b7a0eecc4ea263050674e9b0fa34134a650122976b06935949729514b8110\": container with ID starting with 266b7a0eecc4ea263050674e9b0fa34134a650122976b06935949729514b8110 not found: ID does not exist" containerID="266b7a0eecc4ea263050674e9b0fa34134a650122976b06935949729514b8110" Feb 19 15:13:23 crc kubenswrapper[4810]: I0219 15:13:23.641202 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"266b7a0eecc4ea263050674e9b0fa34134a650122976b06935949729514b8110"} err="failed to get container status \"266b7a0eecc4ea263050674e9b0fa34134a650122976b06935949729514b8110\": rpc error: code = NotFound desc = could not find container \"266b7a0eecc4ea263050674e9b0fa34134a650122976b06935949729514b8110\": container with ID starting with 266b7a0eecc4ea263050674e9b0fa34134a650122976b06935949729514b8110 not found: ID does not exist" Feb 19 15:13:23 crc kubenswrapper[4810]: I0219 15:13:23.641226 4810 scope.go:117] "RemoveContainer" containerID="9ab517ec22370626097c33f80a892ee93279a0d966337802ec729d7271a4e233" Feb 19 15:13:23 crc kubenswrapper[4810]: E0219 15:13:23.641719 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ab517ec22370626097c33f80a892ee93279a0d966337802ec729d7271a4e233\": container with ID starting with 9ab517ec22370626097c33f80a892ee93279a0d966337802ec729d7271a4e233 not found: ID does not exist" containerID="9ab517ec22370626097c33f80a892ee93279a0d966337802ec729d7271a4e233" Feb 19 15:13:23 crc kubenswrapper[4810]: I0219 15:13:23.641746 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ab517ec22370626097c33f80a892ee93279a0d966337802ec729d7271a4e233"} err="failed to get container status \"9ab517ec22370626097c33f80a892ee93279a0d966337802ec729d7271a4e233\": rpc error: code = NotFound desc = could not find container \"9ab517ec22370626097c33f80a892ee93279a0d966337802ec729d7271a4e233\": container with ID starting with 9ab517ec22370626097c33f80a892ee93279a0d966337802ec729d7271a4e233 not found: ID does not exist" Feb 19 15:13:23 crc kubenswrapper[4810]: I0219 15:13:23.641761 4810 scope.go:117] "RemoveContainer" containerID="5c922635ea487f2a0968c5eace1227e4dcace0fce70a7ca4e66511c946968284" Feb 19 15:13:23 crc kubenswrapper[4810]: E0219 15:13:23.642104 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c922635ea487f2a0968c5eace1227e4dcace0fce70a7ca4e66511c946968284\": container with ID starting with 5c922635ea487f2a0968c5eace1227e4dcace0fce70a7ca4e66511c946968284 not found: ID does not exist" containerID="5c922635ea487f2a0968c5eace1227e4dcace0fce70a7ca4e66511c946968284" Feb 19 15:13:23 crc kubenswrapper[4810]: I0219 15:13:23.642124 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c922635ea487f2a0968c5eace1227e4dcace0fce70a7ca4e66511c946968284"} err="failed to get container status \"5c922635ea487f2a0968c5eace1227e4dcace0fce70a7ca4e66511c946968284\": rpc error: code = NotFound desc = could not find container \"5c922635ea487f2a0968c5eace1227e4dcace0fce70a7ca4e66511c946968284\": container with ID starting with 5c922635ea487f2a0968c5eace1227e4dcace0fce70a7ca4e66511c946968284 not found: ID does not exist" Feb 19 15:13:23 crc kubenswrapper[4810]: I0219 15:13:23.651835 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc8ce195-1fe1-4684-8172-e710b3552fb5-catalog-content\") pod \"cc8ce195-1fe1-4684-8172-e710b3552fb5\" (UID: \"cc8ce195-1fe1-4684-8172-e710b3552fb5\") " Feb 19 15:13:23 crc kubenswrapper[4810]: I0219 15:13:23.652090 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc8ce195-1fe1-4684-8172-e710b3552fb5-utilities\") pod \"cc8ce195-1fe1-4684-8172-e710b3552fb5\" (UID: \"cc8ce195-1fe1-4684-8172-e710b3552fb5\") " Feb 19 15:13:23 crc kubenswrapper[4810]: I0219 15:13:23.652161 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xzxn\" (UniqueName: \"kubernetes.io/projected/cc8ce195-1fe1-4684-8172-e710b3552fb5-kube-api-access-4xzxn\") pod \"cc8ce195-1fe1-4684-8172-e710b3552fb5\" (UID: \"cc8ce195-1fe1-4684-8172-e710b3552fb5\") " Feb 19 15:13:23 crc kubenswrapper[4810]: I0219 15:13:23.653334 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc8ce195-1fe1-4684-8172-e710b3552fb5-utilities" (OuterVolumeSpecName: "utilities") pod "cc8ce195-1fe1-4684-8172-e710b3552fb5" (UID: "cc8ce195-1fe1-4684-8172-e710b3552fb5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:13:23 crc kubenswrapper[4810]: I0219 15:13:23.661292 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc8ce195-1fe1-4684-8172-e710b3552fb5-kube-api-access-4xzxn" (OuterVolumeSpecName: "kube-api-access-4xzxn") pod "cc8ce195-1fe1-4684-8172-e710b3552fb5" (UID: "cc8ce195-1fe1-4684-8172-e710b3552fb5"). InnerVolumeSpecName "kube-api-access-4xzxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:13:23 crc kubenswrapper[4810]: I0219 15:13:23.719553 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc8ce195-1fe1-4684-8172-e710b3552fb5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cc8ce195-1fe1-4684-8172-e710b3552fb5" (UID: "cc8ce195-1fe1-4684-8172-e710b3552fb5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:13:23 crc kubenswrapper[4810]: I0219 15:13:23.754486 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xzxn\" (UniqueName: \"kubernetes.io/projected/cc8ce195-1fe1-4684-8172-e710b3552fb5-kube-api-access-4xzxn\") on node \"crc\" DevicePath \"\"" Feb 19 15:13:23 crc kubenswrapper[4810]: I0219 15:13:23.754557 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc8ce195-1fe1-4684-8172-e710b3552fb5-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 15:13:23 crc kubenswrapper[4810]: I0219 15:13:23.754590 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc8ce195-1fe1-4684-8172-e710b3552fb5-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 15:13:23 crc kubenswrapper[4810]: I0219 15:13:23.913037 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-x8sn2"] Feb 19 15:13:23 crc kubenswrapper[4810]: I0219 15:13:23.916828 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-x8sn2"] Feb 19 15:13:25 crc kubenswrapper[4810]: I0219 15:13:25.275612 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-blpmq"] Feb 19 15:13:25 crc kubenswrapper[4810]: I0219 15:13:25.276151 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-blpmq" podUID="4127fef2-ef2b-4cc4-967d-d52dac26f314" containerName="registry-server" containerID="cri-o://6cc469c93c48a9ab5d144e46acaa99a2b78d56e0e2fbdf938f36668cd8a62128" gracePeriod=2 Feb 19 15:13:25 crc kubenswrapper[4810]: I0219 15:13:25.450712 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc8ce195-1fe1-4684-8172-e710b3552fb5" path="/var/lib/kubelet/pods/cc8ce195-1fe1-4684-8172-e710b3552fb5/volumes" Feb 19 15:13:26 crc kubenswrapper[4810]: I0219 15:13:26.603808 4810 generic.go:334] "Generic (PLEG): container finished" podID="4127fef2-ef2b-4cc4-967d-d52dac26f314" containerID="6cc469c93c48a9ab5d144e46acaa99a2b78d56e0e2fbdf938f36668cd8a62128" exitCode=0 Feb 19 15:13:26 crc kubenswrapper[4810]: I0219 15:13:26.603861 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-blpmq" event={"ID":"4127fef2-ef2b-4cc4-967d-d52dac26f314","Type":"ContainerDied","Data":"6cc469c93c48a9ab5d144e46acaa99a2b78d56e0e2fbdf938f36668cd8a62128"} Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.028225 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-blpmq" Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.189098 4810 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 19 15:13:27 crc kubenswrapper[4810]: E0219 15:13:27.189559 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4127fef2-ef2b-4cc4-967d-d52dac26f314" containerName="extract-utilities" Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.189595 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="4127fef2-ef2b-4cc4-967d-d52dac26f314" containerName="extract-utilities" Feb 19 15:13:27 crc kubenswrapper[4810]: E0219 15:13:27.189622 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc8ce195-1fe1-4684-8172-e710b3552fb5" containerName="registry-server" Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.189637 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc8ce195-1fe1-4684-8172-e710b3552fb5" containerName="registry-server" Feb 19 15:13:27 crc kubenswrapper[4810]: E0219 15:13:27.189663 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc8ce195-1fe1-4684-8172-e710b3552fb5" containerName="extract-utilities" Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.189677 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc8ce195-1fe1-4684-8172-e710b3552fb5" containerName="extract-utilities" Feb 19 15:13:27 crc kubenswrapper[4810]: E0219 15:13:27.189698 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4127fef2-ef2b-4cc4-967d-d52dac26f314" containerName="extract-content" Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.189712 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="4127fef2-ef2b-4cc4-967d-d52dac26f314" containerName="extract-content" Feb 19 15:13:27 crc kubenswrapper[4810]: E0219 15:13:27.189735 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc8ce195-1fe1-4684-8172-e710b3552fb5" containerName="extract-content" Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.189747 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc8ce195-1fe1-4684-8172-e710b3552fb5" containerName="extract-content" Feb 19 15:13:27 crc kubenswrapper[4810]: E0219 15:13:27.189763 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4127fef2-ef2b-4cc4-967d-d52dac26f314" containerName="registry-server" Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.189775 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="4127fef2-ef2b-4cc4-967d-d52dac26f314" containerName="registry-server" Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.190131 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="4127fef2-ef2b-4cc4-967d-d52dac26f314" containerName="registry-server" Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.190161 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc8ce195-1fe1-4684-8172-e710b3552fb5" containerName="registry-server" Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.190748 4810 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.190947 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.191223 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://16713d77989f991d6b3136c41aac696aaa3233a0f3454bcd299cd968f46872eb" gracePeriod=15 Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.191257 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://42939a1750dd650a11f4f83812694248cc8767b040f5cab76ee64d30ad6004ef" gracePeriod=15 Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.191297 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://77137e37c7a8cdddbba21456e942c760fb2cc6c31533e3bc6c4561929b41393b" gracePeriod=15 Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.191770 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://ef740fa90b279363e303fb824188d0a26097b06aa8eaa488c0c800a8caa8f10e" gracePeriod=15 Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.192719 4810 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 19 15:13:27 crc kubenswrapper[4810]: E0219 15:13:27.192925 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.192939 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 19 15:13:27 crc kubenswrapper[4810]: E0219 15:13:27.192959 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.192969 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 19 15:13:27 crc kubenswrapper[4810]: E0219 15:13:27.192980 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.192989 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 19 15:13:27 crc kubenswrapper[4810]: E0219 15:13:27.193006 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.193015 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 19 15:13:27 crc kubenswrapper[4810]: E0219 15:13:27.193026 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.193035 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 19 15:13:27 crc kubenswrapper[4810]: E0219 15:13:27.193049 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.193057 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 19 15:13:27 crc kubenswrapper[4810]: E0219 15:13:27.193070 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.193081 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.193233 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.193245 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.193255 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.193267 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.193283 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.193294 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.193306 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 19 15:13:27 crc kubenswrapper[4810]: E0219 15:13:27.193493 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.193506 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.191405 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://60a3707767fcc9c9ee6a808f9c0598457f1710b6394ecf9910f7a1a0ddd8ec1e" gracePeriod=15 Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.202096 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4127fef2-ef2b-4cc4-967d-d52dac26f314-catalog-content\") pod \"4127fef2-ef2b-4cc4-967d-d52dac26f314\" (UID: \"4127fef2-ef2b-4cc4-967d-d52dac26f314\") " Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.202292 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4127fef2-ef2b-4cc4-967d-d52dac26f314-utilities\") pod \"4127fef2-ef2b-4cc4-967d-d52dac26f314\" (UID: \"4127fef2-ef2b-4cc4-967d-d52dac26f314\") " Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.202417 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thvr6\" (UniqueName: \"kubernetes.io/projected/4127fef2-ef2b-4cc4-967d-d52dac26f314-kube-api-access-thvr6\") pod \"4127fef2-ef2b-4cc4-967d-d52dac26f314\" (UID: \"4127fef2-ef2b-4cc4-967d-d52dac26f314\") " Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.209223 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4127fef2-ef2b-4cc4-967d-d52dac26f314-utilities" (OuterVolumeSpecName: "utilities") pod "4127fef2-ef2b-4cc4-967d-d52dac26f314" (UID: "4127fef2-ef2b-4cc4-967d-d52dac26f314"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.249556 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4127fef2-ef2b-4cc4-967d-d52dac26f314-kube-api-access-thvr6" (OuterVolumeSpecName: "kube-api-access-thvr6") pod "4127fef2-ef2b-4cc4-967d-d52dac26f314" (UID: "4127fef2-ef2b-4cc4-967d-d52dac26f314"). InnerVolumeSpecName "kube-api-access-thvr6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:13:27 crc kubenswrapper[4810]: E0219 15:13:27.291831 4810 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.162:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.301285 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4127fef2-ef2b-4cc4-967d-d52dac26f314-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4127fef2-ef2b-4cc4-967d-d52dac26f314" (UID: "4127fef2-ef2b-4cc4-967d-d52dac26f314"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.305052 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.305132 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.305173 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.305215 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.305256 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.305284 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.305362 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.305429 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.305511 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thvr6\" (UniqueName: \"kubernetes.io/projected/4127fef2-ef2b-4cc4-967d-d52dac26f314-kube-api-access-thvr6\") on node \"crc\" DevicePath \"\"" Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.305530 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4127fef2-ef2b-4cc4-967d-d52dac26f314-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.305544 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4127fef2-ef2b-4cc4-967d-d52dac26f314-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.406443 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.406537 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.406570 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.406608 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.406574 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.406665 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.406699 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.406737 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.406771 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.406777 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.406734 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.406835 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.406845 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.406719 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.406820 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.406889 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.592685 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.614681 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.617420 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.618994 4810 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ef740fa90b279363e303fb824188d0a26097b06aa8eaa488c0c800a8caa8f10e" exitCode=0 Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.619040 4810 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="77137e37c7a8cdddbba21456e942c760fb2cc6c31533e3bc6c4561929b41393b" exitCode=0 Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.619057 4810 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="42939a1750dd650a11f4f83812694248cc8767b040f5cab76ee64d30ad6004ef" exitCode=0 Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.619074 4810 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="60a3707767fcc9c9ee6a808f9c0598457f1710b6394ecf9910f7a1a0ddd8ec1e" exitCode=2 Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.619162 4810 scope.go:117] "RemoveContainer" containerID="640d02443d7d20048695866b4db1d1fdd6469c712a804010860ca4c730f59c65" Feb 19 15:13:27 crc kubenswrapper[4810]: W0219 15:13:27.619164 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-c202c6fe7155bb03a73466e079dfd207307a44f3613fdc0586849f7cd5739121 WatchSource:0}: Error finding container c202c6fe7155bb03a73466e079dfd207307a44f3613fdc0586849f7cd5739121: Status 404 returned error can't find the container with id c202c6fe7155bb03a73466e079dfd207307a44f3613fdc0586849f7cd5739121 Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.622521 4810 generic.go:334] "Generic (PLEG): container finished" podID="3788d870-2889-4190-9675-e4da44f69a71" containerID="df933f73e8a108c6e7ee4ac92776be1551ecd1fa292247fa01da1f7c1d90d486" exitCode=0 Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.622616 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"3788d870-2889-4190-9675-e4da44f69a71","Type":"ContainerDied","Data":"df933f73e8a108c6e7ee4ac92776be1551ecd1fa292247fa01da1f7c1d90d486"} Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.623410 4810 status_manager.go:851] "Failed to get status for pod" podUID="3788d870-2889-4190-9675-e4da44f69a71" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.162:6443: connect: connection refused" Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.626919 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-blpmq" event={"ID":"4127fef2-ef2b-4cc4-967d-d52dac26f314","Type":"ContainerDied","Data":"4442c4b83b02a776b63ed28285fde96beca3df86a20a033d8feb27311a4298e1"} Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.627041 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-blpmq" Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.628457 4810 status_manager.go:851] "Failed to get status for pod" podUID="4127fef2-ef2b-4cc4-967d-d52dac26f314" pod="openshift-marketplace/certified-operators-blpmq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-blpmq\": dial tcp 38.102.83.162:6443: connect: connection refused" Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.628958 4810 status_manager.go:851] "Failed to get status for pod" podUID="3788d870-2889-4190-9675-e4da44f69a71" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.162:6443: connect: connection refused" Feb 19 15:13:27 crc kubenswrapper[4810]: E0219 15:13:27.630201 4810 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.162:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1895ae9aa35d2a3a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-19 15:13:27.629146682 +0000 UTC m=+237.111176826,LastTimestamp:2026-02-19 15:13:27.629146682 +0000 UTC m=+237.111176826,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.631869 4810 status_manager.go:851] "Failed to get status for pod" podUID="4127fef2-ef2b-4cc4-967d-d52dac26f314" pod="openshift-marketplace/certified-operators-blpmq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-blpmq\": dial tcp 38.102.83.162:6443: connect: connection refused" Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.632265 4810 status_manager.go:851] "Failed to get status for pod" podUID="3788d870-2889-4190-9675-e4da44f69a71" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.162:6443: connect: connection refused" Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.651286 4810 scope.go:117] "RemoveContainer" containerID="6cc469c93c48a9ab5d144e46acaa99a2b78d56e0e2fbdf938f36668cd8a62128" Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.674841 4810 scope.go:117] "RemoveContainer" containerID="b6b451b278f70ee6f85bbefa846a37e02c57266a0a01e80c05241319be84555e" Feb 19 15:13:27 crc kubenswrapper[4810]: I0219 15:13:27.691974 4810 scope.go:117] "RemoveContainer" containerID="01e828c8531030a2de34cc2f43410bf4a69a3af4e19f1adb2dbd098f1f78eca6" Feb 19 15:13:28 crc kubenswrapper[4810]: I0219 15:13:28.640026 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"a2157c6845332b692ab9ff67fd6e2eb4fc93b35babe6d2954b26b7839c430dbe"} Feb 19 15:13:28 crc kubenswrapper[4810]: I0219 15:13:28.640741 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"c202c6fe7155bb03a73466e079dfd207307a44f3613fdc0586849f7cd5739121"} Feb 19 15:13:28 crc kubenswrapper[4810]: I0219 15:13:28.641845 4810 status_manager.go:851] "Failed to get status for pod" podUID="4127fef2-ef2b-4cc4-967d-d52dac26f314" pod="openshift-marketplace/certified-operators-blpmq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-blpmq\": dial tcp 38.102.83.162:6443: connect: connection refused" Feb 19 15:13:28 crc kubenswrapper[4810]: I0219 15:13:28.642499 4810 status_manager.go:851] "Failed to get status for pod" podUID="3788d870-2889-4190-9675-e4da44f69a71" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.162:6443: connect: connection refused" Feb 19 15:13:28 crc kubenswrapper[4810]: E0219 15:13:28.642847 4810 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.162:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 15:13:28 crc kubenswrapper[4810]: I0219 15:13:28.644199 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 19 15:13:29 crc kubenswrapper[4810]: I0219 15:13:29.031399 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 19 15:13:29 crc kubenswrapper[4810]: I0219 15:13:29.032779 4810 status_manager.go:851] "Failed to get status for pod" podUID="4127fef2-ef2b-4cc4-967d-d52dac26f314" pod="openshift-marketplace/certified-operators-blpmq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-blpmq\": dial tcp 38.102.83.162:6443: connect: connection refused" Feb 19 15:13:29 crc kubenswrapper[4810]: I0219 15:13:29.033650 4810 status_manager.go:851] "Failed to get status for pod" podUID="3788d870-2889-4190-9675-e4da44f69a71" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.162:6443: connect: connection refused" Feb 19 15:13:29 crc kubenswrapper[4810]: I0219 15:13:29.233546 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3788d870-2889-4190-9675-e4da44f69a71-kubelet-dir\") pod \"3788d870-2889-4190-9675-e4da44f69a71\" (UID: \"3788d870-2889-4190-9675-e4da44f69a71\") " Feb 19 15:13:29 crc kubenswrapper[4810]: I0219 15:13:29.233648 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3788d870-2889-4190-9675-e4da44f69a71-var-lock\") pod \"3788d870-2889-4190-9675-e4da44f69a71\" (UID: \"3788d870-2889-4190-9675-e4da44f69a71\") " Feb 19 15:13:29 crc kubenswrapper[4810]: I0219 15:13:29.233657 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3788d870-2889-4190-9675-e4da44f69a71-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "3788d870-2889-4190-9675-e4da44f69a71" (UID: "3788d870-2889-4190-9675-e4da44f69a71"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 15:13:29 crc kubenswrapper[4810]: I0219 15:13:29.233705 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3788d870-2889-4190-9675-e4da44f69a71-kube-api-access\") pod \"3788d870-2889-4190-9675-e4da44f69a71\" (UID: \"3788d870-2889-4190-9675-e4da44f69a71\") " Feb 19 15:13:29 crc kubenswrapper[4810]: I0219 15:13:29.233739 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3788d870-2889-4190-9675-e4da44f69a71-var-lock" (OuterVolumeSpecName: "var-lock") pod "3788d870-2889-4190-9675-e4da44f69a71" (UID: "3788d870-2889-4190-9675-e4da44f69a71"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 15:13:29 crc kubenswrapper[4810]: I0219 15:13:29.233983 4810 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3788d870-2889-4190-9675-e4da44f69a71-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 19 15:13:29 crc kubenswrapper[4810]: I0219 15:13:29.233998 4810 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3788d870-2889-4190-9675-e4da44f69a71-var-lock\") on node \"crc\" DevicePath \"\"" Feb 19 15:13:29 crc kubenswrapper[4810]: I0219 15:13:29.239097 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3788d870-2889-4190-9675-e4da44f69a71-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "3788d870-2889-4190-9675-e4da44f69a71" (UID: "3788d870-2889-4190-9675-e4da44f69a71"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:13:29 crc kubenswrapper[4810]: I0219 15:13:29.344281 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3788d870-2889-4190-9675-e4da44f69a71-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 19 15:13:29 crc kubenswrapper[4810]: I0219 15:13:29.645577 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 19 15:13:29 crc kubenswrapper[4810]: I0219 15:13:29.646750 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 15:13:29 crc kubenswrapper[4810]: I0219 15:13:29.647240 4810 status_manager.go:851] "Failed to get status for pod" podUID="4127fef2-ef2b-4cc4-967d-d52dac26f314" pod="openshift-marketplace/certified-operators-blpmq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-blpmq\": dial tcp 38.102.83.162:6443: connect: connection refused" Feb 19 15:13:29 crc kubenswrapper[4810]: I0219 15:13:29.647610 4810 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.162:6443: connect: connection refused" Feb 19 15:13:29 crc kubenswrapper[4810]: I0219 15:13:29.648590 4810 status_manager.go:851] "Failed to get status for pod" podUID="3788d870-2889-4190-9675-e4da44f69a71" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.162:6443: connect: connection refused" Feb 19 15:13:29 crc kubenswrapper[4810]: I0219 15:13:29.657388 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 19 15:13:29 crc kubenswrapper[4810]: I0219 15:13:29.658745 4810 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="16713d77989f991d6b3136c41aac696aaa3233a0f3454bcd299cd968f46872eb" exitCode=0 Feb 19 15:13:29 crc kubenswrapper[4810]: I0219 15:13:29.658847 4810 scope.go:117] "RemoveContainer" containerID="ef740fa90b279363e303fb824188d0a26097b06aa8eaa488c0c800a8caa8f10e" Feb 19 15:13:29 crc kubenswrapper[4810]: I0219 15:13:29.658919 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 15:13:29 crc kubenswrapper[4810]: I0219 15:13:29.661672 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"3788d870-2889-4190-9675-e4da44f69a71","Type":"ContainerDied","Data":"edeb9d3a80185cfad5f2d137a73e03675b14c1a6f2b489fec04d11b503684967"} Feb 19 15:13:29 crc kubenswrapper[4810]: I0219 15:13:29.661713 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="edeb9d3a80185cfad5f2d137a73e03675b14c1a6f2b489fec04d11b503684967" Feb 19 15:13:29 crc kubenswrapper[4810]: I0219 15:13:29.661803 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 19 15:13:29 crc kubenswrapper[4810]: I0219 15:13:29.667087 4810 status_manager.go:851] "Failed to get status for pod" podUID="4127fef2-ef2b-4cc4-967d-d52dac26f314" pod="openshift-marketplace/certified-operators-blpmq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-blpmq\": dial tcp 38.102.83.162:6443: connect: connection refused" Feb 19 15:13:29 crc kubenswrapper[4810]: I0219 15:13:29.667676 4810 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.162:6443: connect: connection refused" Feb 19 15:13:29 crc kubenswrapper[4810]: I0219 15:13:29.668198 4810 status_manager.go:851] "Failed to get status for pod" podUID="3788d870-2889-4190-9675-e4da44f69a71" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.162:6443: connect: connection refused" Feb 19 15:13:29 crc kubenswrapper[4810]: I0219 15:13:29.685620 4810 scope.go:117] "RemoveContainer" containerID="77137e37c7a8cdddbba21456e942c760fb2cc6c31533e3bc6c4561929b41393b" Feb 19 15:13:29 crc kubenswrapper[4810]: I0219 15:13:29.701051 4810 scope.go:117] "RemoveContainer" containerID="42939a1750dd650a11f4f83812694248cc8767b040f5cab76ee64d30ad6004ef" Feb 19 15:13:29 crc kubenswrapper[4810]: I0219 15:13:29.717836 4810 scope.go:117] "RemoveContainer" containerID="60a3707767fcc9c9ee6a808f9c0598457f1710b6394ecf9910f7a1a0ddd8ec1e" Feb 19 15:13:29 crc kubenswrapper[4810]: I0219 15:13:29.732542 4810 scope.go:117] "RemoveContainer" containerID="16713d77989f991d6b3136c41aac696aaa3233a0f3454bcd299cd968f46872eb" Feb 19 15:13:29 crc kubenswrapper[4810]: I0219 15:13:29.749194 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 19 15:13:29 crc kubenswrapper[4810]: I0219 15:13:29.749273 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 15:13:29 crc kubenswrapper[4810]: I0219 15:13:29.749351 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 19 15:13:29 crc kubenswrapper[4810]: I0219 15:13:29.749384 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 19 15:13:29 crc kubenswrapper[4810]: I0219 15:13:29.749476 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 15:13:29 crc kubenswrapper[4810]: I0219 15:13:29.749603 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 15:13:29 crc kubenswrapper[4810]: I0219 15:13:29.749772 4810 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 19 15:13:29 crc kubenswrapper[4810]: I0219 15:13:29.749795 4810 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 19 15:13:29 crc kubenswrapper[4810]: I0219 15:13:29.749808 4810 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Feb 19 15:13:29 crc kubenswrapper[4810]: I0219 15:13:29.755002 4810 scope.go:117] "RemoveContainer" containerID="875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f" Feb 19 15:13:29 crc kubenswrapper[4810]: I0219 15:13:29.778456 4810 scope.go:117] "RemoveContainer" containerID="ef740fa90b279363e303fb824188d0a26097b06aa8eaa488c0c800a8caa8f10e" Feb 19 15:13:29 crc kubenswrapper[4810]: E0219 15:13:29.779107 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef740fa90b279363e303fb824188d0a26097b06aa8eaa488c0c800a8caa8f10e\": container with ID starting with ef740fa90b279363e303fb824188d0a26097b06aa8eaa488c0c800a8caa8f10e not found: ID does not exist" containerID="ef740fa90b279363e303fb824188d0a26097b06aa8eaa488c0c800a8caa8f10e" Feb 19 15:13:29 crc kubenswrapper[4810]: I0219 15:13:29.779194 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef740fa90b279363e303fb824188d0a26097b06aa8eaa488c0c800a8caa8f10e"} err="failed to get container status \"ef740fa90b279363e303fb824188d0a26097b06aa8eaa488c0c800a8caa8f10e\": rpc error: code = NotFound desc = could not find container \"ef740fa90b279363e303fb824188d0a26097b06aa8eaa488c0c800a8caa8f10e\": container with ID starting with ef740fa90b279363e303fb824188d0a26097b06aa8eaa488c0c800a8caa8f10e not found: ID does not exist" Feb 19 15:13:29 crc kubenswrapper[4810]: I0219 15:13:29.779243 4810 scope.go:117] "RemoveContainer" containerID="77137e37c7a8cdddbba21456e942c760fb2cc6c31533e3bc6c4561929b41393b" Feb 19 15:13:29 crc kubenswrapper[4810]: E0219 15:13:29.779742 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77137e37c7a8cdddbba21456e942c760fb2cc6c31533e3bc6c4561929b41393b\": container with ID starting with 77137e37c7a8cdddbba21456e942c760fb2cc6c31533e3bc6c4561929b41393b not found: ID does not exist" containerID="77137e37c7a8cdddbba21456e942c760fb2cc6c31533e3bc6c4561929b41393b" Feb 19 15:13:29 crc kubenswrapper[4810]: I0219 15:13:29.779780 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77137e37c7a8cdddbba21456e942c760fb2cc6c31533e3bc6c4561929b41393b"} err="failed to get container status \"77137e37c7a8cdddbba21456e942c760fb2cc6c31533e3bc6c4561929b41393b\": rpc error: code = NotFound desc = could not find container \"77137e37c7a8cdddbba21456e942c760fb2cc6c31533e3bc6c4561929b41393b\": container with ID starting with 77137e37c7a8cdddbba21456e942c760fb2cc6c31533e3bc6c4561929b41393b not found: ID does not exist" Feb 19 15:13:29 crc kubenswrapper[4810]: I0219 15:13:29.779809 4810 scope.go:117] "RemoveContainer" containerID="42939a1750dd650a11f4f83812694248cc8767b040f5cab76ee64d30ad6004ef" Feb 19 15:13:29 crc kubenswrapper[4810]: E0219 15:13:29.780547 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42939a1750dd650a11f4f83812694248cc8767b040f5cab76ee64d30ad6004ef\": container with ID starting with 42939a1750dd650a11f4f83812694248cc8767b040f5cab76ee64d30ad6004ef not found: ID does not exist" containerID="42939a1750dd650a11f4f83812694248cc8767b040f5cab76ee64d30ad6004ef" Feb 19 15:13:29 crc kubenswrapper[4810]: I0219 15:13:29.780611 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42939a1750dd650a11f4f83812694248cc8767b040f5cab76ee64d30ad6004ef"} err="failed to get container status \"42939a1750dd650a11f4f83812694248cc8767b040f5cab76ee64d30ad6004ef\": rpc error: code = NotFound desc = could not find container \"42939a1750dd650a11f4f83812694248cc8767b040f5cab76ee64d30ad6004ef\": container with ID starting with 42939a1750dd650a11f4f83812694248cc8767b040f5cab76ee64d30ad6004ef not found: ID does not exist" Feb 19 15:13:29 crc kubenswrapper[4810]: I0219 15:13:29.780650 4810 scope.go:117] "RemoveContainer" containerID="60a3707767fcc9c9ee6a808f9c0598457f1710b6394ecf9910f7a1a0ddd8ec1e" Feb 19 15:13:29 crc kubenswrapper[4810]: E0219 15:13:29.781692 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60a3707767fcc9c9ee6a808f9c0598457f1710b6394ecf9910f7a1a0ddd8ec1e\": container with ID starting with 60a3707767fcc9c9ee6a808f9c0598457f1710b6394ecf9910f7a1a0ddd8ec1e not found: ID does not exist" containerID="60a3707767fcc9c9ee6a808f9c0598457f1710b6394ecf9910f7a1a0ddd8ec1e" Feb 19 15:13:29 crc kubenswrapper[4810]: I0219 15:13:29.781731 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60a3707767fcc9c9ee6a808f9c0598457f1710b6394ecf9910f7a1a0ddd8ec1e"} err="failed to get container status \"60a3707767fcc9c9ee6a808f9c0598457f1710b6394ecf9910f7a1a0ddd8ec1e\": rpc error: code = NotFound desc = could not find container \"60a3707767fcc9c9ee6a808f9c0598457f1710b6394ecf9910f7a1a0ddd8ec1e\": container with ID starting with 60a3707767fcc9c9ee6a808f9c0598457f1710b6394ecf9910f7a1a0ddd8ec1e not found: ID does not exist" Feb 19 15:13:29 crc kubenswrapper[4810]: I0219 15:13:29.781753 4810 scope.go:117] "RemoveContainer" containerID="16713d77989f991d6b3136c41aac696aaa3233a0f3454bcd299cd968f46872eb" Feb 19 15:13:29 crc kubenswrapper[4810]: E0219 15:13:29.782200 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16713d77989f991d6b3136c41aac696aaa3233a0f3454bcd299cd968f46872eb\": container with ID starting with 16713d77989f991d6b3136c41aac696aaa3233a0f3454bcd299cd968f46872eb not found: ID does not exist" containerID="16713d77989f991d6b3136c41aac696aaa3233a0f3454bcd299cd968f46872eb" Feb 19 15:13:29 crc kubenswrapper[4810]: I0219 15:13:29.782246 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16713d77989f991d6b3136c41aac696aaa3233a0f3454bcd299cd968f46872eb"} err="failed to get container status \"16713d77989f991d6b3136c41aac696aaa3233a0f3454bcd299cd968f46872eb\": rpc error: code = NotFound desc = could not find container \"16713d77989f991d6b3136c41aac696aaa3233a0f3454bcd299cd968f46872eb\": container with ID starting with 16713d77989f991d6b3136c41aac696aaa3233a0f3454bcd299cd968f46872eb not found: ID does not exist" Feb 19 15:13:29 crc kubenswrapper[4810]: I0219 15:13:29.782445 4810 scope.go:117] "RemoveContainer" containerID="875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f" Feb 19 15:13:29 crc kubenswrapper[4810]: E0219 15:13:29.782906 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\": container with ID starting with 875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f not found: ID does not exist" containerID="875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f" Feb 19 15:13:29 crc kubenswrapper[4810]: I0219 15:13:29.782936 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f"} err="failed to get container status \"875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\": rpc error: code = NotFound desc = could not find container \"875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f\": container with ID starting with 875ed8333c1f444f5418ec6900e7d62c687cf7441e1be23a7e8138c9c721fb7f not found: ID does not exist" Feb 19 15:13:30 crc kubenswrapper[4810]: I0219 15:13:30.014225 4810 status_manager.go:851] "Failed to get status for pod" podUID="4127fef2-ef2b-4cc4-967d-d52dac26f314" pod="openshift-marketplace/certified-operators-blpmq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-blpmq\": dial tcp 38.102.83.162:6443: connect: connection refused" Feb 19 15:13:30 crc kubenswrapper[4810]: I0219 15:13:30.014854 4810 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.162:6443: connect: connection refused" Feb 19 15:13:30 crc kubenswrapper[4810]: I0219 15:13:30.015455 4810 status_manager.go:851] "Failed to get status for pod" podUID="3788d870-2889-4190-9675-e4da44f69a71" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.162:6443: connect: connection refused" Feb 19 15:13:31 crc kubenswrapper[4810]: E0219 15:13:31.357221 4810 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.162:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1895ae9aa35d2a3a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-19 15:13:27.629146682 +0000 UTC m=+237.111176826,LastTimestamp:2026-02-19 15:13:27.629146682 +0000 UTC m=+237.111176826,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 19 15:13:31 crc kubenswrapper[4810]: I0219 15:13:31.444897 4810 status_manager.go:851] "Failed to get status for pod" podUID="4127fef2-ef2b-4cc4-967d-d52dac26f314" pod="openshift-marketplace/certified-operators-blpmq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-blpmq\": dial tcp 38.102.83.162:6443: connect: connection refused" Feb 19 15:13:31 crc kubenswrapper[4810]: I0219 15:13:31.445495 4810 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.162:6443: connect: connection refused" Feb 19 15:13:31 crc kubenswrapper[4810]: I0219 15:13:31.446233 4810 status_manager.go:851] "Failed to get status for pod" podUID="3788d870-2889-4190-9675-e4da44f69a71" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.162:6443: connect: connection refused" Feb 19 15:13:31 crc kubenswrapper[4810]: I0219 15:13:31.447709 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Feb 19 15:13:32 crc kubenswrapper[4810]: E0219 15:13:32.032936 4810 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.162:6443: connect: connection refused" Feb 19 15:13:32 crc kubenswrapper[4810]: E0219 15:13:32.033625 4810 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.162:6443: connect: connection refused" Feb 19 15:13:32 crc kubenswrapper[4810]: E0219 15:13:32.034476 4810 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.162:6443: connect: connection refused" Feb 19 15:13:32 crc kubenswrapper[4810]: E0219 15:13:32.035079 4810 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.162:6443: connect: connection refused" Feb 19 15:13:32 crc kubenswrapper[4810]: E0219 15:13:32.035529 4810 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.162:6443: connect: connection refused" Feb 19 15:13:32 crc kubenswrapper[4810]: I0219 15:13:32.035582 4810 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Feb 19 15:13:32 crc kubenswrapper[4810]: E0219 15:13:32.036109 4810 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.162:6443: connect: connection refused" interval="200ms" Feb 19 15:13:32 crc kubenswrapper[4810]: E0219 15:13:32.236893 4810 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.162:6443: connect: connection refused" interval="400ms" Feb 19 15:13:32 crc kubenswrapper[4810]: E0219 15:13:32.638529 4810 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.162:6443: connect: connection refused" interval="800ms" Feb 19 15:13:33 crc kubenswrapper[4810]: E0219 15:13:33.439675 4810 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.162:6443: connect: connection refused" interval="1.6s" Feb 19 15:13:35 crc kubenswrapper[4810]: E0219 15:13:35.041987 4810 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.162:6443: connect: connection refused" interval="3.2s" Feb 19 15:13:35 crc kubenswrapper[4810]: E0219 15:13:35.446171 4810 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.162:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" volumeName="registry-storage" Feb 19 15:13:37 crc kubenswrapper[4810]: I0219 15:13:37.722380 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-r74mv" podUID="c18fb461-ce5b-43ad-85ca-305c3f8a7d46" containerName="oauth-openshift" containerID="cri-o://1540e32985c892982c268c96e96e5e47cb3be178ce36f945dc12ccf2635f82d2" gracePeriod=15 Feb 19 15:13:38 crc kubenswrapper[4810]: E0219 15:13:38.243445 4810 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.162:6443: connect: connection refused" interval="6.4s" Feb 19 15:13:38 crc kubenswrapper[4810]: I0219 15:13:38.365056 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-r74mv" Feb 19 15:13:38 crc kubenswrapper[4810]: I0219 15:13:38.365704 4810 status_manager.go:851] "Failed to get status for pod" podUID="4127fef2-ef2b-4cc4-967d-d52dac26f314" pod="openshift-marketplace/certified-operators-blpmq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-blpmq\": dial tcp 38.102.83.162:6443: connect: connection refused" Feb 19 15:13:38 crc kubenswrapper[4810]: I0219 15:13:38.366093 4810 status_manager.go:851] "Failed to get status for pod" podUID="c18fb461-ce5b-43ad-85ca-305c3f8a7d46" pod="openshift-authentication/oauth-openshift-558db77b4-r74mv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-r74mv\": dial tcp 38.102.83.162:6443: connect: connection refused" Feb 19 15:13:38 crc kubenswrapper[4810]: I0219 15:13:38.366490 4810 status_manager.go:851] "Failed to get status for pod" podUID="3788d870-2889-4190-9675-e4da44f69a71" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.162:6443: connect: connection refused" Feb 19 15:13:38 crc kubenswrapper[4810]: I0219 15:13:38.501575 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-v4-0-config-user-template-error\") pod \"c18fb461-ce5b-43ad-85ca-305c3f8a7d46\" (UID: \"c18fb461-ce5b-43ad-85ca-305c3f8a7d46\") " Feb 19 15:13:38 crc kubenswrapper[4810]: I0219 15:13:38.501621 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-audit-policies\") pod \"c18fb461-ce5b-43ad-85ca-305c3f8a7d46\" (UID: \"c18fb461-ce5b-43ad-85ca-305c3f8a7d46\") " Feb 19 15:13:38 crc kubenswrapper[4810]: I0219 15:13:38.501646 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-v4-0-config-system-service-ca\") pod \"c18fb461-ce5b-43ad-85ca-305c3f8a7d46\" (UID: \"c18fb461-ce5b-43ad-85ca-305c3f8a7d46\") " Feb 19 15:13:38 crc kubenswrapper[4810]: I0219 15:13:38.501667 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-v4-0-config-user-template-login\") pod \"c18fb461-ce5b-43ad-85ca-305c3f8a7d46\" (UID: \"c18fb461-ce5b-43ad-85ca-305c3f8a7d46\") " Feb 19 15:13:38 crc kubenswrapper[4810]: I0219 15:13:38.501693 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-v4-0-config-user-idp-0-file-data\") pod \"c18fb461-ce5b-43ad-85ca-305c3f8a7d46\" (UID: \"c18fb461-ce5b-43ad-85ca-305c3f8a7d46\") " Feb 19 15:13:38 crc kubenswrapper[4810]: I0219 15:13:38.501712 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-audit-dir\") pod \"c18fb461-ce5b-43ad-85ca-305c3f8a7d46\" (UID: \"c18fb461-ce5b-43ad-85ca-305c3f8a7d46\") " Feb 19 15:13:38 crc kubenswrapper[4810]: I0219 15:13:38.501729 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-v4-0-config-system-router-certs\") pod \"c18fb461-ce5b-43ad-85ca-305c3f8a7d46\" (UID: \"c18fb461-ce5b-43ad-85ca-305c3f8a7d46\") " Feb 19 15:13:38 crc kubenswrapper[4810]: I0219 15:13:38.501760 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-v4-0-config-system-ocp-branding-template\") pod \"c18fb461-ce5b-43ad-85ca-305c3f8a7d46\" (UID: \"c18fb461-ce5b-43ad-85ca-305c3f8a7d46\") " Feb 19 15:13:38 crc kubenswrapper[4810]: I0219 15:13:38.501786 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-v4-0-config-system-serving-cert\") pod \"c18fb461-ce5b-43ad-85ca-305c3f8a7d46\" (UID: \"c18fb461-ce5b-43ad-85ca-305c3f8a7d46\") " Feb 19 15:13:38 crc kubenswrapper[4810]: I0219 15:13:38.501806 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-v4-0-config-system-cliconfig\") pod \"c18fb461-ce5b-43ad-85ca-305c3f8a7d46\" (UID: \"c18fb461-ce5b-43ad-85ca-305c3f8a7d46\") " Feb 19 15:13:38 crc kubenswrapper[4810]: I0219 15:13:38.501836 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-v4-0-config-system-session\") pod \"c18fb461-ce5b-43ad-85ca-305c3f8a7d46\" (UID: \"c18fb461-ce5b-43ad-85ca-305c3f8a7d46\") " Feb 19 15:13:38 crc kubenswrapper[4810]: I0219 15:13:38.501859 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zxw8d\" (UniqueName: \"kubernetes.io/projected/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-kube-api-access-zxw8d\") pod \"c18fb461-ce5b-43ad-85ca-305c3f8a7d46\" (UID: \"c18fb461-ce5b-43ad-85ca-305c3f8a7d46\") " Feb 19 15:13:38 crc kubenswrapper[4810]: I0219 15:13:38.501887 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-v4-0-config-system-trusted-ca-bundle\") pod \"c18fb461-ce5b-43ad-85ca-305c3f8a7d46\" (UID: \"c18fb461-ce5b-43ad-85ca-305c3f8a7d46\") " Feb 19 15:13:38 crc kubenswrapper[4810]: I0219 15:13:38.501908 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-v4-0-config-user-template-provider-selection\") pod \"c18fb461-ce5b-43ad-85ca-305c3f8a7d46\" (UID: \"c18fb461-ce5b-43ad-85ca-305c3f8a7d46\") " Feb 19 15:13:38 crc kubenswrapper[4810]: I0219 15:13:38.502809 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "c18fb461-ce5b-43ad-85ca-305c3f8a7d46" (UID: "c18fb461-ce5b-43ad-85ca-305c3f8a7d46"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 15:13:38 crc kubenswrapper[4810]: I0219 15:13:38.503124 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "c18fb461-ce5b-43ad-85ca-305c3f8a7d46" (UID: "c18fb461-ce5b-43ad-85ca-305c3f8a7d46"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:13:38 crc kubenswrapper[4810]: I0219 15:13:38.503414 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "c18fb461-ce5b-43ad-85ca-305c3f8a7d46" (UID: "c18fb461-ce5b-43ad-85ca-305c3f8a7d46"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:13:38 crc kubenswrapper[4810]: I0219 15:13:38.503959 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "c18fb461-ce5b-43ad-85ca-305c3f8a7d46" (UID: "c18fb461-ce5b-43ad-85ca-305c3f8a7d46"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:13:38 crc kubenswrapper[4810]: I0219 15:13:38.504135 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "c18fb461-ce5b-43ad-85ca-305c3f8a7d46" (UID: "c18fb461-ce5b-43ad-85ca-305c3f8a7d46"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:13:38 crc kubenswrapper[4810]: I0219 15:13:38.509294 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "c18fb461-ce5b-43ad-85ca-305c3f8a7d46" (UID: "c18fb461-ce5b-43ad-85ca-305c3f8a7d46"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:13:38 crc kubenswrapper[4810]: I0219 15:13:38.510266 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "c18fb461-ce5b-43ad-85ca-305c3f8a7d46" (UID: "c18fb461-ce5b-43ad-85ca-305c3f8a7d46"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:13:38 crc kubenswrapper[4810]: I0219 15:13:38.511841 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "c18fb461-ce5b-43ad-85ca-305c3f8a7d46" (UID: "c18fb461-ce5b-43ad-85ca-305c3f8a7d46"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:13:38 crc kubenswrapper[4810]: I0219 15:13:38.512033 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "c18fb461-ce5b-43ad-85ca-305c3f8a7d46" (UID: "c18fb461-ce5b-43ad-85ca-305c3f8a7d46"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:13:38 crc kubenswrapper[4810]: I0219 15:13:38.512154 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "c18fb461-ce5b-43ad-85ca-305c3f8a7d46" (UID: "c18fb461-ce5b-43ad-85ca-305c3f8a7d46"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:13:38 crc kubenswrapper[4810]: I0219 15:13:38.513127 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-kube-api-access-zxw8d" (OuterVolumeSpecName: "kube-api-access-zxw8d") pod "c18fb461-ce5b-43ad-85ca-305c3f8a7d46" (UID: "c18fb461-ce5b-43ad-85ca-305c3f8a7d46"). InnerVolumeSpecName "kube-api-access-zxw8d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:13:38 crc kubenswrapper[4810]: I0219 15:13:38.517259 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "c18fb461-ce5b-43ad-85ca-305c3f8a7d46" (UID: "c18fb461-ce5b-43ad-85ca-305c3f8a7d46"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:13:38 crc kubenswrapper[4810]: I0219 15:13:38.518204 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "c18fb461-ce5b-43ad-85ca-305c3f8a7d46" (UID: "c18fb461-ce5b-43ad-85ca-305c3f8a7d46"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:13:38 crc kubenswrapper[4810]: I0219 15:13:38.518894 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "c18fb461-ce5b-43ad-85ca-305c3f8a7d46" (UID: "c18fb461-ce5b-43ad-85ca-305c3f8a7d46"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:13:38 crc kubenswrapper[4810]: I0219 15:13:38.603197 4810 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 19 15:13:38 crc kubenswrapper[4810]: I0219 15:13:38.603288 4810 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 15:13:38 crc kubenswrapper[4810]: I0219 15:13:38.603306 4810 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 19 15:13:38 crc kubenswrapper[4810]: I0219 15:13:38.603345 4810 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 19 15:13:38 crc kubenswrapper[4810]: I0219 15:13:38.603357 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zxw8d\" (UniqueName: \"kubernetes.io/projected/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-kube-api-access-zxw8d\") on node \"crc\" DevicePath \"\"" Feb 19 15:13:38 crc kubenswrapper[4810]: I0219 15:13:38.603366 4810 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 15:13:38 crc kubenswrapper[4810]: I0219 15:13:38.603377 4810 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 19 15:13:38 crc kubenswrapper[4810]: I0219 15:13:38.603389 4810 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 19 15:13:38 crc kubenswrapper[4810]: I0219 15:13:38.603398 4810 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 19 15:13:38 crc kubenswrapper[4810]: I0219 15:13:38.603444 4810 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 19 15:13:38 crc kubenswrapper[4810]: I0219 15:13:38.603454 4810 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 19 15:13:38 crc kubenswrapper[4810]: I0219 15:13:38.603466 4810 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 19 15:13:38 crc kubenswrapper[4810]: I0219 15:13:38.603474 4810 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 19 15:13:38 crc kubenswrapper[4810]: I0219 15:13:38.603483 4810 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c18fb461-ce5b-43ad-85ca-305c3f8a7d46-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 19 15:13:38 crc kubenswrapper[4810]: I0219 15:13:38.729910 4810 generic.go:334] "Generic (PLEG): container finished" podID="c18fb461-ce5b-43ad-85ca-305c3f8a7d46" containerID="1540e32985c892982c268c96e96e5e47cb3be178ce36f945dc12ccf2635f82d2" exitCode=0 Feb 19 15:13:38 crc kubenswrapper[4810]: I0219 15:13:38.729995 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-r74mv" event={"ID":"c18fb461-ce5b-43ad-85ca-305c3f8a7d46","Type":"ContainerDied","Data":"1540e32985c892982c268c96e96e5e47cb3be178ce36f945dc12ccf2635f82d2"} Feb 19 15:13:38 crc kubenswrapper[4810]: I0219 15:13:38.730049 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-r74mv" event={"ID":"c18fb461-ce5b-43ad-85ca-305c3f8a7d46","Type":"ContainerDied","Data":"a5b8d6a2012cb01f6278524103e645fa596a71b16bda554c88859e183269d288"} Feb 19 15:13:38 crc kubenswrapper[4810]: I0219 15:13:38.730077 4810 scope.go:117] "RemoveContainer" containerID="1540e32985c892982c268c96e96e5e47cb3be178ce36f945dc12ccf2635f82d2" Feb 19 15:13:38 crc kubenswrapper[4810]: I0219 15:13:38.730085 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-r74mv" Feb 19 15:13:38 crc kubenswrapper[4810]: I0219 15:13:38.730702 4810 status_manager.go:851] "Failed to get status for pod" podUID="4127fef2-ef2b-4cc4-967d-d52dac26f314" pod="openshift-marketplace/certified-operators-blpmq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-blpmq\": dial tcp 38.102.83.162:6443: connect: connection refused" Feb 19 15:13:38 crc kubenswrapper[4810]: I0219 15:13:38.731115 4810 status_manager.go:851] "Failed to get status for pod" podUID="c18fb461-ce5b-43ad-85ca-305c3f8a7d46" pod="openshift-authentication/oauth-openshift-558db77b4-r74mv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-r74mv\": dial tcp 38.102.83.162:6443: connect: connection refused" Feb 19 15:13:38 crc kubenswrapper[4810]: I0219 15:13:38.731841 4810 status_manager.go:851] "Failed to get status for pod" podUID="3788d870-2889-4190-9675-e4da44f69a71" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.162:6443: connect: connection refused" Feb 19 15:13:38 crc kubenswrapper[4810]: I0219 15:13:38.746553 4810 status_manager.go:851] "Failed to get status for pod" podUID="4127fef2-ef2b-4cc4-967d-d52dac26f314" pod="openshift-marketplace/certified-operators-blpmq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-blpmq\": dial tcp 38.102.83.162:6443: connect: connection refused" Feb 19 15:13:38 crc kubenswrapper[4810]: I0219 15:13:38.747725 4810 status_manager.go:851] "Failed to get status for pod" podUID="c18fb461-ce5b-43ad-85ca-305c3f8a7d46" pod="openshift-authentication/oauth-openshift-558db77b4-r74mv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-r74mv\": dial tcp 38.102.83.162:6443: connect: connection refused" Feb 19 15:13:38 crc kubenswrapper[4810]: I0219 15:13:38.747949 4810 status_manager.go:851] "Failed to get status for pod" podUID="3788d870-2889-4190-9675-e4da44f69a71" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.162:6443: connect: connection refused" Feb 19 15:13:38 crc kubenswrapper[4810]: I0219 15:13:38.756776 4810 scope.go:117] "RemoveContainer" containerID="1540e32985c892982c268c96e96e5e47cb3be178ce36f945dc12ccf2635f82d2" Feb 19 15:13:38 crc kubenswrapper[4810]: E0219 15:13:38.757385 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1540e32985c892982c268c96e96e5e47cb3be178ce36f945dc12ccf2635f82d2\": container with ID starting with 1540e32985c892982c268c96e96e5e47cb3be178ce36f945dc12ccf2635f82d2 not found: ID does not exist" containerID="1540e32985c892982c268c96e96e5e47cb3be178ce36f945dc12ccf2635f82d2" Feb 19 15:13:38 crc kubenswrapper[4810]: I0219 15:13:38.757437 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1540e32985c892982c268c96e96e5e47cb3be178ce36f945dc12ccf2635f82d2"} err="failed to get container status \"1540e32985c892982c268c96e96e5e47cb3be178ce36f945dc12ccf2635f82d2\": rpc error: code = NotFound desc = could not find container \"1540e32985c892982c268c96e96e5e47cb3be178ce36f945dc12ccf2635f82d2\": container with ID starting with 1540e32985c892982c268c96e96e5e47cb3be178ce36f945dc12ccf2635f82d2 not found: ID does not exist" Feb 19 15:13:40 crc kubenswrapper[4810]: I0219 15:13:40.438473 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 15:13:40 crc kubenswrapper[4810]: I0219 15:13:40.439443 4810 status_manager.go:851] "Failed to get status for pod" podUID="4127fef2-ef2b-4cc4-967d-d52dac26f314" pod="openshift-marketplace/certified-operators-blpmq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-blpmq\": dial tcp 38.102.83.162:6443: connect: connection refused" Feb 19 15:13:40 crc kubenswrapper[4810]: I0219 15:13:40.440181 4810 status_manager.go:851] "Failed to get status for pod" podUID="c18fb461-ce5b-43ad-85ca-305c3f8a7d46" pod="openshift-authentication/oauth-openshift-558db77b4-r74mv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-r74mv\": dial tcp 38.102.83.162:6443: connect: connection refused" Feb 19 15:13:40 crc kubenswrapper[4810]: I0219 15:13:40.441057 4810 status_manager.go:851] "Failed to get status for pod" podUID="3788d870-2889-4190-9675-e4da44f69a71" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.162:6443: connect: connection refused" Feb 19 15:13:40 crc kubenswrapper[4810]: I0219 15:13:40.461605 4810 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b6694b0d-3264-43eb-bd52-5088c7d2bf15" Feb 19 15:13:40 crc kubenswrapper[4810]: I0219 15:13:40.461647 4810 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b6694b0d-3264-43eb-bd52-5088c7d2bf15" Feb 19 15:13:40 crc kubenswrapper[4810]: E0219 15:13:40.462312 4810 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.162:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 15:13:40 crc kubenswrapper[4810]: I0219 15:13:40.463074 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 15:13:40 crc kubenswrapper[4810]: W0219 15:13:40.487845 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-b95b27a53792cf48c31d3de79d5da1f11dd0122feff8ae7f33a11b7e86a1c396 WatchSource:0}: Error finding container b95b27a53792cf48c31d3de79d5da1f11dd0122feff8ae7f33a11b7e86a1c396: Status 404 returned error can't find the container with id b95b27a53792cf48c31d3de79d5da1f11dd0122feff8ae7f33a11b7e86a1c396 Feb 19 15:13:40 crc kubenswrapper[4810]: I0219 15:13:40.751292 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"b95b27a53792cf48c31d3de79d5da1f11dd0122feff8ae7f33a11b7e86a1c396"} Feb 19 15:13:41 crc kubenswrapper[4810]: E0219 15:13:41.359835 4810 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.162:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1895ae9aa35d2a3a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-19 15:13:27.629146682 +0000 UTC m=+237.111176826,LastTimestamp:2026-02-19 15:13:27.629146682 +0000 UTC m=+237.111176826,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 19 15:13:41 crc kubenswrapper[4810]: I0219 15:13:41.449347 4810 status_manager.go:851] "Failed to get status for pod" podUID="c18fb461-ce5b-43ad-85ca-305c3f8a7d46" pod="openshift-authentication/oauth-openshift-558db77b4-r74mv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-r74mv\": dial tcp 38.102.83.162:6443: connect: connection refused" Feb 19 15:13:41 crc kubenswrapper[4810]: I0219 15:13:41.450266 4810 status_manager.go:851] "Failed to get status for pod" podUID="3788d870-2889-4190-9675-e4da44f69a71" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.162:6443: connect: connection refused" Feb 19 15:13:41 crc kubenswrapper[4810]: I0219 15:13:41.450973 4810 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.162:6443: connect: connection refused" Feb 19 15:13:41 crc kubenswrapper[4810]: I0219 15:13:41.451516 4810 status_manager.go:851] "Failed to get status for pod" podUID="4127fef2-ef2b-4cc4-967d-d52dac26f314" pod="openshift-marketplace/certified-operators-blpmq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-blpmq\": dial tcp 38.102.83.162:6443: connect: connection refused" Feb 19 15:13:41 crc kubenswrapper[4810]: I0219 15:13:41.759004 4810 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="2edba90b5013aedd5164f7df8b4ee66d180c2d853b109ba9560f61512a8fbd83" exitCode=0 Feb 19 15:13:41 crc kubenswrapper[4810]: I0219 15:13:41.759454 4810 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b6694b0d-3264-43eb-bd52-5088c7d2bf15" Feb 19 15:13:41 crc kubenswrapper[4810]: I0219 15:13:41.759496 4810 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b6694b0d-3264-43eb-bd52-5088c7d2bf15" Feb 19 15:13:41 crc kubenswrapper[4810]: I0219 15:13:41.759128 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"2edba90b5013aedd5164f7df8b4ee66d180c2d853b109ba9560f61512a8fbd83"} Feb 19 15:13:41 crc kubenswrapper[4810]: E0219 15:13:41.760186 4810 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.162:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 15:13:41 crc kubenswrapper[4810]: I0219 15:13:41.760202 4810 status_manager.go:851] "Failed to get status for pod" podUID="4127fef2-ef2b-4cc4-967d-d52dac26f314" pod="openshift-marketplace/certified-operators-blpmq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-blpmq\": dial tcp 38.102.83.162:6443: connect: connection refused" Feb 19 15:13:41 crc kubenswrapper[4810]: I0219 15:13:41.760680 4810 status_manager.go:851] "Failed to get status for pod" podUID="c18fb461-ce5b-43ad-85ca-305c3f8a7d46" pod="openshift-authentication/oauth-openshift-558db77b4-r74mv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-r74mv\": dial tcp 38.102.83.162:6443: connect: connection refused" Feb 19 15:13:41 crc kubenswrapper[4810]: I0219 15:13:41.761062 4810 status_manager.go:851] "Failed to get status for pod" podUID="3788d870-2889-4190-9675-e4da44f69a71" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.162:6443: connect: connection refused" Feb 19 15:13:41 crc kubenswrapper[4810]: I0219 15:13:41.761670 4810 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.162:6443: connect: connection refused" Feb 19 15:13:41 crc kubenswrapper[4810]: I0219 15:13:41.763312 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 19 15:13:41 crc kubenswrapper[4810]: I0219 15:13:41.763395 4810 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="89e969496cd5adab974a5d8b8c3051bcdfb63c582b8f0d6d9af233d25e2c1d3c" exitCode=1 Feb 19 15:13:41 crc kubenswrapper[4810]: I0219 15:13:41.763436 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"89e969496cd5adab974a5d8b8c3051bcdfb63c582b8f0d6d9af233d25e2c1d3c"} Feb 19 15:13:41 crc kubenswrapper[4810]: I0219 15:13:41.764022 4810 scope.go:117] "RemoveContainer" containerID="89e969496cd5adab974a5d8b8c3051bcdfb63c582b8f0d6d9af233d25e2c1d3c" Feb 19 15:13:41 crc kubenswrapper[4810]: I0219 15:13:41.764202 4810 status_manager.go:851] "Failed to get status for pod" podUID="4127fef2-ef2b-4cc4-967d-d52dac26f314" pod="openshift-marketplace/certified-operators-blpmq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-blpmq\": dial tcp 38.102.83.162:6443: connect: connection refused" Feb 19 15:13:41 crc kubenswrapper[4810]: I0219 15:13:41.764756 4810 status_manager.go:851] "Failed to get status for pod" podUID="c18fb461-ce5b-43ad-85ca-305c3f8a7d46" pod="openshift-authentication/oauth-openshift-558db77b4-r74mv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-r74mv\": dial tcp 38.102.83.162:6443: connect: connection refused" Feb 19 15:13:41 crc kubenswrapper[4810]: I0219 15:13:41.765049 4810 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.162:6443: connect: connection refused" Feb 19 15:13:41 crc kubenswrapper[4810]: I0219 15:13:41.765430 4810 status_manager.go:851] "Failed to get status for pod" podUID="3788d870-2889-4190-9675-e4da44f69a71" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.162:6443: connect: connection refused" Feb 19 15:13:41 crc kubenswrapper[4810]: I0219 15:13:41.765800 4810 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.162:6443: connect: connection refused" Feb 19 15:13:42 crc kubenswrapper[4810]: I0219 15:13:42.775972 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"ecb4dc2953fc0dd2b18030ccd9aa2495be814d5a7c10f3db03d0199658da5674"} Feb 19 15:13:42 crc kubenswrapper[4810]: I0219 15:13:42.776403 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"932073081c0f47ba7edb3b4806f86d49553956625ae2cf6e09e6c9c7e82f1408"} Feb 19 15:13:42 crc kubenswrapper[4810]: I0219 15:13:42.776416 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"82641ab373c5a3f584b9718a4e1076acabf3c6107eb17711cf62bcec7a3379c6"} Feb 19 15:13:42 crc kubenswrapper[4810]: I0219 15:13:42.776426 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"951401d9d5a5d6a99f842ad8a4cd8ad2ff73fd645878c2ccfed4f28e20399ff9"} Feb 19 15:13:42 crc kubenswrapper[4810]: I0219 15:13:42.788975 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 19 15:13:42 crc kubenswrapper[4810]: I0219 15:13:42.789039 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"04741ed0fdbbb3980f28a8f3763ed31736522fe98f071b4a91c2caf991f00ce4"} Feb 19 15:13:43 crc kubenswrapper[4810]: I0219 15:13:43.538438 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 15:13:43 crc kubenswrapper[4810]: I0219 15:13:43.538797 4810 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 19 15:13:43 crc kubenswrapper[4810]: I0219 15:13:43.539120 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 19 15:13:43 crc kubenswrapper[4810]: I0219 15:13:43.798196 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"a675a26c30628492c2daff65ad50468a5356c241a3c7ad66e33b38ee05c2e5ca"} Feb 19 15:13:43 crc kubenswrapper[4810]: I0219 15:13:43.798368 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 15:13:43 crc kubenswrapper[4810]: I0219 15:13:43.798493 4810 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b6694b0d-3264-43eb-bd52-5088c7d2bf15" Feb 19 15:13:43 crc kubenswrapper[4810]: I0219 15:13:43.798520 4810 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b6694b0d-3264-43eb-bd52-5088c7d2bf15" Feb 19 15:13:45 crc kubenswrapper[4810]: I0219 15:13:45.463499 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 15:13:45 crc kubenswrapper[4810]: I0219 15:13:45.463814 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 15:13:45 crc kubenswrapper[4810]: I0219 15:13:45.479069 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 15:13:46 crc kubenswrapper[4810]: I0219 15:13:46.222581 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 15:13:48 crc kubenswrapper[4810]: I0219 15:13:48.814846 4810 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 15:13:49 crc kubenswrapper[4810]: I0219 15:13:49.837576 4810 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b6694b0d-3264-43eb-bd52-5088c7d2bf15" Feb 19 15:13:49 crc kubenswrapper[4810]: I0219 15:13:49.837986 4810 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b6694b0d-3264-43eb-bd52-5088c7d2bf15" Feb 19 15:13:49 crc kubenswrapper[4810]: I0219 15:13:49.843484 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 15:13:49 crc kubenswrapper[4810]: I0219 15:13:49.847828 4810 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="10ac7c28-166d-4a4e-aaf9-c7adac6a9cc5" Feb 19 15:13:50 crc kubenswrapper[4810]: I0219 15:13:50.844062 4810 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b6694b0d-3264-43eb-bd52-5088c7d2bf15" Feb 19 15:13:50 crc kubenswrapper[4810]: I0219 15:13:50.844109 4810 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b6694b0d-3264-43eb-bd52-5088c7d2bf15" Feb 19 15:13:51 crc kubenswrapper[4810]: I0219 15:13:51.449641 4810 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="10ac7c28-166d-4a4e-aaf9-c7adac6a9cc5" Feb 19 15:13:53 crc kubenswrapper[4810]: I0219 15:13:53.539025 4810 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 19 15:13:53 crc kubenswrapper[4810]: I0219 15:13:53.539576 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 19 15:13:58 crc kubenswrapper[4810]: I0219 15:13:58.444015 4810 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 19 15:13:58 crc kubenswrapper[4810]: I0219 15:13:58.553271 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 19 15:13:58 crc kubenswrapper[4810]: I0219 15:13:58.692915 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 19 15:13:58 crc kubenswrapper[4810]: I0219 15:13:58.756499 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 19 15:13:59 crc kubenswrapper[4810]: I0219 15:13:59.355120 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 19 15:13:59 crc kubenswrapper[4810]: I0219 15:13:59.442096 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 19 15:13:59 crc kubenswrapper[4810]: I0219 15:13:59.693567 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 19 15:13:59 crc kubenswrapper[4810]: I0219 15:13:59.820802 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 19 15:13:59 crc kubenswrapper[4810]: I0219 15:13:59.898358 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 19 15:14:00 crc kubenswrapper[4810]: I0219 15:14:00.136726 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 19 15:14:00 crc kubenswrapper[4810]: I0219 15:14:00.376365 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 19 15:14:00 crc kubenswrapper[4810]: I0219 15:14:00.402970 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 19 15:14:00 crc kubenswrapper[4810]: I0219 15:14:00.586596 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 19 15:14:00 crc kubenswrapper[4810]: I0219 15:14:00.655653 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 19 15:14:00 crc kubenswrapper[4810]: I0219 15:14:00.877224 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 19 15:14:00 crc kubenswrapper[4810]: I0219 15:14:00.917575 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 19 15:14:01 crc kubenswrapper[4810]: I0219 15:14:01.042026 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 19 15:14:01 crc kubenswrapper[4810]: I0219 15:14:01.190123 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 19 15:14:01 crc kubenswrapper[4810]: I0219 15:14:01.682186 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 19 15:14:01 crc kubenswrapper[4810]: I0219 15:14:01.805997 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 19 15:14:01 crc kubenswrapper[4810]: I0219 15:14:01.828125 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 19 15:14:02 crc kubenswrapper[4810]: I0219 15:14:02.050046 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 19 15:14:02 crc kubenswrapper[4810]: I0219 15:14:02.134495 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 19 15:14:02 crc kubenswrapper[4810]: I0219 15:14:02.168810 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 19 15:14:02 crc kubenswrapper[4810]: I0219 15:14:02.291735 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 19 15:14:02 crc kubenswrapper[4810]: I0219 15:14:02.357350 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 19 15:14:02 crc kubenswrapper[4810]: I0219 15:14:02.366036 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 19 15:14:02 crc kubenswrapper[4810]: I0219 15:14:02.390538 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 19 15:14:02 crc kubenswrapper[4810]: I0219 15:14:02.409047 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 19 15:14:02 crc kubenswrapper[4810]: I0219 15:14:02.489261 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 19 15:14:02 crc kubenswrapper[4810]: I0219 15:14:02.568276 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 19 15:14:02 crc kubenswrapper[4810]: I0219 15:14:02.577648 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 19 15:14:02 crc kubenswrapper[4810]: I0219 15:14:02.594973 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 19 15:14:02 crc kubenswrapper[4810]: I0219 15:14:02.716170 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 19 15:14:02 crc kubenswrapper[4810]: I0219 15:14:02.831419 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 19 15:14:02 crc kubenswrapper[4810]: I0219 15:14:02.846047 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 19 15:14:02 crc kubenswrapper[4810]: I0219 15:14:02.884028 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 19 15:14:02 crc kubenswrapper[4810]: I0219 15:14:02.902747 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 19 15:14:02 crc kubenswrapper[4810]: I0219 15:14:02.971194 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 19 15:14:03 crc kubenswrapper[4810]: I0219 15:14:03.071741 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 19 15:14:03 crc kubenswrapper[4810]: I0219 15:14:03.107915 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 19 15:14:03 crc kubenswrapper[4810]: I0219 15:14:03.156476 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 19 15:14:03 crc kubenswrapper[4810]: I0219 15:14:03.225455 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 19 15:14:03 crc kubenswrapper[4810]: I0219 15:14:03.260995 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 19 15:14:03 crc kubenswrapper[4810]: I0219 15:14:03.302449 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 19 15:14:03 crc kubenswrapper[4810]: I0219 15:14:03.317350 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 19 15:14:03 crc kubenswrapper[4810]: I0219 15:14:03.334137 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 19 15:14:03 crc kubenswrapper[4810]: I0219 15:14:03.337072 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 19 15:14:03 crc kubenswrapper[4810]: I0219 15:14:03.357563 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 19 15:14:03 crc kubenswrapper[4810]: I0219 15:14:03.362653 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 19 15:14:03 crc kubenswrapper[4810]: I0219 15:14:03.386888 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 19 15:14:03 crc kubenswrapper[4810]: I0219 15:14:03.453148 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 19 15:14:03 crc kubenswrapper[4810]: I0219 15:14:03.463187 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 19 15:14:03 crc kubenswrapper[4810]: I0219 15:14:03.545095 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 15:14:03 crc kubenswrapper[4810]: I0219 15:14:03.552418 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 15:14:03 crc kubenswrapper[4810]: I0219 15:14:03.630368 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 19 15:14:03 crc kubenswrapper[4810]: I0219 15:14:03.637082 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 19 15:14:03 crc kubenswrapper[4810]: I0219 15:14:03.730671 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 19 15:14:03 crc kubenswrapper[4810]: I0219 15:14:03.791053 4810 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 19 15:14:03 crc kubenswrapper[4810]: I0219 15:14:03.894190 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 19 15:14:03 crc kubenswrapper[4810]: I0219 15:14:03.931622 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 19 15:14:04 crc kubenswrapper[4810]: I0219 15:14:04.021089 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 19 15:14:04 crc kubenswrapper[4810]: I0219 15:14:04.098408 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 19 15:14:04 crc kubenswrapper[4810]: I0219 15:14:04.112693 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 19 15:14:04 crc kubenswrapper[4810]: I0219 15:14:04.159355 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 19 15:14:04 crc kubenswrapper[4810]: I0219 15:14:04.245902 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 19 15:14:04 crc kubenswrapper[4810]: I0219 15:14:04.258013 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 19 15:14:04 crc kubenswrapper[4810]: I0219 15:14:04.426191 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 19 15:14:04 crc kubenswrapper[4810]: I0219 15:14:04.462022 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 19 15:14:04 crc kubenswrapper[4810]: I0219 15:14:04.472957 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 19 15:14:04 crc kubenswrapper[4810]: I0219 15:14:04.490729 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 19 15:14:04 crc kubenswrapper[4810]: I0219 15:14:04.543124 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 19 15:14:04 crc kubenswrapper[4810]: I0219 15:14:04.546193 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 19 15:14:04 crc kubenswrapper[4810]: I0219 15:14:04.575543 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 19 15:14:04 crc kubenswrapper[4810]: I0219 15:14:04.680828 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 19 15:14:04 crc kubenswrapper[4810]: I0219 15:14:04.692392 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 19 15:14:04 crc kubenswrapper[4810]: I0219 15:14:04.749366 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 19 15:14:04 crc kubenswrapper[4810]: I0219 15:14:04.749456 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 19 15:14:04 crc kubenswrapper[4810]: I0219 15:14:04.801277 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 19 15:14:04 crc kubenswrapper[4810]: I0219 15:14:04.856306 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 19 15:14:04 crc kubenswrapper[4810]: I0219 15:14:04.947978 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 19 15:14:05 crc kubenswrapper[4810]: I0219 15:14:05.031809 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 19 15:14:05 crc kubenswrapper[4810]: I0219 15:14:05.103101 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 19 15:14:05 crc kubenswrapper[4810]: I0219 15:14:05.114080 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 19 15:14:05 crc kubenswrapper[4810]: I0219 15:14:05.177119 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 19 15:14:05 crc kubenswrapper[4810]: I0219 15:14:05.395723 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 19 15:14:05 crc kubenswrapper[4810]: I0219 15:14:05.399201 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 19 15:14:05 crc kubenswrapper[4810]: I0219 15:14:05.414473 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 19 15:14:05 crc kubenswrapper[4810]: I0219 15:14:05.460381 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 19 15:14:05 crc kubenswrapper[4810]: I0219 15:14:05.528745 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 19 15:14:05 crc kubenswrapper[4810]: I0219 15:14:05.624612 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 19 15:14:05 crc kubenswrapper[4810]: I0219 15:14:05.749738 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 19 15:14:05 crc kubenswrapper[4810]: I0219 15:14:05.881699 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 19 15:14:05 crc kubenswrapper[4810]: I0219 15:14:05.892644 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 19 15:14:05 crc kubenswrapper[4810]: I0219 15:14:05.983548 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 19 15:14:05 crc kubenswrapper[4810]: I0219 15:14:05.994603 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 19 15:14:06 crc kubenswrapper[4810]: I0219 15:14:06.006435 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 19 15:14:06 crc kubenswrapper[4810]: I0219 15:14:06.072808 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 19 15:14:06 crc kubenswrapper[4810]: I0219 15:14:06.120220 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 19 15:14:06 crc kubenswrapper[4810]: I0219 15:14:06.134931 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 19 15:14:06 crc kubenswrapper[4810]: I0219 15:14:06.171029 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 19 15:14:06 crc kubenswrapper[4810]: I0219 15:14:06.240779 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 19 15:14:06 crc kubenswrapper[4810]: I0219 15:14:06.468220 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 19 15:14:06 crc kubenswrapper[4810]: I0219 15:14:06.521849 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 19 15:14:06 crc kubenswrapper[4810]: I0219 15:14:06.526440 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 19 15:14:06 crc kubenswrapper[4810]: I0219 15:14:06.527864 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 19 15:14:06 crc kubenswrapper[4810]: I0219 15:14:06.577488 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 19 15:14:06 crc kubenswrapper[4810]: I0219 15:14:06.598597 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 19 15:14:06 crc kubenswrapper[4810]: I0219 15:14:06.602903 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 19 15:14:06 crc kubenswrapper[4810]: I0219 15:14:06.633663 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 19 15:14:06 crc kubenswrapper[4810]: I0219 15:14:06.657854 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 19 15:14:06 crc kubenswrapper[4810]: I0219 15:14:06.691092 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 19 15:14:06 crc kubenswrapper[4810]: I0219 15:14:06.720577 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 19 15:14:06 crc kubenswrapper[4810]: I0219 15:14:06.794981 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 19 15:14:07 crc kubenswrapper[4810]: I0219 15:14:07.144962 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 19 15:14:07 crc kubenswrapper[4810]: I0219 15:14:07.228847 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 19 15:14:07 crc kubenswrapper[4810]: I0219 15:14:07.240004 4810 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 19 15:14:07 crc kubenswrapper[4810]: I0219 15:14:07.247731 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-r74mv","openshift-marketplace/certified-operators-blpmq","openshift-kube-apiserver/kube-apiserver-crc"] Feb 19 15:14:07 crc kubenswrapper[4810]: I0219 15:14:07.247839 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 19 15:14:07 crc kubenswrapper[4810]: I0219 15:14:07.256672 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 15:14:07 crc kubenswrapper[4810]: I0219 15:14:07.258715 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 19 15:14:07 crc kubenswrapper[4810]: I0219 15:14:07.280119 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=19.280083050000002 podStartE2EDuration="19.28008305s" podCreationTimestamp="2026-02-19 15:13:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:14:07.272947747 +0000 UTC m=+276.754977871" watchObservedRunningTime="2026-02-19 15:14:07.28008305 +0000 UTC m=+276.762113224" Feb 19 15:14:07 crc kubenswrapper[4810]: I0219 15:14:07.371854 4810 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 19 15:14:07 crc kubenswrapper[4810]: I0219 15:14:07.446945 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4127fef2-ef2b-4cc4-967d-d52dac26f314" path="/var/lib/kubelet/pods/4127fef2-ef2b-4cc4-967d-d52dac26f314/volumes" Feb 19 15:14:07 crc kubenswrapper[4810]: I0219 15:14:07.447639 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c18fb461-ce5b-43ad-85ca-305c3f8a7d46" path="/var/lib/kubelet/pods/c18fb461-ce5b-43ad-85ca-305c3f8a7d46/volumes" Feb 19 15:14:07 crc kubenswrapper[4810]: I0219 15:14:07.489557 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 19 15:14:07 crc kubenswrapper[4810]: I0219 15:14:07.509730 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 19 15:14:07 crc kubenswrapper[4810]: I0219 15:14:07.830527 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 19 15:14:07 crc kubenswrapper[4810]: I0219 15:14:07.838211 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 19 15:14:07 crc kubenswrapper[4810]: I0219 15:14:07.846808 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 19 15:14:07 crc kubenswrapper[4810]: I0219 15:14:07.855801 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 19 15:14:07 crc kubenswrapper[4810]: I0219 15:14:07.871230 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 19 15:14:07 crc kubenswrapper[4810]: I0219 15:14:07.921753 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 19 15:14:07 crc kubenswrapper[4810]: I0219 15:14:07.946841 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 19 15:14:07 crc kubenswrapper[4810]: I0219 15:14:07.947573 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 19 15:14:08 crc kubenswrapper[4810]: I0219 15:14:08.104770 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 19 15:14:08 crc kubenswrapper[4810]: I0219 15:14:08.144406 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 19 15:14:08 crc kubenswrapper[4810]: I0219 15:14:08.167424 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 19 15:14:08 crc kubenswrapper[4810]: I0219 15:14:08.223795 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 19 15:14:08 crc kubenswrapper[4810]: I0219 15:14:08.344730 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 19 15:14:08 crc kubenswrapper[4810]: I0219 15:14:08.366425 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 19 15:14:08 crc kubenswrapper[4810]: I0219 15:14:08.469878 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 19 15:14:08 crc kubenswrapper[4810]: I0219 15:14:08.722673 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 19 15:14:08 crc kubenswrapper[4810]: I0219 15:14:08.732647 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 19 15:14:08 crc kubenswrapper[4810]: I0219 15:14:08.813858 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 19 15:14:08 crc kubenswrapper[4810]: I0219 15:14:08.842663 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 19 15:14:08 crc kubenswrapper[4810]: I0219 15:14:08.904722 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 19 15:14:08 crc kubenswrapper[4810]: I0219 15:14:08.930900 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 19 15:14:09 crc kubenswrapper[4810]: I0219 15:14:09.014174 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 19 15:14:09 crc kubenswrapper[4810]: I0219 15:14:09.019142 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 19 15:14:09 crc kubenswrapper[4810]: I0219 15:14:09.041314 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 19 15:14:09 crc kubenswrapper[4810]: I0219 15:14:09.151482 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 19 15:14:09 crc kubenswrapper[4810]: I0219 15:14:09.155166 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 19 15:14:09 crc kubenswrapper[4810]: I0219 15:14:09.205491 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 19 15:14:09 crc kubenswrapper[4810]: I0219 15:14:09.276409 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 19 15:14:09 crc kubenswrapper[4810]: I0219 15:14:09.308561 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 19 15:14:09 crc kubenswrapper[4810]: I0219 15:14:09.350371 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 19 15:14:09 crc kubenswrapper[4810]: I0219 15:14:09.377553 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 19 15:14:09 crc kubenswrapper[4810]: I0219 15:14:09.399255 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 19 15:14:09 crc kubenswrapper[4810]: I0219 15:14:09.430408 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 19 15:14:09 crc kubenswrapper[4810]: I0219 15:14:09.466059 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 19 15:14:09 crc kubenswrapper[4810]: I0219 15:14:09.519623 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 19 15:14:09 crc kubenswrapper[4810]: I0219 15:14:09.526745 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 19 15:14:09 crc kubenswrapper[4810]: I0219 15:14:09.568287 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 19 15:14:09 crc kubenswrapper[4810]: I0219 15:14:09.579919 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 19 15:14:09 crc kubenswrapper[4810]: I0219 15:14:09.659183 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 19 15:14:09 crc kubenswrapper[4810]: I0219 15:14:09.803471 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 19 15:14:09 crc kubenswrapper[4810]: I0219 15:14:09.835711 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 19 15:14:09 crc kubenswrapper[4810]: I0219 15:14:09.840225 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 19 15:14:09 crc kubenswrapper[4810]: I0219 15:14:09.851375 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 19 15:14:09 crc kubenswrapper[4810]: I0219 15:14:09.852971 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 19 15:14:09 crc kubenswrapper[4810]: I0219 15:14:09.888896 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 19 15:14:10 crc kubenswrapper[4810]: I0219 15:14:10.036120 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 19 15:14:10 crc kubenswrapper[4810]: I0219 15:14:10.143414 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 19 15:14:10 crc kubenswrapper[4810]: I0219 15:14:10.184431 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 19 15:14:10 crc kubenswrapper[4810]: I0219 15:14:10.185663 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 19 15:14:10 crc kubenswrapper[4810]: I0219 15:14:10.186447 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 19 15:14:10 crc kubenswrapper[4810]: I0219 15:14:10.244966 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 19 15:14:10 crc kubenswrapper[4810]: I0219 15:14:10.329887 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 19 15:14:10 crc kubenswrapper[4810]: I0219 15:14:10.389767 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 19 15:14:10 crc kubenswrapper[4810]: I0219 15:14:10.442647 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 19 15:14:10 crc kubenswrapper[4810]: I0219 15:14:10.579764 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 19 15:14:10 crc kubenswrapper[4810]: I0219 15:14:10.621770 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 19 15:14:10 crc kubenswrapper[4810]: I0219 15:14:10.642921 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 19 15:14:10 crc kubenswrapper[4810]: I0219 15:14:10.659362 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 19 15:14:10 crc kubenswrapper[4810]: I0219 15:14:10.682866 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 19 15:14:10 crc kubenswrapper[4810]: I0219 15:14:10.714372 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 19 15:14:10 crc kubenswrapper[4810]: I0219 15:14:10.774969 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 19 15:14:10 crc kubenswrapper[4810]: I0219 15:14:10.801742 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 19 15:14:10 crc kubenswrapper[4810]: I0219 15:14:10.869207 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 19 15:14:10 crc kubenswrapper[4810]: I0219 15:14:10.875106 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-85766c7959-6lnjr"] Feb 19 15:14:10 crc kubenswrapper[4810]: E0219 15:14:10.875420 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c18fb461-ce5b-43ad-85ca-305c3f8a7d46" containerName="oauth-openshift" Feb 19 15:14:10 crc kubenswrapper[4810]: I0219 15:14:10.875473 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="c18fb461-ce5b-43ad-85ca-305c3f8a7d46" containerName="oauth-openshift" Feb 19 15:14:10 crc kubenswrapper[4810]: E0219 15:14:10.875494 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3788d870-2889-4190-9675-e4da44f69a71" containerName="installer" Feb 19 15:14:10 crc kubenswrapper[4810]: I0219 15:14:10.875505 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="3788d870-2889-4190-9675-e4da44f69a71" containerName="installer" Feb 19 15:14:10 crc kubenswrapper[4810]: I0219 15:14:10.875669 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="3788d870-2889-4190-9675-e4da44f69a71" containerName="installer" Feb 19 15:14:10 crc kubenswrapper[4810]: I0219 15:14:10.875714 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="c18fb461-ce5b-43ad-85ca-305c3f8a7d46" containerName="oauth-openshift" Feb 19 15:14:10 crc kubenswrapper[4810]: I0219 15:14:10.876299 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-85766c7959-6lnjr" Feb 19 15:14:10 crc kubenswrapper[4810]: I0219 15:14:10.884882 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 19 15:14:10 crc kubenswrapper[4810]: I0219 15:14:10.885182 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 19 15:14:10 crc kubenswrapper[4810]: I0219 15:14:10.885204 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 19 15:14:10 crc kubenswrapper[4810]: I0219 15:14:10.885904 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 19 15:14:10 crc kubenswrapper[4810]: I0219 15:14:10.886234 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 19 15:14:10 crc kubenswrapper[4810]: I0219 15:14:10.886491 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 19 15:14:10 crc kubenswrapper[4810]: I0219 15:14:10.886692 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 19 15:14:10 crc kubenswrapper[4810]: I0219 15:14:10.887125 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 19 15:14:10 crc kubenswrapper[4810]: I0219 15:14:10.887835 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 19 15:14:10 crc kubenswrapper[4810]: I0219 15:14:10.888007 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 19 15:14:10 crc kubenswrapper[4810]: I0219 15:14:10.888159 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 19 15:14:10 crc kubenswrapper[4810]: I0219 15:14:10.891812 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 19 15:14:10 crc kubenswrapper[4810]: I0219 15:14:10.899716 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 19 15:14:10 crc kubenswrapper[4810]: I0219 15:14:10.906177 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-85766c7959-6lnjr"] Feb 19 15:14:10 crc kubenswrapper[4810]: I0219 15:14:10.908065 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 19 15:14:10 crc kubenswrapper[4810]: I0219 15:14:10.916691 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 19 15:14:10 crc kubenswrapper[4810]: I0219 15:14:10.925978 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 19 15:14:10 crc kubenswrapper[4810]: I0219 15:14:10.932393 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 19 15:14:10 crc kubenswrapper[4810]: I0219 15:14:10.959847 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba91d0d8-a2ad-47b1-b161-56ed570a2e42-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-85766c7959-6lnjr\" (UID: \"ba91d0d8-a2ad-47b1-b161-56ed570a2e42\") " pod="openshift-authentication/oauth-openshift-85766c7959-6lnjr" Feb 19 15:14:10 crc kubenswrapper[4810]: I0219 15:14:10.960092 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ba91d0d8-a2ad-47b1-b161-56ed570a2e42-audit-policies\") pod \"oauth-openshift-85766c7959-6lnjr\" (UID: \"ba91d0d8-a2ad-47b1-b161-56ed570a2e42\") " pod="openshift-authentication/oauth-openshift-85766c7959-6lnjr" Feb 19 15:14:10 crc kubenswrapper[4810]: I0219 15:14:10.960189 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ba91d0d8-a2ad-47b1-b161-56ed570a2e42-audit-dir\") pod \"oauth-openshift-85766c7959-6lnjr\" (UID: \"ba91d0d8-a2ad-47b1-b161-56ed570a2e42\") " pod="openshift-authentication/oauth-openshift-85766c7959-6lnjr" Feb 19 15:14:10 crc kubenswrapper[4810]: I0219 15:14:10.960288 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ba91d0d8-a2ad-47b1-b161-56ed570a2e42-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-85766c7959-6lnjr\" (UID: \"ba91d0d8-a2ad-47b1-b161-56ed570a2e42\") " pod="openshift-authentication/oauth-openshift-85766c7959-6lnjr" Feb 19 15:14:10 crc kubenswrapper[4810]: I0219 15:14:10.960430 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ba91d0d8-a2ad-47b1-b161-56ed570a2e42-v4-0-config-user-template-login\") pod \"oauth-openshift-85766c7959-6lnjr\" (UID: \"ba91d0d8-a2ad-47b1-b161-56ed570a2e42\") " pod="openshift-authentication/oauth-openshift-85766c7959-6lnjr" Feb 19 15:14:10 crc kubenswrapper[4810]: I0219 15:14:10.960570 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ba91d0d8-a2ad-47b1-b161-56ed570a2e42-v4-0-config-system-service-ca\") pod \"oauth-openshift-85766c7959-6lnjr\" (UID: \"ba91d0d8-a2ad-47b1-b161-56ed570a2e42\") " pod="openshift-authentication/oauth-openshift-85766c7959-6lnjr" Feb 19 15:14:10 crc kubenswrapper[4810]: I0219 15:14:10.960697 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ba91d0d8-a2ad-47b1-b161-56ed570a2e42-v4-0-config-system-session\") pod \"oauth-openshift-85766c7959-6lnjr\" (UID: \"ba91d0d8-a2ad-47b1-b161-56ed570a2e42\") " pod="openshift-authentication/oauth-openshift-85766c7959-6lnjr" Feb 19 15:14:10 crc kubenswrapper[4810]: I0219 15:14:10.960826 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ba91d0d8-a2ad-47b1-b161-56ed570a2e42-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-85766c7959-6lnjr\" (UID: \"ba91d0d8-a2ad-47b1-b161-56ed570a2e42\") " pod="openshift-authentication/oauth-openshift-85766c7959-6lnjr" Feb 19 15:14:10 crc kubenswrapper[4810]: I0219 15:14:10.960941 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2q259\" (UniqueName: \"kubernetes.io/projected/ba91d0d8-a2ad-47b1-b161-56ed570a2e42-kube-api-access-2q259\") pod \"oauth-openshift-85766c7959-6lnjr\" (UID: \"ba91d0d8-a2ad-47b1-b161-56ed570a2e42\") " pod="openshift-authentication/oauth-openshift-85766c7959-6lnjr" Feb 19 15:14:10 crc kubenswrapper[4810]: I0219 15:14:10.961043 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ba91d0d8-a2ad-47b1-b161-56ed570a2e42-v4-0-config-system-serving-cert\") pod \"oauth-openshift-85766c7959-6lnjr\" (UID: \"ba91d0d8-a2ad-47b1-b161-56ed570a2e42\") " pod="openshift-authentication/oauth-openshift-85766c7959-6lnjr" Feb 19 15:14:10 crc kubenswrapper[4810]: I0219 15:14:10.961162 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ba91d0d8-a2ad-47b1-b161-56ed570a2e42-v4-0-config-system-router-certs\") pod \"oauth-openshift-85766c7959-6lnjr\" (UID: \"ba91d0d8-a2ad-47b1-b161-56ed570a2e42\") " pod="openshift-authentication/oauth-openshift-85766c7959-6lnjr" Feb 19 15:14:10 crc kubenswrapper[4810]: I0219 15:14:10.961274 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ba91d0d8-a2ad-47b1-b161-56ed570a2e42-v4-0-config-user-template-error\") pod \"oauth-openshift-85766c7959-6lnjr\" (UID: \"ba91d0d8-a2ad-47b1-b161-56ed570a2e42\") " pod="openshift-authentication/oauth-openshift-85766c7959-6lnjr" Feb 19 15:14:10 crc kubenswrapper[4810]: I0219 15:14:10.961431 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ba91d0d8-a2ad-47b1-b161-56ed570a2e42-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-85766c7959-6lnjr\" (UID: \"ba91d0d8-a2ad-47b1-b161-56ed570a2e42\") " pod="openshift-authentication/oauth-openshift-85766c7959-6lnjr" Feb 19 15:14:10 crc kubenswrapper[4810]: I0219 15:14:10.961565 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ba91d0d8-a2ad-47b1-b161-56ed570a2e42-v4-0-config-system-cliconfig\") pod \"oauth-openshift-85766c7959-6lnjr\" (UID: \"ba91d0d8-a2ad-47b1-b161-56ed570a2e42\") " pod="openshift-authentication/oauth-openshift-85766c7959-6lnjr" Feb 19 15:14:11 crc kubenswrapper[4810]: I0219 15:14:11.026555 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 19 15:14:11 crc kubenswrapper[4810]: I0219 15:14:11.062823 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba91d0d8-a2ad-47b1-b161-56ed570a2e42-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-85766c7959-6lnjr\" (UID: \"ba91d0d8-a2ad-47b1-b161-56ed570a2e42\") " pod="openshift-authentication/oauth-openshift-85766c7959-6lnjr" Feb 19 15:14:11 crc kubenswrapper[4810]: I0219 15:14:11.063062 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ba91d0d8-a2ad-47b1-b161-56ed570a2e42-audit-policies\") pod \"oauth-openshift-85766c7959-6lnjr\" (UID: \"ba91d0d8-a2ad-47b1-b161-56ed570a2e42\") " pod="openshift-authentication/oauth-openshift-85766c7959-6lnjr" Feb 19 15:14:11 crc kubenswrapper[4810]: I0219 15:14:11.063173 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ba91d0d8-a2ad-47b1-b161-56ed570a2e42-audit-dir\") pod \"oauth-openshift-85766c7959-6lnjr\" (UID: \"ba91d0d8-a2ad-47b1-b161-56ed570a2e42\") " pod="openshift-authentication/oauth-openshift-85766c7959-6lnjr" Feb 19 15:14:11 crc kubenswrapper[4810]: I0219 15:14:11.063254 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ba91d0d8-a2ad-47b1-b161-56ed570a2e42-audit-dir\") pod \"oauth-openshift-85766c7959-6lnjr\" (UID: \"ba91d0d8-a2ad-47b1-b161-56ed570a2e42\") " pod="openshift-authentication/oauth-openshift-85766c7959-6lnjr" Feb 19 15:14:11 crc kubenswrapper[4810]: I0219 15:14:11.063389 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ba91d0d8-a2ad-47b1-b161-56ed570a2e42-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-85766c7959-6lnjr\" (UID: \"ba91d0d8-a2ad-47b1-b161-56ed570a2e42\") " pod="openshift-authentication/oauth-openshift-85766c7959-6lnjr" Feb 19 15:14:11 crc kubenswrapper[4810]: I0219 15:14:11.063500 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ba91d0d8-a2ad-47b1-b161-56ed570a2e42-v4-0-config-user-template-login\") pod \"oauth-openshift-85766c7959-6lnjr\" (UID: \"ba91d0d8-a2ad-47b1-b161-56ed570a2e42\") " pod="openshift-authentication/oauth-openshift-85766c7959-6lnjr" Feb 19 15:14:11 crc kubenswrapper[4810]: I0219 15:14:11.063605 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ba91d0d8-a2ad-47b1-b161-56ed570a2e42-v4-0-config-system-service-ca\") pod \"oauth-openshift-85766c7959-6lnjr\" (UID: \"ba91d0d8-a2ad-47b1-b161-56ed570a2e42\") " pod="openshift-authentication/oauth-openshift-85766c7959-6lnjr" Feb 19 15:14:11 crc kubenswrapper[4810]: I0219 15:14:11.063751 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ba91d0d8-a2ad-47b1-b161-56ed570a2e42-v4-0-config-system-session\") pod \"oauth-openshift-85766c7959-6lnjr\" (UID: \"ba91d0d8-a2ad-47b1-b161-56ed570a2e42\") " pod="openshift-authentication/oauth-openshift-85766c7959-6lnjr" Feb 19 15:14:11 crc kubenswrapper[4810]: I0219 15:14:11.063866 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ba91d0d8-a2ad-47b1-b161-56ed570a2e42-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-85766c7959-6lnjr\" (UID: \"ba91d0d8-a2ad-47b1-b161-56ed570a2e42\") " pod="openshift-authentication/oauth-openshift-85766c7959-6lnjr" Feb 19 15:14:11 crc kubenswrapper[4810]: I0219 15:14:11.063975 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2q259\" (UniqueName: \"kubernetes.io/projected/ba91d0d8-a2ad-47b1-b161-56ed570a2e42-kube-api-access-2q259\") pod \"oauth-openshift-85766c7959-6lnjr\" (UID: \"ba91d0d8-a2ad-47b1-b161-56ed570a2e42\") " pod="openshift-authentication/oauth-openshift-85766c7959-6lnjr" Feb 19 15:14:11 crc kubenswrapper[4810]: I0219 15:14:11.064485 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ba91d0d8-a2ad-47b1-b161-56ed570a2e42-v4-0-config-system-serving-cert\") pod \"oauth-openshift-85766c7959-6lnjr\" (UID: \"ba91d0d8-a2ad-47b1-b161-56ed570a2e42\") " pod="openshift-authentication/oauth-openshift-85766c7959-6lnjr" Feb 19 15:14:11 crc kubenswrapper[4810]: I0219 15:14:11.064593 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ba91d0d8-a2ad-47b1-b161-56ed570a2e42-v4-0-config-system-service-ca\") pod \"oauth-openshift-85766c7959-6lnjr\" (UID: \"ba91d0d8-a2ad-47b1-b161-56ed570a2e42\") " pod="openshift-authentication/oauth-openshift-85766c7959-6lnjr" Feb 19 15:14:11 crc kubenswrapper[4810]: I0219 15:14:11.064603 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ba91d0d8-a2ad-47b1-b161-56ed570a2e42-v4-0-config-system-router-certs\") pod \"oauth-openshift-85766c7959-6lnjr\" (UID: \"ba91d0d8-a2ad-47b1-b161-56ed570a2e42\") " pod="openshift-authentication/oauth-openshift-85766c7959-6lnjr" Feb 19 15:14:11 crc kubenswrapper[4810]: I0219 15:14:11.064680 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ba91d0d8-a2ad-47b1-b161-56ed570a2e42-v4-0-config-user-template-error\") pod \"oauth-openshift-85766c7959-6lnjr\" (UID: \"ba91d0d8-a2ad-47b1-b161-56ed570a2e42\") " pod="openshift-authentication/oauth-openshift-85766c7959-6lnjr" Feb 19 15:14:11 crc kubenswrapper[4810]: I0219 15:14:11.063897 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ba91d0d8-a2ad-47b1-b161-56ed570a2e42-audit-policies\") pod \"oauth-openshift-85766c7959-6lnjr\" (UID: \"ba91d0d8-a2ad-47b1-b161-56ed570a2e42\") " pod="openshift-authentication/oauth-openshift-85766c7959-6lnjr" Feb 19 15:14:11 crc kubenswrapper[4810]: I0219 15:14:11.064724 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ba91d0d8-a2ad-47b1-b161-56ed570a2e42-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-85766c7959-6lnjr\" (UID: \"ba91d0d8-a2ad-47b1-b161-56ed570a2e42\") " pod="openshift-authentication/oauth-openshift-85766c7959-6lnjr" Feb 19 15:14:11 crc kubenswrapper[4810]: I0219 15:14:11.064756 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ba91d0d8-a2ad-47b1-b161-56ed570a2e42-v4-0-config-system-cliconfig\") pod \"oauth-openshift-85766c7959-6lnjr\" (UID: \"ba91d0d8-a2ad-47b1-b161-56ed570a2e42\") " pod="openshift-authentication/oauth-openshift-85766c7959-6lnjr" Feb 19 15:14:11 crc kubenswrapper[4810]: I0219 15:14:11.065469 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ba91d0d8-a2ad-47b1-b161-56ed570a2e42-v4-0-config-system-cliconfig\") pod \"oauth-openshift-85766c7959-6lnjr\" (UID: \"ba91d0d8-a2ad-47b1-b161-56ed570a2e42\") " pod="openshift-authentication/oauth-openshift-85766c7959-6lnjr" Feb 19 15:14:11 crc kubenswrapper[4810]: I0219 15:14:11.067017 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba91d0d8-a2ad-47b1-b161-56ed570a2e42-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-85766c7959-6lnjr\" (UID: \"ba91d0d8-a2ad-47b1-b161-56ed570a2e42\") " pod="openshift-authentication/oauth-openshift-85766c7959-6lnjr" Feb 19 15:14:11 crc kubenswrapper[4810]: I0219 15:14:11.068853 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ba91d0d8-a2ad-47b1-b161-56ed570a2e42-v4-0-config-system-serving-cert\") pod \"oauth-openshift-85766c7959-6lnjr\" (UID: \"ba91d0d8-a2ad-47b1-b161-56ed570a2e42\") " pod="openshift-authentication/oauth-openshift-85766c7959-6lnjr" Feb 19 15:14:11 crc kubenswrapper[4810]: I0219 15:14:11.068962 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ba91d0d8-a2ad-47b1-b161-56ed570a2e42-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-85766c7959-6lnjr\" (UID: \"ba91d0d8-a2ad-47b1-b161-56ed570a2e42\") " pod="openshift-authentication/oauth-openshift-85766c7959-6lnjr" Feb 19 15:14:11 crc kubenswrapper[4810]: I0219 15:14:11.070147 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ba91d0d8-a2ad-47b1-b161-56ed570a2e42-v4-0-config-system-router-certs\") pod \"oauth-openshift-85766c7959-6lnjr\" (UID: \"ba91d0d8-a2ad-47b1-b161-56ed570a2e42\") " pod="openshift-authentication/oauth-openshift-85766c7959-6lnjr" Feb 19 15:14:11 crc kubenswrapper[4810]: I0219 15:14:11.070275 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ba91d0d8-a2ad-47b1-b161-56ed570a2e42-v4-0-config-user-template-error\") pod \"oauth-openshift-85766c7959-6lnjr\" (UID: \"ba91d0d8-a2ad-47b1-b161-56ed570a2e42\") " pod="openshift-authentication/oauth-openshift-85766c7959-6lnjr" Feb 19 15:14:11 crc kubenswrapper[4810]: I0219 15:14:11.070868 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ba91d0d8-a2ad-47b1-b161-56ed570a2e42-v4-0-config-system-session\") pod \"oauth-openshift-85766c7959-6lnjr\" (UID: \"ba91d0d8-a2ad-47b1-b161-56ed570a2e42\") " pod="openshift-authentication/oauth-openshift-85766c7959-6lnjr" Feb 19 15:14:11 crc kubenswrapper[4810]: I0219 15:14:11.071461 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ba91d0d8-a2ad-47b1-b161-56ed570a2e42-v4-0-config-user-template-login\") pod \"oauth-openshift-85766c7959-6lnjr\" (UID: \"ba91d0d8-a2ad-47b1-b161-56ed570a2e42\") " pod="openshift-authentication/oauth-openshift-85766c7959-6lnjr" Feb 19 15:14:11 crc kubenswrapper[4810]: I0219 15:14:11.073400 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ba91d0d8-a2ad-47b1-b161-56ed570a2e42-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-85766c7959-6lnjr\" (UID: \"ba91d0d8-a2ad-47b1-b161-56ed570a2e42\") " pod="openshift-authentication/oauth-openshift-85766c7959-6lnjr" Feb 19 15:14:11 crc kubenswrapper[4810]: I0219 15:14:11.086106 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ba91d0d8-a2ad-47b1-b161-56ed570a2e42-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-85766c7959-6lnjr\" (UID: \"ba91d0d8-a2ad-47b1-b161-56ed570a2e42\") " pod="openshift-authentication/oauth-openshift-85766c7959-6lnjr" Feb 19 15:14:11 crc kubenswrapper[4810]: I0219 15:14:11.089150 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2q259\" (UniqueName: \"kubernetes.io/projected/ba91d0d8-a2ad-47b1-b161-56ed570a2e42-kube-api-access-2q259\") pod \"oauth-openshift-85766c7959-6lnjr\" (UID: \"ba91d0d8-a2ad-47b1-b161-56ed570a2e42\") " pod="openshift-authentication/oauth-openshift-85766c7959-6lnjr" Feb 19 15:14:11 crc kubenswrapper[4810]: I0219 15:14:11.095915 4810 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 19 15:14:11 crc kubenswrapper[4810]: I0219 15:14:11.096116 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://a2157c6845332b692ab9ff67fd6e2eb4fc93b35babe6d2954b26b7839c430dbe" gracePeriod=5 Feb 19 15:14:11 crc kubenswrapper[4810]: I0219 15:14:11.117815 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 19 15:14:11 crc kubenswrapper[4810]: I0219 15:14:11.127490 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 19 15:14:11 crc kubenswrapper[4810]: I0219 15:14:11.185771 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 19 15:14:11 crc kubenswrapper[4810]: I0219 15:14:11.209157 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-85766c7959-6lnjr" Feb 19 15:14:11 crc kubenswrapper[4810]: I0219 15:14:11.209263 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 19 15:14:11 crc kubenswrapper[4810]: I0219 15:14:11.320797 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 19 15:14:11 crc kubenswrapper[4810]: I0219 15:14:11.349852 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 19 15:14:11 crc kubenswrapper[4810]: I0219 15:14:11.362543 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 19 15:14:11 crc kubenswrapper[4810]: I0219 15:14:11.422386 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 19 15:14:11 crc kubenswrapper[4810]: I0219 15:14:11.453436 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 19 15:14:11 crc kubenswrapper[4810]: I0219 15:14:11.531300 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 19 15:14:11 crc kubenswrapper[4810]: I0219 15:14:11.544204 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 19 15:14:11 crc kubenswrapper[4810]: I0219 15:14:11.582964 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 19 15:14:11 crc kubenswrapper[4810]: I0219 15:14:11.602142 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-85766c7959-6lnjr"] Feb 19 15:14:11 crc kubenswrapper[4810]: I0219 15:14:11.685608 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 19 15:14:11 crc kubenswrapper[4810]: I0219 15:14:11.740588 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 19 15:14:11 crc kubenswrapper[4810]: I0219 15:14:11.817011 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 19 15:14:11 crc kubenswrapper[4810]: I0219 15:14:11.837949 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 19 15:14:11 crc kubenswrapper[4810]: I0219 15:14:11.870514 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 19 15:14:11 crc kubenswrapper[4810]: I0219 15:14:11.876172 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 19 15:14:11 crc kubenswrapper[4810]: I0219 15:14:11.881036 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 19 15:14:11 crc kubenswrapper[4810]: I0219 15:14:11.968888 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 19 15:14:11 crc kubenswrapper[4810]: I0219 15:14:11.994605 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-85766c7959-6lnjr" event={"ID":"ba91d0d8-a2ad-47b1-b161-56ed570a2e42","Type":"ContainerStarted","Data":"f3b919ca6117bba187eaeeb48421608b19352c475498cf813f24b9f89e02a35d"} Feb 19 15:14:11 crc kubenswrapper[4810]: I0219 15:14:11.994731 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-85766c7959-6lnjr" event={"ID":"ba91d0d8-a2ad-47b1-b161-56ed570a2e42","Type":"ContainerStarted","Data":"6bf5c788ebd52253ff52bd60a644388aa228fe2450514654184215f0a929b3df"} Feb 19 15:14:11 crc kubenswrapper[4810]: I0219 15:14:11.995062 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-85766c7959-6lnjr" Feb 19 15:14:11 crc kubenswrapper[4810]: I0219 15:14:11.996827 4810 patch_prober.go:28] interesting pod/oauth-openshift-85766c7959-6lnjr container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.65:6443/healthz\": dial tcp 10.217.0.65:6443: connect: connection refused" start-of-body= Feb 19 15:14:11 crc kubenswrapper[4810]: I0219 15:14:11.996892 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-85766c7959-6lnjr" podUID="ba91d0d8-a2ad-47b1-b161-56ed570a2e42" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.65:6443/healthz\": dial tcp 10.217.0.65:6443: connect: connection refused" Feb 19 15:14:12 crc kubenswrapper[4810]: I0219 15:14:12.022412 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-85766c7959-6lnjr" podStartSLOduration=60.022391179 podStartE2EDuration="1m0.022391179s" podCreationTimestamp="2026-02-19 15:13:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:14:12.019233158 +0000 UTC m=+281.501263292" watchObservedRunningTime="2026-02-19 15:14:12.022391179 +0000 UTC m=+281.504421323" Feb 19 15:14:12 crc kubenswrapper[4810]: I0219 15:14:12.035354 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 19 15:14:12 crc kubenswrapper[4810]: I0219 15:14:12.120875 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 19 15:14:12 crc kubenswrapper[4810]: I0219 15:14:12.176725 4810 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 19 15:14:12 crc kubenswrapper[4810]: I0219 15:14:12.257500 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 19 15:14:12 crc kubenswrapper[4810]: I0219 15:14:12.532739 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 19 15:14:12 crc kubenswrapper[4810]: I0219 15:14:12.559862 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 19 15:14:12 crc kubenswrapper[4810]: I0219 15:14:12.683451 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 19 15:14:12 crc kubenswrapper[4810]: I0219 15:14:12.769318 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 19 15:14:12 crc kubenswrapper[4810]: I0219 15:14:12.891688 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 19 15:14:12 crc kubenswrapper[4810]: I0219 15:14:12.924665 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 19 15:14:12 crc kubenswrapper[4810]: I0219 15:14:12.945856 4810 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 19 15:14:13 crc kubenswrapper[4810]: I0219 15:14:13.000794 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 19 15:14:13 crc kubenswrapper[4810]: I0219 15:14:13.001128 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 19 15:14:13 crc kubenswrapper[4810]: I0219 15:14:13.004427 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-85766c7959-6lnjr_ba91d0d8-a2ad-47b1-b161-56ed570a2e42/oauth-openshift/0.log" Feb 19 15:14:13 crc kubenswrapper[4810]: I0219 15:14:13.004488 4810 generic.go:334] "Generic (PLEG): container finished" podID="ba91d0d8-a2ad-47b1-b161-56ed570a2e42" containerID="f3b919ca6117bba187eaeeb48421608b19352c475498cf813f24b9f89e02a35d" exitCode=255 Feb 19 15:14:13 crc kubenswrapper[4810]: I0219 15:14:13.004533 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-85766c7959-6lnjr" event={"ID":"ba91d0d8-a2ad-47b1-b161-56ed570a2e42","Type":"ContainerDied","Data":"f3b919ca6117bba187eaeeb48421608b19352c475498cf813f24b9f89e02a35d"} Feb 19 15:14:13 crc kubenswrapper[4810]: I0219 15:14:13.005700 4810 scope.go:117] "RemoveContainer" containerID="f3b919ca6117bba187eaeeb48421608b19352c475498cf813f24b9f89e02a35d" Feb 19 15:14:13 crc kubenswrapper[4810]: I0219 15:14:13.075253 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 19 15:14:13 crc kubenswrapper[4810]: I0219 15:14:13.190583 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 19 15:14:13 crc kubenswrapper[4810]: I0219 15:14:13.206500 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 19 15:14:13 crc kubenswrapper[4810]: I0219 15:14:13.222558 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 19 15:14:13 crc kubenswrapper[4810]: I0219 15:14:13.254969 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 19 15:14:13 crc kubenswrapper[4810]: I0219 15:14:13.282691 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 19 15:14:13 crc kubenswrapper[4810]: I0219 15:14:13.638275 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 19 15:14:13 crc kubenswrapper[4810]: I0219 15:14:13.715081 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 19 15:14:13 crc kubenswrapper[4810]: I0219 15:14:13.964697 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 19 15:14:14 crc kubenswrapper[4810]: I0219 15:14:14.019979 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-85766c7959-6lnjr_ba91d0d8-a2ad-47b1-b161-56ed570a2e42/oauth-openshift/0.log" Feb 19 15:14:14 crc kubenswrapper[4810]: I0219 15:14:14.020038 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-85766c7959-6lnjr" event={"ID":"ba91d0d8-a2ad-47b1-b161-56ed570a2e42","Type":"ContainerStarted","Data":"b3371b30221678523b2083e3d3c9fc5858d7f2f0256abf4e521e9f1dc8d9aad2"} Feb 19 15:14:14 crc kubenswrapper[4810]: I0219 15:14:14.020423 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-85766c7959-6lnjr" Feb 19 15:14:14 crc kubenswrapper[4810]: I0219 15:14:14.023463 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 19 15:14:14 crc kubenswrapper[4810]: I0219 15:14:14.025626 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-85766c7959-6lnjr" Feb 19 15:14:14 crc kubenswrapper[4810]: I0219 15:14:14.182200 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 19 15:14:14 crc kubenswrapper[4810]: I0219 15:14:14.602475 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 19 15:14:14 crc kubenswrapper[4810]: I0219 15:14:14.617002 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 19 15:14:14 crc kubenswrapper[4810]: I0219 15:14:14.683906 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 19 15:14:14 crc kubenswrapper[4810]: I0219 15:14:14.732198 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 19 15:14:14 crc kubenswrapper[4810]: I0219 15:14:14.739641 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 19 15:14:15 crc kubenswrapper[4810]: I0219 15:14:15.310012 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 19 15:14:15 crc kubenswrapper[4810]: I0219 15:14:15.508637 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 19 15:14:16 crc kubenswrapper[4810]: I0219 15:14:16.694026 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 19 15:14:16 crc kubenswrapper[4810]: I0219 15:14:16.694106 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 15:14:16 crc kubenswrapper[4810]: I0219 15:14:16.836411 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 19 15:14:16 crc kubenswrapper[4810]: I0219 15:14:16.836461 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 19 15:14:16 crc kubenswrapper[4810]: I0219 15:14:16.836480 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 19 15:14:16 crc kubenswrapper[4810]: I0219 15:14:16.836560 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 19 15:14:16 crc kubenswrapper[4810]: I0219 15:14:16.836551 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 15:14:16 crc kubenswrapper[4810]: I0219 15:14:16.836606 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 15:14:16 crc kubenswrapper[4810]: I0219 15:14:16.836643 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 15:14:16 crc kubenswrapper[4810]: I0219 15:14:16.836629 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 19 15:14:16 crc kubenswrapper[4810]: I0219 15:14:16.836692 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 15:14:16 crc kubenswrapper[4810]: I0219 15:14:16.837233 4810 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Feb 19 15:14:16 crc kubenswrapper[4810]: I0219 15:14:16.837271 4810 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 19 15:14:16 crc kubenswrapper[4810]: I0219 15:14:16.837294 4810 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Feb 19 15:14:16 crc kubenswrapper[4810]: I0219 15:14:16.837312 4810 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Feb 19 15:14:16 crc kubenswrapper[4810]: I0219 15:14:16.849572 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 15:14:16 crc kubenswrapper[4810]: I0219 15:14:16.938663 4810 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 19 15:14:17 crc kubenswrapper[4810]: I0219 15:14:17.039570 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 19 15:14:17 crc kubenswrapper[4810]: I0219 15:14:17.039641 4810 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="a2157c6845332b692ab9ff67fd6e2eb4fc93b35babe6d2954b26b7839c430dbe" exitCode=137 Feb 19 15:14:17 crc kubenswrapper[4810]: I0219 15:14:17.039692 4810 scope.go:117] "RemoveContainer" containerID="a2157c6845332b692ab9ff67fd6e2eb4fc93b35babe6d2954b26b7839c430dbe" Feb 19 15:14:17 crc kubenswrapper[4810]: I0219 15:14:17.039751 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 15:14:17 crc kubenswrapper[4810]: I0219 15:14:17.059996 4810 scope.go:117] "RemoveContainer" containerID="a2157c6845332b692ab9ff67fd6e2eb4fc93b35babe6d2954b26b7839c430dbe" Feb 19 15:14:17 crc kubenswrapper[4810]: E0219 15:14:17.060477 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2157c6845332b692ab9ff67fd6e2eb4fc93b35babe6d2954b26b7839c430dbe\": container with ID starting with a2157c6845332b692ab9ff67fd6e2eb4fc93b35babe6d2954b26b7839c430dbe not found: ID does not exist" containerID="a2157c6845332b692ab9ff67fd6e2eb4fc93b35babe6d2954b26b7839c430dbe" Feb 19 15:14:17 crc kubenswrapper[4810]: I0219 15:14:17.060517 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2157c6845332b692ab9ff67fd6e2eb4fc93b35babe6d2954b26b7839c430dbe"} err="failed to get container status \"a2157c6845332b692ab9ff67fd6e2eb4fc93b35babe6d2954b26b7839c430dbe\": rpc error: code = NotFound desc = could not find container \"a2157c6845332b692ab9ff67fd6e2eb4fc93b35babe6d2954b26b7839c430dbe\": container with ID starting with a2157c6845332b692ab9ff67fd6e2eb4fc93b35babe6d2954b26b7839c430dbe not found: ID does not exist" Feb 19 15:14:17 crc kubenswrapper[4810]: I0219 15:14:17.078223 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 19 15:14:17 crc kubenswrapper[4810]: I0219 15:14:17.451962 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Feb 19 15:14:31 crc kubenswrapper[4810]: I0219 15:14:31.158988 4810 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Feb 19 15:14:32 crc kubenswrapper[4810]: I0219 15:14:32.140126 4810 generic.go:334] "Generic (PLEG): container finished" podID="b8b1faba-e1b8-436c-aa84-ae4353c5f0a9" containerID="20e10c5e198ddee65262d0528102f59529d30ad7bead31c5ab5c764fe94b9de0" exitCode=0 Feb 19 15:14:32 crc kubenswrapper[4810]: I0219 15:14:32.140169 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-r2tqm" event={"ID":"b8b1faba-e1b8-436c-aa84-ae4353c5f0a9","Type":"ContainerDied","Data":"20e10c5e198ddee65262d0528102f59529d30ad7bead31c5ab5c764fe94b9de0"} Feb 19 15:14:32 crc kubenswrapper[4810]: I0219 15:14:32.140622 4810 scope.go:117] "RemoveContainer" containerID="20e10c5e198ddee65262d0528102f59529d30ad7bead31c5ab5c764fe94b9de0" Feb 19 15:14:33 crc kubenswrapper[4810]: I0219 15:14:33.151432 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-r2tqm" event={"ID":"b8b1faba-e1b8-436c-aa84-ae4353c5f0a9","Type":"ContainerStarted","Data":"18198b3c1f0a5d1d83b301a87f12dccc9fde533f7e0587694e992331246f2d37"} Feb 19 15:14:33 crc kubenswrapper[4810]: I0219 15:14:33.152971 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-r2tqm" Feb 19 15:14:33 crc kubenswrapper[4810]: I0219 15:14:33.155934 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-r2tqm" Feb 19 15:15:00 crc kubenswrapper[4810]: I0219 15:15:00.169458 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525235-vk2jg"] Feb 19 15:15:00 crc kubenswrapper[4810]: E0219 15:15:00.170228 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 19 15:15:00 crc kubenswrapper[4810]: I0219 15:15:00.170243 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 19 15:15:00 crc kubenswrapper[4810]: I0219 15:15:00.170377 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 19 15:15:00 crc kubenswrapper[4810]: I0219 15:15:00.170811 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525235-vk2jg" Feb 19 15:15:00 crc kubenswrapper[4810]: I0219 15:15:00.172887 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 19 15:15:00 crc kubenswrapper[4810]: I0219 15:15:00.173674 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 19 15:15:00 crc kubenswrapper[4810]: I0219 15:15:00.180668 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525235-vk2jg"] Feb 19 15:15:00 crc kubenswrapper[4810]: I0219 15:15:00.325840 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2f3fa539-f490-4b25-b592-d199cc757b8a-config-volume\") pod \"collect-profiles-29525235-vk2jg\" (UID: \"2f3fa539-f490-4b25-b592-d199cc757b8a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525235-vk2jg" Feb 19 15:15:00 crc kubenswrapper[4810]: I0219 15:15:00.326065 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncgrd\" (UniqueName: \"kubernetes.io/projected/2f3fa539-f490-4b25-b592-d199cc757b8a-kube-api-access-ncgrd\") pod \"collect-profiles-29525235-vk2jg\" (UID: \"2f3fa539-f490-4b25-b592-d199cc757b8a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525235-vk2jg" Feb 19 15:15:00 crc kubenswrapper[4810]: I0219 15:15:00.326127 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2f3fa539-f490-4b25-b592-d199cc757b8a-secret-volume\") pod \"collect-profiles-29525235-vk2jg\" (UID: \"2f3fa539-f490-4b25-b592-d199cc757b8a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525235-vk2jg" Feb 19 15:15:00 crc kubenswrapper[4810]: I0219 15:15:00.428214 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2f3fa539-f490-4b25-b592-d199cc757b8a-config-volume\") pod \"collect-profiles-29525235-vk2jg\" (UID: \"2f3fa539-f490-4b25-b592-d199cc757b8a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525235-vk2jg" Feb 19 15:15:00 crc kubenswrapper[4810]: I0219 15:15:00.428697 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncgrd\" (UniqueName: \"kubernetes.io/projected/2f3fa539-f490-4b25-b592-d199cc757b8a-kube-api-access-ncgrd\") pod \"collect-profiles-29525235-vk2jg\" (UID: \"2f3fa539-f490-4b25-b592-d199cc757b8a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525235-vk2jg" Feb 19 15:15:00 crc kubenswrapper[4810]: I0219 15:15:00.428772 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2f3fa539-f490-4b25-b592-d199cc757b8a-secret-volume\") pod \"collect-profiles-29525235-vk2jg\" (UID: \"2f3fa539-f490-4b25-b592-d199cc757b8a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525235-vk2jg" Feb 19 15:15:00 crc kubenswrapper[4810]: I0219 15:15:00.429864 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2f3fa539-f490-4b25-b592-d199cc757b8a-config-volume\") pod \"collect-profiles-29525235-vk2jg\" (UID: \"2f3fa539-f490-4b25-b592-d199cc757b8a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525235-vk2jg" Feb 19 15:15:00 crc kubenswrapper[4810]: I0219 15:15:00.435078 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2f3fa539-f490-4b25-b592-d199cc757b8a-secret-volume\") pod \"collect-profiles-29525235-vk2jg\" (UID: \"2f3fa539-f490-4b25-b592-d199cc757b8a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525235-vk2jg" Feb 19 15:15:00 crc kubenswrapper[4810]: I0219 15:15:00.450089 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncgrd\" (UniqueName: \"kubernetes.io/projected/2f3fa539-f490-4b25-b592-d199cc757b8a-kube-api-access-ncgrd\") pod \"collect-profiles-29525235-vk2jg\" (UID: \"2f3fa539-f490-4b25-b592-d199cc757b8a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525235-vk2jg" Feb 19 15:15:00 crc kubenswrapper[4810]: I0219 15:15:00.486927 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525235-vk2jg" Feb 19 15:15:00 crc kubenswrapper[4810]: I0219 15:15:00.984175 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525235-vk2jg"] Feb 19 15:15:00 crc kubenswrapper[4810]: W0219 15:15:00.992569 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f3fa539_f490_4b25_b592_d199cc757b8a.slice/crio-7a4255db1170dcd73c8818f18b999b4830ae54d545bb0ede34e81f9454503447 WatchSource:0}: Error finding container 7a4255db1170dcd73c8818f18b999b4830ae54d545bb0ede34e81f9454503447: Status 404 returned error can't find the container with id 7a4255db1170dcd73c8818f18b999b4830ae54d545bb0ede34e81f9454503447 Feb 19 15:15:01 crc kubenswrapper[4810]: I0219 15:15:01.335430 4810 generic.go:334] "Generic (PLEG): container finished" podID="2f3fa539-f490-4b25-b592-d199cc757b8a" containerID="88bb50e9f73c36f14be470f12c634b9e24a853c43e328fa253f758c7a485a40a" exitCode=0 Feb 19 15:15:01 crc kubenswrapper[4810]: I0219 15:15:01.335519 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525235-vk2jg" event={"ID":"2f3fa539-f490-4b25-b592-d199cc757b8a","Type":"ContainerDied","Data":"88bb50e9f73c36f14be470f12c634b9e24a853c43e328fa253f758c7a485a40a"} Feb 19 15:15:01 crc kubenswrapper[4810]: I0219 15:15:01.335556 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525235-vk2jg" event={"ID":"2f3fa539-f490-4b25-b592-d199cc757b8a","Type":"ContainerStarted","Data":"7a4255db1170dcd73c8818f18b999b4830ae54d545bb0ede34e81f9454503447"} Feb 19 15:15:02 crc kubenswrapper[4810]: I0219 15:15:02.687709 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525235-vk2jg" Feb 19 15:15:02 crc kubenswrapper[4810]: I0219 15:15:02.872762 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ncgrd\" (UniqueName: \"kubernetes.io/projected/2f3fa539-f490-4b25-b592-d199cc757b8a-kube-api-access-ncgrd\") pod \"2f3fa539-f490-4b25-b592-d199cc757b8a\" (UID: \"2f3fa539-f490-4b25-b592-d199cc757b8a\") " Feb 19 15:15:02 crc kubenswrapper[4810]: I0219 15:15:02.872979 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2f3fa539-f490-4b25-b592-d199cc757b8a-config-volume\") pod \"2f3fa539-f490-4b25-b592-d199cc757b8a\" (UID: \"2f3fa539-f490-4b25-b592-d199cc757b8a\") " Feb 19 15:15:02 crc kubenswrapper[4810]: I0219 15:15:02.873030 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2f3fa539-f490-4b25-b592-d199cc757b8a-secret-volume\") pod \"2f3fa539-f490-4b25-b592-d199cc757b8a\" (UID: \"2f3fa539-f490-4b25-b592-d199cc757b8a\") " Feb 19 15:15:02 crc kubenswrapper[4810]: I0219 15:15:02.874218 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f3fa539-f490-4b25-b592-d199cc757b8a-config-volume" (OuterVolumeSpecName: "config-volume") pod "2f3fa539-f490-4b25-b592-d199cc757b8a" (UID: "2f3fa539-f490-4b25-b592-d199cc757b8a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:15:02 crc kubenswrapper[4810]: I0219 15:15:02.880960 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f3fa539-f490-4b25-b592-d199cc757b8a-kube-api-access-ncgrd" (OuterVolumeSpecName: "kube-api-access-ncgrd") pod "2f3fa539-f490-4b25-b592-d199cc757b8a" (UID: "2f3fa539-f490-4b25-b592-d199cc757b8a"). InnerVolumeSpecName "kube-api-access-ncgrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:15:02 crc kubenswrapper[4810]: I0219 15:15:02.881899 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f3fa539-f490-4b25-b592-d199cc757b8a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2f3fa539-f490-4b25-b592-d199cc757b8a" (UID: "2f3fa539-f490-4b25-b592-d199cc757b8a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:15:02 crc kubenswrapper[4810]: I0219 15:15:02.974496 4810 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2f3fa539-f490-4b25-b592-d199cc757b8a-config-volume\") on node \"crc\" DevicePath \"\"" Feb 19 15:15:02 crc kubenswrapper[4810]: I0219 15:15:02.974558 4810 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2f3fa539-f490-4b25-b592-d199cc757b8a-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 19 15:15:02 crc kubenswrapper[4810]: I0219 15:15:02.974584 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ncgrd\" (UniqueName: \"kubernetes.io/projected/2f3fa539-f490-4b25-b592-d199cc757b8a-kube-api-access-ncgrd\") on node \"crc\" DevicePath \"\"" Feb 19 15:15:03 crc kubenswrapper[4810]: I0219 15:15:03.351077 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525235-vk2jg" event={"ID":"2f3fa539-f490-4b25-b592-d199cc757b8a","Type":"ContainerDied","Data":"7a4255db1170dcd73c8818f18b999b4830ae54d545bb0ede34e81f9454503447"} Feb 19 15:15:03 crc kubenswrapper[4810]: I0219 15:15:03.351129 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a4255db1170dcd73c8818f18b999b4830ae54d545bb0ede34e81f9454503447" Feb 19 15:15:03 crc kubenswrapper[4810]: I0219 15:15:03.351188 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525235-vk2jg" Feb 19 15:15:19 crc kubenswrapper[4810]: I0219 15:15:19.537656 4810 patch_prober.go:28] interesting pod/machine-config-daemon-t499d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 15:15:19 crc kubenswrapper[4810]: I0219 15:15:19.538465 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 15:15:30 crc kubenswrapper[4810]: I0219 15:15:30.856448 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-59jrk"] Feb 19 15:15:30 crc kubenswrapper[4810]: E0219 15:15:30.857339 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f3fa539-f490-4b25-b592-d199cc757b8a" containerName="collect-profiles" Feb 19 15:15:30 crc kubenswrapper[4810]: I0219 15:15:30.857356 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f3fa539-f490-4b25-b592-d199cc757b8a" containerName="collect-profiles" Feb 19 15:15:30 crc kubenswrapper[4810]: I0219 15:15:30.857509 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f3fa539-f490-4b25-b592-d199cc757b8a" containerName="collect-profiles" Feb 19 15:15:30 crc kubenswrapper[4810]: I0219 15:15:30.858045 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-59jrk" Feb 19 15:15:30 crc kubenswrapper[4810]: I0219 15:15:30.878410 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-59jrk"] Feb 19 15:15:30 crc kubenswrapper[4810]: I0219 15:15:30.977543 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/38318350-42a7-4d65-93b7-e11dc71e3cb6-registry-certificates\") pod \"image-registry-66df7c8f76-59jrk\" (UID: \"38318350-42a7-4d65-93b7-e11dc71e3cb6\") " pod="openshift-image-registry/image-registry-66df7c8f76-59jrk" Feb 19 15:15:30 crc kubenswrapper[4810]: I0219 15:15:30.978292 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-59jrk\" (UID: \"38318350-42a7-4d65-93b7-e11dc71e3cb6\") " pod="openshift-image-registry/image-registry-66df7c8f76-59jrk" Feb 19 15:15:30 crc kubenswrapper[4810]: I0219 15:15:30.978418 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/38318350-42a7-4d65-93b7-e11dc71e3cb6-ca-trust-extracted\") pod \"image-registry-66df7c8f76-59jrk\" (UID: \"38318350-42a7-4d65-93b7-e11dc71e3cb6\") " pod="openshift-image-registry/image-registry-66df7c8f76-59jrk" Feb 19 15:15:30 crc kubenswrapper[4810]: I0219 15:15:30.978583 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/38318350-42a7-4d65-93b7-e11dc71e3cb6-registry-tls\") pod \"image-registry-66df7c8f76-59jrk\" (UID: \"38318350-42a7-4d65-93b7-e11dc71e3cb6\") " pod="openshift-image-registry/image-registry-66df7c8f76-59jrk" Feb 19 15:15:30 crc kubenswrapper[4810]: I0219 15:15:30.978697 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/38318350-42a7-4d65-93b7-e11dc71e3cb6-installation-pull-secrets\") pod \"image-registry-66df7c8f76-59jrk\" (UID: \"38318350-42a7-4d65-93b7-e11dc71e3cb6\") " pod="openshift-image-registry/image-registry-66df7c8f76-59jrk" Feb 19 15:15:30 crc kubenswrapper[4810]: I0219 15:15:30.978843 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52jjq\" (UniqueName: \"kubernetes.io/projected/38318350-42a7-4d65-93b7-e11dc71e3cb6-kube-api-access-52jjq\") pod \"image-registry-66df7c8f76-59jrk\" (UID: \"38318350-42a7-4d65-93b7-e11dc71e3cb6\") " pod="openshift-image-registry/image-registry-66df7c8f76-59jrk" Feb 19 15:15:30 crc kubenswrapper[4810]: I0219 15:15:30.978918 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/38318350-42a7-4d65-93b7-e11dc71e3cb6-bound-sa-token\") pod \"image-registry-66df7c8f76-59jrk\" (UID: \"38318350-42a7-4d65-93b7-e11dc71e3cb6\") " pod="openshift-image-registry/image-registry-66df7c8f76-59jrk" Feb 19 15:15:30 crc kubenswrapper[4810]: I0219 15:15:30.978943 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/38318350-42a7-4d65-93b7-e11dc71e3cb6-trusted-ca\") pod \"image-registry-66df7c8f76-59jrk\" (UID: \"38318350-42a7-4d65-93b7-e11dc71e3cb6\") " pod="openshift-image-registry/image-registry-66df7c8f76-59jrk" Feb 19 15:15:31 crc kubenswrapper[4810]: I0219 15:15:31.005991 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-59jrk\" (UID: \"38318350-42a7-4d65-93b7-e11dc71e3cb6\") " pod="openshift-image-registry/image-registry-66df7c8f76-59jrk" Feb 19 15:15:31 crc kubenswrapper[4810]: I0219 15:15:31.080121 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/38318350-42a7-4d65-93b7-e11dc71e3cb6-ca-trust-extracted\") pod \"image-registry-66df7c8f76-59jrk\" (UID: \"38318350-42a7-4d65-93b7-e11dc71e3cb6\") " pod="openshift-image-registry/image-registry-66df7c8f76-59jrk" Feb 19 15:15:31 crc kubenswrapper[4810]: I0219 15:15:31.080222 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/38318350-42a7-4d65-93b7-e11dc71e3cb6-registry-tls\") pod \"image-registry-66df7c8f76-59jrk\" (UID: \"38318350-42a7-4d65-93b7-e11dc71e3cb6\") " pod="openshift-image-registry/image-registry-66df7c8f76-59jrk" Feb 19 15:15:31 crc kubenswrapper[4810]: I0219 15:15:31.080259 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/38318350-42a7-4d65-93b7-e11dc71e3cb6-installation-pull-secrets\") pod \"image-registry-66df7c8f76-59jrk\" (UID: \"38318350-42a7-4d65-93b7-e11dc71e3cb6\") " pod="openshift-image-registry/image-registry-66df7c8f76-59jrk" Feb 19 15:15:31 crc kubenswrapper[4810]: I0219 15:15:31.080296 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52jjq\" (UniqueName: \"kubernetes.io/projected/38318350-42a7-4d65-93b7-e11dc71e3cb6-kube-api-access-52jjq\") pod \"image-registry-66df7c8f76-59jrk\" (UID: \"38318350-42a7-4d65-93b7-e11dc71e3cb6\") " pod="openshift-image-registry/image-registry-66df7c8f76-59jrk" Feb 19 15:15:31 crc kubenswrapper[4810]: I0219 15:15:31.080344 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/38318350-42a7-4d65-93b7-e11dc71e3cb6-bound-sa-token\") pod \"image-registry-66df7c8f76-59jrk\" (UID: \"38318350-42a7-4d65-93b7-e11dc71e3cb6\") " pod="openshift-image-registry/image-registry-66df7c8f76-59jrk" Feb 19 15:15:31 crc kubenswrapper[4810]: I0219 15:15:31.080366 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/38318350-42a7-4d65-93b7-e11dc71e3cb6-trusted-ca\") pod \"image-registry-66df7c8f76-59jrk\" (UID: \"38318350-42a7-4d65-93b7-e11dc71e3cb6\") " pod="openshift-image-registry/image-registry-66df7c8f76-59jrk" Feb 19 15:15:31 crc kubenswrapper[4810]: I0219 15:15:31.080396 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/38318350-42a7-4d65-93b7-e11dc71e3cb6-registry-certificates\") pod \"image-registry-66df7c8f76-59jrk\" (UID: \"38318350-42a7-4d65-93b7-e11dc71e3cb6\") " pod="openshift-image-registry/image-registry-66df7c8f76-59jrk" Feb 19 15:15:31 crc kubenswrapper[4810]: I0219 15:15:31.080849 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/38318350-42a7-4d65-93b7-e11dc71e3cb6-ca-trust-extracted\") pod \"image-registry-66df7c8f76-59jrk\" (UID: \"38318350-42a7-4d65-93b7-e11dc71e3cb6\") " pod="openshift-image-registry/image-registry-66df7c8f76-59jrk" Feb 19 15:15:31 crc kubenswrapper[4810]: I0219 15:15:31.082133 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/38318350-42a7-4d65-93b7-e11dc71e3cb6-registry-certificates\") pod \"image-registry-66df7c8f76-59jrk\" (UID: \"38318350-42a7-4d65-93b7-e11dc71e3cb6\") " pod="openshift-image-registry/image-registry-66df7c8f76-59jrk" Feb 19 15:15:31 crc kubenswrapper[4810]: I0219 15:15:31.083525 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/38318350-42a7-4d65-93b7-e11dc71e3cb6-trusted-ca\") pod \"image-registry-66df7c8f76-59jrk\" (UID: \"38318350-42a7-4d65-93b7-e11dc71e3cb6\") " pod="openshift-image-registry/image-registry-66df7c8f76-59jrk" Feb 19 15:15:31 crc kubenswrapper[4810]: I0219 15:15:31.088213 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/38318350-42a7-4d65-93b7-e11dc71e3cb6-registry-tls\") pod \"image-registry-66df7c8f76-59jrk\" (UID: \"38318350-42a7-4d65-93b7-e11dc71e3cb6\") " pod="openshift-image-registry/image-registry-66df7c8f76-59jrk" Feb 19 15:15:31 crc kubenswrapper[4810]: I0219 15:15:31.089030 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/38318350-42a7-4d65-93b7-e11dc71e3cb6-installation-pull-secrets\") pod \"image-registry-66df7c8f76-59jrk\" (UID: \"38318350-42a7-4d65-93b7-e11dc71e3cb6\") " pod="openshift-image-registry/image-registry-66df7c8f76-59jrk" Feb 19 15:15:31 crc kubenswrapper[4810]: I0219 15:15:31.096989 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52jjq\" (UniqueName: \"kubernetes.io/projected/38318350-42a7-4d65-93b7-e11dc71e3cb6-kube-api-access-52jjq\") pod \"image-registry-66df7c8f76-59jrk\" (UID: \"38318350-42a7-4d65-93b7-e11dc71e3cb6\") " pod="openshift-image-registry/image-registry-66df7c8f76-59jrk" Feb 19 15:15:31 crc kubenswrapper[4810]: I0219 15:15:31.120006 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/38318350-42a7-4d65-93b7-e11dc71e3cb6-bound-sa-token\") pod \"image-registry-66df7c8f76-59jrk\" (UID: \"38318350-42a7-4d65-93b7-e11dc71e3cb6\") " pod="openshift-image-registry/image-registry-66df7c8f76-59jrk" Feb 19 15:15:31 crc kubenswrapper[4810]: I0219 15:15:31.184738 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-59jrk" Feb 19 15:15:31 crc kubenswrapper[4810]: I0219 15:15:31.665390 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-59jrk"] Feb 19 15:15:32 crc kubenswrapper[4810]: I0219 15:15:32.550468 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-59jrk" event={"ID":"38318350-42a7-4d65-93b7-e11dc71e3cb6","Type":"ContainerStarted","Data":"ccaaf0572405ab2cfd8dd4fd5bc4d293a41400947505fd08fbf0770c54aebc23"} Feb 19 15:15:32 crc kubenswrapper[4810]: I0219 15:15:32.550811 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-59jrk" event={"ID":"38318350-42a7-4d65-93b7-e11dc71e3cb6","Type":"ContainerStarted","Data":"3f5d01333689d1471f7256a087f02b21acf274ca7d33bc2fad1d60b86b76b63c"} Feb 19 15:15:32 crc kubenswrapper[4810]: I0219 15:15:32.550834 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-59jrk" Feb 19 15:15:32 crc kubenswrapper[4810]: I0219 15:15:32.569874 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-59jrk" podStartSLOduration=2.5698534779999997 podStartE2EDuration="2.569853478s" podCreationTimestamp="2026-02-19 15:15:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:15:32.565848537 +0000 UTC m=+362.047878661" watchObservedRunningTime="2026-02-19 15:15:32.569853478 +0000 UTC m=+362.051883602" Feb 19 15:15:49 crc kubenswrapper[4810]: I0219 15:15:49.537813 4810 patch_prober.go:28] interesting pod/machine-config-daemon-t499d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 15:15:49 crc kubenswrapper[4810]: I0219 15:15:49.538784 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 15:15:51 crc kubenswrapper[4810]: I0219 15:15:51.194317 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-59jrk" Feb 19 15:15:51 crc kubenswrapper[4810]: I0219 15:15:51.252461 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-7bnq2"] Feb 19 15:15:53 crc kubenswrapper[4810]: I0219 15:15:53.425368 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rk4vw"] Feb 19 15:15:53 crc kubenswrapper[4810]: I0219 15:15:53.426677 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rk4vw" podUID="3bf3315d-3d2f-4aeb-b925-c3832e102e85" containerName="registry-server" containerID="cri-o://077475bcb0b82366e9846158038585fb73fc53528d8e624159315394b6489745" gracePeriod=30 Feb 19 15:15:53 crc kubenswrapper[4810]: I0219 15:15:53.459864 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-d5ks5"] Feb 19 15:15:53 crc kubenswrapper[4810]: I0219 15:15:53.459929 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-r2tqm"] Feb 19 15:15:53 crc kubenswrapper[4810]: I0219 15:15:53.460233 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-r2tqm" podUID="b8b1faba-e1b8-436c-aa84-ae4353c5f0a9" containerName="marketplace-operator" containerID="cri-o://18198b3c1f0a5d1d83b301a87f12dccc9fde533f7e0587694e992331246f2d37" gracePeriod=30 Feb 19 15:15:53 crc kubenswrapper[4810]: I0219 15:15:53.460992 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-d5ks5" podUID="9a3d6b1f-2011-4f7f-bea0-1d303007fe41" containerName="registry-server" containerID="cri-o://6d399395ed59b272e5ce80489e65062cb3d89e0840588f5165efb67b6a223708" gracePeriod=30 Feb 19 15:15:53 crc kubenswrapper[4810]: I0219 15:15:53.469184 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ptbh9"] Feb 19 15:15:53 crc kubenswrapper[4810]: I0219 15:15:53.469615 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ptbh9" podUID="7e18c7b6-8ee0-4dbd-a0b2-3830b3a7bc53" containerName="registry-server" containerID="cri-o://3c4b22d47de57aea7d57602a01f5293afd50a4c1301de10fd9d0be527b9184bd" gracePeriod=30 Feb 19 15:15:53 crc kubenswrapper[4810]: I0219 15:15:53.486258 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-sm9wk"] Feb 19 15:15:53 crc kubenswrapper[4810]: I0219 15:15:53.487376 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-sm9wk" Feb 19 15:15:53 crc kubenswrapper[4810]: I0219 15:15:53.500133 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gp8sg"] Feb 19 15:15:53 crc kubenswrapper[4810]: I0219 15:15:53.500535 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gp8sg" podUID="3146bc9a-c4fc-4aa1-acae-032db4aa0582" containerName="registry-server" containerID="cri-o://f7e09ee181e03814c9bffa4e23b2c1f95c88075e138b0f5ea2eafb7e14efd0cc" gracePeriod=30 Feb 19 15:15:53 crc kubenswrapper[4810]: I0219 15:15:53.512768 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-sm9wk"] Feb 19 15:15:53 crc kubenswrapper[4810]: I0219 15:15:53.646506 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/41d27e40-a89e-4fd6-8106-824c5a257f25-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-sm9wk\" (UID: \"41d27e40-a89e-4fd6-8106-824c5a257f25\") " pod="openshift-marketplace/marketplace-operator-79b997595-sm9wk" Feb 19 15:15:53 crc kubenswrapper[4810]: I0219 15:15:53.646953 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrjlm\" (UniqueName: \"kubernetes.io/projected/41d27e40-a89e-4fd6-8106-824c5a257f25-kube-api-access-wrjlm\") pod \"marketplace-operator-79b997595-sm9wk\" (UID: \"41d27e40-a89e-4fd6-8106-824c5a257f25\") " pod="openshift-marketplace/marketplace-operator-79b997595-sm9wk" Feb 19 15:15:53 crc kubenswrapper[4810]: I0219 15:15:53.646988 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/41d27e40-a89e-4fd6-8106-824c5a257f25-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-sm9wk\" (UID: \"41d27e40-a89e-4fd6-8106-824c5a257f25\") " pod="openshift-marketplace/marketplace-operator-79b997595-sm9wk" Feb 19 15:15:53 crc kubenswrapper[4810]: I0219 15:15:53.700080 4810 generic.go:334] "Generic (PLEG): container finished" podID="9a3d6b1f-2011-4f7f-bea0-1d303007fe41" containerID="6d399395ed59b272e5ce80489e65062cb3d89e0840588f5165efb67b6a223708" exitCode=0 Feb 19 15:15:53 crc kubenswrapper[4810]: I0219 15:15:53.700163 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d5ks5" event={"ID":"9a3d6b1f-2011-4f7f-bea0-1d303007fe41","Type":"ContainerDied","Data":"6d399395ed59b272e5ce80489e65062cb3d89e0840588f5165efb67b6a223708"} Feb 19 15:15:53 crc kubenswrapper[4810]: I0219 15:15:53.702251 4810 generic.go:334] "Generic (PLEG): container finished" podID="7e18c7b6-8ee0-4dbd-a0b2-3830b3a7bc53" containerID="3c4b22d47de57aea7d57602a01f5293afd50a4c1301de10fd9d0be527b9184bd" exitCode=0 Feb 19 15:15:53 crc kubenswrapper[4810]: I0219 15:15:53.702337 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ptbh9" event={"ID":"7e18c7b6-8ee0-4dbd-a0b2-3830b3a7bc53","Type":"ContainerDied","Data":"3c4b22d47de57aea7d57602a01f5293afd50a4c1301de10fd9d0be527b9184bd"} Feb 19 15:15:53 crc kubenswrapper[4810]: I0219 15:15:53.712410 4810 generic.go:334] "Generic (PLEG): container finished" podID="3bf3315d-3d2f-4aeb-b925-c3832e102e85" containerID="077475bcb0b82366e9846158038585fb73fc53528d8e624159315394b6489745" exitCode=0 Feb 19 15:15:53 crc kubenswrapper[4810]: I0219 15:15:53.712485 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rk4vw" event={"ID":"3bf3315d-3d2f-4aeb-b925-c3832e102e85","Type":"ContainerDied","Data":"077475bcb0b82366e9846158038585fb73fc53528d8e624159315394b6489745"} Feb 19 15:15:53 crc kubenswrapper[4810]: I0219 15:15:53.715752 4810 generic.go:334] "Generic (PLEG): container finished" podID="3146bc9a-c4fc-4aa1-acae-032db4aa0582" containerID="f7e09ee181e03814c9bffa4e23b2c1f95c88075e138b0f5ea2eafb7e14efd0cc" exitCode=0 Feb 19 15:15:53 crc kubenswrapper[4810]: I0219 15:15:53.715877 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gp8sg" event={"ID":"3146bc9a-c4fc-4aa1-acae-032db4aa0582","Type":"ContainerDied","Data":"f7e09ee181e03814c9bffa4e23b2c1f95c88075e138b0f5ea2eafb7e14efd0cc"} Feb 19 15:15:53 crc kubenswrapper[4810]: I0219 15:15:53.719390 4810 generic.go:334] "Generic (PLEG): container finished" podID="b8b1faba-e1b8-436c-aa84-ae4353c5f0a9" containerID="18198b3c1f0a5d1d83b301a87f12dccc9fde533f7e0587694e992331246f2d37" exitCode=0 Feb 19 15:15:53 crc kubenswrapper[4810]: I0219 15:15:53.719475 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-r2tqm" event={"ID":"b8b1faba-e1b8-436c-aa84-ae4353c5f0a9","Type":"ContainerDied","Data":"18198b3c1f0a5d1d83b301a87f12dccc9fde533f7e0587694e992331246f2d37"} Feb 19 15:15:53 crc kubenswrapper[4810]: I0219 15:15:53.719550 4810 scope.go:117] "RemoveContainer" containerID="20e10c5e198ddee65262d0528102f59529d30ad7bead31c5ab5c764fe94b9de0" Feb 19 15:15:53 crc kubenswrapper[4810]: I0219 15:15:53.749559 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/41d27e40-a89e-4fd6-8106-824c5a257f25-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-sm9wk\" (UID: \"41d27e40-a89e-4fd6-8106-824c5a257f25\") " pod="openshift-marketplace/marketplace-operator-79b997595-sm9wk" Feb 19 15:15:53 crc kubenswrapper[4810]: I0219 15:15:53.749614 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrjlm\" (UniqueName: \"kubernetes.io/projected/41d27e40-a89e-4fd6-8106-824c5a257f25-kube-api-access-wrjlm\") pod \"marketplace-operator-79b997595-sm9wk\" (UID: \"41d27e40-a89e-4fd6-8106-824c5a257f25\") " pod="openshift-marketplace/marketplace-operator-79b997595-sm9wk" Feb 19 15:15:53 crc kubenswrapper[4810]: I0219 15:15:53.749644 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/41d27e40-a89e-4fd6-8106-824c5a257f25-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-sm9wk\" (UID: \"41d27e40-a89e-4fd6-8106-824c5a257f25\") " pod="openshift-marketplace/marketplace-operator-79b997595-sm9wk" Feb 19 15:15:53 crc kubenswrapper[4810]: I0219 15:15:53.751200 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/41d27e40-a89e-4fd6-8106-824c5a257f25-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-sm9wk\" (UID: \"41d27e40-a89e-4fd6-8106-824c5a257f25\") " pod="openshift-marketplace/marketplace-operator-79b997595-sm9wk" Feb 19 15:15:53 crc kubenswrapper[4810]: I0219 15:15:53.764240 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/41d27e40-a89e-4fd6-8106-824c5a257f25-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-sm9wk\" (UID: \"41d27e40-a89e-4fd6-8106-824c5a257f25\") " pod="openshift-marketplace/marketplace-operator-79b997595-sm9wk" Feb 19 15:15:53 crc kubenswrapper[4810]: I0219 15:15:53.775657 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrjlm\" (UniqueName: \"kubernetes.io/projected/41d27e40-a89e-4fd6-8106-824c5a257f25-kube-api-access-wrjlm\") pod \"marketplace-operator-79b997595-sm9wk\" (UID: \"41d27e40-a89e-4fd6-8106-824c5a257f25\") " pod="openshift-marketplace/marketplace-operator-79b997595-sm9wk" Feb 19 15:15:53 crc kubenswrapper[4810]: I0219 15:15:53.931390 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-sm9wk" Feb 19 15:15:53 crc kubenswrapper[4810]: I0219 15:15:53.935713 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rk4vw" Feb 19 15:15:53 crc kubenswrapper[4810]: I0219 15:15:53.947907 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ptbh9" Feb 19 15:15:53 crc kubenswrapper[4810]: I0219 15:15:53.962940 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d5ks5" Feb 19 15:15:53 crc kubenswrapper[4810]: I0219 15:15:53.969808 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gp8sg" Feb 19 15:15:53 crc kubenswrapper[4810]: I0219 15:15:53.987421 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-r2tqm" Feb 19 15:15:54 crc kubenswrapper[4810]: I0219 15:15:54.053752 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqz6g\" (UniqueName: \"kubernetes.io/projected/9a3d6b1f-2011-4f7f-bea0-1d303007fe41-kube-api-access-cqz6g\") pod \"9a3d6b1f-2011-4f7f-bea0-1d303007fe41\" (UID: \"9a3d6b1f-2011-4f7f-bea0-1d303007fe41\") " Feb 19 15:15:54 crc kubenswrapper[4810]: I0219 15:15:54.053984 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e18c7b6-8ee0-4dbd-a0b2-3830b3a7bc53-utilities\") pod \"7e18c7b6-8ee0-4dbd-a0b2-3830b3a7bc53\" (UID: \"7e18c7b6-8ee0-4dbd-a0b2-3830b3a7bc53\") " Feb 19 15:15:54 crc kubenswrapper[4810]: I0219 15:15:54.054011 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e18c7b6-8ee0-4dbd-a0b2-3830b3a7bc53-catalog-content\") pod \"7e18c7b6-8ee0-4dbd-a0b2-3830b3a7bc53\" (UID: \"7e18c7b6-8ee0-4dbd-a0b2-3830b3a7bc53\") " Feb 19 15:15:54 crc kubenswrapper[4810]: I0219 15:15:54.054043 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a3d6b1f-2011-4f7f-bea0-1d303007fe41-catalog-content\") pod \"9a3d6b1f-2011-4f7f-bea0-1d303007fe41\" (UID: \"9a3d6b1f-2011-4f7f-bea0-1d303007fe41\") " Feb 19 15:15:54 crc kubenswrapper[4810]: I0219 15:15:54.054073 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a3d6b1f-2011-4f7f-bea0-1d303007fe41-utilities\") pod \"9a3d6b1f-2011-4f7f-bea0-1d303007fe41\" (UID: \"9a3d6b1f-2011-4f7f-bea0-1d303007fe41\") " Feb 19 15:15:54 crc kubenswrapper[4810]: I0219 15:15:54.054097 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bf3315d-3d2f-4aeb-b925-c3832e102e85-utilities\") pod \"3bf3315d-3d2f-4aeb-b925-c3832e102e85\" (UID: \"3bf3315d-3d2f-4aeb-b925-c3832e102e85\") " Feb 19 15:15:54 crc kubenswrapper[4810]: I0219 15:15:54.054115 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqzns\" (UniqueName: \"kubernetes.io/projected/3bf3315d-3d2f-4aeb-b925-c3832e102e85-kube-api-access-fqzns\") pod \"3bf3315d-3d2f-4aeb-b925-c3832e102e85\" (UID: \"3bf3315d-3d2f-4aeb-b925-c3832e102e85\") " Feb 19 15:15:54 crc kubenswrapper[4810]: I0219 15:15:54.054166 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gbxf5\" (UniqueName: \"kubernetes.io/projected/7e18c7b6-8ee0-4dbd-a0b2-3830b3a7bc53-kube-api-access-gbxf5\") pod \"7e18c7b6-8ee0-4dbd-a0b2-3830b3a7bc53\" (UID: \"7e18c7b6-8ee0-4dbd-a0b2-3830b3a7bc53\") " Feb 19 15:15:54 crc kubenswrapper[4810]: I0219 15:15:54.054206 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bf3315d-3d2f-4aeb-b925-c3832e102e85-catalog-content\") pod \"3bf3315d-3d2f-4aeb-b925-c3832e102e85\" (UID: \"3bf3315d-3d2f-4aeb-b925-c3832e102e85\") " Feb 19 15:15:54 crc kubenswrapper[4810]: I0219 15:15:54.054248 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3146bc9a-c4fc-4aa1-acae-032db4aa0582-utilities\") pod \"3146bc9a-c4fc-4aa1-acae-032db4aa0582\" (UID: \"3146bc9a-c4fc-4aa1-acae-032db4aa0582\") " Feb 19 15:15:54 crc kubenswrapper[4810]: I0219 15:15:54.054280 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3146bc9a-c4fc-4aa1-acae-032db4aa0582-catalog-content\") pod \"3146bc9a-c4fc-4aa1-acae-032db4aa0582\" (UID: \"3146bc9a-c4fc-4aa1-acae-032db4aa0582\") " Feb 19 15:15:54 crc kubenswrapper[4810]: I0219 15:15:54.054302 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b45t8\" (UniqueName: \"kubernetes.io/projected/3146bc9a-c4fc-4aa1-acae-032db4aa0582-kube-api-access-b45t8\") pod \"3146bc9a-c4fc-4aa1-acae-032db4aa0582\" (UID: \"3146bc9a-c4fc-4aa1-acae-032db4aa0582\") " Feb 19 15:15:54 crc kubenswrapper[4810]: I0219 15:15:54.056513 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e18c7b6-8ee0-4dbd-a0b2-3830b3a7bc53-utilities" (OuterVolumeSpecName: "utilities") pod "7e18c7b6-8ee0-4dbd-a0b2-3830b3a7bc53" (UID: "7e18c7b6-8ee0-4dbd-a0b2-3830b3a7bc53"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:15:54 crc kubenswrapper[4810]: I0219 15:15:54.056825 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3146bc9a-c4fc-4aa1-acae-032db4aa0582-utilities" (OuterVolumeSpecName: "utilities") pod "3146bc9a-c4fc-4aa1-acae-032db4aa0582" (UID: "3146bc9a-c4fc-4aa1-acae-032db4aa0582"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:15:54 crc kubenswrapper[4810]: I0219 15:15:54.057522 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bf3315d-3d2f-4aeb-b925-c3832e102e85-utilities" (OuterVolumeSpecName: "utilities") pod "3bf3315d-3d2f-4aeb-b925-c3832e102e85" (UID: "3bf3315d-3d2f-4aeb-b925-c3832e102e85"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:15:54 crc kubenswrapper[4810]: I0219 15:15:54.059262 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e18c7b6-8ee0-4dbd-a0b2-3830b3a7bc53-kube-api-access-gbxf5" (OuterVolumeSpecName: "kube-api-access-gbxf5") pod "7e18c7b6-8ee0-4dbd-a0b2-3830b3a7bc53" (UID: "7e18c7b6-8ee0-4dbd-a0b2-3830b3a7bc53"). InnerVolumeSpecName "kube-api-access-gbxf5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:15:54 crc kubenswrapper[4810]: I0219 15:15:54.062132 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a3d6b1f-2011-4f7f-bea0-1d303007fe41-utilities" (OuterVolumeSpecName: "utilities") pod "9a3d6b1f-2011-4f7f-bea0-1d303007fe41" (UID: "9a3d6b1f-2011-4f7f-bea0-1d303007fe41"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:15:54 crc kubenswrapper[4810]: I0219 15:15:54.064108 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bf3315d-3d2f-4aeb-b925-c3832e102e85-kube-api-access-fqzns" (OuterVolumeSpecName: "kube-api-access-fqzns") pod "3bf3315d-3d2f-4aeb-b925-c3832e102e85" (UID: "3bf3315d-3d2f-4aeb-b925-c3832e102e85"). InnerVolumeSpecName "kube-api-access-fqzns". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:15:54 crc kubenswrapper[4810]: I0219 15:15:54.065549 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3146bc9a-c4fc-4aa1-acae-032db4aa0582-kube-api-access-b45t8" (OuterVolumeSpecName: "kube-api-access-b45t8") pod "3146bc9a-c4fc-4aa1-acae-032db4aa0582" (UID: "3146bc9a-c4fc-4aa1-acae-032db4aa0582"). InnerVolumeSpecName "kube-api-access-b45t8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:15:54 crc kubenswrapper[4810]: I0219 15:15:54.067593 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a3d6b1f-2011-4f7f-bea0-1d303007fe41-kube-api-access-cqz6g" (OuterVolumeSpecName: "kube-api-access-cqz6g") pod "9a3d6b1f-2011-4f7f-bea0-1d303007fe41" (UID: "9a3d6b1f-2011-4f7f-bea0-1d303007fe41"). InnerVolumeSpecName "kube-api-access-cqz6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:15:54 crc kubenswrapper[4810]: I0219 15:15:54.083705 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e18c7b6-8ee0-4dbd-a0b2-3830b3a7bc53-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7e18c7b6-8ee0-4dbd-a0b2-3830b3a7bc53" (UID: "7e18c7b6-8ee0-4dbd-a0b2-3830b3a7bc53"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:15:54 crc kubenswrapper[4810]: I0219 15:15:54.119598 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a3d6b1f-2011-4f7f-bea0-1d303007fe41-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9a3d6b1f-2011-4f7f-bea0-1d303007fe41" (UID: "9a3d6b1f-2011-4f7f-bea0-1d303007fe41"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:15:54 crc kubenswrapper[4810]: I0219 15:15:54.128978 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bf3315d-3d2f-4aeb-b925-c3832e102e85-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3bf3315d-3d2f-4aeb-b925-c3832e102e85" (UID: "3bf3315d-3d2f-4aeb-b925-c3832e102e85"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:15:54 crc kubenswrapper[4810]: I0219 15:15:54.160558 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b8b1faba-e1b8-436c-aa84-ae4353c5f0a9-marketplace-trusted-ca\") pod \"b8b1faba-e1b8-436c-aa84-ae4353c5f0a9\" (UID: \"b8b1faba-e1b8-436c-aa84-ae4353c5f0a9\") " Feb 19 15:15:54 crc kubenswrapper[4810]: I0219 15:15:54.160749 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b8b1faba-e1b8-436c-aa84-ae4353c5f0a9-marketplace-operator-metrics\") pod \"b8b1faba-e1b8-436c-aa84-ae4353c5f0a9\" (UID: \"b8b1faba-e1b8-436c-aa84-ae4353c5f0a9\") " Feb 19 15:15:54 crc kubenswrapper[4810]: I0219 15:15:54.160785 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9vz7\" (UniqueName: \"kubernetes.io/projected/b8b1faba-e1b8-436c-aa84-ae4353c5f0a9-kube-api-access-q9vz7\") pod \"b8b1faba-e1b8-436c-aa84-ae4353c5f0a9\" (UID: \"b8b1faba-e1b8-436c-aa84-ae4353c5f0a9\") " Feb 19 15:15:54 crc kubenswrapper[4810]: I0219 15:15:54.161093 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gbxf5\" (UniqueName: \"kubernetes.io/projected/7e18c7b6-8ee0-4dbd-a0b2-3830b3a7bc53-kube-api-access-gbxf5\") on node \"crc\" DevicePath \"\"" Feb 19 15:15:54 crc kubenswrapper[4810]: I0219 15:15:54.161124 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bf3315d-3d2f-4aeb-b925-c3832e102e85-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 15:15:54 crc kubenswrapper[4810]: I0219 15:15:54.161139 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3146bc9a-c4fc-4aa1-acae-032db4aa0582-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 15:15:54 crc kubenswrapper[4810]: I0219 15:15:54.161152 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b45t8\" (UniqueName: \"kubernetes.io/projected/3146bc9a-c4fc-4aa1-acae-032db4aa0582-kube-api-access-b45t8\") on node \"crc\" DevicePath \"\"" Feb 19 15:15:54 crc kubenswrapper[4810]: I0219 15:15:54.161165 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqz6g\" (UniqueName: \"kubernetes.io/projected/9a3d6b1f-2011-4f7f-bea0-1d303007fe41-kube-api-access-cqz6g\") on node \"crc\" DevicePath \"\"" Feb 19 15:15:54 crc kubenswrapper[4810]: I0219 15:15:54.161177 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e18c7b6-8ee0-4dbd-a0b2-3830b3a7bc53-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 15:15:54 crc kubenswrapper[4810]: I0219 15:15:54.161188 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e18c7b6-8ee0-4dbd-a0b2-3830b3a7bc53-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 15:15:54 crc kubenswrapper[4810]: I0219 15:15:54.161199 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a3d6b1f-2011-4f7f-bea0-1d303007fe41-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 15:15:54 crc kubenswrapper[4810]: I0219 15:15:54.161211 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a3d6b1f-2011-4f7f-bea0-1d303007fe41-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 15:15:54 crc kubenswrapper[4810]: I0219 15:15:54.161223 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bf3315d-3d2f-4aeb-b925-c3832e102e85-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 15:15:54 crc kubenswrapper[4810]: I0219 15:15:54.161236 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqzns\" (UniqueName: \"kubernetes.io/projected/3bf3315d-3d2f-4aeb-b925-c3832e102e85-kube-api-access-fqzns\") on node \"crc\" DevicePath \"\"" Feb 19 15:15:54 crc kubenswrapper[4810]: I0219 15:15:54.162776 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8b1faba-e1b8-436c-aa84-ae4353c5f0a9-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b8b1faba-e1b8-436c-aa84-ae4353c5f0a9" (UID: "b8b1faba-e1b8-436c-aa84-ae4353c5f0a9"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:15:54 crc kubenswrapper[4810]: I0219 15:15:54.163860 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8b1faba-e1b8-436c-aa84-ae4353c5f0a9-kube-api-access-q9vz7" (OuterVolumeSpecName: "kube-api-access-q9vz7") pod "b8b1faba-e1b8-436c-aa84-ae4353c5f0a9" (UID: "b8b1faba-e1b8-436c-aa84-ae4353c5f0a9"). InnerVolumeSpecName "kube-api-access-q9vz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:15:54 crc kubenswrapper[4810]: I0219 15:15:54.164398 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8b1faba-e1b8-436c-aa84-ae4353c5f0a9-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b8b1faba-e1b8-436c-aa84-ae4353c5f0a9" (UID: "b8b1faba-e1b8-436c-aa84-ae4353c5f0a9"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:15:54 crc kubenswrapper[4810]: I0219 15:15:54.228180 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-sm9wk"] Feb 19 15:15:54 crc kubenswrapper[4810]: I0219 15:15:54.235087 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3146bc9a-c4fc-4aa1-acae-032db4aa0582-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3146bc9a-c4fc-4aa1-acae-032db4aa0582" (UID: "3146bc9a-c4fc-4aa1-acae-032db4aa0582"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:15:54 crc kubenswrapper[4810]: I0219 15:15:54.262208 4810 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b8b1faba-e1b8-436c-aa84-ae4353c5f0a9-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 19 15:15:54 crc kubenswrapper[4810]: I0219 15:15:54.262240 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3146bc9a-c4fc-4aa1-acae-032db4aa0582-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 15:15:54 crc kubenswrapper[4810]: I0219 15:15:54.262249 4810 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b8b1faba-e1b8-436c-aa84-ae4353c5f0a9-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 19 15:15:54 crc kubenswrapper[4810]: I0219 15:15:54.262258 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q9vz7\" (UniqueName: \"kubernetes.io/projected/b8b1faba-e1b8-436c-aa84-ae4353c5f0a9-kube-api-access-q9vz7\") on node \"crc\" DevicePath \"\"" Feb 19 15:15:54 crc kubenswrapper[4810]: I0219 15:15:54.727600 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-sm9wk" event={"ID":"41d27e40-a89e-4fd6-8106-824c5a257f25","Type":"ContainerStarted","Data":"5ccd1283cec720696319ec135a75750499238b1f385e02afa75450eab89e47aa"} Feb 19 15:15:54 crc kubenswrapper[4810]: I0219 15:15:54.728064 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-sm9wk" event={"ID":"41d27e40-a89e-4fd6-8106-824c5a257f25","Type":"ContainerStarted","Data":"f3a211914284ae52c30528a1fb7055afc9b3b025664d0c15800024f0c3748ddd"} Feb 19 15:15:54 crc kubenswrapper[4810]: I0219 15:15:54.728962 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-sm9wk" Feb 19 15:15:54 crc kubenswrapper[4810]: I0219 15:15:54.730882 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-sm9wk" Feb 19 15:15:54 crc kubenswrapper[4810]: I0219 15:15:54.731985 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rk4vw" event={"ID":"3bf3315d-3d2f-4aeb-b925-c3832e102e85","Type":"ContainerDied","Data":"2a93b7168fafbe84b16d4aeee817860063438d02590293e9edf6bad1699c168a"} Feb 19 15:15:54 crc kubenswrapper[4810]: I0219 15:15:54.732052 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rk4vw" Feb 19 15:15:54 crc kubenswrapper[4810]: I0219 15:15:54.732099 4810 scope.go:117] "RemoveContainer" containerID="077475bcb0b82366e9846158038585fb73fc53528d8e624159315394b6489745" Feb 19 15:15:54 crc kubenswrapper[4810]: I0219 15:15:54.735182 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gp8sg" event={"ID":"3146bc9a-c4fc-4aa1-acae-032db4aa0582","Type":"ContainerDied","Data":"8f98c34b7848371876e8226d32a5b9a72fad3efa01f7370b3de8b257667df91e"} Feb 19 15:15:54 crc kubenswrapper[4810]: I0219 15:15:54.735296 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gp8sg" Feb 19 15:15:54 crc kubenswrapper[4810]: I0219 15:15:54.742520 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-r2tqm" event={"ID":"b8b1faba-e1b8-436c-aa84-ae4353c5f0a9","Type":"ContainerDied","Data":"23f5d4bb5ee04c131b85331f5b6ae4b924ed6b6ac6634c148de3748ba4fdc4ad"} Feb 19 15:15:54 crc kubenswrapper[4810]: I0219 15:15:54.742649 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-r2tqm" Feb 19 15:15:54 crc kubenswrapper[4810]: I0219 15:15:54.750097 4810 scope.go:117] "RemoveContainer" containerID="2338631bd769d72dd75e6311ec37172a35cf219ae1ae08bee9e394a35d599110" Feb 19 15:15:54 crc kubenswrapper[4810]: I0219 15:15:54.762685 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-sm9wk" podStartSLOduration=1.76265062 podStartE2EDuration="1.76265062s" podCreationTimestamp="2026-02-19 15:15:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:15:54.760099783 +0000 UTC m=+384.242129907" watchObservedRunningTime="2026-02-19 15:15:54.76265062 +0000 UTC m=+384.244680774" Feb 19 15:15:54 crc kubenswrapper[4810]: I0219 15:15:54.768664 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d5ks5" event={"ID":"9a3d6b1f-2011-4f7f-bea0-1d303007fe41","Type":"ContainerDied","Data":"c3d08bc3ddaa041e0392052ef7f026d6557f47271ad22913b60948a058b74b85"} Feb 19 15:15:54 crc kubenswrapper[4810]: I0219 15:15:54.768867 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d5ks5" Feb 19 15:15:54 crc kubenswrapper[4810]: I0219 15:15:54.771825 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ptbh9" event={"ID":"7e18c7b6-8ee0-4dbd-a0b2-3830b3a7bc53","Type":"ContainerDied","Data":"585fb3dde24254534020203ea59c98681809ba91054fff093daf819845736af6"} Feb 19 15:15:54 crc kubenswrapper[4810]: I0219 15:15:54.772062 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ptbh9" Feb 19 15:15:54 crc kubenswrapper[4810]: I0219 15:15:54.797482 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rk4vw"] Feb 19 15:15:54 crc kubenswrapper[4810]: I0219 15:15:54.802892 4810 scope.go:117] "RemoveContainer" containerID="dca6d9a99a30ff4bef03f7e86c179f1a1309a876ad27e10ee78ace680fa82510" Feb 19 15:15:54 crc kubenswrapper[4810]: I0219 15:15:54.838986 4810 scope.go:117] "RemoveContainer" containerID="f7e09ee181e03814c9bffa4e23b2c1f95c88075e138b0f5ea2eafb7e14efd0cc" Feb 19 15:15:54 crc kubenswrapper[4810]: I0219 15:15:54.844680 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rk4vw"] Feb 19 15:15:54 crc kubenswrapper[4810]: I0219 15:15:54.856265 4810 scope.go:117] "RemoveContainer" containerID="678210fe6fd1c3abf47690dcbbfba0fc503dc30b786177c25e50c7cce621be3d" Feb 19 15:15:54 crc kubenswrapper[4810]: I0219 15:15:54.863262 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-d5ks5"] Feb 19 15:15:54 crc kubenswrapper[4810]: I0219 15:15:54.870010 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-d5ks5"] Feb 19 15:15:54 crc kubenswrapper[4810]: I0219 15:15:54.897598 4810 scope.go:117] "RemoveContainer" containerID="db4a068bb20ce6903e18758cc6f38e8bff29bd023c7f06ec3db18c434439f0c7" Feb 19 15:15:54 crc kubenswrapper[4810]: I0219 15:15:54.900763 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ptbh9"] Feb 19 15:15:54 crc kubenswrapper[4810]: I0219 15:15:54.906497 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ptbh9"] Feb 19 15:15:54 crc kubenswrapper[4810]: I0219 15:15:54.918698 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-r2tqm"] Feb 19 15:15:54 crc kubenswrapper[4810]: I0219 15:15:54.936004 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-r2tqm"] Feb 19 15:15:54 crc kubenswrapper[4810]: I0219 15:15:54.936052 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gp8sg"] Feb 19 15:15:54 crc kubenswrapper[4810]: I0219 15:15:54.942949 4810 scope.go:117] "RemoveContainer" containerID="18198b3c1f0a5d1d83b301a87f12dccc9fde533f7e0587694e992331246f2d37" Feb 19 15:15:54 crc kubenswrapper[4810]: I0219 15:15:54.943678 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gp8sg"] Feb 19 15:15:54 crc kubenswrapper[4810]: I0219 15:15:54.955154 4810 scope.go:117] "RemoveContainer" containerID="6d399395ed59b272e5ce80489e65062cb3d89e0840588f5165efb67b6a223708" Feb 19 15:15:54 crc kubenswrapper[4810]: I0219 15:15:54.966575 4810 scope.go:117] "RemoveContainer" containerID="cf8a20a9712326d7f7917d432dc449c8f1126f425b48ed7c1328c76b1ca7b19c" Feb 19 15:15:54 crc kubenswrapper[4810]: I0219 15:15:54.978563 4810 scope.go:117] "RemoveContainer" containerID="1962f43ea8735830650ac9b311ce674cd5cebcb42c9922bc390ae19775d9f9f0" Feb 19 15:15:54 crc kubenswrapper[4810]: I0219 15:15:54.990585 4810 scope.go:117] "RemoveContainer" containerID="3c4b22d47de57aea7d57602a01f5293afd50a4c1301de10fd9d0be527b9184bd" Feb 19 15:15:55 crc kubenswrapper[4810]: I0219 15:15:55.004409 4810 scope.go:117] "RemoveContainer" containerID="2ba8cb7edb1b0d7ff70efe1c621d14ae3cd3fc1499b0b814053d764332922921" Feb 19 15:15:55 crc kubenswrapper[4810]: I0219 15:15:55.019567 4810 scope.go:117] "RemoveContainer" containerID="50a196ba034be9702770fc3245e22281a78913a08bd60f8b507db74c90490792" Feb 19 15:15:55 crc kubenswrapper[4810]: I0219 15:15:55.454728 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3146bc9a-c4fc-4aa1-acae-032db4aa0582" path="/var/lib/kubelet/pods/3146bc9a-c4fc-4aa1-acae-032db4aa0582/volumes" Feb 19 15:15:55 crc kubenswrapper[4810]: I0219 15:15:55.456818 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bf3315d-3d2f-4aeb-b925-c3832e102e85" path="/var/lib/kubelet/pods/3bf3315d-3d2f-4aeb-b925-c3832e102e85/volumes" Feb 19 15:15:55 crc kubenswrapper[4810]: I0219 15:15:55.458485 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e18c7b6-8ee0-4dbd-a0b2-3830b3a7bc53" path="/var/lib/kubelet/pods/7e18c7b6-8ee0-4dbd-a0b2-3830b3a7bc53/volumes" Feb 19 15:15:55 crc kubenswrapper[4810]: I0219 15:15:55.461069 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a3d6b1f-2011-4f7f-bea0-1d303007fe41" path="/var/lib/kubelet/pods/9a3d6b1f-2011-4f7f-bea0-1d303007fe41/volumes" Feb 19 15:15:55 crc kubenswrapper[4810]: I0219 15:15:55.462773 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8b1faba-e1b8-436c-aa84-ae4353c5f0a9" path="/var/lib/kubelet/pods/b8b1faba-e1b8-436c-aa84-ae4353c5f0a9/volumes" Feb 19 15:15:56 crc kubenswrapper[4810]: I0219 15:15:56.042641 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gzkwp"] Feb 19 15:15:56 crc kubenswrapper[4810]: E0219 15:15:56.044705 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a3d6b1f-2011-4f7f-bea0-1d303007fe41" containerName="extract-utilities" Feb 19 15:15:56 crc kubenswrapper[4810]: I0219 15:15:56.045011 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a3d6b1f-2011-4f7f-bea0-1d303007fe41" containerName="extract-utilities" Feb 19 15:15:56 crc kubenswrapper[4810]: E0219 15:15:56.045137 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bf3315d-3d2f-4aeb-b925-c3832e102e85" containerName="extract-utilities" Feb 19 15:15:56 crc kubenswrapper[4810]: I0219 15:15:56.045250 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bf3315d-3d2f-4aeb-b925-c3832e102e85" containerName="extract-utilities" Feb 19 15:15:56 crc kubenswrapper[4810]: E0219 15:15:56.045348 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e18c7b6-8ee0-4dbd-a0b2-3830b3a7bc53" containerName="extract-utilities" Feb 19 15:15:56 crc kubenswrapper[4810]: I0219 15:15:56.045429 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e18c7b6-8ee0-4dbd-a0b2-3830b3a7bc53" containerName="extract-utilities" Feb 19 15:15:56 crc kubenswrapper[4810]: E0219 15:15:56.045510 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e18c7b6-8ee0-4dbd-a0b2-3830b3a7bc53" containerName="registry-server" Feb 19 15:15:56 crc kubenswrapper[4810]: I0219 15:15:56.045626 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e18c7b6-8ee0-4dbd-a0b2-3830b3a7bc53" containerName="registry-server" Feb 19 15:15:56 crc kubenswrapper[4810]: E0219 15:15:56.045719 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3146bc9a-c4fc-4aa1-acae-032db4aa0582" containerName="registry-server" Feb 19 15:15:56 crc kubenswrapper[4810]: I0219 15:15:56.045730 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="3146bc9a-c4fc-4aa1-acae-032db4aa0582" containerName="registry-server" Feb 19 15:15:56 crc kubenswrapper[4810]: E0219 15:15:56.045749 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3146bc9a-c4fc-4aa1-acae-032db4aa0582" containerName="extract-utilities" Feb 19 15:15:56 crc kubenswrapper[4810]: I0219 15:15:56.045758 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="3146bc9a-c4fc-4aa1-acae-032db4aa0582" containerName="extract-utilities" Feb 19 15:15:56 crc kubenswrapper[4810]: E0219 15:15:56.045775 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8b1faba-e1b8-436c-aa84-ae4353c5f0a9" containerName="marketplace-operator" Feb 19 15:15:56 crc kubenswrapper[4810]: I0219 15:15:56.045786 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8b1faba-e1b8-436c-aa84-ae4353c5f0a9" containerName="marketplace-operator" Feb 19 15:15:56 crc kubenswrapper[4810]: E0219 15:15:56.045798 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a3d6b1f-2011-4f7f-bea0-1d303007fe41" containerName="registry-server" Feb 19 15:15:56 crc kubenswrapper[4810]: I0219 15:15:56.045806 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a3d6b1f-2011-4f7f-bea0-1d303007fe41" containerName="registry-server" Feb 19 15:15:56 crc kubenswrapper[4810]: E0219 15:15:56.045824 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bf3315d-3d2f-4aeb-b925-c3832e102e85" containerName="extract-content" Feb 19 15:15:56 crc kubenswrapper[4810]: I0219 15:15:56.045832 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bf3315d-3d2f-4aeb-b925-c3832e102e85" containerName="extract-content" Feb 19 15:15:56 crc kubenswrapper[4810]: E0219 15:15:56.045848 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e18c7b6-8ee0-4dbd-a0b2-3830b3a7bc53" containerName="extract-content" Feb 19 15:15:56 crc kubenswrapper[4810]: I0219 15:15:56.045856 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e18c7b6-8ee0-4dbd-a0b2-3830b3a7bc53" containerName="extract-content" Feb 19 15:15:56 crc kubenswrapper[4810]: E0219 15:15:56.045870 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a3d6b1f-2011-4f7f-bea0-1d303007fe41" containerName="extract-content" Feb 19 15:15:56 crc kubenswrapper[4810]: I0219 15:15:56.045877 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a3d6b1f-2011-4f7f-bea0-1d303007fe41" containerName="extract-content" Feb 19 15:15:56 crc kubenswrapper[4810]: E0219 15:15:56.045887 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bf3315d-3d2f-4aeb-b925-c3832e102e85" containerName="registry-server" Feb 19 15:15:56 crc kubenswrapper[4810]: I0219 15:15:56.045895 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bf3315d-3d2f-4aeb-b925-c3832e102e85" containerName="registry-server" Feb 19 15:15:56 crc kubenswrapper[4810]: E0219 15:15:56.045908 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3146bc9a-c4fc-4aa1-acae-032db4aa0582" containerName="extract-content" Feb 19 15:15:56 crc kubenswrapper[4810]: I0219 15:15:56.045915 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="3146bc9a-c4fc-4aa1-acae-032db4aa0582" containerName="extract-content" Feb 19 15:15:56 crc kubenswrapper[4810]: I0219 15:15:56.046214 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8b1faba-e1b8-436c-aa84-ae4353c5f0a9" containerName="marketplace-operator" Feb 19 15:15:56 crc kubenswrapper[4810]: I0219 15:15:56.046230 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="3146bc9a-c4fc-4aa1-acae-032db4aa0582" containerName="registry-server" Feb 19 15:15:56 crc kubenswrapper[4810]: I0219 15:15:56.046240 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bf3315d-3d2f-4aeb-b925-c3832e102e85" containerName="registry-server" Feb 19 15:15:56 crc kubenswrapper[4810]: I0219 15:15:56.046248 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e18c7b6-8ee0-4dbd-a0b2-3830b3a7bc53" containerName="registry-server" Feb 19 15:15:56 crc kubenswrapper[4810]: I0219 15:15:56.046257 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a3d6b1f-2011-4f7f-bea0-1d303007fe41" containerName="registry-server" Feb 19 15:15:56 crc kubenswrapper[4810]: E0219 15:15:56.046423 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8b1faba-e1b8-436c-aa84-ae4353c5f0a9" containerName="marketplace-operator" Feb 19 15:15:56 crc kubenswrapper[4810]: I0219 15:15:56.046432 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8b1faba-e1b8-436c-aa84-ae4353c5f0a9" containerName="marketplace-operator" Feb 19 15:15:56 crc kubenswrapper[4810]: I0219 15:15:56.046533 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8b1faba-e1b8-436c-aa84-ae4353c5f0a9" containerName="marketplace-operator" Feb 19 15:15:56 crc kubenswrapper[4810]: I0219 15:15:56.047416 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gzkwp" Feb 19 15:15:56 crc kubenswrapper[4810]: I0219 15:15:56.050391 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 19 15:15:56 crc kubenswrapper[4810]: I0219 15:15:56.059524 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gzkwp"] Feb 19 15:15:56 crc kubenswrapper[4810]: I0219 15:15:56.194770 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb41d90e-0896-4229-a19b-a8577292bbf6-catalog-content\") pod \"redhat-operators-gzkwp\" (UID: \"cb41d90e-0896-4229-a19b-a8577292bbf6\") " pod="openshift-marketplace/redhat-operators-gzkwp" Feb 19 15:15:56 crc kubenswrapper[4810]: I0219 15:15:56.194816 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s79c7\" (UniqueName: \"kubernetes.io/projected/cb41d90e-0896-4229-a19b-a8577292bbf6-kube-api-access-s79c7\") pod \"redhat-operators-gzkwp\" (UID: \"cb41d90e-0896-4229-a19b-a8577292bbf6\") " pod="openshift-marketplace/redhat-operators-gzkwp" Feb 19 15:15:56 crc kubenswrapper[4810]: I0219 15:15:56.194857 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb41d90e-0896-4229-a19b-a8577292bbf6-utilities\") pod \"redhat-operators-gzkwp\" (UID: \"cb41d90e-0896-4229-a19b-a8577292bbf6\") " pod="openshift-marketplace/redhat-operators-gzkwp" Feb 19 15:15:56 crc kubenswrapper[4810]: I0219 15:15:56.295675 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s79c7\" (UniqueName: \"kubernetes.io/projected/cb41d90e-0896-4229-a19b-a8577292bbf6-kube-api-access-s79c7\") pod \"redhat-operators-gzkwp\" (UID: \"cb41d90e-0896-4229-a19b-a8577292bbf6\") " pod="openshift-marketplace/redhat-operators-gzkwp" Feb 19 15:15:56 crc kubenswrapper[4810]: I0219 15:15:56.295739 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb41d90e-0896-4229-a19b-a8577292bbf6-utilities\") pod \"redhat-operators-gzkwp\" (UID: \"cb41d90e-0896-4229-a19b-a8577292bbf6\") " pod="openshift-marketplace/redhat-operators-gzkwp" Feb 19 15:15:56 crc kubenswrapper[4810]: I0219 15:15:56.295792 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb41d90e-0896-4229-a19b-a8577292bbf6-catalog-content\") pod \"redhat-operators-gzkwp\" (UID: \"cb41d90e-0896-4229-a19b-a8577292bbf6\") " pod="openshift-marketplace/redhat-operators-gzkwp" Feb 19 15:15:56 crc kubenswrapper[4810]: I0219 15:15:56.296229 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb41d90e-0896-4229-a19b-a8577292bbf6-catalog-content\") pod \"redhat-operators-gzkwp\" (UID: \"cb41d90e-0896-4229-a19b-a8577292bbf6\") " pod="openshift-marketplace/redhat-operators-gzkwp" Feb 19 15:15:56 crc kubenswrapper[4810]: I0219 15:15:56.296344 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb41d90e-0896-4229-a19b-a8577292bbf6-utilities\") pod \"redhat-operators-gzkwp\" (UID: \"cb41d90e-0896-4229-a19b-a8577292bbf6\") " pod="openshift-marketplace/redhat-operators-gzkwp" Feb 19 15:15:56 crc kubenswrapper[4810]: I0219 15:15:56.319286 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s79c7\" (UniqueName: \"kubernetes.io/projected/cb41d90e-0896-4229-a19b-a8577292bbf6-kube-api-access-s79c7\") pod \"redhat-operators-gzkwp\" (UID: \"cb41d90e-0896-4229-a19b-a8577292bbf6\") " pod="openshift-marketplace/redhat-operators-gzkwp" Feb 19 15:15:56 crc kubenswrapper[4810]: I0219 15:15:56.378755 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gzkwp" Feb 19 15:15:56 crc kubenswrapper[4810]: W0219 15:15:56.806981 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb41d90e_0896_4229_a19b_a8577292bbf6.slice/crio-359d44fe31f278162e80c02cff4ce2bbbb87a6a8b85fc0045e405dd62ec98267 WatchSource:0}: Error finding container 359d44fe31f278162e80c02cff4ce2bbbb87a6a8b85fc0045e405dd62ec98267: Status 404 returned error can't find the container with id 359d44fe31f278162e80c02cff4ce2bbbb87a6a8b85fc0045e405dd62ec98267 Feb 19 15:15:56 crc kubenswrapper[4810]: I0219 15:15:56.808991 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gzkwp"] Feb 19 15:15:57 crc kubenswrapper[4810]: I0219 15:15:57.049503 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-96zmk"] Feb 19 15:15:57 crc kubenswrapper[4810]: I0219 15:15:57.050970 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-96zmk" Feb 19 15:15:57 crc kubenswrapper[4810]: I0219 15:15:57.053846 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 19 15:15:57 crc kubenswrapper[4810]: I0219 15:15:57.055586 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-96zmk"] Feb 19 15:15:57 crc kubenswrapper[4810]: I0219 15:15:57.217636 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7wtj\" (UniqueName: \"kubernetes.io/projected/78aaed3c-dfb4-4332-bc63-4fc5342870ae-kube-api-access-z7wtj\") pod \"certified-operators-96zmk\" (UID: \"78aaed3c-dfb4-4332-bc63-4fc5342870ae\") " pod="openshift-marketplace/certified-operators-96zmk" Feb 19 15:15:57 crc kubenswrapper[4810]: I0219 15:15:57.217730 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78aaed3c-dfb4-4332-bc63-4fc5342870ae-utilities\") pod \"certified-operators-96zmk\" (UID: \"78aaed3c-dfb4-4332-bc63-4fc5342870ae\") " pod="openshift-marketplace/certified-operators-96zmk" Feb 19 15:15:57 crc kubenswrapper[4810]: I0219 15:15:57.217810 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78aaed3c-dfb4-4332-bc63-4fc5342870ae-catalog-content\") pod \"certified-operators-96zmk\" (UID: \"78aaed3c-dfb4-4332-bc63-4fc5342870ae\") " pod="openshift-marketplace/certified-operators-96zmk" Feb 19 15:15:57 crc kubenswrapper[4810]: I0219 15:15:57.320228 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7wtj\" (UniqueName: \"kubernetes.io/projected/78aaed3c-dfb4-4332-bc63-4fc5342870ae-kube-api-access-z7wtj\") pod \"certified-operators-96zmk\" (UID: \"78aaed3c-dfb4-4332-bc63-4fc5342870ae\") " pod="openshift-marketplace/certified-operators-96zmk" Feb 19 15:15:57 crc kubenswrapper[4810]: I0219 15:15:57.320398 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78aaed3c-dfb4-4332-bc63-4fc5342870ae-utilities\") pod \"certified-operators-96zmk\" (UID: \"78aaed3c-dfb4-4332-bc63-4fc5342870ae\") " pod="openshift-marketplace/certified-operators-96zmk" Feb 19 15:15:57 crc kubenswrapper[4810]: I0219 15:15:57.320473 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78aaed3c-dfb4-4332-bc63-4fc5342870ae-catalog-content\") pod \"certified-operators-96zmk\" (UID: \"78aaed3c-dfb4-4332-bc63-4fc5342870ae\") " pod="openshift-marketplace/certified-operators-96zmk" Feb 19 15:15:57 crc kubenswrapper[4810]: I0219 15:15:57.321521 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78aaed3c-dfb4-4332-bc63-4fc5342870ae-utilities\") pod \"certified-operators-96zmk\" (UID: \"78aaed3c-dfb4-4332-bc63-4fc5342870ae\") " pod="openshift-marketplace/certified-operators-96zmk" Feb 19 15:15:57 crc kubenswrapper[4810]: I0219 15:15:57.323680 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78aaed3c-dfb4-4332-bc63-4fc5342870ae-catalog-content\") pod \"certified-operators-96zmk\" (UID: \"78aaed3c-dfb4-4332-bc63-4fc5342870ae\") " pod="openshift-marketplace/certified-operators-96zmk" Feb 19 15:15:57 crc kubenswrapper[4810]: I0219 15:15:57.344509 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7wtj\" (UniqueName: \"kubernetes.io/projected/78aaed3c-dfb4-4332-bc63-4fc5342870ae-kube-api-access-z7wtj\") pod \"certified-operators-96zmk\" (UID: \"78aaed3c-dfb4-4332-bc63-4fc5342870ae\") " pod="openshift-marketplace/certified-operators-96zmk" Feb 19 15:15:57 crc kubenswrapper[4810]: I0219 15:15:57.369832 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-96zmk" Feb 19 15:15:57 crc kubenswrapper[4810]: I0219 15:15:57.791951 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-96zmk"] Feb 19 15:15:57 crc kubenswrapper[4810]: W0219 15:15:57.802701 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78aaed3c_dfb4_4332_bc63_4fc5342870ae.slice/crio-dc3bb4ddeed72ae65273594c85ea1bf8c3ee8a4155cd199ebba0a1a239217684 WatchSource:0}: Error finding container dc3bb4ddeed72ae65273594c85ea1bf8c3ee8a4155cd199ebba0a1a239217684: Status 404 returned error can't find the container with id dc3bb4ddeed72ae65273594c85ea1bf8c3ee8a4155cd199ebba0a1a239217684 Feb 19 15:15:57 crc kubenswrapper[4810]: I0219 15:15:57.816748 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-96zmk" event={"ID":"78aaed3c-dfb4-4332-bc63-4fc5342870ae","Type":"ContainerStarted","Data":"dc3bb4ddeed72ae65273594c85ea1bf8c3ee8a4155cd199ebba0a1a239217684"} Feb 19 15:15:57 crc kubenswrapper[4810]: I0219 15:15:57.818886 4810 generic.go:334] "Generic (PLEG): container finished" podID="cb41d90e-0896-4229-a19b-a8577292bbf6" containerID="446909cef9363d01f975cdf4a69b340a8cfb1e1ae9bddb9ba40e5d706fc6a797" exitCode=0 Feb 19 15:15:57 crc kubenswrapper[4810]: I0219 15:15:57.819143 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gzkwp" event={"ID":"cb41d90e-0896-4229-a19b-a8577292bbf6","Type":"ContainerDied","Data":"446909cef9363d01f975cdf4a69b340a8cfb1e1ae9bddb9ba40e5d706fc6a797"} Feb 19 15:15:57 crc kubenswrapper[4810]: I0219 15:15:57.819199 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gzkwp" event={"ID":"cb41d90e-0896-4229-a19b-a8577292bbf6","Type":"ContainerStarted","Data":"359d44fe31f278162e80c02cff4ce2bbbb87a6a8b85fc0045e405dd62ec98267"} Feb 19 15:15:58 crc kubenswrapper[4810]: I0219 15:15:58.448852 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-v68v6"] Feb 19 15:15:58 crc kubenswrapper[4810]: I0219 15:15:58.450518 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v68v6" Feb 19 15:15:58 crc kubenswrapper[4810]: I0219 15:15:58.452371 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 19 15:15:58 crc kubenswrapper[4810]: I0219 15:15:58.455388 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-v68v6"] Feb 19 15:15:58 crc kubenswrapper[4810]: I0219 15:15:58.538465 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/935efdc2-5596-4207-a27b-68a8a39b6529-catalog-content\") pod \"community-operators-v68v6\" (UID: \"935efdc2-5596-4207-a27b-68a8a39b6529\") " pod="openshift-marketplace/community-operators-v68v6" Feb 19 15:15:58 crc kubenswrapper[4810]: I0219 15:15:58.538570 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/935efdc2-5596-4207-a27b-68a8a39b6529-utilities\") pod \"community-operators-v68v6\" (UID: \"935efdc2-5596-4207-a27b-68a8a39b6529\") " pod="openshift-marketplace/community-operators-v68v6" Feb 19 15:15:58 crc kubenswrapper[4810]: I0219 15:15:58.538639 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlxt4\" (UniqueName: \"kubernetes.io/projected/935efdc2-5596-4207-a27b-68a8a39b6529-kube-api-access-dlxt4\") pod \"community-operators-v68v6\" (UID: \"935efdc2-5596-4207-a27b-68a8a39b6529\") " pod="openshift-marketplace/community-operators-v68v6" Feb 19 15:15:58 crc kubenswrapper[4810]: I0219 15:15:58.640060 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlxt4\" (UniqueName: \"kubernetes.io/projected/935efdc2-5596-4207-a27b-68a8a39b6529-kube-api-access-dlxt4\") pod \"community-operators-v68v6\" (UID: \"935efdc2-5596-4207-a27b-68a8a39b6529\") " pod="openshift-marketplace/community-operators-v68v6" Feb 19 15:15:58 crc kubenswrapper[4810]: I0219 15:15:58.640133 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/935efdc2-5596-4207-a27b-68a8a39b6529-catalog-content\") pod \"community-operators-v68v6\" (UID: \"935efdc2-5596-4207-a27b-68a8a39b6529\") " pod="openshift-marketplace/community-operators-v68v6" Feb 19 15:15:58 crc kubenswrapper[4810]: I0219 15:15:58.640180 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/935efdc2-5596-4207-a27b-68a8a39b6529-utilities\") pod \"community-operators-v68v6\" (UID: \"935efdc2-5596-4207-a27b-68a8a39b6529\") " pod="openshift-marketplace/community-operators-v68v6" Feb 19 15:15:58 crc kubenswrapper[4810]: I0219 15:15:58.640683 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/935efdc2-5596-4207-a27b-68a8a39b6529-utilities\") pod \"community-operators-v68v6\" (UID: \"935efdc2-5596-4207-a27b-68a8a39b6529\") " pod="openshift-marketplace/community-operators-v68v6" Feb 19 15:15:58 crc kubenswrapper[4810]: I0219 15:15:58.640713 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/935efdc2-5596-4207-a27b-68a8a39b6529-catalog-content\") pod \"community-operators-v68v6\" (UID: \"935efdc2-5596-4207-a27b-68a8a39b6529\") " pod="openshift-marketplace/community-operators-v68v6" Feb 19 15:15:58 crc kubenswrapper[4810]: I0219 15:15:58.661136 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlxt4\" (UniqueName: \"kubernetes.io/projected/935efdc2-5596-4207-a27b-68a8a39b6529-kube-api-access-dlxt4\") pod \"community-operators-v68v6\" (UID: \"935efdc2-5596-4207-a27b-68a8a39b6529\") " pod="openshift-marketplace/community-operators-v68v6" Feb 19 15:15:58 crc kubenswrapper[4810]: I0219 15:15:58.775635 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v68v6" Feb 19 15:15:58 crc kubenswrapper[4810]: I0219 15:15:58.829488 4810 generic.go:334] "Generic (PLEG): container finished" podID="78aaed3c-dfb4-4332-bc63-4fc5342870ae" containerID="389e0276f99c2489270efb421c00ea721017dbf02f8163a0b8723a2949f2384c" exitCode=0 Feb 19 15:15:58 crc kubenswrapper[4810]: I0219 15:15:58.829572 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-96zmk" event={"ID":"78aaed3c-dfb4-4332-bc63-4fc5342870ae","Type":"ContainerDied","Data":"389e0276f99c2489270efb421c00ea721017dbf02f8163a0b8723a2949f2384c"} Feb 19 15:15:58 crc kubenswrapper[4810]: I0219 15:15:58.993217 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-v68v6"] Feb 19 15:15:59 crc kubenswrapper[4810]: W0219 15:15:59.006026 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod935efdc2_5596_4207_a27b_68a8a39b6529.slice/crio-514f59846afe5a035371037e15fe72e417c83d08b819931a1c6ac5a2603476da WatchSource:0}: Error finding container 514f59846afe5a035371037e15fe72e417c83d08b819931a1c6ac5a2603476da: Status 404 returned error can't find the container with id 514f59846afe5a035371037e15fe72e417c83d08b819931a1c6ac5a2603476da Feb 19 15:15:59 crc kubenswrapper[4810]: I0219 15:15:59.448499 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-tgtg8"] Feb 19 15:15:59 crc kubenswrapper[4810]: I0219 15:15:59.451078 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tgtg8" Feb 19 15:15:59 crc kubenswrapper[4810]: I0219 15:15:59.455725 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 19 15:15:59 crc kubenswrapper[4810]: I0219 15:15:59.457680 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tgtg8"] Feb 19 15:15:59 crc kubenswrapper[4810]: I0219 15:15:59.557880 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5szq\" (UniqueName: \"kubernetes.io/projected/b3c2bc60-712d-4ef6-b461-ad683f51f2e4-kube-api-access-h5szq\") pod \"redhat-marketplace-tgtg8\" (UID: \"b3c2bc60-712d-4ef6-b461-ad683f51f2e4\") " pod="openshift-marketplace/redhat-marketplace-tgtg8" Feb 19 15:15:59 crc kubenswrapper[4810]: I0219 15:15:59.558020 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3c2bc60-712d-4ef6-b461-ad683f51f2e4-utilities\") pod \"redhat-marketplace-tgtg8\" (UID: \"b3c2bc60-712d-4ef6-b461-ad683f51f2e4\") " pod="openshift-marketplace/redhat-marketplace-tgtg8" Feb 19 15:15:59 crc kubenswrapper[4810]: I0219 15:15:59.558136 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3c2bc60-712d-4ef6-b461-ad683f51f2e4-catalog-content\") pod \"redhat-marketplace-tgtg8\" (UID: \"b3c2bc60-712d-4ef6-b461-ad683f51f2e4\") " pod="openshift-marketplace/redhat-marketplace-tgtg8" Feb 19 15:15:59 crc kubenswrapper[4810]: I0219 15:15:59.659647 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3c2bc60-712d-4ef6-b461-ad683f51f2e4-catalog-content\") pod \"redhat-marketplace-tgtg8\" (UID: \"b3c2bc60-712d-4ef6-b461-ad683f51f2e4\") " pod="openshift-marketplace/redhat-marketplace-tgtg8" Feb 19 15:15:59 crc kubenswrapper[4810]: I0219 15:15:59.659786 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5szq\" (UniqueName: \"kubernetes.io/projected/b3c2bc60-712d-4ef6-b461-ad683f51f2e4-kube-api-access-h5szq\") pod \"redhat-marketplace-tgtg8\" (UID: \"b3c2bc60-712d-4ef6-b461-ad683f51f2e4\") " pod="openshift-marketplace/redhat-marketplace-tgtg8" Feb 19 15:15:59 crc kubenswrapper[4810]: I0219 15:15:59.659865 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3c2bc60-712d-4ef6-b461-ad683f51f2e4-utilities\") pod \"redhat-marketplace-tgtg8\" (UID: \"b3c2bc60-712d-4ef6-b461-ad683f51f2e4\") " pod="openshift-marketplace/redhat-marketplace-tgtg8" Feb 19 15:15:59 crc kubenswrapper[4810]: I0219 15:15:59.660337 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3c2bc60-712d-4ef6-b461-ad683f51f2e4-catalog-content\") pod \"redhat-marketplace-tgtg8\" (UID: \"b3c2bc60-712d-4ef6-b461-ad683f51f2e4\") " pod="openshift-marketplace/redhat-marketplace-tgtg8" Feb 19 15:15:59 crc kubenswrapper[4810]: I0219 15:15:59.660754 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3c2bc60-712d-4ef6-b461-ad683f51f2e4-utilities\") pod \"redhat-marketplace-tgtg8\" (UID: \"b3c2bc60-712d-4ef6-b461-ad683f51f2e4\") " pod="openshift-marketplace/redhat-marketplace-tgtg8" Feb 19 15:15:59 crc kubenswrapper[4810]: I0219 15:15:59.698758 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5szq\" (UniqueName: \"kubernetes.io/projected/b3c2bc60-712d-4ef6-b461-ad683f51f2e4-kube-api-access-h5szq\") pod \"redhat-marketplace-tgtg8\" (UID: \"b3c2bc60-712d-4ef6-b461-ad683f51f2e4\") " pod="openshift-marketplace/redhat-marketplace-tgtg8" Feb 19 15:15:59 crc kubenswrapper[4810]: I0219 15:15:59.774874 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tgtg8" Feb 19 15:15:59 crc kubenswrapper[4810]: I0219 15:15:59.846561 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gzkwp" event={"ID":"cb41d90e-0896-4229-a19b-a8577292bbf6","Type":"ContainerStarted","Data":"563ec001d1590fa9c16e3de9515c66b2a8327d12667295252c48a5230c733ba1"} Feb 19 15:15:59 crc kubenswrapper[4810]: I0219 15:15:59.856548 4810 generic.go:334] "Generic (PLEG): container finished" podID="935efdc2-5596-4207-a27b-68a8a39b6529" containerID="1bf8e18430d52e1bdffbf7e432c09ccbb227c1161c0321cdc60f7b17956e0999" exitCode=0 Feb 19 15:15:59 crc kubenswrapper[4810]: I0219 15:15:59.856729 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v68v6" event={"ID":"935efdc2-5596-4207-a27b-68a8a39b6529","Type":"ContainerDied","Data":"1bf8e18430d52e1bdffbf7e432c09ccbb227c1161c0321cdc60f7b17956e0999"} Feb 19 15:15:59 crc kubenswrapper[4810]: I0219 15:15:59.856923 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v68v6" event={"ID":"935efdc2-5596-4207-a27b-68a8a39b6529","Type":"ContainerStarted","Data":"514f59846afe5a035371037e15fe72e417c83d08b819931a1c6ac5a2603476da"} Feb 19 15:15:59 crc kubenswrapper[4810]: I0219 15:15:59.870669 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-96zmk" event={"ID":"78aaed3c-dfb4-4332-bc63-4fc5342870ae","Type":"ContainerStarted","Data":"6b9165a9af4fc64c1f899ec8f99221cc29231316d69d504fd16ef6a4ce57525c"} Feb 19 15:16:00 crc kubenswrapper[4810]: I0219 15:16:00.048709 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tgtg8"] Feb 19 15:16:00 crc kubenswrapper[4810]: I0219 15:16:00.878296 4810 generic.go:334] "Generic (PLEG): container finished" podID="cb41d90e-0896-4229-a19b-a8577292bbf6" containerID="563ec001d1590fa9c16e3de9515c66b2a8327d12667295252c48a5230c733ba1" exitCode=0 Feb 19 15:16:00 crc kubenswrapper[4810]: I0219 15:16:00.878366 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gzkwp" event={"ID":"cb41d90e-0896-4229-a19b-a8577292bbf6","Type":"ContainerDied","Data":"563ec001d1590fa9c16e3de9515c66b2a8327d12667295252c48a5230c733ba1"} Feb 19 15:16:00 crc kubenswrapper[4810]: I0219 15:16:00.881286 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v68v6" event={"ID":"935efdc2-5596-4207-a27b-68a8a39b6529","Type":"ContainerStarted","Data":"454b485efdf0b632d0bf68a7c004625d4bec4676c25fb0c84c862bf5d9832936"} Feb 19 15:16:00 crc kubenswrapper[4810]: I0219 15:16:00.885442 4810 generic.go:334] "Generic (PLEG): container finished" podID="b3c2bc60-712d-4ef6-b461-ad683f51f2e4" containerID="87e5190564db63178273d5c86a9e7c06d079f45801be5670341f89e74e1884b5" exitCode=0 Feb 19 15:16:00 crc kubenswrapper[4810]: I0219 15:16:00.885648 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tgtg8" event={"ID":"b3c2bc60-712d-4ef6-b461-ad683f51f2e4","Type":"ContainerDied","Data":"87e5190564db63178273d5c86a9e7c06d079f45801be5670341f89e74e1884b5"} Feb 19 15:16:00 crc kubenswrapper[4810]: I0219 15:16:00.885696 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tgtg8" event={"ID":"b3c2bc60-712d-4ef6-b461-ad683f51f2e4","Type":"ContainerStarted","Data":"a80ffb892789a1e8904a911324da9b863602e3697bb3ca2836be7f70a34bcf90"} Feb 19 15:16:00 crc kubenswrapper[4810]: I0219 15:16:00.888676 4810 generic.go:334] "Generic (PLEG): container finished" podID="78aaed3c-dfb4-4332-bc63-4fc5342870ae" containerID="6b9165a9af4fc64c1f899ec8f99221cc29231316d69d504fd16ef6a4ce57525c" exitCode=0 Feb 19 15:16:00 crc kubenswrapper[4810]: I0219 15:16:00.889838 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-96zmk" event={"ID":"78aaed3c-dfb4-4332-bc63-4fc5342870ae","Type":"ContainerDied","Data":"6b9165a9af4fc64c1f899ec8f99221cc29231316d69d504fd16ef6a4ce57525c"} Feb 19 15:16:01 crc kubenswrapper[4810]: I0219 15:16:01.899553 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gzkwp" event={"ID":"cb41d90e-0896-4229-a19b-a8577292bbf6","Type":"ContainerStarted","Data":"4dcb06a25fb7e17e0617e078d6ec18aa0f057a2f73d88aa7a8ec24cf486c4034"} Feb 19 15:16:01 crc kubenswrapper[4810]: I0219 15:16:01.900961 4810 generic.go:334] "Generic (PLEG): container finished" podID="935efdc2-5596-4207-a27b-68a8a39b6529" containerID="454b485efdf0b632d0bf68a7c004625d4bec4676c25fb0c84c862bf5d9832936" exitCode=0 Feb 19 15:16:01 crc kubenswrapper[4810]: I0219 15:16:01.901025 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v68v6" event={"ID":"935efdc2-5596-4207-a27b-68a8a39b6529","Type":"ContainerDied","Data":"454b485efdf0b632d0bf68a7c004625d4bec4676c25fb0c84c862bf5d9832936"} Feb 19 15:16:01 crc kubenswrapper[4810]: I0219 15:16:01.902823 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tgtg8" event={"ID":"b3c2bc60-712d-4ef6-b461-ad683f51f2e4","Type":"ContainerStarted","Data":"a7e5546cff139c5de86becf0e667166b7b8333859007a285f8ba2d1bb6ebca8d"} Feb 19 15:16:01 crc kubenswrapper[4810]: I0219 15:16:01.904909 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-96zmk" event={"ID":"78aaed3c-dfb4-4332-bc63-4fc5342870ae","Type":"ContainerStarted","Data":"d3b7a9028d72c7eb4783b51f96495dc71d42e8907df523028cceebd689902934"} Feb 19 15:16:01 crc kubenswrapper[4810]: I0219 15:16:01.923904 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gzkwp" podStartSLOduration=2.414785366 podStartE2EDuration="5.923871138s" podCreationTimestamp="2026-02-19 15:15:56 +0000 UTC" firstStartedPulling="2026-02-19 15:15:57.826173397 +0000 UTC m=+387.308203521" lastFinishedPulling="2026-02-19 15:16:01.335259169 +0000 UTC m=+390.817289293" observedRunningTime="2026-02-19 15:16:01.918639914 +0000 UTC m=+391.400670048" watchObservedRunningTime="2026-02-19 15:16:01.923871138 +0000 UTC m=+391.405901252" Feb 19 15:16:01 crc kubenswrapper[4810]: I0219 15:16:01.963747 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-96zmk" podStartSLOduration=2.4759718680000002 podStartE2EDuration="4.963723508s" podCreationTimestamp="2026-02-19 15:15:57 +0000 UTC" firstStartedPulling="2026-02-19 15:15:58.831215742 +0000 UTC m=+388.313245866" lastFinishedPulling="2026-02-19 15:16:01.318967382 +0000 UTC m=+390.800997506" observedRunningTime="2026-02-19 15:16:01.959864266 +0000 UTC m=+391.441894390" watchObservedRunningTime="2026-02-19 15:16:01.963723508 +0000 UTC m=+391.445753632" Feb 19 15:16:02 crc kubenswrapper[4810]: I0219 15:16:02.912821 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v68v6" event={"ID":"935efdc2-5596-4207-a27b-68a8a39b6529","Type":"ContainerStarted","Data":"fdb2805a108d53dbae6af5f6571743f61b51164e75890b28023d46aa9db3310a"} Feb 19 15:16:02 crc kubenswrapper[4810]: I0219 15:16:02.916389 4810 generic.go:334] "Generic (PLEG): container finished" podID="b3c2bc60-712d-4ef6-b461-ad683f51f2e4" containerID="a7e5546cff139c5de86becf0e667166b7b8333859007a285f8ba2d1bb6ebca8d" exitCode=0 Feb 19 15:16:02 crc kubenswrapper[4810]: I0219 15:16:02.917178 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tgtg8" event={"ID":"b3c2bc60-712d-4ef6-b461-ad683f51f2e4","Type":"ContainerDied","Data":"a7e5546cff139c5de86becf0e667166b7b8333859007a285f8ba2d1bb6ebca8d"} Feb 19 15:16:02 crc kubenswrapper[4810]: I0219 15:16:02.933738 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-v68v6" podStartSLOduration=2.450233944 podStartE2EDuration="4.933719433s" podCreationTimestamp="2026-02-19 15:15:58 +0000 UTC" firstStartedPulling="2026-02-19 15:15:59.86258381 +0000 UTC m=+389.344613934" lastFinishedPulling="2026-02-19 15:16:02.346069299 +0000 UTC m=+391.828099423" observedRunningTime="2026-02-19 15:16:02.928216852 +0000 UTC m=+392.410246976" watchObservedRunningTime="2026-02-19 15:16:02.933719433 +0000 UTC m=+392.415749557" Feb 19 15:16:03 crc kubenswrapper[4810]: I0219 15:16:03.924711 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tgtg8" event={"ID":"b3c2bc60-712d-4ef6-b461-ad683f51f2e4","Type":"ContainerStarted","Data":"8739789f355fa198e1e69c2a52485d5504ce568f33b25ddb44e09767e21bad1c"} Feb 19 15:16:03 crc kubenswrapper[4810]: I0219 15:16:03.951444 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-tgtg8" podStartSLOduration=2.284627957 podStartE2EDuration="4.951427035s" podCreationTimestamp="2026-02-19 15:15:59 +0000 UTC" firstStartedPulling="2026-02-19 15:16:00.886812407 +0000 UTC m=+390.368842561" lastFinishedPulling="2026-02-19 15:16:03.553611495 +0000 UTC m=+393.035641639" observedRunningTime="2026-02-19 15:16:03.950846861 +0000 UTC m=+393.432876985" watchObservedRunningTime="2026-02-19 15:16:03.951427035 +0000 UTC m=+393.433457159" Feb 19 15:16:06 crc kubenswrapper[4810]: I0219 15:16:06.379067 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gzkwp" Feb 19 15:16:06 crc kubenswrapper[4810]: I0219 15:16:06.380901 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gzkwp" Feb 19 15:16:07 crc kubenswrapper[4810]: I0219 15:16:07.370648 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-96zmk" Feb 19 15:16:07 crc kubenswrapper[4810]: I0219 15:16:07.370979 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-96zmk" Feb 19 15:16:07 crc kubenswrapper[4810]: I0219 15:16:07.412197 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-96zmk" Feb 19 15:16:07 crc kubenswrapper[4810]: I0219 15:16:07.444705 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gzkwp" podUID="cb41d90e-0896-4229-a19b-a8577292bbf6" containerName="registry-server" probeResult="failure" output=< Feb 19 15:16:07 crc kubenswrapper[4810]: timeout: failed to connect service ":50051" within 1s Feb 19 15:16:07 crc kubenswrapper[4810]: > Feb 19 15:16:08 crc kubenswrapper[4810]: I0219 15:16:08.007787 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-96zmk" Feb 19 15:16:08 crc kubenswrapper[4810]: I0219 15:16:08.776483 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-v68v6" Feb 19 15:16:08 crc kubenswrapper[4810]: I0219 15:16:08.776852 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-v68v6" Feb 19 15:16:08 crc kubenswrapper[4810]: I0219 15:16:08.823534 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-v68v6" Feb 19 15:16:09 crc kubenswrapper[4810]: I0219 15:16:09.026435 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-v68v6" Feb 19 15:16:09 crc kubenswrapper[4810]: I0219 15:16:09.776474 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tgtg8" Feb 19 15:16:09 crc kubenswrapper[4810]: I0219 15:16:09.776524 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tgtg8" Feb 19 15:16:09 crc kubenswrapper[4810]: I0219 15:16:09.812182 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tgtg8" Feb 19 15:16:10 crc kubenswrapper[4810]: I0219 15:16:10.051440 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tgtg8" Feb 19 15:16:16 crc kubenswrapper[4810]: I0219 15:16:16.293483 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" podUID="a6cb5092-5f01-4dd9-a940-804d88907744" containerName="registry" containerID="cri-o://5251301af1540af7d5ebbbedf95175c9c2ed8c2deb1460dfe0ea2da0c4971799" gracePeriod=30 Feb 19 15:16:16 crc kubenswrapper[4810]: I0219 15:16:16.430293 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gzkwp" Feb 19 15:16:16 crc kubenswrapper[4810]: I0219 15:16:16.490001 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gzkwp" Feb 19 15:16:16 crc kubenswrapper[4810]: I0219 15:16:16.678344 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" Feb 19 15:16:16 crc kubenswrapper[4810]: I0219 15:16:16.842166 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a6cb5092-5f01-4dd9-a940-804d88907744-registry-certificates\") pod \"a6cb5092-5f01-4dd9-a940-804d88907744\" (UID: \"a6cb5092-5f01-4dd9-a940-804d88907744\") " Feb 19 15:16:16 crc kubenswrapper[4810]: I0219 15:16:16.842252 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a6cb5092-5f01-4dd9-a940-804d88907744-bound-sa-token\") pod \"a6cb5092-5f01-4dd9-a940-804d88907744\" (UID: \"a6cb5092-5f01-4dd9-a940-804d88907744\") " Feb 19 15:16:16 crc kubenswrapper[4810]: I0219 15:16:16.842595 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"a6cb5092-5f01-4dd9-a940-804d88907744\" (UID: \"a6cb5092-5f01-4dd9-a940-804d88907744\") " Feb 19 15:16:16 crc kubenswrapper[4810]: I0219 15:16:16.842672 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g96x2\" (UniqueName: \"kubernetes.io/projected/a6cb5092-5f01-4dd9-a940-804d88907744-kube-api-access-g96x2\") pod \"a6cb5092-5f01-4dd9-a940-804d88907744\" (UID: \"a6cb5092-5f01-4dd9-a940-804d88907744\") " Feb 19 15:16:16 crc kubenswrapper[4810]: I0219 15:16:16.842738 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a6cb5092-5f01-4dd9-a940-804d88907744-registry-tls\") pod \"a6cb5092-5f01-4dd9-a940-804d88907744\" (UID: \"a6cb5092-5f01-4dd9-a940-804d88907744\") " Feb 19 15:16:16 crc kubenswrapper[4810]: I0219 15:16:16.842818 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a6cb5092-5f01-4dd9-a940-804d88907744-ca-trust-extracted\") pod \"a6cb5092-5f01-4dd9-a940-804d88907744\" (UID: \"a6cb5092-5f01-4dd9-a940-804d88907744\") " Feb 19 15:16:16 crc kubenswrapper[4810]: I0219 15:16:16.842876 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a6cb5092-5f01-4dd9-a940-804d88907744-installation-pull-secrets\") pod \"a6cb5092-5f01-4dd9-a940-804d88907744\" (UID: \"a6cb5092-5f01-4dd9-a940-804d88907744\") " Feb 19 15:16:16 crc kubenswrapper[4810]: I0219 15:16:16.842943 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a6cb5092-5f01-4dd9-a940-804d88907744-trusted-ca\") pod \"a6cb5092-5f01-4dd9-a940-804d88907744\" (UID: \"a6cb5092-5f01-4dd9-a940-804d88907744\") " Feb 19 15:16:16 crc kubenswrapper[4810]: I0219 15:16:16.843249 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6cb5092-5f01-4dd9-a940-804d88907744-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "a6cb5092-5f01-4dd9-a940-804d88907744" (UID: "a6cb5092-5f01-4dd9-a940-804d88907744"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:16:16 crc kubenswrapper[4810]: I0219 15:16:16.843529 4810 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a6cb5092-5f01-4dd9-a940-804d88907744-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 19 15:16:16 crc kubenswrapper[4810]: I0219 15:16:16.845168 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6cb5092-5f01-4dd9-a940-804d88907744-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a6cb5092-5f01-4dd9-a940-804d88907744" (UID: "a6cb5092-5f01-4dd9-a940-804d88907744"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:16:16 crc kubenswrapper[4810]: I0219 15:16:16.848717 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6cb5092-5f01-4dd9-a940-804d88907744-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "a6cb5092-5f01-4dd9-a940-804d88907744" (UID: "a6cb5092-5f01-4dd9-a940-804d88907744"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:16:16 crc kubenswrapper[4810]: I0219 15:16:16.850551 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6cb5092-5f01-4dd9-a940-804d88907744-kube-api-access-g96x2" (OuterVolumeSpecName: "kube-api-access-g96x2") pod "a6cb5092-5f01-4dd9-a940-804d88907744" (UID: "a6cb5092-5f01-4dd9-a940-804d88907744"). InnerVolumeSpecName "kube-api-access-g96x2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:16:16 crc kubenswrapper[4810]: I0219 15:16:16.850762 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6cb5092-5f01-4dd9-a940-804d88907744-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a6cb5092-5f01-4dd9-a940-804d88907744" (UID: "a6cb5092-5f01-4dd9-a940-804d88907744"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:16:16 crc kubenswrapper[4810]: I0219 15:16:16.852689 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6cb5092-5f01-4dd9-a940-804d88907744-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "a6cb5092-5f01-4dd9-a940-804d88907744" (UID: "a6cb5092-5f01-4dd9-a940-804d88907744"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:16:16 crc kubenswrapper[4810]: I0219 15:16:16.859519 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6cb5092-5f01-4dd9-a940-804d88907744-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "a6cb5092-5f01-4dd9-a940-804d88907744" (UID: "a6cb5092-5f01-4dd9-a940-804d88907744"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:16:16 crc kubenswrapper[4810]: I0219 15:16:16.859898 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "a6cb5092-5f01-4dd9-a940-804d88907744" (UID: "a6cb5092-5f01-4dd9-a940-804d88907744"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 19 15:16:16 crc kubenswrapper[4810]: I0219 15:16:16.944559 4810 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a6cb5092-5f01-4dd9-a940-804d88907744-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 19 15:16:16 crc kubenswrapper[4810]: I0219 15:16:16.944606 4810 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a6cb5092-5f01-4dd9-a940-804d88907744-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 19 15:16:16 crc kubenswrapper[4810]: I0219 15:16:16.944620 4810 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a6cb5092-5f01-4dd9-a940-804d88907744-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 19 15:16:16 crc kubenswrapper[4810]: I0219 15:16:16.944628 4810 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a6cb5092-5f01-4dd9-a940-804d88907744-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 19 15:16:16 crc kubenswrapper[4810]: I0219 15:16:16.944636 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g96x2\" (UniqueName: \"kubernetes.io/projected/a6cb5092-5f01-4dd9-a940-804d88907744-kube-api-access-g96x2\") on node \"crc\" DevicePath \"\"" Feb 19 15:16:16 crc kubenswrapper[4810]: I0219 15:16:16.944646 4810 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a6cb5092-5f01-4dd9-a940-804d88907744-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 19 15:16:17 crc kubenswrapper[4810]: I0219 15:16:17.006756 4810 generic.go:334] "Generic (PLEG): container finished" podID="a6cb5092-5f01-4dd9-a940-804d88907744" containerID="5251301af1540af7d5ebbbedf95175c9c2ed8c2deb1460dfe0ea2da0c4971799" exitCode=0 Feb 19 15:16:17 crc kubenswrapper[4810]: I0219 15:16:17.006807 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" event={"ID":"a6cb5092-5f01-4dd9-a940-804d88907744","Type":"ContainerDied","Data":"5251301af1540af7d5ebbbedf95175c9c2ed8c2deb1460dfe0ea2da0c4971799"} Feb 19 15:16:17 crc kubenswrapper[4810]: I0219 15:16:17.006863 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" event={"ID":"a6cb5092-5f01-4dd9-a940-804d88907744","Type":"ContainerDied","Data":"958cccf7fb34847e1b79d6753f3c0b3ac89a306a0da6aa21a70c7c233870987b"} Feb 19 15:16:17 crc kubenswrapper[4810]: I0219 15:16:17.006890 4810 scope.go:117] "RemoveContainer" containerID="5251301af1540af7d5ebbbedf95175c9c2ed8c2deb1460dfe0ea2da0c4971799" Feb 19 15:16:17 crc kubenswrapper[4810]: I0219 15:16:17.007069 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-7bnq2" Feb 19 15:16:17 crc kubenswrapper[4810]: I0219 15:16:17.029034 4810 scope.go:117] "RemoveContainer" containerID="5251301af1540af7d5ebbbedf95175c9c2ed8c2deb1460dfe0ea2da0c4971799" Feb 19 15:16:17 crc kubenswrapper[4810]: E0219 15:16:17.029510 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5251301af1540af7d5ebbbedf95175c9c2ed8c2deb1460dfe0ea2da0c4971799\": container with ID starting with 5251301af1540af7d5ebbbedf95175c9c2ed8c2deb1460dfe0ea2da0c4971799 not found: ID does not exist" containerID="5251301af1540af7d5ebbbedf95175c9c2ed8c2deb1460dfe0ea2da0c4971799" Feb 19 15:16:17 crc kubenswrapper[4810]: I0219 15:16:17.029556 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5251301af1540af7d5ebbbedf95175c9c2ed8c2deb1460dfe0ea2da0c4971799"} err="failed to get container status \"5251301af1540af7d5ebbbedf95175c9c2ed8c2deb1460dfe0ea2da0c4971799\": rpc error: code = NotFound desc = could not find container \"5251301af1540af7d5ebbbedf95175c9c2ed8c2deb1460dfe0ea2da0c4971799\": container with ID starting with 5251301af1540af7d5ebbbedf95175c9c2ed8c2deb1460dfe0ea2da0c4971799 not found: ID does not exist" Feb 19 15:16:17 crc kubenswrapper[4810]: I0219 15:16:17.055938 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-7bnq2"] Feb 19 15:16:17 crc kubenswrapper[4810]: I0219 15:16:17.062494 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-7bnq2"] Feb 19 15:16:17 crc kubenswrapper[4810]: I0219 15:16:17.450845 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6cb5092-5f01-4dd9-a940-804d88907744" path="/var/lib/kubelet/pods/a6cb5092-5f01-4dd9-a940-804d88907744/volumes" Feb 19 15:16:19 crc kubenswrapper[4810]: I0219 15:16:19.538141 4810 patch_prober.go:28] interesting pod/machine-config-daemon-t499d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 15:16:19 crc kubenswrapper[4810]: I0219 15:16:19.538552 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 15:16:19 crc kubenswrapper[4810]: I0219 15:16:19.538605 4810 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t499d" Feb 19 15:16:19 crc kubenswrapper[4810]: I0219 15:16:19.539255 4810 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4a50af848183862ac97eb1221e89b46a5711d55d54c9bd96026946aef1766658"} pod="openshift-machine-config-operator/machine-config-daemon-t499d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 15:16:19 crc kubenswrapper[4810]: I0219 15:16:19.539335 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" containerID="cri-o://4a50af848183862ac97eb1221e89b46a5711d55d54c9bd96026946aef1766658" gracePeriod=600 Feb 19 15:16:20 crc kubenswrapper[4810]: I0219 15:16:20.026442 4810 generic.go:334] "Generic (PLEG): container finished" podID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerID="4a50af848183862ac97eb1221e89b46a5711d55d54c9bd96026946aef1766658" exitCode=0 Feb 19 15:16:20 crc kubenswrapper[4810]: I0219 15:16:20.026502 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" event={"ID":"dfbf74ef-1e94-4826-8583-42b2e246ccf3","Type":"ContainerDied","Data":"4a50af848183862ac97eb1221e89b46a5711d55d54c9bd96026946aef1766658"} Feb 19 15:16:20 crc kubenswrapper[4810]: I0219 15:16:20.026867 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" event={"ID":"dfbf74ef-1e94-4826-8583-42b2e246ccf3","Type":"ContainerStarted","Data":"b445a122b966a7403b8df4638cf97036239996a24e8ace1fab9b55e591849bf5"} Feb 19 15:16:20 crc kubenswrapper[4810]: I0219 15:16:20.026899 4810 scope.go:117] "RemoveContainer" containerID="ba65a8abc0841494ba4596019b005e3cbb3f3c0897bef615b51a2ab10aa9e651" Feb 19 15:18:19 crc kubenswrapper[4810]: I0219 15:18:19.538428 4810 patch_prober.go:28] interesting pod/machine-config-daemon-t499d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 15:18:19 crc kubenswrapper[4810]: I0219 15:18:19.539063 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 15:18:31 crc kubenswrapper[4810]: I0219 15:18:31.729302 4810 scope.go:117] "RemoveContainer" containerID="ae8bc998d14d3fe5b46f631e4cd6f287af277e6334648d3823ae6a448b5c6c06" Feb 19 15:18:49 crc kubenswrapper[4810]: I0219 15:18:49.538012 4810 patch_prober.go:28] interesting pod/machine-config-daemon-t499d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 15:18:49 crc kubenswrapper[4810]: I0219 15:18:49.539190 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 15:19:19 crc kubenswrapper[4810]: I0219 15:19:19.537487 4810 patch_prober.go:28] interesting pod/machine-config-daemon-t499d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 15:19:19 crc kubenswrapper[4810]: I0219 15:19:19.539520 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 15:19:19 crc kubenswrapper[4810]: I0219 15:19:19.539601 4810 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t499d" Feb 19 15:19:19 crc kubenswrapper[4810]: I0219 15:19:19.540528 4810 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b445a122b966a7403b8df4638cf97036239996a24e8ace1fab9b55e591849bf5"} pod="openshift-machine-config-operator/machine-config-daemon-t499d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 15:19:19 crc kubenswrapper[4810]: I0219 15:19:19.540632 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" containerID="cri-o://b445a122b966a7403b8df4638cf97036239996a24e8ace1fab9b55e591849bf5" gracePeriod=600 Feb 19 15:19:20 crc kubenswrapper[4810]: I0219 15:19:20.452477 4810 generic.go:334] "Generic (PLEG): container finished" podID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerID="b445a122b966a7403b8df4638cf97036239996a24e8ace1fab9b55e591849bf5" exitCode=0 Feb 19 15:19:20 crc kubenswrapper[4810]: I0219 15:19:20.452568 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" event={"ID":"dfbf74ef-1e94-4826-8583-42b2e246ccf3","Type":"ContainerDied","Data":"b445a122b966a7403b8df4638cf97036239996a24e8ace1fab9b55e591849bf5"} Feb 19 15:19:20 crc kubenswrapper[4810]: I0219 15:19:20.452652 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" event={"ID":"dfbf74ef-1e94-4826-8583-42b2e246ccf3","Type":"ContainerStarted","Data":"946b41284ed03248aebd830c7fb80426be59078e4ea2a93cd09930514fedec98"} Feb 19 15:19:20 crc kubenswrapper[4810]: I0219 15:19:20.452672 4810 scope.go:117] "RemoveContainer" containerID="4a50af848183862ac97eb1221e89b46a5711d55d54c9bd96026946aef1766658" Feb 19 15:19:31 crc kubenswrapper[4810]: I0219 15:19:31.770900 4810 scope.go:117] "RemoveContainer" containerID="a2b82759b901660a79c47e74ce7b041d5164f2bc3f90bc7fb84fbbfb09e3a3e7" Feb 19 15:19:31 crc kubenswrapper[4810]: I0219 15:19:31.799806 4810 scope.go:117] "RemoveContainer" containerID="32a771c22d7bf5ec3610bad3dd5fe81ac2ad5a407f7d5f0293d46b10d47c380d" Feb 19 15:19:31 crc kubenswrapper[4810]: I0219 15:19:31.817518 4810 scope.go:117] "RemoveContainer" containerID="471901a772079c16e7d5c328b745070f868fa529208773cf4e67ab79b2945769" Feb 19 15:20:43 crc kubenswrapper[4810]: I0219 15:20:43.935451 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-x4csq"] Feb 19 15:20:43 crc kubenswrapper[4810]: E0219 15:20:43.936440 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6cb5092-5f01-4dd9-a940-804d88907744" containerName="registry" Feb 19 15:20:43 crc kubenswrapper[4810]: I0219 15:20:43.936464 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6cb5092-5f01-4dd9-a940-804d88907744" containerName="registry" Feb 19 15:20:43 crc kubenswrapper[4810]: I0219 15:20:43.936645 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6cb5092-5f01-4dd9-a940-804d88907744" containerName="registry" Feb 19 15:20:43 crc kubenswrapper[4810]: I0219 15:20:43.937202 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-x4csq" Feb 19 15:20:43 crc kubenswrapper[4810]: I0219 15:20:43.946083 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Feb 19 15:20:43 crc kubenswrapper[4810]: I0219 15:20:43.946175 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Feb 19 15:20:43 crc kubenswrapper[4810]: I0219 15:20:43.946205 4810 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-cnlds" Feb 19 15:20:43 crc kubenswrapper[4810]: I0219 15:20:43.960057 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-fwh4x"] Feb 19 15:20:43 crc kubenswrapper[4810]: I0219 15:20:43.961731 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-fwh4x" Feb 19 15:20:43 crc kubenswrapper[4810]: I0219 15:20:43.963479 4810 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-pghqb" Feb 19 15:20:43 crc kubenswrapper[4810]: I0219 15:20:43.964677 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-x4csq"] Feb 19 15:20:43 crc kubenswrapper[4810]: I0219 15:20:43.969499 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-7lspv"] Feb 19 15:20:43 crc kubenswrapper[4810]: I0219 15:20:43.970353 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-7lspv" Feb 19 15:20:43 crc kubenswrapper[4810]: I0219 15:20:43.972022 4810 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-7dgbh" Feb 19 15:20:43 crc kubenswrapper[4810]: I0219 15:20:43.979547 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-fwh4x"] Feb 19 15:20:43 crc kubenswrapper[4810]: I0219 15:20:43.991995 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-7lspv"] Feb 19 15:20:43 crc kubenswrapper[4810]: I0219 15:20:43.997973 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpcbb\" (UniqueName: \"kubernetes.io/projected/54e755f0-9c2f-4d47-9979-b7b92996bab6-kube-api-access-mpcbb\") pod \"cert-manager-cainjector-cf98fcc89-x4csq\" (UID: \"54e755f0-9c2f-4d47-9979-b7b92996bab6\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-x4csq" Feb 19 15:20:43 crc kubenswrapper[4810]: I0219 15:20:43.998020 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gh9f\" (UniqueName: \"kubernetes.io/projected/02206e32-6f49-407e-a02b-ce61e3daabf6-kube-api-access-5gh9f\") pod \"cert-manager-858654f9db-fwh4x\" (UID: \"02206e32-6f49-407e-a02b-ce61e3daabf6\") " pod="cert-manager/cert-manager-858654f9db-fwh4x" Feb 19 15:20:43 crc kubenswrapper[4810]: I0219 15:20:43.998046 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tf7b\" (UniqueName: \"kubernetes.io/projected/1224bbe4-6d8e-410e-8990-3813efdd2003-kube-api-access-4tf7b\") pod \"cert-manager-webhook-687f57d79b-7lspv\" (UID: \"1224bbe4-6d8e-410e-8990-3813efdd2003\") " pod="cert-manager/cert-manager-webhook-687f57d79b-7lspv" Feb 19 15:20:44 crc kubenswrapper[4810]: I0219 15:20:44.098536 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpcbb\" (UniqueName: \"kubernetes.io/projected/54e755f0-9c2f-4d47-9979-b7b92996bab6-kube-api-access-mpcbb\") pod \"cert-manager-cainjector-cf98fcc89-x4csq\" (UID: \"54e755f0-9c2f-4d47-9979-b7b92996bab6\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-x4csq" Feb 19 15:20:44 crc kubenswrapper[4810]: I0219 15:20:44.098586 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gh9f\" (UniqueName: \"kubernetes.io/projected/02206e32-6f49-407e-a02b-ce61e3daabf6-kube-api-access-5gh9f\") pod \"cert-manager-858654f9db-fwh4x\" (UID: \"02206e32-6f49-407e-a02b-ce61e3daabf6\") " pod="cert-manager/cert-manager-858654f9db-fwh4x" Feb 19 15:20:44 crc kubenswrapper[4810]: I0219 15:20:44.098603 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tf7b\" (UniqueName: \"kubernetes.io/projected/1224bbe4-6d8e-410e-8990-3813efdd2003-kube-api-access-4tf7b\") pod \"cert-manager-webhook-687f57d79b-7lspv\" (UID: \"1224bbe4-6d8e-410e-8990-3813efdd2003\") " pod="cert-manager/cert-manager-webhook-687f57d79b-7lspv" Feb 19 15:20:44 crc kubenswrapper[4810]: I0219 15:20:44.117389 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tf7b\" (UniqueName: \"kubernetes.io/projected/1224bbe4-6d8e-410e-8990-3813efdd2003-kube-api-access-4tf7b\") pod \"cert-manager-webhook-687f57d79b-7lspv\" (UID: \"1224bbe4-6d8e-410e-8990-3813efdd2003\") " pod="cert-manager/cert-manager-webhook-687f57d79b-7lspv" Feb 19 15:20:44 crc kubenswrapper[4810]: I0219 15:20:44.121966 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpcbb\" (UniqueName: \"kubernetes.io/projected/54e755f0-9c2f-4d47-9979-b7b92996bab6-kube-api-access-mpcbb\") pod \"cert-manager-cainjector-cf98fcc89-x4csq\" (UID: \"54e755f0-9c2f-4d47-9979-b7b92996bab6\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-x4csq" Feb 19 15:20:44 crc kubenswrapper[4810]: I0219 15:20:44.122534 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gh9f\" (UniqueName: \"kubernetes.io/projected/02206e32-6f49-407e-a02b-ce61e3daabf6-kube-api-access-5gh9f\") pod \"cert-manager-858654f9db-fwh4x\" (UID: \"02206e32-6f49-407e-a02b-ce61e3daabf6\") " pod="cert-manager/cert-manager-858654f9db-fwh4x" Feb 19 15:20:44 crc kubenswrapper[4810]: I0219 15:20:44.268418 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-x4csq" Feb 19 15:20:44 crc kubenswrapper[4810]: I0219 15:20:44.291914 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-fwh4x" Feb 19 15:20:44 crc kubenswrapper[4810]: I0219 15:20:44.303145 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-7lspv" Feb 19 15:20:44 crc kubenswrapper[4810]: I0219 15:20:44.488654 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-x4csq"] Feb 19 15:20:44 crc kubenswrapper[4810]: I0219 15:20:44.492848 4810 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 15:20:44 crc kubenswrapper[4810]: I0219 15:20:44.525539 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-fwh4x"] Feb 19 15:20:44 crc kubenswrapper[4810]: I0219 15:20:44.554152 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-7lspv"] Feb 19 15:20:44 crc kubenswrapper[4810]: W0219 15:20:44.556187 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1224bbe4_6d8e_410e_8990_3813efdd2003.slice/crio-4492040e017c7e7b949146ade8671463d53259f0ae04d8eb151c438342128a85 WatchSource:0}: Error finding container 4492040e017c7e7b949146ade8671463d53259f0ae04d8eb151c438342128a85: Status 404 returned error can't find the container with id 4492040e017c7e7b949146ade8671463d53259f0ae04d8eb151c438342128a85 Feb 19 15:20:45 crc kubenswrapper[4810]: I0219 15:20:45.005613 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-x4csq" event={"ID":"54e755f0-9c2f-4d47-9979-b7b92996bab6","Type":"ContainerStarted","Data":"34966eab10f57cd468bea1e6ace89e02275506b5d5987b7abee5a409a9579582"} Feb 19 15:20:45 crc kubenswrapper[4810]: I0219 15:20:45.007571 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-7lspv" event={"ID":"1224bbe4-6d8e-410e-8990-3813efdd2003","Type":"ContainerStarted","Data":"4492040e017c7e7b949146ade8671463d53259f0ae04d8eb151c438342128a85"} Feb 19 15:20:45 crc kubenswrapper[4810]: I0219 15:20:45.010319 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-fwh4x" event={"ID":"02206e32-6f49-407e-a02b-ce61e3daabf6","Type":"ContainerStarted","Data":"f81ea8ac77dddbe786b7c05794b0649b0364255186fe2d40d0ad0b0100e5ad04"} Feb 19 15:20:49 crc kubenswrapper[4810]: I0219 15:20:49.049052 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-x4csq" event={"ID":"54e755f0-9c2f-4d47-9979-b7b92996bab6","Type":"ContainerStarted","Data":"32a1a5f24be62818a31943591cd66d0df2271978a9ec37db4bca7012dc876bb1"} Feb 19 15:20:49 crc kubenswrapper[4810]: I0219 15:20:49.051493 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-7lspv" event={"ID":"1224bbe4-6d8e-410e-8990-3813efdd2003","Type":"ContainerStarted","Data":"f7f7d6675897c68de79ddd0d947a436e8753b1a7d6278661a00a72ce0eb1817b"} Feb 19 15:20:49 crc kubenswrapper[4810]: I0219 15:20:49.051821 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-7lspv" Feb 19 15:20:49 crc kubenswrapper[4810]: I0219 15:20:49.052645 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-fwh4x" event={"ID":"02206e32-6f49-407e-a02b-ce61e3daabf6","Type":"ContainerStarted","Data":"249caeb282aa20b7841be3cdf691e55edd7b79f0db132244419e3bfc63afab84"} Feb 19 15:20:49 crc kubenswrapper[4810]: I0219 15:20:49.066289 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-x4csq" podStartSLOduration=1.916233095 podStartE2EDuration="6.066268055s" podCreationTimestamp="2026-02-19 15:20:43 +0000 UTC" firstStartedPulling="2026-02-19 15:20:44.492585031 +0000 UTC m=+673.974615155" lastFinishedPulling="2026-02-19 15:20:48.642619981 +0000 UTC m=+678.124650115" observedRunningTime="2026-02-19 15:20:49.065967878 +0000 UTC m=+678.547998022" watchObservedRunningTime="2026-02-19 15:20:49.066268055 +0000 UTC m=+678.548298179" Feb 19 15:20:49 crc kubenswrapper[4810]: I0219 15:20:49.096225 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-7lspv" podStartSLOduration=2.013537558 podStartE2EDuration="6.096202415s" podCreationTimestamp="2026-02-19 15:20:43 +0000 UTC" firstStartedPulling="2026-02-19 15:20:44.558805056 +0000 UTC m=+674.040835180" lastFinishedPulling="2026-02-19 15:20:48.641469913 +0000 UTC m=+678.123500037" observedRunningTime="2026-02-19 15:20:49.091796207 +0000 UTC m=+678.573826331" watchObservedRunningTime="2026-02-19 15:20:49.096202415 +0000 UTC m=+678.578232539" Feb 19 15:20:49 crc kubenswrapper[4810]: I0219 15:20:49.107521 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-fwh4x" podStartSLOduration=1.893134662 podStartE2EDuration="6.10750808s" podCreationTimestamp="2026-02-19 15:20:43 +0000 UTC" firstStartedPulling="2026-02-19 15:20:44.528406524 +0000 UTC m=+674.010436638" lastFinishedPulling="2026-02-19 15:20:48.742779932 +0000 UTC m=+678.224810056" observedRunningTime="2026-02-19 15:20:49.105804929 +0000 UTC m=+678.587835053" watchObservedRunningTime="2026-02-19 15:20:49.10750808 +0000 UTC m=+678.589538204" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.308391 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-7lspv" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.329929 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-8k7p5"] Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.331123 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" podUID="c5a8a15c-53e8-4868-8feb-dcd4e83939a4" containerName="ovn-controller" containerID="cri-o://5089067acda7084123f239150916500b11a0547068b0a39ee60a467c372f3459" gracePeriod=30 Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.331393 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" podUID="c5a8a15c-53e8-4868-8feb-dcd4e83939a4" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://879052e78b8d383d0a7f4fbddb38b98fa931bec3d0d0c7ff7eb830ccb62be46d" gracePeriod=30 Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.331527 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" podUID="c5a8a15c-53e8-4868-8feb-dcd4e83939a4" containerName="kube-rbac-proxy-node" containerID="cri-o://6dcca2640a7625737b151a65d63dac889139b41a8dff1cd5aae2ead1a37138f1" gracePeriod=30 Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.331187 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" podUID="c5a8a15c-53e8-4868-8feb-dcd4e83939a4" containerName="northd" containerID="cri-o://0c0568b462e1168949dcfd0131ef4e94d6c3f68dd31906ee5c6a8c87f7fef25b" gracePeriod=30 Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.331531 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" podUID="c5a8a15c-53e8-4868-8feb-dcd4e83939a4" containerName="sbdb" containerID="cri-o://e21fe2d6964872913c8e5a2f983519806b72d0e2880016372ac90fc2b9f61fc6" gracePeriod=30 Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.331790 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" podUID="c5a8a15c-53e8-4868-8feb-dcd4e83939a4" containerName="ovn-acl-logging" containerID="cri-o://702a868ef507f31cb27f41ccd862f38ba8d5256e5d4af19a5d8b168939932663" gracePeriod=30 Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.331854 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" podUID="c5a8a15c-53e8-4868-8feb-dcd4e83939a4" containerName="nbdb" containerID="cri-o://5de90aae0b44ad98711b4e4c2764cf25c491a329a7acbe9a6ec49f88fb8e7c92" gracePeriod=30 Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.374398 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" podUID="c5a8a15c-53e8-4868-8feb-dcd4e83939a4" containerName="ovnkube-controller" containerID="cri-o://07a16a020eab7c7e2c4939aceb3c9448061dd4b2196bce8720223a0590498c5b" gracePeriod=30 Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.633549 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8k7p5_c5a8a15c-53e8-4868-8feb-dcd4e83939a4/ovnkube-controller/3.log" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.636716 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8k7p5_c5a8a15c-53e8-4868-8feb-dcd4e83939a4/ovn-acl-logging/0.log" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.637575 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8k7p5_c5a8a15c-53e8-4868-8feb-dcd4e83939a4/ovn-controller/0.log" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.638037 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.702884 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-8zf4d"] Feb 19 15:20:54 crc kubenswrapper[4810]: E0219 15:20:54.703186 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5a8a15c-53e8-4868-8feb-dcd4e83939a4" containerName="ovn-controller" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.703215 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5a8a15c-53e8-4868-8feb-dcd4e83939a4" containerName="ovn-controller" Feb 19 15:20:54 crc kubenswrapper[4810]: E0219 15:20:54.703234 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5a8a15c-53e8-4868-8feb-dcd4e83939a4" containerName="ovnkube-controller" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.703246 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5a8a15c-53e8-4868-8feb-dcd4e83939a4" containerName="ovnkube-controller" Feb 19 15:20:54 crc kubenswrapper[4810]: E0219 15:20:54.703261 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5a8a15c-53e8-4868-8feb-dcd4e83939a4" containerName="kube-rbac-proxy-ovn-metrics" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.703274 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5a8a15c-53e8-4868-8feb-dcd4e83939a4" containerName="kube-rbac-proxy-ovn-metrics" Feb 19 15:20:54 crc kubenswrapper[4810]: E0219 15:20:54.703299 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5a8a15c-53e8-4868-8feb-dcd4e83939a4" containerName="northd" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.703311 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5a8a15c-53e8-4868-8feb-dcd4e83939a4" containerName="northd" Feb 19 15:20:54 crc kubenswrapper[4810]: E0219 15:20:54.703324 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5a8a15c-53e8-4868-8feb-dcd4e83939a4" containerName="ovnkube-controller" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.703361 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5a8a15c-53e8-4868-8feb-dcd4e83939a4" containerName="ovnkube-controller" Feb 19 15:20:54 crc kubenswrapper[4810]: E0219 15:20:54.703375 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5a8a15c-53e8-4868-8feb-dcd4e83939a4" containerName="kubecfg-setup" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.703387 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5a8a15c-53e8-4868-8feb-dcd4e83939a4" containerName="kubecfg-setup" Feb 19 15:20:54 crc kubenswrapper[4810]: E0219 15:20:54.703413 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5a8a15c-53e8-4868-8feb-dcd4e83939a4" containerName="kube-rbac-proxy-node" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.703425 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5a8a15c-53e8-4868-8feb-dcd4e83939a4" containerName="kube-rbac-proxy-node" Feb 19 15:20:54 crc kubenswrapper[4810]: E0219 15:20:54.703441 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5a8a15c-53e8-4868-8feb-dcd4e83939a4" containerName="sbdb" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.703453 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5a8a15c-53e8-4868-8feb-dcd4e83939a4" containerName="sbdb" Feb 19 15:20:54 crc kubenswrapper[4810]: E0219 15:20:54.703472 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5a8a15c-53e8-4868-8feb-dcd4e83939a4" containerName="nbdb" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.703483 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5a8a15c-53e8-4868-8feb-dcd4e83939a4" containerName="nbdb" Feb 19 15:20:54 crc kubenswrapper[4810]: E0219 15:20:54.703498 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5a8a15c-53e8-4868-8feb-dcd4e83939a4" containerName="ovnkube-controller" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.703509 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5a8a15c-53e8-4868-8feb-dcd4e83939a4" containerName="ovnkube-controller" Feb 19 15:20:54 crc kubenswrapper[4810]: E0219 15:20:54.703525 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5a8a15c-53e8-4868-8feb-dcd4e83939a4" containerName="ovn-acl-logging" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.703537 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5a8a15c-53e8-4868-8feb-dcd4e83939a4" containerName="ovn-acl-logging" Feb 19 15:20:54 crc kubenswrapper[4810]: E0219 15:20:54.703551 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5a8a15c-53e8-4868-8feb-dcd4e83939a4" containerName="ovnkube-controller" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.703563 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5a8a15c-53e8-4868-8feb-dcd4e83939a4" containerName="ovnkube-controller" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.703716 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5a8a15c-53e8-4868-8feb-dcd4e83939a4" containerName="kube-rbac-proxy-node" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.703736 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5a8a15c-53e8-4868-8feb-dcd4e83939a4" containerName="ovnkube-controller" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.703749 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5a8a15c-53e8-4868-8feb-dcd4e83939a4" containerName="ovn-controller" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.703767 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5a8a15c-53e8-4868-8feb-dcd4e83939a4" containerName="nbdb" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.703783 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5a8a15c-53e8-4868-8feb-dcd4e83939a4" containerName="kube-rbac-proxy-ovn-metrics" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.703801 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5a8a15c-53e8-4868-8feb-dcd4e83939a4" containerName="ovnkube-controller" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.703814 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5a8a15c-53e8-4868-8feb-dcd4e83939a4" containerName="sbdb" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.703833 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5a8a15c-53e8-4868-8feb-dcd4e83939a4" containerName="ovnkube-controller" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.703848 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5a8a15c-53e8-4868-8feb-dcd4e83939a4" containerName="northd" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.703862 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5a8a15c-53e8-4868-8feb-dcd4e83939a4" containerName="ovnkube-controller" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.703880 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5a8a15c-53e8-4868-8feb-dcd4e83939a4" containerName="ovn-acl-logging" Feb 19 15:20:54 crc kubenswrapper[4810]: E0219 15:20:54.704077 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5a8a15c-53e8-4868-8feb-dcd4e83939a4" containerName="ovnkube-controller" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.704103 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5a8a15c-53e8-4868-8feb-dcd4e83939a4" containerName="ovnkube-controller" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.704264 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5a8a15c-53e8-4868-8feb-dcd4e83939a4" containerName="ovnkube-controller" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.707145 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8zf4d" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.762396 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-run-ovn\") pod \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\" (UID: \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\") " Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.762449 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-host-slash\") pod \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\" (UID: \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\") " Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.762471 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-log-socket\") pod \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\" (UID: \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\") " Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.762500 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\" (UID: \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\") " Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.762546 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-env-overrides\") pod \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\" (UID: \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\") " Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.762570 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-ovnkube-config\") pod \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\" (UID: \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\") " Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.762595 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-ovn-node-metrics-cert\") pod \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\" (UID: \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\") " Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.762555 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "c5a8a15c-53e8-4868-8feb-dcd4e83939a4" (UID: "c5a8a15c-53e8-4868-8feb-dcd4e83939a4"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.762625 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-etc-openvswitch\") pod \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\" (UID: \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\") " Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.762649 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-var-lib-openvswitch\") pod \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\" (UID: \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\") " Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.762666 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-host-run-netns\") pod \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\" (UID: \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\") " Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.762664 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "c5a8a15c-53e8-4868-8feb-dcd4e83939a4" (UID: "c5a8a15c-53e8-4868-8feb-dcd4e83939a4"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.762691 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-systemd-units\") pod \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\" (UID: \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\") " Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.762704 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-host-slash" (OuterVolumeSpecName: "host-slash") pod "c5a8a15c-53e8-4868-8feb-dcd4e83939a4" (UID: "c5a8a15c-53e8-4868-8feb-dcd4e83939a4"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.762714 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-host-kubelet\") pod \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\" (UID: \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\") " Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.762769 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "c5a8a15c-53e8-4868-8feb-dcd4e83939a4" (UID: "c5a8a15c-53e8-4868-8feb-dcd4e83939a4"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.762789 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-ovnkube-script-lib\") pod \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\" (UID: \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\") " Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.762843 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-host-run-ovn-kubernetes\") pod \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\" (UID: \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\") " Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.762877 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-run-systemd\") pod \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\" (UID: \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\") " Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.762915 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-run-openvswitch\") pod \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\" (UID: \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\") " Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.762948 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-host-cni-netd\") pod \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\" (UID: \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\") " Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.763001 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7xqv\" (UniqueName: \"kubernetes.io/projected/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-kube-api-access-v7xqv\") pod \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\" (UID: \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\") " Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.763031 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-node-log\") pod \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\" (UID: \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\") " Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.763078 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-host-cni-bin\") pod \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\" (UID: \"c5a8a15c-53e8-4868-8feb-dcd4e83939a4\") " Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.763192 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "c5a8a15c-53e8-4868-8feb-dcd4e83939a4" (UID: "c5a8a15c-53e8-4868-8feb-dcd4e83939a4"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.763288 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "c5a8a15c-53e8-4868-8feb-dcd4e83939a4" (UID: "c5a8a15c-53e8-4868-8feb-dcd4e83939a4"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.763293 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "c5a8a15c-53e8-4868-8feb-dcd4e83939a4" (UID: "c5a8a15c-53e8-4868-8feb-dcd4e83939a4"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.763372 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "c5a8a15c-53e8-4868-8feb-dcd4e83939a4" (UID: "c5a8a15c-53e8-4868-8feb-dcd4e83939a4"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.763413 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-node-log" (OuterVolumeSpecName: "node-log") pod "c5a8a15c-53e8-4868-8feb-dcd4e83939a4" (UID: "c5a8a15c-53e8-4868-8feb-dcd4e83939a4"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.763687 4810 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.763729 4810 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-host-kubelet\") on node \"crc\" DevicePath \"\"" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.763748 4810 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-run-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.763766 4810 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-host-cni-netd\") on node \"crc\" DevicePath \"\"" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.763783 4810 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-node-log\") on node \"crc\" DevicePath \"\"" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.763802 4810 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-host-cni-bin\") on node \"crc\" DevicePath \"\"" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.763696 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "c5a8a15c-53e8-4868-8feb-dcd4e83939a4" (UID: "c5a8a15c-53e8-4868-8feb-dcd4e83939a4"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.763720 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "c5a8a15c-53e8-4868-8feb-dcd4e83939a4" (UID: "c5a8a15c-53e8-4868-8feb-dcd4e83939a4"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.763743 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-log-socket" (OuterVolumeSpecName: "log-socket") pod "c5a8a15c-53e8-4868-8feb-dcd4e83939a4" (UID: "c5a8a15c-53e8-4868-8feb-dcd4e83939a4"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.763762 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "c5a8a15c-53e8-4868-8feb-dcd4e83939a4" (UID: "c5a8a15c-53e8-4868-8feb-dcd4e83939a4"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.763775 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "c5a8a15c-53e8-4868-8feb-dcd4e83939a4" (UID: "c5a8a15c-53e8-4868-8feb-dcd4e83939a4"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.763793 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "c5a8a15c-53e8-4868-8feb-dcd4e83939a4" (UID: "c5a8a15c-53e8-4868-8feb-dcd4e83939a4"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.763876 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "c5a8a15c-53e8-4868-8feb-dcd4e83939a4" (UID: "c5a8a15c-53e8-4868-8feb-dcd4e83939a4"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.764132 4810 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.764157 4810 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-host-slash\") on node \"crc\" DevicePath \"\"" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.764172 4810 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.764341 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "c5a8a15c-53e8-4868-8feb-dcd4e83939a4" (UID: "c5a8a15c-53e8-4868-8feb-dcd4e83939a4"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.768693 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "c5a8a15c-53e8-4868-8feb-dcd4e83939a4" (UID: "c5a8a15c-53e8-4868-8feb-dcd4e83939a4"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.770674 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-kube-api-access-v7xqv" (OuterVolumeSpecName: "kube-api-access-v7xqv") pod "c5a8a15c-53e8-4868-8feb-dcd4e83939a4" (UID: "c5a8a15c-53e8-4868-8feb-dcd4e83939a4"). InnerVolumeSpecName "kube-api-access-v7xqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.776980 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "c5a8a15c-53e8-4868-8feb-dcd4e83939a4" (UID: "c5a8a15c-53e8-4868-8feb-dcd4e83939a4"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.869163 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6ac1d960-0c17-4a48-8944-3eb4bb640ddf-host-kubelet\") pod \"ovnkube-node-8zf4d\" (UID: \"6ac1d960-0c17-4a48-8944-3eb4bb640ddf\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zf4d" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.869247 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6ac1d960-0c17-4a48-8944-3eb4bb640ddf-run-systemd\") pod \"ovnkube-node-8zf4d\" (UID: \"6ac1d960-0c17-4a48-8944-3eb4bb640ddf\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zf4d" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.869283 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvfdt\" (UniqueName: \"kubernetes.io/projected/6ac1d960-0c17-4a48-8944-3eb4bb640ddf-kube-api-access-xvfdt\") pod \"ovnkube-node-8zf4d\" (UID: \"6ac1d960-0c17-4a48-8944-3eb4bb640ddf\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zf4d" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.869395 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6ac1d960-0c17-4a48-8944-3eb4bb640ddf-run-openvswitch\") pod \"ovnkube-node-8zf4d\" (UID: \"6ac1d960-0c17-4a48-8944-3eb4bb640ddf\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zf4d" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.869431 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ac1d960-0c17-4a48-8944-3eb4bb640ddf-ovn-node-metrics-cert\") pod \"ovnkube-node-8zf4d\" (UID: \"6ac1d960-0c17-4a48-8944-3eb4bb640ddf\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zf4d" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.869464 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6ac1d960-0c17-4a48-8944-3eb4bb640ddf-host-run-netns\") pod \"ovnkube-node-8zf4d\" (UID: \"6ac1d960-0c17-4a48-8944-3eb4bb640ddf\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zf4d" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.869504 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6ac1d960-0c17-4a48-8944-3eb4bb640ddf-run-ovn\") pod \"ovnkube-node-8zf4d\" (UID: \"6ac1d960-0c17-4a48-8944-3eb4bb640ddf\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zf4d" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.869534 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6ac1d960-0c17-4a48-8944-3eb4bb640ddf-host-cni-netd\") pod \"ovnkube-node-8zf4d\" (UID: \"6ac1d960-0c17-4a48-8944-3eb4bb640ddf\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zf4d" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.869574 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6ac1d960-0c17-4a48-8944-3eb4bb640ddf-host-slash\") pod \"ovnkube-node-8zf4d\" (UID: \"6ac1d960-0c17-4a48-8944-3eb4bb640ddf\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zf4d" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.869604 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ac1d960-0c17-4a48-8944-3eb4bb640ddf-ovnkube-config\") pod \"ovnkube-node-8zf4d\" (UID: \"6ac1d960-0c17-4a48-8944-3eb4bb640ddf\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zf4d" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.869652 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6ac1d960-0c17-4a48-8944-3eb4bb640ddf-log-socket\") pod \"ovnkube-node-8zf4d\" (UID: \"6ac1d960-0c17-4a48-8944-3eb4bb640ddf\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zf4d" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.869682 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ac1d960-0c17-4a48-8944-3eb4bb640ddf-env-overrides\") pod \"ovnkube-node-8zf4d\" (UID: \"6ac1d960-0c17-4a48-8944-3eb4bb640ddf\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zf4d" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.869723 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6ac1d960-0c17-4a48-8944-3eb4bb640ddf-etc-openvswitch\") pod \"ovnkube-node-8zf4d\" (UID: \"6ac1d960-0c17-4a48-8944-3eb4bb640ddf\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zf4d" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.869753 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6ac1d960-0c17-4a48-8944-3eb4bb640ddf-systemd-units\") pod \"ovnkube-node-8zf4d\" (UID: \"6ac1d960-0c17-4a48-8944-3eb4bb640ddf\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zf4d" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.869792 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6ac1d960-0c17-4a48-8944-3eb4bb640ddf-var-lib-openvswitch\") pod \"ovnkube-node-8zf4d\" (UID: \"6ac1d960-0c17-4a48-8944-3eb4bb640ddf\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zf4d" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.869829 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6ac1d960-0c17-4a48-8944-3eb4bb640ddf-host-run-ovn-kubernetes\") pod \"ovnkube-node-8zf4d\" (UID: \"6ac1d960-0c17-4a48-8944-3eb4bb640ddf\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zf4d" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.869869 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6ac1d960-0c17-4a48-8944-3eb4bb640ddf-node-log\") pod \"ovnkube-node-8zf4d\" (UID: \"6ac1d960-0c17-4a48-8944-3eb4bb640ddf\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zf4d" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.869904 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6ac1d960-0c17-4a48-8944-3eb4bb640ddf-host-cni-bin\") pod \"ovnkube-node-8zf4d\" (UID: \"6ac1d960-0c17-4a48-8944-3eb4bb640ddf\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zf4d" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.869952 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6ac1d960-0c17-4a48-8944-3eb4bb640ddf-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8zf4d\" (UID: \"6ac1d960-0c17-4a48-8944-3eb4bb640ddf\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zf4d" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.869995 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ac1d960-0c17-4a48-8944-3eb4bb640ddf-ovnkube-script-lib\") pod \"ovnkube-node-8zf4d\" (UID: \"6ac1d960-0c17-4a48-8944-3eb4bb640ddf\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zf4d" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.870166 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7xqv\" (UniqueName: \"kubernetes.io/projected/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-kube-api-access-v7xqv\") on node \"crc\" DevicePath \"\"" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.870191 4810 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-log-socket\") on node \"crc\" DevicePath \"\"" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.870211 4810 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.870231 4810 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.870249 4810 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.870268 4810 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.870286 4810 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-host-run-netns\") on node \"crc\" DevicePath \"\"" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.870305 4810 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-systemd-units\") on node \"crc\" DevicePath \"\"" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.870345 4810 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.870364 4810 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.870381 4810 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c5a8a15c-53e8-4868-8feb-dcd4e83939a4-run-systemd\") on node \"crc\" DevicePath \"\"" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.971318 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6ac1d960-0c17-4a48-8944-3eb4bb640ddf-host-cni-bin\") pod \"ovnkube-node-8zf4d\" (UID: \"6ac1d960-0c17-4a48-8944-3eb4bb640ddf\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zf4d" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.971433 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6ac1d960-0c17-4a48-8944-3eb4bb640ddf-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8zf4d\" (UID: \"6ac1d960-0c17-4a48-8944-3eb4bb640ddf\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zf4d" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.971439 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6ac1d960-0c17-4a48-8944-3eb4bb640ddf-host-cni-bin\") pod \"ovnkube-node-8zf4d\" (UID: \"6ac1d960-0c17-4a48-8944-3eb4bb640ddf\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zf4d" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.971487 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ac1d960-0c17-4a48-8944-3eb4bb640ddf-ovnkube-script-lib\") pod \"ovnkube-node-8zf4d\" (UID: \"6ac1d960-0c17-4a48-8944-3eb4bb640ddf\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zf4d" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.971528 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6ac1d960-0c17-4a48-8944-3eb4bb640ddf-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8zf4d\" (UID: \"6ac1d960-0c17-4a48-8944-3eb4bb640ddf\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zf4d" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.971539 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6ac1d960-0c17-4a48-8944-3eb4bb640ddf-host-kubelet\") pod \"ovnkube-node-8zf4d\" (UID: \"6ac1d960-0c17-4a48-8944-3eb4bb640ddf\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zf4d" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.971580 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvfdt\" (UniqueName: \"kubernetes.io/projected/6ac1d960-0c17-4a48-8944-3eb4bb640ddf-kube-api-access-xvfdt\") pod \"ovnkube-node-8zf4d\" (UID: \"6ac1d960-0c17-4a48-8944-3eb4bb640ddf\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zf4d" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.971610 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6ac1d960-0c17-4a48-8944-3eb4bb640ddf-run-systemd\") pod \"ovnkube-node-8zf4d\" (UID: \"6ac1d960-0c17-4a48-8944-3eb4bb640ddf\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zf4d" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.971644 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6ac1d960-0c17-4a48-8944-3eb4bb640ddf-run-openvswitch\") pod \"ovnkube-node-8zf4d\" (UID: \"6ac1d960-0c17-4a48-8944-3eb4bb640ddf\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zf4d" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.971671 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ac1d960-0c17-4a48-8944-3eb4bb640ddf-ovn-node-metrics-cert\") pod \"ovnkube-node-8zf4d\" (UID: \"6ac1d960-0c17-4a48-8944-3eb4bb640ddf\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zf4d" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.971684 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6ac1d960-0c17-4a48-8944-3eb4bb640ddf-host-kubelet\") pod \"ovnkube-node-8zf4d\" (UID: \"6ac1d960-0c17-4a48-8944-3eb4bb640ddf\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zf4d" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.971701 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6ac1d960-0c17-4a48-8944-3eb4bb640ddf-host-run-netns\") pod \"ovnkube-node-8zf4d\" (UID: \"6ac1d960-0c17-4a48-8944-3eb4bb640ddf\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zf4d" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.971751 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6ac1d960-0c17-4a48-8944-3eb4bb640ddf-host-run-netns\") pod \"ovnkube-node-8zf4d\" (UID: \"6ac1d960-0c17-4a48-8944-3eb4bb640ddf\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zf4d" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.971755 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6ac1d960-0c17-4a48-8944-3eb4bb640ddf-run-systemd\") pod \"ovnkube-node-8zf4d\" (UID: \"6ac1d960-0c17-4a48-8944-3eb4bb640ddf\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zf4d" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.971794 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6ac1d960-0c17-4a48-8944-3eb4bb640ddf-run-ovn\") pod \"ovnkube-node-8zf4d\" (UID: \"6ac1d960-0c17-4a48-8944-3eb4bb640ddf\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zf4d" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.971845 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6ac1d960-0c17-4a48-8944-3eb4bb640ddf-host-cni-netd\") pod \"ovnkube-node-8zf4d\" (UID: \"6ac1d960-0c17-4a48-8944-3eb4bb640ddf\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zf4d" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.971846 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6ac1d960-0c17-4a48-8944-3eb4bb640ddf-run-openvswitch\") pod \"ovnkube-node-8zf4d\" (UID: \"6ac1d960-0c17-4a48-8944-3eb4bb640ddf\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zf4d" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.971894 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6ac1d960-0c17-4a48-8944-3eb4bb640ddf-host-slash\") pod \"ovnkube-node-8zf4d\" (UID: \"6ac1d960-0c17-4a48-8944-3eb4bb640ddf\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zf4d" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.971940 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ac1d960-0c17-4a48-8944-3eb4bb640ddf-ovnkube-config\") pod \"ovnkube-node-8zf4d\" (UID: \"6ac1d960-0c17-4a48-8944-3eb4bb640ddf\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zf4d" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.971948 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6ac1d960-0c17-4a48-8944-3eb4bb640ddf-run-ovn\") pod \"ovnkube-node-8zf4d\" (UID: \"6ac1d960-0c17-4a48-8944-3eb4bb640ddf\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zf4d" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.971998 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6ac1d960-0c17-4a48-8944-3eb4bb640ddf-log-socket\") pod \"ovnkube-node-8zf4d\" (UID: \"6ac1d960-0c17-4a48-8944-3eb4bb640ddf\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zf4d" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.972016 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6ac1d960-0c17-4a48-8944-3eb4bb640ddf-host-slash\") pod \"ovnkube-node-8zf4d\" (UID: \"6ac1d960-0c17-4a48-8944-3eb4bb640ddf\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zf4d" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.972043 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ac1d960-0c17-4a48-8944-3eb4bb640ddf-env-overrides\") pod \"ovnkube-node-8zf4d\" (UID: \"6ac1d960-0c17-4a48-8944-3eb4bb640ddf\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zf4d" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.972084 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6ac1d960-0c17-4a48-8944-3eb4bb640ddf-log-socket\") pod \"ovnkube-node-8zf4d\" (UID: \"6ac1d960-0c17-4a48-8944-3eb4bb640ddf\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zf4d" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.972117 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6ac1d960-0c17-4a48-8944-3eb4bb640ddf-etc-openvswitch\") pod \"ovnkube-node-8zf4d\" (UID: \"6ac1d960-0c17-4a48-8944-3eb4bb640ddf\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zf4d" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.972164 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6ac1d960-0c17-4a48-8944-3eb4bb640ddf-systemd-units\") pod \"ovnkube-node-8zf4d\" (UID: \"6ac1d960-0c17-4a48-8944-3eb4bb640ddf\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zf4d" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.972192 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6ac1d960-0c17-4a48-8944-3eb4bb640ddf-etc-openvswitch\") pod \"ovnkube-node-8zf4d\" (UID: \"6ac1d960-0c17-4a48-8944-3eb4bb640ddf\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zf4d" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.972222 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6ac1d960-0c17-4a48-8944-3eb4bb640ddf-var-lib-openvswitch\") pod \"ovnkube-node-8zf4d\" (UID: \"6ac1d960-0c17-4a48-8944-3eb4bb640ddf\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zf4d" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.972243 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6ac1d960-0c17-4a48-8944-3eb4bb640ddf-systemd-units\") pod \"ovnkube-node-8zf4d\" (UID: \"6ac1d960-0c17-4a48-8944-3eb4bb640ddf\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zf4d" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.972371 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6ac1d960-0c17-4a48-8944-3eb4bb640ddf-var-lib-openvswitch\") pod \"ovnkube-node-8zf4d\" (UID: \"6ac1d960-0c17-4a48-8944-3eb4bb640ddf\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zf4d" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.972316 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6ac1d960-0c17-4a48-8944-3eb4bb640ddf-host-run-ovn-kubernetes\") pod \"ovnkube-node-8zf4d\" (UID: \"6ac1d960-0c17-4a48-8944-3eb4bb640ddf\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zf4d" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.972425 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6ac1d960-0c17-4a48-8944-3eb4bb640ddf-host-run-ovn-kubernetes\") pod \"ovnkube-node-8zf4d\" (UID: \"6ac1d960-0c17-4a48-8944-3eb4bb640ddf\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zf4d" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.972446 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6ac1d960-0c17-4a48-8944-3eb4bb640ddf-node-log\") pod \"ovnkube-node-8zf4d\" (UID: \"6ac1d960-0c17-4a48-8944-3eb4bb640ddf\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zf4d" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.972475 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6ac1d960-0c17-4a48-8944-3eb4bb640ddf-node-log\") pod \"ovnkube-node-8zf4d\" (UID: \"6ac1d960-0c17-4a48-8944-3eb4bb640ddf\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zf4d" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.971965 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6ac1d960-0c17-4a48-8944-3eb4bb640ddf-host-cni-netd\") pod \"ovnkube-node-8zf4d\" (UID: \"6ac1d960-0c17-4a48-8944-3eb4bb640ddf\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zf4d" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.972935 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ac1d960-0c17-4a48-8944-3eb4bb640ddf-ovnkube-script-lib\") pod \"ovnkube-node-8zf4d\" (UID: \"6ac1d960-0c17-4a48-8944-3eb4bb640ddf\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zf4d" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.972997 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ac1d960-0c17-4a48-8944-3eb4bb640ddf-env-overrides\") pod \"ovnkube-node-8zf4d\" (UID: \"6ac1d960-0c17-4a48-8944-3eb4bb640ddf\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zf4d" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.973097 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ac1d960-0c17-4a48-8944-3eb4bb640ddf-ovnkube-config\") pod \"ovnkube-node-8zf4d\" (UID: \"6ac1d960-0c17-4a48-8944-3eb4bb640ddf\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zf4d" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.976229 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ac1d960-0c17-4a48-8944-3eb4bb640ddf-ovn-node-metrics-cert\") pod \"ovnkube-node-8zf4d\" (UID: \"6ac1d960-0c17-4a48-8944-3eb4bb640ddf\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zf4d" Feb 19 15:20:54 crc kubenswrapper[4810]: I0219 15:20:54.990303 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvfdt\" (UniqueName: \"kubernetes.io/projected/6ac1d960-0c17-4a48-8944-3eb4bb640ddf-kube-api-access-xvfdt\") pod \"ovnkube-node-8zf4d\" (UID: \"6ac1d960-0c17-4a48-8944-3eb4bb640ddf\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zf4d" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.023161 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8zf4d" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.090247 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8k7p5_c5a8a15c-53e8-4868-8feb-dcd4e83939a4/ovnkube-controller/3.log" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.093833 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8k7p5_c5a8a15c-53e8-4868-8feb-dcd4e83939a4/ovn-acl-logging/0.log" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.094775 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8k7p5_c5a8a15c-53e8-4868-8feb-dcd4e83939a4/ovn-controller/0.log" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.095259 4810 generic.go:334] "Generic (PLEG): container finished" podID="c5a8a15c-53e8-4868-8feb-dcd4e83939a4" containerID="07a16a020eab7c7e2c4939aceb3c9448061dd4b2196bce8720223a0590498c5b" exitCode=0 Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.095298 4810 generic.go:334] "Generic (PLEG): container finished" podID="c5a8a15c-53e8-4868-8feb-dcd4e83939a4" containerID="e21fe2d6964872913c8e5a2f983519806b72d0e2880016372ac90fc2b9f61fc6" exitCode=0 Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.095316 4810 generic.go:334] "Generic (PLEG): container finished" podID="c5a8a15c-53e8-4868-8feb-dcd4e83939a4" containerID="5de90aae0b44ad98711b4e4c2764cf25c491a329a7acbe9a6ec49f88fb8e7c92" exitCode=0 Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.095357 4810 generic.go:334] "Generic (PLEG): container finished" podID="c5a8a15c-53e8-4868-8feb-dcd4e83939a4" containerID="0c0568b462e1168949dcfd0131ef4e94d6c3f68dd31906ee5c6a8c87f7fef25b" exitCode=0 Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.095376 4810 generic.go:334] "Generic (PLEG): container finished" podID="c5a8a15c-53e8-4868-8feb-dcd4e83939a4" containerID="879052e78b8d383d0a7f4fbddb38b98fa931bec3d0d0c7ff7eb830ccb62be46d" exitCode=0 Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.095394 4810 generic.go:334] "Generic (PLEG): container finished" podID="c5a8a15c-53e8-4868-8feb-dcd4e83939a4" containerID="6dcca2640a7625737b151a65d63dac889139b41a8dff1cd5aae2ead1a37138f1" exitCode=0 Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.095411 4810 generic.go:334] "Generic (PLEG): container finished" podID="c5a8a15c-53e8-4868-8feb-dcd4e83939a4" containerID="702a868ef507f31cb27f41ccd862f38ba8d5256e5d4af19a5d8b168939932663" exitCode=143 Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.095430 4810 generic.go:334] "Generic (PLEG): container finished" podID="c5a8a15c-53e8-4868-8feb-dcd4e83939a4" containerID="5089067acda7084123f239150916500b11a0547068b0a39ee60a467c372f3459" exitCode=143 Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.095500 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.095486 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" event={"ID":"c5a8a15c-53e8-4868-8feb-dcd4e83939a4","Type":"ContainerDied","Data":"07a16a020eab7c7e2c4939aceb3c9448061dd4b2196bce8720223a0590498c5b"} Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.095688 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" event={"ID":"c5a8a15c-53e8-4868-8feb-dcd4e83939a4","Type":"ContainerDied","Data":"e21fe2d6964872913c8e5a2f983519806b72d0e2880016372ac90fc2b9f61fc6"} Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.095736 4810 scope.go:117] "RemoveContainer" containerID="07a16a020eab7c7e2c4939aceb3c9448061dd4b2196bce8720223a0590498c5b" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.095767 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" event={"ID":"c5a8a15c-53e8-4868-8feb-dcd4e83939a4","Type":"ContainerDied","Data":"5de90aae0b44ad98711b4e4c2764cf25c491a329a7acbe9a6ec49f88fb8e7c92"} Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.095794 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" event={"ID":"c5a8a15c-53e8-4868-8feb-dcd4e83939a4","Type":"ContainerDied","Data":"0c0568b462e1168949dcfd0131ef4e94d6c3f68dd31906ee5c6a8c87f7fef25b"} Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.095847 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" event={"ID":"c5a8a15c-53e8-4868-8feb-dcd4e83939a4","Type":"ContainerDied","Data":"879052e78b8d383d0a7f4fbddb38b98fa931bec3d0d0c7ff7eb830ccb62be46d"} Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.095867 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" event={"ID":"c5a8a15c-53e8-4868-8feb-dcd4e83939a4","Type":"ContainerDied","Data":"6dcca2640a7625737b151a65d63dac889139b41a8dff1cd5aae2ead1a37138f1"} Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.095889 4810 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"824c8634103fa360f2c3a6cb4c22c0185f87e2de6475beadb412027255de2cd4"} Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.095936 4810 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e21fe2d6964872913c8e5a2f983519806b72d0e2880016372ac90fc2b9f61fc6"} Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.095948 4810 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5de90aae0b44ad98711b4e4c2764cf25c491a329a7acbe9a6ec49f88fb8e7c92"} Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.095960 4810 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0c0568b462e1168949dcfd0131ef4e94d6c3f68dd31906ee5c6a8c87f7fef25b"} Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.095972 4810 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"879052e78b8d383d0a7f4fbddb38b98fa931bec3d0d0c7ff7eb830ccb62be46d"} Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.095982 4810 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6dcca2640a7625737b151a65d63dac889139b41a8dff1cd5aae2ead1a37138f1"} Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.096021 4810 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"702a868ef507f31cb27f41ccd862f38ba8d5256e5d4af19a5d8b168939932663"} Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.096035 4810 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5089067acda7084123f239150916500b11a0547068b0a39ee60a467c372f3459"} Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.096045 4810 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c"} Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.096060 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" event={"ID":"c5a8a15c-53e8-4868-8feb-dcd4e83939a4","Type":"ContainerDied","Data":"702a868ef507f31cb27f41ccd862f38ba8d5256e5d4af19a5d8b168939932663"} Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.096077 4810 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"07a16a020eab7c7e2c4939aceb3c9448061dd4b2196bce8720223a0590498c5b"} Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.096120 4810 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"824c8634103fa360f2c3a6cb4c22c0185f87e2de6475beadb412027255de2cd4"} Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.096130 4810 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e21fe2d6964872913c8e5a2f983519806b72d0e2880016372ac90fc2b9f61fc6"} Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.096141 4810 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5de90aae0b44ad98711b4e4c2764cf25c491a329a7acbe9a6ec49f88fb8e7c92"} Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.096153 4810 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0c0568b462e1168949dcfd0131ef4e94d6c3f68dd31906ee5c6a8c87f7fef25b"} Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.096163 4810 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"879052e78b8d383d0a7f4fbddb38b98fa931bec3d0d0c7ff7eb830ccb62be46d"} Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.096203 4810 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6dcca2640a7625737b151a65d63dac889139b41a8dff1cd5aae2ead1a37138f1"} Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.096213 4810 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"702a868ef507f31cb27f41ccd862f38ba8d5256e5d4af19a5d8b168939932663"} Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.096224 4810 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5089067acda7084123f239150916500b11a0547068b0a39ee60a467c372f3459"} Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.096235 4810 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c"} Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.096251 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" event={"ID":"c5a8a15c-53e8-4868-8feb-dcd4e83939a4","Type":"ContainerDied","Data":"5089067acda7084123f239150916500b11a0547068b0a39ee60a467c372f3459"} Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.096291 4810 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"07a16a020eab7c7e2c4939aceb3c9448061dd4b2196bce8720223a0590498c5b"} Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.096304 4810 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"824c8634103fa360f2c3a6cb4c22c0185f87e2de6475beadb412027255de2cd4"} Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.096315 4810 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e21fe2d6964872913c8e5a2f983519806b72d0e2880016372ac90fc2b9f61fc6"} Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.096364 4810 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5de90aae0b44ad98711b4e4c2764cf25c491a329a7acbe9a6ec49f88fb8e7c92"} Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.096377 4810 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0c0568b462e1168949dcfd0131ef4e94d6c3f68dd31906ee5c6a8c87f7fef25b"} Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.096388 4810 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"879052e78b8d383d0a7f4fbddb38b98fa931bec3d0d0c7ff7eb830ccb62be46d"} Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.096400 4810 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6dcca2640a7625737b151a65d63dac889139b41a8dff1cd5aae2ead1a37138f1"} Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.096414 4810 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"702a868ef507f31cb27f41ccd862f38ba8d5256e5d4af19a5d8b168939932663"} Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.096463 4810 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5089067acda7084123f239150916500b11a0547068b0a39ee60a467c372f3459"} Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.096475 4810 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c"} Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.096490 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8k7p5" event={"ID":"c5a8a15c-53e8-4868-8feb-dcd4e83939a4","Type":"ContainerDied","Data":"18f9a8e62a518ac5ed0415309aa8b6acc10ac0aef8f801aae58cbe85d9127027"} Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.096508 4810 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"07a16a020eab7c7e2c4939aceb3c9448061dd4b2196bce8720223a0590498c5b"} Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.096551 4810 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"824c8634103fa360f2c3a6cb4c22c0185f87e2de6475beadb412027255de2cd4"} Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.096562 4810 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e21fe2d6964872913c8e5a2f983519806b72d0e2880016372ac90fc2b9f61fc6"} Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.096572 4810 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5de90aae0b44ad98711b4e4c2764cf25c491a329a7acbe9a6ec49f88fb8e7c92"} Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.096583 4810 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0c0568b462e1168949dcfd0131ef4e94d6c3f68dd31906ee5c6a8c87f7fef25b"} Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.096594 4810 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"879052e78b8d383d0a7f4fbddb38b98fa931bec3d0d0c7ff7eb830ccb62be46d"} Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.096633 4810 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6dcca2640a7625737b151a65d63dac889139b41a8dff1cd5aae2ead1a37138f1"} Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.096644 4810 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"702a868ef507f31cb27f41ccd862f38ba8d5256e5d4af19a5d8b168939932663"} Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.096656 4810 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5089067acda7084123f239150916500b11a0547068b0a39ee60a467c372f3459"} Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.096666 4810 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c"} Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.098573 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8zf4d" event={"ID":"6ac1d960-0c17-4a48-8944-3eb4bb640ddf","Type":"ContainerStarted","Data":"935c0ff3f82d9b767de715fd41fc9226da2a899a73c7af7bdb5c3326cb019546"} Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.100636 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bsztz_2a45a199-beeb-4972-b796-15c958fe99d3/kube-multus/2.log" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.101213 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bsztz_2a45a199-beeb-4972-b796-15c958fe99d3/kube-multus/1.log" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.101265 4810 generic.go:334] "Generic (PLEG): container finished" podID="2a45a199-beeb-4972-b796-15c958fe99d3" containerID="92e30886e49380d7d876397116f7db4e85388275b36d2f8ee0ab84b9167f3dde" exitCode=2 Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.101297 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bsztz" event={"ID":"2a45a199-beeb-4972-b796-15c958fe99d3","Type":"ContainerDied","Data":"92e30886e49380d7d876397116f7db4e85388275b36d2f8ee0ab84b9167f3dde"} Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.101322 4810 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c1f575901c6e6174387489f4f3a7d8cfa46089db51556b6acc9245faaf36a9ca"} Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.102042 4810 scope.go:117] "RemoveContainer" containerID="92e30886e49380d7d876397116f7db4e85388275b36d2f8ee0ab84b9167f3dde" Feb 19 15:20:55 crc kubenswrapper[4810]: E0219 15:20:55.102538 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-bsztz_openshift-multus(2a45a199-beeb-4972-b796-15c958fe99d3)\"" pod="openshift-multus/multus-bsztz" podUID="2a45a199-beeb-4972-b796-15c958fe99d3" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.126597 4810 scope.go:117] "RemoveContainer" containerID="824c8634103fa360f2c3a6cb4c22c0185f87e2de6475beadb412027255de2cd4" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.157084 4810 scope.go:117] "RemoveContainer" containerID="e21fe2d6964872913c8e5a2f983519806b72d0e2880016372ac90fc2b9f61fc6" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.179730 4810 scope.go:117] "RemoveContainer" containerID="5de90aae0b44ad98711b4e4c2764cf25c491a329a7acbe9a6ec49f88fb8e7c92" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.185284 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-8k7p5"] Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.193069 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-8k7p5"] Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.195425 4810 scope.go:117] "RemoveContainer" containerID="0c0568b462e1168949dcfd0131ef4e94d6c3f68dd31906ee5c6a8c87f7fef25b" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.252484 4810 scope.go:117] "RemoveContainer" containerID="879052e78b8d383d0a7f4fbddb38b98fa931bec3d0d0c7ff7eb830ccb62be46d" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.281286 4810 scope.go:117] "RemoveContainer" containerID="6dcca2640a7625737b151a65d63dac889139b41a8dff1cd5aae2ead1a37138f1" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.297229 4810 scope.go:117] "RemoveContainer" containerID="702a868ef507f31cb27f41ccd862f38ba8d5256e5d4af19a5d8b168939932663" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.310291 4810 scope.go:117] "RemoveContainer" containerID="5089067acda7084123f239150916500b11a0547068b0a39ee60a467c372f3459" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.324077 4810 scope.go:117] "RemoveContainer" containerID="3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.337072 4810 scope.go:117] "RemoveContainer" containerID="07a16a020eab7c7e2c4939aceb3c9448061dd4b2196bce8720223a0590498c5b" Feb 19 15:20:55 crc kubenswrapper[4810]: E0219 15:20:55.337615 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07a16a020eab7c7e2c4939aceb3c9448061dd4b2196bce8720223a0590498c5b\": container with ID starting with 07a16a020eab7c7e2c4939aceb3c9448061dd4b2196bce8720223a0590498c5b not found: ID does not exist" containerID="07a16a020eab7c7e2c4939aceb3c9448061dd4b2196bce8720223a0590498c5b" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.337645 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07a16a020eab7c7e2c4939aceb3c9448061dd4b2196bce8720223a0590498c5b"} err="failed to get container status \"07a16a020eab7c7e2c4939aceb3c9448061dd4b2196bce8720223a0590498c5b\": rpc error: code = NotFound desc = could not find container \"07a16a020eab7c7e2c4939aceb3c9448061dd4b2196bce8720223a0590498c5b\": container with ID starting with 07a16a020eab7c7e2c4939aceb3c9448061dd4b2196bce8720223a0590498c5b not found: ID does not exist" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.337665 4810 scope.go:117] "RemoveContainer" containerID="824c8634103fa360f2c3a6cb4c22c0185f87e2de6475beadb412027255de2cd4" Feb 19 15:20:55 crc kubenswrapper[4810]: E0219 15:20:55.337866 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"824c8634103fa360f2c3a6cb4c22c0185f87e2de6475beadb412027255de2cd4\": container with ID starting with 824c8634103fa360f2c3a6cb4c22c0185f87e2de6475beadb412027255de2cd4 not found: ID does not exist" containerID="824c8634103fa360f2c3a6cb4c22c0185f87e2de6475beadb412027255de2cd4" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.337889 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"824c8634103fa360f2c3a6cb4c22c0185f87e2de6475beadb412027255de2cd4"} err="failed to get container status \"824c8634103fa360f2c3a6cb4c22c0185f87e2de6475beadb412027255de2cd4\": rpc error: code = NotFound desc = could not find container \"824c8634103fa360f2c3a6cb4c22c0185f87e2de6475beadb412027255de2cd4\": container with ID starting with 824c8634103fa360f2c3a6cb4c22c0185f87e2de6475beadb412027255de2cd4 not found: ID does not exist" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.337903 4810 scope.go:117] "RemoveContainer" containerID="e21fe2d6964872913c8e5a2f983519806b72d0e2880016372ac90fc2b9f61fc6" Feb 19 15:20:55 crc kubenswrapper[4810]: E0219 15:20:55.338343 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e21fe2d6964872913c8e5a2f983519806b72d0e2880016372ac90fc2b9f61fc6\": container with ID starting with e21fe2d6964872913c8e5a2f983519806b72d0e2880016372ac90fc2b9f61fc6 not found: ID does not exist" containerID="e21fe2d6964872913c8e5a2f983519806b72d0e2880016372ac90fc2b9f61fc6" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.338429 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e21fe2d6964872913c8e5a2f983519806b72d0e2880016372ac90fc2b9f61fc6"} err="failed to get container status \"e21fe2d6964872913c8e5a2f983519806b72d0e2880016372ac90fc2b9f61fc6\": rpc error: code = NotFound desc = could not find container \"e21fe2d6964872913c8e5a2f983519806b72d0e2880016372ac90fc2b9f61fc6\": container with ID starting with e21fe2d6964872913c8e5a2f983519806b72d0e2880016372ac90fc2b9f61fc6 not found: ID does not exist" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.338478 4810 scope.go:117] "RemoveContainer" containerID="5de90aae0b44ad98711b4e4c2764cf25c491a329a7acbe9a6ec49f88fb8e7c92" Feb 19 15:20:55 crc kubenswrapper[4810]: E0219 15:20:55.338820 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5de90aae0b44ad98711b4e4c2764cf25c491a329a7acbe9a6ec49f88fb8e7c92\": container with ID starting with 5de90aae0b44ad98711b4e4c2764cf25c491a329a7acbe9a6ec49f88fb8e7c92 not found: ID does not exist" containerID="5de90aae0b44ad98711b4e4c2764cf25c491a329a7acbe9a6ec49f88fb8e7c92" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.338848 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5de90aae0b44ad98711b4e4c2764cf25c491a329a7acbe9a6ec49f88fb8e7c92"} err="failed to get container status \"5de90aae0b44ad98711b4e4c2764cf25c491a329a7acbe9a6ec49f88fb8e7c92\": rpc error: code = NotFound desc = could not find container \"5de90aae0b44ad98711b4e4c2764cf25c491a329a7acbe9a6ec49f88fb8e7c92\": container with ID starting with 5de90aae0b44ad98711b4e4c2764cf25c491a329a7acbe9a6ec49f88fb8e7c92 not found: ID does not exist" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.338859 4810 scope.go:117] "RemoveContainer" containerID="0c0568b462e1168949dcfd0131ef4e94d6c3f68dd31906ee5c6a8c87f7fef25b" Feb 19 15:20:55 crc kubenswrapper[4810]: E0219 15:20:55.339147 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c0568b462e1168949dcfd0131ef4e94d6c3f68dd31906ee5c6a8c87f7fef25b\": container with ID starting with 0c0568b462e1168949dcfd0131ef4e94d6c3f68dd31906ee5c6a8c87f7fef25b not found: ID does not exist" containerID="0c0568b462e1168949dcfd0131ef4e94d6c3f68dd31906ee5c6a8c87f7fef25b" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.339186 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c0568b462e1168949dcfd0131ef4e94d6c3f68dd31906ee5c6a8c87f7fef25b"} err="failed to get container status \"0c0568b462e1168949dcfd0131ef4e94d6c3f68dd31906ee5c6a8c87f7fef25b\": rpc error: code = NotFound desc = could not find container \"0c0568b462e1168949dcfd0131ef4e94d6c3f68dd31906ee5c6a8c87f7fef25b\": container with ID starting with 0c0568b462e1168949dcfd0131ef4e94d6c3f68dd31906ee5c6a8c87f7fef25b not found: ID does not exist" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.339207 4810 scope.go:117] "RemoveContainer" containerID="879052e78b8d383d0a7f4fbddb38b98fa931bec3d0d0c7ff7eb830ccb62be46d" Feb 19 15:20:55 crc kubenswrapper[4810]: E0219 15:20:55.339532 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"879052e78b8d383d0a7f4fbddb38b98fa931bec3d0d0c7ff7eb830ccb62be46d\": container with ID starting with 879052e78b8d383d0a7f4fbddb38b98fa931bec3d0d0c7ff7eb830ccb62be46d not found: ID does not exist" containerID="879052e78b8d383d0a7f4fbddb38b98fa931bec3d0d0c7ff7eb830ccb62be46d" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.339553 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"879052e78b8d383d0a7f4fbddb38b98fa931bec3d0d0c7ff7eb830ccb62be46d"} err="failed to get container status \"879052e78b8d383d0a7f4fbddb38b98fa931bec3d0d0c7ff7eb830ccb62be46d\": rpc error: code = NotFound desc = could not find container \"879052e78b8d383d0a7f4fbddb38b98fa931bec3d0d0c7ff7eb830ccb62be46d\": container with ID starting with 879052e78b8d383d0a7f4fbddb38b98fa931bec3d0d0c7ff7eb830ccb62be46d not found: ID does not exist" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.339565 4810 scope.go:117] "RemoveContainer" containerID="6dcca2640a7625737b151a65d63dac889139b41a8dff1cd5aae2ead1a37138f1" Feb 19 15:20:55 crc kubenswrapper[4810]: E0219 15:20:55.339760 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6dcca2640a7625737b151a65d63dac889139b41a8dff1cd5aae2ead1a37138f1\": container with ID starting with 6dcca2640a7625737b151a65d63dac889139b41a8dff1cd5aae2ead1a37138f1 not found: ID does not exist" containerID="6dcca2640a7625737b151a65d63dac889139b41a8dff1cd5aae2ead1a37138f1" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.339780 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6dcca2640a7625737b151a65d63dac889139b41a8dff1cd5aae2ead1a37138f1"} err="failed to get container status \"6dcca2640a7625737b151a65d63dac889139b41a8dff1cd5aae2ead1a37138f1\": rpc error: code = NotFound desc = could not find container \"6dcca2640a7625737b151a65d63dac889139b41a8dff1cd5aae2ead1a37138f1\": container with ID starting with 6dcca2640a7625737b151a65d63dac889139b41a8dff1cd5aae2ead1a37138f1 not found: ID does not exist" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.339792 4810 scope.go:117] "RemoveContainer" containerID="702a868ef507f31cb27f41ccd862f38ba8d5256e5d4af19a5d8b168939932663" Feb 19 15:20:55 crc kubenswrapper[4810]: E0219 15:20:55.340026 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"702a868ef507f31cb27f41ccd862f38ba8d5256e5d4af19a5d8b168939932663\": container with ID starting with 702a868ef507f31cb27f41ccd862f38ba8d5256e5d4af19a5d8b168939932663 not found: ID does not exist" containerID="702a868ef507f31cb27f41ccd862f38ba8d5256e5d4af19a5d8b168939932663" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.340056 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"702a868ef507f31cb27f41ccd862f38ba8d5256e5d4af19a5d8b168939932663"} err="failed to get container status \"702a868ef507f31cb27f41ccd862f38ba8d5256e5d4af19a5d8b168939932663\": rpc error: code = NotFound desc = could not find container \"702a868ef507f31cb27f41ccd862f38ba8d5256e5d4af19a5d8b168939932663\": container with ID starting with 702a868ef507f31cb27f41ccd862f38ba8d5256e5d4af19a5d8b168939932663 not found: ID does not exist" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.340076 4810 scope.go:117] "RemoveContainer" containerID="5089067acda7084123f239150916500b11a0547068b0a39ee60a467c372f3459" Feb 19 15:20:55 crc kubenswrapper[4810]: E0219 15:20:55.340342 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5089067acda7084123f239150916500b11a0547068b0a39ee60a467c372f3459\": container with ID starting with 5089067acda7084123f239150916500b11a0547068b0a39ee60a467c372f3459 not found: ID does not exist" containerID="5089067acda7084123f239150916500b11a0547068b0a39ee60a467c372f3459" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.340365 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5089067acda7084123f239150916500b11a0547068b0a39ee60a467c372f3459"} err="failed to get container status \"5089067acda7084123f239150916500b11a0547068b0a39ee60a467c372f3459\": rpc error: code = NotFound desc = could not find container \"5089067acda7084123f239150916500b11a0547068b0a39ee60a467c372f3459\": container with ID starting with 5089067acda7084123f239150916500b11a0547068b0a39ee60a467c372f3459 not found: ID does not exist" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.340379 4810 scope.go:117] "RemoveContainer" containerID="3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c" Feb 19 15:20:55 crc kubenswrapper[4810]: E0219 15:20:55.340567 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\": container with ID starting with 3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c not found: ID does not exist" containerID="3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.340589 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c"} err="failed to get container status \"3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\": rpc error: code = NotFound desc = could not find container \"3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\": container with ID starting with 3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c not found: ID does not exist" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.340601 4810 scope.go:117] "RemoveContainer" containerID="07a16a020eab7c7e2c4939aceb3c9448061dd4b2196bce8720223a0590498c5b" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.340871 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07a16a020eab7c7e2c4939aceb3c9448061dd4b2196bce8720223a0590498c5b"} err="failed to get container status \"07a16a020eab7c7e2c4939aceb3c9448061dd4b2196bce8720223a0590498c5b\": rpc error: code = NotFound desc = could not find container \"07a16a020eab7c7e2c4939aceb3c9448061dd4b2196bce8720223a0590498c5b\": container with ID starting with 07a16a020eab7c7e2c4939aceb3c9448061dd4b2196bce8720223a0590498c5b not found: ID does not exist" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.340900 4810 scope.go:117] "RemoveContainer" containerID="824c8634103fa360f2c3a6cb4c22c0185f87e2de6475beadb412027255de2cd4" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.341152 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"824c8634103fa360f2c3a6cb4c22c0185f87e2de6475beadb412027255de2cd4"} err="failed to get container status \"824c8634103fa360f2c3a6cb4c22c0185f87e2de6475beadb412027255de2cd4\": rpc error: code = NotFound desc = could not find container \"824c8634103fa360f2c3a6cb4c22c0185f87e2de6475beadb412027255de2cd4\": container with ID starting with 824c8634103fa360f2c3a6cb4c22c0185f87e2de6475beadb412027255de2cd4 not found: ID does not exist" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.341173 4810 scope.go:117] "RemoveContainer" containerID="e21fe2d6964872913c8e5a2f983519806b72d0e2880016372ac90fc2b9f61fc6" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.341469 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e21fe2d6964872913c8e5a2f983519806b72d0e2880016372ac90fc2b9f61fc6"} err="failed to get container status \"e21fe2d6964872913c8e5a2f983519806b72d0e2880016372ac90fc2b9f61fc6\": rpc error: code = NotFound desc = could not find container \"e21fe2d6964872913c8e5a2f983519806b72d0e2880016372ac90fc2b9f61fc6\": container with ID starting with e21fe2d6964872913c8e5a2f983519806b72d0e2880016372ac90fc2b9f61fc6 not found: ID does not exist" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.341505 4810 scope.go:117] "RemoveContainer" containerID="5de90aae0b44ad98711b4e4c2764cf25c491a329a7acbe9a6ec49f88fb8e7c92" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.341865 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5de90aae0b44ad98711b4e4c2764cf25c491a329a7acbe9a6ec49f88fb8e7c92"} err="failed to get container status \"5de90aae0b44ad98711b4e4c2764cf25c491a329a7acbe9a6ec49f88fb8e7c92\": rpc error: code = NotFound desc = could not find container \"5de90aae0b44ad98711b4e4c2764cf25c491a329a7acbe9a6ec49f88fb8e7c92\": container with ID starting with 5de90aae0b44ad98711b4e4c2764cf25c491a329a7acbe9a6ec49f88fb8e7c92 not found: ID does not exist" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.341888 4810 scope.go:117] "RemoveContainer" containerID="0c0568b462e1168949dcfd0131ef4e94d6c3f68dd31906ee5c6a8c87f7fef25b" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.342052 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c0568b462e1168949dcfd0131ef4e94d6c3f68dd31906ee5c6a8c87f7fef25b"} err="failed to get container status \"0c0568b462e1168949dcfd0131ef4e94d6c3f68dd31906ee5c6a8c87f7fef25b\": rpc error: code = NotFound desc = could not find container \"0c0568b462e1168949dcfd0131ef4e94d6c3f68dd31906ee5c6a8c87f7fef25b\": container with ID starting with 0c0568b462e1168949dcfd0131ef4e94d6c3f68dd31906ee5c6a8c87f7fef25b not found: ID does not exist" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.342070 4810 scope.go:117] "RemoveContainer" containerID="879052e78b8d383d0a7f4fbddb38b98fa931bec3d0d0c7ff7eb830ccb62be46d" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.342297 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"879052e78b8d383d0a7f4fbddb38b98fa931bec3d0d0c7ff7eb830ccb62be46d"} err="failed to get container status \"879052e78b8d383d0a7f4fbddb38b98fa931bec3d0d0c7ff7eb830ccb62be46d\": rpc error: code = NotFound desc = could not find container \"879052e78b8d383d0a7f4fbddb38b98fa931bec3d0d0c7ff7eb830ccb62be46d\": container with ID starting with 879052e78b8d383d0a7f4fbddb38b98fa931bec3d0d0c7ff7eb830ccb62be46d not found: ID does not exist" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.342362 4810 scope.go:117] "RemoveContainer" containerID="6dcca2640a7625737b151a65d63dac889139b41a8dff1cd5aae2ead1a37138f1" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.342674 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6dcca2640a7625737b151a65d63dac889139b41a8dff1cd5aae2ead1a37138f1"} err="failed to get container status \"6dcca2640a7625737b151a65d63dac889139b41a8dff1cd5aae2ead1a37138f1\": rpc error: code = NotFound desc = could not find container \"6dcca2640a7625737b151a65d63dac889139b41a8dff1cd5aae2ead1a37138f1\": container with ID starting with 6dcca2640a7625737b151a65d63dac889139b41a8dff1cd5aae2ead1a37138f1 not found: ID does not exist" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.342712 4810 scope.go:117] "RemoveContainer" containerID="702a868ef507f31cb27f41ccd862f38ba8d5256e5d4af19a5d8b168939932663" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.342983 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"702a868ef507f31cb27f41ccd862f38ba8d5256e5d4af19a5d8b168939932663"} err="failed to get container status \"702a868ef507f31cb27f41ccd862f38ba8d5256e5d4af19a5d8b168939932663\": rpc error: code = NotFound desc = could not find container \"702a868ef507f31cb27f41ccd862f38ba8d5256e5d4af19a5d8b168939932663\": container with ID starting with 702a868ef507f31cb27f41ccd862f38ba8d5256e5d4af19a5d8b168939932663 not found: ID does not exist" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.343002 4810 scope.go:117] "RemoveContainer" containerID="5089067acda7084123f239150916500b11a0547068b0a39ee60a467c372f3459" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.343197 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5089067acda7084123f239150916500b11a0547068b0a39ee60a467c372f3459"} err="failed to get container status \"5089067acda7084123f239150916500b11a0547068b0a39ee60a467c372f3459\": rpc error: code = NotFound desc = could not find container \"5089067acda7084123f239150916500b11a0547068b0a39ee60a467c372f3459\": container with ID starting with 5089067acda7084123f239150916500b11a0547068b0a39ee60a467c372f3459 not found: ID does not exist" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.343215 4810 scope.go:117] "RemoveContainer" containerID="3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.343537 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c"} err="failed to get container status \"3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\": rpc error: code = NotFound desc = could not find container \"3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\": container with ID starting with 3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c not found: ID does not exist" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.343599 4810 scope.go:117] "RemoveContainer" containerID="07a16a020eab7c7e2c4939aceb3c9448061dd4b2196bce8720223a0590498c5b" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.343900 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07a16a020eab7c7e2c4939aceb3c9448061dd4b2196bce8720223a0590498c5b"} err="failed to get container status \"07a16a020eab7c7e2c4939aceb3c9448061dd4b2196bce8720223a0590498c5b\": rpc error: code = NotFound desc = could not find container \"07a16a020eab7c7e2c4939aceb3c9448061dd4b2196bce8720223a0590498c5b\": container with ID starting with 07a16a020eab7c7e2c4939aceb3c9448061dd4b2196bce8720223a0590498c5b not found: ID does not exist" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.343918 4810 scope.go:117] "RemoveContainer" containerID="824c8634103fa360f2c3a6cb4c22c0185f87e2de6475beadb412027255de2cd4" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.344215 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"824c8634103fa360f2c3a6cb4c22c0185f87e2de6475beadb412027255de2cd4"} err="failed to get container status \"824c8634103fa360f2c3a6cb4c22c0185f87e2de6475beadb412027255de2cd4\": rpc error: code = NotFound desc = could not find container \"824c8634103fa360f2c3a6cb4c22c0185f87e2de6475beadb412027255de2cd4\": container with ID starting with 824c8634103fa360f2c3a6cb4c22c0185f87e2de6475beadb412027255de2cd4 not found: ID does not exist" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.344274 4810 scope.go:117] "RemoveContainer" containerID="e21fe2d6964872913c8e5a2f983519806b72d0e2880016372ac90fc2b9f61fc6" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.344555 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e21fe2d6964872913c8e5a2f983519806b72d0e2880016372ac90fc2b9f61fc6"} err="failed to get container status \"e21fe2d6964872913c8e5a2f983519806b72d0e2880016372ac90fc2b9f61fc6\": rpc error: code = NotFound desc = could not find container \"e21fe2d6964872913c8e5a2f983519806b72d0e2880016372ac90fc2b9f61fc6\": container with ID starting with e21fe2d6964872913c8e5a2f983519806b72d0e2880016372ac90fc2b9f61fc6 not found: ID does not exist" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.344574 4810 scope.go:117] "RemoveContainer" containerID="5de90aae0b44ad98711b4e4c2764cf25c491a329a7acbe9a6ec49f88fb8e7c92" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.344730 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5de90aae0b44ad98711b4e4c2764cf25c491a329a7acbe9a6ec49f88fb8e7c92"} err="failed to get container status \"5de90aae0b44ad98711b4e4c2764cf25c491a329a7acbe9a6ec49f88fb8e7c92\": rpc error: code = NotFound desc = could not find container \"5de90aae0b44ad98711b4e4c2764cf25c491a329a7acbe9a6ec49f88fb8e7c92\": container with ID starting with 5de90aae0b44ad98711b4e4c2764cf25c491a329a7acbe9a6ec49f88fb8e7c92 not found: ID does not exist" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.344749 4810 scope.go:117] "RemoveContainer" containerID="0c0568b462e1168949dcfd0131ef4e94d6c3f68dd31906ee5c6a8c87f7fef25b" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.344924 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c0568b462e1168949dcfd0131ef4e94d6c3f68dd31906ee5c6a8c87f7fef25b"} err="failed to get container status \"0c0568b462e1168949dcfd0131ef4e94d6c3f68dd31906ee5c6a8c87f7fef25b\": rpc error: code = NotFound desc = could not find container \"0c0568b462e1168949dcfd0131ef4e94d6c3f68dd31906ee5c6a8c87f7fef25b\": container with ID starting with 0c0568b462e1168949dcfd0131ef4e94d6c3f68dd31906ee5c6a8c87f7fef25b not found: ID does not exist" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.344954 4810 scope.go:117] "RemoveContainer" containerID="879052e78b8d383d0a7f4fbddb38b98fa931bec3d0d0c7ff7eb830ccb62be46d" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.345167 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"879052e78b8d383d0a7f4fbddb38b98fa931bec3d0d0c7ff7eb830ccb62be46d"} err="failed to get container status \"879052e78b8d383d0a7f4fbddb38b98fa931bec3d0d0c7ff7eb830ccb62be46d\": rpc error: code = NotFound desc = could not find container \"879052e78b8d383d0a7f4fbddb38b98fa931bec3d0d0c7ff7eb830ccb62be46d\": container with ID starting with 879052e78b8d383d0a7f4fbddb38b98fa931bec3d0d0c7ff7eb830ccb62be46d not found: ID does not exist" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.345186 4810 scope.go:117] "RemoveContainer" containerID="6dcca2640a7625737b151a65d63dac889139b41a8dff1cd5aae2ead1a37138f1" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.345481 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6dcca2640a7625737b151a65d63dac889139b41a8dff1cd5aae2ead1a37138f1"} err="failed to get container status \"6dcca2640a7625737b151a65d63dac889139b41a8dff1cd5aae2ead1a37138f1\": rpc error: code = NotFound desc = could not find container \"6dcca2640a7625737b151a65d63dac889139b41a8dff1cd5aae2ead1a37138f1\": container with ID starting with 6dcca2640a7625737b151a65d63dac889139b41a8dff1cd5aae2ead1a37138f1 not found: ID does not exist" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.345545 4810 scope.go:117] "RemoveContainer" containerID="702a868ef507f31cb27f41ccd862f38ba8d5256e5d4af19a5d8b168939932663" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.345866 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"702a868ef507f31cb27f41ccd862f38ba8d5256e5d4af19a5d8b168939932663"} err="failed to get container status \"702a868ef507f31cb27f41ccd862f38ba8d5256e5d4af19a5d8b168939932663\": rpc error: code = NotFound desc = could not find container \"702a868ef507f31cb27f41ccd862f38ba8d5256e5d4af19a5d8b168939932663\": container with ID starting with 702a868ef507f31cb27f41ccd862f38ba8d5256e5d4af19a5d8b168939932663 not found: ID does not exist" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.345896 4810 scope.go:117] "RemoveContainer" containerID="5089067acda7084123f239150916500b11a0547068b0a39ee60a467c372f3459" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.346109 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5089067acda7084123f239150916500b11a0547068b0a39ee60a467c372f3459"} err="failed to get container status \"5089067acda7084123f239150916500b11a0547068b0a39ee60a467c372f3459\": rpc error: code = NotFound desc = could not find container \"5089067acda7084123f239150916500b11a0547068b0a39ee60a467c372f3459\": container with ID starting with 5089067acda7084123f239150916500b11a0547068b0a39ee60a467c372f3459 not found: ID does not exist" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.346139 4810 scope.go:117] "RemoveContainer" containerID="3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.346399 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c"} err="failed to get container status \"3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\": rpc error: code = NotFound desc = could not find container \"3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\": container with ID starting with 3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c not found: ID does not exist" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.346426 4810 scope.go:117] "RemoveContainer" containerID="07a16a020eab7c7e2c4939aceb3c9448061dd4b2196bce8720223a0590498c5b" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.346705 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07a16a020eab7c7e2c4939aceb3c9448061dd4b2196bce8720223a0590498c5b"} err="failed to get container status \"07a16a020eab7c7e2c4939aceb3c9448061dd4b2196bce8720223a0590498c5b\": rpc error: code = NotFound desc = could not find container \"07a16a020eab7c7e2c4939aceb3c9448061dd4b2196bce8720223a0590498c5b\": container with ID starting with 07a16a020eab7c7e2c4939aceb3c9448061dd4b2196bce8720223a0590498c5b not found: ID does not exist" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.346740 4810 scope.go:117] "RemoveContainer" containerID="824c8634103fa360f2c3a6cb4c22c0185f87e2de6475beadb412027255de2cd4" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.347021 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"824c8634103fa360f2c3a6cb4c22c0185f87e2de6475beadb412027255de2cd4"} err="failed to get container status \"824c8634103fa360f2c3a6cb4c22c0185f87e2de6475beadb412027255de2cd4\": rpc error: code = NotFound desc = could not find container \"824c8634103fa360f2c3a6cb4c22c0185f87e2de6475beadb412027255de2cd4\": container with ID starting with 824c8634103fa360f2c3a6cb4c22c0185f87e2de6475beadb412027255de2cd4 not found: ID does not exist" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.347043 4810 scope.go:117] "RemoveContainer" containerID="e21fe2d6964872913c8e5a2f983519806b72d0e2880016372ac90fc2b9f61fc6" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.347270 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e21fe2d6964872913c8e5a2f983519806b72d0e2880016372ac90fc2b9f61fc6"} err="failed to get container status \"e21fe2d6964872913c8e5a2f983519806b72d0e2880016372ac90fc2b9f61fc6\": rpc error: code = NotFound desc = could not find container \"e21fe2d6964872913c8e5a2f983519806b72d0e2880016372ac90fc2b9f61fc6\": container with ID starting with e21fe2d6964872913c8e5a2f983519806b72d0e2880016372ac90fc2b9f61fc6 not found: ID does not exist" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.347307 4810 scope.go:117] "RemoveContainer" containerID="5de90aae0b44ad98711b4e4c2764cf25c491a329a7acbe9a6ec49f88fb8e7c92" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.347549 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5de90aae0b44ad98711b4e4c2764cf25c491a329a7acbe9a6ec49f88fb8e7c92"} err="failed to get container status \"5de90aae0b44ad98711b4e4c2764cf25c491a329a7acbe9a6ec49f88fb8e7c92\": rpc error: code = NotFound desc = could not find container \"5de90aae0b44ad98711b4e4c2764cf25c491a329a7acbe9a6ec49f88fb8e7c92\": container with ID starting with 5de90aae0b44ad98711b4e4c2764cf25c491a329a7acbe9a6ec49f88fb8e7c92 not found: ID does not exist" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.347566 4810 scope.go:117] "RemoveContainer" containerID="0c0568b462e1168949dcfd0131ef4e94d6c3f68dd31906ee5c6a8c87f7fef25b" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.347757 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c0568b462e1168949dcfd0131ef4e94d6c3f68dd31906ee5c6a8c87f7fef25b"} err="failed to get container status \"0c0568b462e1168949dcfd0131ef4e94d6c3f68dd31906ee5c6a8c87f7fef25b\": rpc error: code = NotFound desc = could not find container \"0c0568b462e1168949dcfd0131ef4e94d6c3f68dd31906ee5c6a8c87f7fef25b\": container with ID starting with 0c0568b462e1168949dcfd0131ef4e94d6c3f68dd31906ee5c6a8c87f7fef25b not found: ID does not exist" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.347811 4810 scope.go:117] "RemoveContainer" containerID="879052e78b8d383d0a7f4fbddb38b98fa931bec3d0d0c7ff7eb830ccb62be46d" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.348051 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"879052e78b8d383d0a7f4fbddb38b98fa931bec3d0d0c7ff7eb830ccb62be46d"} err="failed to get container status \"879052e78b8d383d0a7f4fbddb38b98fa931bec3d0d0c7ff7eb830ccb62be46d\": rpc error: code = NotFound desc = could not find container \"879052e78b8d383d0a7f4fbddb38b98fa931bec3d0d0c7ff7eb830ccb62be46d\": container with ID starting with 879052e78b8d383d0a7f4fbddb38b98fa931bec3d0d0c7ff7eb830ccb62be46d not found: ID does not exist" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.348080 4810 scope.go:117] "RemoveContainer" containerID="6dcca2640a7625737b151a65d63dac889139b41a8dff1cd5aae2ead1a37138f1" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.348395 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6dcca2640a7625737b151a65d63dac889139b41a8dff1cd5aae2ead1a37138f1"} err="failed to get container status \"6dcca2640a7625737b151a65d63dac889139b41a8dff1cd5aae2ead1a37138f1\": rpc error: code = NotFound desc = could not find container \"6dcca2640a7625737b151a65d63dac889139b41a8dff1cd5aae2ead1a37138f1\": container with ID starting with 6dcca2640a7625737b151a65d63dac889139b41a8dff1cd5aae2ead1a37138f1 not found: ID does not exist" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.348455 4810 scope.go:117] "RemoveContainer" containerID="702a868ef507f31cb27f41ccd862f38ba8d5256e5d4af19a5d8b168939932663" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.348699 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"702a868ef507f31cb27f41ccd862f38ba8d5256e5d4af19a5d8b168939932663"} err="failed to get container status \"702a868ef507f31cb27f41ccd862f38ba8d5256e5d4af19a5d8b168939932663\": rpc error: code = NotFound desc = could not find container \"702a868ef507f31cb27f41ccd862f38ba8d5256e5d4af19a5d8b168939932663\": container with ID starting with 702a868ef507f31cb27f41ccd862f38ba8d5256e5d4af19a5d8b168939932663 not found: ID does not exist" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.348746 4810 scope.go:117] "RemoveContainer" containerID="5089067acda7084123f239150916500b11a0547068b0a39ee60a467c372f3459" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.349004 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5089067acda7084123f239150916500b11a0547068b0a39ee60a467c372f3459"} err="failed to get container status \"5089067acda7084123f239150916500b11a0547068b0a39ee60a467c372f3459\": rpc error: code = NotFound desc = could not find container \"5089067acda7084123f239150916500b11a0547068b0a39ee60a467c372f3459\": container with ID starting with 5089067acda7084123f239150916500b11a0547068b0a39ee60a467c372f3459 not found: ID does not exist" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.349055 4810 scope.go:117] "RemoveContainer" containerID="3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.349456 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c"} err="failed to get container status \"3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\": rpc error: code = NotFound desc = could not find container \"3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c\": container with ID starting with 3924ff75e9c9ac70988594877cb4d8bf52d6491e6335ed7cf78f43fc79728f7c not found: ID does not exist" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.349492 4810 scope.go:117] "RemoveContainer" containerID="07a16a020eab7c7e2c4939aceb3c9448061dd4b2196bce8720223a0590498c5b" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.349825 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07a16a020eab7c7e2c4939aceb3c9448061dd4b2196bce8720223a0590498c5b"} err="failed to get container status \"07a16a020eab7c7e2c4939aceb3c9448061dd4b2196bce8720223a0590498c5b\": rpc error: code = NotFound desc = could not find container \"07a16a020eab7c7e2c4939aceb3c9448061dd4b2196bce8720223a0590498c5b\": container with ID starting with 07a16a020eab7c7e2c4939aceb3c9448061dd4b2196bce8720223a0590498c5b not found: ID does not exist" Feb 19 15:20:55 crc kubenswrapper[4810]: I0219 15:20:55.449735 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5a8a15c-53e8-4868-8feb-dcd4e83939a4" path="/var/lib/kubelet/pods/c5a8a15c-53e8-4868-8feb-dcd4e83939a4/volumes" Feb 19 15:20:56 crc kubenswrapper[4810]: I0219 15:20:56.112460 4810 generic.go:334] "Generic (PLEG): container finished" podID="6ac1d960-0c17-4a48-8944-3eb4bb640ddf" containerID="d1b7ed0186665eaf4b51010ff23f296cb8672043c98e5557b6cad2f3d8297654" exitCode=0 Feb 19 15:20:56 crc kubenswrapper[4810]: I0219 15:20:56.112514 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8zf4d" event={"ID":"6ac1d960-0c17-4a48-8944-3eb4bb640ddf","Type":"ContainerDied","Data":"d1b7ed0186665eaf4b51010ff23f296cb8672043c98e5557b6cad2f3d8297654"} Feb 19 15:20:57 crc kubenswrapper[4810]: I0219 15:20:57.123108 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8zf4d" event={"ID":"6ac1d960-0c17-4a48-8944-3eb4bb640ddf","Type":"ContainerStarted","Data":"acc4fe8a88eec0620e636c35cf7351ef642a53d545ccb338729ff194598def20"} Feb 19 15:20:57 crc kubenswrapper[4810]: I0219 15:20:57.123720 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8zf4d" event={"ID":"6ac1d960-0c17-4a48-8944-3eb4bb640ddf","Type":"ContainerStarted","Data":"cd15a8f088bd89fc80b17bdfbb326dfc6c27d72a7465ea9239114a0c687f8406"} Feb 19 15:20:57 crc kubenswrapper[4810]: I0219 15:20:57.123735 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8zf4d" event={"ID":"6ac1d960-0c17-4a48-8944-3eb4bb640ddf","Type":"ContainerStarted","Data":"2006d3c679333961875a1a4fee973d25f7e9e912685b913d369d8756d9b2db6b"} Feb 19 15:20:57 crc kubenswrapper[4810]: I0219 15:20:57.123750 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8zf4d" event={"ID":"6ac1d960-0c17-4a48-8944-3eb4bb640ddf","Type":"ContainerStarted","Data":"0c3f881dadd35095fbcb646578d8e44a30f4321edc52312452e8b65980ddc196"} Feb 19 15:20:57 crc kubenswrapper[4810]: I0219 15:20:57.123762 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8zf4d" event={"ID":"6ac1d960-0c17-4a48-8944-3eb4bb640ddf","Type":"ContainerStarted","Data":"32493140e07e5c14b6b8da2f78426c828bdcab6767d59d249ad4c7f8100eac7e"} Feb 19 15:20:57 crc kubenswrapper[4810]: I0219 15:20:57.123773 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8zf4d" event={"ID":"6ac1d960-0c17-4a48-8944-3eb4bb640ddf","Type":"ContainerStarted","Data":"1aa22282c45b428229c7c21075eb2eb4d7eb514f85c55fbf2f0e7ba9b365f047"} Feb 19 15:20:59 crc kubenswrapper[4810]: I0219 15:20:59.139695 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8zf4d" event={"ID":"6ac1d960-0c17-4a48-8944-3eb4bb640ddf","Type":"ContainerStarted","Data":"1bc3bee891abfe3c8b65203eac01828dd1fd03d0c253692df3b35cc9b3afcd93"} Feb 19 15:21:02 crc kubenswrapper[4810]: I0219 15:21:02.163796 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8zf4d" event={"ID":"6ac1d960-0c17-4a48-8944-3eb4bb640ddf","Type":"ContainerStarted","Data":"a6c36c4db41254418cf783c07128f03be371060e1445541ae072454a7b283562"} Feb 19 15:21:02 crc kubenswrapper[4810]: I0219 15:21:02.164385 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8zf4d" Feb 19 15:21:02 crc kubenswrapper[4810]: I0219 15:21:02.164404 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8zf4d" Feb 19 15:21:02 crc kubenswrapper[4810]: I0219 15:21:02.190876 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-8zf4d" podStartSLOduration=8.190846448 podStartE2EDuration="8.190846448s" podCreationTimestamp="2026-02-19 15:20:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:21:02.189243769 +0000 UTC m=+691.671273883" watchObservedRunningTime="2026-02-19 15:21:02.190846448 +0000 UTC m=+691.672876582" Feb 19 15:21:02 crc kubenswrapper[4810]: I0219 15:21:02.192354 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8zf4d" Feb 19 15:21:03 crc kubenswrapper[4810]: I0219 15:21:03.168878 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8zf4d" Feb 19 15:21:03 crc kubenswrapper[4810]: I0219 15:21:03.228078 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8zf4d" Feb 19 15:21:06 crc kubenswrapper[4810]: I0219 15:21:06.439470 4810 scope.go:117] "RemoveContainer" containerID="92e30886e49380d7d876397116f7db4e85388275b36d2f8ee0ab84b9167f3dde" Feb 19 15:21:06 crc kubenswrapper[4810]: E0219 15:21:06.440252 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-bsztz_openshift-multus(2a45a199-beeb-4972-b796-15c958fe99d3)\"" pod="openshift-multus/multus-bsztz" podUID="2a45a199-beeb-4972-b796-15c958fe99d3" Feb 19 15:21:17 crc kubenswrapper[4810]: I0219 15:21:17.439267 4810 scope.go:117] "RemoveContainer" containerID="92e30886e49380d7d876397116f7db4e85388275b36d2f8ee0ab84b9167f3dde" Feb 19 15:21:18 crc kubenswrapper[4810]: I0219 15:21:18.288901 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bsztz_2a45a199-beeb-4972-b796-15c958fe99d3/kube-multus/2.log" Feb 19 15:21:18 crc kubenswrapper[4810]: I0219 15:21:18.290220 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bsztz_2a45a199-beeb-4972-b796-15c958fe99d3/kube-multus/1.log" Feb 19 15:21:18 crc kubenswrapper[4810]: I0219 15:21:18.290379 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bsztz" event={"ID":"2a45a199-beeb-4972-b796-15c958fe99d3","Type":"ContainerStarted","Data":"db0895596172526edd308f9ec4aeaa3e98f20758dd6d805914c4a9b2c94a0568"} Feb 19 15:21:19 crc kubenswrapper[4810]: I0219 15:21:19.538184 4810 patch_prober.go:28] interesting pod/machine-config-daemon-t499d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 15:21:19 crc kubenswrapper[4810]: I0219 15:21:19.538725 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 15:21:22 crc kubenswrapper[4810]: I0219 15:21:22.204826 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lgm4z"] Feb 19 15:21:22 crc kubenswrapper[4810]: I0219 15:21:22.208838 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lgm4z" Feb 19 15:21:22 crc kubenswrapper[4810]: I0219 15:21:22.211837 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 19 15:21:22 crc kubenswrapper[4810]: I0219 15:21:22.223297 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lgm4z"] Feb 19 15:21:22 crc kubenswrapper[4810]: I0219 15:21:22.342259 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e54510d9-7d24-47bb-a55e-b50e7cff9fba-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lgm4z\" (UID: \"e54510d9-7d24-47bb-a55e-b50e7cff9fba\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lgm4z" Feb 19 15:21:22 crc kubenswrapper[4810]: I0219 15:21:22.342971 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e54510d9-7d24-47bb-a55e-b50e7cff9fba-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lgm4z\" (UID: \"e54510d9-7d24-47bb-a55e-b50e7cff9fba\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lgm4z" Feb 19 15:21:22 crc kubenswrapper[4810]: I0219 15:21:22.343147 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxsss\" (UniqueName: \"kubernetes.io/projected/e54510d9-7d24-47bb-a55e-b50e7cff9fba-kube-api-access-cxsss\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lgm4z\" (UID: \"e54510d9-7d24-47bb-a55e-b50e7cff9fba\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lgm4z" Feb 19 15:21:22 crc kubenswrapper[4810]: I0219 15:21:22.444825 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e54510d9-7d24-47bb-a55e-b50e7cff9fba-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lgm4z\" (UID: \"e54510d9-7d24-47bb-a55e-b50e7cff9fba\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lgm4z" Feb 19 15:21:22 crc kubenswrapper[4810]: I0219 15:21:22.444914 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e54510d9-7d24-47bb-a55e-b50e7cff9fba-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lgm4z\" (UID: \"e54510d9-7d24-47bb-a55e-b50e7cff9fba\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lgm4z" Feb 19 15:21:22 crc kubenswrapper[4810]: I0219 15:21:22.445139 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxsss\" (UniqueName: \"kubernetes.io/projected/e54510d9-7d24-47bb-a55e-b50e7cff9fba-kube-api-access-cxsss\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lgm4z\" (UID: \"e54510d9-7d24-47bb-a55e-b50e7cff9fba\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lgm4z" Feb 19 15:21:22 crc kubenswrapper[4810]: I0219 15:21:22.445661 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e54510d9-7d24-47bb-a55e-b50e7cff9fba-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lgm4z\" (UID: \"e54510d9-7d24-47bb-a55e-b50e7cff9fba\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lgm4z" Feb 19 15:21:22 crc kubenswrapper[4810]: I0219 15:21:22.445692 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e54510d9-7d24-47bb-a55e-b50e7cff9fba-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lgm4z\" (UID: \"e54510d9-7d24-47bb-a55e-b50e7cff9fba\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lgm4z" Feb 19 15:21:22 crc kubenswrapper[4810]: I0219 15:21:22.472185 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxsss\" (UniqueName: \"kubernetes.io/projected/e54510d9-7d24-47bb-a55e-b50e7cff9fba-kube-api-access-cxsss\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lgm4z\" (UID: \"e54510d9-7d24-47bb-a55e-b50e7cff9fba\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lgm4z" Feb 19 15:21:22 crc kubenswrapper[4810]: I0219 15:21:22.534309 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lgm4z" Feb 19 15:21:22 crc kubenswrapper[4810]: I0219 15:21:22.832963 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lgm4z"] Feb 19 15:21:22 crc kubenswrapper[4810]: W0219 15:21:22.839196 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode54510d9_7d24_47bb_a55e_b50e7cff9fba.slice/crio-4c62b8022bb32b78bafdeb6d825b4518711a45a07ca451f3468bb51c95de1ff1 WatchSource:0}: Error finding container 4c62b8022bb32b78bafdeb6d825b4518711a45a07ca451f3468bb51c95de1ff1: Status 404 returned error can't find the container with id 4c62b8022bb32b78bafdeb6d825b4518711a45a07ca451f3468bb51c95de1ff1 Feb 19 15:21:23 crc kubenswrapper[4810]: I0219 15:21:23.324970 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lgm4z" event={"ID":"e54510d9-7d24-47bb-a55e-b50e7cff9fba","Type":"ContainerStarted","Data":"f2754ed0bd782f0cfef1eee1c1af5d1b4775d6b8194b486514baacb9316e5241"} Feb 19 15:21:23 crc kubenswrapper[4810]: I0219 15:21:23.325004 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lgm4z" event={"ID":"e54510d9-7d24-47bb-a55e-b50e7cff9fba","Type":"ContainerStarted","Data":"4c62b8022bb32b78bafdeb6d825b4518711a45a07ca451f3468bb51c95de1ff1"} Feb 19 15:21:25 crc kubenswrapper[4810]: I0219 15:21:25.052131 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8zf4d" Feb 19 15:21:25 crc kubenswrapper[4810]: I0219 15:21:25.340574 4810 generic.go:334] "Generic (PLEG): container finished" podID="e54510d9-7d24-47bb-a55e-b50e7cff9fba" containerID="f2754ed0bd782f0cfef1eee1c1af5d1b4775d6b8194b486514baacb9316e5241" exitCode=0 Feb 19 15:21:25 crc kubenswrapper[4810]: I0219 15:21:25.340623 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lgm4z" event={"ID":"e54510d9-7d24-47bb-a55e-b50e7cff9fba","Type":"ContainerDied","Data":"f2754ed0bd782f0cfef1eee1c1af5d1b4775d6b8194b486514baacb9316e5241"} Feb 19 15:21:27 crc kubenswrapper[4810]: I0219 15:21:27.356446 4810 generic.go:334] "Generic (PLEG): container finished" podID="e54510d9-7d24-47bb-a55e-b50e7cff9fba" containerID="602e608b65f405d3564f6623182522db988bda7619faf3bb81281274f4fee755" exitCode=0 Feb 19 15:21:27 crc kubenswrapper[4810]: I0219 15:21:27.356559 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lgm4z" event={"ID":"e54510d9-7d24-47bb-a55e-b50e7cff9fba","Type":"ContainerDied","Data":"602e608b65f405d3564f6623182522db988bda7619faf3bb81281274f4fee755"} Feb 19 15:21:28 crc kubenswrapper[4810]: I0219 15:21:28.370098 4810 generic.go:334] "Generic (PLEG): container finished" podID="e54510d9-7d24-47bb-a55e-b50e7cff9fba" containerID="62f6b9037cd3effdb72857382a8084eec734578a33ebfe0a5303ab318003d68c" exitCode=0 Feb 19 15:21:28 crc kubenswrapper[4810]: I0219 15:21:28.370405 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lgm4z" event={"ID":"e54510d9-7d24-47bb-a55e-b50e7cff9fba","Type":"ContainerDied","Data":"62f6b9037cd3effdb72857382a8084eec734578a33ebfe0a5303ab318003d68c"} Feb 19 15:21:29 crc kubenswrapper[4810]: I0219 15:21:29.728908 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lgm4z" Feb 19 15:21:29 crc kubenswrapper[4810]: I0219 15:21:29.866119 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxsss\" (UniqueName: \"kubernetes.io/projected/e54510d9-7d24-47bb-a55e-b50e7cff9fba-kube-api-access-cxsss\") pod \"e54510d9-7d24-47bb-a55e-b50e7cff9fba\" (UID: \"e54510d9-7d24-47bb-a55e-b50e7cff9fba\") " Feb 19 15:21:29 crc kubenswrapper[4810]: I0219 15:21:29.866262 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e54510d9-7d24-47bb-a55e-b50e7cff9fba-util\") pod \"e54510d9-7d24-47bb-a55e-b50e7cff9fba\" (UID: \"e54510d9-7d24-47bb-a55e-b50e7cff9fba\") " Feb 19 15:21:29 crc kubenswrapper[4810]: I0219 15:21:29.866306 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e54510d9-7d24-47bb-a55e-b50e7cff9fba-bundle\") pod \"e54510d9-7d24-47bb-a55e-b50e7cff9fba\" (UID: \"e54510d9-7d24-47bb-a55e-b50e7cff9fba\") " Feb 19 15:21:29 crc kubenswrapper[4810]: I0219 15:21:29.870500 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e54510d9-7d24-47bb-a55e-b50e7cff9fba-bundle" (OuterVolumeSpecName: "bundle") pod "e54510d9-7d24-47bb-a55e-b50e7cff9fba" (UID: "e54510d9-7d24-47bb-a55e-b50e7cff9fba"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:21:29 crc kubenswrapper[4810]: I0219 15:21:29.872891 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e54510d9-7d24-47bb-a55e-b50e7cff9fba-kube-api-access-cxsss" (OuterVolumeSpecName: "kube-api-access-cxsss") pod "e54510d9-7d24-47bb-a55e-b50e7cff9fba" (UID: "e54510d9-7d24-47bb-a55e-b50e7cff9fba"). InnerVolumeSpecName "kube-api-access-cxsss". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:21:29 crc kubenswrapper[4810]: I0219 15:21:29.893128 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e54510d9-7d24-47bb-a55e-b50e7cff9fba-util" (OuterVolumeSpecName: "util") pod "e54510d9-7d24-47bb-a55e-b50e7cff9fba" (UID: "e54510d9-7d24-47bb-a55e-b50e7cff9fba"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:21:29 crc kubenswrapper[4810]: I0219 15:21:29.967870 4810 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e54510d9-7d24-47bb-a55e-b50e7cff9fba-util\") on node \"crc\" DevicePath \"\"" Feb 19 15:21:29 crc kubenswrapper[4810]: I0219 15:21:29.967939 4810 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e54510d9-7d24-47bb-a55e-b50e7cff9fba-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 15:21:29 crc kubenswrapper[4810]: I0219 15:21:29.967966 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxsss\" (UniqueName: \"kubernetes.io/projected/e54510d9-7d24-47bb-a55e-b50e7cff9fba-kube-api-access-cxsss\") on node \"crc\" DevicePath \"\"" Feb 19 15:21:30 crc kubenswrapper[4810]: I0219 15:21:30.388899 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lgm4z" event={"ID":"e54510d9-7d24-47bb-a55e-b50e7cff9fba","Type":"ContainerDied","Data":"4c62b8022bb32b78bafdeb6d825b4518711a45a07ca451f3468bb51c95de1ff1"} Feb 19 15:21:30 crc kubenswrapper[4810]: I0219 15:21:30.388971 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c62b8022bb32b78bafdeb6d825b4518711a45a07ca451f3468bb51c95de1ff1" Feb 19 15:21:30 crc kubenswrapper[4810]: I0219 15:21:30.389012 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lgm4z" Feb 19 15:21:31 crc kubenswrapper[4810]: I0219 15:21:31.908634 4810 scope.go:117] "RemoveContainer" containerID="c1f575901c6e6174387489f4f3a7d8cfa46089db51556b6acc9245faaf36a9ca" Feb 19 15:21:32 crc kubenswrapper[4810]: I0219 15:21:32.403188 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bsztz_2a45a199-beeb-4972-b796-15c958fe99d3/kube-multus/2.log" Feb 19 15:21:39 crc kubenswrapper[4810]: I0219 15:21:39.120046 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-6jkkz"] Feb 19 15:21:39 crc kubenswrapper[4810]: E0219 15:21:39.120386 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e54510d9-7d24-47bb-a55e-b50e7cff9fba" containerName="pull" Feb 19 15:21:39 crc kubenswrapper[4810]: I0219 15:21:39.120409 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="e54510d9-7d24-47bb-a55e-b50e7cff9fba" containerName="pull" Feb 19 15:21:39 crc kubenswrapper[4810]: E0219 15:21:39.120429 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e54510d9-7d24-47bb-a55e-b50e7cff9fba" containerName="extract" Feb 19 15:21:39 crc kubenswrapper[4810]: I0219 15:21:39.120439 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="e54510d9-7d24-47bb-a55e-b50e7cff9fba" containerName="extract" Feb 19 15:21:39 crc kubenswrapper[4810]: E0219 15:21:39.120458 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e54510d9-7d24-47bb-a55e-b50e7cff9fba" containerName="util" Feb 19 15:21:39 crc kubenswrapper[4810]: I0219 15:21:39.120469 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="e54510d9-7d24-47bb-a55e-b50e7cff9fba" containerName="util" Feb 19 15:21:39 crc kubenswrapper[4810]: I0219 15:21:39.120609 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="e54510d9-7d24-47bb-a55e-b50e7cff9fba" containerName="extract" Feb 19 15:21:39 crc kubenswrapper[4810]: I0219 15:21:39.121047 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-6jkkz" Feb 19 15:21:39 crc kubenswrapper[4810]: I0219 15:21:39.124374 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Feb 19 15:21:39 crc kubenswrapper[4810]: I0219 15:21:39.124919 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-hf4j7" Feb 19 15:21:39 crc kubenswrapper[4810]: I0219 15:21:39.125418 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Feb 19 15:21:39 crc kubenswrapper[4810]: I0219 15:21:39.175720 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-6jkkz"] Feb 19 15:21:39 crc kubenswrapper[4810]: I0219 15:21:39.185990 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ch6r5\" (UniqueName: \"kubernetes.io/projected/1656f52d-7771-4bbb-9642-b296d16b791e-kube-api-access-ch6r5\") pod \"obo-prometheus-operator-68bc856cb9-6jkkz\" (UID: \"1656f52d-7771-4bbb-9642-b296d16b791e\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-6jkkz" Feb 19 15:21:39 crc kubenswrapper[4810]: I0219 15:21:39.257510 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6d6bf9b6-vrfvr"] Feb 19 15:21:39 crc kubenswrapper[4810]: I0219 15:21:39.258128 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d6bf9b6-vrfvr" Feb 19 15:21:39 crc kubenswrapper[4810]: I0219 15:21:39.259947 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-chvj7" Feb 19 15:21:39 crc kubenswrapper[4810]: I0219 15:21:39.260076 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Feb 19 15:21:39 crc kubenswrapper[4810]: I0219 15:21:39.273157 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6d6bf9b6-vrfvr"] Feb 19 15:21:39 crc kubenswrapper[4810]: I0219 15:21:39.284624 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6d6bf9b6-qpvdt"] Feb 19 15:21:39 crc kubenswrapper[4810]: I0219 15:21:39.285484 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d6bf9b6-qpvdt" Feb 19 15:21:39 crc kubenswrapper[4810]: I0219 15:21:39.291092 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/408628c0-0b2c-48f9-b849-ee1b124499e1-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6d6bf9b6-vrfvr\" (UID: \"408628c0-0b2c-48f9-b849-ee1b124499e1\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d6bf9b6-vrfvr" Feb 19 15:21:39 crc kubenswrapper[4810]: I0219 15:21:39.291243 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ch6r5\" (UniqueName: \"kubernetes.io/projected/1656f52d-7771-4bbb-9642-b296d16b791e-kube-api-access-ch6r5\") pod \"obo-prometheus-operator-68bc856cb9-6jkkz\" (UID: \"1656f52d-7771-4bbb-9642-b296d16b791e\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-6jkkz" Feb 19 15:21:39 crc kubenswrapper[4810]: I0219 15:21:39.291293 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/408628c0-0b2c-48f9-b849-ee1b124499e1-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6d6bf9b6-vrfvr\" (UID: \"408628c0-0b2c-48f9-b849-ee1b124499e1\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d6bf9b6-vrfvr" Feb 19 15:21:39 crc kubenswrapper[4810]: I0219 15:21:39.305672 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6d6bf9b6-qpvdt"] Feb 19 15:21:39 crc kubenswrapper[4810]: I0219 15:21:39.340258 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ch6r5\" (UniqueName: \"kubernetes.io/projected/1656f52d-7771-4bbb-9642-b296d16b791e-kube-api-access-ch6r5\") pod \"obo-prometheus-operator-68bc856cb9-6jkkz\" (UID: \"1656f52d-7771-4bbb-9642-b296d16b791e\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-6jkkz" Feb 19 15:21:39 crc kubenswrapper[4810]: I0219 15:21:39.391952 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/408628c0-0b2c-48f9-b849-ee1b124499e1-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6d6bf9b6-vrfvr\" (UID: \"408628c0-0b2c-48f9-b849-ee1b124499e1\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d6bf9b6-vrfvr" Feb 19 15:21:39 crc kubenswrapper[4810]: I0219 15:21:39.392007 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d5debcf2-9629-4bb2-9133-f4b81748ff7d-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6d6bf9b6-qpvdt\" (UID: \"d5debcf2-9629-4bb2-9133-f4b81748ff7d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d6bf9b6-qpvdt" Feb 19 15:21:39 crc kubenswrapper[4810]: I0219 15:21:39.392042 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d5debcf2-9629-4bb2-9133-f4b81748ff7d-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6d6bf9b6-qpvdt\" (UID: \"d5debcf2-9629-4bb2-9133-f4b81748ff7d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d6bf9b6-qpvdt" Feb 19 15:21:39 crc kubenswrapper[4810]: I0219 15:21:39.392099 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/408628c0-0b2c-48f9-b849-ee1b124499e1-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6d6bf9b6-vrfvr\" (UID: \"408628c0-0b2c-48f9-b849-ee1b124499e1\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d6bf9b6-vrfvr" Feb 19 15:21:39 crc kubenswrapper[4810]: I0219 15:21:39.404760 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/408628c0-0b2c-48f9-b849-ee1b124499e1-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6d6bf9b6-vrfvr\" (UID: \"408628c0-0b2c-48f9-b849-ee1b124499e1\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d6bf9b6-vrfvr" Feb 19 15:21:39 crc kubenswrapper[4810]: I0219 15:21:39.411777 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/408628c0-0b2c-48f9-b849-ee1b124499e1-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6d6bf9b6-vrfvr\" (UID: \"408628c0-0b2c-48f9-b849-ee1b124499e1\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d6bf9b6-vrfvr" Feb 19 15:21:39 crc kubenswrapper[4810]: I0219 15:21:39.436728 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-dk9c4"] Feb 19 15:21:39 crc kubenswrapper[4810]: I0219 15:21:39.437407 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-dk9c4" Feb 19 15:21:39 crc kubenswrapper[4810]: I0219 15:21:39.439029 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-6jkkz" Feb 19 15:21:39 crc kubenswrapper[4810]: I0219 15:21:39.439848 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Feb 19 15:21:39 crc kubenswrapper[4810]: I0219 15:21:39.440036 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-zhpbs" Feb 19 15:21:39 crc kubenswrapper[4810]: I0219 15:21:39.471847 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-dk9c4"] Feb 19 15:21:39 crc kubenswrapper[4810]: I0219 15:21:39.493023 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfwlv\" (UniqueName: \"kubernetes.io/projected/8bdf030e-92d8-45dc-ab6c-a7b241444677-kube-api-access-nfwlv\") pod \"observability-operator-59bdc8b94-dk9c4\" (UID: \"8bdf030e-92d8-45dc-ab6c-a7b241444677\") " pod="openshift-operators/observability-operator-59bdc8b94-dk9c4" Feb 19 15:21:39 crc kubenswrapper[4810]: I0219 15:21:39.493123 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d5debcf2-9629-4bb2-9133-f4b81748ff7d-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6d6bf9b6-qpvdt\" (UID: \"d5debcf2-9629-4bb2-9133-f4b81748ff7d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d6bf9b6-qpvdt" Feb 19 15:21:39 crc kubenswrapper[4810]: I0219 15:21:39.493176 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d5debcf2-9629-4bb2-9133-f4b81748ff7d-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6d6bf9b6-qpvdt\" (UID: \"d5debcf2-9629-4bb2-9133-f4b81748ff7d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d6bf9b6-qpvdt" Feb 19 15:21:39 crc kubenswrapper[4810]: I0219 15:21:39.493196 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/8bdf030e-92d8-45dc-ab6c-a7b241444677-observability-operator-tls\") pod \"observability-operator-59bdc8b94-dk9c4\" (UID: \"8bdf030e-92d8-45dc-ab6c-a7b241444677\") " pod="openshift-operators/observability-operator-59bdc8b94-dk9c4" Feb 19 15:21:39 crc kubenswrapper[4810]: I0219 15:21:39.496927 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d5debcf2-9629-4bb2-9133-f4b81748ff7d-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6d6bf9b6-qpvdt\" (UID: \"d5debcf2-9629-4bb2-9133-f4b81748ff7d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d6bf9b6-qpvdt" Feb 19 15:21:39 crc kubenswrapper[4810]: I0219 15:21:39.497148 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d5debcf2-9629-4bb2-9133-f4b81748ff7d-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6d6bf9b6-qpvdt\" (UID: \"d5debcf2-9629-4bb2-9133-f4b81748ff7d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d6bf9b6-qpvdt" Feb 19 15:21:39 crc kubenswrapper[4810]: I0219 15:21:39.574663 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d6bf9b6-vrfvr" Feb 19 15:21:39 crc kubenswrapper[4810]: I0219 15:21:39.594135 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfwlv\" (UniqueName: \"kubernetes.io/projected/8bdf030e-92d8-45dc-ab6c-a7b241444677-kube-api-access-nfwlv\") pod \"observability-operator-59bdc8b94-dk9c4\" (UID: \"8bdf030e-92d8-45dc-ab6c-a7b241444677\") " pod="openshift-operators/observability-operator-59bdc8b94-dk9c4" Feb 19 15:21:39 crc kubenswrapper[4810]: I0219 15:21:39.594622 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/8bdf030e-92d8-45dc-ab6c-a7b241444677-observability-operator-tls\") pod \"observability-operator-59bdc8b94-dk9c4\" (UID: \"8bdf030e-92d8-45dc-ab6c-a7b241444677\") " pod="openshift-operators/observability-operator-59bdc8b94-dk9c4" Feb 19 15:21:39 crc kubenswrapper[4810]: I0219 15:21:39.598959 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/8bdf030e-92d8-45dc-ab6c-a7b241444677-observability-operator-tls\") pod \"observability-operator-59bdc8b94-dk9c4\" (UID: \"8bdf030e-92d8-45dc-ab6c-a7b241444677\") " pod="openshift-operators/observability-operator-59bdc8b94-dk9c4" Feb 19 15:21:39 crc kubenswrapper[4810]: I0219 15:21:39.613396 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d6bf9b6-qpvdt" Feb 19 15:21:39 crc kubenswrapper[4810]: I0219 15:21:39.615532 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfwlv\" (UniqueName: \"kubernetes.io/projected/8bdf030e-92d8-45dc-ab6c-a7b241444677-kube-api-access-nfwlv\") pod \"observability-operator-59bdc8b94-dk9c4\" (UID: \"8bdf030e-92d8-45dc-ab6c-a7b241444677\") " pod="openshift-operators/observability-operator-59bdc8b94-dk9c4" Feb 19 15:21:39 crc kubenswrapper[4810]: I0219 15:21:39.655425 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-2fdxm"] Feb 19 15:21:39 crc kubenswrapper[4810]: I0219 15:21:39.658226 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-2fdxm" Feb 19 15:21:39 crc kubenswrapper[4810]: I0219 15:21:39.660353 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-575f4" Feb 19 15:21:39 crc kubenswrapper[4810]: I0219 15:21:39.670067 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-2fdxm"] Feb 19 15:21:39 crc kubenswrapper[4810]: I0219 15:21:39.692109 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-6jkkz"] Feb 19 15:21:39 crc kubenswrapper[4810]: I0219 15:21:39.699852 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/c5968625-c396-4ae0-9846-c2ceb6baf655-openshift-service-ca\") pod \"perses-operator-5bf474d74f-2fdxm\" (UID: \"c5968625-c396-4ae0-9846-c2ceb6baf655\") " pod="openshift-operators/perses-operator-5bf474d74f-2fdxm" Feb 19 15:21:39 crc kubenswrapper[4810]: I0219 15:21:39.699895 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qj8z\" (UniqueName: \"kubernetes.io/projected/c5968625-c396-4ae0-9846-c2ceb6baf655-kube-api-access-4qj8z\") pod \"perses-operator-5bf474d74f-2fdxm\" (UID: \"c5968625-c396-4ae0-9846-c2ceb6baf655\") " pod="openshift-operators/perses-operator-5bf474d74f-2fdxm" Feb 19 15:21:39 crc kubenswrapper[4810]: I0219 15:21:39.753628 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-dk9c4" Feb 19 15:21:39 crc kubenswrapper[4810]: I0219 15:21:39.807101 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/c5968625-c396-4ae0-9846-c2ceb6baf655-openshift-service-ca\") pod \"perses-operator-5bf474d74f-2fdxm\" (UID: \"c5968625-c396-4ae0-9846-c2ceb6baf655\") " pod="openshift-operators/perses-operator-5bf474d74f-2fdxm" Feb 19 15:21:39 crc kubenswrapper[4810]: I0219 15:21:39.807170 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qj8z\" (UniqueName: \"kubernetes.io/projected/c5968625-c396-4ae0-9846-c2ceb6baf655-kube-api-access-4qj8z\") pod \"perses-operator-5bf474d74f-2fdxm\" (UID: \"c5968625-c396-4ae0-9846-c2ceb6baf655\") " pod="openshift-operators/perses-operator-5bf474d74f-2fdxm" Feb 19 15:21:39 crc kubenswrapper[4810]: I0219 15:21:39.810158 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/c5968625-c396-4ae0-9846-c2ceb6baf655-openshift-service-ca\") pod \"perses-operator-5bf474d74f-2fdxm\" (UID: \"c5968625-c396-4ae0-9846-c2ceb6baf655\") " pod="openshift-operators/perses-operator-5bf474d74f-2fdxm" Feb 19 15:21:39 crc kubenswrapper[4810]: I0219 15:21:39.854366 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qj8z\" (UniqueName: \"kubernetes.io/projected/c5968625-c396-4ae0-9846-c2ceb6baf655-kube-api-access-4qj8z\") pod \"perses-operator-5bf474d74f-2fdxm\" (UID: \"c5968625-c396-4ae0-9846-c2ceb6baf655\") " pod="openshift-operators/perses-operator-5bf474d74f-2fdxm" Feb 19 15:21:40 crc kubenswrapper[4810]: I0219 15:21:40.029839 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-2fdxm" Feb 19 15:21:40 crc kubenswrapper[4810]: I0219 15:21:40.066661 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-dk9c4"] Feb 19 15:21:40 crc kubenswrapper[4810]: W0219 15:21:40.088218 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8bdf030e_92d8_45dc_ab6c_a7b241444677.slice/crio-48bd0ccfe86f887e37af92da9fce2d1dba7ea8e193d3d9b20e0cfa2118f0e40b WatchSource:0}: Error finding container 48bd0ccfe86f887e37af92da9fce2d1dba7ea8e193d3d9b20e0cfa2118f0e40b: Status 404 returned error can't find the container with id 48bd0ccfe86f887e37af92da9fce2d1dba7ea8e193d3d9b20e0cfa2118f0e40b Feb 19 15:21:40 crc kubenswrapper[4810]: I0219 15:21:40.164644 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6d6bf9b6-vrfvr"] Feb 19 15:21:40 crc kubenswrapper[4810]: I0219 15:21:40.175064 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6d6bf9b6-qpvdt"] Feb 19 15:21:40 crc kubenswrapper[4810]: W0219 15:21:40.192686 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5debcf2_9629_4bb2_9133_f4b81748ff7d.slice/crio-2bf939b44566aa5e4a0eeba65b734a5d16a8861ce489e6c630fb34f0d49e62a5 WatchSource:0}: Error finding container 2bf939b44566aa5e4a0eeba65b734a5d16a8861ce489e6c630fb34f0d49e62a5: Status 404 returned error can't find the container with id 2bf939b44566aa5e4a0eeba65b734a5d16a8861ce489e6c630fb34f0d49e62a5 Feb 19 15:21:40 crc kubenswrapper[4810]: I0219 15:21:40.240319 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-2fdxm"] Feb 19 15:21:40 crc kubenswrapper[4810]: I0219 15:21:40.458750 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-dk9c4" event={"ID":"8bdf030e-92d8-45dc-ab6c-a7b241444677","Type":"ContainerStarted","Data":"48bd0ccfe86f887e37af92da9fce2d1dba7ea8e193d3d9b20e0cfa2118f0e40b"} Feb 19 15:21:40 crc kubenswrapper[4810]: I0219 15:21:40.459932 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d6bf9b6-qpvdt" event={"ID":"d5debcf2-9629-4bb2-9133-f4b81748ff7d","Type":"ContainerStarted","Data":"2bf939b44566aa5e4a0eeba65b734a5d16a8861ce489e6c630fb34f0d49e62a5"} Feb 19 15:21:40 crc kubenswrapper[4810]: I0219 15:21:40.461896 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-6jkkz" event={"ID":"1656f52d-7771-4bbb-9642-b296d16b791e","Type":"ContainerStarted","Data":"df30e505ce981ea33f8fe527d2b7f8540bf4e32963710f4f8a4e3352e1d50c54"} Feb 19 15:21:40 crc kubenswrapper[4810]: I0219 15:21:40.463390 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-2fdxm" event={"ID":"c5968625-c396-4ae0-9846-c2ceb6baf655","Type":"ContainerStarted","Data":"5e2fc5928a3aa091ddb850b2d963a57cd85a0a72b462a1e672fec6d1ec5add72"} Feb 19 15:21:40 crc kubenswrapper[4810]: I0219 15:21:40.464739 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d6bf9b6-vrfvr" event={"ID":"408628c0-0b2c-48f9-b849-ee1b124499e1","Type":"ContainerStarted","Data":"443fc0f49ea2c02e6953693b08c3e8548561e2b0b271706a0a6a1d71103ec0e7"} Feb 19 15:21:49 crc kubenswrapper[4810]: I0219 15:21:49.537035 4810 patch_prober.go:28] interesting pod/machine-config-daemon-t499d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 15:21:49 crc kubenswrapper[4810]: I0219 15:21:49.537364 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 15:21:50 crc kubenswrapper[4810]: I0219 15:21:50.541621 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-2fdxm" event={"ID":"c5968625-c396-4ae0-9846-c2ceb6baf655","Type":"ContainerStarted","Data":"e7dbe5fab8e78e629119fbda79854b2e4fa55be862c6926e2b039cf5c8814234"} Feb 19 15:21:50 crc kubenswrapper[4810]: I0219 15:21:50.541884 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-2fdxm" Feb 19 15:21:50 crc kubenswrapper[4810]: I0219 15:21:50.543077 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d6bf9b6-vrfvr" event={"ID":"408628c0-0b2c-48f9-b849-ee1b124499e1","Type":"ContainerStarted","Data":"590b9446c65c57df99bac890da7a85c8b41483862b9f3afe4e2d065c52146bd0"} Feb 19 15:21:50 crc kubenswrapper[4810]: I0219 15:21:50.544368 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-dk9c4" event={"ID":"8bdf030e-92d8-45dc-ab6c-a7b241444677","Type":"ContainerStarted","Data":"a3a4520aad2042a32553e90a58bd4aadcc4654046a05f16d4c1b60ffd59ab716"} Feb 19 15:21:50 crc kubenswrapper[4810]: I0219 15:21:50.544581 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-dk9c4" Feb 19 15:21:50 crc kubenswrapper[4810]: I0219 15:21:50.545650 4810 patch_prober.go:28] interesting pod/observability-operator-59bdc8b94-dk9c4 container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.35:8081/healthz\": dial tcp 10.217.0.35:8081: connect: connection refused" start-of-body= Feb 19 15:21:50 crc kubenswrapper[4810]: I0219 15:21:50.545686 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-59bdc8b94-dk9c4" podUID="8bdf030e-92d8-45dc-ab6c-a7b241444677" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.35:8081/healthz\": dial tcp 10.217.0.35:8081: connect: connection refused" Feb 19 15:21:50 crc kubenswrapper[4810]: I0219 15:21:50.545701 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d6bf9b6-qpvdt" event={"ID":"d5debcf2-9629-4bb2-9133-f4b81748ff7d","Type":"ContainerStarted","Data":"b18e65b91e8f057e3c3b6ae93bd4918488d4e8b38a80531b1e6a27366a4cc66c"} Feb 19 15:21:50 crc kubenswrapper[4810]: I0219 15:21:50.547143 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-6jkkz" event={"ID":"1656f52d-7771-4bbb-9642-b296d16b791e","Type":"ContainerStarted","Data":"50d98e73af9efeaa986f94b0e2869529dba239ae8a9d8832f6c0d197a582c3de"} Feb 19 15:21:50 crc kubenswrapper[4810]: I0219 15:21:50.561250 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-2fdxm" podStartSLOduration=1.652078628 podStartE2EDuration="11.561230833s" podCreationTimestamp="2026-02-19 15:21:39 +0000 UTC" firstStartedPulling="2026-02-19 15:21:40.253567842 +0000 UTC m=+729.735597966" lastFinishedPulling="2026-02-19 15:21:50.162720047 +0000 UTC m=+739.644750171" observedRunningTime="2026-02-19 15:21:50.55747254 +0000 UTC m=+740.039502664" watchObservedRunningTime="2026-02-19 15:21:50.561230833 +0000 UTC m=+740.043260957" Feb 19 15:21:50 crc kubenswrapper[4810]: I0219 15:21:50.588318 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d6bf9b6-qpvdt" podStartSLOduration=1.621156424 podStartE2EDuration="11.588302782s" podCreationTimestamp="2026-02-19 15:21:39 +0000 UTC" firstStartedPulling="2026-02-19 15:21:40.195244071 +0000 UTC m=+729.677274195" lastFinishedPulling="2026-02-19 15:21:50.162390429 +0000 UTC m=+739.644420553" observedRunningTime="2026-02-19 15:21:50.586836485 +0000 UTC m=+740.068866609" watchObservedRunningTime="2026-02-19 15:21:50.588302782 +0000 UTC m=+740.070332896" Feb 19 15:21:50 crc kubenswrapper[4810]: I0219 15:21:50.618410 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-6jkkz" podStartSLOduration=1.188823873 podStartE2EDuration="11.618391795s" podCreationTimestamp="2026-02-19 15:21:39 +0000 UTC" firstStartedPulling="2026-02-19 15:21:39.732662023 +0000 UTC m=+729.214692147" lastFinishedPulling="2026-02-19 15:21:50.162229925 +0000 UTC m=+739.644260069" observedRunningTime="2026-02-19 15:21:50.61416077 +0000 UTC m=+740.096190894" watchObservedRunningTime="2026-02-19 15:21:50.618391795 +0000 UTC m=+740.100421919" Feb 19 15:21:50 crc kubenswrapper[4810]: I0219 15:21:50.645868 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-dk9c4" podStartSLOduration=1.534729938 podStartE2EDuration="11.645852973s" podCreationTimestamp="2026-02-19 15:21:39 +0000 UTC" firstStartedPulling="2026-02-19 15:21:40.090989866 +0000 UTC m=+729.573019990" lastFinishedPulling="2026-02-19 15:21:50.202112901 +0000 UTC m=+739.684143025" observedRunningTime="2026-02-19 15:21:50.643140776 +0000 UTC m=+740.125170900" watchObservedRunningTime="2026-02-19 15:21:50.645852973 +0000 UTC m=+740.127883097" Feb 19 15:21:50 crc kubenswrapper[4810]: I0219 15:21:50.668824 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d6bf9b6-vrfvr" podStartSLOduration=1.6254323400000001 podStartE2EDuration="11.668807201s" podCreationTimestamp="2026-02-19 15:21:39 +0000 UTC" firstStartedPulling="2026-02-19 15:21:40.179425241 +0000 UTC m=+729.661455365" lastFinishedPulling="2026-02-19 15:21:50.222800102 +0000 UTC m=+739.704830226" observedRunningTime="2026-02-19 15:21:50.66758223 +0000 UTC m=+740.149612374" watchObservedRunningTime="2026-02-19 15:21:50.668807201 +0000 UTC m=+740.150837325" Feb 19 15:21:51 crc kubenswrapper[4810]: I0219 15:21:51.582876 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-dk9c4" Feb 19 15:22:00 crc kubenswrapper[4810]: I0219 15:22:00.032355 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-2fdxm" Feb 19 15:22:06 crc kubenswrapper[4810]: I0219 15:22:06.692263 4810 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 19 15:22:15 crc kubenswrapper[4810]: I0219 15:22:15.998350 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawk578"] Feb 19 15:22:16 crc kubenswrapper[4810]: I0219 15:22:16.000790 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawk578" Feb 19 15:22:16 crc kubenswrapper[4810]: I0219 15:22:16.004469 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawk578"] Feb 19 15:22:16 crc kubenswrapper[4810]: I0219 15:22:16.006683 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 19 15:22:16 crc kubenswrapper[4810]: I0219 15:22:16.167500 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/815191f4-9d3a-4003-a32f-de4f76c9c15f-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawk578\" (UID: \"815191f4-9d3a-4003-a32f-de4f76c9c15f\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawk578" Feb 19 15:22:16 crc kubenswrapper[4810]: I0219 15:22:16.167566 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/815191f4-9d3a-4003-a32f-de4f76c9c15f-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawk578\" (UID: \"815191f4-9d3a-4003-a32f-de4f76c9c15f\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawk578" Feb 19 15:22:16 crc kubenswrapper[4810]: I0219 15:22:16.167661 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xd6ks\" (UniqueName: \"kubernetes.io/projected/815191f4-9d3a-4003-a32f-de4f76c9c15f-kube-api-access-xd6ks\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawk578\" (UID: \"815191f4-9d3a-4003-a32f-de4f76c9c15f\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawk578" Feb 19 15:22:16 crc kubenswrapper[4810]: I0219 15:22:16.268439 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/815191f4-9d3a-4003-a32f-de4f76c9c15f-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawk578\" (UID: \"815191f4-9d3a-4003-a32f-de4f76c9c15f\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawk578" Feb 19 15:22:16 crc kubenswrapper[4810]: I0219 15:22:16.268494 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/815191f4-9d3a-4003-a32f-de4f76c9c15f-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawk578\" (UID: \"815191f4-9d3a-4003-a32f-de4f76c9c15f\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawk578" Feb 19 15:22:16 crc kubenswrapper[4810]: I0219 15:22:16.268568 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xd6ks\" (UniqueName: \"kubernetes.io/projected/815191f4-9d3a-4003-a32f-de4f76c9c15f-kube-api-access-xd6ks\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawk578\" (UID: \"815191f4-9d3a-4003-a32f-de4f76c9c15f\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawk578" Feb 19 15:22:16 crc kubenswrapper[4810]: I0219 15:22:16.269255 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/815191f4-9d3a-4003-a32f-de4f76c9c15f-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawk578\" (UID: \"815191f4-9d3a-4003-a32f-de4f76c9c15f\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawk578" Feb 19 15:22:16 crc kubenswrapper[4810]: I0219 15:22:16.269401 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/815191f4-9d3a-4003-a32f-de4f76c9c15f-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawk578\" (UID: \"815191f4-9d3a-4003-a32f-de4f76c9c15f\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawk578" Feb 19 15:22:16 crc kubenswrapper[4810]: I0219 15:22:16.293880 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xd6ks\" (UniqueName: \"kubernetes.io/projected/815191f4-9d3a-4003-a32f-de4f76c9c15f-kube-api-access-xd6ks\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawk578\" (UID: \"815191f4-9d3a-4003-a32f-de4f76c9c15f\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawk578" Feb 19 15:22:16 crc kubenswrapper[4810]: I0219 15:22:16.321053 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawk578" Feb 19 15:22:16 crc kubenswrapper[4810]: I0219 15:22:16.801006 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawk578"] Feb 19 15:22:17 crc kubenswrapper[4810]: I0219 15:22:17.714447 4810 generic.go:334] "Generic (PLEG): container finished" podID="815191f4-9d3a-4003-a32f-de4f76c9c15f" containerID="a640c129b436c481969955036241e624b9a8fdc35ec10a9e96ec135a65005697" exitCode=0 Feb 19 15:22:17 crc kubenswrapper[4810]: I0219 15:22:17.714496 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawk578" event={"ID":"815191f4-9d3a-4003-a32f-de4f76c9c15f","Type":"ContainerDied","Data":"a640c129b436c481969955036241e624b9a8fdc35ec10a9e96ec135a65005697"} Feb 19 15:22:17 crc kubenswrapper[4810]: I0219 15:22:17.714527 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawk578" event={"ID":"815191f4-9d3a-4003-a32f-de4f76c9c15f","Type":"ContainerStarted","Data":"1f4ec315c7d6f364ebdb7e18d891ff326300f300ee5ca683540cb86943d9a50a"} Feb 19 15:22:18 crc kubenswrapper[4810]: I0219 15:22:18.369691 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2mft7"] Feb 19 15:22:18 crc kubenswrapper[4810]: I0219 15:22:18.372238 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2mft7" Feb 19 15:22:18 crc kubenswrapper[4810]: I0219 15:22:18.379230 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2mft7"] Feb 19 15:22:18 crc kubenswrapper[4810]: I0219 15:22:18.528240 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/132beac7-bb25-4b52-863e-0a113dc6799b-catalog-content\") pod \"redhat-operators-2mft7\" (UID: \"132beac7-bb25-4b52-863e-0a113dc6799b\") " pod="openshift-marketplace/redhat-operators-2mft7" Feb 19 15:22:18 crc kubenswrapper[4810]: I0219 15:22:18.528386 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/132beac7-bb25-4b52-863e-0a113dc6799b-utilities\") pod \"redhat-operators-2mft7\" (UID: \"132beac7-bb25-4b52-863e-0a113dc6799b\") " pod="openshift-marketplace/redhat-operators-2mft7" Feb 19 15:22:18 crc kubenswrapper[4810]: I0219 15:22:18.528633 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbj82\" (UniqueName: \"kubernetes.io/projected/132beac7-bb25-4b52-863e-0a113dc6799b-kube-api-access-wbj82\") pod \"redhat-operators-2mft7\" (UID: \"132beac7-bb25-4b52-863e-0a113dc6799b\") " pod="openshift-marketplace/redhat-operators-2mft7" Feb 19 15:22:18 crc kubenswrapper[4810]: I0219 15:22:18.630401 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbj82\" (UniqueName: \"kubernetes.io/projected/132beac7-bb25-4b52-863e-0a113dc6799b-kube-api-access-wbj82\") pod \"redhat-operators-2mft7\" (UID: \"132beac7-bb25-4b52-863e-0a113dc6799b\") " pod="openshift-marketplace/redhat-operators-2mft7" Feb 19 15:22:18 crc kubenswrapper[4810]: I0219 15:22:18.630524 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/132beac7-bb25-4b52-863e-0a113dc6799b-catalog-content\") pod \"redhat-operators-2mft7\" (UID: \"132beac7-bb25-4b52-863e-0a113dc6799b\") " pod="openshift-marketplace/redhat-operators-2mft7" Feb 19 15:22:18 crc kubenswrapper[4810]: I0219 15:22:18.630550 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/132beac7-bb25-4b52-863e-0a113dc6799b-utilities\") pod \"redhat-operators-2mft7\" (UID: \"132beac7-bb25-4b52-863e-0a113dc6799b\") " pod="openshift-marketplace/redhat-operators-2mft7" Feb 19 15:22:18 crc kubenswrapper[4810]: I0219 15:22:18.631249 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/132beac7-bb25-4b52-863e-0a113dc6799b-utilities\") pod \"redhat-operators-2mft7\" (UID: \"132beac7-bb25-4b52-863e-0a113dc6799b\") " pod="openshift-marketplace/redhat-operators-2mft7" Feb 19 15:22:18 crc kubenswrapper[4810]: I0219 15:22:18.631386 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/132beac7-bb25-4b52-863e-0a113dc6799b-catalog-content\") pod \"redhat-operators-2mft7\" (UID: \"132beac7-bb25-4b52-863e-0a113dc6799b\") " pod="openshift-marketplace/redhat-operators-2mft7" Feb 19 15:22:18 crc kubenswrapper[4810]: I0219 15:22:18.666955 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbj82\" (UniqueName: \"kubernetes.io/projected/132beac7-bb25-4b52-863e-0a113dc6799b-kube-api-access-wbj82\") pod \"redhat-operators-2mft7\" (UID: \"132beac7-bb25-4b52-863e-0a113dc6799b\") " pod="openshift-marketplace/redhat-operators-2mft7" Feb 19 15:22:18 crc kubenswrapper[4810]: I0219 15:22:18.742281 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2mft7" Feb 19 15:22:18 crc kubenswrapper[4810]: I0219 15:22:18.968774 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2mft7"] Feb 19 15:22:18 crc kubenswrapper[4810]: W0219 15:22:18.975695 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod132beac7_bb25_4b52_863e_0a113dc6799b.slice/crio-13841f7eec8b89d2fb39be72b7796fc00732025d2bc4c95c5e8d5b290f1c7821 WatchSource:0}: Error finding container 13841f7eec8b89d2fb39be72b7796fc00732025d2bc4c95c5e8d5b290f1c7821: Status 404 returned error can't find the container with id 13841f7eec8b89d2fb39be72b7796fc00732025d2bc4c95c5e8d5b290f1c7821 Feb 19 15:22:19 crc kubenswrapper[4810]: I0219 15:22:19.538927 4810 patch_prober.go:28] interesting pod/machine-config-daemon-t499d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 15:22:19 crc kubenswrapper[4810]: I0219 15:22:19.539021 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 15:22:19 crc kubenswrapper[4810]: I0219 15:22:19.539095 4810 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t499d" Feb 19 15:22:19 crc kubenswrapper[4810]: I0219 15:22:19.540061 4810 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"946b41284ed03248aebd830c7fb80426be59078e4ea2a93cd09930514fedec98"} pod="openshift-machine-config-operator/machine-config-daemon-t499d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 15:22:19 crc kubenswrapper[4810]: I0219 15:22:19.540181 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" containerID="cri-o://946b41284ed03248aebd830c7fb80426be59078e4ea2a93cd09930514fedec98" gracePeriod=600 Feb 19 15:22:19 crc kubenswrapper[4810]: I0219 15:22:19.727197 4810 generic.go:334] "Generic (PLEG): container finished" podID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerID="946b41284ed03248aebd830c7fb80426be59078e4ea2a93cd09930514fedec98" exitCode=0 Feb 19 15:22:19 crc kubenswrapper[4810]: I0219 15:22:19.727268 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" event={"ID":"dfbf74ef-1e94-4826-8583-42b2e246ccf3","Type":"ContainerDied","Data":"946b41284ed03248aebd830c7fb80426be59078e4ea2a93cd09930514fedec98"} Feb 19 15:22:19 crc kubenswrapper[4810]: I0219 15:22:19.727316 4810 scope.go:117] "RemoveContainer" containerID="b445a122b966a7403b8df4638cf97036239996a24e8ace1fab9b55e591849bf5" Feb 19 15:22:19 crc kubenswrapper[4810]: I0219 15:22:19.729248 4810 generic.go:334] "Generic (PLEG): container finished" podID="815191f4-9d3a-4003-a32f-de4f76c9c15f" containerID="28fd19fef824811305dd55bf91f97cdcd9584e932d6d41db8a7f50154ca5ad1b" exitCode=0 Feb 19 15:22:19 crc kubenswrapper[4810]: I0219 15:22:19.729361 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawk578" event={"ID":"815191f4-9d3a-4003-a32f-de4f76c9c15f","Type":"ContainerDied","Data":"28fd19fef824811305dd55bf91f97cdcd9584e932d6d41db8a7f50154ca5ad1b"} Feb 19 15:22:19 crc kubenswrapper[4810]: I0219 15:22:19.731261 4810 generic.go:334] "Generic (PLEG): container finished" podID="132beac7-bb25-4b52-863e-0a113dc6799b" containerID="b5538334e24ef6c83330097d1f5b71ee7b748811fc18ccd063319fa24e2faa21" exitCode=0 Feb 19 15:22:19 crc kubenswrapper[4810]: I0219 15:22:19.731306 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2mft7" event={"ID":"132beac7-bb25-4b52-863e-0a113dc6799b","Type":"ContainerDied","Data":"b5538334e24ef6c83330097d1f5b71ee7b748811fc18ccd063319fa24e2faa21"} Feb 19 15:22:19 crc kubenswrapper[4810]: I0219 15:22:19.731352 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2mft7" event={"ID":"132beac7-bb25-4b52-863e-0a113dc6799b","Type":"ContainerStarted","Data":"13841f7eec8b89d2fb39be72b7796fc00732025d2bc4c95c5e8d5b290f1c7821"} Feb 19 15:22:22 crc kubenswrapper[4810]: I0219 15:22:22.755400 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" event={"ID":"dfbf74ef-1e94-4826-8583-42b2e246ccf3","Type":"ContainerStarted","Data":"6c88e0127771a4aa28c6261d9a83da29a3f930023146271a9d942e738f8152ff"} Feb 19 15:22:22 crc kubenswrapper[4810]: I0219 15:22:22.759732 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawk578" event={"ID":"815191f4-9d3a-4003-a32f-de4f76c9c15f","Type":"ContainerStarted","Data":"bb0e4b414f182e075cb0d5b61693af2b1627fe337800fe9cf5cfd83cacbbe05c"} Feb 19 15:22:22 crc kubenswrapper[4810]: I0219 15:22:22.802033 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawk578" podStartSLOduration=6.621601321 podStartE2EDuration="7.802002583s" podCreationTimestamp="2026-02-19 15:22:15 +0000 UTC" firstStartedPulling="2026-02-19 15:22:17.716615348 +0000 UTC m=+767.198645472" lastFinishedPulling="2026-02-19 15:22:18.89701662 +0000 UTC m=+768.379046734" observedRunningTime="2026-02-19 15:22:22.79621889 +0000 UTC m=+772.278249054" watchObservedRunningTime="2026-02-19 15:22:22.802002583 +0000 UTC m=+772.284032747" Feb 19 15:22:23 crc kubenswrapper[4810]: I0219 15:22:23.772084 4810 generic.go:334] "Generic (PLEG): container finished" podID="815191f4-9d3a-4003-a32f-de4f76c9c15f" containerID="bb0e4b414f182e075cb0d5b61693af2b1627fe337800fe9cf5cfd83cacbbe05c" exitCode=0 Feb 19 15:22:23 crc kubenswrapper[4810]: I0219 15:22:23.772217 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawk578" event={"ID":"815191f4-9d3a-4003-a32f-de4f76c9c15f","Type":"ContainerDied","Data":"bb0e4b414f182e075cb0d5b61693af2b1627fe337800fe9cf5cfd83cacbbe05c"} Feb 19 15:22:23 crc kubenswrapper[4810]: I0219 15:22:23.775832 4810 generic.go:334] "Generic (PLEG): container finished" podID="132beac7-bb25-4b52-863e-0a113dc6799b" containerID="585ee91571796fba78ba8e4f41233d19b09ebef57aa467f35d92b53d4fb2aa16" exitCode=0 Feb 19 15:22:23 crc kubenswrapper[4810]: I0219 15:22:23.775898 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2mft7" event={"ID":"132beac7-bb25-4b52-863e-0a113dc6799b","Type":"ContainerDied","Data":"585ee91571796fba78ba8e4f41233d19b09ebef57aa467f35d92b53d4fb2aa16"} Feb 19 15:22:24 crc kubenswrapper[4810]: I0219 15:22:24.785836 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2mft7" event={"ID":"132beac7-bb25-4b52-863e-0a113dc6799b","Type":"ContainerStarted","Data":"50094d034a84c3ff10e55fa2dc439e3f2fd43638ff98866ab58aa48ac171229c"} Feb 19 15:22:24 crc kubenswrapper[4810]: I0219 15:22:24.812932 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2mft7" podStartSLOduration=2.362759411 podStartE2EDuration="6.812910202s" podCreationTimestamp="2026-02-19 15:22:18 +0000 UTC" firstStartedPulling="2026-02-19 15:22:19.733231059 +0000 UTC m=+769.215261183" lastFinishedPulling="2026-02-19 15:22:24.18338185 +0000 UTC m=+773.665411974" observedRunningTime="2026-02-19 15:22:24.811072756 +0000 UTC m=+774.293102920" watchObservedRunningTime="2026-02-19 15:22:24.812910202 +0000 UTC m=+774.294940346" Feb 19 15:22:25 crc kubenswrapper[4810]: I0219 15:22:25.107262 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawk578" Feb 19 15:22:25 crc kubenswrapper[4810]: I0219 15:22:25.224213 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xd6ks\" (UniqueName: \"kubernetes.io/projected/815191f4-9d3a-4003-a32f-de4f76c9c15f-kube-api-access-xd6ks\") pod \"815191f4-9d3a-4003-a32f-de4f76c9c15f\" (UID: \"815191f4-9d3a-4003-a32f-de4f76c9c15f\") " Feb 19 15:22:25 crc kubenswrapper[4810]: I0219 15:22:25.224323 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/815191f4-9d3a-4003-a32f-de4f76c9c15f-bundle\") pod \"815191f4-9d3a-4003-a32f-de4f76c9c15f\" (UID: \"815191f4-9d3a-4003-a32f-de4f76c9c15f\") " Feb 19 15:22:25 crc kubenswrapper[4810]: I0219 15:22:25.224812 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/815191f4-9d3a-4003-a32f-de4f76c9c15f-bundle" (OuterVolumeSpecName: "bundle") pod "815191f4-9d3a-4003-a32f-de4f76c9c15f" (UID: "815191f4-9d3a-4003-a32f-de4f76c9c15f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:22:25 crc kubenswrapper[4810]: I0219 15:22:25.225219 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/815191f4-9d3a-4003-a32f-de4f76c9c15f-util\") pod \"815191f4-9d3a-4003-a32f-de4f76c9c15f\" (UID: \"815191f4-9d3a-4003-a32f-de4f76c9c15f\") " Feb 19 15:22:25 crc kubenswrapper[4810]: I0219 15:22:25.225620 4810 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/815191f4-9d3a-4003-a32f-de4f76c9c15f-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 15:22:25 crc kubenswrapper[4810]: I0219 15:22:25.234686 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/815191f4-9d3a-4003-a32f-de4f76c9c15f-kube-api-access-xd6ks" (OuterVolumeSpecName: "kube-api-access-xd6ks") pod "815191f4-9d3a-4003-a32f-de4f76c9c15f" (UID: "815191f4-9d3a-4003-a32f-de4f76c9c15f"). InnerVolumeSpecName "kube-api-access-xd6ks". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:22:25 crc kubenswrapper[4810]: I0219 15:22:25.235645 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/815191f4-9d3a-4003-a32f-de4f76c9c15f-util" (OuterVolumeSpecName: "util") pod "815191f4-9d3a-4003-a32f-de4f76c9c15f" (UID: "815191f4-9d3a-4003-a32f-de4f76c9c15f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:22:25 crc kubenswrapper[4810]: I0219 15:22:25.326876 4810 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/815191f4-9d3a-4003-a32f-de4f76c9c15f-util\") on node \"crc\" DevicePath \"\"" Feb 19 15:22:25 crc kubenswrapper[4810]: I0219 15:22:25.326929 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xd6ks\" (UniqueName: \"kubernetes.io/projected/815191f4-9d3a-4003-a32f-de4f76c9c15f-kube-api-access-xd6ks\") on node \"crc\" DevicePath \"\"" Feb 19 15:22:25 crc kubenswrapper[4810]: I0219 15:22:25.793232 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawk578" event={"ID":"815191f4-9d3a-4003-a32f-de4f76c9c15f","Type":"ContainerDied","Data":"1f4ec315c7d6f364ebdb7e18d891ff326300f300ee5ca683540cb86943d9a50a"} Feb 19 15:22:25 crc kubenswrapper[4810]: I0219 15:22:25.793272 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawk578" Feb 19 15:22:25 crc kubenswrapper[4810]: I0219 15:22:25.793277 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f4ec315c7d6f364ebdb7e18d891ff326300f300ee5ca683540cb86943d9a50a" Feb 19 15:22:26 crc kubenswrapper[4810]: I0219 15:22:26.748265 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-p4hwg"] Feb 19 15:22:26 crc kubenswrapper[4810]: E0219 15:22:26.748925 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="815191f4-9d3a-4003-a32f-de4f76c9c15f" containerName="pull" Feb 19 15:22:26 crc kubenswrapper[4810]: I0219 15:22:26.748941 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="815191f4-9d3a-4003-a32f-de4f76c9c15f" containerName="pull" Feb 19 15:22:26 crc kubenswrapper[4810]: E0219 15:22:26.748956 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="815191f4-9d3a-4003-a32f-de4f76c9c15f" containerName="extract" Feb 19 15:22:26 crc kubenswrapper[4810]: I0219 15:22:26.748963 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="815191f4-9d3a-4003-a32f-de4f76c9c15f" containerName="extract" Feb 19 15:22:26 crc kubenswrapper[4810]: E0219 15:22:26.748980 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="815191f4-9d3a-4003-a32f-de4f76c9c15f" containerName="util" Feb 19 15:22:26 crc kubenswrapper[4810]: I0219 15:22:26.748988 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="815191f4-9d3a-4003-a32f-de4f76c9c15f" containerName="util" Feb 19 15:22:26 crc kubenswrapper[4810]: I0219 15:22:26.749109 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="815191f4-9d3a-4003-a32f-de4f76c9c15f" containerName="extract" Feb 19 15:22:26 crc kubenswrapper[4810]: I0219 15:22:26.759629 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-694c9596b7-p4hwg" Feb 19 15:22:26 crc kubenswrapper[4810]: I0219 15:22:26.763621 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Feb 19 15:22:26 crc kubenswrapper[4810]: I0219 15:22:26.763894 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Feb 19 15:22:26 crc kubenswrapper[4810]: I0219 15:22:26.763961 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-vhxj2" Feb 19 15:22:26 crc kubenswrapper[4810]: I0219 15:22:26.779477 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-p4hwg"] Feb 19 15:22:26 crc kubenswrapper[4810]: I0219 15:22:26.845173 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5s8s4\" (UniqueName: \"kubernetes.io/projected/f8300a06-7526-4da5-89a6-7fff8ff284c9-kube-api-access-5s8s4\") pod \"nmstate-operator-694c9596b7-p4hwg\" (UID: \"f8300a06-7526-4da5-89a6-7fff8ff284c9\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-p4hwg" Feb 19 15:22:26 crc kubenswrapper[4810]: I0219 15:22:26.945887 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5s8s4\" (UniqueName: \"kubernetes.io/projected/f8300a06-7526-4da5-89a6-7fff8ff284c9-kube-api-access-5s8s4\") pod \"nmstate-operator-694c9596b7-p4hwg\" (UID: \"f8300a06-7526-4da5-89a6-7fff8ff284c9\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-p4hwg" Feb 19 15:22:26 crc kubenswrapper[4810]: I0219 15:22:26.968889 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5s8s4\" (UniqueName: \"kubernetes.io/projected/f8300a06-7526-4da5-89a6-7fff8ff284c9-kube-api-access-5s8s4\") pod \"nmstate-operator-694c9596b7-p4hwg\" (UID: \"f8300a06-7526-4da5-89a6-7fff8ff284c9\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-p4hwg" Feb 19 15:22:27 crc kubenswrapper[4810]: I0219 15:22:27.077359 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-694c9596b7-p4hwg" Feb 19 15:22:27 crc kubenswrapper[4810]: I0219 15:22:27.504073 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-p4hwg"] Feb 19 15:22:27 crc kubenswrapper[4810]: W0219 15:22:27.506403 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8300a06_7526_4da5_89a6_7fff8ff284c9.slice/crio-2067d8a741f7f2a527f960baa170366ccceafe1a7e87baec91b176022f78a46a WatchSource:0}: Error finding container 2067d8a741f7f2a527f960baa170366ccceafe1a7e87baec91b176022f78a46a: Status 404 returned error can't find the container with id 2067d8a741f7f2a527f960baa170366ccceafe1a7e87baec91b176022f78a46a Feb 19 15:22:27 crc kubenswrapper[4810]: I0219 15:22:27.805805 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-694c9596b7-p4hwg" event={"ID":"f8300a06-7526-4da5-89a6-7fff8ff284c9","Type":"ContainerStarted","Data":"2067d8a741f7f2a527f960baa170366ccceafe1a7e87baec91b176022f78a46a"} Feb 19 15:22:28 crc kubenswrapper[4810]: I0219 15:22:28.743282 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2mft7" Feb 19 15:22:28 crc kubenswrapper[4810]: I0219 15:22:28.743656 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2mft7" Feb 19 15:22:29 crc kubenswrapper[4810]: I0219 15:22:29.785257 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2mft7" podUID="132beac7-bb25-4b52-863e-0a113dc6799b" containerName="registry-server" probeResult="failure" output=< Feb 19 15:22:29 crc kubenswrapper[4810]: timeout: failed to connect service ":50051" within 1s Feb 19 15:22:29 crc kubenswrapper[4810]: > Feb 19 15:22:30 crc kubenswrapper[4810]: I0219 15:22:30.824252 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-694c9596b7-p4hwg" event={"ID":"f8300a06-7526-4da5-89a6-7fff8ff284c9","Type":"ContainerStarted","Data":"831f71f6a9fb06cf0c9a8008a94f0058b5d372ffaa13ac7aedf8cc1032fa923d"} Feb 19 15:22:30 crc kubenswrapper[4810]: I0219 15:22:30.851555 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-694c9596b7-p4hwg" podStartSLOduration=2.157005727 podStartE2EDuration="4.851532755s" podCreationTimestamp="2026-02-19 15:22:26 +0000 UTC" firstStartedPulling="2026-02-19 15:22:27.508912617 +0000 UTC m=+776.990942741" lastFinishedPulling="2026-02-19 15:22:30.203439605 +0000 UTC m=+779.685469769" observedRunningTime="2026-02-19 15:22:30.845835974 +0000 UTC m=+780.327866108" watchObservedRunningTime="2026-02-19 15:22:30.851532755 +0000 UTC m=+780.333562889" Feb 19 15:22:37 crc kubenswrapper[4810]: I0219 15:22:37.301690 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-4g952"] Feb 19 15:22:37 crc kubenswrapper[4810]: I0219 15:22:37.302897 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58c85c668d-4g952" Feb 19 15:22:37 crc kubenswrapper[4810]: I0219 15:22:37.309435 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-mcdnc" Feb 19 15:22:37 crc kubenswrapper[4810]: I0219 15:22:37.326164 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-ckvvq"] Feb 19 15:22:37 crc kubenswrapper[4810]: I0219 15:22:37.327593 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-ckvvq" Feb 19 15:22:37 crc kubenswrapper[4810]: I0219 15:22:37.330997 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Feb 19 15:22:37 crc kubenswrapper[4810]: I0219 15:22:37.338272 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-bhhvv"] Feb 19 15:22:37 crc kubenswrapper[4810]: I0219 15:22:37.338979 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-bhhvv" Feb 19 15:22:37 crc kubenswrapper[4810]: I0219 15:22:37.347953 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-4g952"] Feb 19 15:22:37 crc kubenswrapper[4810]: I0219 15:22:37.373612 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-ckvvq"] Feb 19 15:22:37 crc kubenswrapper[4810]: I0219 15:22:37.382353 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/c0eb0835-6df5-4a21-b309-f178a032d027-nmstate-lock\") pod \"nmstate-handler-bhhvv\" (UID: \"c0eb0835-6df5-4a21-b309-f178a032d027\") " pod="openshift-nmstate/nmstate-handler-bhhvv" Feb 19 15:22:37 crc kubenswrapper[4810]: I0219 15:22:37.382403 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/c0eb0835-6df5-4a21-b309-f178a032d027-dbus-socket\") pod \"nmstate-handler-bhhvv\" (UID: \"c0eb0835-6df5-4a21-b309-f178a032d027\") " pod="openshift-nmstate/nmstate-handler-bhhvv" Feb 19 15:22:37 crc kubenswrapper[4810]: I0219 15:22:37.382444 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shtmz\" (UniqueName: \"kubernetes.io/projected/c0eb0835-6df5-4a21-b309-f178a032d027-kube-api-access-shtmz\") pod \"nmstate-handler-bhhvv\" (UID: \"c0eb0835-6df5-4a21-b309-f178a032d027\") " pod="openshift-nmstate/nmstate-handler-bhhvv" Feb 19 15:22:37 crc kubenswrapper[4810]: I0219 15:22:37.382476 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbnkc\" (UniqueName: \"kubernetes.io/projected/ce589619-7c2f-43db-ae4f-fb43be7b07f4-kube-api-access-wbnkc\") pod \"nmstate-metrics-58c85c668d-4g952\" (UID: \"ce589619-7c2f-43db-ae4f-fb43be7b07f4\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-4g952" Feb 19 15:22:37 crc kubenswrapper[4810]: I0219 15:22:37.382508 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/db05e782-a3d7-4cbe-be3f-f6226d894864-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-ckvvq\" (UID: \"db05e782-a3d7-4cbe-be3f-f6226d894864\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-ckvvq" Feb 19 15:22:37 crc kubenswrapper[4810]: I0219 15:22:37.382545 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/c0eb0835-6df5-4a21-b309-f178a032d027-ovs-socket\") pod \"nmstate-handler-bhhvv\" (UID: \"c0eb0835-6df5-4a21-b309-f178a032d027\") " pod="openshift-nmstate/nmstate-handler-bhhvv" Feb 19 15:22:37 crc kubenswrapper[4810]: I0219 15:22:37.382591 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxtpt\" (UniqueName: \"kubernetes.io/projected/db05e782-a3d7-4cbe-be3f-f6226d894864-kube-api-access-cxtpt\") pod \"nmstate-webhook-866bcb46dc-ckvvq\" (UID: \"db05e782-a3d7-4cbe-be3f-f6226d894864\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-ckvvq" Feb 19 15:22:37 crc kubenswrapper[4810]: I0219 15:22:37.449413 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-kdwwx"] Feb 19 15:22:37 crc kubenswrapper[4810]: I0219 15:22:37.450080 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-kdwwx" Feb 19 15:22:37 crc kubenswrapper[4810]: I0219 15:22:37.453537 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Feb 19 15:22:37 crc kubenswrapper[4810]: I0219 15:22:37.453598 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-lp2rk" Feb 19 15:22:37 crc kubenswrapper[4810]: I0219 15:22:37.453699 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Feb 19 15:22:37 crc kubenswrapper[4810]: I0219 15:22:37.460889 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-kdwwx"] Feb 19 15:22:37 crc kubenswrapper[4810]: I0219 15:22:37.484167 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/c0eb0835-6df5-4a21-b309-f178a032d027-dbus-socket\") pod \"nmstate-handler-bhhvv\" (UID: \"c0eb0835-6df5-4a21-b309-f178a032d027\") " pod="openshift-nmstate/nmstate-handler-bhhvv" Feb 19 15:22:37 crc kubenswrapper[4810]: I0219 15:22:37.484235 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shtmz\" (UniqueName: \"kubernetes.io/projected/c0eb0835-6df5-4a21-b309-f178a032d027-kube-api-access-shtmz\") pod \"nmstate-handler-bhhvv\" (UID: \"c0eb0835-6df5-4a21-b309-f178a032d027\") " pod="openshift-nmstate/nmstate-handler-bhhvv" Feb 19 15:22:37 crc kubenswrapper[4810]: I0219 15:22:37.484285 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbnkc\" (UniqueName: \"kubernetes.io/projected/ce589619-7c2f-43db-ae4f-fb43be7b07f4-kube-api-access-wbnkc\") pod \"nmstate-metrics-58c85c668d-4g952\" (UID: \"ce589619-7c2f-43db-ae4f-fb43be7b07f4\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-4g952" Feb 19 15:22:37 crc kubenswrapper[4810]: I0219 15:22:37.484318 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/db05e782-a3d7-4cbe-be3f-f6226d894864-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-ckvvq\" (UID: \"db05e782-a3d7-4cbe-be3f-f6226d894864\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-ckvvq" Feb 19 15:22:37 crc kubenswrapper[4810]: I0219 15:22:37.484373 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75vs2\" (UniqueName: \"kubernetes.io/projected/35fc682a-0cc9-4922-a2f2-60da1ddb1eb9-kube-api-access-75vs2\") pod \"nmstate-console-plugin-5c78fc5d65-kdwwx\" (UID: \"35fc682a-0cc9-4922-a2f2-60da1ddb1eb9\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-kdwwx" Feb 19 15:22:37 crc kubenswrapper[4810]: I0219 15:22:37.484402 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/35fc682a-0cc9-4922-a2f2-60da1ddb1eb9-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-kdwwx\" (UID: \"35fc682a-0cc9-4922-a2f2-60da1ddb1eb9\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-kdwwx" Feb 19 15:22:37 crc kubenswrapper[4810]: I0219 15:22:37.484457 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/c0eb0835-6df5-4a21-b309-f178a032d027-ovs-socket\") pod \"nmstate-handler-bhhvv\" (UID: \"c0eb0835-6df5-4a21-b309-f178a032d027\") " pod="openshift-nmstate/nmstate-handler-bhhvv" Feb 19 15:22:37 crc kubenswrapper[4810]: I0219 15:22:37.484578 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/c0eb0835-6df5-4a21-b309-f178a032d027-dbus-socket\") pod \"nmstate-handler-bhhvv\" (UID: \"c0eb0835-6df5-4a21-b309-f178a032d027\") " pod="openshift-nmstate/nmstate-handler-bhhvv" Feb 19 15:22:37 crc kubenswrapper[4810]: E0219 15:22:37.484588 4810 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Feb 19 15:22:37 crc kubenswrapper[4810]: I0219 15:22:37.484624 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/c0eb0835-6df5-4a21-b309-f178a032d027-ovs-socket\") pod \"nmstate-handler-bhhvv\" (UID: \"c0eb0835-6df5-4a21-b309-f178a032d027\") " pod="openshift-nmstate/nmstate-handler-bhhvv" Feb 19 15:22:37 crc kubenswrapper[4810]: I0219 15:22:37.484919 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/35fc682a-0cc9-4922-a2f2-60da1ddb1eb9-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-kdwwx\" (UID: \"35fc682a-0cc9-4922-a2f2-60da1ddb1eb9\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-kdwwx" Feb 19 15:22:37 crc kubenswrapper[4810]: I0219 15:22:37.485027 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxtpt\" (UniqueName: \"kubernetes.io/projected/db05e782-a3d7-4cbe-be3f-f6226d894864-kube-api-access-cxtpt\") pod \"nmstate-webhook-866bcb46dc-ckvvq\" (UID: \"db05e782-a3d7-4cbe-be3f-f6226d894864\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-ckvvq" Feb 19 15:22:37 crc kubenswrapper[4810]: E0219 15:22:37.485073 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db05e782-a3d7-4cbe-be3f-f6226d894864-tls-key-pair podName:db05e782-a3d7-4cbe-be3f-f6226d894864 nodeName:}" failed. No retries permitted until 2026-02-19 15:22:37.985050362 +0000 UTC m=+787.467080496 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/db05e782-a3d7-4cbe-be3f-f6226d894864-tls-key-pair") pod "nmstate-webhook-866bcb46dc-ckvvq" (UID: "db05e782-a3d7-4cbe-be3f-f6226d894864") : secret "openshift-nmstate-webhook" not found Feb 19 15:22:37 crc kubenswrapper[4810]: I0219 15:22:37.485126 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/c0eb0835-6df5-4a21-b309-f178a032d027-nmstate-lock\") pod \"nmstate-handler-bhhvv\" (UID: \"c0eb0835-6df5-4a21-b309-f178a032d027\") " pod="openshift-nmstate/nmstate-handler-bhhvv" Feb 19 15:22:37 crc kubenswrapper[4810]: I0219 15:22:37.485238 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/c0eb0835-6df5-4a21-b309-f178a032d027-nmstate-lock\") pod \"nmstate-handler-bhhvv\" (UID: \"c0eb0835-6df5-4a21-b309-f178a032d027\") " pod="openshift-nmstate/nmstate-handler-bhhvv" Feb 19 15:22:37 crc kubenswrapper[4810]: I0219 15:22:37.503366 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shtmz\" (UniqueName: \"kubernetes.io/projected/c0eb0835-6df5-4a21-b309-f178a032d027-kube-api-access-shtmz\") pod \"nmstate-handler-bhhvv\" (UID: \"c0eb0835-6df5-4a21-b309-f178a032d027\") " pod="openshift-nmstate/nmstate-handler-bhhvv" Feb 19 15:22:37 crc kubenswrapper[4810]: I0219 15:22:37.503546 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxtpt\" (UniqueName: \"kubernetes.io/projected/db05e782-a3d7-4cbe-be3f-f6226d894864-kube-api-access-cxtpt\") pod \"nmstate-webhook-866bcb46dc-ckvvq\" (UID: \"db05e782-a3d7-4cbe-be3f-f6226d894864\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-ckvvq" Feb 19 15:22:37 crc kubenswrapper[4810]: I0219 15:22:37.514415 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbnkc\" (UniqueName: \"kubernetes.io/projected/ce589619-7c2f-43db-ae4f-fb43be7b07f4-kube-api-access-wbnkc\") pod \"nmstate-metrics-58c85c668d-4g952\" (UID: \"ce589619-7c2f-43db-ae4f-fb43be7b07f4\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-4g952" Feb 19 15:22:37 crc kubenswrapper[4810]: I0219 15:22:37.585711 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75vs2\" (UniqueName: \"kubernetes.io/projected/35fc682a-0cc9-4922-a2f2-60da1ddb1eb9-kube-api-access-75vs2\") pod \"nmstate-console-plugin-5c78fc5d65-kdwwx\" (UID: \"35fc682a-0cc9-4922-a2f2-60da1ddb1eb9\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-kdwwx" Feb 19 15:22:37 crc kubenswrapper[4810]: I0219 15:22:37.586032 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/35fc682a-0cc9-4922-a2f2-60da1ddb1eb9-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-kdwwx\" (UID: \"35fc682a-0cc9-4922-a2f2-60da1ddb1eb9\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-kdwwx" Feb 19 15:22:37 crc kubenswrapper[4810]: I0219 15:22:37.586059 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/35fc682a-0cc9-4922-a2f2-60da1ddb1eb9-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-kdwwx\" (UID: \"35fc682a-0cc9-4922-a2f2-60da1ddb1eb9\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-kdwwx" Feb 19 15:22:37 crc kubenswrapper[4810]: I0219 15:22:37.587228 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/35fc682a-0cc9-4922-a2f2-60da1ddb1eb9-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-kdwwx\" (UID: \"35fc682a-0cc9-4922-a2f2-60da1ddb1eb9\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-kdwwx" Feb 19 15:22:37 crc kubenswrapper[4810]: I0219 15:22:37.589041 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/35fc682a-0cc9-4922-a2f2-60da1ddb1eb9-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-kdwwx\" (UID: \"35fc682a-0cc9-4922-a2f2-60da1ddb1eb9\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-kdwwx" Feb 19 15:22:37 crc kubenswrapper[4810]: I0219 15:22:37.608088 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75vs2\" (UniqueName: \"kubernetes.io/projected/35fc682a-0cc9-4922-a2f2-60da1ddb1eb9-kube-api-access-75vs2\") pod \"nmstate-console-plugin-5c78fc5d65-kdwwx\" (UID: \"35fc682a-0cc9-4922-a2f2-60da1ddb1eb9\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-kdwwx" Feb 19 15:22:37 crc kubenswrapper[4810]: I0219 15:22:37.623654 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58c85c668d-4g952" Feb 19 15:22:37 crc kubenswrapper[4810]: I0219 15:22:37.648061 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-6748b4f7c7-98l2r"] Feb 19 15:22:37 crc kubenswrapper[4810]: I0219 15:22:37.648933 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6748b4f7c7-98l2r" Feb 19 15:22:37 crc kubenswrapper[4810]: I0219 15:22:37.662171 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6748b4f7c7-98l2r"] Feb 19 15:22:37 crc kubenswrapper[4810]: I0219 15:22:37.664479 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-bhhvv" Feb 19 15:22:37 crc kubenswrapper[4810]: I0219 15:22:37.686637 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/652c3d69-0b00-46b4-a0d7-752de7f222aa-service-ca\") pod \"console-6748b4f7c7-98l2r\" (UID: \"652c3d69-0b00-46b4-a0d7-752de7f222aa\") " pod="openshift-console/console-6748b4f7c7-98l2r" Feb 19 15:22:37 crc kubenswrapper[4810]: I0219 15:22:37.686706 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/652c3d69-0b00-46b4-a0d7-752de7f222aa-console-config\") pod \"console-6748b4f7c7-98l2r\" (UID: \"652c3d69-0b00-46b4-a0d7-752de7f222aa\") " pod="openshift-console/console-6748b4f7c7-98l2r" Feb 19 15:22:37 crc kubenswrapper[4810]: I0219 15:22:37.686738 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fw7f7\" (UniqueName: \"kubernetes.io/projected/652c3d69-0b00-46b4-a0d7-752de7f222aa-kube-api-access-fw7f7\") pod \"console-6748b4f7c7-98l2r\" (UID: \"652c3d69-0b00-46b4-a0d7-752de7f222aa\") " pod="openshift-console/console-6748b4f7c7-98l2r" Feb 19 15:22:37 crc kubenswrapper[4810]: I0219 15:22:37.686768 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/652c3d69-0b00-46b4-a0d7-752de7f222aa-console-oauth-config\") pod \"console-6748b4f7c7-98l2r\" (UID: \"652c3d69-0b00-46b4-a0d7-752de7f222aa\") " pod="openshift-console/console-6748b4f7c7-98l2r" Feb 19 15:22:37 crc kubenswrapper[4810]: I0219 15:22:37.686790 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/652c3d69-0b00-46b4-a0d7-752de7f222aa-trusted-ca-bundle\") pod \"console-6748b4f7c7-98l2r\" (UID: \"652c3d69-0b00-46b4-a0d7-752de7f222aa\") " pod="openshift-console/console-6748b4f7c7-98l2r" Feb 19 15:22:37 crc kubenswrapper[4810]: I0219 15:22:37.686820 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/652c3d69-0b00-46b4-a0d7-752de7f222aa-console-serving-cert\") pod \"console-6748b4f7c7-98l2r\" (UID: \"652c3d69-0b00-46b4-a0d7-752de7f222aa\") " pod="openshift-console/console-6748b4f7c7-98l2r" Feb 19 15:22:37 crc kubenswrapper[4810]: I0219 15:22:37.686851 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/652c3d69-0b00-46b4-a0d7-752de7f222aa-oauth-serving-cert\") pod \"console-6748b4f7c7-98l2r\" (UID: \"652c3d69-0b00-46b4-a0d7-752de7f222aa\") " pod="openshift-console/console-6748b4f7c7-98l2r" Feb 19 15:22:37 crc kubenswrapper[4810]: I0219 15:22:37.765909 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-kdwwx" Feb 19 15:22:37 crc kubenswrapper[4810]: I0219 15:22:37.788017 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/652c3d69-0b00-46b4-a0d7-752de7f222aa-console-serving-cert\") pod \"console-6748b4f7c7-98l2r\" (UID: \"652c3d69-0b00-46b4-a0d7-752de7f222aa\") " pod="openshift-console/console-6748b4f7c7-98l2r" Feb 19 15:22:37 crc kubenswrapper[4810]: I0219 15:22:37.788345 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/652c3d69-0b00-46b4-a0d7-752de7f222aa-oauth-serving-cert\") pod \"console-6748b4f7c7-98l2r\" (UID: \"652c3d69-0b00-46b4-a0d7-752de7f222aa\") " pod="openshift-console/console-6748b4f7c7-98l2r" Feb 19 15:22:37 crc kubenswrapper[4810]: I0219 15:22:37.788401 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/652c3d69-0b00-46b4-a0d7-752de7f222aa-service-ca\") pod \"console-6748b4f7c7-98l2r\" (UID: \"652c3d69-0b00-46b4-a0d7-752de7f222aa\") " pod="openshift-console/console-6748b4f7c7-98l2r" Feb 19 15:22:37 crc kubenswrapper[4810]: I0219 15:22:37.788439 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/652c3d69-0b00-46b4-a0d7-752de7f222aa-console-config\") pod \"console-6748b4f7c7-98l2r\" (UID: \"652c3d69-0b00-46b4-a0d7-752de7f222aa\") " pod="openshift-console/console-6748b4f7c7-98l2r" Feb 19 15:22:37 crc kubenswrapper[4810]: I0219 15:22:37.788484 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fw7f7\" (UniqueName: \"kubernetes.io/projected/652c3d69-0b00-46b4-a0d7-752de7f222aa-kube-api-access-fw7f7\") pod \"console-6748b4f7c7-98l2r\" (UID: \"652c3d69-0b00-46b4-a0d7-752de7f222aa\") " pod="openshift-console/console-6748b4f7c7-98l2r" Feb 19 15:22:37 crc kubenswrapper[4810]: I0219 15:22:37.788513 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/652c3d69-0b00-46b4-a0d7-752de7f222aa-console-oauth-config\") pod \"console-6748b4f7c7-98l2r\" (UID: \"652c3d69-0b00-46b4-a0d7-752de7f222aa\") " pod="openshift-console/console-6748b4f7c7-98l2r" Feb 19 15:22:37 crc kubenswrapper[4810]: I0219 15:22:37.788545 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/652c3d69-0b00-46b4-a0d7-752de7f222aa-trusted-ca-bundle\") pod \"console-6748b4f7c7-98l2r\" (UID: \"652c3d69-0b00-46b4-a0d7-752de7f222aa\") " pod="openshift-console/console-6748b4f7c7-98l2r" Feb 19 15:22:37 crc kubenswrapper[4810]: I0219 15:22:37.789727 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/652c3d69-0b00-46b4-a0d7-752de7f222aa-oauth-serving-cert\") pod \"console-6748b4f7c7-98l2r\" (UID: \"652c3d69-0b00-46b4-a0d7-752de7f222aa\") " pod="openshift-console/console-6748b4f7c7-98l2r" Feb 19 15:22:37 crc kubenswrapper[4810]: I0219 15:22:37.789948 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/652c3d69-0b00-46b4-a0d7-752de7f222aa-service-ca\") pod \"console-6748b4f7c7-98l2r\" (UID: \"652c3d69-0b00-46b4-a0d7-752de7f222aa\") " pod="openshift-console/console-6748b4f7c7-98l2r" Feb 19 15:22:37 crc kubenswrapper[4810]: I0219 15:22:37.791464 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/652c3d69-0b00-46b4-a0d7-752de7f222aa-console-config\") pod \"console-6748b4f7c7-98l2r\" (UID: \"652c3d69-0b00-46b4-a0d7-752de7f222aa\") " pod="openshift-console/console-6748b4f7c7-98l2r" Feb 19 15:22:37 crc kubenswrapper[4810]: I0219 15:22:37.791483 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/652c3d69-0b00-46b4-a0d7-752de7f222aa-trusted-ca-bundle\") pod \"console-6748b4f7c7-98l2r\" (UID: \"652c3d69-0b00-46b4-a0d7-752de7f222aa\") " pod="openshift-console/console-6748b4f7c7-98l2r" Feb 19 15:22:37 crc kubenswrapper[4810]: I0219 15:22:37.795032 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/652c3d69-0b00-46b4-a0d7-752de7f222aa-console-serving-cert\") pod \"console-6748b4f7c7-98l2r\" (UID: \"652c3d69-0b00-46b4-a0d7-752de7f222aa\") " pod="openshift-console/console-6748b4f7c7-98l2r" Feb 19 15:22:37 crc kubenswrapper[4810]: I0219 15:22:37.795764 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/652c3d69-0b00-46b4-a0d7-752de7f222aa-console-oauth-config\") pod \"console-6748b4f7c7-98l2r\" (UID: \"652c3d69-0b00-46b4-a0d7-752de7f222aa\") " pod="openshift-console/console-6748b4f7c7-98l2r" Feb 19 15:22:37 crc kubenswrapper[4810]: I0219 15:22:37.803640 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fw7f7\" (UniqueName: \"kubernetes.io/projected/652c3d69-0b00-46b4-a0d7-752de7f222aa-kube-api-access-fw7f7\") pod \"console-6748b4f7c7-98l2r\" (UID: \"652c3d69-0b00-46b4-a0d7-752de7f222aa\") " pod="openshift-console/console-6748b4f7c7-98l2r" Feb 19 15:22:37 crc kubenswrapper[4810]: I0219 15:22:37.877014 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-bhhvv" event={"ID":"c0eb0835-6df5-4a21-b309-f178a032d027","Type":"ContainerStarted","Data":"3a2c5b0a2db448eec40a84472f94f2c5eb429dcca9879fce0a98035848292174"} Feb 19 15:22:37 crc kubenswrapper[4810]: I0219 15:22:37.943507 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-kdwwx"] Feb 19 15:22:37 crc kubenswrapper[4810]: W0219 15:22:37.949614 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35fc682a_0cc9_4922_a2f2_60da1ddb1eb9.slice/crio-b859ff50f0ab64c1c062a20f499cf15998f09e60a2f0007efdcb544b17f21d7c WatchSource:0}: Error finding container b859ff50f0ab64c1c062a20f499cf15998f09e60a2f0007efdcb544b17f21d7c: Status 404 returned error can't find the container with id b859ff50f0ab64c1c062a20f499cf15998f09e60a2f0007efdcb544b17f21d7c Feb 19 15:22:37 crc kubenswrapper[4810]: I0219 15:22:37.991080 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/db05e782-a3d7-4cbe-be3f-f6226d894864-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-ckvvq\" (UID: \"db05e782-a3d7-4cbe-be3f-f6226d894864\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-ckvvq" Feb 19 15:22:37 crc kubenswrapper[4810]: I0219 15:22:37.991669 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6748b4f7c7-98l2r" Feb 19 15:22:37 crc kubenswrapper[4810]: I0219 15:22:37.994391 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/db05e782-a3d7-4cbe-be3f-f6226d894864-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-ckvvq\" (UID: \"db05e782-a3d7-4cbe-be3f-f6226d894864\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-ckvvq" Feb 19 15:22:38 crc kubenswrapper[4810]: I0219 15:22:38.087348 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-4g952"] Feb 19 15:22:38 crc kubenswrapper[4810]: W0219 15:22:38.094938 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce589619_7c2f_43db_ae4f_fb43be7b07f4.slice/crio-91edc095f5fdd489f1314a6a7986527e74d1fe6916196880ba67c61055949b21 WatchSource:0}: Error finding container 91edc095f5fdd489f1314a6a7986527e74d1fe6916196880ba67c61055949b21: Status 404 returned error can't find the container with id 91edc095f5fdd489f1314a6a7986527e74d1fe6916196880ba67c61055949b21 Feb 19 15:22:38 crc kubenswrapper[4810]: I0219 15:22:38.174228 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6748b4f7c7-98l2r"] Feb 19 15:22:38 crc kubenswrapper[4810]: W0219 15:22:38.179555 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod652c3d69_0b00_46b4_a0d7_752de7f222aa.slice/crio-c028658fc40979a28aacaf03d89e9437caaf30d64d412dcca542628fc9fe1e8d WatchSource:0}: Error finding container c028658fc40979a28aacaf03d89e9437caaf30d64d412dcca542628fc9fe1e8d: Status 404 returned error can't find the container with id c028658fc40979a28aacaf03d89e9437caaf30d64d412dcca542628fc9fe1e8d Feb 19 15:22:38 crc kubenswrapper[4810]: I0219 15:22:38.245474 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-ckvvq" Feb 19 15:22:38 crc kubenswrapper[4810]: I0219 15:22:38.662593 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-ckvvq"] Feb 19 15:22:38 crc kubenswrapper[4810]: I0219 15:22:38.799036 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2mft7" Feb 19 15:22:38 crc kubenswrapper[4810]: I0219 15:22:38.850009 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2mft7" Feb 19 15:22:38 crc kubenswrapper[4810]: I0219 15:22:38.888317 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-kdwwx" event={"ID":"35fc682a-0cc9-4922-a2f2-60da1ddb1eb9","Type":"ContainerStarted","Data":"b859ff50f0ab64c1c062a20f499cf15998f09e60a2f0007efdcb544b17f21d7c"} Feb 19 15:22:38 crc kubenswrapper[4810]: I0219 15:22:38.889835 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-ckvvq" event={"ID":"db05e782-a3d7-4cbe-be3f-f6226d894864","Type":"ContainerStarted","Data":"683e3f66aa41271ffb4a031d473334039956ab097ea62cc75f22695b9ff0d6ba"} Feb 19 15:22:38 crc kubenswrapper[4810]: I0219 15:22:38.891157 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6748b4f7c7-98l2r" event={"ID":"652c3d69-0b00-46b4-a0d7-752de7f222aa","Type":"ContainerStarted","Data":"297b1e417d23fdaad5a457fea57406025723128b0f338f99f0d3982125ef9eba"} Feb 19 15:22:38 crc kubenswrapper[4810]: I0219 15:22:38.891193 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6748b4f7c7-98l2r" event={"ID":"652c3d69-0b00-46b4-a0d7-752de7f222aa","Type":"ContainerStarted","Data":"c028658fc40979a28aacaf03d89e9437caaf30d64d412dcca542628fc9fe1e8d"} Feb 19 15:22:38 crc kubenswrapper[4810]: I0219 15:22:38.893494 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-4g952" event={"ID":"ce589619-7c2f-43db-ae4f-fb43be7b07f4","Type":"ContainerStarted","Data":"91edc095f5fdd489f1314a6a7986527e74d1fe6916196880ba67c61055949b21"} Feb 19 15:22:38 crc kubenswrapper[4810]: I0219 15:22:38.919195 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6748b4f7c7-98l2r" podStartSLOduration=1.919171703 podStartE2EDuration="1.919171703s" podCreationTimestamp="2026-02-19 15:22:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:22:38.906912941 +0000 UTC m=+788.388943105" watchObservedRunningTime="2026-02-19 15:22:38.919171703 +0000 UTC m=+788.401201827" Feb 19 15:22:39 crc kubenswrapper[4810]: I0219 15:22:39.038351 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2mft7"] Feb 19 15:22:39 crc kubenswrapper[4810]: I0219 15:22:39.906899 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2mft7" podUID="132beac7-bb25-4b52-863e-0a113dc6799b" containerName="registry-server" containerID="cri-o://50094d034a84c3ff10e55fa2dc439e3f2fd43638ff98866ab58aa48ac171229c" gracePeriod=2 Feb 19 15:22:40 crc kubenswrapper[4810]: I0219 15:22:40.601164 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2mft7" Feb 19 15:22:40 crc kubenswrapper[4810]: I0219 15:22:40.729584 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/132beac7-bb25-4b52-863e-0a113dc6799b-catalog-content\") pod \"132beac7-bb25-4b52-863e-0a113dc6799b\" (UID: \"132beac7-bb25-4b52-863e-0a113dc6799b\") " Feb 19 15:22:40 crc kubenswrapper[4810]: I0219 15:22:40.729644 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/132beac7-bb25-4b52-863e-0a113dc6799b-utilities\") pod \"132beac7-bb25-4b52-863e-0a113dc6799b\" (UID: \"132beac7-bb25-4b52-863e-0a113dc6799b\") " Feb 19 15:22:40 crc kubenswrapper[4810]: I0219 15:22:40.729694 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbj82\" (UniqueName: \"kubernetes.io/projected/132beac7-bb25-4b52-863e-0a113dc6799b-kube-api-access-wbj82\") pod \"132beac7-bb25-4b52-863e-0a113dc6799b\" (UID: \"132beac7-bb25-4b52-863e-0a113dc6799b\") " Feb 19 15:22:40 crc kubenswrapper[4810]: I0219 15:22:40.730684 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/132beac7-bb25-4b52-863e-0a113dc6799b-utilities" (OuterVolumeSpecName: "utilities") pod "132beac7-bb25-4b52-863e-0a113dc6799b" (UID: "132beac7-bb25-4b52-863e-0a113dc6799b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:22:40 crc kubenswrapper[4810]: I0219 15:22:40.733872 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/132beac7-bb25-4b52-863e-0a113dc6799b-kube-api-access-wbj82" (OuterVolumeSpecName: "kube-api-access-wbj82") pod "132beac7-bb25-4b52-863e-0a113dc6799b" (UID: "132beac7-bb25-4b52-863e-0a113dc6799b"). InnerVolumeSpecName "kube-api-access-wbj82". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:22:40 crc kubenswrapper[4810]: I0219 15:22:40.830653 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/132beac7-bb25-4b52-863e-0a113dc6799b-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 15:22:40 crc kubenswrapper[4810]: I0219 15:22:40.830947 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wbj82\" (UniqueName: \"kubernetes.io/projected/132beac7-bb25-4b52-863e-0a113dc6799b-kube-api-access-wbj82\") on node \"crc\" DevicePath \"\"" Feb 19 15:22:40 crc kubenswrapper[4810]: I0219 15:22:40.854447 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/132beac7-bb25-4b52-863e-0a113dc6799b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "132beac7-bb25-4b52-863e-0a113dc6799b" (UID: "132beac7-bb25-4b52-863e-0a113dc6799b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:22:40 crc kubenswrapper[4810]: I0219 15:22:40.914863 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-4g952" event={"ID":"ce589619-7c2f-43db-ae4f-fb43be7b07f4","Type":"ContainerStarted","Data":"4f64332aafd522674ad097a4a805656cde7bc8e23c7de594c7efe16720e0256c"} Feb 19 15:22:40 crc kubenswrapper[4810]: I0219 15:22:40.916639 4810 generic.go:334] "Generic (PLEG): container finished" podID="132beac7-bb25-4b52-863e-0a113dc6799b" containerID="50094d034a84c3ff10e55fa2dc439e3f2fd43638ff98866ab58aa48ac171229c" exitCode=0 Feb 19 15:22:40 crc kubenswrapper[4810]: I0219 15:22:40.916668 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2mft7" event={"ID":"132beac7-bb25-4b52-863e-0a113dc6799b","Type":"ContainerDied","Data":"50094d034a84c3ff10e55fa2dc439e3f2fd43638ff98866ab58aa48ac171229c"} Feb 19 15:22:40 crc kubenswrapper[4810]: I0219 15:22:40.916697 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2mft7" event={"ID":"132beac7-bb25-4b52-863e-0a113dc6799b","Type":"ContainerDied","Data":"13841f7eec8b89d2fb39be72b7796fc00732025d2bc4c95c5e8d5b290f1c7821"} Feb 19 15:22:40 crc kubenswrapper[4810]: I0219 15:22:40.916719 4810 scope.go:117] "RemoveContainer" containerID="50094d034a84c3ff10e55fa2dc439e3f2fd43638ff98866ab58aa48ac171229c" Feb 19 15:22:40 crc kubenswrapper[4810]: I0219 15:22:40.916727 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2mft7" Feb 19 15:22:40 crc kubenswrapper[4810]: I0219 15:22:40.920856 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-bhhvv" event={"ID":"c0eb0835-6df5-4a21-b309-f178a032d027","Type":"ContainerStarted","Data":"1cd51c9297a58b9b01ee7ad33c1f12591f770b85f4b30d5cc206be5a7fd5e422"} Feb 19 15:22:40 crc kubenswrapper[4810]: I0219 15:22:40.921564 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-bhhvv" Feb 19 15:22:40 crc kubenswrapper[4810]: I0219 15:22:40.924900 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-kdwwx" event={"ID":"35fc682a-0cc9-4922-a2f2-60da1ddb1eb9","Type":"ContainerStarted","Data":"13741502a9cb047e876cde5c8b5c953c930ebb28b193ad72aab1ab20875477e3"} Feb 19 15:22:40 crc kubenswrapper[4810]: I0219 15:22:40.925037 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-ckvvq" event={"ID":"db05e782-a3d7-4cbe-be3f-f6226d894864","Type":"ContainerStarted","Data":"f18ec1c07060a2de47dafe6583ef0b98dceb7586fd9e2945c09eafd1d25b9a40"} Feb 19 15:22:40 crc kubenswrapper[4810]: I0219 15:22:40.925194 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-ckvvq" Feb 19 15:22:40 crc kubenswrapper[4810]: I0219 15:22:40.931918 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/132beac7-bb25-4b52-863e-0a113dc6799b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 15:22:40 crc kubenswrapper[4810]: I0219 15:22:40.938763 4810 scope.go:117] "RemoveContainer" containerID="585ee91571796fba78ba8e4f41233d19b09ebef57aa467f35d92b53d4fb2aa16" Feb 19 15:22:40 crc kubenswrapper[4810]: I0219 15:22:40.943965 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-bhhvv" podStartSLOduration=1.02416616 podStartE2EDuration="3.943936674s" podCreationTimestamp="2026-02-19 15:22:37 +0000 UTC" firstStartedPulling="2026-02-19 15:22:37.698706686 +0000 UTC m=+787.180736810" lastFinishedPulling="2026-02-19 15:22:40.6184772 +0000 UTC m=+790.100507324" observedRunningTime="2026-02-19 15:22:40.937784532 +0000 UTC m=+790.419814696" watchObservedRunningTime="2026-02-19 15:22:40.943936674 +0000 UTC m=+790.425966848" Feb 19 15:22:40 crc kubenswrapper[4810]: I0219 15:22:40.954821 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-kdwwx" podStartSLOduration=1.287712536 podStartE2EDuration="3.954800042s" podCreationTimestamp="2026-02-19 15:22:37 +0000 UTC" firstStartedPulling="2026-02-19 15:22:37.951517197 +0000 UTC m=+787.433547321" lastFinishedPulling="2026-02-19 15:22:40.618604703 +0000 UTC m=+790.100634827" observedRunningTime="2026-02-19 15:22:40.953978992 +0000 UTC m=+790.436009166" watchObservedRunningTime="2026-02-19 15:22:40.954800042 +0000 UTC m=+790.436830196" Feb 19 15:22:40 crc kubenswrapper[4810]: I0219 15:22:40.977740 4810 scope.go:117] "RemoveContainer" containerID="b5538334e24ef6c83330097d1f5b71ee7b748811fc18ccd063319fa24e2faa21" Feb 19 15:22:40 crc kubenswrapper[4810]: I0219 15:22:40.993860 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-ckvvq" podStartSLOduration=2.04986959 podStartE2EDuration="3.993841926s" podCreationTimestamp="2026-02-19 15:22:37 +0000 UTC" firstStartedPulling="2026-02-19 15:22:38.676972285 +0000 UTC m=+788.159002409" lastFinishedPulling="2026-02-19 15:22:40.620944601 +0000 UTC m=+790.102974745" observedRunningTime="2026-02-19 15:22:40.97820116 +0000 UTC m=+790.460231294" watchObservedRunningTime="2026-02-19 15:22:40.993841926 +0000 UTC m=+790.475872060" Feb 19 15:22:40 crc kubenswrapper[4810]: I0219 15:22:40.995997 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2mft7"] Feb 19 15:22:41 crc kubenswrapper[4810]: I0219 15:22:41.001274 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2mft7"] Feb 19 15:22:41 crc kubenswrapper[4810]: I0219 15:22:41.013828 4810 scope.go:117] "RemoveContainer" containerID="50094d034a84c3ff10e55fa2dc439e3f2fd43638ff98866ab58aa48ac171229c" Feb 19 15:22:41 crc kubenswrapper[4810]: E0219 15:22:41.014495 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50094d034a84c3ff10e55fa2dc439e3f2fd43638ff98866ab58aa48ac171229c\": container with ID starting with 50094d034a84c3ff10e55fa2dc439e3f2fd43638ff98866ab58aa48ac171229c not found: ID does not exist" containerID="50094d034a84c3ff10e55fa2dc439e3f2fd43638ff98866ab58aa48ac171229c" Feb 19 15:22:41 crc kubenswrapper[4810]: I0219 15:22:41.014742 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50094d034a84c3ff10e55fa2dc439e3f2fd43638ff98866ab58aa48ac171229c"} err="failed to get container status \"50094d034a84c3ff10e55fa2dc439e3f2fd43638ff98866ab58aa48ac171229c\": rpc error: code = NotFound desc = could not find container \"50094d034a84c3ff10e55fa2dc439e3f2fd43638ff98866ab58aa48ac171229c\": container with ID starting with 50094d034a84c3ff10e55fa2dc439e3f2fd43638ff98866ab58aa48ac171229c not found: ID does not exist" Feb 19 15:22:41 crc kubenswrapper[4810]: I0219 15:22:41.014964 4810 scope.go:117] "RemoveContainer" containerID="585ee91571796fba78ba8e4f41233d19b09ebef57aa467f35d92b53d4fb2aa16" Feb 19 15:22:41 crc kubenswrapper[4810]: E0219 15:22:41.015543 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"585ee91571796fba78ba8e4f41233d19b09ebef57aa467f35d92b53d4fb2aa16\": container with ID starting with 585ee91571796fba78ba8e4f41233d19b09ebef57aa467f35d92b53d4fb2aa16 not found: ID does not exist" containerID="585ee91571796fba78ba8e4f41233d19b09ebef57aa467f35d92b53d4fb2aa16" Feb 19 15:22:41 crc kubenswrapper[4810]: I0219 15:22:41.015582 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"585ee91571796fba78ba8e4f41233d19b09ebef57aa467f35d92b53d4fb2aa16"} err="failed to get container status \"585ee91571796fba78ba8e4f41233d19b09ebef57aa467f35d92b53d4fb2aa16\": rpc error: code = NotFound desc = could not find container \"585ee91571796fba78ba8e4f41233d19b09ebef57aa467f35d92b53d4fb2aa16\": container with ID starting with 585ee91571796fba78ba8e4f41233d19b09ebef57aa467f35d92b53d4fb2aa16 not found: ID does not exist" Feb 19 15:22:41 crc kubenswrapper[4810]: I0219 15:22:41.015609 4810 scope.go:117] "RemoveContainer" containerID="b5538334e24ef6c83330097d1f5b71ee7b748811fc18ccd063319fa24e2faa21" Feb 19 15:22:41 crc kubenswrapper[4810]: E0219 15:22:41.016371 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5538334e24ef6c83330097d1f5b71ee7b748811fc18ccd063319fa24e2faa21\": container with ID starting with b5538334e24ef6c83330097d1f5b71ee7b748811fc18ccd063319fa24e2faa21 not found: ID does not exist" containerID="b5538334e24ef6c83330097d1f5b71ee7b748811fc18ccd063319fa24e2faa21" Feb 19 15:22:41 crc kubenswrapper[4810]: I0219 15:22:41.016418 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5538334e24ef6c83330097d1f5b71ee7b748811fc18ccd063319fa24e2faa21"} err="failed to get container status \"b5538334e24ef6c83330097d1f5b71ee7b748811fc18ccd063319fa24e2faa21\": rpc error: code = NotFound desc = could not find container \"b5538334e24ef6c83330097d1f5b71ee7b748811fc18ccd063319fa24e2faa21\": container with ID starting with b5538334e24ef6c83330097d1f5b71ee7b748811fc18ccd063319fa24e2faa21 not found: ID does not exist" Feb 19 15:22:41 crc kubenswrapper[4810]: I0219 15:22:41.460685 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="132beac7-bb25-4b52-863e-0a113dc6799b" path="/var/lib/kubelet/pods/132beac7-bb25-4b52-863e-0a113dc6799b/volumes" Feb 19 15:22:43 crc kubenswrapper[4810]: I0219 15:22:43.951928 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-4g952" event={"ID":"ce589619-7c2f-43db-ae4f-fb43be7b07f4","Type":"ContainerStarted","Data":"5643a79363cc749a73847da14fca76d6e3b9082bb426d8952b68f72c613cc7a7"} Feb 19 15:22:43 crc kubenswrapper[4810]: I0219 15:22:43.982172 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-58c85c668d-4g952" podStartSLOduration=1.942382976 podStartE2EDuration="6.982148612s" podCreationTimestamp="2026-02-19 15:22:37 +0000 UTC" firstStartedPulling="2026-02-19 15:22:38.096837004 +0000 UTC m=+787.578867148" lastFinishedPulling="2026-02-19 15:22:43.13660262 +0000 UTC m=+792.618632784" observedRunningTime="2026-02-19 15:22:43.978133533 +0000 UTC m=+793.460163697" watchObservedRunningTime="2026-02-19 15:22:43.982148612 +0000 UTC m=+793.464178736" Feb 19 15:22:47 crc kubenswrapper[4810]: I0219 15:22:47.700230 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-bhhvv" Feb 19 15:22:47 crc kubenswrapper[4810]: I0219 15:22:47.992251 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-6748b4f7c7-98l2r" Feb 19 15:22:47 crc kubenswrapper[4810]: I0219 15:22:47.992354 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6748b4f7c7-98l2r" Feb 19 15:22:48 crc kubenswrapper[4810]: I0219 15:22:48.000194 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6748b4f7c7-98l2r" Feb 19 15:22:48 crc kubenswrapper[4810]: I0219 15:22:48.996948 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6748b4f7c7-98l2r" Feb 19 15:22:49 crc kubenswrapper[4810]: I0219 15:22:49.079798 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-4hddt"] Feb 19 15:22:58 crc kubenswrapper[4810]: I0219 15:22:58.251140 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-ckvvq" Feb 19 15:23:11 crc kubenswrapper[4810]: I0219 15:23:11.946232 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ccnrb"] Feb 19 15:23:11 crc kubenswrapper[4810]: E0219 15:23:11.948464 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="132beac7-bb25-4b52-863e-0a113dc6799b" containerName="extract-content" Feb 19 15:23:11 crc kubenswrapper[4810]: I0219 15:23:11.948478 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="132beac7-bb25-4b52-863e-0a113dc6799b" containerName="extract-content" Feb 19 15:23:11 crc kubenswrapper[4810]: E0219 15:23:11.948506 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="132beac7-bb25-4b52-863e-0a113dc6799b" containerName="extract-utilities" Feb 19 15:23:11 crc kubenswrapper[4810]: I0219 15:23:11.948513 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="132beac7-bb25-4b52-863e-0a113dc6799b" containerName="extract-utilities" Feb 19 15:23:11 crc kubenswrapper[4810]: E0219 15:23:11.948531 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="132beac7-bb25-4b52-863e-0a113dc6799b" containerName="registry-server" Feb 19 15:23:11 crc kubenswrapper[4810]: I0219 15:23:11.948537 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="132beac7-bb25-4b52-863e-0a113dc6799b" containerName="registry-server" Feb 19 15:23:11 crc kubenswrapper[4810]: I0219 15:23:11.948648 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="132beac7-bb25-4b52-863e-0a113dc6799b" containerName="registry-server" Feb 19 15:23:11 crc kubenswrapper[4810]: I0219 15:23:11.949393 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ccnrb" Feb 19 15:23:11 crc kubenswrapper[4810]: I0219 15:23:11.953149 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 19 15:23:11 crc kubenswrapper[4810]: I0219 15:23:11.958027 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ccnrb"] Feb 19 15:23:11 crc kubenswrapper[4810]: I0219 15:23:11.984954 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a861f8a3-be34-4fc0-96cb-42502d0a3bab-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ccnrb\" (UID: \"a861f8a3-be34-4fc0-96cb-42502d0a3bab\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ccnrb" Feb 19 15:23:11 crc kubenswrapper[4810]: I0219 15:23:11.985013 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a861f8a3-be34-4fc0-96cb-42502d0a3bab-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ccnrb\" (UID: \"a861f8a3-be34-4fc0-96cb-42502d0a3bab\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ccnrb" Feb 19 15:23:11 crc kubenswrapper[4810]: I0219 15:23:11.985047 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bpbt\" (UniqueName: \"kubernetes.io/projected/a861f8a3-be34-4fc0-96cb-42502d0a3bab-kube-api-access-6bpbt\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ccnrb\" (UID: \"a861f8a3-be34-4fc0-96cb-42502d0a3bab\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ccnrb" Feb 19 15:23:12 crc kubenswrapper[4810]: I0219 15:23:12.086624 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a861f8a3-be34-4fc0-96cb-42502d0a3bab-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ccnrb\" (UID: \"a861f8a3-be34-4fc0-96cb-42502d0a3bab\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ccnrb" Feb 19 15:23:12 crc kubenswrapper[4810]: I0219 15:23:12.086924 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bpbt\" (UniqueName: \"kubernetes.io/projected/a861f8a3-be34-4fc0-96cb-42502d0a3bab-kube-api-access-6bpbt\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ccnrb\" (UID: \"a861f8a3-be34-4fc0-96cb-42502d0a3bab\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ccnrb" Feb 19 15:23:12 crc kubenswrapper[4810]: I0219 15:23:12.087053 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a861f8a3-be34-4fc0-96cb-42502d0a3bab-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ccnrb\" (UID: \"a861f8a3-be34-4fc0-96cb-42502d0a3bab\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ccnrb" Feb 19 15:23:12 crc kubenswrapper[4810]: I0219 15:23:12.087357 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a861f8a3-be34-4fc0-96cb-42502d0a3bab-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ccnrb\" (UID: \"a861f8a3-be34-4fc0-96cb-42502d0a3bab\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ccnrb" Feb 19 15:23:12 crc kubenswrapper[4810]: I0219 15:23:12.087447 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a861f8a3-be34-4fc0-96cb-42502d0a3bab-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ccnrb\" (UID: \"a861f8a3-be34-4fc0-96cb-42502d0a3bab\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ccnrb" Feb 19 15:23:12 crc kubenswrapper[4810]: I0219 15:23:12.135208 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bpbt\" (UniqueName: \"kubernetes.io/projected/a861f8a3-be34-4fc0-96cb-42502d0a3bab-kube-api-access-6bpbt\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ccnrb\" (UID: \"a861f8a3-be34-4fc0-96cb-42502d0a3bab\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ccnrb" Feb 19 15:23:12 crc kubenswrapper[4810]: I0219 15:23:12.267281 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ccnrb" Feb 19 15:23:12 crc kubenswrapper[4810]: I0219 15:23:12.717608 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ccnrb"] Feb 19 15:23:13 crc kubenswrapper[4810]: I0219 15:23:13.176027 4810 generic.go:334] "Generic (PLEG): container finished" podID="a861f8a3-be34-4fc0-96cb-42502d0a3bab" containerID="9b2fd4a0249047278a14bf1d594fa14cb2acf71a341fef933ac568cfa7ce7a75" exitCode=0 Feb 19 15:23:13 crc kubenswrapper[4810]: I0219 15:23:13.176091 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ccnrb" event={"ID":"a861f8a3-be34-4fc0-96cb-42502d0a3bab","Type":"ContainerDied","Data":"9b2fd4a0249047278a14bf1d594fa14cb2acf71a341fef933ac568cfa7ce7a75"} Feb 19 15:23:13 crc kubenswrapper[4810]: I0219 15:23:13.176409 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ccnrb" event={"ID":"a861f8a3-be34-4fc0-96cb-42502d0a3bab","Type":"ContainerStarted","Data":"274e568a39b3ec5f5262ddb4db28a79855f75a688c00d404905429361e70fd1e"} Feb 19 15:23:14 crc kubenswrapper[4810]: I0219 15:23:14.147079 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-4hddt" podUID="362cd55c-b576-44bd-843c-078bf26b3b1e" containerName="console" containerID="cri-o://f08283b34555217bd3055e9709afdf48fe7bc7bac3a42fcf9d68868beda18106" gracePeriod=15 Feb 19 15:23:14 crc kubenswrapper[4810]: I0219 15:23:14.608833 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-4hddt_362cd55c-b576-44bd-843c-078bf26b3b1e/console/0.log" Feb 19 15:23:14 crc kubenswrapper[4810]: I0219 15:23:14.608901 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-4hddt" Feb 19 15:23:14 crc kubenswrapper[4810]: I0219 15:23:14.625257 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/362cd55c-b576-44bd-843c-078bf26b3b1e-console-serving-cert\") pod \"362cd55c-b576-44bd-843c-078bf26b3b1e\" (UID: \"362cd55c-b576-44bd-843c-078bf26b3b1e\") " Feb 19 15:23:14 crc kubenswrapper[4810]: I0219 15:23:14.625349 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/362cd55c-b576-44bd-843c-078bf26b3b1e-service-ca\") pod \"362cd55c-b576-44bd-843c-078bf26b3b1e\" (UID: \"362cd55c-b576-44bd-843c-078bf26b3b1e\") " Feb 19 15:23:14 crc kubenswrapper[4810]: I0219 15:23:14.625383 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/362cd55c-b576-44bd-843c-078bf26b3b1e-oauth-serving-cert\") pod \"362cd55c-b576-44bd-843c-078bf26b3b1e\" (UID: \"362cd55c-b576-44bd-843c-078bf26b3b1e\") " Feb 19 15:23:14 crc kubenswrapper[4810]: I0219 15:23:14.625438 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/362cd55c-b576-44bd-843c-078bf26b3b1e-console-oauth-config\") pod \"362cd55c-b576-44bd-843c-078bf26b3b1e\" (UID: \"362cd55c-b576-44bd-843c-078bf26b3b1e\") " Feb 19 15:23:14 crc kubenswrapper[4810]: I0219 15:23:14.625485 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pnqb7\" (UniqueName: \"kubernetes.io/projected/362cd55c-b576-44bd-843c-078bf26b3b1e-kube-api-access-pnqb7\") pod \"362cd55c-b576-44bd-843c-078bf26b3b1e\" (UID: \"362cd55c-b576-44bd-843c-078bf26b3b1e\") " Feb 19 15:23:14 crc kubenswrapper[4810]: I0219 15:23:14.625528 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/362cd55c-b576-44bd-843c-078bf26b3b1e-trusted-ca-bundle\") pod \"362cd55c-b576-44bd-843c-078bf26b3b1e\" (UID: \"362cd55c-b576-44bd-843c-078bf26b3b1e\") " Feb 19 15:23:14 crc kubenswrapper[4810]: I0219 15:23:14.625573 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/362cd55c-b576-44bd-843c-078bf26b3b1e-console-config\") pod \"362cd55c-b576-44bd-843c-078bf26b3b1e\" (UID: \"362cd55c-b576-44bd-843c-078bf26b3b1e\") " Feb 19 15:23:14 crc kubenswrapper[4810]: I0219 15:23:14.626369 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/362cd55c-b576-44bd-843c-078bf26b3b1e-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "362cd55c-b576-44bd-843c-078bf26b3b1e" (UID: "362cd55c-b576-44bd-843c-078bf26b3b1e"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:23:14 crc kubenswrapper[4810]: I0219 15:23:14.626413 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/362cd55c-b576-44bd-843c-078bf26b3b1e-service-ca" (OuterVolumeSpecName: "service-ca") pod "362cd55c-b576-44bd-843c-078bf26b3b1e" (UID: "362cd55c-b576-44bd-843c-078bf26b3b1e"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:23:14 crc kubenswrapper[4810]: I0219 15:23:14.626611 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/362cd55c-b576-44bd-843c-078bf26b3b1e-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "362cd55c-b576-44bd-843c-078bf26b3b1e" (UID: "362cd55c-b576-44bd-843c-078bf26b3b1e"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:23:14 crc kubenswrapper[4810]: I0219 15:23:14.626920 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/362cd55c-b576-44bd-843c-078bf26b3b1e-console-config" (OuterVolumeSpecName: "console-config") pod "362cd55c-b576-44bd-843c-078bf26b3b1e" (UID: "362cd55c-b576-44bd-843c-078bf26b3b1e"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:23:14 crc kubenswrapper[4810]: I0219 15:23:14.631161 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/362cd55c-b576-44bd-843c-078bf26b3b1e-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "362cd55c-b576-44bd-843c-078bf26b3b1e" (UID: "362cd55c-b576-44bd-843c-078bf26b3b1e"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:23:14 crc kubenswrapper[4810]: I0219 15:23:14.633072 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/362cd55c-b576-44bd-843c-078bf26b3b1e-kube-api-access-pnqb7" (OuterVolumeSpecName: "kube-api-access-pnqb7") pod "362cd55c-b576-44bd-843c-078bf26b3b1e" (UID: "362cd55c-b576-44bd-843c-078bf26b3b1e"). InnerVolumeSpecName "kube-api-access-pnqb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:23:14 crc kubenswrapper[4810]: I0219 15:23:14.639415 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/362cd55c-b576-44bd-843c-078bf26b3b1e-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "362cd55c-b576-44bd-843c-078bf26b3b1e" (UID: "362cd55c-b576-44bd-843c-078bf26b3b1e"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:23:14 crc kubenswrapper[4810]: I0219 15:23:14.727622 4810 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/362cd55c-b576-44bd-843c-078bf26b3b1e-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 19 15:23:14 crc kubenswrapper[4810]: I0219 15:23:14.727675 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pnqb7\" (UniqueName: \"kubernetes.io/projected/362cd55c-b576-44bd-843c-078bf26b3b1e-kube-api-access-pnqb7\") on node \"crc\" DevicePath \"\"" Feb 19 15:23:14 crc kubenswrapper[4810]: I0219 15:23:14.727695 4810 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/362cd55c-b576-44bd-843c-078bf26b3b1e-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 15:23:14 crc kubenswrapper[4810]: I0219 15:23:14.727712 4810 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/362cd55c-b576-44bd-843c-078bf26b3b1e-console-config\") on node \"crc\" DevicePath \"\"" Feb 19 15:23:14 crc kubenswrapper[4810]: I0219 15:23:14.727730 4810 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/362cd55c-b576-44bd-843c-078bf26b3b1e-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 15:23:14 crc kubenswrapper[4810]: I0219 15:23:14.727748 4810 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/362cd55c-b576-44bd-843c-078bf26b3b1e-service-ca\") on node \"crc\" DevicePath \"\"" Feb 19 15:23:14 crc kubenswrapper[4810]: I0219 15:23:14.727763 4810 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/362cd55c-b576-44bd-843c-078bf26b3b1e-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 15:23:15 crc kubenswrapper[4810]: I0219 15:23:15.198517 4810 generic.go:334] "Generic (PLEG): container finished" podID="a861f8a3-be34-4fc0-96cb-42502d0a3bab" containerID="b6e2bcd802e63f5a2e306f8014614133f88f2a3f2f3d28a5c3f931f341252672" exitCode=0 Feb 19 15:23:15 crc kubenswrapper[4810]: I0219 15:23:15.198676 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ccnrb" event={"ID":"a861f8a3-be34-4fc0-96cb-42502d0a3bab","Type":"ContainerDied","Data":"b6e2bcd802e63f5a2e306f8014614133f88f2a3f2f3d28a5c3f931f341252672"} Feb 19 15:23:15 crc kubenswrapper[4810]: I0219 15:23:15.202015 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-4hddt_362cd55c-b576-44bd-843c-078bf26b3b1e/console/0.log" Feb 19 15:23:15 crc kubenswrapper[4810]: I0219 15:23:15.202102 4810 generic.go:334] "Generic (PLEG): container finished" podID="362cd55c-b576-44bd-843c-078bf26b3b1e" containerID="f08283b34555217bd3055e9709afdf48fe7bc7bac3a42fcf9d68868beda18106" exitCode=2 Feb 19 15:23:15 crc kubenswrapper[4810]: I0219 15:23:15.202156 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-4hddt" event={"ID":"362cd55c-b576-44bd-843c-078bf26b3b1e","Type":"ContainerDied","Data":"f08283b34555217bd3055e9709afdf48fe7bc7bac3a42fcf9d68868beda18106"} Feb 19 15:23:15 crc kubenswrapper[4810]: I0219 15:23:15.202240 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-4hddt" event={"ID":"362cd55c-b576-44bd-843c-078bf26b3b1e","Type":"ContainerDied","Data":"a1d8a2975e22eb56e23640790355f60287c10a0504259d614d431ce0dc78edbb"} Feb 19 15:23:15 crc kubenswrapper[4810]: I0219 15:23:15.202242 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-4hddt" Feb 19 15:23:15 crc kubenswrapper[4810]: I0219 15:23:15.202278 4810 scope.go:117] "RemoveContainer" containerID="f08283b34555217bd3055e9709afdf48fe7bc7bac3a42fcf9d68868beda18106" Feb 19 15:23:15 crc kubenswrapper[4810]: I0219 15:23:15.241642 4810 scope.go:117] "RemoveContainer" containerID="f08283b34555217bd3055e9709afdf48fe7bc7bac3a42fcf9d68868beda18106" Feb 19 15:23:15 crc kubenswrapper[4810]: E0219 15:23:15.242516 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f08283b34555217bd3055e9709afdf48fe7bc7bac3a42fcf9d68868beda18106\": container with ID starting with f08283b34555217bd3055e9709afdf48fe7bc7bac3a42fcf9d68868beda18106 not found: ID does not exist" containerID="f08283b34555217bd3055e9709afdf48fe7bc7bac3a42fcf9d68868beda18106" Feb 19 15:23:15 crc kubenswrapper[4810]: I0219 15:23:15.242799 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f08283b34555217bd3055e9709afdf48fe7bc7bac3a42fcf9d68868beda18106"} err="failed to get container status \"f08283b34555217bd3055e9709afdf48fe7bc7bac3a42fcf9d68868beda18106\": rpc error: code = NotFound desc = could not find container \"f08283b34555217bd3055e9709afdf48fe7bc7bac3a42fcf9d68868beda18106\": container with ID starting with f08283b34555217bd3055e9709afdf48fe7bc7bac3a42fcf9d68868beda18106 not found: ID does not exist" Feb 19 15:23:15 crc kubenswrapper[4810]: I0219 15:23:15.260794 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-4hddt"] Feb 19 15:23:15 crc kubenswrapper[4810]: I0219 15:23:15.264522 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-4hddt"] Feb 19 15:23:15 crc kubenswrapper[4810]: I0219 15:23:15.453979 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="362cd55c-b576-44bd-843c-078bf26b3b1e" path="/var/lib/kubelet/pods/362cd55c-b576-44bd-843c-078bf26b3b1e/volumes" Feb 19 15:23:16 crc kubenswrapper[4810]: I0219 15:23:16.216396 4810 generic.go:334] "Generic (PLEG): container finished" podID="a861f8a3-be34-4fc0-96cb-42502d0a3bab" containerID="814a9f55e6d8349a2b1db9b86dbab3e606bdea70dc2cb74e2614b5f286c6442a" exitCode=0 Feb 19 15:23:16 crc kubenswrapper[4810]: I0219 15:23:16.216535 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ccnrb" event={"ID":"a861f8a3-be34-4fc0-96cb-42502d0a3bab","Type":"ContainerDied","Data":"814a9f55e6d8349a2b1db9b86dbab3e606bdea70dc2cb74e2614b5f286c6442a"} Feb 19 15:23:17 crc kubenswrapper[4810]: I0219 15:23:17.479263 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ccnrb" Feb 19 15:23:17 crc kubenswrapper[4810]: I0219 15:23:17.568452 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a861f8a3-be34-4fc0-96cb-42502d0a3bab-util\") pod \"a861f8a3-be34-4fc0-96cb-42502d0a3bab\" (UID: \"a861f8a3-be34-4fc0-96cb-42502d0a3bab\") " Feb 19 15:23:17 crc kubenswrapper[4810]: I0219 15:23:17.568545 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a861f8a3-be34-4fc0-96cb-42502d0a3bab-bundle\") pod \"a861f8a3-be34-4fc0-96cb-42502d0a3bab\" (UID: \"a861f8a3-be34-4fc0-96cb-42502d0a3bab\") " Feb 19 15:23:17 crc kubenswrapper[4810]: I0219 15:23:17.568616 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6bpbt\" (UniqueName: \"kubernetes.io/projected/a861f8a3-be34-4fc0-96cb-42502d0a3bab-kube-api-access-6bpbt\") pod \"a861f8a3-be34-4fc0-96cb-42502d0a3bab\" (UID: \"a861f8a3-be34-4fc0-96cb-42502d0a3bab\") " Feb 19 15:23:17 crc kubenswrapper[4810]: I0219 15:23:17.570183 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a861f8a3-be34-4fc0-96cb-42502d0a3bab-bundle" (OuterVolumeSpecName: "bundle") pod "a861f8a3-be34-4fc0-96cb-42502d0a3bab" (UID: "a861f8a3-be34-4fc0-96cb-42502d0a3bab"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:23:17 crc kubenswrapper[4810]: I0219 15:23:17.575167 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a861f8a3-be34-4fc0-96cb-42502d0a3bab-kube-api-access-6bpbt" (OuterVolumeSpecName: "kube-api-access-6bpbt") pod "a861f8a3-be34-4fc0-96cb-42502d0a3bab" (UID: "a861f8a3-be34-4fc0-96cb-42502d0a3bab"). InnerVolumeSpecName "kube-api-access-6bpbt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:23:17 crc kubenswrapper[4810]: I0219 15:23:17.600278 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a861f8a3-be34-4fc0-96cb-42502d0a3bab-util" (OuterVolumeSpecName: "util") pod "a861f8a3-be34-4fc0-96cb-42502d0a3bab" (UID: "a861f8a3-be34-4fc0-96cb-42502d0a3bab"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:23:17 crc kubenswrapper[4810]: I0219 15:23:17.670291 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6bpbt\" (UniqueName: \"kubernetes.io/projected/a861f8a3-be34-4fc0-96cb-42502d0a3bab-kube-api-access-6bpbt\") on node \"crc\" DevicePath \"\"" Feb 19 15:23:17 crc kubenswrapper[4810]: I0219 15:23:17.670361 4810 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a861f8a3-be34-4fc0-96cb-42502d0a3bab-util\") on node \"crc\" DevicePath \"\"" Feb 19 15:23:17 crc kubenswrapper[4810]: I0219 15:23:17.670380 4810 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a861f8a3-be34-4fc0-96cb-42502d0a3bab-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 15:23:18 crc kubenswrapper[4810]: I0219 15:23:18.248473 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ccnrb" event={"ID":"a861f8a3-be34-4fc0-96cb-42502d0a3bab","Type":"ContainerDied","Data":"274e568a39b3ec5f5262ddb4db28a79855f75a688c00d404905429361e70fd1e"} Feb 19 15:23:18 crc kubenswrapper[4810]: I0219 15:23:18.248557 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="274e568a39b3ec5f5262ddb4db28a79855f75a688c00d404905429361e70fd1e" Feb 19 15:23:18 crc kubenswrapper[4810]: I0219 15:23:18.248729 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ccnrb" Feb 19 15:23:28 crc kubenswrapper[4810]: I0219 15:23:28.449738 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-75f48c59dc-m5vm8"] Feb 19 15:23:28 crc kubenswrapper[4810]: E0219 15:23:28.450552 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a861f8a3-be34-4fc0-96cb-42502d0a3bab" containerName="extract" Feb 19 15:23:28 crc kubenswrapper[4810]: I0219 15:23:28.450569 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="a861f8a3-be34-4fc0-96cb-42502d0a3bab" containerName="extract" Feb 19 15:23:28 crc kubenswrapper[4810]: E0219 15:23:28.450584 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a861f8a3-be34-4fc0-96cb-42502d0a3bab" containerName="pull" Feb 19 15:23:28 crc kubenswrapper[4810]: I0219 15:23:28.450591 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="a861f8a3-be34-4fc0-96cb-42502d0a3bab" containerName="pull" Feb 19 15:23:28 crc kubenswrapper[4810]: E0219 15:23:28.450605 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="362cd55c-b576-44bd-843c-078bf26b3b1e" containerName="console" Feb 19 15:23:28 crc kubenswrapper[4810]: I0219 15:23:28.450613 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="362cd55c-b576-44bd-843c-078bf26b3b1e" containerName="console" Feb 19 15:23:28 crc kubenswrapper[4810]: E0219 15:23:28.450624 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a861f8a3-be34-4fc0-96cb-42502d0a3bab" containerName="util" Feb 19 15:23:28 crc kubenswrapper[4810]: I0219 15:23:28.450630 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="a861f8a3-be34-4fc0-96cb-42502d0a3bab" containerName="util" Feb 19 15:23:28 crc kubenswrapper[4810]: I0219 15:23:28.450732 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="362cd55c-b576-44bd-843c-078bf26b3b1e" containerName="console" Feb 19 15:23:28 crc kubenswrapper[4810]: I0219 15:23:28.450745 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="a861f8a3-be34-4fc0-96cb-42502d0a3bab" containerName="extract" Feb 19 15:23:28 crc kubenswrapper[4810]: I0219 15:23:28.451154 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-75f48c59dc-m5vm8" Feb 19 15:23:28 crc kubenswrapper[4810]: I0219 15:23:28.453411 4810 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Feb 19 15:23:28 crc kubenswrapper[4810]: I0219 15:23:28.455807 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Feb 19 15:23:28 crc kubenswrapper[4810]: I0219 15:23:28.455979 4810 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-fp427" Feb 19 15:23:28 crc kubenswrapper[4810]: I0219 15:23:28.456058 4810 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Feb 19 15:23:28 crc kubenswrapper[4810]: I0219 15:23:28.456112 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Feb 19 15:23:28 crc kubenswrapper[4810]: I0219 15:23:28.513687 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-75f48c59dc-m5vm8"] Feb 19 15:23:28 crc kubenswrapper[4810]: I0219 15:23:28.535433 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f26047c7-b8cc-4ce2-8a48-4b380ab225c0-apiservice-cert\") pod \"metallb-operator-controller-manager-75f48c59dc-m5vm8\" (UID: \"f26047c7-b8cc-4ce2-8a48-4b380ab225c0\") " pod="metallb-system/metallb-operator-controller-manager-75f48c59dc-m5vm8" Feb 19 15:23:28 crc kubenswrapper[4810]: I0219 15:23:28.535541 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z875r\" (UniqueName: \"kubernetes.io/projected/f26047c7-b8cc-4ce2-8a48-4b380ab225c0-kube-api-access-z875r\") pod \"metallb-operator-controller-manager-75f48c59dc-m5vm8\" (UID: \"f26047c7-b8cc-4ce2-8a48-4b380ab225c0\") " pod="metallb-system/metallb-operator-controller-manager-75f48c59dc-m5vm8" Feb 19 15:23:28 crc kubenswrapper[4810]: I0219 15:23:28.535565 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f26047c7-b8cc-4ce2-8a48-4b380ab225c0-webhook-cert\") pod \"metallb-operator-controller-manager-75f48c59dc-m5vm8\" (UID: \"f26047c7-b8cc-4ce2-8a48-4b380ab225c0\") " pod="metallb-system/metallb-operator-controller-manager-75f48c59dc-m5vm8" Feb 19 15:23:28 crc kubenswrapper[4810]: I0219 15:23:28.636521 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z875r\" (UniqueName: \"kubernetes.io/projected/f26047c7-b8cc-4ce2-8a48-4b380ab225c0-kube-api-access-z875r\") pod \"metallb-operator-controller-manager-75f48c59dc-m5vm8\" (UID: \"f26047c7-b8cc-4ce2-8a48-4b380ab225c0\") " pod="metallb-system/metallb-operator-controller-manager-75f48c59dc-m5vm8" Feb 19 15:23:28 crc kubenswrapper[4810]: I0219 15:23:28.636576 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f26047c7-b8cc-4ce2-8a48-4b380ab225c0-webhook-cert\") pod \"metallb-operator-controller-manager-75f48c59dc-m5vm8\" (UID: \"f26047c7-b8cc-4ce2-8a48-4b380ab225c0\") " pod="metallb-system/metallb-operator-controller-manager-75f48c59dc-m5vm8" Feb 19 15:23:28 crc kubenswrapper[4810]: I0219 15:23:28.636614 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f26047c7-b8cc-4ce2-8a48-4b380ab225c0-apiservice-cert\") pod \"metallb-operator-controller-manager-75f48c59dc-m5vm8\" (UID: \"f26047c7-b8cc-4ce2-8a48-4b380ab225c0\") " pod="metallb-system/metallb-operator-controller-manager-75f48c59dc-m5vm8" Feb 19 15:23:28 crc kubenswrapper[4810]: I0219 15:23:28.642955 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f26047c7-b8cc-4ce2-8a48-4b380ab225c0-apiservice-cert\") pod \"metallb-operator-controller-manager-75f48c59dc-m5vm8\" (UID: \"f26047c7-b8cc-4ce2-8a48-4b380ab225c0\") " pod="metallb-system/metallb-operator-controller-manager-75f48c59dc-m5vm8" Feb 19 15:23:28 crc kubenswrapper[4810]: I0219 15:23:28.652138 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f26047c7-b8cc-4ce2-8a48-4b380ab225c0-webhook-cert\") pod \"metallb-operator-controller-manager-75f48c59dc-m5vm8\" (UID: \"f26047c7-b8cc-4ce2-8a48-4b380ab225c0\") " pod="metallb-system/metallb-operator-controller-manager-75f48c59dc-m5vm8" Feb 19 15:23:28 crc kubenswrapper[4810]: I0219 15:23:28.675840 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z875r\" (UniqueName: \"kubernetes.io/projected/f26047c7-b8cc-4ce2-8a48-4b380ab225c0-kube-api-access-z875r\") pod \"metallb-operator-controller-manager-75f48c59dc-m5vm8\" (UID: \"f26047c7-b8cc-4ce2-8a48-4b380ab225c0\") " pod="metallb-system/metallb-operator-controller-manager-75f48c59dc-m5vm8" Feb 19 15:23:28 crc kubenswrapper[4810]: I0219 15:23:28.768069 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-75f48c59dc-m5vm8" Feb 19 15:23:28 crc kubenswrapper[4810]: I0219 15:23:28.794667 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-595d5f7545-vfb4c"] Feb 19 15:23:28 crc kubenswrapper[4810]: I0219 15:23:28.797049 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-595d5f7545-vfb4c" Feb 19 15:23:28 crc kubenswrapper[4810]: I0219 15:23:28.800522 4810 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-gt529" Feb 19 15:23:28 crc kubenswrapper[4810]: I0219 15:23:28.800827 4810 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 19 15:23:28 crc kubenswrapper[4810]: I0219 15:23:28.805101 4810 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Feb 19 15:23:28 crc kubenswrapper[4810]: I0219 15:23:28.812690 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-595d5f7545-vfb4c"] Feb 19 15:23:28 crc kubenswrapper[4810]: I0219 15:23:28.838894 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3d62866f-b047-419d-8eb0-848b0df84e63-apiservice-cert\") pod \"metallb-operator-webhook-server-595d5f7545-vfb4c\" (UID: \"3d62866f-b047-419d-8eb0-848b0df84e63\") " pod="metallb-system/metallb-operator-webhook-server-595d5f7545-vfb4c" Feb 19 15:23:28 crc kubenswrapper[4810]: I0219 15:23:28.838943 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xt545\" (UniqueName: \"kubernetes.io/projected/3d62866f-b047-419d-8eb0-848b0df84e63-kube-api-access-xt545\") pod \"metallb-operator-webhook-server-595d5f7545-vfb4c\" (UID: \"3d62866f-b047-419d-8eb0-848b0df84e63\") " pod="metallb-system/metallb-operator-webhook-server-595d5f7545-vfb4c" Feb 19 15:23:28 crc kubenswrapper[4810]: I0219 15:23:28.838999 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3d62866f-b047-419d-8eb0-848b0df84e63-webhook-cert\") pod \"metallb-operator-webhook-server-595d5f7545-vfb4c\" (UID: \"3d62866f-b047-419d-8eb0-848b0df84e63\") " pod="metallb-system/metallb-operator-webhook-server-595d5f7545-vfb4c" Feb 19 15:23:28 crc kubenswrapper[4810]: I0219 15:23:28.940817 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3d62866f-b047-419d-8eb0-848b0df84e63-apiservice-cert\") pod \"metallb-operator-webhook-server-595d5f7545-vfb4c\" (UID: \"3d62866f-b047-419d-8eb0-848b0df84e63\") " pod="metallb-system/metallb-operator-webhook-server-595d5f7545-vfb4c" Feb 19 15:23:28 crc kubenswrapper[4810]: I0219 15:23:28.940860 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xt545\" (UniqueName: \"kubernetes.io/projected/3d62866f-b047-419d-8eb0-848b0df84e63-kube-api-access-xt545\") pod \"metallb-operator-webhook-server-595d5f7545-vfb4c\" (UID: \"3d62866f-b047-419d-8eb0-848b0df84e63\") " pod="metallb-system/metallb-operator-webhook-server-595d5f7545-vfb4c" Feb 19 15:23:28 crc kubenswrapper[4810]: I0219 15:23:28.940914 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3d62866f-b047-419d-8eb0-848b0df84e63-webhook-cert\") pod \"metallb-operator-webhook-server-595d5f7545-vfb4c\" (UID: \"3d62866f-b047-419d-8eb0-848b0df84e63\") " pod="metallb-system/metallb-operator-webhook-server-595d5f7545-vfb4c" Feb 19 15:23:28 crc kubenswrapper[4810]: I0219 15:23:28.957675 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3d62866f-b047-419d-8eb0-848b0df84e63-webhook-cert\") pod \"metallb-operator-webhook-server-595d5f7545-vfb4c\" (UID: \"3d62866f-b047-419d-8eb0-848b0df84e63\") " pod="metallb-system/metallb-operator-webhook-server-595d5f7545-vfb4c" Feb 19 15:23:28 crc kubenswrapper[4810]: I0219 15:23:28.961000 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xt545\" (UniqueName: \"kubernetes.io/projected/3d62866f-b047-419d-8eb0-848b0df84e63-kube-api-access-xt545\") pod \"metallb-operator-webhook-server-595d5f7545-vfb4c\" (UID: \"3d62866f-b047-419d-8eb0-848b0df84e63\") " pod="metallb-system/metallb-operator-webhook-server-595d5f7545-vfb4c" Feb 19 15:23:28 crc kubenswrapper[4810]: I0219 15:23:28.962916 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3d62866f-b047-419d-8eb0-848b0df84e63-apiservice-cert\") pod \"metallb-operator-webhook-server-595d5f7545-vfb4c\" (UID: \"3d62866f-b047-419d-8eb0-848b0df84e63\") " pod="metallb-system/metallb-operator-webhook-server-595d5f7545-vfb4c" Feb 19 15:23:29 crc kubenswrapper[4810]: I0219 15:23:29.084785 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-75f48c59dc-m5vm8"] Feb 19 15:23:29 crc kubenswrapper[4810]: I0219 15:23:29.147888 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-595d5f7545-vfb4c" Feb 19 15:23:29 crc kubenswrapper[4810]: I0219 15:23:29.323508 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-75f48c59dc-m5vm8" event={"ID":"f26047c7-b8cc-4ce2-8a48-4b380ab225c0","Type":"ContainerStarted","Data":"5dd9f3097919a706ca8eb8ca45cd8e19a5d942ce0b5dadcdec6bc31b84aebcc0"} Feb 19 15:23:29 crc kubenswrapper[4810]: I0219 15:23:29.355501 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-595d5f7545-vfb4c"] Feb 19 15:23:30 crc kubenswrapper[4810]: I0219 15:23:30.331313 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-595d5f7545-vfb4c" event={"ID":"3d62866f-b047-419d-8eb0-848b0df84e63","Type":"ContainerStarted","Data":"131c744b5c6df9c014ae6acbb2107c63839e5f4efb0421c5ef2239d7606c84be"} Feb 19 15:23:34 crc kubenswrapper[4810]: I0219 15:23:34.359882 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-75f48c59dc-m5vm8" event={"ID":"f26047c7-b8cc-4ce2-8a48-4b380ab225c0","Type":"ContainerStarted","Data":"810b857650a6a1110481e1e3a731ff3c0ad5950ad9aea829f16d3edcc1491599"} Feb 19 15:23:34 crc kubenswrapper[4810]: I0219 15:23:34.360488 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-75f48c59dc-m5vm8" Feb 19 15:23:34 crc kubenswrapper[4810]: I0219 15:23:34.361722 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-595d5f7545-vfb4c" event={"ID":"3d62866f-b047-419d-8eb0-848b0df84e63","Type":"ContainerStarted","Data":"f54462b9ff2ee7096d94d8c2b86235acd7f13dd74d2f35042e868d77f5ec3490"} Feb 19 15:23:34 crc kubenswrapper[4810]: I0219 15:23:34.361891 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-595d5f7545-vfb4c" Feb 19 15:23:34 crc kubenswrapper[4810]: I0219 15:23:34.383743 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-75f48c59dc-m5vm8" podStartSLOduration=1.4508706980000001 podStartE2EDuration="6.383721085s" podCreationTimestamp="2026-02-19 15:23:28 +0000 UTC" firstStartedPulling="2026-02-19 15:23:29.092746808 +0000 UTC m=+838.574776932" lastFinishedPulling="2026-02-19 15:23:34.025597185 +0000 UTC m=+843.507627319" observedRunningTime="2026-02-19 15:23:34.376684711 +0000 UTC m=+843.858714835" watchObservedRunningTime="2026-02-19 15:23:34.383721085 +0000 UTC m=+843.865751209" Feb 19 15:23:34 crc kubenswrapper[4810]: I0219 15:23:34.404871 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-595d5f7545-vfb4c" podStartSLOduration=1.722910632 podStartE2EDuration="6.404834696s" podCreationTimestamp="2026-02-19 15:23:28 +0000 UTC" firstStartedPulling="2026-02-19 15:23:29.362877635 +0000 UTC m=+838.844907759" lastFinishedPulling="2026-02-19 15:23:34.044801689 +0000 UTC m=+843.526831823" observedRunningTime="2026-02-19 15:23:34.39770452 +0000 UTC m=+843.879734664" watchObservedRunningTime="2026-02-19 15:23:34.404834696 +0000 UTC m=+843.886864820" Feb 19 15:23:49 crc kubenswrapper[4810]: I0219 15:23:49.152783 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-595d5f7545-vfb4c" Feb 19 15:24:08 crc kubenswrapper[4810]: I0219 15:24:08.773374 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-75f48c59dc-m5vm8" Feb 19 15:24:09 crc kubenswrapper[4810]: I0219 15:24:09.536797 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-7rbxk"] Feb 19 15:24:09 crc kubenswrapper[4810]: I0219 15:24:09.552098 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-cwj24"] Feb 19 15:24:09 crc kubenswrapper[4810]: I0219 15:24:09.552854 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-cwj24" Feb 19 15:24:09 crc kubenswrapper[4810]: I0219 15:24:09.553416 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-7rbxk" Feb 19 15:24:09 crc kubenswrapper[4810]: I0219 15:24:09.558728 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Feb 19 15:24:09 crc kubenswrapper[4810]: I0219 15:24:09.558907 4810 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-5d75x" Feb 19 15:24:09 crc kubenswrapper[4810]: I0219 15:24:09.559035 4810 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Feb 19 15:24:09 crc kubenswrapper[4810]: I0219 15:24:09.561541 4810 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Feb 19 15:24:09 crc kubenswrapper[4810]: I0219 15:24:09.569404 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-cwj24"] Feb 19 15:24:09 crc kubenswrapper[4810]: I0219 15:24:09.643841 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-hllgd"] Feb 19 15:24:09 crc kubenswrapper[4810]: I0219 15:24:09.644954 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-hllgd" Feb 19 15:24:09 crc kubenswrapper[4810]: I0219 15:24:09.650025 4810 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Feb 19 15:24:09 crc kubenswrapper[4810]: I0219 15:24:09.650642 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Feb 19 15:24:09 crc kubenswrapper[4810]: I0219 15:24:09.650850 4810 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-hzx2c" Feb 19 15:24:09 crc kubenswrapper[4810]: I0219 15:24:09.652680 4810 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Feb 19 15:24:09 crc kubenswrapper[4810]: I0219 15:24:09.667209 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-69bbfbf88f-jngcz"] Feb 19 15:24:09 crc kubenswrapper[4810]: I0219 15:24:09.668529 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-69bbfbf88f-jngcz" Feb 19 15:24:09 crc kubenswrapper[4810]: I0219 15:24:09.670552 4810 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Feb 19 15:24:09 crc kubenswrapper[4810]: I0219 15:24:09.698506 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-69bbfbf88f-jngcz"] Feb 19 15:24:09 crc kubenswrapper[4810]: I0219 15:24:09.705435 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8csk\" (UniqueName: \"kubernetes.io/projected/1ee9f8f3-05a8-4648-b48d-4975285346d7-kube-api-access-v8csk\") pod \"frr-k8s-webhook-server-78b44bf5bb-cwj24\" (UID: \"1ee9f8f3-05a8-4648-b48d-4975285346d7\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-cwj24" Feb 19 15:24:09 crc kubenswrapper[4810]: I0219 15:24:09.705521 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/66c7e596-ffa3-4687-8c80-21acecbd8075-frr-conf\") pod \"frr-k8s-7rbxk\" (UID: \"66c7e596-ffa3-4687-8c80-21acecbd8075\") " pod="metallb-system/frr-k8s-7rbxk" Feb 19 15:24:09 crc kubenswrapper[4810]: I0219 15:24:09.705552 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/66c7e596-ffa3-4687-8c80-21acecbd8075-frr-startup\") pod \"frr-k8s-7rbxk\" (UID: \"66c7e596-ffa3-4687-8c80-21acecbd8075\") " pod="metallb-system/frr-k8s-7rbxk" Feb 19 15:24:09 crc kubenswrapper[4810]: I0219 15:24:09.705572 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/66c7e596-ffa3-4687-8c80-21acecbd8075-reloader\") pod \"frr-k8s-7rbxk\" (UID: \"66c7e596-ffa3-4687-8c80-21acecbd8075\") " pod="metallb-system/frr-k8s-7rbxk" Feb 19 15:24:09 crc kubenswrapper[4810]: I0219 15:24:09.705588 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/66c7e596-ffa3-4687-8c80-21acecbd8075-metrics\") pod \"frr-k8s-7rbxk\" (UID: \"66c7e596-ffa3-4687-8c80-21acecbd8075\") " pod="metallb-system/frr-k8s-7rbxk" Feb 19 15:24:09 crc kubenswrapper[4810]: I0219 15:24:09.705612 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5kk5\" (UniqueName: \"kubernetes.io/projected/66c7e596-ffa3-4687-8c80-21acecbd8075-kube-api-access-w5kk5\") pod \"frr-k8s-7rbxk\" (UID: \"66c7e596-ffa3-4687-8c80-21acecbd8075\") " pod="metallb-system/frr-k8s-7rbxk" Feb 19 15:24:09 crc kubenswrapper[4810]: I0219 15:24:09.705656 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/66c7e596-ffa3-4687-8c80-21acecbd8075-frr-sockets\") pod \"frr-k8s-7rbxk\" (UID: \"66c7e596-ffa3-4687-8c80-21acecbd8075\") " pod="metallb-system/frr-k8s-7rbxk" Feb 19 15:24:09 crc kubenswrapper[4810]: I0219 15:24:09.705679 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/66c7e596-ffa3-4687-8c80-21acecbd8075-metrics-certs\") pod \"frr-k8s-7rbxk\" (UID: \"66c7e596-ffa3-4687-8c80-21acecbd8075\") " pod="metallb-system/frr-k8s-7rbxk" Feb 19 15:24:09 crc kubenswrapper[4810]: I0219 15:24:09.705697 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1ee9f8f3-05a8-4648-b48d-4975285346d7-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-cwj24\" (UID: \"1ee9f8f3-05a8-4648-b48d-4975285346d7\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-cwj24" Feb 19 15:24:09 crc kubenswrapper[4810]: I0219 15:24:09.807198 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/66c7e596-ffa3-4687-8c80-21acecbd8075-frr-sockets\") pod \"frr-k8s-7rbxk\" (UID: \"66c7e596-ffa3-4687-8c80-21acecbd8075\") " pod="metallb-system/frr-k8s-7rbxk" Feb 19 15:24:09 crc kubenswrapper[4810]: I0219 15:24:09.807264 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/66c7e596-ffa3-4687-8c80-21acecbd8075-metrics-certs\") pod \"frr-k8s-7rbxk\" (UID: \"66c7e596-ffa3-4687-8c80-21acecbd8075\") " pod="metallb-system/frr-k8s-7rbxk" Feb 19 15:24:09 crc kubenswrapper[4810]: I0219 15:24:09.807296 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/781d467e-8522-43a3-a552-1ceebc40cddd-metrics-certs\") pod \"controller-69bbfbf88f-jngcz\" (UID: \"781d467e-8522-43a3-a552-1ceebc40cddd\") " pod="metallb-system/controller-69bbfbf88f-jngcz" Feb 19 15:24:09 crc kubenswrapper[4810]: I0219 15:24:09.807373 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1ee9f8f3-05a8-4648-b48d-4975285346d7-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-cwj24\" (UID: \"1ee9f8f3-05a8-4648-b48d-4975285346d7\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-cwj24" Feb 19 15:24:09 crc kubenswrapper[4810]: I0219 15:24:09.807406 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8csk\" (UniqueName: \"kubernetes.io/projected/1ee9f8f3-05a8-4648-b48d-4975285346d7-kube-api-access-v8csk\") pod \"frr-k8s-webhook-server-78b44bf5bb-cwj24\" (UID: \"1ee9f8f3-05a8-4648-b48d-4975285346d7\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-cwj24" Feb 19 15:24:09 crc kubenswrapper[4810]: I0219 15:24:09.807436 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gl7wn\" (UniqueName: \"kubernetes.io/projected/c9d97974-67d2-42e5-89fe-b6db106a47c4-kube-api-access-gl7wn\") pod \"speaker-hllgd\" (UID: \"c9d97974-67d2-42e5-89fe-b6db106a47c4\") " pod="metallb-system/speaker-hllgd" Feb 19 15:24:09 crc kubenswrapper[4810]: I0219 15:24:09.807467 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/66c7e596-ffa3-4687-8c80-21acecbd8075-frr-conf\") pod \"frr-k8s-7rbxk\" (UID: \"66c7e596-ffa3-4687-8c80-21acecbd8075\") " pod="metallb-system/frr-k8s-7rbxk" Feb 19 15:24:09 crc kubenswrapper[4810]: I0219 15:24:09.807490 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqkg5\" (UniqueName: \"kubernetes.io/projected/781d467e-8522-43a3-a552-1ceebc40cddd-kube-api-access-zqkg5\") pod \"controller-69bbfbf88f-jngcz\" (UID: \"781d467e-8522-43a3-a552-1ceebc40cddd\") " pod="metallb-system/controller-69bbfbf88f-jngcz" Feb 19 15:24:09 crc kubenswrapper[4810]: I0219 15:24:09.807513 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/66c7e596-ffa3-4687-8c80-21acecbd8075-frr-startup\") pod \"frr-k8s-7rbxk\" (UID: \"66c7e596-ffa3-4687-8c80-21acecbd8075\") " pod="metallb-system/frr-k8s-7rbxk" Feb 19 15:24:09 crc kubenswrapper[4810]: I0219 15:24:09.807539 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/66c7e596-ffa3-4687-8c80-21acecbd8075-reloader\") pod \"frr-k8s-7rbxk\" (UID: \"66c7e596-ffa3-4687-8c80-21acecbd8075\") " pod="metallb-system/frr-k8s-7rbxk" Feb 19 15:24:09 crc kubenswrapper[4810]: I0219 15:24:09.807561 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/c9d97974-67d2-42e5-89fe-b6db106a47c4-metallb-excludel2\") pod \"speaker-hllgd\" (UID: \"c9d97974-67d2-42e5-89fe-b6db106a47c4\") " pod="metallb-system/speaker-hllgd" Feb 19 15:24:09 crc kubenswrapper[4810]: I0219 15:24:09.807583 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/66c7e596-ffa3-4687-8c80-21acecbd8075-metrics\") pod \"frr-k8s-7rbxk\" (UID: \"66c7e596-ffa3-4687-8c80-21acecbd8075\") " pod="metallb-system/frr-k8s-7rbxk" Feb 19 15:24:09 crc kubenswrapper[4810]: I0219 15:24:09.807618 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5kk5\" (UniqueName: \"kubernetes.io/projected/66c7e596-ffa3-4687-8c80-21acecbd8075-kube-api-access-w5kk5\") pod \"frr-k8s-7rbxk\" (UID: \"66c7e596-ffa3-4687-8c80-21acecbd8075\") " pod="metallb-system/frr-k8s-7rbxk" Feb 19 15:24:09 crc kubenswrapper[4810]: I0219 15:24:09.807648 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/781d467e-8522-43a3-a552-1ceebc40cddd-cert\") pod \"controller-69bbfbf88f-jngcz\" (UID: \"781d467e-8522-43a3-a552-1ceebc40cddd\") " pod="metallb-system/controller-69bbfbf88f-jngcz" Feb 19 15:24:09 crc kubenswrapper[4810]: I0219 15:24:09.807673 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c9d97974-67d2-42e5-89fe-b6db106a47c4-memberlist\") pod \"speaker-hllgd\" (UID: \"c9d97974-67d2-42e5-89fe-b6db106a47c4\") " pod="metallb-system/speaker-hllgd" Feb 19 15:24:09 crc kubenswrapper[4810]: I0219 15:24:09.807697 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c9d97974-67d2-42e5-89fe-b6db106a47c4-metrics-certs\") pod \"speaker-hllgd\" (UID: \"c9d97974-67d2-42e5-89fe-b6db106a47c4\") " pod="metallb-system/speaker-hllgd" Feb 19 15:24:09 crc kubenswrapper[4810]: I0219 15:24:09.808603 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/66c7e596-ffa3-4687-8c80-21acecbd8075-frr-sockets\") pod \"frr-k8s-7rbxk\" (UID: \"66c7e596-ffa3-4687-8c80-21acecbd8075\") " pod="metallb-system/frr-k8s-7rbxk" Feb 19 15:24:09 crc kubenswrapper[4810]: I0219 15:24:09.808662 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/66c7e596-ffa3-4687-8c80-21acecbd8075-metrics\") pod \"frr-k8s-7rbxk\" (UID: \"66c7e596-ffa3-4687-8c80-21acecbd8075\") " pod="metallb-system/frr-k8s-7rbxk" Feb 19 15:24:09 crc kubenswrapper[4810]: I0219 15:24:09.808753 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/66c7e596-ffa3-4687-8c80-21acecbd8075-frr-conf\") pod \"frr-k8s-7rbxk\" (UID: \"66c7e596-ffa3-4687-8c80-21acecbd8075\") " pod="metallb-system/frr-k8s-7rbxk" Feb 19 15:24:09 crc kubenswrapper[4810]: I0219 15:24:09.808788 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/66c7e596-ffa3-4687-8c80-21acecbd8075-reloader\") pod \"frr-k8s-7rbxk\" (UID: \"66c7e596-ffa3-4687-8c80-21acecbd8075\") " pod="metallb-system/frr-k8s-7rbxk" Feb 19 15:24:09 crc kubenswrapper[4810]: I0219 15:24:09.809133 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/66c7e596-ffa3-4687-8c80-21acecbd8075-frr-startup\") pod \"frr-k8s-7rbxk\" (UID: \"66c7e596-ffa3-4687-8c80-21acecbd8075\") " pod="metallb-system/frr-k8s-7rbxk" Feb 19 15:24:09 crc kubenswrapper[4810]: I0219 15:24:09.817055 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1ee9f8f3-05a8-4648-b48d-4975285346d7-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-cwj24\" (UID: \"1ee9f8f3-05a8-4648-b48d-4975285346d7\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-cwj24" Feb 19 15:24:09 crc kubenswrapper[4810]: I0219 15:24:09.829852 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/66c7e596-ffa3-4687-8c80-21acecbd8075-metrics-certs\") pod \"frr-k8s-7rbxk\" (UID: \"66c7e596-ffa3-4687-8c80-21acecbd8075\") " pod="metallb-system/frr-k8s-7rbxk" Feb 19 15:24:09 crc kubenswrapper[4810]: I0219 15:24:09.832465 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5kk5\" (UniqueName: \"kubernetes.io/projected/66c7e596-ffa3-4687-8c80-21acecbd8075-kube-api-access-w5kk5\") pod \"frr-k8s-7rbxk\" (UID: \"66c7e596-ffa3-4687-8c80-21acecbd8075\") " pod="metallb-system/frr-k8s-7rbxk" Feb 19 15:24:09 crc kubenswrapper[4810]: I0219 15:24:09.834004 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8csk\" (UniqueName: \"kubernetes.io/projected/1ee9f8f3-05a8-4648-b48d-4975285346d7-kube-api-access-v8csk\") pod \"frr-k8s-webhook-server-78b44bf5bb-cwj24\" (UID: \"1ee9f8f3-05a8-4648-b48d-4975285346d7\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-cwj24" Feb 19 15:24:09 crc kubenswrapper[4810]: I0219 15:24:09.882751 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-cwj24" Feb 19 15:24:09 crc kubenswrapper[4810]: I0219 15:24:09.894948 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-7rbxk" Feb 19 15:24:09 crc kubenswrapper[4810]: I0219 15:24:09.913355 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/781d467e-8522-43a3-a552-1ceebc40cddd-metrics-certs\") pod \"controller-69bbfbf88f-jngcz\" (UID: \"781d467e-8522-43a3-a552-1ceebc40cddd\") " pod="metallb-system/controller-69bbfbf88f-jngcz" Feb 19 15:24:09 crc kubenswrapper[4810]: I0219 15:24:09.913405 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gl7wn\" (UniqueName: \"kubernetes.io/projected/c9d97974-67d2-42e5-89fe-b6db106a47c4-kube-api-access-gl7wn\") pod \"speaker-hllgd\" (UID: \"c9d97974-67d2-42e5-89fe-b6db106a47c4\") " pod="metallb-system/speaker-hllgd" Feb 19 15:24:09 crc kubenswrapper[4810]: I0219 15:24:09.913437 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqkg5\" (UniqueName: \"kubernetes.io/projected/781d467e-8522-43a3-a552-1ceebc40cddd-kube-api-access-zqkg5\") pod \"controller-69bbfbf88f-jngcz\" (UID: \"781d467e-8522-43a3-a552-1ceebc40cddd\") " pod="metallb-system/controller-69bbfbf88f-jngcz" Feb 19 15:24:09 crc kubenswrapper[4810]: I0219 15:24:09.913471 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/c9d97974-67d2-42e5-89fe-b6db106a47c4-metallb-excludel2\") pod \"speaker-hllgd\" (UID: \"c9d97974-67d2-42e5-89fe-b6db106a47c4\") " pod="metallb-system/speaker-hllgd" Feb 19 15:24:09 crc kubenswrapper[4810]: I0219 15:24:09.913511 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/781d467e-8522-43a3-a552-1ceebc40cddd-cert\") pod \"controller-69bbfbf88f-jngcz\" (UID: \"781d467e-8522-43a3-a552-1ceebc40cddd\") " pod="metallb-system/controller-69bbfbf88f-jngcz" Feb 19 15:24:09 crc kubenswrapper[4810]: I0219 15:24:09.913531 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c9d97974-67d2-42e5-89fe-b6db106a47c4-memberlist\") pod \"speaker-hllgd\" (UID: \"c9d97974-67d2-42e5-89fe-b6db106a47c4\") " pod="metallb-system/speaker-hllgd" Feb 19 15:24:09 crc kubenswrapper[4810]: I0219 15:24:09.913550 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c9d97974-67d2-42e5-89fe-b6db106a47c4-metrics-certs\") pod \"speaker-hllgd\" (UID: \"c9d97974-67d2-42e5-89fe-b6db106a47c4\") " pod="metallb-system/speaker-hllgd" Feb 19 15:24:09 crc kubenswrapper[4810]: E0219 15:24:09.913947 4810 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 19 15:24:09 crc kubenswrapper[4810]: E0219 15:24:09.914031 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c9d97974-67d2-42e5-89fe-b6db106a47c4-memberlist podName:c9d97974-67d2-42e5-89fe-b6db106a47c4 nodeName:}" failed. No retries permitted until 2026-02-19 15:24:10.414008828 +0000 UTC m=+879.896038952 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/c9d97974-67d2-42e5-89fe-b6db106a47c4-memberlist") pod "speaker-hllgd" (UID: "c9d97974-67d2-42e5-89fe-b6db106a47c4") : secret "metallb-memberlist" not found Feb 19 15:24:09 crc kubenswrapper[4810]: I0219 15:24:09.917209 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c9d97974-67d2-42e5-89fe-b6db106a47c4-metrics-certs\") pod \"speaker-hllgd\" (UID: \"c9d97974-67d2-42e5-89fe-b6db106a47c4\") " pod="metallb-system/speaker-hllgd" Feb 19 15:24:09 crc kubenswrapper[4810]: I0219 15:24:09.919191 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/781d467e-8522-43a3-a552-1ceebc40cddd-metrics-certs\") pod \"controller-69bbfbf88f-jngcz\" (UID: \"781d467e-8522-43a3-a552-1ceebc40cddd\") " pod="metallb-system/controller-69bbfbf88f-jngcz" Feb 19 15:24:09 crc kubenswrapper[4810]: I0219 15:24:09.920055 4810 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 19 15:24:09 crc kubenswrapper[4810]: I0219 15:24:09.922120 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/c9d97974-67d2-42e5-89fe-b6db106a47c4-metallb-excludel2\") pod \"speaker-hllgd\" (UID: \"c9d97974-67d2-42e5-89fe-b6db106a47c4\") " pod="metallb-system/speaker-hllgd" Feb 19 15:24:09 crc kubenswrapper[4810]: I0219 15:24:09.926923 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/781d467e-8522-43a3-a552-1ceebc40cddd-cert\") pod \"controller-69bbfbf88f-jngcz\" (UID: \"781d467e-8522-43a3-a552-1ceebc40cddd\") " pod="metallb-system/controller-69bbfbf88f-jngcz" Feb 19 15:24:09 crc kubenswrapper[4810]: I0219 15:24:09.931674 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gl7wn\" (UniqueName: \"kubernetes.io/projected/c9d97974-67d2-42e5-89fe-b6db106a47c4-kube-api-access-gl7wn\") pod \"speaker-hllgd\" (UID: \"c9d97974-67d2-42e5-89fe-b6db106a47c4\") " pod="metallb-system/speaker-hllgd" Feb 19 15:24:09 crc kubenswrapper[4810]: I0219 15:24:09.932159 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqkg5\" (UniqueName: \"kubernetes.io/projected/781d467e-8522-43a3-a552-1ceebc40cddd-kube-api-access-zqkg5\") pod \"controller-69bbfbf88f-jngcz\" (UID: \"781d467e-8522-43a3-a552-1ceebc40cddd\") " pod="metallb-system/controller-69bbfbf88f-jngcz" Feb 19 15:24:09 crc kubenswrapper[4810]: I0219 15:24:09.984443 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-69bbfbf88f-jngcz" Feb 19 15:24:10 crc kubenswrapper[4810]: I0219 15:24:10.300485 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-cwj24"] Feb 19 15:24:10 crc kubenswrapper[4810]: W0219 15:24:10.309390 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ee9f8f3_05a8_4648_b48d_4975285346d7.slice/crio-274b78254a5e1abda79bd4f33f25dc39c7075df4aacc7058ceb43d19c42c0fac WatchSource:0}: Error finding container 274b78254a5e1abda79bd4f33f25dc39c7075df4aacc7058ceb43d19c42c0fac: Status 404 returned error can't find the container with id 274b78254a5e1abda79bd4f33f25dc39c7075df4aacc7058ceb43d19c42c0fac Feb 19 15:24:10 crc kubenswrapper[4810]: I0219 15:24:10.399577 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-69bbfbf88f-jngcz"] Feb 19 15:24:10 crc kubenswrapper[4810]: W0219 15:24:10.407131 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod781d467e_8522_43a3_a552_1ceebc40cddd.slice/crio-37ae2f1281734abb3dd1fb3e9f5ecbe4af3939d71b62f2c26fff801e9564e799 WatchSource:0}: Error finding container 37ae2f1281734abb3dd1fb3e9f5ecbe4af3939d71b62f2c26fff801e9564e799: Status 404 returned error can't find the container with id 37ae2f1281734abb3dd1fb3e9f5ecbe4af3939d71b62f2c26fff801e9564e799 Feb 19 15:24:10 crc kubenswrapper[4810]: I0219 15:24:10.420561 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c9d97974-67d2-42e5-89fe-b6db106a47c4-memberlist\") pod \"speaker-hllgd\" (UID: \"c9d97974-67d2-42e5-89fe-b6db106a47c4\") " pod="metallb-system/speaker-hllgd" Feb 19 15:24:10 crc kubenswrapper[4810]: E0219 15:24:10.420737 4810 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 19 15:24:10 crc kubenswrapper[4810]: E0219 15:24:10.420811 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c9d97974-67d2-42e5-89fe-b6db106a47c4-memberlist podName:c9d97974-67d2-42e5-89fe-b6db106a47c4 nodeName:}" failed. No retries permitted until 2026-02-19 15:24:11.420789898 +0000 UTC m=+880.902820022 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/c9d97974-67d2-42e5-89fe-b6db106a47c4-memberlist") pod "speaker-hllgd" (UID: "c9d97974-67d2-42e5-89fe-b6db106a47c4") : secret "metallb-memberlist" not found Feb 19 15:24:10 crc kubenswrapper[4810]: I0219 15:24:10.609422 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-jngcz" event={"ID":"781d467e-8522-43a3-a552-1ceebc40cddd","Type":"ContainerStarted","Data":"9c2c5945aa86b126530081c5a05f63b9e950122cbc46898cd1ca1628bff224f8"} Feb 19 15:24:10 crc kubenswrapper[4810]: I0219 15:24:10.609964 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-jngcz" event={"ID":"781d467e-8522-43a3-a552-1ceebc40cddd","Type":"ContainerStarted","Data":"37ae2f1281734abb3dd1fb3e9f5ecbe4af3939d71b62f2c26fff801e9564e799"} Feb 19 15:24:10 crc kubenswrapper[4810]: I0219 15:24:10.610564 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-cwj24" event={"ID":"1ee9f8f3-05a8-4648-b48d-4975285346d7","Type":"ContainerStarted","Data":"274b78254a5e1abda79bd4f33f25dc39c7075df4aacc7058ceb43d19c42c0fac"} Feb 19 15:24:10 crc kubenswrapper[4810]: I0219 15:24:10.611621 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7rbxk" event={"ID":"66c7e596-ffa3-4687-8c80-21acecbd8075","Type":"ContainerStarted","Data":"8fa2c12028d5ed3f09c76f7e66de9e316bec80a75cb33de0fe7b0642d2605373"} Feb 19 15:24:11 crc kubenswrapper[4810]: I0219 15:24:11.436448 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c9d97974-67d2-42e5-89fe-b6db106a47c4-memberlist\") pod \"speaker-hllgd\" (UID: \"c9d97974-67d2-42e5-89fe-b6db106a47c4\") " pod="metallb-system/speaker-hllgd" Feb 19 15:24:11 crc kubenswrapper[4810]: I0219 15:24:11.456791 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c9d97974-67d2-42e5-89fe-b6db106a47c4-memberlist\") pod \"speaker-hllgd\" (UID: \"c9d97974-67d2-42e5-89fe-b6db106a47c4\") " pod="metallb-system/speaker-hllgd" Feb 19 15:24:11 crc kubenswrapper[4810]: I0219 15:24:11.462646 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-hllgd" Feb 19 15:24:11 crc kubenswrapper[4810]: W0219 15:24:11.531579 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9d97974_67d2_42e5_89fe_b6db106a47c4.slice/crio-eb3c0b5353d5abfc159f28110e22d7cba48bff35f08c3420e5afa93f8904566c WatchSource:0}: Error finding container eb3c0b5353d5abfc159f28110e22d7cba48bff35f08c3420e5afa93f8904566c: Status 404 returned error can't find the container with id eb3c0b5353d5abfc159f28110e22d7cba48bff35f08c3420e5afa93f8904566c Feb 19 15:24:11 crc kubenswrapper[4810]: I0219 15:24:11.619728 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-jngcz" event={"ID":"781d467e-8522-43a3-a552-1ceebc40cddd","Type":"ContainerStarted","Data":"14156917ac66334fd3bc1a6be64cb2045dd223ce40ab384af4688fef71fa71f8"} Feb 19 15:24:11 crc kubenswrapper[4810]: I0219 15:24:11.619987 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-69bbfbf88f-jngcz" Feb 19 15:24:11 crc kubenswrapper[4810]: I0219 15:24:11.621290 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-hllgd" event={"ID":"c9d97974-67d2-42e5-89fe-b6db106a47c4","Type":"ContainerStarted","Data":"eb3c0b5353d5abfc159f28110e22d7cba48bff35f08c3420e5afa93f8904566c"} Feb 19 15:24:11 crc kubenswrapper[4810]: I0219 15:24:11.638639 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-69bbfbf88f-jngcz" podStartSLOduration=2.638622121 podStartE2EDuration="2.638622121s" podCreationTimestamp="2026-02-19 15:24:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:24:11.636635992 +0000 UTC m=+881.118666116" watchObservedRunningTime="2026-02-19 15:24:11.638622121 +0000 UTC m=+881.120652245" Feb 19 15:24:12 crc kubenswrapper[4810]: I0219 15:24:12.631984 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-hllgd" event={"ID":"c9d97974-67d2-42e5-89fe-b6db106a47c4","Type":"ContainerStarted","Data":"134104236f48a53956d890285b237a5508919997d691fed7f558b3fd42bca024"} Feb 19 15:24:12 crc kubenswrapper[4810]: I0219 15:24:12.632040 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-hllgd" event={"ID":"c9d97974-67d2-42e5-89fe-b6db106a47c4","Type":"ContainerStarted","Data":"47a20248c1a45b8d96de33c2bc69ba29568be3e2e976fc1017e8757777318ad6"} Feb 19 15:24:12 crc kubenswrapper[4810]: I0219 15:24:12.632128 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-hllgd" Feb 19 15:24:12 crc kubenswrapper[4810]: I0219 15:24:12.657898 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-hllgd" podStartSLOduration=3.657880041 podStartE2EDuration="3.657880041s" podCreationTimestamp="2026-02-19 15:24:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:24:12.652342814 +0000 UTC m=+882.134372928" watchObservedRunningTime="2026-02-19 15:24:12.657880041 +0000 UTC m=+882.139910165" Feb 19 15:24:18 crc kubenswrapper[4810]: I0219 15:24:18.682820 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-cwj24" event={"ID":"1ee9f8f3-05a8-4648-b48d-4975285346d7","Type":"ContainerStarted","Data":"73cf8d14acab7446a45fab0ac5fbc8426dcfb7b622c6a2e7262335444e0db051"} Feb 19 15:24:18 crc kubenswrapper[4810]: I0219 15:24:18.683604 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-cwj24" Feb 19 15:24:18 crc kubenswrapper[4810]: I0219 15:24:18.685156 4810 generic.go:334] "Generic (PLEG): container finished" podID="66c7e596-ffa3-4687-8c80-21acecbd8075" containerID="cbc70f1e7e14eb1ce82644cc3a99ef44920da0cedb66b4a74d35e9510b2f0d1d" exitCode=0 Feb 19 15:24:18 crc kubenswrapper[4810]: I0219 15:24:18.685242 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7rbxk" event={"ID":"66c7e596-ffa3-4687-8c80-21acecbd8075","Type":"ContainerDied","Data":"cbc70f1e7e14eb1ce82644cc3a99ef44920da0cedb66b4a74d35e9510b2f0d1d"} Feb 19 15:24:18 crc kubenswrapper[4810]: I0219 15:24:18.707672 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-cwj24" podStartSLOduration=2.341561547 podStartE2EDuration="9.707642259s" podCreationTimestamp="2026-02-19 15:24:09 +0000 UTC" firstStartedPulling="2026-02-19 15:24:10.312348771 +0000 UTC m=+879.794378895" lastFinishedPulling="2026-02-19 15:24:17.678429453 +0000 UTC m=+887.160459607" observedRunningTime="2026-02-19 15:24:18.702021691 +0000 UTC m=+888.184051815" watchObservedRunningTime="2026-02-19 15:24:18.707642259 +0000 UTC m=+888.189672423" Feb 19 15:24:19 crc kubenswrapper[4810]: I0219 15:24:19.701694 4810 generic.go:334] "Generic (PLEG): container finished" podID="66c7e596-ffa3-4687-8c80-21acecbd8075" containerID="02578ae2ec427a212d8c044ddf2bfaa326b5d3dd70c9a09659d1aee617f1535f" exitCode=0 Feb 19 15:24:19 crc kubenswrapper[4810]: I0219 15:24:19.701809 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7rbxk" event={"ID":"66c7e596-ffa3-4687-8c80-21acecbd8075","Type":"ContainerDied","Data":"02578ae2ec427a212d8c044ddf2bfaa326b5d3dd70c9a09659d1aee617f1535f"} Feb 19 15:24:20 crc kubenswrapper[4810]: I0219 15:24:20.710396 4810 generic.go:334] "Generic (PLEG): container finished" podID="66c7e596-ffa3-4687-8c80-21acecbd8075" containerID="8e693cf93437c9c05e8bdc88a4a36c64be3f787f3c70969abb84108fa8d23c9f" exitCode=0 Feb 19 15:24:20 crc kubenswrapper[4810]: I0219 15:24:20.710468 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7rbxk" event={"ID":"66c7e596-ffa3-4687-8c80-21acecbd8075","Type":"ContainerDied","Data":"8e693cf93437c9c05e8bdc88a4a36c64be3f787f3c70969abb84108fa8d23c9f"} Feb 19 15:24:21 crc kubenswrapper[4810]: I0219 15:24:21.469438 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-hllgd" Feb 19 15:24:21 crc kubenswrapper[4810]: I0219 15:24:21.734411 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7rbxk" event={"ID":"66c7e596-ffa3-4687-8c80-21acecbd8075","Type":"ContainerStarted","Data":"8b7e47e70a6e8055d8f542504afd337e8aa4c60b72d221db875feb5537eeca43"} Feb 19 15:24:21 crc kubenswrapper[4810]: I0219 15:24:21.734522 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7rbxk" event={"ID":"66c7e596-ffa3-4687-8c80-21acecbd8075","Type":"ContainerStarted","Data":"f901ccd9397f453fcd92de419c2489db750603670aca79d61f797ca4456f1425"} Feb 19 15:24:21 crc kubenswrapper[4810]: I0219 15:24:21.734535 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7rbxk" event={"ID":"66c7e596-ffa3-4687-8c80-21acecbd8075","Type":"ContainerStarted","Data":"a5242bdba01e77c89facdb5cf616d70b6ab21d75b87ff0c920dc2e5212136f48"} Feb 19 15:24:21 crc kubenswrapper[4810]: I0219 15:24:21.734546 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7rbxk" event={"ID":"66c7e596-ffa3-4687-8c80-21acecbd8075","Type":"ContainerStarted","Data":"5c507bc92e0aea14a09cbb57e68e961eaf45800b0d0ad346020542c85064b99d"} Feb 19 15:24:21 crc kubenswrapper[4810]: I0219 15:24:21.734558 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7rbxk" event={"ID":"66c7e596-ffa3-4687-8c80-21acecbd8075","Type":"ContainerStarted","Data":"ac46469a5393bc2843e4d6c82f3e6bc0722ade1cee9ec77f728700e026b8f654"} Feb 19 15:24:22 crc kubenswrapper[4810]: I0219 15:24:22.746955 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7rbxk" event={"ID":"66c7e596-ffa3-4687-8c80-21acecbd8075","Type":"ContainerStarted","Data":"d90869cb3c398041b0dfc6260521c79c55956aadb206f542e3a7393d75810e32"} Feb 19 15:24:22 crc kubenswrapper[4810]: I0219 15:24:22.747151 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-7rbxk" Feb 19 15:24:22 crc kubenswrapper[4810]: I0219 15:24:22.777074 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-7rbxk" podStartSLOduration=6.0981763 podStartE2EDuration="13.777043082s" podCreationTimestamp="2026-02-19 15:24:09 +0000 UTC" firstStartedPulling="2026-02-19 15:24:10.0339808 +0000 UTC m=+879.516010914" lastFinishedPulling="2026-02-19 15:24:17.712847572 +0000 UTC m=+887.194877696" observedRunningTime="2026-02-19 15:24:22.769974998 +0000 UTC m=+892.252005142" watchObservedRunningTime="2026-02-19 15:24:22.777043082 +0000 UTC m=+892.259073256" Feb 19 15:24:25 crc kubenswrapper[4810]: I0219 15:24:25.119586 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-7rbxk" Feb 19 15:24:25 crc kubenswrapper[4810]: I0219 15:24:25.176513 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-7rbxk" Feb 19 15:24:25 crc kubenswrapper[4810]: I0219 15:24:25.643420 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-sk8xn"] Feb 19 15:24:25 crc kubenswrapper[4810]: I0219 15:24:25.644456 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-sk8xn" Feb 19 15:24:25 crc kubenswrapper[4810]: I0219 15:24:25.647719 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-k54t5" Feb 19 15:24:25 crc kubenswrapper[4810]: I0219 15:24:25.653031 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Feb 19 15:24:25 crc kubenswrapper[4810]: I0219 15:24:25.654124 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Feb 19 15:24:25 crc kubenswrapper[4810]: I0219 15:24:25.665125 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-sk8xn"] Feb 19 15:24:25 crc kubenswrapper[4810]: I0219 15:24:25.726308 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtc74\" (UniqueName: \"kubernetes.io/projected/41d0bc43-e85a-4f9f-afd4-084e6f44e4ce-kube-api-access-mtc74\") pod \"openstack-operator-index-sk8xn\" (UID: \"41d0bc43-e85a-4f9f-afd4-084e6f44e4ce\") " pod="openstack-operators/openstack-operator-index-sk8xn" Feb 19 15:24:25 crc kubenswrapper[4810]: I0219 15:24:25.827641 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtc74\" (UniqueName: \"kubernetes.io/projected/41d0bc43-e85a-4f9f-afd4-084e6f44e4ce-kube-api-access-mtc74\") pod \"openstack-operator-index-sk8xn\" (UID: \"41d0bc43-e85a-4f9f-afd4-084e6f44e4ce\") " pod="openstack-operators/openstack-operator-index-sk8xn" Feb 19 15:24:25 crc kubenswrapper[4810]: I0219 15:24:25.862216 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtc74\" (UniqueName: \"kubernetes.io/projected/41d0bc43-e85a-4f9f-afd4-084e6f44e4ce-kube-api-access-mtc74\") pod \"openstack-operator-index-sk8xn\" (UID: \"41d0bc43-e85a-4f9f-afd4-084e6f44e4ce\") " pod="openstack-operators/openstack-operator-index-sk8xn" Feb 19 15:24:25 crc kubenswrapper[4810]: I0219 15:24:25.964211 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-sk8xn" Feb 19 15:24:26 crc kubenswrapper[4810]: I0219 15:24:26.397673 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-sk8xn"] Feb 19 15:24:27 crc kubenswrapper[4810]: I0219 15:24:27.153556 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-sk8xn" event={"ID":"41d0bc43-e85a-4f9f-afd4-084e6f44e4ce","Type":"ContainerStarted","Data":"2b0a57286b9b2353d026057f88088fed2faf2067e31ca739bd4323db681d6a21"} Feb 19 15:24:29 crc kubenswrapper[4810]: I0219 15:24:29.018871 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-sk8xn"] Feb 19 15:24:29 crc kubenswrapper[4810]: I0219 15:24:29.180164 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-sk8xn" event={"ID":"41d0bc43-e85a-4f9f-afd4-084e6f44e4ce","Type":"ContainerStarted","Data":"aa0ac433c527ef4ec3ac933184eb8fd41d99a848cd1d1d205b4fdfd2a61acede"} Feb 19 15:24:29 crc kubenswrapper[4810]: I0219 15:24:29.206818 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-sk8xn" podStartSLOduration=1.881144001 podStartE2EDuration="4.20678453s" podCreationTimestamp="2026-02-19 15:24:25 +0000 UTC" firstStartedPulling="2026-02-19 15:24:26.411755884 +0000 UTC m=+895.893786048" lastFinishedPulling="2026-02-19 15:24:28.737396453 +0000 UTC m=+898.219426577" observedRunningTime="2026-02-19 15:24:29.201812297 +0000 UTC m=+898.683842451" watchObservedRunningTime="2026-02-19 15:24:29.20678453 +0000 UTC m=+898.688814694" Feb 19 15:24:29 crc kubenswrapper[4810]: I0219 15:24:29.629700 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-gkft8"] Feb 19 15:24:29 crc kubenswrapper[4810]: I0219 15:24:29.631238 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-gkft8" Feb 19 15:24:29 crc kubenswrapper[4810]: I0219 15:24:29.650899 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-gkft8"] Feb 19 15:24:29 crc kubenswrapper[4810]: I0219 15:24:29.700257 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkjvv\" (UniqueName: \"kubernetes.io/projected/09f49ae7-b6fb-4ca5-9238-8bcf8d15ea95-kube-api-access-hkjvv\") pod \"openstack-operator-index-gkft8\" (UID: \"09f49ae7-b6fb-4ca5-9238-8bcf8d15ea95\") " pod="openstack-operators/openstack-operator-index-gkft8" Feb 19 15:24:29 crc kubenswrapper[4810]: I0219 15:24:29.801631 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkjvv\" (UniqueName: \"kubernetes.io/projected/09f49ae7-b6fb-4ca5-9238-8bcf8d15ea95-kube-api-access-hkjvv\") pod \"openstack-operator-index-gkft8\" (UID: \"09f49ae7-b6fb-4ca5-9238-8bcf8d15ea95\") " pod="openstack-operators/openstack-operator-index-gkft8" Feb 19 15:24:29 crc kubenswrapper[4810]: I0219 15:24:29.828160 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkjvv\" (UniqueName: \"kubernetes.io/projected/09f49ae7-b6fb-4ca5-9238-8bcf8d15ea95-kube-api-access-hkjvv\") pod \"openstack-operator-index-gkft8\" (UID: \"09f49ae7-b6fb-4ca5-9238-8bcf8d15ea95\") " pod="openstack-operators/openstack-operator-index-gkft8" Feb 19 15:24:29 crc kubenswrapper[4810]: I0219 15:24:29.889909 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-cwj24" Feb 19 15:24:29 crc kubenswrapper[4810]: I0219 15:24:29.958206 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-gkft8" Feb 19 15:24:29 crc kubenswrapper[4810]: I0219 15:24:29.987390 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-69bbfbf88f-jngcz" Feb 19 15:24:30 crc kubenswrapper[4810]: I0219 15:24:30.187043 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-sk8xn" podUID="41d0bc43-e85a-4f9f-afd4-084e6f44e4ce" containerName="registry-server" containerID="cri-o://aa0ac433c527ef4ec3ac933184eb8fd41d99a848cd1d1d205b4fdfd2a61acede" gracePeriod=2 Feb 19 15:24:30 crc kubenswrapper[4810]: I0219 15:24:30.256932 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-gkft8"] Feb 19 15:24:30 crc kubenswrapper[4810]: W0219 15:24:30.279248 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09f49ae7_b6fb_4ca5_9238_8bcf8d15ea95.slice/crio-5c060bf458cdb24d082d26fcac41d5603a9a93c63bfecf77323ee460a9708b0a WatchSource:0}: Error finding container 5c060bf458cdb24d082d26fcac41d5603a9a93c63bfecf77323ee460a9708b0a: Status 404 returned error can't find the container with id 5c060bf458cdb24d082d26fcac41d5603a9a93c63bfecf77323ee460a9708b0a Feb 19 15:24:30 crc kubenswrapper[4810]: I0219 15:24:30.508444 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-sk8xn" Feb 19 15:24:30 crc kubenswrapper[4810]: I0219 15:24:30.627846 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtc74\" (UniqueName: \"kubernetes.io/projected/41d0bc43-e85a-4f9f-afd4-084e6f44e4ce-kube-api-access-mtc74\") pod \"41d0bc43-e85a-4f9f-afd4-084e6f44e4ce\" (UID: \"41d0bc43-e85a-4f9f-afd4-084e6f44e4ce\") " Feb 19 15:24:30 crc kubenswrapper[4810]: I0219 15:24:30.634614 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41d0bc43-e85a-4f9f-afd4-084e6f44e4ce-kube-api-access-mtc74" (OuterVolumeSpecName: "kube-api-access-mtc74") pod "41d0bc43-e85a-4f9f-afd4-084e6f44e4ce" (UID: "41d0bc43-e85a-4f9f-afd4-084e6f44e4ce"). InnerVolumeSpecName "kube-api-access-mtc74". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:24:30 crc kubenswrapper[4810]: I0219 15:24:30.730055 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtc74\" (UniqueName: \"kubernetes.io/projected/41d0bc43-e85a-4f9f-afd4-084e6f44e4ce-kube-api-access-mtc74\") on node \"crc\" DevicePath \"\"" Feb 19 15:24:31 crc kubenswrapper[4810]: I0219 15:24:31.195603 4810 generic.go:334] "Generic (PLEG): container finished" podID="41d0bc43-e85a-4f9f-afd4-084e6f44e4ce" containerID="aa0ac433c527ef4ec3ac933184eb8fd41d99a848cd1d1d205b4fdfd2a61acede" exitCode=0 Feb 19 15:24:31 crc kubenswrapper[4810]: I0219 15:24:31.195713 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-sk8xn" event={"ID":"41d0bc43-e85a-4f9f-afd4-084e6f44e4ce","Type":"ContainerDied","Data":"aa0ac433c527ef4ec3ac933184eb8fd41d99a848cd1d1d205b4fdfd2a61acede"} Feb 19 15:24:31 crc kubenswrapper[4810]: I0219 15:24:31.195725 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-sk8xn" Feb 19 15:24:31 crc kubenswrapper[4810]: I0219 15:24:31.195756 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-sk8xn" event={"ID":"41d0bc43-e85a-4f9f-afd4-084e6f44e4ce","Type":"ContainerDied","Data":"2b0a57286b9b2353d026057f88088fed2faf2067e31ca739bd4323db681d6a21"} Feb 19 15:24:31 crc kubenswrapper[4810]: I0219 15:24:31.195784 4810 scope.go:117] "RemoveContainer" containerID="aa0ac433c527ef4ec3ac933184eb8fd41d99a848cd1d1d205b4fdfd2a61acede" Feb 19 15:24:31 crc kubenswrapper[4810]: I0219 15:24:31.198085 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-gkft8" event={"ID":"09f49ae7-b6fb-4ca5-9238-8bcf8d15ea95","Type":"ContainerStarted","Data":"7113f462f11087b05663de419743450115d9f73ef4aed7e53e46e383fc3d8299"} Feb 19 15:24:31 crc kubenswrapper[4810]: I0219 15:24:31.198154 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-gkft8" event={"ID":"09f49ae7-b6fb-4ca5-9238-8bcf8d15ea95","Type":"ContainerStarted","Data":"5c060bf458cdb24d082d26fcac41d5603a9a93c63bfecf77323ee460a9708b0a"} Feb 19 15:24:31 crc kubenswrapper[4810]: I0219 15:24:31.215982 4810 scope.go:117] "RemoveContainer" containerID="aa0ac433c527ef4ec3ac933184eb8fd41d99a848cd1d1d205b4fdfd2a61acede" Feb 19 15:24:31 crc kubenswrapper[4810]: E0219 15:24:31.216603 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa0ac433c527ef4ec3ac933184eb8fd41d99a848cd1d1d205b4fdfd2a61acede\": container with ID starting with aa0ac433c527ef4ec3ac933184eb8fd41d99a848cd1d1d205b4fdfd2a61acede not found: ID does not exist" containerID="aa0ac433c527ef4ec3ac933184eb8fd41d99a848cd1d1d205b4fdfd2a61acede" Feb 19 15:24:31 crc kubenswrapper[4810]: I0219 15:24:31.216645 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa0ac433c527ef4ec3ac933184eb8fd41d99a848cd1d1d205b4fdfd2a61acede"} err="failed to get container status \"aa0ac433c527ef4ec3ac933184eb8fd41d99a848cd1d1d205b4fdfd2a61acede\": rpc error: code = NotFound desc = could not find container \"aa0ac433c527ef4ec3ac933184eb8fd41d99a848cd1d1d205b4fdfd2a61acede\": container with ID starting with aa0ac433c527ef4ec3ac933184eb8fd41d99a848cd1d1d205b4fdfd2a61acede not found: ID does not exist" Feb 19 15:24:31 crc kubenswrapper[4810]: I0219 15:24:31.231103 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-gkft8" podStartSLOduration=2.189969664 podStartE2EDuration="2.231080289s" podCreationTimestamp="2026-02-19 15:24:29 +0000 UTC" firstStartedPulling="2026-02-19 15:24:30.284131533 +0000 UTC m=+899.766161677" lastFinishedPulling="2026-02-19 15:24:30.325242188 +0000 UTC m=+899.807272302" observedRunningTime="2026-02-19 15:24:31.22990695 +0000 UTC m=+900.711937094" watchObservedRunningTime="2026-02-19 15:24:31.231080289 +0000 UTC m=+900.713110423" Feb 19 15:24:31 crc kubenswrapper[4810]: I0219 15:24:31.254459 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-sk8xn"] Feb 19 15:24:31 crc kubenswrapper[4810]: I0219 15:24:31.260878 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-sk8xn"] Feb 19 15:24:31 crc kubenswrapper[4810]: I0219 15:24:31.463432 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41d0bc43-e85a-4f9f-afd4-084e6f44e4ce" path="/var/lib/kubelet/pods/41d0bc43-e85a-4f9f-afd4-084e6f44e4ce/volumes" Feb 19 15:24:39 crc kubenswrapper[4810]: I0219 15:24:39.902981 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-7rbxk" Feb 19 15:24:39 crc kubenswrapper[4810]: I0219 15:24:39.958934 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-gkft8" Feb 19 15:24:39 crc kubenswrapper[4810]: I0219 15:24:39.958992 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-gkft8" Feb 19 15:24:40 crc kubenswrapper[4810]: I0219 15:24:40.004506 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-gkft8" Feb 19 15:24:40 crc kubenswrapper[4810]: I0219 15:24:40.294239 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-gkft8" Feb 19 15:24:45 crc kubenswrapper[4810]: I0219 15:24:45.883005 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ae2a058ea32e5d02c736557a2a81dfccb5f5284d9451ce18a33356457cfrzl4"] Feb 19 15:24:45 crc kubenswrapper[4810]: E0219 15:24:45.883594 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41d0bc43-e85a-4f9f-afd4-084e6f44e4ce" containerName="registry-server" Feb 19 15:24:45 crc kubenswrapper[4810]: I0219 15:24:45.883610 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="41d0bc43-e85a-4f9f-afd4-084e6f44e4ce" containerName="registry-server" Feb 19 15:24:45 crc kubenswrapper[4810]: I0219 15:24:45.883745 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="41d0bc43-e85a-4f9f-afd4-084e6f44e4ce" containerName="registry-server" Feb 19 15:24:45 crc kubenswrapper[4810]: I0219 15:24:45.884732 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ae2a058ea32e5d02c736557a2a81dfccb5f5284d9451ce18a33356457cfrzl4" Feb 19 15:24:45 crc kubenswrapper[4810]: I0219 15:24:45.886850 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-6rxxb" Feb 19 15:24:45 crc kubenswrapper[4810]: I0219 15:24:45.901142 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ae2a058ea32e5d02c736557a2a81dfccb5f5284d9451ce18a33356457cfrzl4"] Feb 19 15:24:46 crc kubenswrapper[4810]: I0219 15:24:46.047013 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqwln\" (UniqueName: \"kubernetes.io/projected/124e176a-b011-4a5c-8e7c-ca027d881aea-kube-api-access-dqwln\") pod \"ae2a058ea32e5d02c736557a2a81dfccb5f5284d9451ce18a33356457cfrzl4\" (UID: \"124e176a-b011-4a5c-8e7c-ca027d881aea\") " pod="openstack-operators/ae2a058ea32e5d02c736557a2a81dfccb5f5284d9451ce18a33356457cfrzl4" Feb 19 15:24:46 crc kubenswrapper[4810]: I0219 15:24:46.047413 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/124e176a-b011-4a5c-8e7c-ca027d881aea-util\") pod \"ae2a058ea32e5d02c736557a2a81dfccb5f5284d9451ce18a33356457cfrzl4\" (UID: \"124e176a-b011-4a5c-8e7c-ca027d881aea\") " pod="openstack-operators/ae2a058ea32e5d02c736557a2a81dfccb5f5284d9451ce18a33356457cfrzl4" Feb 19 15:24:46 crc kubenswrapper[4810]: I0219 15:24:46.047451 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/124e176a-b011-4a5c-8e7c-ca027d881aea-bundle\") pod \"ae2a058ea32e5d02c736557a2a81dfccb5f5284d9451ce18a33356457cfrzl4\" (UID: \"124e176a-b011-4a5c-8e7c-ca027d881aea\") " pod="openstack-operators/ae2a058ea32e5d02c736557a2a81dfccb5f5284d9451ce18a33356457cfrzl4" Feb 19 15:24:46 crc kubenswrapper[4810]: I0219 15:24:46.148353 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqwln\" (UniqueName: \"kubernetes.io/projected/124e176a-b011-4a5c-8e7c-ca027d881aea-kube-api-access-dqwln\") pod \"ae2a058ea32e5d02c736557a2a81dfccb5f5284d9451ce18a33356457cfrzl4\" (UID: \"124e176a-b011-4a5c-8e7c-ca027d881aea\") " pod="openstack-operators/ae2a058ea32e5d02c736557a2a81dfccb5f5284d9451ce18a33356457cfrzl4" Feb 19 15:24:46 crc kubenswrapper[4810]: I0219 15:24:46.148426 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/124e176a-b011-4a5c-8e7c-ca027d881aea-util\") pod \"ae2a058ea32e5d02c736557a2a81dfccb5f5284d9451ce18a33356457cfrzl4\" (UID: \"124e176a-b011-4a5c-8e7c-ca027d881aea\") " pod="openstack-operators/ae2a058ea32e5d02c736557a2a81dfccb5f5284d9451ce18a33356457cfrzl4" Feb 19 15:24:46 crc kubenswrapper[4810]: I0219 15:24:46.148456 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/124e176a-b011-4a5c-8e7c-ca027d881aea-bundle\") pod \"ae2a058ea32e5d02c736557a2a81dfccb5f5284d9451ce18a33356457cfrzl4\" (UID: \"124e176a-b011-4a5c-8e7c-ca027d881aea\") " pod="openstack-operators/ae2a058ea32e5d02c736557a2a81dfccb5f5284d9451ce18a33356457cfrzl4" Feb 19 15:24:46 crc kubenswrapper[4810]: I0219 15:24:46.148880 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/124e176a-b011-4a5c-8e7c-ca027d881aea-util\") pod \"ae2a058ea32e5d02c736557a2a81dfccb5f5284d9451ce18a33356457cfrzl4\" (UID: \"124e176a-b011-4a5c-8e7c-ca027d881aea\") " pod="openstack-operators/ae2a058ea32e5d02c736557a2a81dfccb5f5284d9451ce18a33356457cfrzl4" Feb 19 15:24:46 crc kubenswrapper[4810]: I0219 15:24:46.148916 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/124e176a-b011-4a5c-8e7c-ca027d881aea-bundle\") pod \"ae2a058ea32e5d02c736557a2a81dfccb5f5284d9451ce18a33356457cfrzl4\" (UID: \"124e176a-b011-4a5c-8e7c-ca027d881aea\") " pod="openstack-operators/ae2a058ea32e5d02c736557a2a81dfccb5f5284d9451ce18a33356457cfrzl4" Feb 19 15:24:46 crc kubenswrapper[4810]: I0219 15:24:46.166881 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqwln\" (UniqueName: \"kubernetes.io/projected/124e176a-b011-4a5c-8e7c-ca027d881aea-kube-api-access-dqwln\") pod \"ae2a058ea32e5d02c736557a2a81dfccb5f5284d9451ce18a33356457cfrzl4\" (UID: \"124e176a-b011-4a5c-8e7c-ca027d881aea\") " pod="openstack-operators/ae2a058ea32e5d02c736557a2a81dfccb5f5284d9451ce18a33356457cfrzl4" Feb 19 15:24:46 crc kubenswrapper[4810]: I0219 15:24:46.213951 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ae2a058ea32e5d02c736557a2a81dfccb5f5284d9451ce18a33356457cfrzl4" Feb 19 15:24:46 crc kubenswrapper[4810]: I0219 15:24:46.446764 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ae2a058ea32e5d02c736557a2a81dfccb5f5284d9451ce18a33356457cfrzl4"] Feb 19 15:24:46 crc kubenswrapper[4810]: W0219 15:24:46.454006 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod124e176a_b011_4a5c_8e7c_ca027d881aea.slice/crio-23f2682d37978e25a3571906c49f5160415d929f9c63b597b382a08e82f5f189 WatchSource:0}: Error finding container 23f2682d37978e25a3571906c49f5160415d929f9c63b597b382a08e82f5f189: Status 404 returned error can't find the container with id 23f2682d37978e25a3571906c49f5160415d929f9c63b597b382a08e82f5f189 Feb 19 15:24:47 crc kubenswrapper[4810]: I0219 15:24:47.322200 4810 generic.go:334] "Generic (PLEG): container finished" podID="124e176a-b011-4a5c-8e7c-ca027d881aea" containerID="246a4fb665235cc06f742344c48f787e5a561910911c86d2f9e860ab2e45da7a" exitCode=0 Feb 19 15:24:47 crc kubenswrapper[4810]: I0219 15:24:47.322269 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ae2a058ea32e5d02c736557a2a81dfccb5f5284d9451ce18a33356457cfrzl4" event={"ID":"124e176a-b011-4a5c-8e7c-ca027d881aea","Type":"ContainerDied","Data":"246a4fb665235cc06f742344c48f787e5a561910911c86d2f9e860ab2e45da7a"} Feb 19 15:24:47 crc kubenswrapper[4810]: I0219 15:24:47.322372 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ae2a058ea32e5d02c736557a2a81dfccb5f5284d9451ce18a33356457cfrzl4" event={"ID":"124e176a-b011-4a5c-8e7c-ca027d881aea","Type":"ContainerStarted","Data":"23f2682d37978e25a3571906c49f5160415d929f9c63b597b382a08e82f5f189"} Feb 19 15:24:48 crc kubenswrapper[4810]: I0219 15:24:48.334371 4810 generic.go:334] "Generic (PLEG): container finished" podID="124e176a-b011-4a5c-8e7c-ca027d881aea" containerID="5d864715a4378a444f94d842fa65a2ecb1f7da4d456dcd2ab74d325ed9f5a637" exitCode=0 Feb 19 15:24:48 crc kubenswrapper[4810]: I0219 15:24:48.334484 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ae2a058ea32e5d02c736557a2a81dfccb5f5284d9451ce18a33356457cfrzl4" event={"ID":"124e176a-b011-4a5c-8e7c-ca027d881aea","Type":"ContainerDied","Data":"5d864715a4378a444f94d842fa65a2ecb1f7da4d456dcd2ab74d325ed9f5a637"} Feb 19 15:24:49 crc kubenswrapper[4810]: I0219 15:24:49.343622 4810 generic.go:334] "Generic (PLEG): container finished" podID="124e176a-b011-4a5c-8e7c-ca027d881aea" containerID="80d37b029c8542d0cb36ee35550147079788b1d39ccb5e077286ac29e2925fc2" exitCode=0 Feb 19 15:24:49 crc kubenswrapper[4810]: I0219 15:24:49.343697 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ae2a058ea32e5d02c736557a2a81dfccb5f5284d9451ce18a33356457cfrzl4" event={"ID":"124e176a-b011-4a5c-8e7c-ca027d881aea","Type":"ContainerDied","Data":"80d37b029c8542d0cb36ee35550147079788b1d39ccb5e077286ac29e2925fc2"} Feb 19 15:24:49 crc kubenswrapper[4810]: I0219 15:24:49.537496 4810 patch_prober.go:28] interesting pod/machine-config-daemon-t499d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 15:24:49 crc kubenswrapper[4810]: I0219 15:24:49.537618 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 15:24:50 crc kubenswrapper[4810]: I0219 15:24:50.615945 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ae2a058ea32e5d02c736557a2a81dfccb5f5284d9451ce18a33356457cfrzl4" Feb 19 15:24:50 crc kubenswrapper[4810]: I0219 15:24:50.720077 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/124e176a-b011-4a5c-8e7c-ca027d881aea-util\") pod \"124e176a-b011-4a5c-8e7c-ca027d881aea\" (UID: \"124e176a-b011-4a5c-8e7c-ca027d881aea\") " Feb 19 15:24:50 crc kubenswrapper[4810]: I0219 15:24:50.720300 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/124e176a-b011-4a5c-8e7c-ca027d881aea-bundle\") pod \"124e176a-b011-4a5c-8e7c-ca027d881aea\" (UID: \"124e176a-b011-4a5c-8e7c-ca027d881aea\") " Feb 19 15:24:50 crc kubenswrapper[4810]: I0219 15:24:50.720429 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqwln\" (UniqueName: \"kubernetes.io/projected/124e176a-b011-4a5c-8e7c-ca027d881aea-kube-api-access-dqwln\") pod \"124e176a-b011-4a5c-8e7c-ca027d881aea\" (UID: \"124e176a-b011-4a5c-8e7c-ca027d881aea\") " Feb 19 15:24:50 crc kubenswrapper[4810]: I0219 15:24:50.722007 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/124e176a-b011-4a5c-8e7c-ca027d881aea-bundle" (OuterVolumeSpecName: "bundle") pod "124e176a-b011-4a5c-8e7c-ca027d881aea" (UID: "124e176a-b011-4a5c-8e7c-ca027d881aea"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:24:50 crc kubenswrapper[4810]: I0219 15:24:50.727154 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/124e176a-b011-4a5c-8e7c-ca027d881aea-kube-api-access-dqwln" (OuterVolumeSpecName: "kube-api-access-dqwln") pod "124e176a-b011-4a5c-8e7c-ca027d881aea" (UID: "124e176a-b011-4a5c-8e7c-ca027d881aea"). InnerVolumeSpecName "kube-api-access-dqwln". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:24:50 crc kubenswrapper[4810]: I0219 15:24:50.734398 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/124e176a-b011-4a5c-8e7c-ca027d881aea-util" (OuterVolumeSpecName: "util") pod "124e176a-b011-4a5c-8e7c-ca027d881aea" (UID: "124e176a-b011-4a5c-8e7c-ca027d881aea"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:24:50 crc kubenswrapper[4810]: I0219 15:24:50.822016 4810 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/124e176a-b011-4a5c-8e7c-ca027d881aea-util\") on node \"crc\" DevicePath \"\"" Feb 19 15:24:50 crc kubenswrapper[4810]: I0219 15:24:50.822055 4810 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/124e176a-b011-4a5c-8e7c-ca027d881aea-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 15:24:50 crc kubenswrapper[4810]: I0219 15:24:50.822068 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqwln\" (UniqueName: \"kubernetes.io/projected/124e176a-b011-4a5c-8e7c-ca027d881aea-kube-api-access-dqwln\") on node \"crc\" DevicePath \"\"" Feb 19 15:24:51 crc kubenswrapper[4810]: I0219 15:24:51.358863 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ae2a058ea32e5d02c736557a2a81dfccb5f5284d9451ce18a33356457cfrzl4" event={"ID":"124e176a-b011-4a5c-8e7c-ca027d881aea","Type":"ContainerDied","Data":"23f2682d37978e25a3571906c49f5160415d929f9c63b597b382a08e82f5f189"} Feb 19 15:24:51 crc kubenswrapper[4810]: I0219 15:24:51.358917 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23f2682d37978e25a3571906c49f5160415d929f9c63b597b382a08e82f5f189" Feb 19 15:24:51 crc kubenswrapper[4810]: I0219 15:24:51.359007 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ae2a058ea32e5d02c736557a2a81dfccb5f5284d9451ce18a33356457cfrzl4" Feb 19 15:24:54 crc kubenswrapper[4810]: I0219 15:24:54.101901 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-t5pmn"] Feb 19 15:24:54 crc kubenswrapper[4810]: E0219 15:24:54.102545 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="124e176a-b011-4a5c-8e7c-ca027d881aea" containerName="extract" Feb 19 15:24:54 crc kubenswrapper[4810]: I0219 15:24:54.102560 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="124e176a-b011-4a5c-8e7c-ca027d881aea" containerName="extract" Feb 19 15:24:54 crc kubenswrapper[4810]: E0219 15:24:54.102581 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="124e176a-b011-4a5c-8e7c-ca027d881aea" containerName="util" Feb 19 15:24:54 crc kubenswrapper[4810]: I0219 15:24:54.102589 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="124e176a-b011-4a5c-8e7c-ca027d881aea" containerName="util" Feb 19 15:24:54 crc kubenswrapper[4810]: E0219 15:24:54.102602 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="124e176a-b011-4a5c-8e7c-ca027d881aea" containerName="pull" Feb 19 15:24:54 crc kubenswrapper[4810]: I0219 15:24:54.102611 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="124e176a-b011-4a5c-8e7c-ca027d881aea" containerName="pull" Feb 19 15:24:54 crc kubenswrapper[4810]: I0219 15:24:54.102763 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="124e176a-b011-4a5c-8e7c-ca027d881aea" containerName="extract" Feb 19 15:24:54 crc kubenswrapper[4810]: I0219 15:24:54.103833 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t5pmn" Feb 19 15:24:54 crc kubenswrapper[4810]: I0219 15:24:54.132063 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-t5pmn"] Feb 19 15:24:54 crc kubenswrapper[4810]: I0219 15:24:54.169915 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8f1642a-52bc-4509-848c-535f0c43fe54-catalog-content\") pod \"community-operators-t5pmn\" (UID: \"f8f1642a-52bc-4509-848c-535f0c43fe54\") " pod="openshift-marketplace/community-operators-t5pmn" Feb 19 15:24:54 crc kubenswrapper[4810]: I0219 15:24:54.169965 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cg7nt\" (UniqueName: \"kubernetes.io/projected/f8f1642a-52bc-4509-848c-535f0c43fe54-kube-api-access-cg7nt\") pod \"community-operators-t5pmn\" (UID: \"f8f1642a-52bc-4509-848c-535f0c43fe54\") " pod="openshift-marketplace/community-operators-t5pmn" Feb 19 15:24:54 crc kubenswrapper[4810]: I0219 15:24:54.169982 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8f1642a-52bc-4509-848c-535f0c43fe54-utilities\") pod \"community-operators-t5pmn\" (UID: \"f8f1642a-52bc-4509-848c-535f0c43fe54\") " pod="openshift-marketplace/community-operators-t5pmn" Feb 19 15:24:54 crc kubenswrapper[4810]: I0219 15:24:54.271293 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8f1642a-52bc-4509-848c-535f0c43fe54-catalog-content\") pod \"community-operators-t5pmn\" (UID: \"f8f1642a-52bc-4509-848c-535f0c43fe54\") " pod="openshift-marketplace/community-operators-t5pmn" Feb 19 15:24:54 crc kubenswrapper[4810]: I0219 15:24:54.271365 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cg7nt\" (UniqueName: \"kubernetes.io/projected/f8f1642a-52bc-4509-848c-535f0c43fe54-kube-api-access-cg7nt\") pod \"community-operators-t5pmn\" (UID: \"f8f1642a-52bc-4509-848c-535f0c43fe54\") " pod="openshift-marketplace/community-operators-t5pmn" Feb 19 15:24:54 crc kubenswrapper[4810]: I0219 15:24:54.271390 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8f1642a-52bc-4509-848c-535f0c43fe54-utilities\") pod \"community-operators-t5pmn\" (UID: \"f8f1642a-52bc-4509-848c-535f0c43fe54\") " pod="openshift-marketplace/community-operators-t5pmn" Feb 19 15:24:54 crc kubenswrapper[4810]: I0219 15:24:54.271861 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8f1642a-52bc-4509-848c-535f0c43fe54-catalog-content\") pod \"community-operators-t5pmn\" (UID: \"f8f1642a-52bc-4509-848c-535f0c43fe54\") " pod="openshift-marketplace/community-operators-t5pmn" Feb 19 15:24:54 crc kubenswrapper[4810]: I0219 15:24:54.271944 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8f1642a-52bc-4509-848c-535f0c43fe54-utilities\") pod \"community-operators-t5pmn\" (UID: \"f8f1642a-52bc-4509-848c-535f0c43fe54\") " pod="openshift-marketplace/community-operators-t5pmn" Feb 19 15:24:54 crc kubenswrapper[4810]: I0219 15:24:54.291250 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cg7nt\" (UniqueName: \"kubernetes.io/projected/f8f1642a-52bc-4509-848c-535f0c43fe54-kube-api-access-cg7nt\") pod \"community-operators-t5pmn\" (UID: \"f8f1642a-52bc-4509-848c-535f0c43fe54\") " pod="openshift-marketplace/community-operators-t5pmn" Feb 19 15:24:54 crc kubenswrapper[4810]: I0219 15:24:54.425776 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t5pmn" Feb 19 15:24:54 crc kubenswrapper[4810]: I0219 15:24:54.991996 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-t5pmn"] Feb 19 15:24:55 crc kubenswrapper[4810]: I0219 15:24:55.389505 4810 generic.go:334] "Generic (PLEG): container finished" podID="f8f1642a-52bc-4509-848c-535f0c43fe54" containerID="0ed7b8321208c45e27b494e78f57e89db21fc20dcc81a8179837a2d07cef9bfe" exitCode=0 Feb 19 15:24:55 crc kubenswrapper[4810]: I0219 15:24:55.389586 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t5pmn" event={"ID":"f8f1642a-52bc-4509-848c-535f0c43fe54","Type":"ContainerDied","Data":"0ed7b8321208c45e27b494e78f57e89db21fc20dcc81a8179837a2d07cef9bfe"} Feb 19 15:24:55 crc kubenswrapper[4810]: I0219 15:24:55.389792 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t5pmn" event={"ID":"f8f1642a-52bc-4509-848c-535f0c43fe54","Type":"ContainerStarted","Data":"6a664415a78d036bc3a1f839d48dd33d629dec25c9501508f9c47654d086e230"} Feb 19 15:24:56 crc kubenswrapper[4810]: I0219 15:24:56.399304 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t5pmn" event={"ID":"f8f1642a-52bc-4509-848c-535f0c43fe54","Type":"ContainerStarted","Data":"0d14aab1918c80c70f97695d04823e0580412af35dfeacbd48568eb0ad351bd0"} Feb 19 15:24:57 crc kubenswrapper[4810]: I0219 15:24:57.411813 4810 generic.go:334] "Generic (PLEG): container finished" podID="f8f1642a-52bc-4509-848c-535f0c43fe54" containerID="0d14aab1918c80c70f97695d04823e0580412af35dfeacbd48568eb0ad351bd0" exitCode=0 Feb 19 15:24:57 crc kubenswrapper[4810]: I0219 15:24:57.412160 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t5pmn" event={"ID":"f8f1642a-52bc-4509-848c-535f0c43fe54","Type":"ContainerDied","Data":"0d14aab1918c80c70f97695d04823e0580412af35dfeacbd48568eb0ad351bd0"} Feb 19 15:24:58 crc kubenswrapper[4810]: I0219 15:24:58.420667 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t5pmn" event={"ID":"f8f1642a-52bc-4509-848c-535f0c43fe54","Type":"ContainerStarted","Data":"0b637e868543b6fea247931f0367f092f18093bc152aef733eaab5e8c6c8fce9"} Feb 19 15:24:58 crc kubenswrapper[4810]: I0219 15:24:58.444487 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-t5pmn" podStartSLOduration=2.033547001 podStartE2EDuration="4.444465787s" podCreationTimestamp="2026-02-19 15:24:54 +0000 UTC" firstStartedPulling="2026-02-19 15:24:55.391576956 +0000 UTC m=+924.873607100" lastFinishedPulling="2026-02-19 15:24:57.802495762 +0000 UTC m=+927.284525886" observedRunningTime="2026-02-19 15:24:58.440035209 +0000 UTC m=+927.922065323" watchObservedRunningTime="2026-02-19 15:24:58.444465787 +0000 UTC m=+927.926495911" Feb 19 15:24:59 crc kubenswrapper[4810]: I0219 15:24:59.651224 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-69cffcd4f6-27gzn"] Feb 19 15:24:59 crc kubenswrapper[4810]: I0219 15:24:59.652190 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-69cffcd4f6-27gzn" Feb 19 15:24:59 crc kubenswrapper[4810]: I0219 15:24:59.653920 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-gj24b" Feb 19 15:24:59 crc kubenswrapper[4810]: I0219 15:24:59.681047 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-69cffcd4f6-27gzn"] Feb 19 15:24:59 crc kubenswrapper[4810]: I0219 15:24:59.744041 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cg8tb\" (UniqueName: \"kubernetes.io/projected/e84ef702-2f13-42e9-ae2b-6f1465b67ff3-kube-api-access-cg8tb\") pod \"openstack-operator-controller-init-69cffcd4f6-27gzn\" (UID: \"e84ef702-2f13-42e9-ae2b-6f1465b67ff3\") " pod="openstack-operators/openstack-operator-controller-init-69cffcd4f6-27gzn" Feb 19 15:24:59 crc kubenswrapper[4810]: I0219 15:24:59.845020 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cg8tb\" (UniqueName: \"kubernetes.io/projected/e84ef702-2f13-42e9-ae2b-6f1465b67ff3-kube-api-access-cg8tb\") pod \"openstack-operator-controller-init-69cffcd4f6-27gzn\" (UID: \"e84ef702-2f13-42e9-ae2b-6f1465b67ff3\") " pod="openstack-operators/openstack-operator-controller-init-69cffcd4f6-27gzn" Feb 19 15:24:59 crc kubenswrapper[4810]: I0219 15:24:59.871485 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cg8tb\" (UniqueName: \"kubernetes.io/projected/e84ef702-2f13-42e9-ae2b-6f1465b67ff3-kube-api-access-cg8tb\") pod \"openstack-operator-controller-init-69cffcd4f6-27gzn\" (UID: \"e84ef702-2f13-42e9-ae2b-6f1465b67ff3\") " pod="openstack-operators/openstack-operator-controller-init-69cffcd4f6-27gzn" Feb 19 15:24:59 crc kubenswrapper[4810]: I0219 15:24:59.968226 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-69cffcd4f6-27gzn" Feb 19 15:25:00 crc kubenswrapper[4810]: I0219 15:25:00.185755 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-69cffcd4f6-27gzn"] Feb 19 15:25:00 crc kubenswrapper[4810]: W0219 15:25:00.207478 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode84ef702_2f13_42e9_ae2b_6f1465b67ff3.slice/crio-8c78845001e58a7b5adde23d36fe56666961710f2b5b9ec9c1d0202ab328f250 WatchSource:0}: Error finding container 8c78845001e58a7b5adde23d36fe56666961710f2b5b9ec9c1d0202ab328f250: Status 404 returned error can't find the container with id 8c78845001e58a7b5adde23d36fe56666961710f2b5b9ec9c1d0202ab328f250 Feb 19 15:25:00 crc kubenswrapper[4810]: I0219 15:25:00.434808 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-69cffcd4f6-27gzn" event={"ID":"e84ef702-2f13-42e9-ae2b-6f1465b67ff3","Type":"ContainerStarted","Data":"8c78845001e58a7b5adde23d36fe56666961710f2b5b9ec9c1d0202ab328f250"} Feb 19 15:25:04 crc kubenswrapper[4810]: I0219 15:25:04.426475 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-t5pmn" Feb 19 15:25:04 crc kubenswrapper[4810]: I0219 15:25:04.426957 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-t5pmn" Feb 19 15:25:04 crc kubenswrapper[4810]: I0219 15:25:04.470117 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-69cffcd4f6-27gzn" event={"ID":"e84ef702-2f13-42e9-ae2b-6f1465b67ff3","Type":"ContainerStarted","Data":"405b7168a897224c7eaa9f3a8924210364683d51893c54f80ba7221e170cde3b"} Feb 19 15:25:04 crc kubenswrapper[4810]: I0219 15:25:04.470492 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-69cffcd4f6-27gzn" Feb 19 15:25:04 crc kubenswrapper[4810]: I0219 15:25:04.474301 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-t5pmn" Feb 19 15:25:04 crc kubenswrapper[4810]: I0219 15:25:04.512060 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-69cffcd4f6-27gzn" podStartSLOduration=1.7884214520000001 podStartE2EDuration="5.512042603s" podCreationTimestamp="2026-02-19 15:24:59 +0000 UTC" firstStartedPulling="2026-02-19 15:25:00.211620353 +0000 UTC m=+929.693650477" lastFinishedPulling="2026-02-19 15:25:03.935241504 +0000 UTC m=+933.417271628" observedRunningTime="2026-02-19 15:25:04.509530861 +0000 UTC m=+933.991560985" watchObservedRunningTime="2026-02-19 15:25:04.512042603 +0000 UTC m=+933.994072727" Feb 19 15:25:04 crc kubenswrapper[4810]: I0219 15:25:04.524380 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-t5pmn" Feb 19 15:25:06 crc kubenswrapper[4810]: I0219 15:25:06.467269 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-t5pmn"] Feb 19 15:25:06 crc kubenswrapper[4810]: I0219 15:25:06.483096 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-t5pmn" podUID="f8f1642a-52bc-4509-848c-535f0c43fe54" containerName="registry-server" containerID="cri-o://0b637e868543b6fea247931f0367f092f18093bc152aef733eaab5e8c6c8fce9" gracePeriod=2 Feb 19 15:25:06 crc kubenswrapper[4810]: I0219 15:25:06.857274 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t5pmn" Feb 19 15:25:06 crc kubenswrapper[4810]: I0219 15:25:06.956384 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8f1642a-52bc-4509-848c-535f0c43fe54-utilities\") pod \"f8f1642a-52bc-4509-848c-535f0c43fe54\" (UID: \"f8f1642a-52bc-4509-848c-535f0c43fe54\") " Feb 19 15:25:06 crc kubenswrapper[4810]: I0219 15:25:06.956966 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cg7nt\" (UniqueName: \"kubernetes.io/projected/f8f1642a-52bc-4509-848c-535f0c43fe54-kube-api-access-cg7nt\") pod \"f8f1642a-52bc-4509-848c-535f0c43fe54\" (UID: \"f8f1642a-52bc-4509-848c-535f0c43fe54\") " Feb 19 15:25:06 crc kubenswrapper[4810]: I0219 15:25:06.957026 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8f1642a-52bc-4509-848c-535f0c43fe54-catalog-content\") pod \"f8f1642a-52bc-4509-848c-535f0c43fe54\" (UID: \"f8f1642a-52bc-4509-848c-535f0c43fe54\") " Feb 19 15:25:06 crc kubenswrapper[4810]: I0219 15:25:06.958056 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8f1642a-52bc-4509-848c-535f0c43fe54-utilities" (OuterVolumeSpecName: "utilities") pod "f8f1642a-52bc-4509-848c-535f0c43fe54" (UID: "f8f1642a-52bc-4509-848c-535f0c43fe54"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:25:06 crc kubenswrapper[4810]: I0219 15:25:06.963020 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8f1642a-52bc-4509-848c-535f0c43fe54-kube-api-access-cg7nt" (OuterVolumeSpecName: "kube-api-access-cg7nt") pod "f8f1642a-52bc-4509-848c-535f0c43fe54" (UID: "f8f1642a-52bc-4509-848c-535f0c43fe54"). InnerVolumeSpecName "kube-api-access-cg7nt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:25:07 crc kubenswrapper[4810]: I0219 15:25:07.068063 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8f1642a-52bc-4509-848c-535f0c43fe54-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 15:25:07 crc kubenswrapper[4810]: I0219 15:25:07.068104 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cg7nt\" (UniqueName: \"kubernetes.io/projected/f8f1642a-52bc-4509-848c-535f0c43fe54-kube-api-access-cg7nt\") on node \"crc\" DevicePath \"\"" Feb 19 15:25:07 crc kubenswrapper[4810]: I0219 15:25:07.097965 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8f1642a-52bc-4509-848c-535f0c43fe54-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f8f1642a-52bc-4509-848c-535f0c43fe54" (UID: "f8f1642a-52bc-4509-848c-535f0c43fe54"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:25:07 crc kubenswrapper[4810]: I0219 15:25:07.169843 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8f1642a-52bc-4509-848c-535f0c43fe54-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 15:25:07 crc kubenswrapper[4810]: I0219 15:25:07.493227 4810 generic.go:334] "Generic (PLEG): container finished" podID="f8f1642a-52bc-4509-848c-535f0c43fe54" containerID="0b637e868543b6fea247931f0367f092f18093bc152aef733eaab5e8c6c8fce9" exitCode=0 Feb 19 15:25:07 crc kubenswrapper[4810]: I0219 15:25:07.493267 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t5pmn" event={"ID":"f8f1642a-52bc-4509-848c-535f0c43fe54","Type":"ContainerDied","Data":"0b637e868543b6fea247931f0367f092f18093bc152aef733eaab5e8c6c8fce9"} Feb 19 15:25:07 crc kubenswrapper[4810]: I0219 15:25:07.493292 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t5pmn" event={"ID":"f8f1642a-52bc-4509-848c-535f0c43fe54","Type":"ContainerDied","Data":"6a664415a78d036bc3a1f839d48dd33d629dec25c9501508f9c47654d086e230"} Feb 19 15:25:07 crc kubenswrapper[4810]: I0219 15:25:07.493309 4810 scope.go:117] "RemoveContainer" containerID="0b637e868543b6fea247931f0367f092f18093bc152aef733eaab5e8c6c8fce9" Feb 19 15:25:07 crc kubenswrapper[4810]: I0219 15:25:07.493447 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t5pmn" Feb 19 15:25:07 crc kubenswrapper[4810]: I0219 15:25:07.511887 4810 scope.go:117] "RemoveContainer" containerID="0d14aab1918c80c70f97695d04823e0580412af35dfeacbd48568eb0ad351bd0" Feb 19 15:25:07 crc kubenswrapper[4810]: I0219 15:25:07.513133 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-t5pmn"] Feb 19 15:25:07 crc kubenswrapper[4810]: I0219 15:25:07.516830 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-t5pmn"] Feb 19 15:25:07 crc kubenswrapper[4810]: I0219 15:25:07.532489 4810 scope.go:117] "RemoveContainer" containerID="0ed7b8321208c45e27b494e78f57e89db21fc20dcc81a8179837a2d07cef9bfe" Feb 19 15:25:07 crc kubenswrapper[4810]: I0219 15:25:07.549538 4810 scope.go:117] "RemoveContainer" containerID="0b637e868543b6fea247931f0367f092f18093bc152aef733eaab5e8c6c8fce9" Feb 19 15:25:07 crc kubenswrapper[4810]: E0219 15:25:07.552911 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b637e868543b6fea247931f0367f092f18093bc152aef733eaab5e8c6c8fce9\": container with ID starting with 0b637e868543b6fea247931f0367f092f18093bc152aef733eaab5e8c6c8fce9 not found: ID does not exist" containerID="0b637e868543b6fea247931f0367f092f18093bc152aef733eaab5e8c6c8fce9" Feb 19 15:25:07 crc kubenswrapper[4810]: I0219 15:25:07.552990 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b637e868543b6fea247931f0367f092f18093bc152aef733eaab5e8c6c8fce9"} err="failed to get container status \"0b637e868543b6fea247931f0367f092f18093bc152aef733eaab5e8c6c8fce9\": rpc error: code = NotFound desc = could not find container \"0b637e868543b6fea247931f0367f092f18093bc152aef733eaab5e8c6c8fce9\": container with ID starting with 0b637e868543b6fea247931f0367f092f18093bc152aef733eaab5e8c6c8fce9 not found: ID does not exist" Feb 19 15:25:07 crc kubenswrapper[4810]: I0219 15:25:07.553024 4810 scope.go:117] "RemoveContainer" containerID="0d14aab1918c80c70f97695d04823e0580412af35dfeacbd48568eb0ad351bd0" Feb 19 15:25:07 crc kubenswrapper[4810]: E0219 15:25:07.553497 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d14aab1918c80c70f97695d04823e0580412af35dfeacbd48568eb0ad351bd0\": container with ID starting with 0d14aab1918c80c70f97695d04823e0580412af35dfeacbd48568eb0ad351bd0 not found: ID does not exist" containerID="0d14aab1918c80c70f97695d04823e0580412af35dfeacbd48568eb0ad351bd0" Feb 19 15:25:07 crc kubenswrapper[4810]: I0219 15:25:07.553538 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d14aab1918c80c70f97695d04823e0580412af35dfeacbd48568eb0ad351bd0"} err="failed to get container status \"0d14aab1918c80c70f97695d04823e0580412af35dfeacbd48568eb0ad351bd0\": rpc error: code = NotFound desc = could not find container \"0d14aab1918c80c70f97695d04823e0580412af35dfeacbd48568eb0ad351bd0\": container with ID starting with 0d14aab1918c80c70f97695d04823e0580412af35dfeacbd48568eb0ad351bd0 not found: ID does not exist" Feb 19 15:25:07 crc kubenswrapper[4810]: I0219 15:25:07.553566 4810 scope.go:117] "RemoveContainer" containerID="0ed7b8321208c45e27b494e78f57e89db21fc20dcc81a8179837a2d07cef9bfe" Feb 19 15:25:07 crc kubenswrapper[4810]: E0219 15:25:07.553913 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ed7b8321208c45e27b494e78f57e89db21fc20dcc81a8179837a2d07cef9bfe\": container with ID starting with 0ed7b8321208c45e27b494e78f57e89db21fc20dcc81a8179837a2d07cef9bfe not found: ID does not exist" containerID="0ed7b8321208c45e27b494e78f57e89db21fc20dcc81a8179837a2d07cef9bfe" Feb 19 15:25:07 crc kubenswrapper[4810]: I0219 15:25:07.553949 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ed7b8321208c45e27b494e78f57e89db21fc20dcc81a8179837a2d07cef9bfe"} err="failed to get container status \"0ed7b8321208c45e27b494e78f57e89db21fc20dcc81a8179837a2d07cef9bfe\": rpc error: code = NotFound desc = could not find container \"0ed7b8321208c45e27b494e78f57e89db21fc20dcc81a8179837a2d07cef9bfe\": container with ID starting with 0ed7b8321208c45e27b494e78f57e89db21fc20dcc81a8179837a2d07cef9bfe not found: ID does not exist" Feb 19 15:25:09 crc kubenswrapper[4810]: I0219 15:25:09.453228 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8f1642a-52bc-4509-848c-535f0c43fe54" path="/var/lib/kubelet/pods/f8f1642a-52bc-4509-848c-535f0c43fe54/volumes" Feb 19 15:25:09 crc kubenswrapper[4810]: I0219 15:25:09.972526 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-69cffcd4f6-27gzn" Feb 19 15:25:19 crc kubenswrapper[4810]: I0219 15:25:19.537909 4810 patch_prober.go:28] interesting pod/machine-config-daemon-t499d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 15:25:19 crc kubenswrapper[4810]: I0219 15:25:19.538675 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 15:25:26 crc kubenswrapper[4810]: I0219 15:25:26.547999 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-p4pb4"] Feb 19 15:25:26 crc kubenswrapper[4810]: E0219 15:25:26.548950 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8f1642a-52bc-4509-848c-535f0c43fe54" containerName="extract-content" Feb 19 15:25:26 crc kubenswrapper[4810]: I0219 15:25:26.548972 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8f1642a-52bc-4509-848c-535f0c43fe54" containerName="extract-content" Feb 19 15:25:26 crc kubenswrapper[4810]: E0219 15:25:26.548998 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8f1642a-52bc-4509-848c-535f0c43fe54" containerName="registry-server" Feb 19 15:25:26 crc kubenswrapper[4810]: I0219 15:25:26.549010 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8f1642a-52bc-4509-848c-535f0c43fe54" containerName="registry-server" Feb 19 15:25:26 crc kubenswrapper[4810]: E0219 15:25:26.549026 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8f1642a-52bc-4509-848c-535f0c43fe54" containerName="extract-utilities" Feb 19 15:25:26 crc kubenswrapper[4810]: I0219 15:25:26.549038 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8f1642a-52bc-4509-848c-535f0c43fe54" containerName="extract-utilities" Feb 19 15:25:26 crc kubenswrapper[4810]: I0219 15:25:26.549190 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8f1642a-52bc-4509-848c-535f0c43fe54" containerName="registry-server" Feb 19 15:25:26 crc kubenswrapper[4810]: I0219 15:25:26.550317 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p4pb4" Feb 19 15:25:26 crc kubenswrapper[4810]: I0219 15:25:26.568852 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p4pb4"] Feb 19 15:25:26 crc kubenswrapper[4810]: I0219 15:25:26.660399 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e8a9167-c157-4d9a-830b-03a91af714f1-utilities\") pod \"certified-operators-p4pb4\" (UID: \"1e8a9167-c157-4d9a-830b-03a91af714f1\") " pod="openshift-marketplace/certified-operators-p4pb4" Feb 19 15:25:26 crc kubenswrapper[4810]: I0219 15:25:26.660513 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e8a9167-c157-4d9a-830b-03a91af714f1-catalog-content\") pod \"certified-operators-p4pb4\" (UID: \"1e8a9167-c157-4d9a-830b-03a91af714f1\") " pod="openshift-marketplace/certified-operators-p4pb4" Feb 19 15:25:26 crc kubenswrapper[4810]: I0219 15:25:26.660660 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7zvl\" (UniqueName: \"kubernetes.io/projected/1e8a9167-c157-4d9a-830b-03a91af714f1-kube-api-access-s7zvl\") pod \"certified-operators-p4pb4\" (UID: \"1e8a9167-c157-4d9a-830b-03a91af714f1\") " pod="openshift-marketplace/certified-operators-p4pb4" Feb 19 15:25:26 crc kubenswrapper[4810]: I0219 15:25:26.762132 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7zvl\" (UniqueName: \"kubernetes.io/projected/1e8a9167-c157-4d9a-830b-03a91af714f1-kube-api-access-s7zvl\") pod \"certified-operators-p4pb4\" (UID: \"1e8a9167-c157-4d9a-830b-03a91af714f1\") " pod="openshift-marketplace/certified-operators-p4pb4" Feb 19 15:25:26 crc kubenswrapper[4810]: I0219 15:25:26.762205 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e8a9167-c157-4d9a-830b-03a91af714f1-utilities\") pod \"certified-operators-p4pb4\" (UID: \"1e8a9167-c157-4d9a-830b-03a91af714f1\") " pod="openshift-marketplace/certified-operators-p4pb4" Feb 19 15:25:26 crc kubenswrapper[4810]: I0219 15:25:26.762234 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e8a9167-c157-4d9a-830b-03a91af714f1-catalog-content\") pod \"certified-operators-p4pb4\" (UID: \"1e8a9167-c157-4d9a-830b-03a91af714f1\") " pod="openshift-marketplace/certified-operators-p4pb4" Feb 19 15:25:26 crc kubenswrapper[4810]: I0219 15:25:26.762693 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e8a9167-c157-4d9a-830b-03a91af714f1-catalog-content\") pod \"certified-operators-p4pb4\" (UID: \"1e8a9167-c157-4d9a-830b-03a91af714f1\") " pod="openshift-marketplace/certified-operators-p4pb4" Feb 19 15:25:26 crc kubenswrapper[4810]: I0219 15:25:26.762899 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e8a9167-c157-4d9a-830b-03a91af714f1-utilities\") pod \"certified-operators-p4pb4\" (UID: \"1e8a9167-c157-4d9a-830b-03a91af714f1\") " pod="openshift-marketplace/certified-operators-p4pb4" Feb 19 15:25:26 crc kubenswrapper[4810]: I0219 15:25:26.794008 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7zvl\" (UniqueName: \"kubernetes.io/projected/1e8a9167-c157-4d9a-830b-03a91af714f1-kube-api-access-s7zvl\") pod \"certified-operators-p4pb4\" (UID: \"1e8a9167-c157-4d9a-830b-03a91af714f1\") " pod="openshift-marketplace/certified-operators-p4pb4" Feb 19 15:25:26 crc kubenswrapper[4810]: I0219 15:25:26.888082 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p4pb4" Feb 19 15:25:27 crc kubenswrapper[4810]: I0219 15:25:27.381881 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p4pb4"] Feb 19 15:25:27 crc kubenswrapper[4810]: W0219 15:25:27.395133 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e8a9167_c157_4d9a_830b_03a91af714f1.slice/crio-ae1a311e8a2e5f58cb4ca3edef21cfd17981ec0f6f89bf4d92c64bc3468bfa5e WatchSource:0}: Error finding container ae1a311e8a2e5f58cb4ca3edef21cfd17981ec0f6f89bf4d92c64bc3468bfa5e: Status 404 returned error can't find the container with id ae1a311e8a2e5f58cb4ca3edef21cfd17981ec0f6f89bf4d92c64bc3468bfa5e Feb 19 15:25:27 crc kubenswrapper[4810]: I0219 15:25:27.673334 4810 generic.go:334] "Generic (PLEG): container finished" podID="1e8a9167-c157-4d9a-830b-03a91af714f1" containerID="48a49f38fa3ffbe1677b0b403f3aaf11cb5caa40978378460bf04941a8180ec9" exitCode=0 Feb 19 15:25:27 crc kubenswrapper[4810]: I0219 15:25:27.673545 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p4pb4" event={"ID":"1e8a9167-c157-4d9a-830b-03a91af714f1","Type":"ContainerDied","Data":"48a49f38fa3ffbe1677b0b403f3aaf11cb5caa40978378460bf04941a8180ec9"} Feb 19 15:25:27 crc kubenswrapper[4810]: I0219 15:25:27.673602 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p4pb4" event={"ID":"1e8a9167-c157-4d9a-830b-03a91af714f1","Type":"ContainerStarted","Data":"ae1a311e8a2e5f58cb4ca3edef21cfd17981ec0f6f89bf4d92c64bc3468bfa5e"} Feb 19 15:25:29 crc kubenswrapper[4810]: I0219 15:25:29.692385 4810 generic.go:334] "Generic (PLEG): container finished" podID="1e8a9167-c157-4d9a-830b-03a91af714f1" containerID="d9c156f23ea8a27c3e29c7dc448b51e5084a296bac6eb2789880f5b577811e2c" exitCode=0 Feb 19 15:25:29 crc kubenswrapper[4810]: I0219 15:25:29.692521 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p4pb4" event={"ID":"1e8a9167-c157-4d9a-830b-03a91af714f1","Type":"ContainerDied","Data":"d9c156f23ea8a27c3e29c7dc448b51e5084a296bac6eb2789880f5b577811e2c"} Feb 19 15:25:30 crc kubenswrapper[4810]: I0219 15:25:30.701119 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p4pb4" event={"ID":"1e8a9167-c157-4d9a-830b-03a91af714f1","Type":"ContainerStarted","Data":"4f9ebb887d3b59f725d8e7b6ac586d52978a5345af6ed853a8bce7fd37049223"} Feb 19 15:25:30 crc kubenswrapper[4810]: I0219 15:25:30.718200 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-p4pb4" podStartSLOduration=2.091366798 podStartE2EDuration="4.718185405s" podCreationTimestamp="2026-02-19 15:25:26 +0000 UTC" firstStartedPulling="2026-02-19 15:25:27.674609713 +0000 UTC m=+957.156639827" lastFinishedPulling="2026-02-19 15:25:30.3014283 +0000 UTC m=+959.783458434" observedRunningTime="2026-02-19 15:25:30.714057914 +0000 UTC m=+960.196088038" watchObservedRunningTime="2026-02-19 15:25:30.718185405 +0000 UTC m=+960.200215529" Feb 19 15:25:36 crc kubenswrapper[4810]: I0219 15:25:36.889171 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-p4pb4" Feb 19 15:25:36 crc kubenswrapper[4810]: I0219 15:25:36.889781 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-p4pb4" Feb 19 15:25:36 crc kubenswrapper[4810]: I0219 15:25:36.944788 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-p4pb4" Feb 19 15:25:37 crc kubenswrapper[4810]: I0219 15:25:37.072278 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-p4pb4" Feb 19 15:25:37 crc kubenswrapper[4810]: I0219 15:25:37.175688 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-p4pb4"] Feb 19 15:25:39 crc kubenswrapper[4810]: I0219 15:25:39.038234 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-p4pb4" podUID="1e8a9167-c157-4d9a-830b-03a91af714f1" containerName="registry-server" containerID="cri-o://4f9ebb887d3b59f725d8e7b6ac586d52978a5345af6ed853a8bce7fd37049223" gracePeriod=2 Feb 19 15:25:39 crc kubenswrapper[4810]: I0219 15:25:39.586071 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gx7qf"] Feb 19 15:25:39 crc kubenswrapper[4810]: I0219 15:25:39.588188 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gx7qf" Feb 19 15:25:39 crc kubenswrapper[4810]: I0219 15:25:39.600042 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gx7qf"] Feb 19 15:25:39 crc kubenswrapper[4810]: I0219 15:25:39.604682 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpjhx\" (UniqueName: \"kubernetes.io/projected/1715cc96-a86a-40d8-8f5a-1a4f35129bd1-kube-api-access-tpjhx\") pod \"redhat-marketplace-gx7qf\" (UID: \"1715cc96-a86a-40d8-8f5a-1a4f35129bd1\") " pod="openshift-marketplace/redhat-marketplace-gx7qf" Feb 19 15:25:39 crc kubenswrapper[4810]: I0219 15:25:39.604751 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1715cc96-a86a-40d8-8f5a-1a4f35129bd1-catalog-content\") pod \"redhat-marketplace-gx7qf\" (UID: \"1715cc96-a86a-40d8-8f5a-1a4f35129bd1\") " pod="openshift-marketplace/redhat-marketplace-gx7qf" Feb 19 15:25:39 crc kubenswrapper[4810]: I0219 15:25:39.604842 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1715cc96-a86a-40d8-8f5a-1a4f35129bd1-utilities\") pod \"redhat-marketplace-gx7qf\" (UID: \"1715cc96-a86a-40d8-8f5a-1a4f35129bd1\") " pod="openshift-marketplace/redhat-marketplace-gx7qf" Feb 19 15:25:39 crc kubenswrapper[4810]: I0219 15:25:39.705821 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpjhx\" (UniqueName: \"kubernetes.io/projected/1715cc96-a86a-40d8-8f5a-1a4f35129bd1-kube-api-access-tpjhx\") pod \"redhat-marketplace-gx7qf\" (UID: \"1715cc96-a86a-40d8-8f5a-1a4f35129bd1\") " pod="openshift-marketplace/redhat-marketplace-gx7qf" Feb 19 15:25:39 crc kubenswrapper[4810]: I0219 15:25:39.706176 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1715cc96-a86a-40d8-8f5a-1a4f35129bd1-catalog-content\") pod \"redhat-marketplace-gx7qf\" (UID: \"1715cc96-a86a-40d8-8f5a-1a4f35129bd1\") " pod="openshift-marketplace/redhat-marketplace-gx7qf" Feb 19 15:25:39 crc kubenswrapper[4810]: I0219 15:25:39.706236 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1715cc96-a86a-40d8-8f5a-1a4f35129bd1-utilities\") pod \"redhat-marketplace-gx7qf\" (UID: \"1715cc96-a86a-40d8-8f5a-1a4f35129bd1\") " pod="openshift-marketplace/redhat-marketplace-gx7qf" Feb 19 15:25:39 crc kubenswrapper[4810]: I0219 15:25:39.706881 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1715cc96-a86a-40d8-8f5a-1a4f35129bd1-utilities\") pod \"redhat-marketplace-gx7qf\" (UID: \"1715cc96-a86a-40d8-8f5a-1a4f35129bd1\") " pod="openshift-marketplace/redhat-marketplace-gx7qf" Feb 19 15:25:39 crc kubenswrapper[4810]: I0219 15:25:39.706955 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1715cc96-a86a-40d8-8f5a-1a4f35129bd1-catalog-content\") pod \"redhat-marketplace-gx7qf\" (UID: \"1715cc96-a86a-40d8-8f5a-1a4f35129bd1\") " pod="openshift-marketplace/redhat-marketplace-gx7qf" Feb 19 15:25:39 crc kubenswrapper[4810]: I0219 15:25:39.726829 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpjhx\" (UniqueName: \"kubernetes.io/projected/1715cc96-a86a-40d8-8f5a-1a4f35129bd1-kube-api-access-tpjhx\") pod \"redhat-marketplace-gx7qf\" (UID: \"1715cc96-a86a-40d8-8f5a-1a4f35129bd1\") " pod="openshift-marketplace/redhat-marketplace-gx7qf" Feb 19 15:25:39 crc kubenswrapper[4810]: I0219 15:25:39.921090 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gx7qf" Feb 19 15:25:39 crc kubenswrapper[4810]: I0219 15:25:39.999669 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p4pb4" Feb 19 15:25:40 crc kubenswrapper[4810]: I0219 15:25:40.052091 4810 generic.go:334] "Generic (PLEG): container finished" podID="1e8a9167-c157-4d9a-830b-03a91af714f1" containerID="4f9ebb887d3b59f725d8e7b6ac586d52978a5345af6ed853a8bce7fd37049223" exitCode=0 Feb 19 15:25:40 crc kubenswrapper[4810]: I0219 15:25:40.052434 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p4pb4" event={"ID":"1e8a9167-c157-4d9a-830b-03a91af714f1","Type":"ContainerDied","Data":"4f9ebb887d3b59f725d8e7b6ac586d52978a5345af6ed853a8bce7fd37049223"} Feb 19 15:25:40 crc kubenswrapper[4810]: I0219 15:25:40.052461 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p4pb4" event={"ID":"1e8a9167-c157-4d9a-830b-03a91af714f1","Type":"ContainerDied","Data":"ae1a311e8a2e5f58cb4ca3edef21cfd17981ec0f6f89bf4d92c64bc3468bfa5e"} Feb 19 15:25:40 crc kubenswrapper[4810]: I0219 15:25:40.052478 4810 scope.go:117] "RemoveContainer" containerID="4f9ebb887d3b59f725d8e7b6ac586d52978a5345af6ed853a8bce7fd37049223" Feb 19 15:25:40 crc kubenswrapper[4810]: I0219 15:25:40.052588 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p4pb4" Feb 19 15:25:40 crc kubenswrapper[4810]: I0219 15:25:40.108649 4810 scope.go:117] "RemoveContainer" containerID="d9c156f23ea8a27c3e29c7dc448b51e5084a296bac6eb2789880f5b577811e2c" Feb 19 15:25:40 crc kubenswrapper[4810]: I0219 15:25:40.111054 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e8a9167-c157-4d9a-830b-03a91af714f1-catalog-content\") pod \"1e8a9167-c157-4d9a-830b-03a91af714f1\" (UID: \"1e8a9167-c157-4d9a-830b-03a91af714f1\") " Feb 19 15:25:40 crc kubenswrapper[4810]: I0219 15:25:40.111128 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7zvl\" (UniqueName: \"kubernetes.io/projected/1e8a9167-c157-4d9a-830b-03a91af714f1-kube-api-access-s7zvl\") pod \"1e8a9167-c157-4d9a-830b-03a91af714f1\" (UID: \"1e8a9167-c157-4d9a-830b-03a91af714f1\") " Feb 19 15:25:40 crc kubenswrapper[4810]: I0219 15:25:40.111213 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e8a9167-c157-4d9a-830b-03a91af714f1-utilities\") pod \"1e8a9167-c157-4d9a-830b-03a91af714f1\" (UID: \"1e8a9167-c157-4d9a-830b-03a91af714f1\") " Feb 19 15:25:40 crc kubenswrapper[4810]: I0219 15:25:40.112114 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e8a9167-c157-4d9a-830b-03a91af714f1-utilities" (OuterVolumeSpecName: "utilities") pod "1e8a9167-c157-4d9a-830b-03a91af714f1" (UID: "1e8a9167-c157-4d9a-830b-03a91af714f1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:25:40 crc kubenswrapper[4810]: I0219 15:25:40.122764 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e8a9167-c157-4d9a-830b-03a91af714f1-kube-api-access-s7zvl" (OuterVolumeSpecName: "kube-api-access-s7zvl") pod "1e8a9167-c157-4d9a-830b-03a91af714f1" (UID: "1e8a9167-c157-4d9a-830b-03a91af714f1"). InnerVolumeSpecName "kube-api-access-s7zvl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:25:40 crc kubenswrapper[4810]: I0219 15:25:40.151292 4810 scope.go:117] "RemoveContainer" containerID="48a49f38fa3ffbe1677b0b403f3aaf11cb5caa40978378460bf04941a8180ec9" Feb 19 15:25:40 crc kubenswrapper[4810]: I0219 15:25:40.213562 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e8a9167-c157-4d9a-830b-03a91af714f1-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 15:25:40 crc kubenswrapper[4810]: I0219 15:25:40.213593 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7zvl\" (UniqueName: \"kubernetes.io/projected/1e8a9167-c157-4d9a-830b-03a91af714f1-kube-api-access-s7zvl\") on node \"crc\" DevicePath \"\"" Feb 19 15:25:40 crc kubenswrapper[4810]: I0219 15:25:40.229499 4810 scope.go:117] "RemoveContainer" containerID="4f9ebb887d3b59f725d8e7b6ac586d52978a5345af6ed853a8bce7fd37049223" Feb 19 15:25:40 crc kubenswrapper[4810]: I0219 15:25:40.231489 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e8a9167-c157-4d9a-830b-03a91af714f1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1e8a9167-c157-4d9a-830b-03a91af714f1" (UID: "1e8a9167-c157-4d9a-830b-03a91af714f1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:25:40 crc kubenswrapper[4810]: E0219 15:25:40.233698 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f9ebb887d3b59f725d8e7b6ac586d52978a5345af6ed853a8bce7fd37049223\": container with ID starting with 4f9ebb887d3b59f725d8e7b6ac586d52978a5345af6ed853a8bce7fd37049223 not found: ID does not exist" containerID="4f9ebb887d3b59f725d8e7b6ac586d52978a5345af6ed853a8bce7fd37049223" Feb 19 15:25:40 crc kubenswrapper[4810]: I0219 15:25:40.233741 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f9ebb887d3b59f725d8e7b6ac586d52978a5345af6ed853a8bce7fd37049223"} err="failed to get container status \"4f9ebb887d3b59f725d8e7b6ac586d52978a5345af6ed853a8bce7fd37049223\": rpc error: code = NotFound desc = could not find container \"4f9ebb887d3b59f725d8e7b6ac586d52978a5345af6ed853a8bce7fd37049223\": container with ID starting with 4f9ebb887d3b59f725d8e7b6ac586d52978a5345af6ed853a8bce7fd37049223 not found: ID does not exist" Feb 19 15:25:40 crc kubenswrapper[4810]: I0219 15:25:40.233767 4810 scope.go:117] "RemoveContainer" containerID="d9c156f23ea8a27c3e29c7dc448b51e5084a296bac6eb2789880f5b577811e2c" Feb 19 15:25:40 crc kubenswrapper[4810]: E0219 15:25:40.234860 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9c156f23ea8a27c3e29c7dc448b51e5084a296bac6eb2789880f5b577811e2c\": container with ID starting with d9c156f23ea8a27c3e29c7dc448b51e5084a296bac6eb2789880f5b577811e2c not found: ID does not exist" containerID="d9c156f23ea8a27c3e29c7dc448b51e5084a296bac6eb2789880f5b577811e2c" Feb 19 15:25:40 crc kubenswrapper[4810]: I0219 15:25:40.234887 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9c156f23ea8a27c3e29c7dc448b51e5084a296bac6eb2789880f5b577811e2c"} err="failed to get container status \"d9c156f23ea8a27c3e29c7dc448b51e5084a296bac6eb2789880f5b577811e2c\": rpc error: code = NotFound desc = could not find container \"d9c156f23ea8a27c3e29c7dc448b51e5084a296bac6eb2789880f5b577811e2c\": container with ID starting with d9c156f23ea8a27c3e29c7dc448b51e5084a296bac6eb2789880f5b577811e2c not found: ID does not exist" Feb 19 15:25:40 crc kubenswrapper[4810]: I0219 15:25:40.234906 4810 scope.go:117] "RemoveContainer" containerID="48a49f38fa3ffbe1677b0b403f3aaf11cb5caa40978378460bf04941a8180ec9" Feb 19 15:25:40 crc kubenswrapper[4810]: E0219 15:25:40.237806 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48a49f38fa3ffbe1677b0b403f3aaf11cb5caa40978378460bf04941a8180ec9\": container with ID starting with 48a49f38fa3ffbe1677b0b403f3aaf11cb5caa40978378460bf04941a8180ec9 not found: ID does not exist" containerID="48a49f38fa3ffbe1677b0b403f3aaf11cb5caa40978378460bf04941a8180ec9" Feb 19 15:25:40 crc kubenswrapper[4810]: I0219 15:25:40.237838 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48a49f38fa3ffbe1677b0b403f3aaf11cb5caa40978378460bf04941a8180ec9"} err="failed to get container status \"48a49f38fa3ffbe1677b0b403f3aaf11cb5caa40978378460bf04941a8180ec9\": rpc error: code = NotFound desc = could not find container \"48a49f38fa3ffbe1677b0b403f3aaf11cb5caa40978378460bf04941a8180ec9\": container with ID starting with 48a49f38fa3ffbe1677b0b403f3aaf11cb5caa40978378460bf04941a8180ec9 not found: ID does not exist" Feb 19 15:25:40 crc kubenswrapper[4810]: I0219 15:25:40.315112 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e8a9167-c157-4d9a-830b-03a91af714f1-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 15:25:40 crc kubenswrapper[4810]: I0219 15:25:40.378215 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-p4pb4"] Feb 19 15:25:40 crc kubenswrapper[4810]: I0219 15:25:40.389018 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-p4pb4"] Feb 19 15:25:40 crc kubenswrapper[4810]: I0219 15:25:40.459920 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gx7qf"] Feb 19 15:25:41 crc kubenswrapper[4810]: I0219 15:25:41.061772 4810 generic.go:334] "Generic (PLEG): container finished" podID="1715cc96-a86a-40d8-8f5a-1a4f35129bd1" containerID="1d5f88ecf4c81e410f0df90ab60b4889433c0df9a3a9c3d46a0ad0dad5a5c6f9" exitCode=0 Feb 19 15:25:41 crc kubenswrapper[4810]: I0219 15:25:41.061893 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gx7qf" event={"ID":"1715cc96-a86a-40d8-8f5a-1a4f35129bd1","Type":"ContainerDied","Data":"1d5f88ecf4c81e410f0df90ab60b4889433c0df9a3a9c3d46a0ad0dad5a5c6f9"} Feb 19 15:25:41 crc kubenswrapper[4810]: I0219 15:25:41.062061 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gx7qf" event={"ID":"1715cc96-a86a-40d8-8f5a-1a4f35129bd1","Type":"ContainerStarted","Data":"c9c6c8774d7ca4a97936ace4183e794c37a4dbb46f2dcfbc1cbf73f1bab26b9b"} Feb 19 15:25:41 crc kubenswrapper[4810]: I0219 15:25:41.446438 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e8a9167-c157-4d9a-830b-03a91af714f1" path="/var/lib/kubelet/pods/1e8a9167-c157-4d9a-830b-03a91af714f1/volumes" Feb 19 15:25:42 crc kubenswrapper[4810]: I0219 15:25:42.068292 4810 generic.go:334] "Generic (PLEG): container finished" podID="1715cc96-a86a-40d8-8f5a-1a4f35129bd1" containerID="a21841d08c3392c62d7d0123a3228f5e73d4d173558fa34eefb59a0035628f39" exitCode=0 Feb 19 15:25:42 crc kubenswrapper[4810]: I0219 15:25:42.068449 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gx7qf" event={"ID":"1715cc96-a86a-40d8-8f5a-1a4f35129bd1","Type":"ContainerDied","Data":"a21841d08c3392c62d7d0123a3228f5e73d4d173558fa34eefb59a0035628f39"} Feb 19 15:25:43 crc kubenswrapper[4810]: I0219 15:25:43.075949 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gx7qf" event={"ID":"1715cc96-a86a-40d8-8f5a-1a4f35129bd1","Type":"ContainerStarted","Data":"5cab9d89ebbb4715343c797d52130d009456c8d3d9eaf887c80836933b581c07"} Feb 19 15:25:43 crc kubenswrapper[4810]: I0219 15:25:43.099467 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gx7qf" podStartSLOduration=2.728977246 podStartE2EDuration="4.099452367s" podCreationTimestamp="2026-02-19 15:25:39 +0000 UTC" firstStartedPulling="2026-02-19 15:25:41.063034852 +0000 UTC m=+970.545064976" lastFinishedPulling="2026-02-19 15:25:42.433509973 +0000 UTC m=+971.915540097" observedRunningTime="2026-02-19 15:25:43.096599547 +0000 UTC m=+972.578629671" watchObservedRunningTime="2026-02-19 15:25:43.099452367 +0000 UTC m=+972.581482491" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.039625 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-mzslt"] Feb 19 15:25:48 crc kubenswrapper[4810]: E0219 15:25:48.040433 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e8a9167-c157-4d9a-830b-03a91af714f1" containerName="registry-server" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.040445 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e8a9167-c157-4d9a-830b-03a91af714f1" containerName="registry-server" Feb 19 15:25:48 crc kubenswrapper[4810]: E0219 15:25:48.040461 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e8a9167-c157-4d9a-830b-03a91af714f1" containerName="extract-content" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.040502 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e8a9167-c157-4d9a-830b-03a91af714f1" containerName="extract-content" Feb 19 15:25:48 crc kubenswrapper[4810]: E0219 15:25:48.040514 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e8a9167-c157-4d9a-830b-03a91af714f1" containerName="extract-utilities" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.040521 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e8a9167-c157-4d9a-830b-03a91af714f1" containerName="extract-utilities" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.040663 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e8a9167-c157-4d9a-830b-03a91af714f1" containerName="registry-server" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.041160 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-mzslt" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.043960 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-l2x68" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.057262 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-mzslt"] Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.094543 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-jxmt5"] Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.095719 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-jxmt5" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.099839 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-z5fb9"] Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.101841 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-xzqhm" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.110272 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-qz68t"] Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.111002 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987464f4-qz68t" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.111480 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-z5fb9" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.118087 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-jxmt5"] Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.118691 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-6p6vd" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.127744 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-j2fdc" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.147946 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-z5fb9"] Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.181400 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-qz68t"] Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.189850 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-ffm66"] Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.190682 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-ffm66" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.198480 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-xkwtq" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.206925 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-gnmlp"] Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.208746 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-gnmlp" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.216023 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-7g47h" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.216318 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzxjv\" (UniqueName: \"kubernetes.io/projected/1217b757-0f1c-4c4e-9abe-55875992915d-kube-api-access-vzxjv\") pod \"cinder-operator-controller-manager-5d946d989d-jxmt5\" (UID: \"1217b757-0f1c-4c4e-9abe-55875992915d\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-jxmt5" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.216396 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dk5rr\" (UniqueName: \"kubernetes.io/projected/2106e7b5-bb83-464a-a43f-943f22b55078-kube-api-access-dk5rr\") pod \"glance-operator-controller-manager-77987464f4-qz68t\" (UID: \"2106e7b5-bb83-464a-a43f-943f22b55078\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-qz68t" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.216424 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpks5\" (UniqueName: \"kubernetes.io/projected/52bb990c-eff0-4673-be27-d55d433bef0d-kube-api-access-kpks5\") pod \"designate-operator-controller-manager-6d8bf5c495-z5fb9\" (UID: \"52bb990c-eff0-4673-be27-d55d433bef0d\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-z5fb9" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.216440 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rq5dr\" (UniqueName: \"kubernetes.io/projected/91002269-9fe0-44d2-9dbd-9e4cf58274bf-kube-api-access-rq5dr\") pod \"barbican-operator-controller-manager-868647ff47-mzslt\" (UID: \"91002269-9fe0-44d2-9dbd-9e4cf58274bf\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-mzslt" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.223939 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-ffm66"] Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.238525 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-gnmlp"] Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.266825 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-2kkhl"] Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.271881 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-nnps5"] Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.274202 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-nnps5" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.274865 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79d975b745-2kkhl" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.281393 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-br7n2" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.295416 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.298596 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-n2jcn" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.319629 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-nnps5"] Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.319942 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fljv7\" (UniqueName: \"kubernetes.io/projected/602535d1-0abe-471e-8409-31319af7bd4b-kube-api-access-fljv7\") pod \"ironic-operator-controller-manager-554564d7fc-nnps5\" (UID: \"602535d1-0abe-471e-8409-31319af7bd4b\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-nnps5" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.320003 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dk5rr\" (UniqueName: \"kubernetes.io/projected/2106e7b5-bb83-464a-a43f-943f22b55078-kube-api-access-dk5rr\") pod \"glance-operator-controller-manager-77987464f4-qz68t\" (UID: \"2106e7b5-bb83-464a-a43f-943f22b55078\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-qz68t" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.320029 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpks5\" (UniqueName: \"kubernetes.io/projected/52bb990c-eff0-4673-be27-d55d433bef0d-kube-api-access-kpks5\") pod \"designate-operator-controller-manager-6d8bf5c495-z5fb9\" (UID: \"52bb990c-eff0-4673-be27-d55d433bef0d\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-z5fb9" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.320046 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rq5dr\" (UniqueName: \"kubernetes.io/projected/91002269-9fe0-44d2-9dbd-9e4cf58274bf-kube-api-access-rq5dr\") pod \"barbican-operator-controller-manager-868647ff47-mzslt\" (UID: \"91002269-9fe0-44d2-9dbd-9e4cf58274bf\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-mzslt" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.320087 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4898d4eb-d474-44bc-9a38-e36f300d132f-cert\") pod \"infra-operator-controller-manager-79d975b745-2kkhl\" (UID: \"4898d4eb-d474-44bc-9a38-e36f300d132f\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-2kkhl" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.320109 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdkd6\" (UniqueName: \"kubernetes.io/projected/f0ab3643-d267-4902-af1f-cbcbdd7e5e41-kube-api-access-mdkd6\") pod \"horizon-operator-controller-manager-5b9b8895d5-gnmlp\" (UID: \"f0ab3643-d267-4902-af1f-cbcbdd7e5e41\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-gnmlp" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.320130 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjqxl\" (UniqueName: \"kubernetes.io/projected/e2942952-ce19-4053-91da-05623c954167-kube-api-access-pjqxl\") pod \"heat-operator-controller-manager-69f49c598c-ffm66\" (UID: \"e2942952-ce19-4053-91da-05623c954167\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-ffm66" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.320154 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzxjv\" (UniqueName: \"kubernetes.io/projected/1217b757-0f1c-4c4e-9abe-55875992915d-kube-api-access-vzxjv\") pod \"cinder-operator-controller-manager-5d946d989d-jxmt5\" (UID: \"1217b757-0f1c-4c4e-9abe-55875992915d\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-jxmt5" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.320171 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8smk\" (UniqueName: \"kubernetes.io/projected/4898d4eb-d474-44bc-9a38-e36f300d132f-kube-api-access-s8smk\") pod \"infra-operator-controller-manager-79d975b745-2kkhl\" (UID: \"4898d4eb-d474-44bc-9a38-e36f300d132f\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-2kkhl" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.326362 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-2kkhl"] Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.333731 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-mkfsc"] Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.334731 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-mkfsc" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.337561 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-mkfsc"] Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.361307 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-vc7cw"] Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.362209 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-vc7cw" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.368367 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-rg8mp" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.381038 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-kz77r" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.389485 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzxjv\" (UniqueName: \"kubernetes.io/projected/1217b757-0f1c-4c4e-9abe-55875992915d-kube-api-access-vzxjv\") pod \"cinder-operator-controller-manager-5d946d989d-jxmt5\" (UID: \"1217b757-0f1c-4c4e-9abe-55875992915d\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-jxmt5" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.392415 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-vc7cw"] Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.392778 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rq5dr\" (UniqueName: \"kubernetes.io/projected/91002269-9fe0-44d2-9dbd-9e4cf58274bf-kube-api-access-rq5dr\") pod \"barbican-operator-controller-manager-868647ff47-mzslt\" (UID: \"91002269-9fe0-44d2-9dbd-9e4cf58274bf\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-mzslt" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.396824 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dk5rr\" (UniqueName: \"kubernetes.io/projected/2106e7b5-bb83-464a-a43f-943f22b55078-kube-api-access-dk5rr\") pod \"glance-operator-controller-manager-77987464f4-qz68t\" (UID: \"2106e7b5-bb83-464a-a43f-943f22b55078\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-qz68t" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.405004 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpks5\" (UniqueName: \"kubernetes.io/projected/52bb990c-eff0-4673-be27-d55d433bef0d-kube-api-access-kpks5\") pod \"designate-operator-controller-manager-6d8bf5c495-z5fb9\" (UID: \"52bb990c-eff0-4673-be27-d55d433bef0d\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-z5fb9" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.421490 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4898d4eb-d474-44bc-9a38-e36f300d132f-cert\") pod \"infra-operator-controller-manager-79d975b745-2kkhl\" (UID: \"4898d4eb-d474-44bc-9a38-e36f300d132f\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-2kkhl" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.421553 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdkd6\" (UniqueName: \"kubernetes.io/projected/f0ab3643-d267-4902-af1f-cbcbdd7e5e41-kube-api-access-mdkd6\") pod \"horizon-operator-controller-manager-5b9b8895d5-gnmlp\" (UID: \"f0ab3643-d267-4902-af1f-cbcbdd7e5e41\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-gnmlp" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.421596 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjqxl\" (UniqueName: \"kubernetes.io/projected/e2942952-ce19-4053-91da-05623c954167-kube-api-access-pjqxl\") pod \"heat-operator-controller-manager-69f49c598c-ffm66\" (UID: \"e2942952-ce19-4053-91da-05623c954167\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-ffm66" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.421627 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8smk\" (UniqueName: \"kubernetes.io/projected/4898d4eb-d474-44bc-9a38-e36f300d132f-kube-api-access-s8smk\") pod \"infra-operator-controller-manager-79d975b745-2kkhl\" (UID: \"4898d4eb-d474-44bc-9a38-e36f300d132f\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-2kkhl" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.421653 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fljv7\" (UniqueName: \"kubernetes.io/projected/602535d1-0abe-471e-8409-31319af7bd4b-kube-api-access-fljv7\") pod \"ironic-operator-controller-manager-554564d7fc-nnps5\" (UID: \"602535d1-0abe-471e-8409-31319af7bd4b\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-nnps5" Feb 19 15:25:48 crc kubenswrapper[4810]: E0219 15:25:48.422088 4810 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 19 15:25:48 crc kubenswrapper[4810]: E0219 15:25:48.422141 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4898d4eb-d474-44bc-9a38-e36f300d132f-cert podName:4898d4eb-d474-44bc-9a38-e36f300d132f nodeName:}" failed. No retries permitted until 2026-02-19 15:25:48.922124904 +0000 UTC m=+978.404155028 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4898d4eb-d474-44bc-9a38-e36f300d132f-cert") pod "infra-operator-controller-manager-79d975b745-2kkhl" (UID: "4898d4eb-d474-44bc-9a38-e36f300d132f") : secret "infra-operator-webhook-server-cert" not found Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.428599 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-jxmt5" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.450560 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-6fqmd"] Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.451378 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-6fqmd" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.460376 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-l67cq"] Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.461273 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-l67cq" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.467914 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-z5fb9" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.474161 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-hbjzp" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.474402 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-tfd89" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.492913 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdkd6\" (UniqueName: \"kubernetes.io/projected/f0ab3643-d267-4902-af1f-cbcbdd7e5e41-kube-api-access-mdkd6\") pod \"horizon-operator-controller-manager-5b9b8895d5-gnmlp\" (UID: \"f0ab3643-d267-4902-af1f-cbcbdd7e5e41\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-gnmlp" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.493423 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8smk\" (UniqueName: \"kubernetes.io/projected/4898d4eb-d474-44bc-9a38-e36f300d132f-kube-api-access-s8smk\") pod \"infra-operator-controller-manager-79d975b745-2kkhl\" (UID: \"4898d4eb-d474-44bc-9a38-e36f300d132f\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-2kkhl" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.510080 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjqxl\" (UniqueName: \"kubernetes.io/projected/e2942952-ce19-4053-91da-05623c954167-kube-api-access-pjqxl\") pod \"heat-operator-controller-manager-69f49c598c-ffm66\" (UID: \"e2942952-ce19-4053-91da-05623c954167\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-ffm66" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.510511 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987464f4-qz68t" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.514958 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fljv7\" (UniqueName: \"kubernetes.io/projected/602535d1-0abe-471e-8409-31319af7bd4b-kube-api-access-fljv7\") pod \"ironic-operator-controller-manager-554564d7fc-nnps5\" (UID: \"602535d1-0abe-471e-8409-31319af7bd4b\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-nnps5" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.515020 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-6fqmd"] Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.525640 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-ffm66" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.526663 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qslf9\" (UniqueName: \"kubernetes.io/projected/2126b31b-0444-43e4-a250-837f37d476aa-kube-api-access-qslf9\") pod \"keystone-operator-controller-manager-b4d948c87-mkfsc\" (UID: \"2126b31b-0444-43e4-a250-837f37d476aa\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-mkfsc" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.526704 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnncf\" (UniqueName: \"kubernetes.io/projected/942f40af-0498-4865-99da-bdcd068ef449-kube-api-access-jnncf\") pod \"manila-operator-controller-manager-54f6768c69-vc7cw\" (UID: \"942f40af-0498-4865-99da-bdcd068ef449\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-vc7cw" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.542620 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-gnmlp" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.598816 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-nnps5" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.628215 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-l67cq"] Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.647982 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-jjqv2"] Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.649625 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-jjqv2" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.661856 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-p2f6x" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.662855 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2j7z\" (UniqueName: \"kubernetes.io/projected/e4a54646-39cf-4e42-9367-487ea4f7d8a4-kube-api-access-j2j7z\") pod \"mariadb-operator-controller-manager-6994f66f48-6fqmd\" (UID: \"e4a54646-39cf-4e42-9367-487ea4f7d8a4\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-6fqmd" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.662907 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jj652\" (UniqueName: \"kubernetes.io/projected/fef4ad67-ccc8-4b32-bfb8-38dd6aa8e07e-kube-api-access-jj652\") pod \"nova-operator-controller-manager-567668f5cf-l67cq\" (UID: \"fef4ad67-ccc8-4b32-bfb8-38dd6aa8e07e\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-l67cq" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.663026 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qslf9\" (UniqueName: \"kubernetes.io/projected/2126b31b-0444-43e4-a250-837f37d476aa-kube-api-access-qslf9\") pod \"keystone-operator-controller-manager-b4d948c87-mkfsc\" (UID: \"2126b31b-0444-43e4-a250-837f37d476aa\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-mkfsc" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.664368 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnncf\" (UniqueName: \"kubernetes.io/projected/942f40af-0498-4865-99da-bdcd068ef449-kube-api-access-jnncf\") pod \"manila-operator-controller-manager-54f6768c69-vc7cw\" (UID: \"942f40af-0498-4865-99da-bdcd068ef449\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-vc7cw" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.689919 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-vcbwg"] Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.720770 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-vcbwg" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.724833 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-6ft99" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.726778 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qslf9\" (UniqueName: \"kubernetes.io/projected/2126b31b-0444-43e4-a250-837f37d476aa-kube-api-access-qslf9\") pod \"keystone-operator-controller-manager-b4d948c87-mkfsc\" (UID: \"2126b31b-0444-43e4-a250-837f37d476aa\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-mkfsc" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.727007 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-mzslt" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.729806 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnncf\" (UniqueName: \"kubernetes.io/projected/942f40af-0498-4865-99da-bdcd068ef449-kube-api-access-jnncf\") pod \"manila-operator-controller-manager-54f6768c69-vc7cw\" (UID: \"942f40af-0498-4865-99da-bdcd068ef449\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-vc7cw" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.751258 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-vc7cw" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.765600 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jj652\" (UniqueName: \"kubernetes.io/projected/fef4ad67-ccc8-4b32-bfb8-38dd6aa8e07e-kube-api-access-jj652\") pod \"nova-operator-controller-manager-567668f5cf-l67cq\" (UID: \"fef4ad67-ccc8-4b32-bfb8-38dd6aa8e07e\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-l67cq" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.765664 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9lgb\" (UniqueName: \"kubernetes.io/projected/3cff7e0c-86c7-4029-aefe-a7f7e8e2d76d-kube-api-access-w9lgb\") pod \"octavia-operator-controller-manager-69f8888797-vcbwg\" (UID: \"3cff7e0c-86c7-4029-aefe-a7f7e8e2d76d\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-vcbwg" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.765690 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drb5r\" (UniqueName: \"kubernetes.io/projected/4aeb7cba-c1db-4dd6-92f7-dae7bd2e3f65-kube-api-access-drb5r\") pod \"neutron-operator-controller-manager-64ddbf8bb-jjqv2\" (UID: \"4aeb7cba-c1db-4dd6-92f7-dae7bd2e3f65\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-jjqv2" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.765769 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2j7z\" (UniqueName: \"kubernetes.io/projected/e4a54646-39cf-4e42-9367-487ea4f7d8a4-kube-api-access-j2j7z\") pod \"mariadb-operator-controller-manager-6994f66f48-6fqmd\" (UID: \"e4a54646-39cf-4e42-9367-487ea4f7d8a4\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-6fqmd" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.772181 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-jjqv2"] Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.780897 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-5xnwd"] Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.781753 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-5xnwd" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.791490 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jj652\" (UniqueName: \"kubernetes.io/projected/fef4ad67-ccc8-4b32-bfb8-38dd6aa8e07e-kube-api-access-jj652\") pod \"nova-operator-controller-manager-567668f5cf-l67cq\" (UID: \"fef4ad67-ccc8-4b32-bfb8-38dd6aa8e07e\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-l67cq" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.791607 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c9hlpt"] Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.791923 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-6b8w8" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.792741 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c9hlpt" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.796889 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.797062 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-2whfc" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.807197 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-vcbwg"] Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.819202 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2j7z\" (UniqueName: \"kubernetes.io/projected/e4a54646-39cf-4e42-9367-487ea4f7d8a4-kube-api-access-j2j7z\") pod \"mariadb-operator-controller-manager-6994f66f48-6fqmd\" (UID: \"e4a54646-39cf-4e42-9367-487ea4f7d8a4\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-6fqmd" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.819640 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-5xnwd"] Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.830037 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-7tzvr"] Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.830918 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-7tzvr"] Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.830994 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-7tzvr" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.840870 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-xkq2q" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.841031 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c9hlpt"] Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.863807 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-t44nb"] Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.864608 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-t44nb" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.867188 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c677bdd0-7248-4b02-9ab4-035c034a976a-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c9hlpt\" (UID: \"c677bdd0-7248-4b02-9ab4-035c034a976a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c9hlpt" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.867231 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmsf2\" (UniqueName: \"kubernetes.io/projected/e163eac0-ea1f-4002-9469-844240d7a44c-kube-api-access-dmsf2\") pod \"ovn-operator-controller-manager-d44cf6b75-5xnwd\" (UID: \"e163eac0-ea1f-4002-9469-844240d7a44c\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-5xnwd" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.867270 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwc4k\" (UniqueName: \"kubernetes.io/projected/9f5779a5-4cda-40dc-831d-950f97eae317-kube-api-access-bwc4k\") pod \"placement-operator-controller-manager-8497b45c89-7tzvr\" (UID: \"9f5779a5-4cda-40dc-831d-950f97eae317\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-7tzvr" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.867302 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9lgb\" (UniqueName: \"kubernetes.io/projected/3cff7e0c-86c7-4029-aefe-a7f7e8e2d76d-kube-api-access-w9lgb\") pod \"octavia-operator-controller-manager-69f8888797-vcbwg\" (UID: \"3cff7e0c-86c7-4029-aefe-a7f7e8e2d76d\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-vcbwg" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.867345 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drb5r\" (UniqueName: \"kubernetes.io/projected/4aeb7cba-c1db-4dd6-92f7-dae7bd2e3f65-kube-api-access-drb5r\") pod \"neutron-operator-controller-manager-64ddbf8bb-jjqv2\" (UID: \"4aeb7cba-c1db-4dd6-92f7-dae7bd2e3f65\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-jjqv2" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.867381 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vf8f\" (UniqueName: \"kubernetes.io/projected/c677bdd0-7248-4b02-9ab4-035c034a976a-kube-api-access-9vf8f\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c9hlpt\" (UID: \"c677bdd0-7248-4b02-9ab4-035c034a976a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c9hlpt" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.872881 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-6m778" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.889308 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-t44nb"] Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.890916 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9lgb\" (UniqueName: \"kubernetes.io/projected/3cff7e0c-86c7-4029-aefe-a7f7e8e2d76d-kube-api-access-w9lgb\") pod \"octavia-operator-controller-manager-69f8888797-vcbwg\" (UID: \"3cff7e0c-86c7-4029-aefe-a7f7e8e2d76d\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-vcbwg" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.897270 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-6fqmd" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.899217 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-px9zx"] Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.900377 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-px9zx" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.903556 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-shnl6" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.915680 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-l67cq" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.923154 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drb5r\" (UniqueName: \"kubernetes.io/projected/4aeb7cba-c1db-4dd6-92f7-dae7bd2e3f65-kube-api-access-drb5r\") pod \"neutron-operator-controller-manager-64ddbf8bb-jjqv2\" (UID: \"4aeb7cba-c1db-4dd6-92f7-dae7bd2e3f65\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-jjqv2" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.923215 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-px9zx"] Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.942531 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-pw9kt"] Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.943955 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-7866795846-pw9kt" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.949352 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-jlkc4" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.952551 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-798847869b-dlmvg"] Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.953448 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-798847869b-dlmvg" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.955387 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-mkfsc" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.959158 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-mvh4x" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.962957 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-pw9kt"] Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.969738 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c677bdd0-7248-4b02-9ab4-035c034a976a-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c9hlpt\" (UID: \"c677bdd0-7248-4b02-9ab4-035c034a976a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c9hlpt" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.969794 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmsf2\" (UniqueName: \"kubernetes.io/projected/e163eac0-ea1f-4002-9469-844240d7a44c-kube-api-access-dmsf2\") pod \"ovn-operator-controller-manager-d44cf6b75-5xnwd\" (UID: \"e163eac0-ea1f-4002-9469-844240d7a44c\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-5xnwd" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.969833 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vbnk\" (UniqueName: \"kubernetes.io/projected/9c5af548-c722-4e6b-9309-1420838257e0-kube-api-access-9vbnk\") pod \"watcher-operator-controller-manager-798847869b-dlmvg\" (UID: \"9c5af548-c722-4e6b-9309-1420838257e0\") " pod="openstack-operators/watcher-operator-controller-manager-798847869b-dlmvg" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.969860 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jvv8\" (UniqueName: \"kubernetes.io/projected/69b7e96d-bce6-4653-998e-3bf5d159ae5a-kube-api-access-4jvv8\") pod \"test-operator-controller-manager-7866795846-pw9kt\" (UID: \"69b7e96d-bce6-4653-998e-3bf5d159ae5a\") " pod="openstack-operators/test-operator-controller-manager-7866795846-pw9kt" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.969878 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwc4k\" (UniqueName: \"kubernetes.io/projected/9f5779a5-4cda-40dc-831d-950f97eae317-kube-api-access-bwc4k\") pod \"placement-operator-controller-manager-8497b45c89-7tzvr\" (UID: \"9f5779a5-4cda-40dc-831d-950f97eae317\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-7tzvr" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.969913 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8pz6\" (UniqueName: \"kubernetes.io/projected/eaed166e-39b5-45ca-8a65-a22710d5fe37-kube-api-access-m8pz6\") pod \"telemetry-operator-controller-manager-7f45b4ff68-px9zx\" (UID: \"eaed166e-39b5-45ca-8a65-a22710d5fe37\") " pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-px9zx" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.969946 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vf8f\" (UniqueName: \"kubernetes.io/projected/c677bdd0-7248-4b02-9ab4-035c034a976a-kube-api-access-9vf8f\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c9hlpt\" (UID: \"c677bdd0-7248-4b02-9ab4-035c034a976a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c9hlpt" Feb 19 15:25:48 crc kubenswrapper[4810]: E0219 15:25:48.971101 4810 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 15:25:48 crc kubenswrapper[4810]: E0219 15:25:48.971178 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c677bdd0-7248-4b02-9ab4-035c034a976a-cert podName:c677bdd0-7248-4b02-9ab4-035c034a976a nodeName:}" failed. No retries permitted until 2026-02-19 15:25:49.471158101 +0000 UTC m=+978.953188225 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c677bdd0-7248-4b02-9ab4-035c034a976a-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9c9hlpt" (UID: "c677bdd0-7248-4b02-9ab4-035c034a976a") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.971340 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4898d4eb-d474-44bc-9a38-e36f300d132f-cert\") pod \"infra-operator-controller-manager-79d975b745-2kkhl\" (UID: \"4898d4eb-d474-44bc-9a38-e36f300d132f\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-2kkhl" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.971383 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpnwv\" (UniqueName: \"kubernetes.io/projected/aa5063d7-2358-4149-a3b9-ef2ce138faf4-kube-api-access-bpnwv\") pod \"swift-operator-controller-manager-68f46476f-t44nb\" (UID: \"aa5063d7-2358-4149-a3b9-ef2ce138faf4\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-t44nb" Feb 19 15:25:48 crc kubenswrapper[4810]: E0219 15:25:48.971690 4810 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 19 15:25:48 crc kubenswrapper[4810]: E0219 15:25:48.971716 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4898d4eb-d474-44bc-9a38-e36f300d132f-cert podName:4898d4eb-d474-44bc-9a38-e36f300d132f nodeName:}" failed. No retries permitted until 2026-02-19 15:25:49.971707394 +0000 UTC m=+979.453737518 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4898d4eb-d474-44bc-9a38-e36f300d132f-cert") pod "infra-operator-controller-manager-79d975b745-2kkhl" (UID: "4898d4eb-d474-44bc-9a38-e36f300d132f") : secret "infra-operator-webhook-server-cert" not found Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.978604 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-798847869b-dlmvg"] Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.993813 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-jjqv2" Feb 19 15:25:48 crc kubenswrapper[4810]: I0219 15:25:48.999403 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6d464797d7-lrlqc"] Feb 19 15:25:49 crc kubenswrapper[4810]: I0219 15:25:49.000290 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6d464797d7-lrlqc" Feb 19 15:25:49 crc kubenswrapper[4810]: I0219 15:25:49.005135 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Feb 19 15:25:49 crc kubenswrapper[4810]: I0219 15:25:49.005556 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-rwc2r" Feb 19 15:25:49 crc kubenswrapper[4810]: I0219 15:25:49.005675 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Feb 19 15:25:49 crc kubenswrapper[4810]: I0219 15:25:49.012025 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwc4k\" (UniqueName: \"kubernetes.io/projected/9f5779a5-4cda-40dc-831d-950f97eae317-kube-api-access-bwc4k\") pod \"placement-operator-controller-manager-8497b45c89-7tzvr\" (UID: \"9f5779a5-4cda-40dc-831d-950f97eae317\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-7tzvr" Feb 19 15:25:49 crc kubenswrapper[4810]: I0219 15:25:49.014561 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6d464797d7-lrlqc"] Feb 19 15:25:49 crc kubenswrapper[4810]: I0219 15:25:49.022568 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmsf2\" (UniqueName: \"kubernetes.io/projected/e163eac0-ea1f-4002-9469-844240d7a44c-kube-api-access-dmsf2\") pod \"ovn-operator-controller-manager-d44cf6b75-5xnwd\" (UID: \"e163eac0-ea1f-4002-9469-844240d7a44c\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-5xnwd" Feb 19 15:25:49 crc kubenswrapper[4810]: I0219 15:25:49.025531 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vf8f\" (UniqueName: \"kubernetes.io/projected/c677bdd0-7248-4b02-9ab4-035c034a976a-kube-api-access-9vf8f\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c9hlpt\" (UID: \"c677bdd0-7248-4b02-9ab4-035c034a976a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c9hlpt" Feb 19 15:25:49 crc kubenswrapper[4810]: I0219 15:25:49.039377 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-k98c8"] Feb 19 15:25:49 crc kubenswrapper[4810]: I0219 15:25:49.040405 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-k98c8" Feb 19 15:25:49 crc kubenswrapper[4810]: I0219 15:25:49.042735 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-62cjp" Feb 19 15:25:49 crc kubenswrapper[4810]: I0219 15:25:49.044214 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-k98c8"] Feb 19 15:25:49 crc kubenswrapper[4810]: I0219 15:25:49.058978 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-vcbwg" Feb 19 15:25:49 crc kubenswrapper[4810]: I0219 15:25:49.075789 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vbnk\" (UniqueName: \"kubernetes.io/projected/9c5af548-c722-4e6b-9309-1420838257e0-kube-api-access-9vbnk\") pod \"watcher-operator-controller-manager-798847869b-dlmvg\" (UID: \"9c5af548-c722-4e6b-9309-1420838257e0\") " pod="openstack-operators/watcher-operator-controller-manager-798847869b-dlmvg" Feb 19 15:25:49 crc kubenswrapper[4810]: I0219 15:25:49.075834 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a6f83f3c-26f4-472f-9fcd-ae8049f1819a-webhook-certs\") pod \"openstack-operator-controller-manager-6d464797d7-lrlqc\" (UID: \"a6f83f3c-26f4-472f-9fcd-ae8049f1819a\") " pod="openstack-operators/openstack-operator-controller-manager-6d464797d7-lrlqc" Feb 19 15:25:49 crc kubenswrapper[4810]: I0219 15:25:49.075875 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jvv8\" (UniqueName: \"kubernetes.io/projected/69b7e96d-bce6-4653-998e-3bf5d159ae5a-kube-api-access-4jvv8\") pod \"test-operator-controller-manager-7866795846-pw9kt\" (UID: \"69b7e96d-bce6-4653-998e-3bf5d159ae5a\") " pod="openstack-operators/test-operator-controller-manager-7866795846-pw9kt" Feb 19 15:25:49 crc kubenswrapper[4810]: I0219 15:25:49.075901 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a6f83f3c-26f4-472f-9fcd-ae8049f1819a-metrics-certs\") pod \"openstack-operator-controller-manager-6d464797d7-lrlqc\" (UID: \"a6f83f3c-26f4-472f-9fcd-ae8049f1819a\") " pod="openstack-operators/openstack-operator-controller-manager-6d464797d7-lrlqc" Feb 19 15:25:49 crc kubenswrapper[4810]: I0219 15:25:49.075992 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8pz6\" (UniqueName: \"kubernetes.io/projected/eaed166e-39b5-45ca-8a65-a22710d5fe37-kube-api-access-m8pz6\") pod \"telemetry-operator-controller-manager-7f45b4ff68-px9zx\" (UID: \"eaed166e-39b5-45ca-8a65-a22710d5fe37\") " pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-px9zx" Feb 19 15:25:49 crc kubenswrapper[4810]: I0219 15:25:49.076066 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5xfs\" (UniqueName: \"kubernetes.io/projected/64ed590e-59b6-44c8-baee-324162d099b8-kube-api-access-b5xfs\") pod \"rabbitmq-cluster-operator-manager-668c99d594-k98c8\" (UID: \"64ed590e-59b6-44c8-baee-324162d099b8\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-k98c8" Feb 19 15:25:49 crc kubenswrapper[4810]: I0219 15:25:49.076137 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpnwv\" (UniqueName: \"kubernetes.io/projected/aa5063d7-2358-4149-a3b9-ef2ce138faf4-kube-api-access-bpnwv\") pod \"swift-operator-controller-manager-68f46476f-t44nb\" (UID: \"aa5063d7-2358-4149-a3b9-ef2ce138faf4\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-t44nb" Feb 19 15:25:49 crc kubenswrapper[4810]: I0219 15:25:49.076160 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbcd5\" (UniqueName: \"kubernetes.io/projected/a6f83f3c-26f4-472f-9fcd-ae8049f1819a-kube-api-access-rbcd5\") pod \"openstack-operator-controller-manager-6d464797d7-lrlqc\" (UID: \"a6f83f3c-26f4-472f-9fcd-ae8049f1819a\") " pod="openstack-operators/openstack-operator-controller-manager-6d464797d7-lrlqc" Feb 19 15:25:49 crc kubenswrapper[4810]: I0219 15:25:49.109733 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpnwv\" (UniqueName: \"kubernetes.io/projected/aa5063d7-2358-4149-a3b9-ef2ce138faf4-kube-api-access-bpnwv\") pod \"swift-operator-controller-manager-68f46476f-t44nb\" (UID: \"aa5063d7-2358-4149-a3b9-ef2ce138faf4\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-t44nb" Feb 19 15:25:49 crc kubenswrapper[4810]: I0219 15:25:49.112424 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vbnk\" (UniqueName: \"kubernetes.io/projected/9c5af548-c722-4e6b-9309-1420838257e0-kube-api-access-9vbnk\") pod \"watcher-operator-controller-manager-798847869b-dlmvg\" (UID: \"9c5af548-c722-4e6b-9309-1420838257e0\") " pod="openstack-operators/watcher-operator-controller-manager-798847869b-dlmvg" Feb 19 15:25:49 crc kubenswrapper[4810]: I0219 15:25:49.114133 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jvv8\" (UniqueName: \"kubernetes.io/projected/69b7e96d-bce6-4653-998e-3bf5d159ae5a-kube-api-access-4jvv8\") pod \"test-operator-controller-manager-7866795846-pw9kt\" (UID: \"69b7e96d-bce6-4653-998e-3bf5d159ae5a\") " pod="openstack-operators/test-operator-controller-manager-7866795846-pw9kt" Feb 19 15:25:49 crc kubenswrapper[4810]: I0219 15:25:49.114482 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8pz6\" (UniqueName: \"kubernetes.io/projected/eaed166e-39b5-45ca-8a65-a22710d5fe37-kube-api-access-m8pz6\") pod \"telemetry-operator-controller-manager-7f45b4ff68-px9zx\" (UID: \"eaed166e-39b5-45ca-8a65-a22710d5fe37\") " pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-px9zx" Feb 19 15:25:49 crc kubenswrapper[4810]: I0219 15:25:49.131412 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-5xnwd" Feb 19 15:25:49 crc kubenswrapper[4810]: I0219 15:25:49.171792 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-z5fb9"] Feb 19 15:25:49 crc kubenswrapper[4810]: I0219 15:25:49.181258 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbcd5\" (UniqueName: \"kubernetes.io/projected/a6f83f3c-26f4-472f-9fcd-ae8049f1819a-kube-api-access-rbcd5\") pod \"openstack-operator-controller-manager-6d464797d7-lrlqc\" (UID: \"a6f83f3c-26f4-472f-9fcd-ae8049f1819a\") " pod="openstack-operators/openstack-operator-controller-manager-6d464797d7-lrlqc" Feb 19 15:25:49 crc kubenswrapper[4810]: I0219 15:25:49.181378 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a6f83f3c-26f4-472f-9fcd-ae8049f1819a-webhook-certs\") pod \"openstack-operator-controller-manager-6d464797d7-lrlqc\" (UID: \"a6f83f3c-26f4-472f-9fcd-ae8049f1819a\") " pod="openstack-operators/openstack-operator-controller-manager-6d464797d7-lrlqc" Feb 19 15:25:49 crc kubenswrapper[4810]: I0219 15:25:49.181419 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a6f83f3c-26f4-472f-9fcd-ae8049f1819a-metrics-certs\") pod \"openstack-operator-controller-manager-6d464797d7-lrlqc\" (UID: \"a6f83f3c-26f4-472f-9fcd-ae8049f1819a\") " pod="openstack-operators/openstack-operator-controller-manager-6d464797d7-lrlqc" Feb 19 15:25:49 crc kubenswrapper[4810]: I0219 15:25:49.181473 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5xfs\" (UniqueName: \"kubernetes.io/projected/64ed590e-59b6-44c8-baee-324162d099b8-kube-api-access-b5xfs\") pod \"rabbitmq-cluster-operator-manager-668c99d594-k98c8\" (UID: \"64ed590e-59b6-44c8-baee-324162d099b8\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-k98c8" Feb 19 15:25:49 crc kubenswrapper[4810]: I0219 15:25:49.181976 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-7tzvr" Feb 19 15:25:49 crc kubenswrapper[4810]: E0219 15:25:49.182637 4810 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 19 15:25:49 crc kubenswrapper[4810]: E0219 15:25:49.182681 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6f83f3c-26f4-472f-9fcd-ae8049f1819a-webhook-certs podName:a6f83f3c-26f4-472f-9fcd-ae8049f1819a nodeName:}" failed. No retries permitted until 2026-02-19 15:25:49.682663475 +0000 UTC m=+979.164693599 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/a6f83f3c-26f4-472f-9fcd-ae8049f1819a-webhook-certs") pod "openstack-operator-controller-manager-6d464797d7-lrlqc" (UID: "a6f83f3c-26f4-472f-9fcd-ae8049f1819a") : secret "webhook-server-cert" not found Feb 19 15:25:49 crc kubenswrapper[4810]: E0219 15:25:49.182855 4810 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 19 15:25:49 crc kubenswrapper[4810]: E0219 15:25:49.182882 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6f83f3c-26f4-472f-9fcd-ae8049f1819a-metrics-certs podName:a6f83f3c-26f4-472f-9fcd-ae8049f1819a nodeName:}" failed. No retries permitted until 2026-02-19 15:25:49.6828738 +0000 UTC m=+979.164903924 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a6f83f3c-26f4-472f-9fcd-ae8049f1819a-metrics-certs") pod "openstack-operator-controller-manager-6d464797d7-lrlqc" (UID: "a6f83f3c-26f4-472f-9fcd-ae8049f1819a") : secret "metrics-server-cert" not found Feb 19 15:25:49 crc kubenswrapper[4810]: I0219 15:25:49.207186 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbcd5\" (UniqueName: \"kubernetes.io/projected/a6f83f3c-26f4-472f-9fcd-ae8049f1819a-kube-api-access-rbcd5\") pod \"openstack-operator-controller-manager-6d464797d7-lrlqc\" (UID: \"a6f83f3c-26f4-472f-9fcd-ae8049f1819a\") " pod="openstack-operators/openstack-operator-controller-manager-6d464797d7-lrlqc" Feb 19 15:25:49 crc kubenswrapper[4810]: I0219 15:25:49.207515 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5xfs\" (UniqueName: \"kubernetes.io/projected/64ed590e-59b6-44c8-baee-324162d099b8-kube-api-access-b5xfs\") pod \"rabbitmq-cluster-operator-manager-668c99d594-k98c8\" (UID: \"64ed590e-59b6-44c8-baee-324162d099b8\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-k98c8" Feb 19 15:25:49 crc kubenswrapper[4810]: I0219 15:25:49.210078 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-t44nb" Feb 19 15:25:49 crc kubenswrapper[4810]: I0219 15:25:49.243700 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-px9zx" Feb 19 15:25:49 crc kubenswrapper[4810]: I0219 15:25:49.325211 4810 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 15:25:49 crc kubenswrapper[4810]: I0219 15:25:49.346452 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-7866795846-pw9kt" Feb 19 15:25:49 crc kubenswrapper[4810]: I0219 15:25:49.367242 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-798847869b-dlmvg" Feb 19 15:25:49 crc kubenswrapper[4810]: I0219 15:25:49.472639 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-k98c8" Feb 19 15:25:49 crc kubenswrapper[4810]: I0219 15:25:49.503291 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c677bdd0-7248-4b02-9ab4-035c034a976a-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c9hlpt\" (UID: \"c677bdd0-7248-4b02-9ab4-035c034a976a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c9hlpt" Feb 19 15:25:49 crc kubenswrapper[4810]: E0219 15:25:49.503562 4810 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 15:25:49 crc kubenswrapper[4810]: E0219 15:25:49.503633 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c677bdd0-7248-4b02-9ab4-035c034a976a-cert podName:c677bdd0-7248-4b02-9ab4-035c034a976a nodeName:}" failed. No retries permitted until 2026-02-19 15:25:50.503614792 +0000 UTC m=+979.985644916 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c677bdd0-7248-4b02-9ab4-035c034a976a-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9c9hlpt" (UID: "c677bdd0-7248-4b02-9ab4-035c034a976a") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 15:25:49 crc kubenswrapper[4810]: I0219 15:25:49.537488 4810 patch_prober.go:28] interesting pod/machine-config-daemon-t499d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 15:25:49 crc kubenswrapper[4810]: I0219 15:25:49.537551 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 15:25:49 crc kubenswrapper[4810]: I0219 15:25:49.537601 4810 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t499d" Feb 19 15:25:49 crc kubenswrapper[4810]: I0219 15:25:49.538218 4810 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6c88e0127771a4aa28c6261d9a83da29a3f930023146271a9d942e738f8152ff"} pod="openshift-machine-config-operator/machine-config-daemon-t499d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 15:25:49 crc kubenswrapper[4810]: I0219 15:25:49.538287 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" containerID="cri-o://6c88e0127771a4aa28c6261d9a83da29a3f930023146271a9d942e738f8152ff" gracePeriod=600 Feb 19 15:25:49 crc kubenswrapper[4810]: I0219 15:25:49.709073 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a6f83f3c-26f4-472f-9fcd-ae8049f1819a-webhook-certs\") pod \"openstack-operator-controller-manager-6d464797d7-lrlqc\" (UID: \"a6f83f3c-26f4-472f-9fcd-ae8049f1819a\") " pod="openstack-operators/openstack-operator-controller-manager-6d464797d7-lrlqc" Feb 19 15:25:49 crc kubenswrapper[4810]: I0219 15:25:49.709158 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a6f83f3c-26f4-472f-9fcd-ae8049f1819a-metrics-certs\") pod \"openstack-operator-controller-manager-6d464797d7-lrlqc\" (UID: \"a6f83f3c-26f4-472f-9fcd-ae8049f1819a\") " pod="openstack-operators/openstack-operator-controller-manager-6d464797d7-lrlqc" Feb 19 15:25:49 crc kubenswrapper[4810]: E0219 15:25:49.709275 4810 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 19 15:25:49 crc kubenswrapper[4810]: E0219 15:25:49.709342 4810 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 19 15:25:49 crc kubenswrapper[4810]: E0219 15:25:49.709356 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6f83f3c-26f4-472f-9fcd-ae8049f1819a-webhook-certs podName:a6f83f3c-26f4-472f-9fcd-ae8049f1819a nodeName:}" failed. No retries permitted until 2026-02-19 15:25:50.709336325 +0000 UTC m=+980.191366439 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/a6f83f3c-26f4-472f-9fcd-ae8049f1819a-webhook-certs") pod "openstack-operator-controller-manager-6d464797d7-lrlqc" (UID: "a6f83f3c-26f4-472f-9fcd-ae8049f1819a") : secret "webhook-server-cert" not found Feb 19 15:25:49 crc kubenswrapper[4810]: E0219 15:25:49.709400 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6f83f3c-26f4-472f-9fcd-ae8049f1819a-metrics-certs podName:a6f83f3c-26f4-472f-9fcd-ae8049f1819a nodeName:}" failed. No retries permitted until 2026-02-19 15:25:50.709382916 +0000 UTC m=+980.191413160 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a6f83f3c-26f4-472f-9fcd-ae8049f1819a-metrics-certs") pod "openstack-operator-controller-manager-6d464797d7-lrlqc" (UID: "a6f83f3c-26f4-472f-9fcd-ae8049f1819a") : secret "metrics-server-cert" not found Feb 19 15:25:49 crc kubenswrapper[4810]: I0219 15:25:49.921417 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-gnmlp"] Feb 19 15:25:49 crc kubenswrapper[4810]: I0219 15:25:49.921902 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gx7qf" Feb 19 15:25:49 crc kubenswrapper[4810]: I0219 15:25:49.921917 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gx7qf" Feb 19 15:25:49 crc kubenswrapper[4810]: I0219 15:25:49.940042 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-jxmt5"] Feb 19 15:25:49 crc kubenswrapper[4810]: I0219 15:25:49.969794 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-qz68t"] Feb 19 15:25:49 crc kubenswrapper[4810]: I0219 15:25:49.974871 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-ffm66"] Feb 19 15:25:49 crc kubenswrapper[4810]: W0219 15:25:49.983602 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1217b757_0f1c_4c4e_9abe_55875992915d.slice/crio-e927dcfa354e3551ae1e703ad4c42cb53f27b54cd823cc4b65323cbfec627b59 WatchSource:0}: Error finding container e927dcfa354e3551ae1e703ad4c42cb53f27b54cd823cc4b65323cbfec627b59: Status 404 returned error can't find the container with id e927dcfa354e3551ae1e703ad4c42cb53f27b54cd823cc4b65323cbfec627b59 Feb 19 15:25:50 crc kubenswrapper[4810]: W0219 15:25:50.002408 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2106e7b5_bb83_464a_a43f_943f22b55078.slice/crio-8142761ba1796805ba5943624af056a3f22412cc0742809da1890e37bd53b992 WatchSource:0}: Error finding container 8142761ba1796805ba5943624af056a3f22412cc0742809da1890e37bd53b992: Status 404 returned error can't find the container with id 8142761ba1796805ba5943624af056a3f22412cc0742809da1890e37bd53b992 Feb 19 15:25:50 crc kubenswrapper[4810]: I0219 15:25:50.004729 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gx7qf" Feb 19 15:25:50 crc kubenswrapper[4810]: I0219 15:25:50.019019 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4898d4eb-d474-44bc-9a38-e36f300d132f-cert\") pod \"infra-operator-controller-manager-79d975b745-2kkhl\" (UID: \"4898d4eb-d474-44bc-9a38-e36f300d132f\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-2kkhl" Feb 19 15:25:50 crc kubenswrapper[4810]: E0219 15:25:50.019852 4810 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 19 15:25:50 crc kubenswrapper[4810]: E0219 15:25:50.019895 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4898d4eb-d474-44bc-9a38-e36f300d132f-cert podName:4898d4eb-d474-44bc-9a38-e36f300d132f nodeName:}" failed. No retries permitted until 2026-02-19 15:25:52.019881257 +0000 UTC m=+981.501911381 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4898d4eb-d474-44bc-9a38-e36f300d132f-cert") pod "infra-operator-controller-manager-79d975b745-2kkhl" (UID: "4898d4eb-d474-44bc-9a38-e36f300d132f") : secret "infra-operator-webhook-server-cert" not found Feb 19 15:25:50 crc kubenswrapper[4810]: I0219 15:25:50.161692 4810 generic.go:334] "Generic (PLEG): container finished" podID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerID="6c88e0127771a4aa28c6261d9a83da29a3f930023146271a9d942e738f8152ff" exitCode=0 Feb 19 15:25:50 crc kubenswrapper[4810]: I0219 15:25:50.161769 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" event={"ID":"dfbf74ef-1e94-4826-8583-42b2e246ccf3","Type":"ContainerDied","Data":"6c88e0127771a4aa28c6261d9a83da29a3f930023146271a9d942e738f8152ff"} Feb 19 15:25:50 crc kubenswrapper[4810]: I0219 15:25:50.161797 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" event={"ID":"dfbf74ef-1e94-4826-8583-42b2e246ccf3","Type":"ContainerStarted","Data":"37fe95e370faa9fca4a69499713730a8ba7e7939f57cd237ea9a505f9b09a6bf"} Feb 19 15:25:50 crc kubenswrapper[4810]: I0219 15:25:50.161813 4810 scope.go:117] "RemoveContainer" containerID="946b41284ed03248aebd830c7fb80426be59078e4ea2a93cd09930514fedec98" Feb 19 15:25:50 crc kubenswrapper[4810]: I0219 15:25:50.165252 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-ffm66" event={"ID":"e2942952-ce19-4053-91da-05623c954167","Type":"ContainerStarted","Data":"4b5dde9b2f2c3ebd501b53c61f111850cace1531e9983fd3cf0b82595113eb42"} Feb 19 15:25:50 crc kubenswrapper[4810]: I0219 15:25:50.167538 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-gnmlp" event={"ID":"f0ab3643-d267-4902-af1f-cbcbdd7e5e41","Type":"ContainerStarted","Data":"32d8aeaba9ee2ad0c0ef0e695b47b2c106e399b015e2a40387b717f698edf72c"} Feb 19 15:25:50 crc kubenswrapper[4810]: I0219 15:25:50.169050 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987464f4-qz68t" event={"ID":"2106e7b5-bb83-464a-a43f-943f22b55078","Type":"ContainerStarted","Data":"8142761ba1796805ba5943624af056a3f22412cc0742809da1890e37bd53b992"} Feb 19 15:25:50 crc kubenswrapper[4810]: I0219 15:25:50.170171 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-jxmt5" event={"ID":"1217b757-0f1c-4c4e-9abe-55875992915d","Type":"ContainerStarted","Data":"e927dcfa354e3551ae1e703ad4c42cb53f27b54cd823cc4b65323cbfec627b59"} Feb 19 15:25:50 crc kubenswrapper[4810]: I0219 15:25:50.172924 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-z5fb9" event={"ID":"52bb990c-eff0-4673-be27-d55d433bef0d","Type":"ContainerStarted","Data":"0440702e1248976a58eed4017e3d4c545485a092c7440f0bde7e0efe97d4256a"} Feb 19 15:25:50 crc kubenswrapper[4810]: I0219 15:25:50.173463 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-vc7cw"] Feb 19 15:25:50 crc kubenswrapper[4810]: I0219 15:25:50.181247 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-mzslt"] Feb 19 15:25:50 crc kubenswrapper[4810]: I0219 15:25:50.189724 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-nnps5"] Feb 19 15:25:50 crc kubenswrapper[4810]: I0219 15:25:50.257667 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gx7qf" Feb 19 15:25:50 crc kubenswrapper[4810]: I0219 15:25:50.328906 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-jjqv2"] Feb 19 15:25:50 crc kubenswrapper[4810]: I0219 15:25:50.362508 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gx7qf"] Feb 19 15:25:50 crc kubenswrapper[4810]: I0219 15:25:50.498478 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-px9zx"] Feb 19 15:25:50 crc kubenswrapper[4810]: I0219 15:25:50.509306 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-5xnwd"] Feb 19 15:25:50 crc kubenswrapper[4810]: I0219 15:25:50.517915 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-l67cq"] Feb 19 15:25:50 crc kubenswrapper[4810]: W0219 15:25:50.518085 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeaed166e_39b5_45ca_8a65_a22710d5fe37.slice/crio-8c2e63627629d8e721a1e357e303b313dc5f495a90ce248f16c8662b394c1381 WatchSource:0}: Error finding container 8c2e63627629d8e721a1e357e303b313dc5f495a90ce248f16c8662b394c1381: Status 404 returned error can't find the container with id 8c2e63627629d8e721a1e357e303b313dc5f495a90ce248f16c8662b394c1381 Feb 19 15:25:50 crc kubenswrapper[4810]: I0219 15:25:50.529519 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c677bdd0-7248-4b02-9ab4-035c034a976a-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c9hlpt\" (UID: \"c677bdd0-7248-4b02-9ab4-035c034a976a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c9hlpt" Feb 19 15:25:50 crc kubenswrapper[4810]: E0219 15:25:50.529648 4810 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 15:25:50 crc kubenswrapper[4810]: E0219 15:25:50.529693 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c677bdd0-7248-4b02-9ab4-035c034a976a-cert podName:c677bdd0-7248-4b02-9ab4-035c034a976a nodeName:}" failed. No retries permitted until 2026-02-19 15:25:52.529678483 +0000 UTC m=+982.011708607 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c677bdd0-7248-4b02-9ab4-035c034a976a-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9c9hlpt" (UID: "c677bdd0-7248-4b02-9ab4-035c034a976a") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 15:25:50 crc kubenswrapper[4810]: I0219 15:25:50.535003 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-mkfsc"] Feb 19 15:25:50 crc kubenswrapper[4810]: I0219 15:25:50.557239 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-6fqmd"] Feb 19 15:25:50 crc kubenswrapper[4810]: I0219 15:25:50.581376 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-vcbwg"] Feb 19 15:25:50 crc kubenswrapper[4810]: I0219 15:25:50.586716 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-t44nb"] Feb 19 15:25:50 crc kubenswrapper[4810]: I0219 15:25:50.595627 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-pw9kt"] Feb 19 15:25:50 crc kubenswrapper[4810]: E0219 15:25:50.599626 4810 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:c6ad383f55f955902b074d1ee947a2233a5fcbf40698479ae693ce056c80dcc1,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qslf9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-b4d948c87-mkfsc_openstack-operators(2126b31b-0444-43e4-a250-837f37d476aa): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 19 15:25:50 crc kubenswrapper[4810]: E0219 15:25:50.599863 4810 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:a18f12497b7159b100fcfd72c7ba2273d0669a5c00600a9ff1333bca028f256a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-j2j7z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-6994f66f48-6fqmd_openstack-operators(e4a54646-39cf-4e42-9367-487ea4f7d8a4): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 19 15:25:50 crc kubenswrapper[4810]: E0219 15:25:50.600832 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-mkfsc" podUID="2126b31b-0444-43e4-a250-837f37d476aa" Feb 19 15:25:50 crc kubenswrapper[4810]: E0219 15:25:50.600923 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-6fqmd" podUID="e4a54646-39cf-4e42-9367-487ea4f7d8a4" Feb 19 15:25:50 crc kubenswrapper[4810]: E0219 15:25:50.612857 4810 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:f0fabdf79095def0f8b1c0442925548a94ca94bed4de2d3b171277129f8079e6,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4jvv8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-7866795846-pw9kt_openstack-operators(69b7e96d-bce6-4653-998e-3bf5d159ae5a): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 19 15:25:50 crc kubenswrapper[4810]: E0219 15:25:50.614477 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-7866795846-pw9kt" podUID="69b7e96d-bce6-4653-998e-3bf5d159ae5a" Feb 19 15:25:50 crc kubenswrapper[4810]: I0219 15:25:50.632403 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-k98c8"] Feb 19 15:25:50 crc kubenswrapper[4810]: I0219 15:25:50.638429 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-798847869b-dlmvg"] Feb 19 15:25:50 crc kubenswrapper[4810]: E0219 15:25:50.644870 4810 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-b5xfs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-k98c8_openstack-operators(64ed590e-59b6-44c8-baee-324162d099b8): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 19 15:25:50 crc kubenswrapper[4810]: I0219 15:25:50.644904 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-7tzvr"] Feb 19 15:25:50 crc kubenswrapper[4810]: E0219 15:25:50.644990 4810 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:a57336b9f95b703f80453db87e43a2834ca1bdc89480796d28ebbe0a9702ecfd,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bwc4k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-8497b45c89-7tzvr_openstack-operators(9f5779a5-4cda-40dc-831d-950f97eae317): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 19 15:25:50 crc kubenswrapper[4810]: E0219 15:25:50.651527 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-7tzvr" podUID="9f5779a5-4cda-40dc-831d-950f97eae317" Feb 19 15:25:50 crc kubenswrapper[4810]: E0219 15:25:50.651940 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-k98c8" podUID="64ed590e-59b6-44c8-baee-324162d099b8" Feb 19 15:25:50 crc kubenswrapper[4810]: E0219 15:25:50.654089 4810 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.159:5001/openstack-k8s-operators/watcher-operator:eaf82eeed7c641cca4b0e467ff9bfd7468ff8986,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9vbnk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-798847869b-dlmvg_openstack-operators(9c5af548-c722-4e6b-9309-1420838257e0): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 19 15:25:50 crc kubenswrapper[4810]: E0219 15:25:50.664682 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-798847869b-dlmvg" podUID="9c5af548-c722-4e6b-9309-1420838257e0" Feb 19 15:25:50 crc kubenswrapper[4810]: I0219 15:25:50.738304 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a6f83f3c-26f4-472f-9fcd-ae8049f1819a-webhook-certs\") pod \"openstack-operator-controller-manager-6d464797d7-lrlqc\" (UID: \"a6f83f3c-26f4-472f-9fcd-ae8049f1819a\") " pod="openstack-operators/openstack-operator-controller-manager-6d464797d7-lrlqc" Feb 19 15:25:50 crc kubenswrapper[4810]: I0219 15:25:50.738630 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a6f83f3c-26f4-472f-9fcd-ae8049f1819a-metrics-certs\") pod \"openstack-operator-controller-manager-6d464797d7-lrlqc\" (UID: \"a6f83f3c-26f4-472f-9fcd-ae8049f1819a\") " pod="openstack-operators/openstack-operator-controller-manager-6d464797d7-lrlqc" Feb 19 15:25:50 crc kubenswrapper[4810]: E0219 15:25:50.738484 4810 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 19 15:25:50 crc kubenswrapper[4810]: E0219 15:25:50.738931 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6f83f3c-26f4-472f-9fcd-ae8049f1819a-webhook-certs podName:a6f83f3c-26f4-472f-9fcd-ae8049f1819a nodeName:}" failed. No retries permitted until 2026-02-19 15:25:52.738897391 +0000 UTC m=+982.220927515 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/a6f83f3c-26f4-472f-9fcd-ae8049f1819a-webhook-certs") pod "openstack-operator-controller-manager-6d464797d7-lrlqc" (UID: "a6f83f3c-26f4-472f-9fcd-ae8049f1819a") : secret "webhook-server-cert" not found Feb 19 15:25:50 crc kubenswrapper[4810]: E0219 15:25:50.738738 4810 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 19 15:25:50 crc kubenswrapper[4810]: E0219 15:25:50.739034 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6f83f3c-26f4-472f-9fcd-ae8049f1819a-metrics-certs podName:a6f83f3c-26f4-472f-9fcd-ae8049f1819a nodeName:}" failed. No retries permitted until 2026-02-19 15:25:52.739003224 +0000 UTC m=+982.221033348 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a6f83f3c-26f4-472f-9fcd-ae8049f1819a-metrics-certs") pod "openstack-operator-controller-manager-6d464797d7-lrlqc" (UID: "a6f83f3c-26f4-472f-9fcd-ae8049f1819a") : secret "metrics-server-cert" not found Feb 19 15:25:51 crc kubenswrapper[4810]: I0219 15:25:51.182890 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-vc7cw" event={"ID":"942f40af-0498-4865-99da-bdcd068ef449","Type":"ContainerStarted","Data":"3f797ad07e482ce11b7c40c6e488a42e9ea2aefe5141155ec3794b7d4d2a5a8c"} Feb 19 15:25:51 crc kubenswrapper[4810]: I0219 15:25:51.186371 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-798847869b-dlmvg" event={"ID":"9c5af548-c722-4e6b-9309-1420838257e0","Type":"ContainerStarted","Data":"0e77b12bf32788aa02d3e12d5d6283c778d943ced5271ce7d019bc9c50adb4a8"} Feb 19 15:25:51 crc kubenswrapper[4810]: E0219 15:25:51.188481 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.159:5001/openstack-k8s-operators/watcher-operator:eaf82eeed7c641cca4b0e467ff9bfd7468ff8986\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-798847869b-dlmvg" podUID="9c5af548-c722-4e6b-9309-1420838257e0" Feb 19 15:25:51 crc kubenswrapper[4810]: I0219 15:25:51.189438 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-nnps5" event={"ID":"602535d1-0abe-471e-8409-31319af7bd4b","Type":"ContainerStarted","Data":"881a33f534ffe0348a7a3429879983d7cc53f438504535993993d815d773a667"} Feb 19 15:25:51 crc kubenswrapper[4810]: I0219 15:25:51.191290 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-5xnwd" event={"ID":"e163eac0-ea1f-4002-9469-844240d7a44c","Type":"ContainerStarted","Data":"3ffbe6cbb668d6a79a8bfff03336b81154cc1684b69a73b1785f43fe8919f5be"} Feb 19 15:25:51 crc kubenswrapper[4810]: I0219 15:25:51.192811 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-mzslt" event={"ID":"91002269-9fe0-44d2-9dbd-9e4cf58274bf","Type":"ContainerStarted","Data":"092e3ea171bf1a38e58c0bd2fa8536b28b188efc4ad52027f6615bb2b32a399a"} Feb 19 15:25:51 crc kubenswrapper[4810]: I0219 15:25:51.193942 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-l67cq" event={"ID":"fef4ad67-ccc8-4b32-bfb8-38dd6aa8e07e","Type":"ContainerStarted","Data":"ef92e7f14f62a52fd19b2c102fce3b203f36bb9c4eee9a1536a90b2df20fa0ea"} Feb 19 15:25:51 crc kubenswrapper[4810]: I0219 15:25:51.194820 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-k98c8" event={"ID":"64ed590e-59b6-44c8-baee-324162d099b8","Type":"ContainerStarted","Data":"5551bf3b1a44fbda301ac3a339c869610be99e0a1b318b669ea8c1f969221dc6"} Feb 19 15:25:51 crc kubenswrapper[4810]: E0219 15:25:51.195912 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-k98c8" podUID="64ed590e-59b6-44c8-baee-324162d099b8" Feb 19 15:25:51 crc kubenswrapper[4810]: I0219 15:25:51.196394 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-t44nb" event={"ID":"aa5063d7-2358-4149-a3b9-ef2ce138faf4","Type":"ContainerStarted","Data":"3d1135f0c722109ff280553923d9cbc96e6a38aa71e2d08dafb98985306203ff"} Feb 19 15:25:51 crc kubenswrapper[4810]: I0219 15:25:51.205532 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-vcbwg" event={"ID":"3cff7e0c-86c7-4029-aefe-a7f7e8e2d76d","Type":"ContainerStarted","Data":"5e33d80ac78489e01a110a17e7af94af4091b7ba46319e949ba8900d9738fc32"} Feb 19 15:25:51 crc kubenswrapper[4810]: I0219 15:25:51.208443 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-7tzvr" event={"ID":"9f5779a5-4cda-40dc-831d-950f97eae317","Type":"ContainerStarted","Data":"c147950560919e5e5a88c8ec7a2c488fe3860e63e024634b661022f3822f5035"} Feb 19 15:25:51 crc kubenswrapper[4810]: E0219 15:25:51.214427 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:a57336b9f95b703f80453db87e43a2834ca1bdc89480796d28ebbe0a9702ecfd\\\"\"" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-7tzvr" podUID="9f5779a5-4cda-40dc-831d-950f97eae317" Feb 19 15:25:51 crc kubenswrapper[4810]: I0219 15:25:51.219444 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-px9zx" event={"ID":"eaed166e-39b5-45ca-8a65-a22710d5fe37","Type":"ContainerStarted","Data":"8c2e63627629d8e721a1e357e303b313dc5f495a90ce248f16c8662b394c1381"} Feb 19 15:25:51 crc kubenswrapper[4810]: I0219 15:25:51.225264 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-jjqv2" event={"ID":"4aeb7cba-c1db-4dd6-92f7-dae7bd2e3f65","Type":"ContainerStarted","Data":"d84dc9bc93636cd29ea80b1f2ad3891e4865821023bb19dd507c583b13389a5c"} Feb 19 15:25:51 crc kubenswrapper[4810]: I0219 15:25:51.237302 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-mkfsc" event={"ID":"2126b31b-0444-43e4-a250-837f37d476aa","Type":"ContainerStarted","Data":"466e59eac13e04b3cf318adee9db4576b2c7c1eb6e6a6f7837da7664ae9ff9cf"} Feb 19 15:25:51 crc kubenswrapper[4810]: E0219 15:25:51.240345 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:c6ad383f55f955902b074d1ee947a2233a5fcbf40698479ae693ce056c80dcc1\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-mkfsc" podUID="2126b31b-0444-43e4-a250-837f37d476aa" Feb 19 15:25:51 crc kubenswrapper[4810]: I0219 15:25:51.241944 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-6fqmd" event={"ID":"e4a54646-39cf-4e42-9367-487ea4f7d8a4","Type":"ContainerStarted","Data":"b89b1c622c6e37cace90f517d6acafdefbc800bc06d9777043b717525f34ebd6"} Feb 19 15:25:51 crc kubenswrapper[4810]: E0219 15:25:51.243512 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:a18f12497b7159b100fcfd72c7ba2273d0669a5c00600a9ff1333bca028f256a\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-6fqmd" podUID="e4a54646-39cf-4e42-9367-487ea4f7d8a4" Feb 19 15:25:51 crc kubenswrapper[4810]: I0219 15:25:51.245520 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7866795846-pw9kt" event={"ID":"69b7e96d-bce6-4653-998e-3bf5d159ae5a","Type":"ContainerStarted","Data":"be03b5fdf3aec4df166c21467084dad324575f6f427865e00b905c6b2b049090"} Feb 19 15:25:51 crc kubenswrapper[4810]: E0219 15:25:51.255878 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f0fabdf79095def0f8b1c0442925548a94ca94bed4de2d3b171277129f8079e6\\\"\"" pod="openstack-operators/test-operator-controller-manager-7866795846-pw9kt" podUID="69b7e96d-bce6-4653-998e-3bf5d159ae5a" Feb 19 15:25:52 crc kubenswrapper[4810]: I0219 15:25:52.061127 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4898d4eb-d474-44bc-9a38-e36f300d132f-cert\") pod \"infra-operator-controller-manager-79d975b745-2kkhl\" (UID: \"4898d4eb-d474-44bc-9a38-e36f300d132f\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-2kkhl" Feb 19 15:25:52 crc kubenswrapper[4810]: E0219 15:25:52.061400 4810 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 19 15:25:52 crc kubenswrapper[4810]: E0219 15:25:52.061664 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4898d4eb-d474-44bc-9a38-e36f300d132f-cert podName:4898d4eb-d474-44bc-9a38-e36f300d132f nodeName:}" failed. No retries permitted until 2026-02-19 15:25:56.061525481 +0000 UTC m=+985.543555605 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4898d4eb-d474-44bc-9a38-e36f300d132f-cert") pod "infra-operator-controller-manager-79d975b745-2kkhl" (UID: "4898d4eb-d474-44bc-9a38-e36f300d132f") : secret "infra-operator-webhook-server-cert" not found Feb 19 15:25:52 crc kubenswrapper[4810]: I0219 15:25:52.261735 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gx7qf" podUID="1715cc96-a86a-40d8-8f5a-1a4f35129bd1" containerName="registry-server" containerID="cri-o://5cab9d89ebbb4715343c797d52130d009456c8d3d9eaf887c80836933b581c07" gracePeriod=2 Feb 19 15:25:52 crc kubenswrapper[4810]: E0219 15:25:52.296070 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:a18f12497b7159b100fcfd72c7ba2273d0669a5c00600a9ff1333bca028f256a\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-6fqmd" podUID="e4a54646-39cf-4e42-9367-487ea4f7d8a4" Feb 19 15:25:52 crc kubenswrapper[4810]: E0219 15:25:52.296118 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:c6ad383f55f955902b074d1ee947a2233a5fcbf40698479ae693ce056c80dcc1\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-mkfsc" podUID="2126b31b-0444-43e4-a250-837f37d476aa" Feb 19 15:25:52 crc kubenswrapper[4810]: E0219 15:25:52.296228 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.159:5001/openstack-k8s-operators/watcher-operator:eaf82eeed7c641cca4b0e467ff9bfd7468ff8986\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-798847869b-dlmvg" podUID="9c5af548-c722-4e6b-9309-1420838257e0" Feb 19 15:25:52 crc kubenswrapper[4810]: E0219 15:25:52.296278 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-k98c8" podUID="64ed590e-59b6-44c8-baee-324162d099b8" Feb 19 15:25:52 crc kubenswrapper[4810]: E0219 15:25:52.296314 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f0fabdf79095def0f8b1c0442925548a94ca94bed4de2d3b171277129f8079e6\\\"\"" pod="openstack-operators/test-operator-controller-manager-7866795846-pw9kt" podUID="69b7e96d-bce6-4653-998e-3bf5d159ae5a" Feb 19 15:25:52 crc kubenswrapper[4810]: E0219 15:25:52.296366 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:a57336b9f95b703f80453db87e43a2834ca1bdc89480796d28ebbe0a9702ecfd\\\"\"" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-7tzvr" podUID="9f5779a5-4cda-40dc-831d-950f97eae317" Feb 19 15:25:52 crc kubenswrapper[4810]: I0219 15:25:52.569849 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c677bdd0-7248-4b02-9ab4-035c034a976a-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c9hlpt\" (UID: \"c677bdd0-7248-4b02-9ab4-035c034a976a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c9hlpt" Feb 19 15:25:52 crc kubenswrapper[4810]: E0219 15:25:52.570035 4810 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 15:25:52 crc kubenswrapper[4810]: E0219 15:25:52.570080 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c677bdd0-7248-4b02-9ab4-035c034a976a-cert podName:c677bdd0-7248-4b02-9ab4-035c034a976a nodeName:}" failed. No retries permitted until 2026-02-19 15:25:56.570067136 +0000 UTC m=+986.052097260 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c677bdd0-7248-4b02-9ab4-035c034a976a-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9c9hlpt" (UID: "c677bdd0-7248-4b02-9ab4-035c034a976a") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 15:25:52 crc kubenswrapper[4810]: I0219 15:25:52.776156 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a6f83f3c-26f4-472f-9fcd-ae8049f1819a-webhook-certs\") pod \"openstack-operator-controller-manager-6d464797d7-lrlqc\" (UID: \"a6f83f3c-26f4-472f-9fcd-ae8049f1819a\") " pod="openstack-operators/openstack-operator-controller-manager-6d464797d7-lrlqc" Feb 19 15:25:52 crc kubenswrapper[4810]: I0219 15:25:52.776220 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a6f83f3c-26f4-472f-9fcd-ae8049f1819a-metrics-certs\") pod \"openstack-operator-controller-manager-6d464797d7-lrlqc\" (UID: \"a6f83f3c-26f4-472f-9fcd-ae8049f1819a\") " pod="openstack-operators/openstack-operator-controller-manager-6d464797d7-lrlqc" Feb 19 15:25:52 crc kubenswrapper[4810]: E0219 15:25:52.776266 4810 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 19 15:25:52 crc kubenswrapper[4810]: E0219 15:25:52.776315 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6f83f3c-26f4-472f-9fcd-ae8049f1819a-webhook-certs podName:a6f83f3c-26f4-472f-9fcd-ae8049f1819a nodeName:}" failed. No retries permitted until 2026-02-19 15:25:56.776300891 +0000 UTC m=+986.258331005 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/a6f83f3c-26f4-472f-9fcd-ae8049f1819a-webhook-certs") pod "openstack-operator-controller-manager-6d464797d7-lrlqc" (UID: "a6f83f3c-26f4-472f-9fcd-ae8049f1819a") : secret "webhook-server-cert" not found Feb 19 15:25:52 crc kubenswrapper[4810]: E0219 15:25:52.776477 4810 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 19 15:25:52 crc kubenswrapper[4810]: E0219 15:25:52.776549 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6f83f3c-26f4-472f-9fcd-ae8049f1819a-metrics-certs podName:a6f83f3c-26f4-472f-9fcd-ae8049f1819a nodeName:}" failed. No retries permitted until 2026-02-19 15:25:56.776531007 +0000 UTC m=+986.258561131 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a6f83f3c-26f4-472f-9fcd-ae8049f1819a-metrics-certs") pod "openstack-operator-controller-manager-6d464797d7-lrlqc" (UID: "a6f83f3c-26f4-472f-9fcd-ae8049f1819a") : secret "metrics-server-cert" not found Feb 19 15:25:53 crc kubenswrapper[4810]: I0219 15:25:53.285823 4810 generic.go:334] "Generic (PLEG): container finished" podID="1715cc96-a86a-40d8-8f5a-1a4f35129bd1" containerID="5cab9d89ebbb4715343c797d52130d009456c8d3d9eaf887c80836933b581c07" exitCode=0 Feb 19 15:25:53 crc kubenswrapper[4810]: I0219 15:25:53.286104 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gx7qf" event={"ID":"1715cc96-a86a-40d8-8f5a-1a4f35129bd1","Type":"ContainerDied","Data":"5cab9d89ebbb4715343c797d52130d009456c8d3d9eaf887c80836933b581c07"} Feb 19 15:25:55 crc kubenswrapper[4810]: I0219 15:25:55.303369 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gx7qf" event={"ID":"1715cc96-a86a-40d8-8f5a-1a4f35129bd1","Type":"ContainerDied","Data":"c9c6c8774d7ca4a97936ace4183e794c37a4dbb46f2dcfbc1cbf73f1bab26b9b"} Feb 19 15:25:55 crc kubenswrapper[4810]: I0219 15:25:55.304023 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c9c6c8774d7ca4a97936ace4183e794c37a4dbb46f2dcfbc1cbf73f1bab26b9b" Feb 19 15:25:55 crc kubenswrapper[4810]: I0219 15:25:55.323788 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gx7qf" Feb 19 15:25:55 crc kubenswrapper[4810]: I0219 15:25:55.413093 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpjhx\" (UniqueName: \"kubernetes.io/projected/1715cc96-a86a-40d8-8f5a-1a4f35129bd1-kube-api-access-tpjhx\") pod \"1715cc96-a86a-40d8-8f5a-1a4f35129bd1\" (UID: \"1715cc96-a86a-40d8-8f5a-1a4f35129bd1\") " Feb 19 15:25:55 crc kubenswrapper[4810]: I0219 15:25:55.413179 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1715cc96-a86a-40d8-8f5a-1a4f35129bd1-utilities\") pod \"1715cc96-a86a-40d8-8f5a-1a4f35129bd1\" (UID: \"1715cc96-a86a-40d8-8f5a-1a4f35129bd1\") " Feb 19 15:25:55 crc kubenswrapper[4810]: I0219 15:25:55.413270 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1715cc96-a86a-40d8-8f5a-1a4f35129bd1-catalog-content\") pod \"1715cc96-a86a-40d8-8f5a-1a4f35129bd1\" (UID: \"1715cc96-a86a-40d8-8f5a-1a4f35129bd1\") " Feb 19 15:25:55 crc kubenswrapper[4810]: I0219 15:25:55.414297 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1715cc96-a86a-40d8-8f5a-1a4f35129bd1-utilities" (OuterVolumeSpecName: "utilities") pod "1715cc96-a86a-40d8-8f5a-1a4f35129bd1" (UID: "1715cc96-a86a-40d8-8f5a-1a4f35129bd1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:25:55 crc kubenswrapper[4810]: I0219 15:25:55.430250 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1715cc96-a86a-40d8-8f5a-1a4f35129bd1-kube-api-access-tpjhx" (OuterVolumeSpecName: "kube-api-access-tpjhx") pod "1715cc96-a86a-40d8-8f5a-1a4f35129bd1" (UID: "1715cc96-a86a-40d8-8f5a-1a4f35129bd1"). InnerVolumeSpecName "kube-api-access-tpjhx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:25:55 crc kubenswrapper[4810]: I0219 15:25:55.442013 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1715cc96-a86a-40d8-8f5a-1a4f35129bd1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1715cc96-a86a-40d8-8f5a-1a4f35129bd1" (UID: "1715cc96-a86a-40d8-8f5a-1a4f35129bd1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:25:55 crc kubenswrapper[4810]: I0219 15:25:55.514777 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tpjhx\" (UniqueName: \"kubernetes.io/projected/1715cc96-a86a-40d8-8f5a-1a4f35129bd1-kube-api-access-tpjhx\") on node \"crc\" DevicePath \"\"" Feb 19 15:25:55 crc kubenswrapper[4810]: I0219 15:25:55.514805 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1715cc96-a86a-40d8-8f5a-1a4f35129bd1-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 15:25:55 crc kubenswrapper[4810]: I0219 15:25:55.514814 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1715cc96-a86a-40d8-8f5a-1a4f35129bd1-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 15:25:56 crc kubenswrapper[4810]: I0219 15:25:56.122900 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4898d4eb-d474-44bc-9a38-e36f300d132f-cert\") pod \"infra-operator-controller-manager-79d975b745-2kkhl\" (UID: \"4898d4eb-d474-44bc-9a38-e36f300d132f\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-2kkhl" Feb 19 15:25:56 crc kubenswrapper[4810]: E0219 15:25:56.123183 4810 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 19 15:25:56 crc kubenswrapper[4810]: E0219 15:25:56.123525 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4898d4eb-d474-44bc-9a38-e36f300d132f-cert podName:4898d4eb-d474-44bc-9a38-e36f300d132f nodeName:}" failed. No retries permitted until 2026-02-19 15:26:04.123488836 +0000 UTC m=+993.605519150 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4898d4eb-d474-44bc-9a38-e36f300d132f-cert") pod "infra-operator-controller-manager-79d975b745-2kkhl" (UID: "4898d4eb-d474-44bc-9a38-e36f300d132f") : secret "infra-operator-webhook-server-cert" not found Feb 19 15:25:56 crc kubenswrapper[4810]: I0219 15:25:56.309779 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gx7qf" Feb 19 15:25:56 crc kubenswrapper[4810]: I0219 15:25:56.336887 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gx7qf"] Feb 19 15:25:56 crc kubenswrapper[4810]: I0219 15:25:56.350259 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gx7qf"] Feb 19 15:25:56 crc kubenswrapper[4810]: I0219 15:25:56.631061 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c677bdd0-7248-4b02-9ab4-035c034a976a-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c9hlpt\" (UID: \"c677bdd0-7248-4b02-9ab4-035c034a976a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c9hlpt" Feb 19 15:25:56 crc kubenswrapper[4810]: E0219 15:25:56.631258 4810 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 15:25:56 crc kubenswrapper[4810]: E0219 15:25:56.631353 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c677bdd0-7248-4b02-9ab4-035c034a976a-cert podName:c677bdd0-7248-4b02-9ab4-035c034a976a nodeName:}" failed. No retries permitted until 2026-02-19 15:26:04.631336414 +0000 UTC m=+994.113366538 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c677bdd0-7248-4b02-9ab4-035c034a976a-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9c9hlpt" (UID: "c677bdd0-7248-4b02-9ab4-035c034a976a") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 15:25:56 crc kubenswrapper[4810]: I0219 15:25:56.833717 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a6f83f3c-26f4-472f-9fcd-ae8049f1819a-metrics-certs\") pod \"openstack-operator-controller-manager-6d464797d7-lrlqc\" (UID: \"a6f83f3c-26f4-472f-9fcd-ae8049f1819a\") " pod="openstack-operators/openstack-operator-controller-manager-6d464797d7-lrlqc" Feb 19 15:25:56 crc kubenswrapper[4810]: I0219 15:25:56.833906 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a6f83f3c-26f4-472f-9fcd-ae8049f1819a-webhook-certs\") pod \"openstack-operator-controller-manager-6d464797d7-lrlqc\" (UID: \"a6f83f3c-26f4-472f-9fcd-ae8049f1819a\") " pod="openstack-operators/openstack-operator-controller-manager-6d464797d7-lrlqc" Feb 19 15:25:56 crc kubenswrapper[4810]: E0219 15:25:56.834035 4810 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 19 15:25:56 crc kubenswrapper[4810]: E0219 15:25:56.834094 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6f83f3c-26f4-472f-9fcd-ae8049f1819a-webhook-certs podName:a6f83f3c-26f4-472f-9fcd-ae8049f1819a nodeName:}" failed. No retries permitted until 2026-02-19 15:26:04.834077183 +0000 UTC m=+994.316107317 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/a6f83f3c-26f4-472f-9fcd-ae8049f1819a-webhook-certs") pod "openstack-operator-controller-manager-6d464797d7-lrlqc" (UID: "a6f83f3c-26f4-472f-9fcd-ae8049f1819a") : secret "webhook-server-cert" not found Feb 19 15:25:56 crc kubenswrapper[4810]: E0219 15:25:56.834621 4810 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 19 15:25:56 crc kubenswrapper[4810]: E0219 15:25:56.834747 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6f83f3c-26f4-472f-9fcd-ae8049f1819a-metrics-certs podName:a6f83f3c-26f4-472f-9fcd-ae8049f1819a nodeName:}" failed. No retries permitted until 2026-02-19 15:26:04.834719419 +0000 UTC m=+994.316749583 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a6f83f3c-26f4-472f-9fcd-ae8049f1819a-metrics-certs") pod "openstack-operator-controller-manager-6d464797d7-lrlqc" (UID: "a6f83f3c-26f4-472f-9fcd-ae8049f1819a") : secret "metrics-server-cert" not found Feb 19 15:25:57 crc kubenswrapper[4810]: I0219 15:25:57.448108 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1715cc96-a86a-40d8-8f5a-1a4f35129bd1" path="/var/lib/kubelet/pods/1715cc96-a86a-40d8-8f5a-1a4f35129bd1/volumes" Feb 19 15:26:03 crc kubenswrapper[4810]: E0219 15:26:03.404790 4810 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ovn-operator@sha256:543c103838f3e6ef48755665a7695dfa3ed84753c557560257d265db31f92759" Feb 19 15:26:03 crc kubenswrapper[4810]: E0219 15:26:03.405730 4810 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:543c103838f3e6ef48755665a7695dfa3ed84753c557560257d265db31f92759,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dmsf2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-d44cf6b75-5xnwd_openstack-operators(e163eac0-ea1f-4002-9469-844240d7a44c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 15:26:03 crc kubenswrapper[4810]: E0219 15:26:03.406864 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-5xnwd" podUID="e163eac0-ea1f-4002-9469-844240d7a44c" Feb 19 15:26:03 crc kubenswrapper[4810]: E0219 15:26:03.982654 4810 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:e4689246ae78635dc3c1db9c677d8b16b8f94276df15fb9c84bfc57cc6578fcf" Feb 19 15:26:03 crc kubenswrapper[4810]: E0219 15:26:03.982861 4810 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:e4689246ae78635dc3c1db9c677d8b16b8f94276df15fb9c84bfc57cc6578fcf,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-drb5r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-64ddbf8bb-jjqv2_openstack-operators(4aeb7cba-c1db-4dd6-92f7-dae7bd2e3f65): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 15:26:03 crc kubenswrapper[4810]: E0219 15:26:03.984162 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-jjqv2" podUID="4aeb7cba-c1db-4dd6-92f7-dae7bd2e3f65" Feb 19 15:26:04 crc kubenswrapper[4810]: I0219 15:26:04.151409 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4898d4eb-d474-44bc-9a38-e36f300d132f-cert\") pod \"infra-operator-controller-manager-79d975b745-2kkhl\" (UID: \"4898d4eb-d474-44bc-9a38-e36f300d132f\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-2kkhl" Feb 19 15:26:04 crc kubenswrapper[4810]: E0219 15:26:04.151573 4810 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 19 15:26:04 crc kubenswrapper[4810]: E0219 15:26:04.151770 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4898d4eb-d474-44bc-9a38-e36f300d132f-cert podName:4898d4eb-d474-44bc-9a38-e36f300d132f nodeName:}" failed. No retries permitted until 2026-02-19 15:26:20.151750481 +0000 UTC m=+1009.633780605 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4898d4eb-d474-44bc-9a38-e36f300d132f-cert") pod "infra-operator-controller-manager-79d975b745-2kkhl" (UID: "4898d4eb-d474-44bc-9a38-e36f300d132f") : secret "infra-operator-webhook-server-cert" not found Feb 19 15:26:04 crc kubenswrapper[4810]: E0219 15:26:04.381160 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:543c103838f3e6ef48755665a7695dfa3ed84753c557560257d265db31f92759\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-5xnwd" podUID="e163eac0-ea1f-4002-9469-844240d7a44c" Feb 19 15:26:04 crc kubenswrapper[4810]: E0219 15:26:04.381450 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:e4689246ae78635dc3c1db9c677d8b16b8f94276df15fb9c84bfc57cc6578fcf\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-jjqv2" podUID="4aeb7cba-c1db-4dd6-92f7-dae7bd2e3f65" Feb 19 15:26:04 crc kubenswrapper[4810]: I0219 15:26:04.659259 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c677bdd0-7248-4b02-9ab4-035c034a976a-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c9hlpt\" (UID: \"c677bdd0-7248-4b02-9ab4-035c034a976a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c9hlpt" Feb 19 15:26:04 crc kubenswrapper[4810]: I0219 15:26:04.665899 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c677bdd0-7248-4b02-9ab4-035c034a976a-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c9hlpt\" (UID: \"c677bdd0-7248-4b02-9ab4-035c034a976a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c9hlpt" Feb 19 15:26:04 crc kubenswrapper[4810]: I0219 15:26:04.758368 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c9hlpt" Feb 19 15:26:04 crc kubenswrapper[4810]: I0219 15:26:04.861983 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a6f83f3c-26f4-472f-9fcd-ae8049f1819a-webhook-certs\") pod \"openstack-operator-controller-manager-6d464797d7-lrlqc\" (UID: \"a6f83f3c-26f4-472f-9fcd-ae8049f1819a\") " pod="openstack-operators/openstack-operator-controller-manager-6d464797d7-lrlqc" Feb 19 15:26:04 crc kubenswrapper[4810]: I0219 15:26:04.862233 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a6f83f3c-26f4-472f-9fcd-ae8049f1819a-metrics-certs\") pod \"openstack-operator-controller-manager-6d464797d7-lrlqc\" (UID: \"a6f83f3c-26f4-472f-9fcd-ae8049f1819a\") " pod="openstack-operators/openstack-operator-controller-manager-6d464797d7-lrlqc" Feb 19 15:26:04 crc kubenswrapper[4810]: E0219 15:26:04.862427 4810 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 19 15:26:04 crc kubenswrapper[4810]: E0219 15:26:04.862445 4810 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 19 15:26:04 crc kubenswrapper[4810]: E0219 15:26:04.862521 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6f83f3c-26f4-472f-9fcd-ae8049f1819a-metrics-certs podName:a6f83f3c-26f4-472f-9fcd-ae8049f1819a nodeName:}" failed. No retries permitted until 2026-02-19 15:26:20.862500373 +0000 UTC m=+1010.344530497 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a6f83f3c-26f4-472f-9fcd-ae8049f1819a-metrics-certs") pod "openstack-operator-controller-manager-6d464797d7-lrlqc" (UID: "a6f83f3c-26f4-472f-9fcd-ae8049f1819a") : secret "metrics-server-cert" not found Feb 19 15:26:04 crc kubenswrapper[4810]: E0219 15:26:04.862542 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6f83f3c-26f4-472f-9fcd-ae8049f1819a-webhook-certs podName:a6f83f3c-26f4-472f-9fcd-ae8049f1819a nodeName:}" failed. No retries permitted until 2026-02-19 15:26:20.862535454 +0000 UTC m=+1010.344565578 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/a6f83f3c-26f4-472f-9fcd-ae8049f1819a-webhook-certs") pod "openstack-operator-controller-manager-6d464797d7-lrlqc" (UID: "a6f83f3c-26f4-472f-9fcd-ae8049f1819a") : secret "webhook-server-cert" not found Feb 19 15:26:06 crc kubenswrapper[4810]: E0219 15:26:06.777086 4810 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838" Feb 19 15:26:06 crc kubenswrapper[4810]: E0219 15:26:06.777645 4810 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jj652,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-567668f5cf-l67cq_openstack-operators(fef4ad67-ccc8-4b32-bfb8-38dd6aa8e07e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 15:26:06 crc kubenswrapper[4810]: E0219 15:26:06.778843 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-l67cq" podUID="fef4ad67-ccc8-4b32-bfb8-38dd6aa8e07e" Feb 19 15:26:07 crc kubenswrapper[4810]: E0219 15:26:07.404633 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838\\\"\"" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-l67cq" podUID="fef4ad67-ccc8-4b32-bfb8-38dd6aa8e07e" Feb 19 15:26:10 crc kubenswrapper[4810]: I0219 15:26:10.491279 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c9hlpt"] Feb 19 15:26:10 crc kubenswrapper[4810]: W0219 15:26:10.532394 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc677bdd0_7248_4b02_9ab4_035c034a976a.slice/crio-9bd911d4191f9996bb14d48fcb4c64180a01369afca29b1071edfda5c752d520 WatchSource:0}: Error finding container 9bd911d4191f9996bb14d48fcb4c64180a01369afca29b1071edfda5c752d520: Status 404 returned error can't find the container with id 9bd911d4191f9996bb14d48fcb4c64180a01369afca29b1071edfda5c752d520 Feb 19 15:26:10 crc kubenswrapper[4810]: I0219 15:26:10.636677 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-t44nb" event={"ID":"aa5063d7-2358-4149-a3b9-ef2ce138faf4","Type":"ContainerStarted","Data":"77b98229ac7d278eb7f5b1bf02ecc266acdc3a05ff1fc7a15da7839f41af2323"} Feb 19 15:26:10 crc kubenswrapper[4810]: I0219 15:26:10.637422 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-68f46476f-t44nb" Feb 19 15:26:10 crc kubenswrapper[4810]: I0219 15:26:10.644093 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c9hlpt" event={"ID":"c677bdd0-7248-4b02-9ab4-035c034a976a","Type":"ContainerStarted","Data":"9bd911d4191f9996bb14d48fcb4c64180a01369afca29b1071edfda5c752d520"} Feb 19 15:26:10 crc kubenswrapper[4810]: I0219 15:26:10.652638 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-px9zx" event={"ID":"eaed166e-39b5-45ca-8a65-a22710d5fe37","Type":"ContainerStarted","Data":"2133399ff4508201c9762a969ef6cdc5dcdaf21e1b3d417e2b4091e39f44dd8a"} Feb 19 15:26:10 crc kubenswrapper[4810]: I0219 15:26:10.653214 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-px9zx" Feb 19 15:26:10 crc kubenswrapper[4810]: I0219 15:26:10.655870 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-nnps5" event={"ID":"602535d1-0abe-471e-8409-31319af7bd4b","Type":"ContainerStarted","Data":"5b98fc5826828b6e5d6d1564bd1ce17f85406817dd488f3a9439a484196dca50"} Feb 19 15:26:10 crc kubenswrapper[4810]: I0219 15:26:10.655934 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-nnps5" Feb 19 15:26:10 crc kubenswrapper[4810]: I0219 15:26:10.661315 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-z5fb9" event={"ID":"52bb990c-eff0-4673-be27-d55d433bef0d","Type":"ContainerStarted","Data":"a6166d3f7b76c1da4fec83b36c9ae28da0845b52cb4080a2f04785b2cf1903aa"} Feb 19 15:26:10 crc kubenswrapper[4810]: I0219 15:26:10.661609 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-z5fb9" Feb 19 15:26:10 crc kubenswrapper[4810]: I0219 15:26:10.663872 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-68f46476f-t44nb" podStartSLOduration=5.965970663 podStartE2EDuration="22.663858884s" podCreationTimestamp="2026-02-19 15:25:48 +0000 UTC" firstStartedPulling="2026-02-19 15:25:50.589590562 +0000 UTC m=+980.071620676" lastFinishedPulling="2026-02-19 15:26:07.287478773 +0000 UTC m=+996.769508897" observedRunningTime="2026-02-19 15:26:10.661369023 +0000 UTC m=+1000.143399147" watchObservedRunningTime="2026-02-19 15:26:10.663858884 +0000 UTC m=+1000.145889008" Feb 19 15:26:10 crc kubenswrapper[4810]: I0219 15:26:10.668617 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-vc7cw" event={"ID":"942f40af-0498-4865-99da-bdcd068ef449","Type":"ContainerStarted","Data":"8690214eca9f1721e0c93fceded3db6194005ed818aa1a86353f6496525afdf6"} Feb 19 15:26:10 crc kubenswrapper[4810]: I0219 15:26:10.669231 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-vc7cw" Feb 19 15:26:10 crc kubenswrapper[4810]: I0219 15:26:10.671960 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-gnmlp" event={"ID":"f0ab3643-d267-4902-af1f-cbcbdd7e5e41","Type":"ContainerStarted","Data":"3de5d7267245117e1c54bdc12c8f34834af0724c0d5220646ecbb63186b0eab0"} Feb 19 15:26:10 crc kubenswrapper[4810]: I0219 15:26:10.672246 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-gnmlp" Feb 19 15:26:10 crc kubenswrapper[4810]: I0219 15:26:10.674089 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987464f4-qz68t" event={"ID":"2106e7b5-bb83-464a-a43f-943f22b55078","Type":"ContainerStarted","Data":"6155112dba7f89415f14e7a11f5d302419cd73cf8867aa21b8ddf088aa99bd22"} Feb 19 15:26:10 crc kubenswrapper[4810]: I0219 15:26:10.674500 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-77987464f4-qz68t" Feb 19 15:26:10 crc kubenswrapper[4810]: I0219 15:26:10.675796 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-mzslt" event={"ID":"91002269-9fe0-44d2-9dbd-9e4cf58274bf","Type":"ContainerStarted","Data":"55c700b041bb863e91e04aba3e853e32c1bfb2457f6dbc001a8a7c7344d0d13a"} Feb 19 15:26:10 crc kubenswrapper[4810]: I0219 15:26:10.676179 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-mzslt" Feb 19 15:26:10 crc kubenswrapper[4810]: I0219 15:26:10.691061 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-nnps5" podStartSLOduration=5.697753038 podStartE2EDuration="22.69104145s" podCreationTimestamp="2026-02-19 15:25:48 +0000 UTC" firstStartedPulling="2026-02-19 15:25:50.294240892 +0000 UTC m=+979.776271016" lastFinishedPulling="2026-02-19 15:26:07.287529304 +0000 UTC m=+996.769559428" observedRunningTime="2026-02-19 15:26:10.676555515 +0000 UTC m=+1000.158585639" watchObservedRunningTime="2026-02-19 15:26:10.69104145 +0000 UTC m=+1000.173071574" Feb 19 15:26:10 crc kubenswrapper[4810]: I0219 15:26:10.693395 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-jxmt5" Feb 19 15:26:10 crc kubenswrapper[4810]: I0219 15:26:10.712129 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-px9zx" podStartSLOduration=5.985632864 podStartE2EDuration="22.712111766s" podCreationTimestamp="2026-02-19 15:25:48 +0000 UTC" firstStartedPulling="2026-02-19 15:25:50.561005651 +0000 UTC m=+980.043035775" lastFinishedPulling="2026-02-19 15:26:07.287484553 +0000 UTC m=+996.769514677" observedRunningTime="2026-02-19 15:26:10.706687013 +0000 UTC m=+1000.188717137" watchObservedRunningTime="2026-02-19 15:26:10.712111766 +0000 UTC m=+1000.194141890" Feb 19 15:26:10 crc kubenswrapper[4810]: I0219 15:26:10.769697 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-77987464f4-qz68t" podStartSLOduration=5.491500272 podStartE2EDuration="22.769677397s" podCreationTimestamp="2026-02-19 15:25:48 +0000 UTC" firstStartedPulling="2026-02-19 15:25:50.008114369 +0000 UTC m=+979.490144493" lastFinishedPulling="2026-02-19 15:26:07.286291484 +0000 UTC m=+996.768321618" observedRunningTime="2026-02-19 15:26:10.765862504 +0000 UTC m=+1000.247892638" watchObservedRunningTime="2026-02-19 15:26:10.769677397 +0000 UTC m=+1000.251707521" Feb 19 15:26:10 crc kubenswrapper[4810]: I0219 15:26:10.771355 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-mzslt" podStartSLOduration=5.777916303 podStartE2EDuration="22.771348168s" podCreationTimestamp="2026-02-19 15:25:48 +0000 UTC" firstStartedPulling="2026-02-19 15:25:50.293817522 +0000 UTC m=+979.775847646" lastFinishedPulling="2026-02-19 15:26:07.287249387 +0000 UTC m=+996.769279511" observedRunningTime="2026-02-19 15:26:10.734722051 +0000 UTC m=+1000.216752165" watchObservedRunningTime="2026-02-19 15:26:10.771348168 +0000 UTC m=+1000.253378292" Feb 19 15:26:10 crc kubenswrapper[4810]: I0219 15:26:10.798293 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-jxmt5" podStartSLOduration=5.505426933 podStartE2EDuration="22.798273478s" podCreationTimestamp="2026-02-19 15:25:48 +0000 UTC" firstStartedPulling="2026-02-19 15:25:49.997379125 +0000 UTC m=+979.479409249" lastFinishedPulling="2026-02-19 15:26:07.29022567 +0000 UTC m=+996.772255794" observedRunningTime="2026-02-19 15:26:10.797439048 +0000 UTC m=+1000.279469172" watchObservedRunningTime="2026-02-19 15:26:10.798273478 +0000 UTC m=+1000.280303602" Feb 19 15:26:10 crc kubenswrapper[4810]: I0219 15:26:10.820486 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-vc7cw" podStartSLOduration=6.873071676 podStartE2EDuration="22.820467382s" podCreationTimestamp="2026-02-19 15:25:48 +0000 UTC" firstStartedPulling="2026-02-19 15:25:50.223860267 +0000 UTC m=+979.705890391" lastFinishedPulling="2026-02-19 15:26:06.171255973 +0000 UTC m=+995.653286097" observedRunningTime="2026-02-19 15:26:10.81957226 +0000 UTC m=+1000.301602384" watchObservedRunningTime="2026-02-19 15:26:10.820467382 +0000 UTC m=+1000.302497506" Feb 19 15:26:10 crc kubenswrapper[4810]: I0219 15:26:10.850547 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-gnmlp" podStartSLOduration=6.64435365 podStartE2EDuration="22.850519509s" podCreationTimestamp="2026-02-19 15:25:48 +0000 UTC" firstStartedPulling="2026-02-19 15:25:49.965122995 +0000 UTC m=+979.447153109" lastFinishedPulling="2026-02-19 15:26:06.171288854 +0000 UTC m=+995.653318968" observedRunningTime="2026-02-19 15:26:10.841919608 +0000 UTC m=+1000.323949732" watchObservedRunningTime="2026-02-19 15:26:10.850519509 +0000 UTC m=+1000.332549633" Feb 19 15:26:10 crc kubenswrapper[4810]: I0219 15:26:10.870936 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-z5fb9" podStartSLOduration=4.908254105 podStartE2EDuration="22.870920699s" podCreationTimestamp="2026-02-19 15:25:48 +0000 UTC" firstStartedPulling="2026-02-19 15:25:49.324968723 +0000 UTC m=+978.806998847" lastFinishedPulling="2026-02-19 15:26:07.287635317 +0000 UTC m=+996.769665441" observedRunningTime="2026-02-19 15:26:10.870068068 +0000 UTC m=+1000.352098192" watchObservedRunningTime="2026-02-19 15:26:10.870920699 +0000 UTC m=+1000.352950813" Feb 19 15:26:11 crc kubenswrapper[4810]: I0219 15:26:11.702466 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-6fqmd" event={"ID":"e4a54646-39cf-4e42-9367-487ea4f7d8a4","Type":"ContainerStarted","Data":"a7f4502d6116a297e20819226e83675436d5f60415c41d5b045a650e129aa64e"} Feb 19 15:26:11 crc kubenswrapper[4810]: I0219 15:26:11.702951 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-6fqmd" Feb 19 15:26:11 crc kubenswrapper[4810]: I0219 15:26:11.711536 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-vcbwg" event={"ID":"3cff7e0c-86c7-4029-aefe-a7f7e8e2d76d","Type":"ContainerStarted","Data":"cc0439d25c9995993839b0b44bd658156506ccca4d0f014da53ea8ee3103f32c"} Feb 19 15:26:11 crc kubenswrapper[4810]: I0219 15:26:11.711637 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-vcbwg" Feb 19 15:26:11 crc kubenswrapper[4810]: I0219 15:26:11.720293 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-798847869b-dlmvg" event={"ID":"9c5af548-c722-4e6b-9309-1420838257e0","Type":"ContainerStarted","Data":"fbbd153a52771d3602f93b40766d5c0ad9da72d9fd0e23c48ce28db0bada5a16"} Feb 19 15:26:11 crc kubenswrapper[4810]: I0219 15:26:11.720537 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-798847869b-dlmvg" Feb 19 15:26:11 crc kubenswrapper[4810]: I0219 15:26:11.724532 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-6fqmd" podStartSLOduration=4.2696109 podStartE2EDuration="23.724516342s" podCreationTimestamp="2026-02-19 15:25:48 +0000 UTC" firstStartedPulling="2026-02-19 15:25:50.59973235 +0000 UTC m=+980.081762474" lastFinishedPulling="2026-02-19 15:26:10.054637792 +0000 UTC m=+999.536667916" observedRunningTime="2026-02-19 15:26:11.720197506 +0000 UTC m=+1001.202227630" watchObservedRunningTime="2026-02-19 15:26:11.724516342 +0000 UTC m=+1001.206546466" Feb 19 15:26:11 crc kubenswrapper[4810]: I0219 15:26:11.736174 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-k98c8" event={"ID":"64ed590e-59b6-44c8-baee-324162d099b8","Type":"ContainerStarted","Data":"d7e363cde92e18a78213fab06f57688d9039d3aad2d3af3c9dc21850d34bf557"} Feb 19 15:26:11 crc kubenswrapper[4810]: I0219 15:26:11.738416 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-vcbwg" podStartSLOduration=7.018776479 podStartE2EDuration="23.738403343s" podCreationTimestamp="2026-02-19 15:25:48 +0000 UTC" firstStartedPulling="2026-02-19 15:25:50.567721646 +0000 UTC m=+980.049751770" lastFinishedPulling="2026-02-19 15:26:07.28734851 +0000 UTC m=+996.769378634" observedRunningTime="2026-02-19 15:26:11.735544313 +0000 UTC m=+1001.217574437" watchObservedRunningTime="2026-02-19 15:26:11.738403343 +0000 UTC m=+1001.220433467" Feb 19 15:26:11 crc kubenswrapper[4810]: I0219 15:26:11.748482 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-jxmt5" event={"ID":"1217b757-0f1c-4c4e-9abe-55875992915d","Type":"ContainerStarted","Data":"071ca87e7010f5b69a9cae2d7f3af2f9b9a955fcc849110d9ce02549a2eb9920"} Feb 19 15:26:11 crc kubenswrapper[4810]: I0219 15:26:11.756432 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-ffm66" event={"ID":"e2942952-ce19-4053-91da-05623c954167","Type":"ContainerStarted","Data":"7a94575fd061fa30dda9fc0e5ed2f7e123fa39aef8efd06a40c09fd2e63cd95a"} Feb 19 15:26:11 crc kubenswrapper[4810]: I0219 15:26:11.756963 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-ffm66" Feb 19 15:26:11 crc kubenswrapper[4810]: I0219 15:26:11.758873 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-mkfsc" event={"ID":"2126b31b-0444-43e4-a250-837f37d476aa","Type":"ContainerStarted","Data":"c09d1663a8f2d2892250866e96e5d5843f2e5980c3c02f51cfcdaa723c017a6f"} Feb 19 15:26:11 crc kubenswrapper[4810]: I0219 15:26:11.759305 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-mkfsc" Feb 19 15:26:11 crc kubenswrapper[4810]: I0219 15:26:11.775810 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-798847869b-dlmvg" podStartSLOduration=4.182789024 podStartE2EDuration="23.775792499s" podCreationTimestamp="2026-02-19 15:25:48 +0000 UTC" firstStartedPulling="2026-02-19 15:25:50.65398032 +0000 UTC m=+980.136010444" lastFinishedPulling="2026-02-19 15:26:10.246983795 +0000 UTC m=+999.729013919" observedRunningTime="2026-02-19 15:26:11.774846046 +0000 UTC m=+1001.256876170" watchObservedRunningTime="2026-02-19 15:26:11.775792499 +0000 UTC m=+1001.257822623" Feb 19 15:26:11 crc kubenswrapper[4810]: I0219 15:26:11.813986 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-k98c8" podStartSLOduration=4.276725097 podStartE2EDuration="23.813967755s" podCreationTimestamp="2026-02-19 15:25:48 +0000 UTC" firstStartedPulling="2026-02-19 15:25:50.644743934 +0000 UTC m=+980.126774048" lastFinishedPulling="2026-02-19 15:26:10.181986582 +0000 UTC m=+999.664016706" observedRunningTime="2026-02-19 15:26:11.812374686 +0000 UTC m=+1001.294404830" watchObservedRunningTime="2026-02-19 15:26:11.813967755 +0000 UTC m=+1001.295997879" Feb 19 15:26:11 crc kubenswrapper[4810]: I0219 15:26:11.834970 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-mkfsc" podStartSLOduration=4.377792452 podStartE2EDuration="23.834955889s" podCreationTimestamp="2026-02-19 15:25:48 +0000 UTC" firstStartedPulling="2026-02-19 15:25:50.599372011 +0000 UTC m=+980.081402135" lastFinishedPulling="2026-02-19 15:26:10.056535448 +0000 UTC m=+999.538565572" observedRunningTime="2026-02-19 15:26:11.832580371 +0000 UTC m=+1001.314610485" watchObservedRunningTime="2026-02-19 15:26:11.834955889 +0000 UTC m=+1001.316986013" Feb 19 15:26:11 crc kubenswrapper[4810]: I0219 15:26:11.852490 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-ffm66" podStartSLOduration=6.566986155 podStartE2EDuration="23.852475089s" podCreationTimestamp="2026-02-19 15:25:48 +0000 UTC" firstStartedPulling="2026-02-19 15:25:50.002089561 +0000 UTC m=+979.484119685" lastFinishedPulling="2026-02-19 15:26:07.287578465 +0000 UTC m=+996.769608619" observedRunningTime="2026-02-19 15:26:11.84842882 +0000 UTC m=+1001.330458944" watchObservedRunningTime="2026-02-19 15:26:11.852475089 +0000 UTC m=+1001.334505213" Feb 19 15:26:15 crc kubenswrapper[4810]: I0219 15:26:15.789662 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c9hlpt" event={"ID":"c677bdd0-7248-4b02-9ab4-035c034a976a","Type":"ContainerStarted","Data":"462bac6d4d85d2f1f193db02df486b4786815b87c40f6d1ac63a7e3d15f3e30b"} Feb 19 15:26:15 crc kubenswrapper[4810]: I0219 15:26:15.790184 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c9hlpt" Feb 19 15:26:15 crc kubenswrapper[4810]: I0219 15:26:15.792407 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7866795846-pw9kt" event={"ID":"69b7e96d-bce6-4653-998e-3bf5d159ae5a","Type":"ContainerStarted","Data":"fd39e0f422569c85b060c3784b9b78c986335f90a333f906a31498c40b3c9346"} Feb 19 15:26:15 crc kubenswrapper[4810]: I0219 15:26:15.793133 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-7866795846-pw9kt" Feb 19 15:26:15 crc kubenswrapper[4810]: I0219 15:26:15.795689 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-7tzvr" event={"ID":"9f5779a5-4cda-40dc-831d-950f97eae317","Type":"ContainerStarted","Data":"619416ce7877401d8fcb6b251da7fdd31bd32f494e99ecbcb555e3ea0b38c198"} Feb 19 15:26:15 crc kubenswrapper[4810]: I0219 15:26:15.795928 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-7tzvr" Feb 19 15:26:15 crc kubenswrapper[4810]: I0219 15:26:15.819871 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c9hlpt" podStartSLOduration=23.671060802 podStartE2EDuration="27.819853685s" podCreationTimestamp="2026-02-19 15:25:48 +0000 UTC" firstStartedPulling="2026-02-19 15:26:10.535103498 +0000 UTC m=+1000.017133622" lastFinishedPulling="2026-02-19 15:26:14.683896381 +0000 UTC m=+1004.165926505" observedRunningTime="2026-02-19 15:26:15.815245302 +0000 UTC m=+1005.297275426" watchObservedRunningTime="2026-02-19 15:26:15.819853685 +0000 UTC m=+1005.301883809" Feb 19 15:26:15 crc kubenswrapper[4810]: I0219 15:26:15.836704 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-7tzvr" podStartSLOduration=3.816759192 podStartE2EDuration="27.836687658s" podCreationTimestamp="2026-02-19 15:25:48 +0000 UTC" firstStartedPulling="2026-02-19 15:25:50.644928008 +0000 UTC m=+980.126958132" lastFinishedPulling="2026-02-19 15:26:14.664856454 +0000 UTC m=+1004.146886598" observedRunningTime="2026-02-19 15:26:15.834646938 +0000 UTC m=+1005.316677082" watchObservedRunningTime="2026-02-19 15:26:15.836687658 +0000 UTC m=+1005.318717802" Feb 19 15:26:16 crc kubenswrapper[4810]: I0219 15:26:16.805313 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-jjqv2" event={"ID":"4aeb7cba-c1db-4dd6-92f7-dae7bd2e3f65","Type":"ContainerStarted","Data":"b13e36b05d5ad7005a8d9facb291e614631ad1e73b410622cd4df8e50f40d87c"} Feb 19 15:26:16 crc kubenswrapper[4810]: I0219 15:26:16.805923 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-jjqv2" Feb 19 15:26:16 crc kubenswrapper[4810]: I0219 15:26:16.825155 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-7866795846-pw9kt" podStartSLOduration=4.769583177 podStartE2EDuration="28.825130436s" podCreationTimestamp="2026-02-19 15:25:48 +0000 UTC" firstStartedPulling="2026-02-19 15:25:50.612719709 +0000 UTC m=+980.094749833" lastFinishedPulling="2026-02-19 15:26:14.668266978 +0000 UTC m=+1004.150297092" observedRunningTime="2026-02-19 15:26:15.853418018 +0000 UTC m=+1005.335448142" watchObservedRunningTime="2026-02-19 15:26:16.825130436 +0000 UTC m=+1006.307160590" Feb 19 15:26:16 crc kubenswrapper[4810]: I0219 15:26:16.833096 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-jjqv2" podStartSLOduration=3.263895929 podStartE2EDuration="28.833083531s" podCreationTimestamp="2026-02-19 15:25:48 +0000 UTC" firstStartedPulling="2026-02-19 15:25:50.401796638 +0000 UTC m=+979.883826762" lastFinishedPulling="2026-02-19 15:26:15.97098423 +0000 UTC m=+1005.453014364" observedRunningTime="2026-02-19 15:26:16.823375733 +0000 UTC m=+1006.305405897" watchObservedRunningTime="2026-02-19 15:26:16.833083531 +0000 UTC m=+1006.315113695" Feb 19 15:26:18 crc kubenswrapper[4810]: I0219 15:26:18.432795 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-jxmt5" Feb 19 15:26:18 crc kubenswrapper[4810]: I0219 15:26:18.471787 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-z5fb9" Feb 19 15:26:18 crc kubenswrapper[4810]: I0219 15:26:18.513751 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-77987464f4-qz68t" Feb 19 15:26:18 crc kubenswrapper[4810]: I0219 15:26:18.529795 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-ffm66" Feb 19 15:26:18 crc kubenswrapper[4810]: I0219 15:26:18.546856 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-gnmlp" Feb 19 15:26:18 crc kubenswrapper[4810]: I0219 15:26:18.604522 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-nnps5" Feb 19 15:26:18 crc kubenswrapper[4810]: I0219 15:26:18.668611 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-mzslt" Feb 19 15:26:18 crc kubenswrapper[4810]: I0219 15:26:18.753366 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-vc7cw" Feb 19 15:26:18 crc kubenswrapper[4810]: I0219 15:26:18.825051 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-5xnwd" event={"ID":"e163eac0-ea1f-4002-9469-844240d7a44c","Type":"ContainerStarted","Data":"20225a486423e9c7cd508a3be1662a87ec4639c953f782c9ba3643ae52196769"} Feb 19 15:26:18 crc kubenswrapper[4810]: I0219 15:26:18.826169 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-5xnwd" Feb 19 15:26:18 crc kubenswrapper[4810]: I0219 15:26:18.843470 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-5xnwd" podStartSLOduration=3.257752499 podStartE2EDuration="30.843452527s" podCreationTimestamp="2026-02-19 15:25:48 +0000 UTC" firstStartedPulling="2026-02-19 15:25:50.551958769 +0000 UTC m=+980.033988893" lastFinishedPulling="2026-02-19 15:26:18.137658767 +0000 UTC m=+1007.619688921" observedRunningTime="2026-02-19 15:26:18.837786488 +0000 UTC m=+1008.319816612" watchObservedRunningTime="2026-02-19 15:26:18.843452527 +0000 UTC m=+1008.325482651" Feb 19 15:26:18 crc kubenswrapper[4810]: I0219 15:26:18.904236 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-6fqmd" Feb 19 15:26:18 crc kubenswrapper[4810]: I0219 15:26:18.961369 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-mkfsc" Feb 19 15:26:19 crc kubenswrapper[4810]: I0219 15:26:19.061771 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-vcbwg" Feb 19 15:26:19 crc kubenswrapper[4810]: I0219 15:26:19.185857 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-7tzvr" Feb 19 15:26:19 crc kubenswrapper[4810]: I0219 15:26:19.215342 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-68f46476f-t44nb" Feb 19 15:26:19 crc kubenswrapper[4810]: I0219 15:26:19.248707 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-px9zx" Feb 19 15:26:19 crc kubenswrapper[4810]: I0219 15:26:19.350448 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-7866795846-pw9kt" Feb 19 15:26:19 crc kubenswrapper[4810]: I0219 15:26:19.376626 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-798847869b-dlmvg" Feb 19 15:26:20 crc kubenswrapper[4810]: I0219 15:26:20.201212 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4898d4eb-d474-44bc-9a38-e36f300d132f-cert\") pod \"infra-operator-controller-manager-79d975b745-2kkhl\" (UID: \"4898d4eb-d474-44bc-9a38-e36f300d132f\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-2kkhl" Feb 19 15:26:20 crc kubenswrapper[4810]: I0219 15:26:20.208352 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4898d4eb-d474-44bc-9a38-e36f300d132f-cert\") pod \"infra-operator-controller-manager-79d975b745-2kkhl\" (UID: \"4898d4eb-d474-44bc-9a38-e36f300d132f\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-2kkhl" Feb 19 15:26:20 crc kubenswrapper[4810]: I0219 15:26:20.413354 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79d975b745-2kkhl" Feb 19 15:26:20 crc kubenswrapper[4810]: I0219 15:26:20.686115 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-2kkhl"] Feb 19 15:26:20 crc kubenswrapper[4810]: W0219 15:26:20.693657 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4898d4eb_d474_44bc_9a38_e36f300d132f.slice/crio-619828b583afcb77dfbcc5aebfc67debe1d29496770cc320fdde59e0df1247e1 WatchSource:0}: Error finding container 619828b583afcb77dfbcc5aebfc67debe1d29496770cc320fdde59e0df1247e1: Status 404 returned error can't find the container with id 619828b583afcb77dfbcc5aebfc67debe1d29496770cc320fdde59e0df1247e1 Feb 19 15:26:20 crc kubenswrapper[4810]: I0219 15:26:20.853307 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79d975b745-2kkhl" event={"ID":"4898d4eb-d474-44bc-9a38-e36f300d132f","Type":"ContainerStarted","Data":"619828b583afcb77dfbcc5aebfc67debe1d29496770cc320fdde59e0df1247e1"} Feb 19 15:26:20 crc kubenswrapper[4810]: I0219 15:26:20.919302 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a6f83f3c-26f4-472f-9fcd-ae8049f1819a-webhook-certs\") pod \"openstack-operator-controller-manager-6d464797d7-lrlqc\" (UID: \"a6f83f3c-26f4-472f-9fcd-ae8049f1819a\") " pod="openstack-operators/openstack-operator-controller-manager-6d464797d7-lrlqc" Feb 19 15:26:20 crc kubenswrapper[4810]: I0219 15:26:20.919385 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a6f83f3c-26f4-472f-9fcd-ae8049f1819a-metrics-certs\") pod \"openstack-operator-controller-manager-6d464797d7-lrlqc\" (UID: \"a6f83f3c-26f4-472f-9fcd-ae8049f1819a\") " pod="openstack-operators/openstack-operator-controller-manager-6d464797d7-lrlqc" Feb 19 15:26:20 crc kubenswrapper[4810]: I0219 15:26:20.925439 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a6f83f3c-26f4-472f-9fcd-ae8049f1819a-metrics-certs\") pod \"openstack-operator-controller-manager-6d464797d7-lrlqc\" (UID: \"a6f83f3c-26f4-472f-9fcd-ae8049f1819a\") " pod="openstack-operators/openstack-operator-controller-manager-6d464797d7-lrlqc" Feb 19 15:26:20 crc kubenswrapper[4810]: I0219 15:26:20.926250 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a6f83f3c-26f4-472f-9fcd-ae8049f1819a-webhook-certs\") pod \"openstack-operator-controller-manager-6d464797d7-lrlqc\" (UID: \"a6f83f3c-26f4-472f-9fcd-ae8049f1819a\") " pod="openstack-operators/openstack-operator-controller-manager-6d464797d7-lrlqc" Feb 19 15:26:21 crc kubenswrapper[4810]: I0219 15:26:21.210005 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6d464797d7-lrlqc" Feb 19 15:26:21 crc kubenswrapper[4810]: I0219 15:26:21.404180 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6d464797d7-lrlqc"] Feb 19 15:26:21 crc kubenswrapper[4810]: W0219 15:26:21.408251 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6f83f3c_26f4_472f_9fcd_ae8049f1819a.slice/crio-d48fa01d3b3b7599b20727651737df2e9b8508e2512f70dccccb80c680c2754c WatchSource:0}: Error finding container d48fa01d3b3b7599b20727651737df2e9b8508e2512f70dccccb80c680c2754c: Status 404 returned error can't find the container with id d48fa01d3b3b7599b20727651737df2e9b8508e2512f70dccccb80c680c2754c Feb 19 15:26:21 crc kubenswrapper[4810]: I0219 15:26:21.860229 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6d464797d7-lrlqc" event={"ID":"a6f83f3c-26f4-472f-9fcd-ae8049f1819a","Type":"ContainerStarted","Data":"d48fa01d3b3b7599b20727651737df2e9b8508e2512f70dccccb80c680c2754c"} Feb 19 15:26:24 crc kubenswrapper[4810]: I0219 15:26:24.767041 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c9hlpt" Feb 19 15:26:27 crc kubenswrapper[4810]: I0219 15:26:27.920685 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6d464797d7-lrlqc" event={"ID":"a6f83f3c-26f4-472f-9fcd-ae8049f1819a","Type":"ContainerStarted","Data":"ef7cc2bef6be2daef684ae24d45cdedb1b1d9ec3d0d46d71c9f10f442ee78855"} Feb 19 15:26:28 crc kubenswrapper[4810]: I0219 15:26:28.929257 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-6d464797d7-lrlqc" Feb 19 15:26:28 crc kubenswrapper[4810]: I0219 15:26:28.963281 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-6d464797d7-lrlqc" podStartSLOduration=40.963263707 podStartE2EDuration="40.963263707s" podCreationTimestamp="2026-02-19 15:25:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:26:28.955346833 +0000 UTC m=+1018.437376957" watchObservedRunningTime="2026-02-19 15:26:28.963263707 +0000 UTC m=+1018.445293831" Feb 19 15:26:28 crc kubenswrapper[4810]: I0219 15:26:28.996974 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-jjqv2" Feb 19 15:26:29 crc kubenswrapper[4810]: I0219 15:26:29.136035 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-5xnwd" Feb 19 15:26:32 crc kubenswrapper[4810]: I0219 15:26:32.964995 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79d975b745-2kkhl" event={"ID":"4898d4eb-d474-44bc-9a38-e36f300d132f","Type":"ContainerStarted","Data":"15670e664aa26d3cb152e3c860abaa75c754a1cacb9836235b8938dc924401ff"} Feb 19 15:26:32 crc kubenswrapper[4810]: I0219 15:26:32.966420 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-79d975b745-2kkhl" Feb 19 15:26:32 crc kubenswrapper[4810]: I0219 15:26:32.968091 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-l67cq" event={"ID":"fef4ad67-ccc8-4b32-bfb8-38dd6aa8e07e","Type":"ContainerStarted","Data":"99da32fe6da3ac5f0ba83fa4c1f0667d7d2c0bfd3372978835cde4e3dea6b9d0"} Feb 19 15:26:32 crc kubenswrapper[4810]: I0219 15:26:32.968884 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-l67cq" Feb 19 15:26:33 crc kubenswrapper[4810]: I0219 15:26:33.012670 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-l67cq" podStartSLOduration=3.738657908 podStartE2EDuration="45.012645324s" podCreationTimestamp="2026-02-19 15:25:48 +0000 UTC" firstStartedPulling="2026-02-19 15:25:50.552016631 +0000 UTC m=+980.034046755" lastFinishedPulling="2026-02-19 15:26:31.826004007 +0000 UTC m=+1021.308034171" observedRunningTime="2026-02-19 15:26:33.005920299 +0000 UTC m=+1022.487950453" watchObservedRunningTime="2026-02-19 15:26:33.012645324 +0000 UTC m=+1022.494675468" Feb 19 15:26:33 crc kubenswrapper[4810]: I0219 15:26:33.014270 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-79d975b745-2kkhl" podStartSLOduration=33.804521898 podStartE2EDuration="45.014258053s" podCreationTimestamp="2026-02-19 15:25:48 +0000 UTC" firstStartedPulling="2026-02-19 15:26:20.696309644 +0000 UTC m=+1010.178339788" lastFinishedPulling="2026-02-19 15:26:31.906045819 +0000 UTC m=+1021.388075943" observedRunningTime="2026-02-19 15:26:32.986921453 +0000 UTC m=+1022.468951617" watchObservedRunningTime="2026-02-19 15:26:33.014258053 +0000 UTC m=+1022.496288197" Feb 19 15:26:38 crc kubenswrapper[4810]: I0219 15:26:38.917908 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-l67cq" Feb 19 15:26:40 crc kubenswrapper[4810]: I0219 15:26:40.418985 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-79d975b745-2kkhl" Feb 19 15:26:41 crc kubenswrapper[4810]: I0219 15:26:41.217869 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-6d464797d7-lrlqc" Feb 19 15:27:00 crc kubenswrapper[4810]: I0219 15:27:00.640280 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6cf7b9b6b9-4h67z"] Feb 19 15:27:00 crc kubenswrapper[4810]: E0219 15:27:00.641079 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1715cc96-a86a-40d8-8f5a-1a4f35129bd1" containerName="extract-content" Feb 19 15:27:00 crc kubenswrapper[4810]: I0219 15:27:00.641092 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="1715cc96-a86a-40d8-8f5a-1a4f35129bd1" containerName="extract-content" Feb 19 15:27:00 crc kubenswrapper[4810]: E0219 15:27:00.641112 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1715cc96-a86a-40d8-8f5a-1a4f35129bd1" containerName="extract-utilities" Feb 19 15:27:00 crc kubenswrapper[4810]: I0219 15:27:00.641118 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="1715cc96-a86a-40d8-8f5a-1a4f35129bd1" containerName="extract-utilities" Feb 19 15:27:00 crc kubenswrapper[4810]: E0219 15:27:00.641127 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1715cc96-a86a-40d8-8f5a-1a4f35129bd1" containerName="registry-server" Feb 19 15:27:00 crc kubenswrapper[4810]: I0219 15:27:00.641133 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="1715cc96-a86a-40d8-8f5a-1a4f35129bd1" containerName="registry-server" Feb 19 15:27:00 crc kubenswrapper[4810]: I0219 15:27:00.641274 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="1715cc96-a86a-40d8-8f5a-1a4f35129bd1" containerName="registry-server" Feb 19 15:27:00 crc kubenswrapper[4810]: I0219 15:27:00.643289 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cf7b9b6b9-4h67z" Feb 19 15:27:00 crc kubenswrapper[4810]: I0219 15:27:00.647465 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Feb 19 15:27:00 crc kubenswrapper[4810]: I0219 15:27:00.647595 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-zxgt6" Feb 19 15:27:00 crc kubenswrapper[4810]: I0219 15:27:00.647668 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Feb 19 15:27:00 crc kubenswrapper[4810]: I0219 15:27:00.647795 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Feb 19 15:27:00 crc kubenswrapper[4810]: I0219 15:27:00.661483 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6cf7b9b6b9-4h67z"] Feb 19 15:27:00 crc kubenswrapper[4810]: I0219 15:27:00.721819 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f48d6b889-hmnjw"] Feb 19 15:27:00 crc kubenswrapper[4810]: I0219 15:27:00.723289 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f48d6b889-hmnjw" Feb 19 15:27:00 crc kubenswrapper[4810]: I0219 15:27:00.726066 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Feb 19 15:27:00 crc kubenswrapper[4810]: I0219 15:27:00.734019 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f48d6b889-hmnjw"] Feb 19 15:27:00 crc kubenswrapper[4810]: I0219 15:27:00.734443 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gkxm\" (UniqueName: \"kubernetes.io/projected/de8cf274-42f1-4c36-bca7-1d622bf61898-kube-api-access-5gkxm\") pod \"dnsmasq-dns-6cf7b9b6b9-4h67z\" (UID: \"de8cf274-42f1-4c36-bca7-1d622bf61898\") " pod="openstack/dnsmasq-dns-6cf7b9b6b9-4h67z" Feb 19 15:27:00 crc kubenswrapper[4810]: I0219 15:27:00.734494 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de8cf274-42f1-4c36-bca7-1d622bf61898-config\") pod \"dnsmasq-dns-6cf7b9b6b9-4h67z\" (UID: \"de8cf274-42f1-4c36-bca7-1d622bf61898\") " pod="openstack/dnsmasq-dns-6cf7b9b6b9-4h67z" Feb 19 15:27:00 crc kubenswrapper[4810]: I0219 15:27:00.835631 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de8cf274-42f1-4c36-bca7-1d622bf61898-config\") pod \"dnsmasq-dns-6cf7b9b6b9-4h67z\" (UID: \"de8cf274-42f1-4c36-bca7-1d622bf61898\") " pod="openstack/dnsmasq-dns-6cf7b9b6b9-4h67z" Feb 19 15:27:00 crc kubenswrapper[4810]: I0219 15:27:00.835696 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/731d3bd2-70ab-4ec0-b574-a00042d0b3b2-config\") pod \"dnsmasq-dns-5f48d6b889-hmnjw\" (UID: \"731d3bd2-70ab-4ec0-b574-a00042d0b3b2\") " pod="openstack/dnsmasq-dns-5f48d6b889-hmnjw" Feb 19 15:27:00 crc kubenswrapper[4810]: I0219 15:27:00.835740 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/731d3bd2-70ab-4ec0-b574-a00042d0b3b2-dns-svc\") pod \"dnsmasq-dns-5f48d6b889-hmnjw\" (UID: \"731d3bd2-70ab-4ec0-b574-a00042d0b3b2\") " pod="openstack/dnsmasq-dns-5f48d6b889-hmnjw" Feb 19 15:27:00 crc kubenswrapper[4810]: I0219 15:27:00.835759 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slwdg\" (UniqueName: \"kubernetes.io/projected/731d3bd2-70ab-4ec0-b574-a00042d0b3b2-kube-api-access-slwdg\") pod \"dnsmasq-dns-5f48d6b889-hmnjw\" (UID: \"731d3bd2-70ab-4ec0-b574-a00042d0b3b2\") " pod="openstack/dnsmasq-dns-5f48d6b889-hmnjw" Feb 19 15:27:00 crc kubenswrapper[4810]: I0219 15:27:00.835799 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gkxm\" (UniqueName: \"kubernetes.io/projected/de8cf274-42f1-4c36-bca7-1d622bf61898-kube-api-access-5gkxm\") pod \"dnsmasq-dns-6cf7b9b6b9-4h67z\" (UID: \"de8cf274-42f1-4c36-bca7-1d622bf61898\") " pod="openstack/dnsmasq-dns-6cf7b9b6b9-4h67z" Feb 19 15:27:00 crc kubenswrapper[4810]: I0219 15:27:00.836890 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de8cf274-42f1-4c36-bca7-1d622bf61898-config\") pod \"dnsmasq-dns-6cf7b9b6b9-4h67z\" (UID: \"de8cf274-42f1-4c36-bca7-1d622bf61898\") " pod="openstack/dnsmasq-dns-6cf7b9b6b9-4h67z" Feb 19 15:27:00 crc kubenswrapper[4810]: I0219 15:27:00.859457 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gkxm\" (UniqueName: \"kubernetes.io/projected/de8cf274-42f1-4c36-bca7-1d622bf61898-kube-api-access-5gkxm\") pod \"dnsmasq-dns-6cf7b9b6b9-4h67z\" (UID: \"de8cf274-42f1-4c36-bca7-1d622bf61898\") " pod="openstack/dnsmasq-dns-6cf7b9b6b9-4h67z" Feb 19 15:27:00 crc kubenswrapper[4810]: I0219 15:27:00.936808 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/731d3bd2-70ab-4ec0-b574-a00042d0b3b2-config\") pod \"dnsmasq-dns-5f48d6b889-hmnjw\" (UID: \"731d3bd2-70ab-4ec0-b574-a00042d0b3b2\") " pod="openstack/dnsmasq-dns-5f48d6b889-hmnjw" Feb 19 15:27:00 crc kubenswrapper[4810]: I0219 15:27:00.936886 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/731d3bd2-70ab-4ec0-b574-a00042d0b3b2-dns-svc\") pod \"dnsmasq-dns-5f48d6b889-hmnjw\" (UID: \"731d3bd2-70ab-4ec0-b574-a00042d0b3b2\") " pod="openstack/dnsmasq-dns-5f48d6b889-hmnjw" Feb 19 15:27:00 crc kubenswrapper[4810]: I0219 15:27:00.936904 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slwdg\" (UniqueName: \"kubernetes.io/projected/731d3bd2-70ab-4ec0-b574-a00042d0b3b2-kube-api-access-slwdg\") pod \"dnsmasq-dns-5f48d6b889-hmnjw\" (UID: \"731d3bd2-70ab-4ec0-b574-a00042d0b3b2\") " pod="openstack/dnsmasq-dns-5f48d6b889-hmnjw" Feb 19 15:27:00 crc kubenswrapper[4810]: I0219 15:27:00.937767 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/731d3bd2-70ab-4ec0-b574-a00042d0b3b2-config\") pod \"dnsmasq-dns-5f48d6b889-hmnjw\" (UID: \"731d3bd2-70ab-4ec0-b574-a00042d0b3b2\") " pod="openstack/dnsmasq-dns-5f48d6b889-hmnjw" Feb 19 15:27:00 crc kubenswrapper[4810]: I0219 15:27:00.937779 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/731d3bd2-70ab-4ec0-b574-a00042d0b3b2-dns-svc\") pod \"dnsmasq-dns-5f48d6b889-hmnjw\" (UID: \"731d3bd2-70ab-4ec0-b574-a00042d0b3b2\") " pod="openstack/dnsmasq-dns-5f48d6b889-hmnjw" Feb 19 15:27:00 crc kubenswrapper[4810]: I0219 15:27:00.959941 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slwdg\" (UniqueName: \"kubernetes.io/projected/731d3bd2-70ab-4ec0-b574-a00042d0b3b2-kube-api-access-slwdg\") pod \"dnsmasq-dns-5f48d6b889-hmnjw\" (UID: \"731d3bd2-70ab-4ec0-b574-a00042d0b3b2\") " pod="openstack/dnsmasq-dns-5f48d6b889-hmnjw" Feb 19 15:27:00 crc kubenswrapper[4810]: I0219 15:27:00.962612 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cf7b9b6b9-4h67z" Feb 19 15:27:01 crc kubenswrapper[4810]: I0219 15:27:01.037820 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f48d6b889-hmnjw" Feb 19 15:27:01 crc kubenswrapper[4810]: I0219 15:27:01.192830 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6cf7b9b6b9-4h67z"] Feb 19 15:27:01 crc kubenswrapper[4810]: I0219 15:27:01.292376 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f48d6b889-hmnjw"] Feb 19 15:27:01 crc kubenswrapper[4810]: W0219 15:27:01.311011 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod731d3bd2_70ab_4ec0_b574_a00042d0b3b2.slice/crio-4b12e6f7653c91c7a79e0ac9f496ebbaccb1093a59dc2281bd92a962e34bac03 WatchSource:0}: Error finding container 4b12e6f7653c91c7a79e0ac9f496ebbaccb1093a59dc2281bd92a962e34bac03: Status 404 returned error can't find the container with id 4b12e6f7653c91c7a79e0ac9f496ebbaccb1093a59dc2281bd92a962e34bac03 Feb 19 15:27:02 crc kubenswrapper[4810]: I0219 15:27:02.215736 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cf7b9b6b9-4h67z" event={"ID":"de8cf274-42f1-4c36-bca7-1d622bf61898","Type":"ContainerStarted","Data":"14a7ad043bd3a14329d8f25870e9bf1129047ecea007a33d63f7b5431ba94745"} Feb 19 15:27:02 crc kubenswrapper[4810]: I0219 15:27:02.216768 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f48d6b889-hmnjw" event={"ID":"731d3bd2-70ab-4ec0-b574-a00042d0b3b2","Type":"ContainerStarted","Data":"4b12e6f7653c91c7a79e0ac9f496ebbaccb1093a59dc2281bd92a962e34bac03"} Feb 19 15:27:04 crc kubenswrapper[4810]: I0219 15:27:04.377845 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f48d6b889-hmnjw"] Feb 19 15:27:04 crc kubenswrapper[4810]: I0219 15:27:04.394897 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-c99647bb5-xkrgb"] Feb 19 15:27:04 crc kubenswrapper[4810]: I0219 15:27:04.397218 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c99647bb5-xkrgb" Feb 19 15:27:04 crc kubenswrapper[4810]: I0219 15:27:04.407226 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-c99647bb5-xkrgb"] Feb 19 15:27:04 crc kubenswrapper[4810]: I0219 15:27:04.508030 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eec01803-2cfd-4e97-a1c0-216c2622e913-dns-svc\") pod \"dnsmasq-dns-c99647bb5-xkrgb\" (UID: \"eec01803-2cfd-4e97-a1c0-216c2622e913\") " pod="openstack/dnsmasq-dns-c99647bb5-xkrgb" Feb 19 15:27:04 crc kubenswrapper[4810]: I0219 15:27:04.508452 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eec01803-2cfd-4e97-a1c0-216c2622e913-config\") pod \"dnsmasq-dns-c99647bb5-xkrgb\" (UID: \"eec01803-2cfd-4e97-a1c0-216c2622e913\") " pod="openstack/dnsmasq-dns-c99647bb5-xkrgb" Feb 19 15:27:04 crc kubenswrapper[4810]: I0219 15:27:04.508527 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xx72g\" (UniqueName: \"kubernetes.io/projected/eec01803-2cfd-4e97-a1c0-216c2622e913-kube-api-access-xx72g\") pod \"dnsmasq-dns-c99647bb5-xkrgb\" (UID: \"eec01803-2cfd-4e97-a1c0-216c2622e913\") " pod="openstack/dnsmasq-dns-c99647bb5-xkrgb" Feb 19 15:27:04 crc kubenswrapper[4810]: I0219 15:27:04.610467 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eec01803-2cfd-4e97-a1c0-216c2622e913-dns-svc\") pod \"dnsmasq-dns-c99647bb5-xkrgb\" (UID: \"eec01803-2cfd-4e97-a1c0-216c2622e913\") " pod="openstack/dnsmasq-dns-c99647bb5-xkrgb" Feb 19 15:27:04 crc kubenswrapper[4810]: I0219 15:27:04.610600 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eec01803-2cfd-4e97-a1c0-216c2622e913-config\") pod \"dnsmasq-dns-c99647bb5-xkrgb\" (UID: \"eec01803-2cfd-4e97-a1c0-216c2622e913\") " pod="openstack/dnsmasq-dns-c99647bb5-xkrgb" Feb 19 15:27:04 crc kubenswrapper[4810]: I0219 15:27:04.610687 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xx72g\" (UniqueName: \"kubernetes.io/projected/eec01803-2cfd-4e97-a1c0-216c2622e913-kube-api-access-xx72g\") pod \"dnsmasq-dns-c99647bb5-xkrgb\" (UID: \"eec01803-2cfd-4e97-a1c0-216c2622e913\") " pod="openstack/dnsmasq-dns-c99647bb5-xkrgb" Feb 19 15:27:04 crc kubenswrapper[4810]: I0219 15:27:04.611897 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eec01803-2cfd-4e97-a1c0-216c2622e913-dns-svc\") pod \"dnsmasq-dns-c99647bb5-xkrgb\" (UID: \"eec01803-2cfd-4e97-a1c0-216c2622e913\") " pod="openstack/dnsmasq-dns-c99647bb5-xkrgb" Feb 19 15:27:04 crc kubenswrapper[4810]: I0219 15:27:04.612609 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eec01803-2cfd-4e97-a1c0-216c2622e913-config\") pod \"dnsmasq-dns-c99647bb5-xkrgb\" (UID: \"eec01803-2cfd-4e97-a1c0-216c2622e913\") " pod="openstack/dnsmasq-dns-c99647bb5-xkrgb" Feb 19 15:27:04 crc kubenswrapper[4810]: I0219 15:27:04.638704 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xx72g\" (UniqueName: \"kubernetes.io/projected/eec01803-2cfd-4e97-a1c0-216c2622e913-kube-api-access-xx72g\") pod \"dnsmasq-dns-c99647bb5-xkrgb\" (UID: \"eec01803-2cfd-4e97-a1c0-216c2622e913\") " pod="openstack/dnsmasq-dns-c99647bb5-xkrgb" Feb 19 15:27:04 crc kubenswrapper[4810]: I0219 15:27:04.644350 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cf7b9b6b9-4h67z"] Feb 19 15:27:04 crc kubenswrapper[4810]: I0219 15:27:04.680102 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6598d9876f-l2z6b"] Feb 19 15:27:04 crc kubenswrapper[4810]: I0219 15:27:04.681218 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6598d9876f-l2z6b" Feb 19 15:27:04 crc kubenswrapper[4810]: I0219 15:27:04.694866 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6598d9876f-l2z6b"] Feb 19 15:27:04 crc kubenswrapper[4810]: I0219 15:27:04.715140 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c99647bb5-xkrgb" Feb 19 15:27:04 crc kubenswrapper[4810]: I0219 15:27:04.820780 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qv7dz\" (UniqueName: \"kubernetes.io/projected/46eb09d6-3ebe-4cb9-ac84-21cc0f203c89-kube-api-access-qv7dz\") pod \"dnsmasq-dns-6598d9876f-l2z6b\" (UID: \"46eb09d6-3ebe-4cb9-ac84-21cc0f203c89\") " pod="openstack/dnsmasq-dns-6598d9876f-l2z6b" Feb 19 15:27:04 crc kubenswrapper[4810]: I0219 15:27:04.820928 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46eb09d6-3ebe-4cb9-ac84-21cc0f203c89-config\") pod \"dnsmasq-dns-6598d9876f-l2z6b\" (UID: \"46eb09d6-3ebe-4cb9-ac84-21cc0f203c89\") " pod="openstack/dnsmasq-dns-6598d9876f-l2z6b" Feb 19 15:27:04 crc kubenswrapper[4810]: I0219 15:27:04.820956 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/46eb09d6-3ebe-4cb9-ac84-21cc0f203c89-dns-svc\") pod \"dnsmasq-dns-6598d9876f-l2z6b\" (UID: \"46eb09d6-3ebe-4cb9-ac84-21cc0f203c89\") " pod="openstack/dnsmasq-dns-6598d9876f-l2z6b" Feb 19 15:27:04 crc kubenswrapper[4810]: I0219 15:27:04.922621 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46eb09d6-3ebe-4cb9-ac84-21cc0f203c89-config\") pod \"dnsmasq-dns-6598d9876f-l2z6b\" (UID: \"46eb09d6-3ebe-4cb9-ac84-21cc0f203c89\") " pod="openstack/dnsmasq-dns-6598d9876f-l2z6b" Feb 19 15:27:04 crc kubenswrapper[4810]: I0219 15:27:04.922992 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/46eb09d6-3ebe-4cb9-ac84-21cc0f203c89-dns-svc\") pod \"dnsmasq-dns-6598d9876f-l2z6b\" (UID: \"46eb09d6-3ebe-4cb9-ac84-21cc0f203c89\") " pod="openstack/dnsmasq-dns-6598d9876f-l2z6b" Feb 19 15:27:04 crc kubenswrapper[4810]: I0219 15:27:04.923218 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qv7dz\" (UniqueName: \"kubernetes.io/projected/46eb09d6-3ebe-4cb9-ac84-21cc0f203c89-kube-api-access-qv7dz\") pod \"dnsmasq-dns-6598d9876f-l2z6b\" (UID: \"46eb09d6-3ebe-4cb9-ac84-21cc0f203c89\") " pod="openstack/dnsmasq-dns-6598d9876f-l2z6b" Feb 19 15:27:04 crc kubenswrapper[4810]: I0219 15:27:04.923636 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46eb09d6-3ebe-4cb9-ac84-21cc0f203c89-config\") pod \"dnsmasq-dns-6598d9876f-l2z6b\" (UID: \"46eb09d6-3ebe-4cb9-ac84-21cc0f203c89\") " pod="openstack/dnsmasq-dns-6598d9876f-l2z6b" Feb 19 15:27:04 crc kubenswrapper[4810]: I0219 15:27:04.923670 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/46eb09d6-3ebe-4cb9-ac84-21cc0f203c89-dns-svc\") pod \"dnsmasq-dns-6598d9876f-l2z6b\" (UID: \"46eb09d6-3ebe-4cb9-ac84-21cc0f203c89\") " pod="openstack/dnsmasq-dns-6598d9876f-l2z6b" Feb 19 15:27:04 crc kubenswrapper[4810]: I0219 15:27:04.934557 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6598d9876f-l2z6b"] Feb 19 15:27:04 crc kubenswrapper[4810]: E0219 15:27:04.935224 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-qv7dz], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-6598d9876f-l2z6b" podUID="46eb09d6-3ebe-4cb9-ac84-21cc0f203c89" Feb 19 15:27:04 crc kubenswrapper[4810]: I0219 15:27:04.945765 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qv7dz\" (UniqueName: \"kubernetes.io/projected/46eb09d6-3ebe-4cb9-ac84-21cc0f203c89-kube-api-access-qv7dz\") pod \"dnsmasq-dns-6598d9876f-l2z6b\" (UID: \"46eb09d6-3ebe-4cb9-ac84-21cc0f203c89\") " pod="openstack/dnsmasq-dns-6598d9876f-l2z6b" Feb 19 15:27:04 crc kubenswrapper[4810]: I0219 15:27:04.967464 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5fdbdbc8cc-mddqt"] Feb 19 15:27:04 crc kubenswrapper[4810]: I0219 15:27:04.969124 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fdbdbc8cc-mddqt" Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.005633 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5fdbdbc8cc-mddqt"] Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.032297 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64a59633-cb6f-4631-a980-566894f0ce35-config\") pod \"dnsmasq-dns-5fdbdbc8cc-mddqt\" (UID: \"64a59633-cb6f-4631-a980-566894f0ce35\") " pod="openstack/dnsmasq-dns-5fdbdbc8cc-mddqt" Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.032374 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/64a59633-cb6f-4631-a980-566894f0ce35-dns-svc\") pod \"dnsmasq-dns-5fdbdbc8cc-mddqt\" (UID: \"64a59633-cb6f-4631-a980-566894f0ce35\") " pod="openstack/dnsmasq-dns-5fdbdbc8cc-mddqt" Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.032438 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dbn6\" (UniqueName: \"kubernetes.io/projected/64a59633-cb6f-4631-a980-566894f0ce35-kube-api-access-5dbn6\") pod \"dnsmasq-dns-5fdbdbc8cc-mddqt\" (UID: \"64a59633-cb6f-4631-a980-566894f0ce35\") " pod="openstack/dnsmasq-dns-5fdbdbc8cc-mddqt" Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.133869 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64a59633-cb6f-4631-a980-566894f0ce35-config\") pod \"dnsmasq-dns-5fdbdbc8cc-mddqt\" (UID: \"64a59633-cb6f-4631-a980-566894f0ce35\") " pod="openstack/dnsmasq-dns-5fdbdbc8cc-mddqt" Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.133967 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/64a59633-cb6f-4631-a980-566894f0ce35-dns-svc\") pod \"dnsmasq-dns-5fdbdbc8cc-mddqt\" (UID: \"64a59633-cb6f-4631-a980-566894f0ce35\") " pod="openstack/dnsmasq-dns-5fdbdbc8cc-mddqt" Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.134263 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dbn6\" (UniqueName: \"kubernetes.io/projected/64a59633-cb6f-4631-a980-566894f0ce35-kube-api-access-5dbn6\") pod \"dnsmasq-dns-5fdbdbc8cc-mddqt\" (UID: \"64a59633-cb6f-4631-a980-566894f0ce35\") " pod="openstack/dnsmasq-dns-5fdbdbc8cc-mddqt" Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.134926 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/64a59633-cb6f-4631-a980-566894f0ce35-dns-svc\") pod \"dnsmasq-dns-5fdbdbc8cc-mddqt\" (UID: \"64a59633-cb6f-4631-a980-566894f0ce35\") " pod="openstack/dnsmasq-dns-5fdbdbc8cc-mddqt" Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.135142 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64a59633-cb6f-4631-a980-566894f0ce35-config\") pod \"dnsmasq-dns-5fdbdbc8cc-mddqt\" (UID: \"64a59633-cb6f-4631-a980-566894f0ce35\") " pod="openstack/dnsmasq-dns-5fdbdbc8cc-mddqt" Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.163161 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dbn6\" (UniqueName: \"kubernetes.io/projected/64a59633-cb6f-4631-a980-566894f0ce35-kube-api-access-5dbn6\") pod \"dnsmasq-dns-5fdbdbc8cc-mddqt\" (UID: \"64a59633-cb6f-4631-a980-566894f0ce35\") " pod="openstack/dnsmasq-dns-5fdbdbc8cc-mddqt" Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.234533 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6598d9876f-l2z6b" Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.248446 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6598d9876f-l2z6b" Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.305041 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fdbdbc8cc-mddqt" Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.336654 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qv7dz\" (UniqueName: \"kubernetes.io/projected/46eb09d6-3ebe-4cb9-ac84-21cc0f203c89-kube-api-access-qv7dz\") pod \"46eb09d6-3ebe-4cb9-ac84-21cc0f203c89\" (UID: \"46eb09d6-3ebe-4cb9-ac84-21cc0f203c89\") " Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.336749 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46eb09d6-3ebe-4cb9-ac84-21cc0f203c89-config\") pod \"46eb09d6-3ebe-4cb9-ac84-21cc0f203c89\" (UID: \"46eb09d6-3ebe-4cb9-ac84-21cc0f203c89\") " Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.336839 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/46eb09d6-3ebe-4cb9-ac84-21cc0f203c89-dns-svc\") pod \"46eb09d6-3ebe-4cb9-ac84-21cc0f203c89\" (UID: \"46eb09d6-3ebe-4cb9-ac84-21cc0f203c89\") " Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.337837 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46eb09d6-3ebe-4cb9-ac84-21cc0f203c89-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "46eb09d6-3ebe-4cb9-ac84-21cc0f203c89" (UID: "46eb09d6-3ebe-4cb9-ac84-21cc0f203c89"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.338423 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46eb09d6-3ebe-4cb9-ac84-21cc0f203c89-config" (OuterVolumeSpecName: "config") pod "46eb09d6-3ebe-4cb9-ac84-21cc0f203c89" (UID: "46eb09d6-3ebe-4cb9-ac84-21cc0f203c89"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.353672 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46eb09d6-3ebe-4cb9-ac84-21cc0f203c89-kube-api-access-qv7dz" (OuterVolumeSpecName: "kube-api-access-qv7dz") pod "46eb09d6-3ebe-4cb9-ac84-21cc0f203c89" (UID: "46eb09d6-3ebe-4cb9-ac84-21cc0f203c89"). InnerVolumeSpecName "kube-api-access-qv7dz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.439336 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qv7dz\" (UniqueName: \"kubernetes.io/projected/46eb09d6-3ebe-4cb9-ac84-21cc0f203c89-kube-api-access-qv7dz\") on node \"crc\" DevicePath \"\"" Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.439657 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46eb09d6-3ebe-4cb9-ac84-21cc0f203c89-config\") on node \"crc\" DevicePath \"\"" Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.439669 4810 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/46eb09d6-3ebe-4cb9-ac84-21cc0f203c89-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.555552 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/notifications-rabbitmq-server-0"] Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.556812 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/notifications-rabbitmq-server-0" Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.560060 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"notifications-rabbitmq-default-user" Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.560407 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"notifications-rabbitmq-config-data" Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.560579 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"notifications-rabbitmq-server-conf" Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.560722 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-notifications-rabbitmq-svc" Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.560925 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"notifications-rabbitmq-server-dockercfg-jflkp" Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.561060 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"notifications-rabbitmq-plugins-conf" Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.562543 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"notifications-rabbitmq-erlang-cookie" Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.563653 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/notifications-rabbitmq-server-0"] Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.742939 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4ad875c7-9e7b-4cc8-b6f7-e6fbeb95573c-pod-info\") pod \"notifications-rabbitmq-server-0\" (UID: \"4ad875c7-9e7b-4cc8-b6f7-e6fbeb95573c\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.743009 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4ad875c7-9e7b-4cc8-b6f7-e6fbeb95573c-config-data\") pod \"notifications-rabbitmq-server-0\" (UID: \"4ad875c7-9e7b-4cc8-b6f7-e6fbeb95573c\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.743062 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4ad875c7-9e7b-4cc8-b6f7-e6fbeb95573c-rabbitmq-tls\") pod \"notifications-rabbitmq-server-0\" (UID: \"4ad875c7-9e7b-4cc8-b6f7-e6fbeb95573c\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.743079 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4ad875c7-9e7b-4cc8-b6f7-e6fbeb95573c-rabbitmq-plugins\") pod \"notifications-rabbitmq-server-0\" (UID: \"4ad875c7-9e7b-4cc8-b6f7-e6fbeb95573c\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.743100 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"notifications-rabbitmq-server-0\" (UID: \"4ad875c7-9e7b-4cc8-b6f7-e6fbeb95573c\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.743149 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56shj\" (UniqueName: \"kubernetes.io/projected/4ad875c7-9e7b-4cc8-b6f7-e6fbeb95573c-kube-api-access-56shj\") pod \"notifications-rabbitmq-server-0\" (UID: \"4ad875c7-9e7b-4cc8-b6f7-e6fbeb95573c\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.743171 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4ad875c7-9e7b-4cc8-b6f7-e6fbeb95573c-plugins-conf\") pod \"notifications-rabbitmq-server-0\" (UID: \"4ad875c7-9e7b-4cc8-b6f7-e6fbeb95573c\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.743215 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4ad875c7-9e7b-4cc8-b6f7-e6fbeb95573c-erlang-cookie-secret\") pod \"notifications-rabbitmq-server-0\" (UID: \"4ad875c7-9e7b-4cc8-b6f7-e6fbeb95573c\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.743249 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4ad875c7-9e7b-4cc8-b6f7-e6fbeb95573c-rabbitmq-confd\") pod \"notifications-rabbitmq-server-0\" (UID: \"4ad875c7-9e7b-4cc8-b6f7-e6fbeb95573c\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.743291 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4ad875c7-9e7b-4cc8-b6f7-e6fbeb95573c-server-conf\") pod \"notifications-rabbitmq-server-0\" (UID: \"4ad875c7-9e7b-4cc8-b6f7-e6fbeb95573c\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.743367 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4ad875c7-9e7b-4cc8-b6f7-e6fbeb95573c-rabbitmq-erlang-cookie\") pod \"notifications-rabbitmq-server-0\" (UID: \"4ad875c7-9e7b-4cc8-b6f7-e6fbeb95573c\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.805089 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.807122 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.809509 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.809654 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.809794 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-9r5f7" Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.810022 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.810058 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.810031 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.810161 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.823068 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.845652 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4ad875c7-9e7b-4cc8-b6f7-e6fbeb95573c-plugins-conf\") pod \"notifications-rabbitmq-server-0\" (UID: \"4ad875c7-9e7b-4cc8-b6f7-e6fbeb95573c\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.845707 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4ad875c7-9e7b-4cc8-b6f7-e6fbeb95573c-erlang-cookie-secret\") pod \"notifications-rabbitmq-server-0\" (UID: \"4ad875c7-9e7b-4cc8-b6f7-e6fbeb95573c\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.845763 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4ad875c7-9e7b-4cc8-b6f7-e6fbeb95573c-rabbitmq-confd\") pod \"notifications-rabbitmq-server-0\" (UID: \"4ad875c7-9e7b-4cc8-b6f7-e6fbeb95573c\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.845792 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4ad875c7-9e7b-4cc8-b6f7-e6fbeb95573c-server-conf\") pod \"notifications-rabbitmq-server-0\" (UID: \"4ad875c7-9e7b-4cc8-b6f7-e6fbeb95573c\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.845819 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4ad875c7-9e7b-4cc8-b6f7-e6fbeb95573c-rabbitmq-erlang-cookie\") pod \"notifications-rabbitmq-server-0\" (UID: \"4ad875c7-9e7b-4cc8-b6f7-e6fbeb95573c\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.845890 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4ad875c7-9e7b-4cc8-b6f7-e6fbeb95573c-pod-info\") pod \"notifications-rabbitmq-server-0\" (UID: \"4ad875c7-9e7b-4cc8-b6f7-e6fbeb95573c\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.845935 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4ad875c7-9e7b-4cc8-b6f7-e6fbeb95573c-config-data\") pod \"notifications-rabbitmq-server-0\" (UID: \"4ad875c7-9e7b-4cc8-b6f7-e6fbeb95573c\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.845984 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4ad875c7-9e7b-4cc8-b6f7-e6fbeb95573c-rabbitmq-tls\") pod \"notifications-rabbitmq-server-0\" (UID: \"4ad875c7-9e7b-4cc8-b6f7-e6fbeb95573c\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.846009 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4ad875c7-9e7b-4cc8-b6f7-e6fbeb95573c-rabbitmq-plugins\") pod \"notifications-rabbitmq-server-0\" (UID: \"4ad875c7-9e7b-4cc8-b6f7-e6fbeb95573c\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.846046 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"notifications-rabbitmq-server-0\" (UID: \"4ad875c7-9e7b-4cc8-b6f7-e6fbeb95573c\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.846099 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56shj\" (UniqueName: \"kubernetes.io/projected/4ad875c7-9e7b-4cc8-b6f7-e6fbeb95573c-kube-api-access-56shj\") pod \"notifications-rabbitmq-server-0\" (UID: \"4ad875c7-9e7b-4cc8-b6f7-e6fbeb95573c\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.846882 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4ad875c7-9e7b-4cc8-b6f7-e6fbeb95573c-rabbitmq-plugins\") pod \"notifications-rabbitmq-server-0\" (UID: \"4ad875c7-9e7b-4cc8-b6f7-e6fbeb95573c\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.846997 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4ad875c7-9e7b-4cc8-b6f7-e6fbeb95573c-plugins-conf\") pod \"notifications-rabbitmq-server-0\" (UID: \"4ad875c7-9e7b-4cc8-b6f7-e6fbeb95573c\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.847231 4810 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"notifications-rabbitmq-server-0\" (UID: \"4ad875c7-9e7b-4cc8-b6f7-e6fbeb95573c\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/notifications-rabbitmq-server-0" Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.850901 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4ad875c7-9e7b-4cc8-b6f7-e6fbeb95573c-rabbitmq-tls\") pod \"notifications-rabbitmq-server-0\" (UID: \"4ad875c7-9e7b-4cc8-b6f7-e6fbeb95573c\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.851401 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4ad875c7-9e7b-4cc8-b6f7-e6fbeb95573c-rabbitmq-confd\") pod \"notifications-rabbitmq-server-0\" (UID: \"4ad875c7-9e7b-4cc8-b6f7-e6fbeb95573c\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.852093 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4ad875c7-9e7b-4cc8-b6f7-e6fbeb95573c-server-conf\") pod \"notifications-rabbitmq-server-0\" (UID: \"4ad875c7-9e7b-4cc8-b6f7-e6fbeb95573c\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.854986 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4ad875c7-9e7b-4cc8-b6f7-e6fbeb95573c-rabbitmq-erlang-cookie\") pod \"notifications-rabbitmq-server-0\" (UID: \"4ad875c7-9e7b-4cc8-b6f7-e6fbeb95573c\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.860449 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4ad875c7-9e7b-4cc8-b6f7-e6fbeb95573c-config-data\") pod \"notifications-rabbitmq-server-0\" (UID: \"4ad875c7-9e7b-4cc8-b6f7-e6fbeb95573c\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.867610 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4ad875c7-9e7b-4cc8-b6f7-e6fbeb95573c-pod-info\") pod \"notifications-rabbitmq-server-0\" (UID: \"4ad875c7-9e7b-4cc8-b6f7-e6fbeb95573c\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.877169 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"notifications-rabbitmq-server-0\" (UID: \"4ad875c7-9e7b-4cc8-b6f7-e6fbeb95573c\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.877175 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4ad875c7-9e7b-4cc8-b6f7-e6fbeb95573c-erlang-cookie-secret\") pod \"notifications-rabbitmq-server-0\" (UID: \"4ad875c7-9e7b-4cc8-b6f7-e6fbeb95573c\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.881408 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56shj\" (UniqueName: \"kubernetes.io/projected/4ad875c7-9e7b-4cc8-b6f7-e6fbeb95573c-kube-api-access-56shj\") pod \"notifications-rabbitmq-server-0\" (UID: \"4ad875c7-9e7b-4cc8-b6f7-e6fbeb95573c\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.883236 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/notifications-rabbitmq-server-0" Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.947130 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/00bcfb03-4357-4343-99a5-30dc7f25abe9-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"00bcfb03-4357-4343-99a5-30dc7f25abe9\") " pod="openstack/rabbitmq-server-0" Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.947174 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/00bcfb03-4357-4343-99a5-30dc7f25abe9-server-conf\") pod \"rabbitmq-server-0\" (UID: \"00bcfb03-4357-4343-99a5-30dc7f25abe9\") " pod="openstack/rabbitmq-server-0" Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.947203 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/00bcfb03-4357-4343-99a5-30dc7f25abe9-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"00bcfb03-4357-4343-99a5-30dc7f25abe9\") " pod="openstack/rabbitmq-server-0" Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.947307 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"00bcfb03-4357-4343-99a5-30dc7f25abe9\") " pod="openstack/rabbitmq-server-0" Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.947377 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/00bcfb03-4357-4343-99a5-30dc7f25abe9-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"00bcfb03-4357-4343-99a5-30dc7f25abe9\") " pod="openstack/rabbitmq-server-0" Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.947451 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/00bcfb03-4357-4343-99a5-30dc7f25abe9-pod-info\") pod \"rabbitmq-server-0\" (UID: \"00bcfb03-4357-4343-99a5-30dc7f25abe9\") " pod="openstack/rabbitmq-server-0" Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.947588 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74vq8\" (UniqueName: \"kubernetes.io/projected/00bcfb03-4357-4343-99a5-30dc7f25abe9-kube-api-access-74vq8\") pod \"rabbitmq-server-0\" (UID: \"00bcfb03-4357-4343-99a5-30dc7f25abe9\") " pod="openstack/rabbitmq-server-0" Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.947643 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/00bcfb03-4357-4343-99a5-30dc7f25abe9-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"00bcfb03-4357-4343-99a5-30dc7f25abe9\") " pod="openstack/rabbitmq-server-0" Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.947662 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/00bcfb03-4357-4343-99a5-30dc7f25abe9-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"00bcfb03-4357-4343-99a5-30dc7f25abe9\") " pod="openstack/rabbitmq-server-0" Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.947691 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/00bcfb03-4357-4343-99a5-30dc7f25abe9-config-data\") pod \"rabbitmq-server-0\" (UID: \"00bcfb03-4357-4343-99a5-30dc7f25abe9\") " pod="openstack/rabbitmq-server-0" Feb 19 15:27:05 crc kubenswrapper[4810]: I0219 15:27:05.947722 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/00bcfb03-4357-4343-99a5-30dc7f25abe9-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"00bcfb03-4357-4343-99a5-30dc7f25abe9\") " pod="openstack/rabbitmq-server-0" Feb 19 15:27:06 crc kubenswrapper[4810]: I0219 15:27:06.051097 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74vq8\" (UniqueName: \"kubernetes.io/projected/00bcfb03-4357-4343-99a5-30dc7f25abe9-kube-api-access-74vq8\") pod \"rabbitmq-server-0\" (UID: \"00bcfb03-4357-4343-99a5-30dc7f25abe9\") " pod="openstack/rabbitmq-server-0" Feb 19 15:27:06 crc kubenswrapper[4810]: I0219 15:27:06.051229 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/00bcfb03-4357-4343-99a5-30dc7f25abe9-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"00bcfb03-4357-4343-99a5-30dc7f25abe9\") " pod="openstack/rabbitmq-server-0" Feb 19 15:27:06 crc kubenswrapper[4810]: I0219 15:27:06.051257 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/00bcfb03-4357-4343-99a5-30dc7f25abe9-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"00bcfb03-4357-4343-99a5-30dc7f25abe9\") " pod="openstack/rabbitmq-server-0" Feb 19 15:27:06 crc kubenswrapper[4810]: I0219 15:27:06.051747 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/00bcfb03-4357-4343-99a5-30dc7f25abe9-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"00bcfb03-4357-4343-99a5-30dc7f25abe9\") " pod="openstack/rabbitmq-server-0" Feb 19 15:27:06 crc kubenswrapper[4810]: I0219 15:27:06.052122 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/00bcfb03-4357-4343-99a5-30dc7f25abe9-config-data\") pod \"rabbitmq-server-0\" (UID: \"00bcfb03-4357-4343-99a5-30dc7f25abe9\") " pod="openstack/rabbitmq-server-0" Feb 19 15:27:06 crc kubenswrapper[4810]: I0219 15:27:06.052238 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/00bcfb03-4357-4343-99a5-30dc7f25abe9-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"00bcfb03-4357-4343-99a5-30dc7f25abe9\") " pod="openstack/rabbitmq-server-0" Feb 19 15:27:06 crc kubenswrapper[4810]: I0219 15:27:06.052435 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/00bcfb03-4357-4343-99a5-30dc7f25abe9-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"00bcfb03-4357-4343-99a5-30dc7f25abe9\") " pod="openstack/rabbitmq-server-0" Feb 19 15:27:06 crc kubenswrapper[4810]: I0219 15:27:06.052492 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/00bcfb03-4357-4343-99a5-30dc7f25abe9-server-conf\") pod \"rabbitmq-server-0\" (UID: \"00bcfb03-4357-4343-99a5-30dc7f25abe9\") " pod="openstack/rabbitmq-server-0" Feb 19 15:27:06 crc kubenswrapper[4810]: I0219 15:27:06.052554 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/00bcfb03-4357-4343-99a5-30dc7f25abe9-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"00bcfb03-4357-4343-99a5-30dc7f25abe9\") " pod="openstack/rabbitmq-server-0" Feb 19 15:27:06 crc kubenswrapper[4810]: I0219 15:27:06.052594 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"00bcfb03-4357-4343-99a5-30dc7f25abe9\") " pod="openstack/rabbitmq-server-0" Feb 19 15:27:06 crc kubenswrapper[4810]: I0219 15:27:06.052611 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/00bcfb03-4357-4343-99a5-30dc7f25abe9-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"00bcfb03-4357-4343-99a5-30dc7f25abe9\") " pod="openstack/rabbitmq-server-0" Feb 19 15:27:06 crc kubenswrapper[4810]: I0219 15:27:06.052662 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/00bcfb03-4357-4343-99a5-30dc7f25abe9-pod-info\") pod \"rabbitmq-server-0\" (UID: \"00bcfb03-4357-4343-99a5-30dc7f25abe9\") " pod="openstack/rabbitmq-server-0" Feb 19 15:27:06 crc kubenswrapper[4810]: I0219 15:27:06.052834 4810 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"00bcfb03-4357-4343-99a5-30dc7f25abe9\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/rabbitmq-server-0" Feb 19 15:27:06 crc kubenswrapper[4810]: I0219 15:27:06.052976 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/00bcfb03-4357-4343-99a5-30dc7f25abe9-config-data\") pod \"rabbitmq-server-0\" (UID: \"00bcfb03-4357-4343-99a5-30dc7f25abe9\") " pod="openstack/rabbitmq-server-0" Feb 19 15:27:06 crc kubenswrapper[4810]: I0219 15:27:06.053305 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/00bcfb03-4357-4343-99a5-30dc7f25abe9-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"00bcfb03-4357-4343-99a5-30dc7f25abe9\") " pod="openstack/rabbitmq-server-0" Feb 19 15:27:06 crc kubenswrapper[4810]: I0219 15:27:06.054141 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/00bcfb03-4357-4343-99a5-30dc7f25abe9-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"00bcfb03-4357-4343-99a5-30dc7f25abe9\") " pod="openstack/rabbitmq-server-0" Feb 19 15:27:06 crc kubenswrapper[4810]: I0219 15:27:06.055213 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/00bcfb03-4357-4343-99a5-30dc7f25abe9-server-conf\") pod \"rabbitmq-server-0\" (UID: \"00bcfb03-4357-4343-99a5-30dc7f25abe9\") " pod="openstack/rabbitmq-server-0" Feb 19 15:27:06 crc kubenswrapper[4810]: I0219 15:27:06.072948 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 15:27:06 crc kubenswrapper[4810]: I0219 15:27:06.077297 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/00bcfb03-4357-4343-99a5-30dc7f25abe9-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"00bcfb03-4357-4343-99a5-30dc7f25abe9\") " pod="openstack/rabbitmq-server-0" Feb 19 15:27:06 crc kubenswrapper[4810]: I0219 15:27:06.077882 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/00bcfb03-4357-4343-99a5-30dc7f25abe9-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"00bcfb03-4357-4343-99a5-30dc7f25abe9\") " pod="openstack/rabbitmq-server-0" Feb 19 15:27:06 crc kubenswrapper[4810]: I0219 15:27:06.090155 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/00bcfb03-4357-4343-99a5-30dc7f25abe9-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"00bcfb03-4357-4343-99a5-30dc7f25abe9\") " pod="openstack/rabbitmq-server-0" Feb 19 15:27:06 crc kubenswrapper[4810]: I0219 15:27:06.090185 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/00bcfb03-4357-4343-99a5-30dc7f25abe9-pod-info\") pod \"rabbitmq-server-0\" (UID: \"00bcfb03-4357-4343-99a5-30dc7f25abe9\") " pod="openstack/rabbitmq-server-0" Feb 19 15:27:06 crc kubenswrapper[4810]: I0219 15:27:06.092978 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74vq8\" (UniqueName: \"kubernetes.io/projected/00bcfb03-4357-4343-99a5-30dc7f25abe9-kube-api-access-74vq8\") pod \"rabbitmq-server-0\" (UID: \"00bcfb03-4357-4343-99a5-30dc7f25abe9\") " pod="openstack/rabbitmq-server-0" Feb 19 15:27:06 crc kubenswrapper[4810]: I0219 15:27:06.106403 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 19 15:27:06 crc kubenswrapper[4810]: I0219 15:27:06.108362 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"00bcfb03-4357-4343-99a5-30dc7f25abe9\") " pod="openstack/rabbitmq-server-0" Feb 19 15:27:06 crc kubenswrapper[4810]: I0219 15:27:06.110388 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 19 15:27:06 crc kubenswrapper[4810]: I0219 15:27:06.110920 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 19 15:27:06 crc kubenswrapper[4810]: I0219 15:27:06.111057 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 19 15:27:06 crc kubenswrapper[4810]: I0219 15:27:06.111083 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 19 15:27:06 crc kubenswrapper[4810]: I0219 15:27:06.111240 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 19 15:27:06 crc kubenswrapper[4810]: I0219 15:27:06.115978 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 15:27:06 crc kubenswrapper[4810]: I0219 15:27:06.116160 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 19 15:27:06 crc kubenswrapper[4810]: I0219 15:27:06.116249 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-qxvfg" Feb 19 15:27:06 crc kubenswrapper[4810]: I0219 15:27:06.130654 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 19 15:27:06 crc kubenswrapper[4810]: I0219 15:27:06.240586 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6598d9876f-l2z6b" Feb 19 15:27:06 crc kubenswrapper[4810]: I0219 15:27:06.256031 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtmb8\" (UniqueName: \"kubernetes.io/projected/2a3676ed-f06f-4dea-82a1-959716331113-kube-api-access-wtmb8\") pod \"rabbitmq-cell1-server-0\" (UID: \"2a3676ed-f06f-4dea-82a1-959716331113\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 15:27:06 crc kubenswrapper[4810]: I0219 15:27:06.256078 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2a3676ed-f06f-4dea-82a1-959716331113-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"2a3676ed-f06f-4dea-82a1-959716331113\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 15:27:06 crc kubenswrapper[4810]: I0219 15:27:06.256127 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"2a3676ed-f06f-4dea-82a1-959716331113\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 15:27:06 crc kubenswrapper[4810]: I0219 15:27:06.256164 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2a3676ed-f06f-4dea-82a1-959716331113-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2a3676ed-f06f-4dea-82a1-959716331113\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 15:27:06 crc kubenswrapper[4810]: I0219 15:27:06.256183 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2a3676ed-f06f-4dea-82a1-959716331113-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"2a3676ed-f06f-4dea-82a1-959716331113\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 15:27:06 crc kubenswrapper[4810]: I0219 15:27:06.256199 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2a3676ed-f06f-4dea-82a1-959716331113-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"2a3676ed-f06f-4dea-82a1-959716331113\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 15:27:06 crc kubenswrapper[4810]: I0219 15:27:06.256216 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2a3676ed-f06f-4dea-82a1-959716331113-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"2a3676ed-f06f-4dea-82a1-959716331113\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 15:27:06 crc kubenswrapper[4810]: I0219 15:27:06.256235 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2a3676ed-f06f-4dea-82a1-959716331113-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"2a3676ed-f06f-4dea-82a1-959716331113\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 15:27:06 crc kubenswrapper[4810]: I0219 15:27:06.256249 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2a3676ed-f06f-4dea-82a1-959716331113-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"2a3676ed-f06f-4dea-82a1-959716331113\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 15:27:06 crc kubenswrapper[4810]: I0219 15:27:06.256275 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2a3676ed-f06f-4dea-82a1-959716331113-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"2a3676ed-f06f-4dea-82a1-959716331113\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 15:27:06 crc kubenswrapper[4810]: I0219 15:27:06.256303 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2a3676ed-f06f-4dea-82a1-959716331113-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2a3676ed-f06f-4dea-82a1-959716331113\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 15:27:06 crc kubenswrapper[4810]: I0219 15:27:06.278583 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6598d9876f-l2z6b"] Feb 19 15:27:06 crc kubenswrapper[4810]: I0219 15:27:06.284548 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6598d9876f-l2z6b"] Feb 19 15:27:06 crc kubenswrapper[4810]: I0219 15:27:06.357575 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2a3676ed-f06f-4dea-82a1-959716331113-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"2a3676ed-f06f-4dea-82a1-959716331113\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 15:27:06 crc kubenswrapper[4810]: I0219 15:27:06.357625 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2a3676ed-f06f-4dea-82a1-959716331113-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"2a3676ed-f06f-4dea-82a1-959716331113\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 15:27:06 crc kubenswrapper[4810]: I0219 15:27:06.357646 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2a3676ed-f06f-4dea-82a1-959716331113-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"2a3676ed-f06f-4dea-82a1-959716331113\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 15:27:06 crc kubenswrapper[4810]: I0219 15:27:06.357673 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2a3676ed-f06f-4dea-82a1-959716331113-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"2a3676ed-f06f-4dea-82a1-959716331113\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 15:27:06 crc kubenswrapper[4810]: I0219 15:27:06.357711 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2a3676ed-f06f-4dea-82a1-959716331113-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"2a3676ed-f06f-4dea-82a1-959716331113\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 15:27:06 crc kubenswrapper[4810]: I0219 15:27:06.357767 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2a3676ed-f06f-4dea-82a1-959716331113-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"2a3676ed-f06f-4dea-82a1-959716331113\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 15:27:06 crc kubenswrapper[4810]: I0219 15:27:06.357804 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2a3676ed-f06f-4dea-82a1-959716331113-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2a3676ed-f06f-4dea-82a1-959716331113\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 15:27:06 crc kubenswrapper[4810]: I0219 15:27:06.357861 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtmb8\" (UniqueName: \"kubernetes.io/projected/2a3676ed-f06f-4dea-82a1-959716331113-kube-api-access-wtmb8\") pod \"rabbitmq-cell1-server-0\" (UID: \"2a3676ed-f06f-4dea-82a1-959716331113\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 15:27:06 crc kubenswrapper[4810]: I0219 15:27:06.357891 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2a3676ed-f06f-4dea-82a1-959716331113-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"2a3676ed-f06f-4dea-82a1-959716331113\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 15:27:06 crc kubenswrapper[4810]: I0219 15:27:06.357967 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"2a3676ed-f06f-4dea-82a1-959716331113\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 15:27:06 crc kubenswrapper[4810]: I0219 15:27:06.358041 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2a3676ed-f06f-4dea-82a1-959716331113-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2a3676ed-f06f-4dea-82a1-959716331113\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 15:27:06 crc kubenswrapper[4810]: I0219 15:27:06.358154 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2a3676ed-f06f-4dea-82a1-959716331113-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"2a3676ed-f06f-4dea-82a1-959716331113\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 15:27:06 crc kubenswrapper[4810]: I0219 15:27:06.358485 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2a3676ed-f06f-4dea-82a1-959716331113-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"2a3676ed-f06f-4dea-82a1-959716331113\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 15:27:06 crc kubenswrapper[4810]: I0219 15:27:06.358744 4810 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"2a3676ed-f06f-4dea-82a1-959716331113\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/rabbitmq-cell1-server-0" Feb 19 15:27:06 crc kubenswrapper[4810]: I0219 15:27:06.359366 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2a3676ed-f06f-4dea-82a1-959716331113-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"2a3676ed-f06f-4dea-82a1-959716331113\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 15:27:06 crc kubenswrapper[4810]: I0219 15:27:06.359991 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2a3676ed-f06f-4dea-82a1-959716331113-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2a3676ed-f06f-4dea-82a1-959716331113\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 15:27:06 crc kubenswrapper[4810]: I0219 15:27:06.360949 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2a3676ed-f06f-4dea-82a1-959716331113-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2a3676ed-f06f-4dea-82a1-959716331113\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 15:27:06 crc kubenswrapper[4810]: I0219 15:27:06.361597 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2a3676ed-f06f-4dea-82a1-959716331113-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"2a3676ed-f06f-4dea-82a1-959716331113\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 15:27:06 crc kubenswrapper[4810]: I0219 15:27:06.362086 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2a3676ed-f06f-4dea-82a1-959716331113-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"2a3676ed-f06f-4dea-82a1-959716331113\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 15:27:06 crc kubenswrapper[4810]: I0219 15:27:06.362518 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2a3676ed-f06f-4dea-82a1-959716331113-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"2a3676ed-f06f-4dea-82a1-959716331113\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 15:27:06 crc kubenswrapper[4810]: I0219 15:27:06.363715 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2a3676ed-f06f-4dea-82a1-959716331113-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"2a3676ed-f06f-4dea-82a1-959716331113\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 15:27:06 crc kubenswrapper[4810]: I0219 15:27:06.384874 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"2a3676ed-f06f-4dea-82a1-959716331113\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 15:27:06 crc kubenswrapper[4810]: I0219 15:27:06.393178 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtmb8\" (UniqueName: \"kubernetes.io/projected/2a3676ed-f06f-4dea-82a1-959716331113-kube-api-access-wtmb8\") pod \"rabbitmq-cell1-server-0\" (UID: \"2a3676ed-f06f-4dea-82a1-959716331113\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 15:27:06 crc kubenswrapper[4810]: I0219 15:27:06.441633 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 19 15:27:07 crc kubenswrapper[4810]: I0219 15:27:07.457936 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46eb09d6-3ebe-4cb9-ac84-21cc0f203c89" path="/var/lib/kubelet/pods/46eb09d6-3ebe-4cb9-ac84-21cc0f203c89/volumes" Feb 19 15:27:07 crc kubenswrapper[4810]: I0219 15:27:07.513438 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Feb 19 15:27:07 crc kubenswrapper[4810]: I0219 15:27:07.514674 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 19 15:27:07 crc kubenswrapper[4810]: I0219 15:27:07.520227 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Feb 19 15:27:07 crc kubenswrapper[4810]: I0219 15:27:07.520467 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Feb 19 15:27:07 crc kubenswrapper[4810]: I0219 15:27:07.520653 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Feb 19 15:27:07 crc kubenswrapper[4810]: I0219 15:27:07.520702 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-22plv" Feb 19 15:27:07 crc kubenswrapper[4810]: I0219 15:27:07.526485 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 19 15:27:07 crc kubenswrapper[4810]: I0219 15:27:07.528089 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Feb 19 15:27:07 crc kubenswrapper[4810]: I0219 15:27:07.676152 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"c0ffb8ce-a356-4416-b96c-49db30ff1947\") " pod="openstack/openstack-galera-0" Feb 19 15:27:07 crc kubenswrapper[4810]: I0219 15:27:07.676216 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c0ffb8ce-a356-4416-b96c-49db30ff1947-config-data-generated\") pod \"openstack-galera-0\" (UID: \"c0ffb8ce-a356-4416-b96c-49db30ff1947\") " pod="openstack/openstack-galera-0" Feb 19 15:27:07 crc kubenswrapper[4810]: I0219 15:27:07.676305 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrkxq\" (UniqueName: \"kubernetes.io/projected/c0ffb8ce-a356-4416-b96c-49db30ff1947-kube-api-access-qrkxq\") pod \"openstack-galera-0\" (UID: \"c0ffb8ce-a356-4416-b96c-49db30ff1947\") " pod="openstack/openstack-galera-0" Feb 19 15:27:07 crc kubenswrapper[4810]: I0219 15:27:07.676376 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0ffb8ce-a356-4416-b96c-49db30ff1947-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"c0ffb8ce-a356-4416-b96c-49db30ff1947\") " pod="openstack/openstack-galera-0" Feb 19 15:27:07 crc kubenswrapper[4810]: I0219 15:27:07.676399 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c0ffb8ce-a356-4416-b96c-49db30ff1947-kolla-config\") pod \"openstack-galera-0\" (UID: \"c0ffb8ce-a356-4416-b96c-49db30ff1947\") " pod="openstack/openstack-galera-0" Feb 19 15:27:07 crc kubenswrapper[4810]: I0219 15:27:07.677474 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c0ffb8ce-a356-4416-b96c-49db30ff1947-config-data-default\") pod \"openstack-galera-0\" (UID: \"c0ffb8ce-a356-4416-b96c-49db30ff1947\") " pod="openstack/openstack-galera-0" Feb 19 15:27:07 crc kubenswrapper[4810]: I0219 15:27:07.677522 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0ffb8ce-a356-4416-b96c-49db30ff1947-operator-scripts\") pod \"openstack-galera-0\" (UID: \"c0ffb8ce-a356-4416-b96c-49db30ff1947\") " pod="openstack/openstack-galera-0" Feb 19 15:27:07 crc kubenswrapper[4810]: I0219 15:27:07.677550 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0ffb8ce-a356-4416-b96c-49db30ff1947-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"c0ffb8ce-a356-4416-b96c-49db30ff1947\") " pod="openstack/openstack-galera-0" Feb 19 15:27:07 crc kubenswrapper[4810]: I0219 15:27:07.778981 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0ffb8ce-a356-4416-b96c-49db30ff1947-operator-scripts\") pod \"openstack-galera-0\" (UID: \"c0ffb8ce-a356-4416-b96c-49db30ff1947\") " pod="openstack/openstack-galera-0" Feb 19 15:27:07 crc kubenswrapper[4810]: I0219 15:27:07.779069 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0ffb8ce-a356-4416-b96c-49db30ff1947-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"c0ffb8ce-a356-4416-b96c-49db30ff1947\") " pod="openstack/openstack-galera-0" Feb 19 15:27:07 crc kubenswrapper[4810]: I0219 15:27:07.779149 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"c0ffb8ce-a356-4416-b96c-49db30ff1947\") " pod="openstack/openstack-galera-0" Feb 19 15:27:07 crc kubenswrapper[4810]: I0219 15:27:07.779197 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c0ffb8ce-a356-4416-b96c-49db30ff1947-config-data-generated\") pod \"openstack-galera-0\" (UID: \"c0ffb8ce-a356-4416-b96c-49db30ff1947\") " pod="openstack/openstack-galera-0" Feb 19 15:27:07 crc kubenswrapper[4810]: I0219 15:27:07.779305 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrkxq\" (UniqueName: \"kubernetes.io/projected/c0ffb8ce-a356-4416-b96c-49db30ff1947-kube-api-access-qrkxq\") pod \"openstack-galera-0\" (UID: \"c0ffb8ce-a356-4416-b96c-49db30ff1947\") " pod="openstack/openstack-galera-0" Feb 19 15:27:07 crc kubenswrapper[4810]: I0219 15:27:07.779376 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0ffb8ce-a356-4416-b96c-49db30ff1947-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"c0ffb8ce-a356-4416-b96c-49db30ff1947\") " pod="openstack/openstack-galera-0" Feb 19 15:27:07 crc kubenswrapper[4810]: I0219 15:27:07.779401 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c0ffb8ce-a356-4416-b96c-49db30ff1947-kolla-config\") pod \"openstack-galera-0\" (UID: \"c0ffb8ce-a356-4416-b96c-49db30ff1947\") " pod="openstack/openstack-galera-0" Feb 19 15:27:07 crc kubenswrapper[4810]: I0219 15:27:07.779475 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c0ffb8ce-a356-4416-b96c-49db30ff1947-config-data-default\") pod \"openstack-galera-0\" (UID: \"c0ffb8ce-a356-4416-b96c-49db30ff1947\") " pod="openstack/openstack-galera-0" Feb 19 15:27:07 crc kubenswrapper[4810]: I0219 15:27:07.781201 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c0ffb8ce-a356-4416-b96c-49db30ff1947-config-data-default\") pod \"openstack-galera-0\" (UID: \"c0ffb8ce-a356-4416-b96c-49db30ff1947\") " pod="openstack/openstack-galera-0" Feb 19 15:27:07 crc kubenswrapper[4810]: I0219 15:27:07.782625 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0ffb8ce-a356-4416-b96c-49db30ff1947-operator-scripts\") pod \"openstack-galera-0\" (UID: \"c0ffb8ce-a356-4416-b96c-49db30ff1947\") " pod="openstack/openstack-galera-0" Feb 19 15:27:07 crc kubenswrapper[4810]: I0219 15:27:07.784614 4810 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"c0ffb8ce-a356-4416-b96c-49db30ff1947\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/openstack-galera-0" Feb 19 15:27:07 crc kubenswrapper[4810]: I0219 15:27:07.784990 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c0ffb8ce-a356-4416-b96c-49db30ff1947-config-data-generated\") pod \"openstack-galera-0\" (UID: \"c0ffb8ce-a356-4416-b96c-49db30ff1947\") " pod="openstack/openstack-galera-0" Feb 19 15:27:07 crc kubenswrapper[4810]: I0219 15:27:07.785852 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c0ffb8ce-a356-4416-b96c-49db30ff1947-kolla-config\") pod \"openstack-galera-0\" (UID: \"c0ffb8ce-a356-4416-b96c-49db30ff1947\") " pod="openstack/openstack-galera-0" Feb 19 15:27:07 crc kubenswrapper[4810]: I0219 15:27:07.791981 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0ffb8ce-a356-4416-b96c-49db30ff1947-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"c0ffb8ce-a356-4416-b96c-49db30ff1947\") " pod="openstack/openstack-galera-0" Feb 19 15:27:07 crc kubenswrapper[4810]: I0219 15:27:07.793111 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0ffb8ce-a356-4416-b96c-49db30ff1947-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"c0ffb8ce-a356-4416-b96c-49db30ff1947\") " pod="openstack/openstack-galera-0" Feb 19 15:27:07 crc kubenswrapper[4810]: I0219 15:27:07.822434 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrkxq\" (UniqueName: \"kubernetes.io/projected/c0ffb8ce-a356-4416-b96c-49db30ff1947-kube-api-access-qrkxq\") pod \"openstack-galera-0\" (UID: \"c0ffb8ce-a356-4416-b96c-49db30ff1947\") " pod="openstack/openstack-galera-0" Feb 19 15:27:07 crc kubenswrapper[4810]: I0219 15:27:07.823276 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"c0ffb8ce-a356-4416-b96c-49db30ff1947\") " pod="openstack/openstack-galera-0" Feb 19 15:27:07 crc kubenswrapper[4810]: I0219 15:27:07.838823 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 19 15:27:09 crc kubenswrapper[4810]: I0219 15:27:09.028747 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 19 15:27:09 crc kubenswrapper[4810]: I0219 15:27:09.032179 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 19 15:27:09 crc kubenswrapper[4810]: I0219 15:27:09.036294 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Feb 19 15:27:09 crc kubenswrapper[4810]: I0219 15:27:09.036575 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-h5nm7" Feb 19 15:27:09 crc kubenswrapper[4810]: I0219 15:27:09.036857 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Feb 19 15:27:09 crc kubenswrapper[4810]: I0219 15:27:09.037780 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Feb 19 15:27:09 crc kubenswrapper[4810]: I0219 15:27:09.038758 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 19 15:27:09 crc kubenswrapper[4810]: I0219 15:27:09.106406 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcjf6\" (UniqueName: \"kubernetes.io/projected/30d11a24-9722-4e7a-9be5-f2bd00128167-kube-api-access-fcjf6\") pod \"openstack-cell1-galera-0\" (UID: \"30d11a24-9722-4e7a-9be5-f2bd00128167\") " pod="openstack/openstack-cell1-galera-0" Feb 19 15:27:09 crc kubenswrapper[4810]: I0219 15:27:09.106472 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/30d11a24-9722-4e7a-9be5-f2bd00128167-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"30d11a24-9722-4e7a-9be5-f2bd00128167\") " pod="openstack/openstack-cell1-galera-0" Feb 19 15:27:09 crc kubenswrapper[4810]: I0219 15:27:09.106510 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30d11a24-9722-4e7a-9be5-f2bd00128167-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"30d11a24-9722-4e7a-9be5-f2bd00128167\") " pod="openstack/openstack-cell1-galera-0" Feb 19 15:27:09 crc kubenswrapper[4810]: I0219 15:27:09.106545 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"30d11a24-9722-4e7a-9be5-f2bd00128167\") " pod="openstack/openstack-cell1-galera-0" Feb 19 15:27:09 crc kubenswrapper[4810]: I0219 15:27:09.106585 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/30d11a24-9722-4e7a-9be5-f2bd00128167-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"30d11a24-9722-4e7a-9be5-f2bd00128167\") " pod="openstack/openstack-cell1-galera-0" Feb 19 15:27:09 crc kubenswrapper[4810]: I0219 15:27:09.106648 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/30d11a24-9722-4e7a-9be5-f2bd00128167-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"30d11a24-9722-4e7a-9be5-f2bd00128167\") " pod="openstack/openstack-cell1-galera-0" Feb 19 15:27:09 crc kubenswrapper[4810]: I0219 15:27:09.106691 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/30d11a24-9722-4e7a-9be5-f2bd00128167-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"30d11a24-9722-4e7a-9be5-f2bd00128167\") " pod="openstack/openstack-cell1-galera-0" Feb 19 15:27:09 crc kubenswrapper[4810]: I0219 15:27:09.106726 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30d11a24-9722-4e7a-9be5-f2bd00128167-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"30d11a24-9722-4e7a-9be5-f2bd00128167\") " pod="openstack/openstack-cell1-galera-0" Feb 19 15:27:09 crc kubenswrapper[4810]: I0219 15:27:09.207522 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/30d11a24-9722-4e7a-9be5-f2bd00128167-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"30d11a24-9722-4e7a-9be5-f2bd00128167\") " pod="openstack/openstack-cell1-galera-0" Feb 19 15:27:09 crc kubenswrapper[4810]: I0219 15:27:09.207615 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/30d11a24-9722-4e7a-9be5-f2bd00128167-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"30d11a24-9722-4e7a-9be5-f2bd00128167\") " pod="openstack/openstack-cell1-galera-0" Feb 19 15:27:09 crc kubenswrapper[4810]: I0219 15:27:09.207660 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/30d11a24-9722-4e7a-9be5-f2bd00128167-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"30d11a24-9722-4e7a-9be5-f2bd00128167\") " pod="openstack/openstack-cell1-galera-0" Feb 19 15:27:09 crc kubenswrapper[4810]: I0219 15:27:09.207693 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30d11a24-9722-4e7a-9be5-f2bd00128167-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"30d11a24-9722-4e7a-9be5-f2bd00128167\") " pod="openstack/openstack-cell1-galera-0" Feb 19 15:27:09 crc kubenswrapper[4810]: I0219 15:27:09.207731 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcjf6\" (UniqueName: \"kubernetes.io/projected/30d11a24-9722-4e7a-9be5-f2bd00128167-kube-api-access-fcjf6\") pod \"openstack-cell1-galera-0\" (UID: \"30d11a24-9722-4e7a-9be5-f2bd00128167\") " pod="openstack/openstack-cell1-galera-0" Feb 19 15:27:09 crc kubenswrapper[4810]: I0219 15:27:09.207758 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/30d11a24-9722-4e7a-9be5-f2bd00128167-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"30d11a24-9722-4e7a-9be5-f2bd00128167\") " pod="openstack/openstack-cell1-galera-0" Feb 19 15:27:09 crc kubenswrapper[4810]: I0219 15:27:09.207788 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30d11a24-9722-4e7a-9be5-f2bd00128167-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"30d11a24-9722-4e7a-9be5-f2bd00128167\") " pod="openstack/openstack-cell1-galera-0" Feb 19 15:27:09 crc kubenswrapper[4810]: I0219 15:27:09.207820 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"30d11a24-9722-4e7a-9be5-f2bd00128167\") " pod="openstack/openstack-cell1-galera-0" Feb 19 15:27:09 crc kubenswrapper[4810]: I0219 15:27:09.208053 4810 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"30d11a24-9722-4e7a-9be5-f2bd00128167\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/openstack-cell1-galera-0" Feb 19 15:27:09 crc kubenswrapper[4810]: I0219 15:27:09.216960 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/30d11a24-9722-4e7a-9be5-f2bd00128167-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"30d11a24-9722-4e7a-9be5-f2bd00128167\") " pod="openstack/openstack-cell1-galera-0" Feb 19 15:27:09 crc kubenswrapper[4810]: I0219 15:27:09.217620 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/30d11a24-9722-4e7a-9be5-f2bd00128167-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"30d11a24-9722-4e7a-9be5-f2bd00128167\") " pod="openstack/openstack-cell1-galera-0" Feb 19 15:27:09 crc kubenswrapper[4810]: I0219 15:27:09.218321 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/30d11a24-9722-4e7a-9be5-f2bd00128167-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"30d11a24-9722-4e7a-9be5-f2bd00128167\") " pod="openstack/openstack-cell1-galera-0" Feb 19 15:27:09 crc kubenswrapper[4810]: I0219 15:27:09.218713 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30d11a24-9722-4e7a-9be5-f2bd00128167-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"30d11a24-9722-4e7a-9be5-f2bd00128167\") " pod="openstack/openstack-cell1-galera-0" Feb 19 15:27:09 crc kubenswrapper[4810]: I0219 15:27:09.223858 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/30d11a24-9722-4e7a-9be5-f2bd00128167-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"30d11a24-9722-4e7a-9be5-f2bd00128167\") " pod="openstack/openstack-cell1-galera-0" Feb 19 15:27:09 crc kubenswrapper[4810]: I0219 15:27:09.225951 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30d11a24-9722-4e7a-9be5-f2bd00128167-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"30d11a24-9722-4e7a-9be5-f2bd00128167\") " pod="openstack/openstack-cell1-galera-0" Feb 19 15:27:09 crc kubenswrapper[4810]: I0219 15:27:09.249950 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcjf6\" (UniqueName: \"kubernetes.io/projected/30d11a24-9722-4e7a-9be5-f2bd00128167-kube-api-access-fcjf6\") pod \"openstack-cell1-galera-0\" (UID: \"30d11a24-9722-4e7a-9be5-f2bd00128167\") " pod="openstack/openstack-cell1-galera-0" Feb 19 15:27:09 crc kubenswrapper[4810]: I0219 15:27:09.288563 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"30d11a24-9722-4e7a-9be5-f2bd00128167\") " pod="openstack/openstack-cell1-galera-0" Feb 19 15:27:09 crc kubenswrapper[4810]: I0219 15:27:09.357967 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 19 15:27:09 crc kubenswrapper[4810]: I0219 15:27:09.385636 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Feb 19 15:27:09 crc kubenswrapper[4810]: I0219 15:27:09.386668 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 19 15:27:09 crc kubenswrapper[4810]: I0219 15:27:09.389078 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-s95tz" Feb 19 15:27:09 crc kubenswrapper[4810]: I0219 15:27:09.389157 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Feb 19 15:27:09 crc kubenswrapper[4810]: I0219 15:27:09.389633 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Feb 19 15:27:09 crc kubenswrapper[4810]: I0219 15:27:09.396062 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 19 15:27:09 crc kubenswrapper[4810]: I0219 15:27:09.512128 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/eb773d46-7b9f-4ca4-b33c-9b800b9eafd7-kolla-config\") pod \"memcached-0\" (UID: \"eb773d46-7b9f-4ca4-b33c-9b800b9eafd7\") " pod="openstack/memcached-0" Feb 19 15:27:09 crc kubenswrapper[4810]: I0219 15:27:09.512243 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/eb773d46-7b9f-4ca4-b33c-9b800b9eafd7-config-data\") pod \"memcached-0\" (UID: \"eb773d46-7b9f-4ca4-b33c-9b800b9eafd7\") " pod="openstack/memcached-0" Feb 19 15:27:09 crc kubenswrapper[4810]: I0219 15:27:09.512270 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb773d46-7b9f-4ca4-b33c-9b800b9eafd7-memcached-tls-certs\") pod \"memcached-0\" (UID: \"eb773d46-7b9f-4ca4-b33c-9b800b9eafd7\") " pod="openstack/memcached-0" Feb 19 15:27:09 crc kubenswrapper[4810]: I0219 15:27:09.512296 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzznx\" (UniqueName: \"kubernetes.io/projected/eb773d46-7b9f-4ca4-b33c-9b800b9eafd7-kube-api-access-hzznx\") pod \"memcached-0\" (UID: \"eb773d46-7b9f-4ca4-b33c-9b800b9eafd7\") " pod="openstack/memcached-0" Feb 19 15:27:09 crc kubenswrapper[4810]: I0219 15:27:09.512517 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb773d46-7b9f-4ca4-b33c-9b800b9eafd7-combined-ca-bundle\") pod \"memcached-0\" (UID: \"eb773d46-7b9f-4ca4-b33c-9b800b9eafd7\") " pod="openstack/memcached-0" Feb 19 15:27:09 crc kubenswrapper[4810]: I0219 15:27:09.614355 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb773d46-7b9f-4ca4-b33c-9b800b9eafd7-combined-ca-bundle\") pod \"memcached-0\" (UID: \"eb773d46-7b9f-4ca4-b33c-9b800b9eafd7\") " pod="openstack/memcached-0" Feb 19 15:27:09 crc kubenswrapper[4810]: I0219 15:27:09.614458 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/eb773d46-7b9f-4ca4-b33c-9b800b9eafd7-kolla-config\") pod \"memcached-0\" (UID: \"eb773d46-7b9f-4ca4-b33c-9b800b9eafd7\") " pod="openstack/memcached-0" Feb 19 15:27:09 crc kubenswrapper[4810]: I0219 15:27:09.614518 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/eb773d46-7b9f-4ca4-b33c-9b800b9eafd7-config-data\") pod \"memcached-0\" (UID: \"eb773d46-7b9f-4ca4-b33c-9b800b9eafd7\") " pod="openstack/memcached-0" Feb 19 15:27:09 crc kubenswrapper[4810]: I0219 15:27:09.614538 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb773d46-7b9f-4ca4-b33c-9b800b9eafd7-memcached-tls-certs\") pod \"memcached-0\" (UID: \"eb773d46-7b9f-4ca4-b33c-9b800b9eafd7\") " pod="openstack/memcached-0" Feb 19 15:27:09 crc kubenswrapper[4810]: I0219 15:27:09.614565 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzznx\" (UniqueName: \"kubernetes.io/projected/eb773d46-7b9f-4ca4-b33c-9b800b9eafd7-kube-api-access-hzznx\") pod \"memcached-0\" (UID: \"eb773d46-7b9f-4ca4-b33c-9b800b9eafd7\") " pod="openstack/memcached-0" Feb 19 15:27:09 crc kubenswrapper[4810]: I0219 15:27:09.615710 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/eb773d46-7b9f-4ca4-b33c-9b800b9eafd7-kolla-config\") pod \"memcached-0\" (UID: \"eb773d46-7b9f-4ca4-b33c-9b800b9eafd7\") " pod="openstack/memcached-0" Feb 19 15:27:09 crc kubenswrapper[4810]: I0219 15:27:09.616837 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/eb773d46-7b9f-4ca4-b33c-9b800b9eafd7-config-data\") pod \"memcached-0\" (UID: \"eb773d46-7b9f-4ca4-b33c-9b800b9eafd7\") " pod="openstack/memcached-0" Feb 19 15:27:09 crc kubenswrapper[4810]: I0219 15:27:09.621406 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb773d46-7b9f-4ca4-b33c-9b800b9eafd7-memcached-tls-certs\") pod \"memcached-0\" (UID: \"eb773d46-7b9f-4ca4-b33c-9b800b9eafd7\") " pod="openstack/memcached-0" Feb 19 15:27:09 crc kubenswrapper[4810]: I0219 15:27:09.624011 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb773d46-7b9f-4ca4-b33c-9b800b9eafd7-combined-ca-bundle\") pod \"memcached-0\" (UID: \"eb773d46-7b9f-4ca4-b33c-9b800b9eafd7\") " pod="openstack/memcached-0" Feb 19 15:27:09 crc kubenswrapper[4810]: I0219 15:27:09.648889 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzznx\" (UniqueName: \"kubernetes.io/projected/eb773d46-7b9f-4ca4-b33c-9b800b9eafd7-kube-api-access-hzznx\") pod \"memcached-0\" (UID: \"eb773d46-7b9f-4ca4-b33c-9b800b9eafd7\") " pod="openstack/memcached-0" Feb 19 15:27:09 crc kubenswrapper[4810]: I0219 15:27:09.701908 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 19 15:27:11 crc kubenswrapper[4810]: I0219 15:27:11.635526 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 15:27:11 crc kubenswrapper[4810]: I0219 15:27:11.636705 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 19 15:27:11 crc kubenswrapper[4810]: I0219 15:27:11.644052 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-sdx2j" Feb 19 15:27:11 crc kubenswrapper[4810]: I0219 15:27:11.648963 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 15:27:11 crc kubenswrapper[4810]: I0219 15:27:11.744740 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdrqf\" (UniqueName: \"kubernetes.io/projected/097bc4d1-5648-4607-9c49-286e4bbbe553-kube-api-access-mdrqf\") pod \"kube-state-metrics-0\" (UID: \"097bc4d1-5648-4607-9c49-286e4bbbe553\") " pod="openstack/kube-state-metrics-0" Feb 19 15:27:11 crc kubenswrapper[4810]: I0219 15:27:11.846493 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdrqf\" (UniqueName: \"kubernetes.io/projected/097bc4d1-5648-4607-9c49-286e4bbbe553-kube-api-access-mdrqf\") pod \"kube-state-metrics-0\" (UID: \"097bc4d1-5648-4607-9c49-286e4bbbe553\") " pod="openstack/kube-state-metrics-0" Feb 19 15:27:11 crc kubenswrapper[4810]: I0219 15:27:11.880382 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdrqf\" (UniqueName: \"kubernetes.io/projected/097bc4d1-5648-4607-9c49-286e4bbbe553-kube-api-access-mdrqf\") pod \"kube-state-metrics-0\" (UID: \"097bc4d1-5648-4607-9c49-286e4bbbe553\") " pod="openstack/kube-state-metrics-0" Feb 19 15:27:11 crc kubenswrapper[4810]: I0219 15:27:11.957707 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 19 15:27:12 crc kubenswrapper[4810]: I0219 15:27:12.958891 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 19 15:27:12 crc kubenswrapper[4810]: I0219 15:27:12.960776 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 19 15:27:12 crc kubenswrapper[4810]: I0219 15:27:12.963237 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-x7hn6" Feb 19 15:27:12 crc kubenswrapper[4810]: I0219 15:27:12.963500 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Feb 19 15:27:12 crc kubenswrapper[4810]: I0219 15:27:12.964078 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Feb 19 15:27:12 crc kubenswrapper[4810]: I0219 15:27:12.964272 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Feb 19 15:27:12 crc kubenswrapper[4810]: I0219 15:27:12.964750 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Feb 19 15:27:12 crc kubenswrapper[4810]: I0219 15:27:12.964979 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Feb 19 15:27:12 crc kubenswrapper[4810]: I0219 15:27:12.965412 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Feb 19 15:27:12 crc kubenswrapper[4810]: I0219 15:27:12.975202 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Feb 19 15:27:12 crc kubenswrapper[4810]: I0219 15:27:12.984529 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 19 15:27:13 crc kubenswrapper[4810]: I0219 15:27:13.063356 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-b3b143f1-488b-49bf-8792-af0d760f341e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b3b143f1-488b-49bf-8792-af0d760f341e\") pod \"prometheus-metric-storage-0\" (UID: \"a220bc57-3f31-4851-ad5c-9f61359f7de5\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:27:13 crc kubenswrapper[4810]: I0219 15:27:13.063416 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/a220bc57-3f31-4851-ad5c-9f61359f7de5-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"a220bc57-3f31-4851-ad5c-9f61359f7de5\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:27:13 crc kubenswrapper[4810]: I0219 15:27:13.063454 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a220bc57-3f31-4851-ad5c-9f61359f7de5-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"a220bc57-3f31-4851-ad5c-9f61359f7de5\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:27:13 crc kubenswrapper[4810]: I0219 15:27:13.063509 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a220bc57-3f31-4851-ad5c-9f61359f7de5-config\") pod \"prometheus-metric-storage-0\" (UID: \"a220bc57-3f31-4851-ad5c-9f61359f7de5\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:27:13 crc kubenswrapper[4810]: I0219 15:27:13.063555 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/a220bc57-3f31-4851-ad5c-9f61359f7de5-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"a220bc57-3f31-4851-ad5c-9f61359f7de5\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:27:13 crc kubenswrapper[4810]: I0219 15:27:13.063582 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ks5p\" (UniqueName: \"kubernetes.io/projected/a220bc57-3f31-4851-ad5c-9f61359f7de5-kube-api-access-8ks5p\") pod \"prometheus-metric-storage-0\" (UID: \"a220bc57-3f31-4851-ad5c-9f61359f7de5\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:27:13 crc kubenswrapper[4810]: I0219 15:27:13.063613 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a220bc57-3f31-4851-ad5c-9f61359f7de5-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"a220bc57-3f31-4851-ad5c-9f61359f7de5\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:27:13 crc kubenswrapper[4810]: I0219 15:27:13.063650 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a220bc57-3f31-4851-ad5c-9f61359f7de5-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"a220bc57-3f31-4851-ad5c-9f61359f7de5\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:27:13 crc kubenswrapper[4810]: I0219 15:27:13.063672 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a220bc57-3f31-4851-ad5c-9f61359f7de5-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"a220bc57-3f31-4851-ad5c-9f61359f7de5\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:27:13 crc kubenswrapper[4810]: I0219 15:27:13.063702 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a220bc57-3f31-4851-ad5c-9f61359f7de5-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"a220bc57-3f31-4851-ad5c-9f61359f7de5\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:27:13 crc kubenswrapper[4810]: I0219 15:27:13.164931 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a220bc57-3f31-4851-ad5c-9f61359f7de5-config\") pod \"prometheus-metric-storage-0\" (UID: \"a220bc57-3f31-4851-ad5c-9f61359f7de5\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:27:13 crc kubenswrapper[4810]: I0219 15:27:13.164986 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/a220bc57-3f31-4851-ad5c-9f61359f7de5-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"a220bc57-3f31-4851-ad5c-9f61359f7de5\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:27:13 crc kubenswrapper[4810]: I0219 15:27:13.165008 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ks5p\" (UniqueName: \"kubernetes.io/projected/a220bc57-3f31-4851-ad5c-9f61359f7de5-kube-api-access-8ks5p\") pod \"prometheus-metric-storage-0\" (UID: \"a220bc57-3f31-4851-ad5c-9f61359f7de5\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:27:13 crc kubenswrapper[4810]: I0219 15:27:13.165024 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a220bc57-3f31-4851-ad5c-9f61359f7de5-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"a220bc57-3f31-4851-ad5c-9f61359f7de5\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:27:13 crc kubenswrapper[4810]: I0219 15:27:13.165052 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a220bc57-3f31-4851-ad5c-9f61359f7de5-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"a220bc57-3f31-4851-ad5c-9f61359f7de5\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:27:13 crc kubenswrapper[4810]: I0219 15:27:13.165068 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a220bc57-3f31-4851-ad5c-9f61359f7de5-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"a220bc57-3f31-4851-ad5c-9f61359f7de5\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:27:13 crc kubenswrapper[4810]: I0219 15:27:13.165298 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a220bc57-3f31-4851-ad5c-9f61359f7de5-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"a220bc57-3f31-4851-ad5c-9f61359f7de5\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:27:13 crc kubenswrapper[4810]: I0219 15:27:13.165342 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-b3b143f1-488b-49bf-8792-af0d760f341e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b3b143f1-488b-49bf-8792-af0d760f341e\") pod \"prometheus-metric-storage-0\" (UID: \"a220bc57-3f31-4851-ad5c-9f61359f7de5\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:27:13 crc kubenswrapper[4810]: I0219 15:27:13.165389 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/a220bc57-3f31-4851-ad5c-9f61359f7de5-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"a220bc57-3f31-4851-ad5c-9f61359f7de5\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:27:13 crc kubenswrapper[4810]: I0219 15:27:13.165412 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a220bc57-3f31-4851-ad5c-9f61359f7de5-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"a220bc57-3f31-4851-ad5c-9f61359f7de5\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:27:13 crc kubenswrapper[4810]: I0219 15:27:13.166229 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/a220bc57-3f31-4851-ad5c-9f61359f7de5-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"a220bc57-3f31-4851-ad5c-9f61359f7de5\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:27:13 crc kubenswrapper[4810]: I0219 15:27:13.166285 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/a220bc57-3f31-4851-ad5c-9f61359f7de5-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"a220bc57-3f31-4851-ad5c-9f61359f7de5\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:27:13 crc kubenswrapper[4810]: I0219 15:27:13.166812 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a220bc57-3f31-4851-ad5c-9f61359f7de5-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"a220bc57-3f31-4851-ad5c-9f61359f7de5\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:27:13 crc kubenswrapper[4810]: I0219 15:27:13.168842 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/a220bc57-3f31-4851-ad5c-9f61359f7de5-config\") pod \"prometheus-metric-storage-0\" (UID: \"a220bc57-3f31-4851-ad5c-9f61359f7de5\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:27:13 crc kubenswrapper[4810]: I0219 15:27:13.183182 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a220bc57-3f31-4851-ad5c-9f61359f7de5-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"a220bc57-3f31-4851-ad5c-9f61359f7de5\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:27:13 crc kubenswrapper[4810]: I0219 15:27:13.183201 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a220bc57-3f31-4851-ad5c-9f61359f7de5-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"a220bc57-3f31-4851-ad5c-9f61359f7de5\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:27:13 crc kubenswrapper[4810]: I0219 15:27:13.183211 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a220bc57-3f31-4851-ad5c-9f61359f7de5-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"a220bc57-3f31-4851-ad5c-9f61359f7de5\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:27:13 crc kubenswrapper[4810]: I0219 15:27:13.183364 4810 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 15:27:13 crc kubenswrapper[4810]: I0219 15:27:13.183388 4810 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-b3b143f1-488b-49bf-8792-af0d760f341e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b3b143f1-488b-49bf-8792-af0d760f341e\") pod \"prometheus-metric-storage-0\" (UID: \"a220bc57-3f31-4851-ad5c-9f61359f7de5\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/e63bc62ea909687cb5abb0c5cf8da7008d795f1441aaff1987b707a42a388027/globalmount\"" pod="openstack/prometheus-metric-storage-0" Feb 19 15:27:13 crc kubenswrapper[4810]: I0219 15:27:13.186470 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ks5p\" (UniqueName: \"kubernetes.io/projected/a220bc57-3f31-4851-ad5c-9f61359f7de5-kube-api-access-8ks5p\") pod \"prometheus-metric-storage-0\" (UID: \"a220bc57-3f31-4851-ad5c-9f61359f7de5\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:27:13 crc kubenswrapper[4810]: I0219 15:27:13.191006 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a220bc57-3f31-4851-ad5c-9f61359f7de5-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"a220bc57-3f31-4851-ad5c-9f61359f7de5\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:27:13 crc kubenswrapper[4810]: I0219 15:27:13.212487 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-b3b143f1-488b-49bf-8792-af0d760f341e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b3b143f1-488b-49bf-8792-af0d760f341e\") pod \"prometheus-metric-storage-0\" (UID: \"a220bc57-3f31-4851-ad5c-9f61359f7de5\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:27:13 crc kubenswrapper[4810]: I0219 15:27:13.302764 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 19 15:27:14 crc kubenswrapper[4810]: I0219 15:27:14.959533 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-s5488"] Feb 19 15:27:14 crc kubenswrapper[4810]: I0219 15:27:14.961384 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-s5488" Feb 19 15:27:14 crc kubenswrapper[4810]: I0219 15:27:14.963319 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-66nlv" Feb 19 15:27:14 crc kubenswrapper[4810]: I0219 15:27:14.968556 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Feb 19 15:27:14 crc kubenswrapper[4810]: I0219 15:27:14.968667 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Feb 19 15:27:14 crc kubenswrapper[4810]: I0219 15:27:14.985454 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-5t6ds"] Feb 19 15:27:14 crc kubenswrapper[4810]: I0219 15:27:14.987138 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-5t6ds" Feb 19 15:27:14 crc kubenswrapper[4810]: I0219 15:27:14.996756 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4a4fa57b-aa00-4866-b31e-df29f7f86480-var-run-ovn\") pod \"ovn-controller-s5488\" (UID: \"4a4fa57b-aa00-4866-b31e-df29f7f86480\") " pod="openstack/ovn-controller-s5488" Feb 19 15:27:14 crc kubenswrapper[4810]: I0219 15:27:14.996827 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4a4fa57b-aa00-4866-b31e-df29f7f86480-var-log-ovn\") pod \"ovn-controller-s5488\" (UID: \"4a4fa57b-aa00-4866-b31e-df29f7f86480\") " pod="openstack/ovn-controller-s5488" Feb 19 15:27:14 crc kubenswrapper[4810]: I0219 15:27:14.996850 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4a4fa57b-aa00-4866-b31e-df29f7f86480-var-run\") pod \"ovn-controller-s5488\" (UID: \"4a4fa57b-aa00-4866-b31e-df29f7f86480\") " pod="openstack/ovn-controller-s5488" Feb 19 15:27:14 crc kubenswrapper[4810]: I0219 15:27:14.996864 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a4fa57b-aa00-4866-b31e-df29f7f86480-combined-ca-bundle\") pod \"ovn-controller-s5488\" (UID: \"4a4fa57b-aa00-4866-b31e-df29f7f86480\") " pod="openstack/ovn-controller-s5488" Feb 19 15:27:14 crc kubenswrapper[4810]: I0219 15:27:14.996880 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a4fa57b-aa00-4866-b31e-df29f7f86480-ovn-controller-tls-certs\") pod \"ovn-controller-s5488\" (UID: \"4a4fa57b-aa00-4866-b31e-df29f7f86480\") " pod="openstack/ovn-controller-s5488" Feb 19 15:27:14 crc kubenswrapper[4810]: I0219 15:27:14.996903 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ngp2\" (UniqueName: \"kubernetes.io/projected/4a4fa57b-aa00-4866-b31e-df29f7f86480-kube-api-access-9ngp2\") pod \"ovn-controller-s5488\" (UID: \"4a4fa57b-aa00-4866-b31e-df29f7f86480\") " pod="openstack/ovn-controller-s5488" Feb 19 15:27:14 crc kubenswrapper[4810]: I0219 15:27:14.996922 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4a4fa57b-aa00-4866-b31e-df29f7f86480-scripts\") pod \"ovn-controller-s5488\" (UID: \"4a4fa57b-aa00-4866-b31e-df29f7f86480\") " pod="openstack/ovn-controller-s5488" Feb 19 15:27:14 crc kubenswrapper[4810]: I0219 15:27:14.999419 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-s5488"] Feb 19 15:27:15 crc kubenswrapper[4810]: I0219 15:27:15.006633 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-5t6ds"] Feb 19 15:27:15 crc kubenswrapper[4810]: I0219 15:27:15.098979 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8jzv\" (UniqueName: \"kubernetes.io/projected/542da555-4f39-4dff-b378-5306135244db-kube-api-access-g8jzv\") pod \"ovn-controller-ovs-5t6ds\" (UID: \"542da555-4f39-4dff-b378-5306135244db\") " pod="openstack/ovn-controller-ovs-5t6ds" Feb 19 15:27:15 crc kubenswrapper[4810]: I0219 15:27:15.099032 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4a4fa57b-aa00-4866-b31e-df29f7f86480-var-log-ovn\") pod \"ovn-controller-s5488\" (UID: \"4a4fa57b-aa00-4866-b31e-df29f7f86480\") " pod="openstack/ovn-controller-s5488" Feb 19 15:27:15 crc kubenswrapper[4810]: I0219 15:27:15.099050 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/542da555-4f39-4dff-b378-5306135244db-etc-ovs\") pod \"ovn-controller-ovs-5t6ds\" (UID: \"542da555-4f39-4dff-b378-5306135244db\") " pod="openstack/ovn-controller-ovs-5t6ds" Feb 19 15:27:15 crc kubenswrapper[4810]: I0219 15:27:15.099065 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/542da555-4f39-4dff-b378-5306135244db-scripts\") pod \"ovn-controller-ovs-5t6ds\" (UID: \"542da555-4f39-4dff-b378-5306135244db\") " pod="openstack/ovn-controller-ovs-5t6ds" Feb 19 15:27:15 crc kubenswrapper[4810]: I0219 15:27:15.099080 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/542da555-4f39-4dff-b378-5306135244db-var-log\") pod \"ovn-controller-ovs-5t6ds\" (UID: \"542da555-4f39-4dff-b378-5306135244db\") " pod="openstack/ovn-controller-ovs-5t6ds" Feb 19 15:27:15 crc kubenswrapper[4810]: I0219 15:27:15.099096 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4a4fa57b-aa00-4866-b31e-df29f7f86480-var-run\") pod \"ovn-controller-s5488\" (UID: \"4a4fa57b-aa00-4866-b31e-df29f7f86480\") " pod="openstack/ovn-controller-s5488" Feb 19 15:27:15 crc kubenswrapper[4810]: I0219 15:27:15.099111 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a4fa57b-aa00-4866-b31e-df29f7f86480-combined-ca-bundle\") pod \"ovn-controller-s5488\" (UID: \"4a4fa57b-aa00-4866-b31e-df29f7f86480\") " pod="openstack/ovn-controller-s5488" Feb 19 15:27:15 crc kubenswrapper[4810]: I0219 15:27:15.099125 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a4fa57b-aa00-4866-b31e-df29f7f86480-ovn-controller-tls-certs\") pod \"ovn-controller-s5488\" (UID: \"4a4fa57b-aa00-4866-b31e-df29f7f86480\") " pod="openstack/ovn-controller-s5488" Feb 19 15:27:15 crc kubenswrapper[4810]: I0219 15:27:15.099147 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ngp2\" (UniqueName: \"kubernetes.io/projected/4a4fa57b-aa00-4866-b31e-df29f7f86480-kube-api-access-9ngp2\") pod \"ovn-controller-s5488\" (UID: \"4a4fa57b-aa00-4866-b31e-df29f7f86480\") " pod="openstack/ovn-controller-s5488" Feb 19 15:27:15 crc kubenswrapper[4810]: I0219 15:27:15.099169 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4a4fa57b-aa00-4866-b31e-df29f7f86480-scripts\") pod \"ovn-controller-s5488\" (UID: \"4a4fa57b-aa00-4866-b31e-df29f7f86480\") " pod="openstack/ovn-controller-s5488" Feb 19 15:27:15 crc kubenswrapper[4810]: I0219 15:27:15.099196 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/542da555-4f39-4dff-b378-5306135244db-var-run\") pod \"ovn-controller-ovs-5t6ds\" (UID: \"542da555-4f39-4dff-b378-5306135244db\") " pod="openstack/ovn-controller-ovs-5t6ds" Feb 19 15:27:15 crc kubenswrapper[4810]: I0219 15:27:15.099232 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/542da555-4f39-4dff-b378-5306135244db-var-lib\") pod \"ovn-controller-ovs-5t6ds\" (UID: \"542da555-4f39-4dff-b378-5306135244db\") " pod="openstack/ovn-controller-ovs-5t6ds" Feb 19 15:27:15 crc kubenswrapper[4810]: I0219 15:27:15.099247 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4a4fa57b-aa00-4866-b31e-df29f7f86480-var-run-ovn\") pod \"ovn-controller-s5488\" (UID: \"4a4fa57b-aa00-4866-b31e-df29f7f86480\") " pod="openstack/ovn-controller-s5488" Feb 19 15:27:15 crc kubenswrapper[4810]: I0219 15:27:15.099720 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4a4fa57b-aa00-4866-b31e-df29f7f86480-var-run-ovn\") pod \"ovn-controller-s5488\" (UID: \"4a4fa57b-aa00-4866-b31e-df29f7f86480\") " pod="openstack/ovn-controller-s5488" Feb 19 15:27:15 crc kubenswrapper[4810]: I0219 15:27:15.099814 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4a4fa57b-aa00-4866-b31e-df29f7f86480-var-log-ovn\") pod \"ovn-controller-s5488\" (UID: \"4a4fa57b-aa00-4866-b31e-df29f7f86480\") " pod="openstack/ovn-controller-s5488" Feb 19 15:27:15 crc kubenswrapper[4810]: I0219 15:27:15.100369 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4a4fa57b-aa00-4866-b31e-df29f7f86480-var-run\") pod \"ovn-controller-s5488\" (UID: \"4a4fa57b-aa00-4866-b31e-df29f7f86480\") " pod="openstack/ovn-controller-s5488" Feb 19 15:27:15 crc kubenswrapper[4810]: I0219 15:27:15.101567 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4a4fa57b-aa00-4866-b31e-df29f7f86480-scripts\") pod \"ovn-controller-s5488\" (UID: \"4a4fa57b-aa00-4866-b31e-df29f7f86480\") " pod="openstack/ovn-controller-s5488" Feb 19 15:27:15 crc kubenswrapper[4810]: I0219 15:27:15.105674 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a4fa57b-aa00-4866-b31e-df29f7f86480-combined-ca-bundle\") pod \"ovn-controller-s5488\" (UID: \"4a4fa57b-aa00-4866-b31e-df29f7f86480\") " pod="openstack/ovn-controller-s5488" Feb 19 15:27:15 crc kubenswrapper[4810]: I0219 15:27:15.106478 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a4fa57b-aa00-4866-b31e-df29f7f86480-ovn-controller-tls-certs\") pod \"ovn-controller-s5488\" (UID: \"4a4fa57b-aa00-4866-b31e-df29f7f86480\") " pod="openstack/ovn-controller-s5488" Feb 19 15:27:15 crc kubenswrapper[4810]: I0219 15:27:15.119008 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ngp2\" (UniqueName: \"kubernetes.io/projected/4a4fa57b-aa00-4866-b31e-df29f7f86480-kube-api-access-9ngp2\") pod \"ovn-controller-s5488\" (UID: \"4a4fa57b-aa00-4866-b31e-df29f7f86480\") " pod="openstack/ovn-controller-s5488" Feb 19 15:27:15 crc kubenswrapper[4810]: I0219 15:27:15.200123 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/542da555-4f39-4dff-b378-5306135244db-var-run\") pod \"ovn-controller-ovs-5t6ds\" (UID: \"542da555-4f39-4dff-b378-5306135244db\") " pod="openstack/ovn-controller-ovs-5t6ds" Feb 19 15:27:15 crc kubenswrapper[4810]: I0219 15:27:15.200182 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/542da555-4f39-4dff-b378-5306135244db-var-lib\") pod \"ovn-controller-ovs-5t6ds\" (UID: \"542da555-4f39-4dff-b378-5306135244db\") " pod="openstack/ovn-controller-ovs-5t6ds" Feb 19 15:27:15 crc kubenswrapper[4810]: I0219 15:27:15.200225 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8jzv\" (UniqueName: \"kubernetes.io/projected/542da555-4f39-4dff-b378-5306135244db-kube-api-access-g8jzv\") pod \"ovn-controller-ovs-5t6ds\" (UID: \"542da555-4f39-4dff-b378-5306135244db\") " pod="openstack/ovn-controller-ovs-5t6ds" Feb 19 15:27:15 crc kubenswrapper[4810]: I0219 15:27:15.200254 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/542da555-4f39-4dff-b378-5306135244db-etc-ovs\") pod \"ovn-controller-ovs-5t6ds\" (UID: \"542da555-4f39-4dff-b378-5306135244db\") " pod="openstack/ovn-controller-ovs-5t6ds" Feb 19 15:27:15 crc kubenswrapper[4810]: I0219 15:27:15.200272 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/542da555-4f39-4dff-b378-5306135244db-scripts\") pod \"ovn-controller-ovs-5t6ds\" (UID: \"542da555-4f39-4dff-b378-5306135244db\") " pod="openstack/ovn-controller-ovs-5t6ds" Feb 19 15:27:15 crc kubenswrapper[4810]: I0219 15:27:15.200287 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/542da555-4f39-4dff-b378-5306135244db-var-log\") pod \"ovn-controller-ovs-5t6ds\" (UID: \"542da555-4f39-4dff-b378-5306135244db\") " pod="openstack/ovn-controller-ovs-5t6ds" Feb 19 15:27:15 crc kubenswrapper[4810]: I0219 15:27:15.200484 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/542da555-4f39-4dff-b378-5306135244db-var-lib\") pod \"ovn-controller-ovs-5t6ds\" (UID: \"542da555-4f39-4dff-b378-5306135244db\") " pod="openstack/ovn-controller-ovs-5t6ds" Feb 19 15:27:15 crc kubenswrapper[4810]: I0219 15:27:15.200520 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/542da555-4f39-4dff-b378-5306135244db-etc-ovs\") pod \"ovn-controller-ovs-5t6ds\" (UID: \"542da555-4f39-4dff-b378-5306135244db\") " pod="openstack/ovn-controller-ovs-5t6ds" Feb 19 15:27:15 crc kubenswrapper[4810]: I0219 15:27:15.200620 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/542da555-4f39-4dff-b378-5306135244db-var-run\") pod \"ovn-controller-ovs-5t6ds\" (UID: \"542da555-4f39-4dff-b378-5306135244db\") " pod="openstack/ovn-controller-ovs-5t6ds" Feb 19 15:27:15 crc kubenswrapper[4810]: I0219 15:27:15.202122 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/542da555-4f39-4dff-b378-5306135244db-scripts\") pod \"ovn-controller-ovs-5t6ds\" (UID: \"542da555-4f39-4dff-b378-5306135244db\") " pod="openstack/ovn-controller-ovs-5t6ds" Feb 19 15:27:15 crc kubenswrapper[4810]: I0219 15:27:15.202245 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/542da555-4f39-4dff-b378-5306135244db-var-log\") pod \"ovn-controller-ovs-5t6ds\" (UID: \"542da555-4f39-4dff-b378-5306135244db\") " pod="openstack/ovn-controller-ovs-5t6ds" Feb 19 15:27:15 crc kubenswrapper[4810]: I0219 15:27:15.232365 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8jzv\" (UniqueName: \"kubernetes.io/projected/542da555-4f39-4dff-b378-5306135244db-kube-api-access-g8jzv\") pod \"ovn-controller-ovs-5t6ds\" (UID: \"542da555-4f39-4dff-b378-5306135244db\") " pod="openstack/ovn-controller-ovs-5t6ds" Feb 19 15:27:15 crc kubenswrapper[4810]: I0219 15:27:15.296112 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-s5488" Feb 19 15:27:15 crc kubenswrapper[4810]: I0219 15:27:15.304303 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 19 15:27:15 crc kubenswrapper[4810]: I0219 15:27:15.314535 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-5t6ds" Feb 19 15:27:15 crc kubenswrapper[4810]: I0219 15:27:15.322987 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 19 15:27:15 crc kubenswrapper[4810]: I0219 15:27:15.324785 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 19 15:27:15 crc kubenswrapper[4810]: I0219 15:27:15.330187 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-pnhq5" Feb 19 15:27:15 crc kubenswrapper[4810]: I0219 15:27:15.330368 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Feb 19 15:27:15 crc kubenswrapper[4810]: I0219 15:27:15.330610 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Feb 19 15:27:15 crc kubenswrapper[4810]: I0219 15:27:15.330725 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Feb 19 15:27:15 crc kubenswrapper[4810]: I0219 15:27:15.330923 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Feb 19 15:27:15 crc kubenswrapper[4810]: I0219 15:27:15.341270 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 19 15:27:15 crc kubenswrapper[4810]: I0219 15:27:15.506917 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bdffb5e6-13bb-4c08-ad3c-52d8ded85431-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"bdffb5e6-13bb-4c08-ad3c-52d8ded85431\") " pod="openstack/ovsdbserver-nb-0" Feb 19 15:27:15 crc kubenswrapper[4810]: I0219 15:27:15.506995 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bdffb5e6-13bb-4c08-ad3c-52d8ded85431-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"bdffb5e6-13bb-4c08-ad3c-52d8ded85431\") " pod="openstack/ovsdbserver-nb-0" Feb 19 15:27:15 crc kubenswrapper[4810]: I0219 15:27:15.507024 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdffb5e6-13bb-4c08-ad3c-52d8ded85431-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"bdffb5e6-13bb-4c08-ad3c-52d8ded85431\") " pod="openstack/ovsdbserver-nb-0" Feb 19 15:27:15 crc kubenswrapper[4810]: I0219 15:27:15.507073 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdffb5e6-13bb-4c08-ad3c-52d8ded85431-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"bdffb5e6-13bb-4c08-ad3c-52d8ded85431\") " pod="openstack/ovsdbserver-nb-0" Feb 19 15:27:15 crc kubenswrapper[4810]: I0219 15:27:15.507114 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdffb5e6-13bb-4c08-ad3c-52d8ded85431-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"bdffb5e6-13bb-4c08-ad3c-52d8ded85431\") " pod="openstack/ovsdbserver-nb-0" Feb 19 15:27:15 crc kubenswrapper[4810]: I0219 15:27:15.507308 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bdffb5e6-13bb-4c08-ad3c-52d8ded85431-config\") pod \"ovsdbserver-nb-0\" (UID: \"bdffb5e6-13bb-4c08-ad3c-52d8ded85431\") " pod="openstack/ovsdbserver-nb-0" Feb 19 15:27:15 crc kubenswrapper[4810]: I0219 15:27:15.507488 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-nb-0\" (UID: \"bdffb5e6-13bb-4c08-ad3c-52d8ded85431\") " pod="openstack/ovsdbserver-nb-0" Feb 19 15:27:15 crc kubenswrapper[4810]: I0219 15:27:15.507640 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9lj8\" (UniqueName: \"kubernetes.io/projected/bdffb5e6-13bb-4c08-ad3c-52d8ded85431-kube-api-access-b9lj8\") pod \"ovsdbserver-nb-0\" (UID: \"bdffb5e6-13bb-4c08-ad3c-52d8ded85431\") " pod="openstack/ovsdbserver-nb-0" Feb 19 15:27:15 crc kubenswrapper[4810]: I0219 15:27:15.609259 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9lj8\" (UniqueName: \"kubernetes.io/projected/bdffb5e6-13bb-4c08-ad3c-52d8ded85431-kube-api-access-b9lj8\") pod \"ovsdbserver-nb-0\" (UID: \"bdffb5e6-13bb-4c08-ad3c-52d8ded85431\") " pod="openstack/ovsdbserver-nb-0" Feb 19 15:27:15 crc kubenswrapper[4810]: I0219 15:27:15.609336 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bdffb5e6-13bb-4c08-ad3c-52d8ded85431-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"bdffb5e6-13bb-4c08-ad3c-52d8ded85431\") " pod="openstack/ovsdbserver-nb-0" Feb 19 15:27:15 crc kubenswrapper[4810]: I0219 15:27:15.609385 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bdffb5e6-13bb-4c08-ad3c-52d8ded85431-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"bdffb5e6-13bb-4c08-ad3c-52d8ded85431\") " pod="openstack/ovsdbserver-nb-0" Feb 19 15:27:15 crc kubenswrapper[4810]: I0219 15:27:15.609405 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdffb5e6-13bb-4c08-ad3c-52d8ded85431-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"bdffb5e6-13bb-4c08-ad3c-52d8ded85431\") " pod="openstack/ovsdbserver-nb-0" Feb 19 15:27:15 crc kubenswrapper[4810]: I0219 15:27:15.609431 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdffb5e6-13bb-4c08-ad3c-52d8ded85431-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"bdffb5e6-13bb-4c08-ad3c-52d8ded85431\") " pod="openstack/ovsdbserver-nb-0" Feb 19 15:27:15 crc kubenswrapper[4810]: I0219 15:27:15.609464 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdffb5e6-13bb-4c08-ad3c-52d8ded85431-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"bdffb5e6-13bb-4c08-ad3c-52d8ded85431\") " pod="openstack/ovsdbserver-nb-0" Feb 19 15:27:15 crc kubenswrapper[4810]: I0219 15:27:15.609485 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bdffb5e6-13bb-4c08-ad3c-52d8ded85431-config\") pod \"ovsdbserver-nb-0\" (UID: \"bdffb5e6-13bb-4c08-ad3c-52d8ded85431\") " pod="openstack/ovsdbserver-nb-0" Feb 19 15:27:15 crc kubenswrapper[4810]: I0219 15:27:15.609521 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-nb-0\" (UID: \"bdffb5e6-13bb-4c08-ad3c-52d8ded85431\") " pod="openstack/ovsdbserver-nb-0" Feb 19 15:27:15 crc kubenswrapper[4810]: I0219 15:27:15.609837 4810 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-nb-0\" (UID: \"bdffb5e6-13bb-4c08-ad3c-52d8ded85431\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/ovsdbserver-nb-0" Feb 19 15:27:15 crc kubenswrapper[4810]: I0219 15:27:15.610595 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bdffb5e6-13bb-4c08-ad3c-52d8ded85431-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"bdffb5e6-13bb-4c08-ad3c-52d8ded85431\") " pod="openstack/ovsdbserver-nb-0" Feb 19 15:27:15 crc kubenswrapper[4810]: I0219 15:27:15.610983 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bdffb5e6-13bb-4c08-ad3c-52d8ded85431-config\") pod \"ovsdbserver-nb-0\" (UID: \"bdffb5e6-13bb-4c08-ad3c-52d8ded85431\") " pod="openstack/ovsdbserver-nb-0" Feb 19 15:27:15 crc kubenswrapper[4810]: I0219 15:27:15.611309 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bdffb5e6-13bb-4c08-ad3c-52d8ded85431-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"bdffb5e6-13bb-4c08-ad3c-52d8ded85431\") " pod="openstack/ovsdbserver-nb-0" Feb 19 15:27:15 crc kubenswrapper[4810]: I0219 15:27:15.613302 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdffb5e6-13bb-4c08-ad3c-52d8ded85431-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"bdffb5e6-13bb-4c08-ad3c-52d8ded85431\") " pod="openstack/ovsdbserver-nb-0" Feb 19 15:27:15 crc kubenswrapper[4810]: I0219 15:27:15.613837 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdffb5e6-13bb-4c08-ad3c-52d8ded85431-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"bdffb5e6-13bb-4c08-ad3c-52d8ded85431\") " pod="openstack/ovsdbserver-nb-0" Feb 19 15:27:15 crc kubenswrapper[4810]: I0219 15:27:15.617563 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdffb5e6-13bb-4c08-ad3c-52d8ded85431-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"bdffb5e6-13bb-4c08-ad3c-52d8ded85431\") " pod="openstack/ovsdbserver-nb-0" Feb 19 15:27:15 crc kubenswrapper[4810]: I0219 15:27:15.629771 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-nb-0\" (UID: \"bdffb5e6-13bb-4c08-ad3c-52d8ded85431\") " pod="openstack/ovsdbserver-nb-0" Feb 19 15:27:15 crc kubenswrapper[4810]: I0219 15:27:15.635220 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9lj8\" (UniqueName: \"kubernetes.io/projected/bdffb5e6-13bb-4c08-ad3c-52d8ded85431-kube-api-access-b9lj8\") pod \"ovsdbserver-nb-0\" (UID: \"bdffb5e6-13bb-4c08-ad3c-52d8ded85431\") " pod="openstack/ovsdbserver-nb-0" Feb 19 15:27:15 crc kubenswrapper[4810]: I0219 15:27:15.653179 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 19 15:27:18 crc kubenswrapper[4810]: I0219 15:27:18.455058 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 19 15:27:18 crc kubenswrapper[4810]: I0219 15:27:18.461522 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 19 15:27:18 crc kubenswrapper[4810]: I0219 15:27:18.465082 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Feb 19 15:27:18 crc kubenswrapper[4810]: I0219 15:27:18.465266 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-rtzwd" Feb 19 15:27:18 crc kubenswrapper[4810]: I0219 15:27:18.465398 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Feb 19 15:27:18 crc kubenswrapper[4810]: I0219 15:27:18.467020 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Feb 19 15:27:18 crc kubenswrapper[4810]: I0219 15:27:18.469349 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 19 15:27:18 crc kubenswrapper[4810]: I0219 15:27:18.559809 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b985124-01b7-430c-b5ea-b9fd095e5f5e-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"5b985124-01b7-430c-b5ea-b9fd095e5f5e\") " pod="openstack/ovsdbserver-sb-0" Feb 19 15:27:18 crc kubenswrapper[4810]: I0219 15:27:18.559943 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b985124-01b7-430c-b5ea-b9fd095e5f5e-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"5b985124-01b7-430c-b5ea-b9fd095e5f5e\") " pod="openstack/ovsdbserver-sb-0" Feb 19 15:27:18 crc kubenswrapper[4810]: I0219 15:27:18.559983 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b985124-01b7-430c-b5ea-b9fd095e5f5e-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"5b985124-01b7-430c-b5ea-b9fd095e5f5e\") " pod="openstack/ovsdbserver-sb-0" Feb 19 15:27:18 crc kubenswrapper[4810]: I0219 15:27:18.560003 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b985124-01b7-430c-b5ea-b9fd095e5f5e-config\") pod \"ovsdbserver-sb-0\" (UID: \"5b985124-01b7-430c-b5ea-b9fd095e5f5e\") " pod="openstack/ovsdbserver-sb-0" Feb 19 15:27:18 crc kubenswrapper[4810]: I0219 15:27:18.560029 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxnn4\" (UniqueName: \"kubernetes.io/projected/5b985124-01b7-430c-b5ea-b9fd095e5f5e-kube-api-access-fxnn4\") pod \"ovsdbserver-sb-0\" (UID: \"5b985124-01b7-430c-b5ea-b9fd095e5f5e\") " pod="openstack/ovsdbserver-sb-0" Feb 19 15:27:18 crc kubenswrapper[4810]: I0219 15:27:18.560081 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5b985124-01b7-430c-b5ea-b9fd095e5f5e-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"5b985124-01b7-430c-b5ea-b9fd095e5f5e\") " pod="openstack/ovsdbserver-sb-0" Feb 19 15:27:18 crc kubenswrapper[4810]: I0219 15:27:18.560316 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"5b985124-01b7-430c-b5ea-b9fd095e5f5e\") " pod="openstack/ovsdbserver-sb-0" Feb 19 15:27:18 crc kubenswrapper[4810]: I0219 15:27:18.560398 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5b985124-01b7-430c-b5ea-b9fd095e5f5e-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"5b985124-01b7-430c-b5ea-b9fd095e5f5e\") " pod="openstack/ovsdbserver-sb-0" Feb 19 15:27:18 crc kubenswrapper[4810]: I0219 15:27:18.662756 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b985124-01b7-430c-b5ea-b9fd095e5f5e-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"5b985124-01b7-430c-b5ea-b9fd095e5f5e\") " pod="openstack/ovsdbserver-sb-0" Feb 19 15:27:18 crc kubenswrapper[4810]: I0219 15:27:18.662835 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b985124-01b7-430c-b5ea-b9fd095e5f5e-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"5b985124-01b7-430c-b5ea-b9fd095e5f5e\") " pod="openstack/ovsdbserver-sb-0" Feb 19 15:27:18 crc kubenswrapper[4810]: I0219 15:27:18.662860 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b985124-01b7-430c-b5ea-b9fd095e5f5e-config\") pod \"ovsdbserver-sb-0\" (UID: \"5b985124-01b7-430c-b5ea-b9fd095e5f5e\") " pod="openstack/ovsdbserver-sb-0" Feb 19 15:27:18 crc kubenswrapper[4810]: I0219 15:27:18.662887 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxnn4\" (UniqueName: \"kubernetes.io/projected/5b985124-01b7-430c-b5ea-b9fd095e5f5e-kube-api-access-fxnn4\") pod \"ovsdbserver-sb-0\" (UID: \"5b985124-01b7-430c-b5ea-b9fd095e5f5e\") " pod="openstack/ovsdbserver-sb-0" Feb 19 15:27:18 crc kubenswrapper[4810]: I0219 15:27:18.662918 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5b985124-01b7-430c-b5ea-b9fd095e5f5e-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"5b985124-01b7-430c-b5ea-b9fd095e5f5e\") " pod="openstack/ovsdbserver-sb-0" Feb 19 15:27:18 crc kubenswrapper[4810]: I0219 15:27:18.662973 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"5b985124-01b7-430c-b5ea-b9fd095e5f5e\") " pod="openstack/ovsdbserver-sb-0" Feb 19 15:27:18 crc kubenswrapper[4810]: I0219 15:27:18.662998 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5b985124-01b7-430c-b5ea-b9fd095e5f5e-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"5b985124-01b7-430c-b5ea-b9fd095e5f5e\") " pod="openstack/ovsdbserver-sb-0" Feb 19 15:27:18 crc kubenswrapper[4810]: I0219 15:27:18.663019 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b985124-01b7-430c-b5ea-b9fd095e5f5e-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"5b985124-01b7-430c-b5ea-b9fd095e5f5e\") " pod="openstack/ovsdbserver-sb-0" Feb 19 15:27:18 crc kubenswrapper[4810]: I0219 15:27:18.664310 4810 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"5b985124-01b7-430c-b5ea-b9fd095e5f5e\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/ovsdbserver-sb-0" Feb 19 15:27:18 crc kubenswrapper[4810]: I0219 15:27:18.666060 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5b985124-01b7-430c-b5ea-b9fd095e5f5e-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"5b985124-01b7-430c-b5ea-b9fd095e5f5e\") " pod="openstack/ovsdbserver-sb-0" Feb 19 15:27:18 crc kubenswrapper[4810]: I0219 15:27:18.666430 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5b985124-01b7-430c-b5ea-b9fd095e5f5e-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"5b985124-01b7-430c-b5ea-b9fd095e5f5e\") " pod="openstack/ovsdbserver-sb-0" Feb 19 15:27:18 crc kubenswrapper[4810]: I0219 15:27:18.668021 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b985124-01b7-430c-b5ea-b9fd095e5f5e-config\") pod \"ovsdbserver-sb-0\" (UID: \"5b985124-01b7-430c-b5ea-b9fd095e5f5e\") " pod="openstack/ovsdbserver-sb-0" Feb 19 15:27:18 crc kubenswrapper[4810]: I0219 15:27:18.673759 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b985124-01b7-430c-b5ea-b9fd095e5f5e-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"5b985124-01b7-430c-b5ea-b9fd095e5f5e\") " pod="openstack/ovsdbserver-sb-0" Feb 19 15:27:18 crc kubenswrapper[4810]: I0219 15:27:18.678261 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b985124-01b7-430c-b5ea-b9fd095e5f5e-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"5b985124-01b7-430c-b5ea-b9fd095e5f5e\") " pod="openstack/ovsdbserver-sb-0" Feb 19 15:27:18 crc kubenswrapper[4810]: I0219 15:27:18.681270 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b985124-01b7-430c-b5ea-b9fd095e5f5e-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"5b985124-01b7-430c-b5ea-b9fd095e5f5e\") " pod="openstack/ovsdbserver-sb-0" Feb 19 15:27:18 crc kubenswrapper[4810]: I0219 15:27:18.681732 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxnn4\" (UniqueName: \"kubernetes.io/projected/5b985124-01b7-430c-b5ea-b9fd095e5f5e-kube-api-access-fxnn4\") pod \"ovsdbserver-sb-0\" (UID: \"5b985124-01b7-430c-b5ea-b9fd095e5f5e\") " pod="openstack/ovsdbserver-sb-0" Feb 19 15:27:18 crc kubenswrapper[4810]: I0219 15:27:18.688388 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"5b985124-01b7-430c-b5ea-b9fd095e5f5e\") " pod="openstack/ovsdbserver-sb-0" Feb 19 15:27:18 crc kubenswrapper[4810]: I0219 15:27:18.795979 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 19 15:27:19 crc kubenswrapper[4810]: W0219 15:27:19.713611 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb773d46_7b9f_4ca4_b33c_9b800b9eafd7.slice/crio-af1bedfbf1e8ee4640f8b3f2501f92ef9cf3bd99570f284d000bc1d9845edff2 WatchSource:0}: Error finding container af1bedfbf1e8ee4640f8b3f2501f92ef9cf3bd99570f284d000bc1d9845edff2: Status 404 returned error can't find the container with id af1bedfbf1e8ee4640f8b3f2501f92ef9cf3bd99570f284d000bc1d9845edff2 Feb 19 15:27:20 crc kubenswrapper[4810]: I0219 15:27:20.050148 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 15:27:20 crc kubenswrapper[4810]: I0219 15:27:20.285658 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 15:27:20 crc kubenswrapper[4810]: I0219 15:27:20.376264 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 19 15:27:20 crc kubenswrapper[4810]: I0219 15:27:20.383460 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"eb773d46-7b9f-4ca4-b33c-9b800b9eafd7","Type":"ContainerStarted","Data":"af1bedfbf1e8ee4640f8b3f2501f92ef9cf3bd99570f284d000bc1d9845edff2"} Feb 19 15:27:20 crc kubenswrapper[4810]: I0219 15:27:20.389022 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 15:27:20 crc kubenswrapper[4810]: I0219 15:27:20.396517 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/notifications-rabbitmq-server-0"] Feb 19 15:27:20 crc kubenswrapper[4810]: I0219 15:27:20.489440 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5fdbdbc8cc-mddqt"] Feb 19 15:27:20 crc kubenswrapper[4810]: W0219 15:27:20.803604 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a3676ed_f06f_4dea_82a1_959716331113.slice/crio-65b3bafec6c943249f491880afc4a9c1d426f050515e89a7aeb5d3ee771259c3 WatchSource:0}: Error finding container 65b3bafec6c943249f491880afc4a9c1d426f050515e89a7aeb5d3ee771259c3: Status 404 returned error can't find the container with id 65b3bafec6c943249f491880afc4a9c1d426f050515e89a7aeb5d3ee771259c3 Feb 19 15:27:20 crc kubenswrapper[4810]: W0219 15:27:20.807289 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc0ffb8ce_a356_4416_b96c_49db30ff1947.slice/crio-5113de562efb8ddfc222a7c7d8fa47d8888f899d7ace9235021e85e325a0f5e6 WatchSource:0}: Error finding container 5113de562efb8ddfc222a7c7d8fa47d8888f899d7ace9235021e85e325a0f5e6: Status 404 returned error can't find the container with id 5113de562efb8ddfc222a7c7d8fa47d8888f899d7ace9235021e85e325a0f5e6 Feb 19 15:27:20 crc kubenswrapper[4810]: E0219 15:27:20.812682 4810 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.159:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Feb 19 15:27:20 crc kubenswrapper[4810]: E0219 15:27:20.812727 4810 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.159:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Feb 19 15:27:20 crc kubenswrapper[4810]: E0219 15:27:20.812847 4810 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:38.102.83.159:5001/podified-master-centos10/openstack-neutron-server:watcher_latest,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-slwdg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5f48d6b889-hmnjw_openstack(731d3bd2-70ab-4ec0-b574-a00042d0b3b2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 15:27:20 crc kubenswrapper[4810]: E0219 15:27:20.814254 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-5f48d6b889-hmnjw" podUID="731d3bd2-70ab-4ec0-b574-a00042d0b3b2" Feb 19 15:27:20 crc kubenswrapper[4810]: E0219 15:27:20.833007 4810 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.159:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Feb 19 15:27:20 crc kubenswrapper[4810]: E0219 15:27:20.833058 4810 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.159:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Feb 19 15:27:20 crc kubenswrapper[4810]: E0219 15:27:20.833164 4810 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:38.102.83.159:5001/podified-master-centos10/openstack-neutron-server:watcher_latest,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5gkxm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-6cf7b9b6b9-4h67z_openstack(de8cf274-42f1-4c36-bca7-1d622bf61898): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 15:27:20 crc kubenswrapper[4810]: E0219 15:27:20.834379 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-6cf7b9b6b9-4h67z" podUID="de8cf274-42f1-4c36-bca7-1d622bf61898" Feb 19 15:27:21 crc kubenswrapper[4810]: I0219 15:27:21.285814 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-c99647bb5-xkrgb"] Feb 19 15:27:21 crc kubenswrapper[4810]: I0219 15:27:21.376312 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 19 15:27:21 crc kubenswrapper[4810]: I0219 15:27:21.385037 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 19 15:27:21 crc kubenswrapper[4810]: I0219 15:27:21.391209 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/notifications-rabbitmq-server-0" event={"ID":"4ad875c7-9e7b-4cc8-b6f7-e6fbeb95573c","Type":"ContainerStarted","Data":"5c41be9999ee30bb993e3bad3c961a576efb4f2be70057a47eb8147e5881b09c"} Feb 19 15:27:21 crc kubenswrapper[4810]: I0219 15:27:21.392265 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"097bc4d1-5648-4607-9c49-286e4bbbe553","Type":"ContainerStarted","Data":"7d6bd84bead9eb4536dd357117c11ae4d96e35d9a18c032d07c64f477200a6eb"} Feb 19 15:27:21 crc kubenswrapper[4810]: I0219 15:27:21.393051 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"00bcfb03-4357-4343-99a5-30dc7f25abe9","Type":"ContainerStarted","Data":"438e5fcabfeda7b104ccc004754827e00367d2ec7bbb19edfefdf5cb049ee1ce"} Feb 19 15:27:21 crc kubenswrapper[4810]: I0219 15:27:21.393644 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2a3676ed-f06f-4dea-82a1-959716331113","Type":"ContainerStarted","Data":"65b3bafec6c943249f491880afc4a9c1d426f050515e89a7aeb5d3ee771259c3"} Feb 19 15:27:21 crc kubenswrapper[4810]: I0219 15:27:21.394347 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"c0ffb8ce-a356-4416-b96c-49db30ff1947","Type":"ContainerStarted","Data":"5113de562efb8ddfc222a7c7d8fa47d8888f899d7ace9235021e85e325a0f5e6"} Feb 19 15:27:21 crc kubenswrapper[4810]: I0219 15:27:21.395129 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fdbdbc8cc-mddqt" event={"ID":"64a59633-cb6f-4631-a980-566894f0ce35","Type":"ContainerStarted","Data":"97bfa68e77f4ab8ae2a59549a783cdf08404be22506f85bef749cb5bf6fbe5cf"} Feb 19 15:27:21 crc kubenswrapper[4810]: W0219 15:27:21.649870 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeec01803_2cfd_4e97_a1c0_216c2622e913.slice/crio-d66ef65309b98fdfd5a68e7c80449beb1e0f777f1d9cb1877aa1054da73700ab WatchSource:0}: Error finding container d66ef65309b98fdfd5a68e7c80449beb1e0f777f1d9cb1877aa1054da73700ab: Status 404 returned error can't find the container with id d66ef65309b98fdfd5a68e7c80449beb1e0f777f1d9cb1877aa1054da73700ab Feb 19 15:27:21 crc kubenswrapper[4810]: W0219 15:27:21.657933 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda220bc57_3f31_4851_ad5c_9f61359f7de5.slice/crio-2e7a3617c56f7d96899fe573b10e0613d97d13afc38ab2fb9c62813a642860d3 WatchSource:0}: Error finding container 2e7a3617c56f7d96899fe573b10e0613d97d13afc38ab2fb9c62813a642860d3: Status 404 returned error can't find the container with id 2e7a3617c56f7d96899fe573b10e0613d97d13afc38ab2fb9c62813a642860d3 Feb 19 15:27:21 crc kubenswrapper[4810]: I0219 15:27:21.703990 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cf7b9b6b9-4h67z" Feb 19 15:27:21 crc kubenswrapper[4810]: I0219 15:27:21.751530 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f48d6b889-hmnjw" Feb 19 15:27:21 crc kubenswrapper[4810]: I0219 15:27:21.817688 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 19 15:27:21 crc kubenswrapper[4810]: I0219 15:27:21.824248 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/731d3bd2-70ab-4ec0-b574-a00042d0b3b2-dns-svc\") pod \"731d3bd2-70ab-4ec0-b574-a00042d0b3b2\" (UID: \"731d3bd2-70ab-4ec0-b574-a00042d0b3b2\") " Feb 19 15:27:21 crc kubenswrapper[4810]: I0219 15:27:21.824345 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/731d3bd2-70ab-4ec0-b574-a00042d0b3b2-config\") pod \"731d3bd2-70ab-4ec0-b574-a00042d0b3b2\" (UID: \"731d3bd2-70ab-4ec0-b574-a00042d0b3b2\") " Feb 19 15:27:21 crc kubenswrapper[4810]: I0219 15:27:21.824557 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de8cf274-42f1-4c36-bca7-1d622bf61898-config\") pod \"de8cf274-42f1-4c36-bca7-1d622bf61898\" (UID: \"de8cf274-42f1-4c36-bca7-1d622bf61898\") " Feb 19 15:27:21 crc kubenswrapper[4810]: I0219 15:27:21.824599 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5gkxm\" (UniqueName: \"kubernetes.io/projected/de8cf274-42f1-4c36-bca7-1d622bf61898-kube-api-access-5gkxm\") pod \"de8cf274-42f1-4c36-bca7-1d622bf61898\" (UID: \"de8cf274-42f1-4c36-bca7-1d622bf61898\") " Feb 19 15:27:21 crc kubenswrapper[4810]: I0219 15:27:21.824639 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-slwdg\" (UniqueName: \"kubernetes.io/projected/731d3bd2-70ab-4ec0-b574-a00042d0b3b2-kube-api-access-slwdg\") pod \"731d3bd2-70ab-4ec0-b574-a00042d0b3b2\" (UID: \"731d3bd2-70ab-4ec0-b574-a00042d0b3b2\") " Feb 19 15:27:21 crc kubenswrapper[4810]: I0219 15:27:21.825021 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de8cf274-42f1-4c36-bca7-1d622bf61898-config" (OuterVolumeSpecName: "config") pod "de8cf274-42f1-4c36-bca7-1d622bf61898" (UID: "de8cf274-42f1-4c36-bca7-1d622bf61898"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:27:21 crc kubenswrapper[4810]: I0219 15:27:21.825592 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/731d3bd2-70ab-4ec0-b574-a00042d0b3b2-config" (OuterVolumeSpecName: "config") pod "731d3bd2-70ab-4ec0-b574-a00042d0b3b2" (UID: "731d3bd2-70ab-4ec0-b574-a00042d0b3b2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:27:21 crc kubenswrapper[4810]: I0219 15:27:21.825765 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/731d3bd2-70ab-4ec0-b574-a00042d0b3b2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "731d3bd2-70ab-4ec0-b574-a00042d0b3b2" (UID: "731d3bd2-70ab-4ec0-b574-a00042d0b3b2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:27:21 crc kubenswrapper[4810]: I0219 15:27:21.828338 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/731d3bd2-70ab-4ec0-b574-a00042d0b3b2-kube-api-access-slwdg" (OuterVolumeSpecName: "kube-api-access-slwdg") pod "731d3bd2-70ab-4ec0-b574-a00042d0b3b2" (UID: "731d3bd2-70ab-4ec0-b574-a00042d0b3b2"). InnerVolumeSpecName "kube-api-access-slwdg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:27:21 crc kubenswrapper[4810]: I0219 15:27:21.834563 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de8cf274-42f1-4c36-bca7-1d622bf61898-kube-api-access-5gkxm" (OuterVolumeSpecName: "kube-api-access-5gkxm") pod "de8cf274-42f1-4c36-bca7-1d622bf61898" (UID: "de8cf274-42f1-4c36-bca7-1d622bf61898"). InnerVolumeSpecName "kube-api-access-5gkxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:27:21 crc kubenswrapper[4810]: I0219 15:27:21.933545 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/731d3bd2-70ab-4ec0-b574-a00042d0b3b2-config\") on node \"crc\" DevicePath \"\"" Feb 19 15:27:21 crc kubenswrapper[4810]: I0219 15:27:21.933862 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de8cf274-42f1-4c36-bca7-1d622bf61898-config\") on node \"crc\" DevicePath \"\"" Feb 19 15:27:21 crc kubenswrapper[4810]: I0219 15:27:21.933876 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5gkxm\" (UniqueName: \"kubernetes.io/projected/de8cf274-42f1-4c36-bca7-1d622bf61898-kube-api-access-5gkxm\") on node \"crc\" DevicePath \"\"" Feb 19 15:27:21 crc kubenswrapper[4810]: I0219 15:27:21.933887 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-slwdg\" (UniqueName: \"kubernetes.io/projected/731d3bd2-70ab-4ec0-b574-a00042d0b3b2-kube-api-access-slwdg\") on node \"crc\" DevicePath \"\"" Feb 19 15:27:21 crc kubenswrapper[4810]: I0219 15:27:21.933895 4810 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/731d3bd2-70ab-4ec0-b574-a00042d0b3b2-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 15:27:22 crc kubenswrapper[4810]: I0219 15:27:22.080147 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-s5488"] Feb 19 15:27:22 crc kubenswrapper[4810]: I0219 15:27:22.403446 4810 generic.go:334] "Generic (PLEG): container finished" podID="64a59633-cb6f-4631-a980-566894f0ce35" containerID="1d10f5e352636a23ee8d873911fea2dc7821da75356614e2daef6d4813ea231e" exitCode=0 Feb 19 15:27:22 crc kubenswrapper[4810]: I0219 15:27:22.403509 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fdbdbc8cc-mddqt" event={"ID":"64a59633-cb6f-4631-a980-566894f0ce35","Type":"ContainerDied","Data":"1d10f5e352636a23ee8d873911fea2dc7821da75356614e2daef6d4813ea231e"} Feb 19 15:27:22 crc kubenswrapper[4810]: I0219 15:27:22.405184 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-s5488" event={"ID":"4a4fa57b-aa00-4866-b31e-df29f7f86480","Type":"ContainerStarted","Data":"b2c5a51b8f9fcfed99cc3efa2e88a6ae77828e6048de9d2b7bed0303f6313484"} Feb 19 15:27:22 crc kubenswrapper[4810]: I0219 15:27:22.406584 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a220bc57-3f31-4851-ad5c-9f61359f7de5","Type":"ContainerStarted","Data":"2e7a3617c56f7d96899fe573b10e0613d97d13afc38ab2fb9c62813a642860d3"} Feb 19 15:27:22 crc kubenswrapper[4810]: I0219 15:27:22.408408 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"eb773d46-7b9f-4ca4-b33c-9b800b9eafd7","Type":"ContainerStarted","Data":"956da0f3b539a4118dc7e177ffd26a630d9ad9bfed5f3294c24c5821c882bc03"} Feb 19 15:27:22 crc kubenswrapper[4810]: I0219 15:27:22.408526 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Feb 19 15:27:22 crc kubenswrapper[4810]: I0219 15:27:22.409723 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"30d11a24-9722-4e7a-9be5-f2bd00128167","Type":"ContainerStarted","Data":"463dc22f7557701fb1c6bdef718f136b22b3fd0a388a2348e4f6c48650f5916a"} Feb 19 15:27:22 crc kubenswrapper[4810]: I0219 15:27:22.411105 4810 generic.go:334] "Generic (PLEG): container finished" podID="eec01803-2cfd-4e97-a1c0-216c2622e913" containerID="ed90cdf99f920236e6f409e68476a3cd86bdc892ec839211cda663eab550964c" exitCode=0 Feb 19 15:27:22 crc kubenswrapper[4810]: I0219 15:27:22.411184 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c99647bb5-xkrgb" event={"ID":"eec01803-2cfd-4e97-a1c0-216c2622e913","Type":"ContainerDied","Data":"ed90cdf99f920236e6f409e68476a3cd86bdc892ec839211cda663eab550964c"} Feb 19 15:27:22 crc kubenswrapper[4810]: I0219 15:27:22.411203 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c99647bb5-xkrgb" event={"ID":"eec01803-2cfd-4e97-a1c0-216c2622e913","Type":"ContainerStarted","Data":"d66ef65309b98fdfd5a68e7c80449beb1e0f777f1d9cb1877aa1054da73700ab"} Feb 19 15:27:22 crc kubenswrapper[4810]: I0219 15:27:22.412293 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cf7b9b6b9-4h67z" event={"ID":"de8cf274-42f1-4c36-bca7-1d622bf61898","Type":"ContainerDied","Data":"14a7ad043bd3a14329d8f25870e9bf1129047ecea007a33d63f7b5431ba94745"} Feb 19 15:27:22 crc kubenswrapper[4810]: I0219 15:27:22.412356 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cf7b9b6b9-4h67z" Feb 19 15:27:22 crc kubenswrapper[4810]: I0219 15:27:22.413621 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f48d6b889-hmnjw" Feb 19 15:27:22 crc kubenswrapper[4810]: I0219 15:27:22.413675 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f48d6b889-hmnjw" event={"ID":"731d3bd2-70ab-4ec0-b574-a00042d0b3b2","Type":"ContainerDied","Data":"4b12e6f7653c91c7a79e0ac9f496ebbaccb1093a59dc2281bd92a962e34bac03"} Feb 19 15:27:22 crc kubenswrapper[4810]: I0219 15:27:22.414707 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"5b985124-01b7-430c-b5ea-b9fd095e5f5e","Type":"ContainerStarted","Data":"54a122a98e003d93dcaba0ff09a6daeac31c1f44f71a905a667894948962b204"} Feb 19 15:27:22 crc kubenswrapper[4810]: I0219 15:27:22.461545 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=11.410523754 podStartE2EDuration="13.461526548s" podCreationTimestamp="2026-02-19 15:27:09 +0000 UTC" firstStartedPulling="2026-02-19 15:27:19.735310706 +0000 UTC m=+1069.217340870" lastFinishedPulling="2026-02-19 15:27:21.78631354 +0000 UTC m=+1071.268343664" observedRunningTime="2026-02-19 15:27:22.444611644 +0000 UTC m=+1071.926641768" watchObservedRunningTime="2026-02-19 15:27:22.461526548 +0000 UTC m=+1071.943556672" Feb 19 15:27:22 crc kubenswrapper[4810]: I0219 15:27:22.507159 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f48d6b889-hmnjw"] Feb 19 15:27:22 crc kubenswrapper[4810]: I0219 15:27:22.518381 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f48d6b889-hmnjw"] Feb 19 15:27:22 crc kubenswrapper[4810]: I0219 15:27:22.531502 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cf7b9b6b9-4h67z"] Feb 19 15:27:22 crc kubenswrapper[4810]: I0219 15:27:22.536869 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6cf7b9b6b9-4h67z"] Feb 19 15:27:22 crc kubenswrapper[4810]: I0219 15:27:22.745521 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-5t6ds"] Feb 19 15:27:22 crc kubenswrapper[4810]: I0219 15:27:22.850728 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 19 15:27:22 crc kubenswrapper[4810]: W0219 15:27:22.875772 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbdffb5e6_13bb_4c08_ad3c_52d8ded85431.slice/crio-e3c4b05e0858c37dce877a21389cd22932bcfb83ffb5f5f09b3a089cd54a1b1c WatchSource:0}: Error finding container e3c4b05e0858c37dce877a21389cd22932bcfb83ffb5f5f09b3a089cd54a1b1c: Status 404 returned error can't find the container with id e3c4b05e0858c37dce877a21389cd22932bcfb83ffb5f5f09b3a089cd54a1b1c Feb 19 15:27:23 crc kubenswrapper[4810]: I0219 15:27:23.426027 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"bdffb5e6-13bb-4c08-ad3c-52d8ded85431","Type":"ContainerStarted","Data":"e3c4b05e0858c37dce877a21389cd22932bcfb83ffb5f5f09b3a089cd54a1b1c"} Feb 19 15:27:23 crc kubenswrapper[4810]: I0219 15:27:23.427493 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-5t6ds" event={"ID":"542da555-4f39-4dff-b378-5306135244db","Type":"ContainerStarted","Data":"f40eb961af62f3727d6983a9058db071af156eff93cc9928b5cf97c5866909ab"} Feb 19 15:27:23 crc kubenswrapper[4810]: I0219 15:27:23.449121 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="731d3bd2-70ab-4ec0-b574-a00042d0b3b2" path="/var/lib/kubelet/pods/731d3bd2-70ab-4ec0-b574-a00042d0b3b2/volumes" Feb 19 15:27:23 crc kubenswrapper[4810]: I0219 15:27:23.449660 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de8cf274-42f1-4c36-bca7-1d622bf61898" path="/var/lib/kubelet/pods/de8cf274-42f1-4c36-bca7-1d622bf61898/volumes" Feb 19 15:27:27 crc kubenswrapper[4810]: I0219 15:27:27.809700 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-tbt28"] Feb 19 15:27:27 crc kubenswrapper[4810]: I0219 15:27:27.813572 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-tbt28" Feb 19 15:27:27 crc kubenswrapper[4810]: I0219 15:27:27.822858 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Feb 19 15:27:27 crc kubenswrapper[4810]: I0219 15:27:27.825999 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-tbt28"] Feb 19 15:27:27 crc kubenswrapper[4810]: I0219 15:27:27.930268 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c03aad2b-8ca1-4310-8c11-3287fafcd66f-config\") pod \"ovn-controller-metrics-tbt28\" (UID: \"c03aad2b-8ca1-4310-8c11-3287fafcd66f\") " pod="openstack/ovn-controller-metrics-tbt28" Feb 19 15:27:27 crc kubenswrapper[4810]: I0219 15:27:27.930311 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/c03aad2b-8ca1-4310-8c11-3287fafcd66f-ovs-rundir\") pod \"ovn-controller-metrics-tbt28\" (UID: \"c03aad2b-8ca1-4310-8c11-3287fafcd66f\") " pod="openstack/ovn-controller-metrics-tbt28" Feb 19 15:27:27 crc kubenswrapper[4810]: I0219 15:27:27.930353 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/c03aad2b-8ca1-4310-8c11-3287fafcd66f-ovn-rundir\") pod \"ovn-controller-metrics-tbt28\" (UID: \"c03aad2b-8ca1-4310-8c11-3287fafcd66f\") " pod="openstack/ovn-controller-metrics-tbt28" Feb 19 15:27:27 crc kubenswrapper[4810]: I0219 15:27:27.930371 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khbhr\" (UniqueName: \"kubernetes.io/projected/c03aad2b-8ca1-4310-8c11-3287fafcd66f-kube-api-access-khbhr\") pod \"ovn-controller-metrics-tbt28\" (UID: \"c03aad2b-8ca1-4310-8c11-3287fafcd66f\") " pod="openstack/ovn-controller-metrics-tbt28" Feb 19 15:27:27 crc kubenswrapper[4810]: I0219 15:27:27.930404 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c03aad2b-8ca1-4310-8c11-3287fafcd66f-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-tbt28\" (UID: \"c03aad2b-8ca1-4310-8c11-3287fafcd66f\") " pod="openstack/ovn-controller-metrics-tbt28" Feb 19 15:27:27 crc kubenswrapper[4810]: I0219 15:27:27.930425 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c03aad2b-8ca1-4310-8c11-3287fafcd66f-combined-ca-bundle\") pod \"ovn-controller-metrics-tbt28\" (UID: \"c03aad2b-8ca1-4310-8c11-3287fafcd66f\") " pod="openstack/ovn-controller-metrics-tbt28" Feb 19 15:27:27 crc kubenswrapper[4810]: I0219 15:27:27.995446 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fdbdbc8cc-mddqt"] Feb 19 15:27:28 crc kubenswrapper[4810]: I0219 15:27:28.039080 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c03aad2b-8ca1-4310-8c11-3287fafcd66f-config\") pod \"ovn-controller-metrics-tbt28\" (UID: \"c03aad2b-8ca1-4310-8c11-3287fafcd66f\") " pod="openstack/ovn-controller-metrics-tbt28" Feb 19 15:27:28 crc kubenswrapper[4810]: I0219 15:27:28.039135 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/c03aad2b-8ca1-4310-8c11-3287fafcd66f-ovs-rundir\") pod \"ovn-controller-metrics-tbt28\" (UID: \"c03aad2b-8ca1-4310-8c11-3287fafcd66f\") " pod="openstack/ovn-controller-metrics-tbt28" Feb 19 15:27:28 crc kubenswrapper[4810]: I0219 15:27:28.039174 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/c03aad2b-8ca1-4310-8c11-3287fafcd66f-ovn-rundir\") pod \"ovn-controller-metrics-tbt28\" (UID: \"c03aad2b-8ca1-4310-8c11-3287fafcd66f\") " pod="openstack/ovn-controller-metrics-tbt28" Feb 19 15:27:28 crc kubenswrapper[4810]: I0219 15:27:28.039201 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khbhr\" (UniqueName: \"kubernetes.io/projected/c03aad2b-8ca1-4310-8c11-3287fafcd66f-kube-api-access-khbhr\") pod \"ovn-controller-metrics-tbt28\" (UID: \"c03aad2b-8ca1-4310-8c11-3287fafcd66f\") " pod="openstack/ovn-controller-metrics-tbt28" Feb 19 15:27:28 crc kubenswrapper[4810]: I0219 15:27:28.039237 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c03aad2b-8ca1-4310-8c11-3287fafcd66f-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-tbt28\" (UID: \"c03aad2b-8ca1-4310-8c11-3287fafcd66f\") " pod="openstack/ovn-controller-metrics-tbt28" Feb 19 15:27:28 crc kubenswrapper[4810]: I0219 15:27:28.039274 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c03aad2b-8ca1-4310-8c11-3287fafcd66f-combined-ca-bundle\") pod \"ovn-controller-metrics-tbt28\" (UID: \"c03aad2b-8ca1-4310-8c11-3287fafcd66f\") " pod="openstack/ovn-controller-metrics-tbt28" Feb 19 15:27:28 crc kubenswrapper[4810]: I0219 15:27:28.039600 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/c03aad2b-8ca1-4310-8c11-3287fafcd66f-ovn-rundir\") pod \"ovn-controller-metrics-tbt28\" (UID: \"c03aad2b-8ca1-4310-8c11-3287fafcd66f\") " pod="openstack/ovn-controller-metrics-tbt28" Feb 19 15:27:28 crc kubenswrapper[4810]: I0219 15:27:28.039657 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/c03aad2b-8ca1-4310-8c11-3287fafcd66f-ovs-rundir\") pod \"ovn-controller-metrics-tbt28\" (UID: \"c03aad2b-8ca1-4310-8c11-3287fafcd66f\") " pod="openstack/ovn-controller-metrics-tbt28" Feb 19 15:27:28 crc kubenswrapper[4810]: I0219 15:27:28.040124 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c03aad2b-8ca1-4310-8c11-3287fafcd66f-config\") pod \"ovn-controller-metrics-tbt28\" (UID: \"c03aad2b-8ca1-4310-8c11-3287fafcd66f\") " pod="openstack/ovn-controller-metrics-tbt28" Feb 19 15:27:28 crc kubenswrapper[4810]: I0219 15:27:28.049412 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c03aad2b-8ca1-4310-8c11-3287fafcd66f-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-tbt28\" (UID: \"c03aad2b-8ca1-4310-8c11-3287fafcd66f\") " pod="openstack/ovn-controller-metrics-tbt28" Feb 19 15:27:28 crc kubenswrapper[4810]: I0219 15:27:28.051505 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-58bdb65675-l5l8x"] Feb 19 15:27:28 crc kubenswrapper[4810]: I0219 15:27:28.054079 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58bdb65675-l5l8x" Feb 19 15:27:28 crc kubenswrapper[4810]: I0219 15:27:28.061029 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58bdb65675-l5l8x"] Feb 19 15:27:28 crc kubenswrapper[4810]: I0219 15:27:28.065606 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Feb 19 15:27:28 crc kubenswrapper[4810]: I0219 15:27:28.066689 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khbhr\" (UniqueName: \"kubernetes.io/projected/c03aad2b-8ca1-4310-8c11-3287fafcd66f-kube-api-access-khbhr\") pod \"ovn-controller-metrics-tbt28\" (UID: \"c03aad2b-8ca1-4310-8c11-3287fafcd66f\") " pod="openstack/ovn-controller-metrics-tbt28" Feb 19 15:27:28 crc kubenswrapper[4810]: I0219 15:27:28.073995 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c03aad2b-8ca1-4310-8c11-3287fafcd66f-combined-ca-bundle\") pod \"ovn-controller-metrics-tbt28\" (UID: \"c03aad2b-8ca1-4310-8c11-3287fafcd66f\") " pod="openstack/ovn-controller-metrics-tbt28" Feb 19 15:27:28 crc kubenswrapper[4810]: I0219 15:27:28.140764 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cef0e580-4d18-480e-a57d-9c1b31405cd8-dns-svc\") pod \"dnsmasq-dns-58bdb65675-l5l8x\" (UID: \"cef0e580-4d18-480e-a57d-9c1b31405cd8\") " pod="openstack/dnsmasq-dns-58bdb65675-l5l8x" Feb 19 15:27:28 crc kubenswrapper[4810]: I0219 15:27:28.140846 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cef0e580-4d18-480e-a57d-9c1b31405cd8-ovsdbserver-nb\") pod \"dnsmasq-dns-58bdb65675-l5l8x\" (UID: \"cef0e580-4d18-480e-a57d-9c1b31405cd8\") " pod="openstack/dnsmasq-dns-58bdb65675-l5l8x" Feb 19 15:27:28 crc kubenswrapper[4810]: I0219 15:27:28.140907 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w28xm\" (UniqueName: \"kubernetes.io/projected/cef0e580-4d18-480e-a57d-9c1b31405cd8-kube-api-access-w28xm\") pod \"dnsmasq-dns-58bdb65675-l5l8x\" (UID: \"cef0e580-4d18-480e-a57d-9c1b31405cd8\") " pod="openstack/dnsmasq-dns-58bdb65675-l5l8x" Feb 19 15:27:28 crc kubenswrapper[4810]: I0219 15:27:28.140960 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cef0e580-4d18-480e-a57d-9c1b31405cd8-config\") pod \"dnsmasq-dns-58bdb65675-l5l8x\" (UID: \"cef0e580-4d18-480e-a57d-9c1b31405cd8\") " pod="openstack/dnsmasq-dns-58bdb65675-l5l8x" Feb 19 15:27:28 crc kubenswrapper[4810]: I0219 15:27:28.147073 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-tbt28" Feb 19 15:27:28 crc kubenswrapper[4810]: I0219 15:27:28.210169 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c99647bb5-xkrgb"] Feb 19 15:27:28 crc kubenswrapper[4810]: I0219 15:27:28.236186 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6f8ff78869-24szl"] Feb 19 15:27:28 crc kubenswrapper[4810]: I0219 15:27:28.237631 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f8ff78869-24szl" Feb 19 15:27:28 crc kubenswrapper[4810]: I0219 15:27:28.242377 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cef0e580-4d18-480e-a57d-9c1b31405cd8-dns-svc\") pod \"dnsmasq-dns-58bdb65675-l5l8x\" (UID: \"cef0e580-4d18-480e-a57d-9c1b31405cd8\") " pod="openstack/dnsmasq-dns-58bdb65675-l5l8x" Feb 19 15:27:28 crc kubenswrapper[4810]: I0219 15:27:28.242460 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cef0e580-4d18-480e-a57d-9c1b31405cd8-ovsdbserver-nb\") pod \"dnsmasq-dns-58bdb65675-l5l8x\" (UID: \"cef0e580-4d18-480e-a57d-9c1b31405cd8\") " pod="openstack/dnsmasq-dns-58bdb65675-l5l8x" Feb 19 15:27:28 crc kubenswrapper[4810]: I0219 15:27:28.242526 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w28xm\" (UniqueName: \"kubernetes.io/projected/cef0e580-4d18-480e-a57d-9c1b31405cd8-kube-api-access-w28xm\") pod \"dnsmasq-dns-58bdb65675-l5l8x\" (UID: \"cef0e580-4d18-480e-a57d-9c1b31405cd8\") " pod="openstack/dnsmasq-dns-58bdb65675-l5l8x" Feb 19 15:27:28 crc kubenswrapper[4810]: I0219 15:27:28.242579 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cef0e580-4d18-480e-a57d-9c1b31405cd8-config\") pod \"dnsmasq-dns-58bdb65675-l5l8x\" (UID: \"cef0e580-4d18-480e-a57d-9c1b31405cd8\") " pod="openstack/dnsmasq-dns-58bdb65675-l5l8x" Feb 19 15:27:28 crc kubenswrapper[4810]: I0219 15:27:28.243208 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cef0e580-4d18-480e-a57d-9c1b31405cd8-dns-svc\") pod \"dnsmasq-dns-58bdb65675-l5l8x\" (UID: \"cef0e580-4d18-480e-a57d-9c1b31405cd8\") " pod="openstack/dnsmasq-dns-58bdb65675-l5l8x" Feb 19 15:27:28 crc kubenswrapper[4810]: I0219 15:27:28.243555 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cef0e580-4d18-480e-a57d-9c1b31405cd8-config\") pod \"dnsmasq-dns-58bdb65675-l5l8x\" (UID: \"cef0e580-4d18-480e-a57d-9c1b31405cd8\") " pod="openstack/dnsmasq-dns-58bdb65675-l5l8x" Feb 19 15:27:28 crc kubenswrapper[4810]: I0219 15:27:28.243763 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cef0e580-4d18-480e-a57d-9c1b31405cd8-ovsdbserver-nb\") pod \"dnsmasq-dns-58bdb65675-l5l8x\" (UID: \"cef0e580-4d18-480e-a57d-9c1b31405cd8\") " pod="openstack/dnsmasq-dns-58bdb65675-l5l8x" Feb 19 15:27:28 crc kubenswrapper[4810]: I0219 15:27:28.246364 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Feb 19 15:27:28 crc kubenswrapper[4810]: I0219 15:27:28.262908 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f8ff78869-24szl"] Feb 19 15:27:28 crc kubenswrapper[4810]: I0219 15:27:28.276278 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w28xm\" (UniqueName: \"kubernetes.io/projected/cef0e580-4d18-480e-a57d-9c1b31405cd8-kube-api-access-w28xm\") pod \"dnsmasq-dns-58bdb65675-l5l8x\" (UID: \"cef0e580-4d18-480e-a57d-9c1b31405cd8\") " pod="openstack/dnsmasq-dns-58bdb65675-l5l8x" Feb 19 15:27:28 crc kubenswrapper[4810]: I0219 15:27:28.343830 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5886c70b-ea09-4a9e-9c31-1689d32735a5-config\") pod \"dnsmasq-dns-6f8ff78869-24szl\" (UID: \"5886c70b-ea09-4a9e-9c31-1689d32735a5\") " pod="openstack/dnsmasq-dns-6f8ff78869-24szl" Feb 19 15:27:28 crc kubenswrapper[4810]: I0219 15:27:28.343889 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcqld\" (UniqueName: \"kubernetes.io/projected/5886c70b-ea09-4a9e-9c31-1689d32735a5-kube-api-access-hcqld\") pod \"dnsmasq-dns-6f8ff78869-24szl\" (UID: \"5886c70b-ea09-4a9e-9c31-1689d32735a5\") " pod="openstack/dnsmasq-dns-6f8ff78869-24szl" Feb 19 15:27:28 crc kubenswrapper[4810]: I0219 15:27:28.343924 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5886c70b-ea09-4a9e-9c31-1689d32735a5-dns-svc\") pod \"dnsmasq-dns-6f8ff78869-24szl\" (UID: \"5886c70b-ea09-4a9e-9c31-1689d32735a5\") " pod="openstack/dnsmasq-dns-6f8ff78869-24szl" Feb 19 15:27:28 crc kubenswrapper[4810]: I0219 15:27:28.344109 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5886c70b-ea09-4a9e-9c31-1689d32735a5-ovsdbserver-sb\") pod \"dnsmasq-dns-6f8ff78869-24szl\" (UID: \"5886c70b-ea09-4a9e-9c31-1689d32735a5\") " pod="openstack/dnsmasq-dns-6f8ff78869-24szl" Feb 19 15:27:28 crc kubenswrapper[4810]: I0219 15:27:28.344240 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5886c70b-ea09-4a9e-9c31-1689d32735a5-ovsdbserver-nb\") pod \"dnsmasq-dns-6f8ff78869-24szl\" (UID: \"5886c70b-ea09-4a9e-9c31-1689d32735a5\") " pod="openstack/dnsmasq-dns-6f8ff78869-24szl" Feb 19 15:27:28 crc kubenswrapper[4810]: I0219 15:27:28.446653 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5886c70b-ea09-4a9e-9c31-1689d32735a5-dns-svc\") pod \"dnsmasq-dns-6f8ff78869-24szl\" (UID: \"5886c70b-ea09-4a9e-9c31-1689d32735a5\") " pod="openstack/dnsmasq-dns-6f8ff78869-24szl" Feb 19 15:27:28 crc kubenswrapper[4810]: I0219 15:27:28.446736 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5886c70b-ea09-4a9e-9c31-1689d32735a5-ovsdbserver-sb\") pod \"dnsmasq-dns-6f8ff78869-24szl\" (UID: \"5886c70b-ea09-4a9e-9c31-1689d32735a5\") " pod="openstack/dnsmasq-dns-6f8ff78869-24szl" Feb 19 15:27:28 crc kubenswrapper[4810]: I0219 15:27:28.446789 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5886c70b-ea09-4a9e-9c31-1689d32735a5-ovsdbserver-nb\") pod \"dnsmasq-dns-6f8ff78869-24szl\" (UID: \"5886c70b-ea09-4a9e-9c31-1689d32735a5\") " pod="openstack/dnsmasq-dns-6f8ff78869-24szl" Feb 19 15:27:28 crc kubenswrapper[4810]: I0219 15:27:28.446820 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5886c70b-ea09-4a9e-9c31-1689d32735a5-config\") pod \"dnsmasq-dns-6f8ff78869-24szl\" (UID: \"5886c70b-ea09-4a9e-9c31-1689d32735a5\") " pod="openstack/dnsmasq-dns-6f8ff78869-24szl" Feb 19 15:27:28 crc kubenswrapper[4810]: I0219 15:27:28.446850 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcqld\" (UniqueName: \"kubernetes.io/projected/5886c70b-ea09-4a9e-9c31-1689d32735a5-kube-api-access-hcqld\") pod \"dnsmasq-dns-6f8ff78869-24szl\" (UID: \"5886c70b-ea09-4a9e-9c31-1689d32735a5\") " pod="openstack/dnsmasq-dns-6f8ff78869-24szl" Feb 19 15:27:28 crc kubenswrapper[4810]: I0219 15:27:28.448804 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5886c70b-ea09-4a9e-9c31-1689d32735a5-ovsdbserver-sb\") pod \"dnsmasq-dns-6f8ff78869-24szl\" (UID: \"5886c70b-ea09-4a9e-9c31-1689d32735a5\") " pod="openstack/dnsmasq-dns-6f8ff78869-24szl" Feb 19 15:27:28 crc kubenswrapper[4810]: I0219 15:27:28.450644 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5886c70b-ea09-4a9e-9c31-1689d32735a5-config\") pod \"dnsmasq-dns-6f8ff78869-24szl\" (UID: \"5886c70b-ea09-4a9e-9c31-1689d32735a5\") " pod="openstack/dnsmasq-dns-6f8ff78869-24szl" Feb 19 15:27:28 crc kubenswrapper[4810]: I0219 15:27:28.453125 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5886c70b-ea09-4a9e-9c31-1689d32735a5-ovsdbserver-nb\") pod \"dnsmasq-dns-6f8ff78869-24szl\" (UID: \"5886c70b-ea09-4a9e-9c31-1689d32735a5\") " pod="openstack/dnsmasq-dns-6f8ff78869-24szl" Feb 19 15:27:28 crc kubenswrapper[4810]: I0219 15:27:28.453868 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5886c70b-ea09-4a9e-9c31-1689d32735a5-dns-svc\") pod \"dnsmasq-dns-6f8ff78869-24szl\" (UID: \"5886c70b-ea09-4a9e-9c31-1689d32735a5\") " pod="openstack/dnsmasq-dns-6f8ff78869-24szl" Feb 19 15:27:28 crc kubenswrapper[4810]: I0219 15:27:28.454367 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58bdb65675-l5l8x" Feb 19 15:27:28 crc kubenswrapper[4810]: I0219 15:27:28.464621 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcqld\" (UniqueName: \"kubernetes.io/projected/5886c70b-ea09-4a9e-9c31-1689d32735a5-kube-api-access-hcqld\") pod \"dnsmasq-dns-6f8ff78869-24szl\" (UID: \"5886c70b-ea09-4a9e-9c31-1689d32735a5\") " pod="openstack/dnsmasq-dns-6f8ff78869-24szl" Feb 19 15:27:28 crc kubenswrapper[4810]: I0219 15:27:28.560944 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f8ff78869-24szl" Feb 19 15:27:29 crc kubenswrapper[4810]: I0219 15:27:29.704016 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Feb 19 15:27:31 crc kubenswrapper[4810]: I0219 15:27:31.141645 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f8ff78869-24szl"] Feb 19 15:27:31 crc kubenswrapper[4810]: I0219 15:27:31.148139 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58bdb65675-l5l8x"] Feb 19 15:27:31 crc kubenswrapper[4810]: I0219 15:27:31.263644 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-tbt28"] Feb 19 15:27:31 crc kubenswrapper[4810]: W0219 15:27:31.264256 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5886c70b_ea09_4a9e_9c31_1689d32735a5.slice/crio-bb8c011e783721ee6fc9b0c30d19bca5636e6fa4d84780893c5d094407e82425 WatchSource:0}: Error finding container bb8c011e783721ee6fc9b0c30d19bca5636e6fa4d84780893c5d094407e82425: Status 404 returned error can't find the container with id bb8c011e783721ee6fc9b0c30d19bca5636e6fa4d84780893c5d094407e82425 Feb 19 15:27:31 crc kubenswrapper[4810]: I0219 15:27:31.510604 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fdbdbc8cc-mddqt" event={"ID":"64a59633-cb6f-4631-a980-566894f0ce35","Type":"ContainerStarted","Data":"bcd810514d656b151586b085915c58b159ffd3f83f716b74f5c945866a1aa802"} Feb 19 15:27:31 crc kubenswrapper[4810]: I0219 15:27:31.510910 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5fdbdbc8cc-mddqt" podUID="64a59633-cb6f-4631-a980-566894f0ce35" containerName="dnsmasq-dns" containerID="cri-o://bcd810514d656b151586b085915c58b159ffd3f83f716b74f5c945866a1aa802" gracePeriod=10 Feb 19 15:27:31 crc kubenswrapper[4810]: I0219 15:27:31.511174 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5fdbdbc8cc-mddqt" Feb 19 15:27:31 crc kubenswrapper[4810]: I0219 15:27:31.520760 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f8ff78869-24szl" event={"ID":"5886c70b-ea09-4a9e-9c31-1689d32735a5","Type":"ContainerStarted","Data":"bb8c011e783721ee6fc9b0c30d19bca5636e6fa4d84780893c5d094407e82425"} Feb 19 15:27:31 crc kubenswrapper[4810]: I0219 15:27:31.525648 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58bdb65675-l5l8x" event={"ID":"cef0e580-4d18-480e-a57d-9c1b31405cd8","Type":"ContainerStarted","Data":"91a11d1dc7d57506c1c71e77acb0aff16fbbb92753ce7889b5cb70d83d2c4e29"} Feb 19 15:27:31 crc kubenswrapper[4810]: I0219 15:27:31.529578 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-tbt28" event={"ID":"c03aad2b-8ca1-4310-8c11-3287fafcd66f","Type":"ContainerStarted","Data":"398ce092968fcf75e595ec5e07bca0c0bfb7f0b37aac1aed5f14c676bdbdecfb"} Feb 19 15:27:31 crc kubenswrapper[4810]: I0219 15:27:31.650351 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5fdbdbc8cc-mddqt" podStartSLOduration=26.66434251 podStartE2EDuration="27.650317654s" podCreationTimestamp="2026-02-19 15:27:04 +0000 UTC" firstStartedPulling="2026-02-19 15:27:20.802297564 +0000 UTC m=+1070.284327718" lastFinishedPulling="2026-02-19 15:27:21.788272738 +0000 UTC m=+1071.270302862" observedRunningTime="2026-02-19 15:27:31.647852104 +0000 UTC m=+1081.129882238" watchObservedRunningTime="2026-02-19 15:27:31.650317654 +0000 UTC m=+1081.132347768" Feb 19 15:27:31 crc kubenswrapper[4810]: I0219 15:27:31.929986 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58bdb65675-l5l8x"] Feb 19 15:27:31 crc kubenswrapper[4810]: I0219 15:27:31.966380 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7d95ff5b97-flxcw"] Feb 19 15:27:31 crc kubenswrapper[4810]: I0219 15:27:31.968119 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d95ff5b97-flxcw" Feb 19 15:27:31 crc kubenswrapper[4810]: I0219 15:27:31.994576 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d95ff5b97-flxcw"] Feb 19 15:27:32 crc kubenswrapper[4810]: I0219 15:27:32.013467 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9d7lt\" (UniqueName: \"kubernetes.io/projected/788aae13-b274-4965-ac0c-8ac075c32567-kube-api-access-9d7lt\") pod \"dnsmasq-dns-7d95ff5b97-flxcw\" (UID: \"788aae13-b274-4965-ac0c-8ac075c32567\") " pod="openstack/dnsmasq-dns-7d95ff5b97-flxcw" Feb 19 15:27:32 crc kubenswrapper[4810]: I0219 15:27:32.013554 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/788aae13-b274-4965-ac0c-8ac075c32567-dns-svc\") pod \"dnsmasq-dns-7d95ff5b97-flxcw\" (UID: \"788aae13-b274-4965-ac0c-8ac075c32567\") " pod="openstack/dnsmasq-dns-7d95ff5b97-flxcw" Feb 19 15:27:32 crc kubenswrapper[4810]: I0219 15:27:32.013604 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/788aae13-b274-4965-ac0c-8ac075c32567-ovsdbserver-sb\") pod \"dnsmasq-dns-7d95ff5b97-flxcw\" (UID: \"788aae13-b274-4965-ac0c-8ac075c32567\") " pod="openstack/dnsmasq-dns-7d95ff5b97-flxcw" Feb 19 15:27:32 crc kubenswrapper[4810]: I0219 15:27:32.013656 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/788aae13-b274-4965-ac0c-8ac075c32567-ovsdbserver-nb\") pod \"dnsmasq-dns-7d95ff5b97-flxcw\" (UID: \"788aae13-b274-4965-ac0c-8ac075c32567\") " pod="openstack/dnsmasq-dns-7d95ff5b97-flxcw" Feb 19 15:27:32 crc kubenswrapper[4810]: I0219 15:27:32.013719 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/788aae13-b274-4965-ac0c-8ac075c32567-config\") pod \"dnsmasq-dns-7d95ff5b97-flxcw\" (UID: \"788aae13-b274-4965-ac0c-8ac075c32567\") " pod="openstack/dnsmasq-dns-7d95ff5b97-flxcw" Feb 19 15:27:32 crc kubenswrapper[4810]: I0219 15:27:32.116124 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9d7lt\" (UniqueName: \"kubernetes.io/projected/788aae13-b274-4965-ac0c-8ac075c32567-kube-api-access-9d7lt\") pod \"dnsmasq-dns-7d95ff5b97-flxcw\" (UID: \"788aae13-b274-4965-ac0c-8ac075c32567\") " pod="openstack/dnsmasq-dns-7d95ff5b97-flxcw" Feb 19 15:27:32 crc kubenswrapper[4810]: I0219 15:27:32.116200 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/788aae13-b274-4965-ac0c-8ac075c32567-dns-svc\") pod \"dnsmasq-dns-7d95ff5b97-flxcw\" (UID: \"788aae13-b274-4965-ac0c-8ac075c32567\") " pod="openstack/dnsmasq-dns-7d95ff5b97-flxcw" Feb 19 15:27:32 crc kubenswrapper[4810]: I0219 15:27:32.116245 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/788aae13-b274-4965-ac0c-8ac075c32567-ovsdbserver-sb\") pod \"dnsmasq-dns-7d95ff5b97-flxcw\" (UID: \"788aae13-b274-4965-ac0c-8ac075c32567\") " pod="openstack/dnsmasq-dns-7d95ff5b97-flxcw" Feb 19 15:27:32 crc kubenswrapper[4810]: I0219 15:27:32.116292 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/788aae13-b274-4965-ac0c-8ac075c32567-ovsdbserver-nb\") pod \"dnsmasq-dns-7d95ff5b97-flxcw\" (UID: \"788aae13-b274-4965-ac0c-8ac075c32567\") " pod="openstack/dnsmasq-dns-7d95ff5b97-flxcw" Feb 19 15:27:32 crc kubenswrapper[4810]: I0219 15:27:32.116362 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/788aae13-b274-4965-ac0c-8ac075c32567-config\") pod \"dnsmasq-dns-7d95ff5b97-flxcw\" (UID: \"788aae13-b274-4965-ac0c-8ac075c32567\") " pod="openstack/dnsmasq-dns-7d95ff5b97-flxcw" Feb 19 15:27:32 crc kubenswrapper[4810]: I0219 15:27:32.117398 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/788aae13-b274-4965-ac0c-8ac075c32567-dns-svc\") pod \"dnsmasq-dns-7d95ff5b97-flxcw\" (UID: \"788aae13-b274-4965-ac0c-8ac075c32567\") " pod="openstack/dnsmasq-dns-7d95ff5b97-flxcw" Feb 19 15:27:32 crc kubenswrapper[4810]: I0219 15:27:32.118136 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/788aae13-b274-4965-ac0c-8ac075c32567-config\") pod \"dnsmasq-dns-7d95ff5b97-flxcw\" (UID: \"788aae13-b274-4965-ac0c-8ac075c32567\") " pod="openstack/dnsmasq-dns-7d95ff5b97-flxcw" Feb 19 15:27:32 crc kubenswrapper[4810]: I0219 15:27:32.118755 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/788aae13-b274-4965-ac0c-8ac075c32567-ovsdbserver-sb\") pod \"dnsmasq-dns-7d95ff5b97-flxcw\" (UID: \"788aae13-b274-4965-ac0c-8ac075c32567\") " pod="openstack/dnsmasq-dns-7d95ff5b97-flxcw" Feb 19 15:27:32 crc kubenswrapper[4810]: I0219 15:27:32.118852 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/788aae13-b274-4965-ac0c-8ac075c32567-ovsdbserver-nb\") pod \"dnsmasq-dns-7d95ff5b97-flxcw\" (UID: \"788aae13-b274-4965-ac0c-8ac075c32567\") " pod="openstack/dnsmasq-dns-7d95ff5b97-flxcw" Feb 19 15:27:32 crc kubenswrapper[4810]: I0219 15:27:32.189162 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9d7lt\" (UniqueName: \"kubernetes.io/projected/788aae13-b274-4965-ac0c-8ac075c32567-kube-api-access-9d7lt\") pod \"dnsmasq-dns-7d95ff5b97-flxcw\" (UID: \"788aae13-b274-4965-ac0c-8ac075c32567\") " pod="openstack/dnsmasq-dns-7d95ff5b97-flxcw" Feb 19 15:27:32 crc kubenswrapper[4810]: I0219 15:27:32.304979 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d95ff5b97-flxcw" Feb 19 15:27:32 crc kubenswrapper[4810]: I0219 15:27:32.558133 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"bdffb5e6-13bb-4c08-ad3c-52d8ded85431","Type":"ContainerStarted","Data":"9c08e680a1b3ab0d4a9d3191480c060f2f4e5575377e753fd3d6cfb4cd55a63e"} Feb 19 15:27:32 crc kubenswrapper[4810]: I0219 15:27:32.560155 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"c0ffb8ce-a356-4416-b96c-49db30ff1947","Type":"ContainerStarted","Data":"bb49108e1faa12136cbb1db0e6f8a1e4d5a337b3554af6b9133dd5428aa7a353"} Feb 19 15:27:32 crc kubenswrapper[4810]: I0219 15:27:32.572239 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"30d11a24-9722-4e7a-9be5-f2bd00128167","Type":"ContainerStarted","Data":"1591fb62b46d0cb1a40e18641c32aabc2ceca62bd58d6c739d0fd39dbad2c5c9"} Feb 19 15:27:32 crc kubenswrapper[4810]: I0219 15:27:32.577414 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-c99647bb5-xkrgb" podUID="eec01803-2cfd-4e97-a1c0-216c2622e913" containerName="dnsmasq-dns" containerID="cri-o://7dff482418537b07f37dc2275d46ba72695f3b6461cc224e2cd000dd183e1d25" gracePeriod=10 Feb 19 15:27:32 crc kubenswrapper[4810]: I0219 15:27:32.577631 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c99647bb5-xkrgb" event={"ID":"eec01803-2cfd-4e97-a1c0-216c2622e913","Type":"ContainerStarted","Data":"7dff482418537b07f37dc2275d46ba72695f3b6461cc224e2cd000dd183e1d25"} Feb 19 15:27:32 crc kubenswrapper[4810]: I0219 15:27:32.577667 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-c99647bb5-xkrgb" Feb 19 15:27:32 crc kubenswrapper[4810]: I0219 15:27:32.594922 4810 generic.go:334] "Generic (PLEG): container finished" podID="64a59633-cb6f-4631-a980-566894f0ce35" containerID="bcd810514d656b151586b085915c58b159ffd3f83f716b74f5c945866a1aa802" exitCode=0 Feb 19 15:27:32 crc kubenswrapper[4810]: I0219 15:27:32.594982 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fdbdbc8cc-mddqt" event={"ID":"64a59633-cb6f-4631-a980-566894f0ce35","Type":"ContainerDied","Data":"bcd810514d656b151586b085915c58b159ffd3f83f716b74f5c945866a1aa802"} Feb 19 15:27:32 crc kubenswrapper[4810]: I0219 15:27:32.595005 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fdbdbc8cc-mddqt" event={"ID":"64a59633-cb6f-4631-a980-566894f0ce35","Type":"ContainerDied","Data":"97bfa68e77f4ab8ae2a59549a783cdf08404be22506f85bef749cb5bf6fbe5cf"} Feb 19 15:27:32 crc kubenswrapper[4810]: I0219 15:27:32.595014 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97bfa68e77f4ab8ae2a59549a783cdf08404be22506f85bef749cb5bf6fbe5cf" Feb 19 15:27:32 crc kubenswrapper[4810]: I0219 15:27:32.614666 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"5b985124-01b7-430c-b5ea-b9fd095e5f5e","Type":"ContainerStarted","Data":"0f36e0c185a145dde7c4e8351f98756404400d361d9734a38f2e7fead03e8f24"} Feb 19 15:27:32 crc kubenswrapper[4810]: I0219 15:27:32.643197 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-c99647bb5-xkrgb" podStartSLOduration=28.476877247 podStartE2EDuration="28.643181897s" podCreationTimestamp="2026-02-19 15:27:04 +0000 UTC" firstStartedPulling="2026-02-19 15:27:21.659834665 +0000 UTC m=+1071.141864789" lastFinishedPulling="2026-02-19 15:27:21.826139325 +0000 UTC m=+1071.308169439" observedRunningTime="2026-02-19 15:27:32.604113241 +0000 UTC m=+1082.086143375" watchObservedRunningTime="2026-02-19 15:27:32.643181897 +0000 UTC m=+1082.125212021" Feb 19 15:27:32 crc kubenswrapper[4810]: I0219 15:27:32.925734 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d95ff5b97-flxcw"] Feb 19 15:27:33 crc kubenswrapper[4810]: I0219 15:27:33.064424 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Feb 19 15:27:33 crc kubenswrapper[4810]: I0219 15:27:33.082621 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 19 15:27:33 crc kubenswrapper[4810]: I0219 15:27:33.088372 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Feb 19 15:27:33 crc kubenswrapper[4810]: I0219 15:27:33.088747 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Feb 19 15:27:33 crc kubenswrapper[4810]: I0219 15:27:33.090778 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Feb 19 15:27:33 crc kubenswrapper[4810]: I0219 15:27:33.101487 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-5lxcx" Feb 19 15:27:33 crc kubenswrapper[4810]: I0219 15:27:33.111723 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 19 15:27:33 crc kubenswrapper[4810]: I0219 15:27:33.242881 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/37e2af25-5b30-4fb9-801e-f4a84d665540-cache\") pod \"swift-storage-0\" (UID: \"37e2af25-5b30-4fb9-801e-f4a84d665540\") " pod="openstack/swift-storage-0" Feb 19 15:27:33 crc kubenswrapper[4810]: I0219 15:27:33.242943 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37e2af25-5b30-4fb9-801e-f4a84d665540-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"37e2af25-5b30-4fb9-801e-f4a84d665540\") " pod="openstack/swift-storage-0" Feb 19 15:27:33 crc kubenswrapper[4810]: I0219 15:27:33.243130 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/37e2af25-5b30-4fb9-801e-f4a84d665540-etc-swift\") pod \"swift-storage-0\" (UID: \"37e2af25-5b30-4fb9-801e-f4a84d665540\") " pod="openstack/swift-storage-0" Feb 19 15:27:33 crc kubenswrapper[4810]: I0219 15:27:33.243307 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kjdb\" (UniqueName: \"kubernetes.io/projected/37e2af25-5b30-4fb9-801e-f4a84d665540-kube-api-access-7kjdb\") pod \"swift-storage-0\" (UID: \"37e2af25-5b30-4fb9-801e-f4a84d665540\") " pod="openstack/swift-storage-0" Feb 19 15:27:33 crc kubenswrapper[4810]: I0219 15:27:33.243376 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"37e2af25-5b30-4fb9-801e-f4a84d665540\") " pod="openstack/swift-storage-0" Feb 19 15:27:33 crc kubenswrapper[4810]: I0219 15:27:33.243425 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/37e2af25-5b30-4fb9-801e-f4a84d665540-lock\") pod \"swift-storage-0\" (UID: \"37e2af25-5b30-4fb9-801e-f4a84d665540\") " pod="openstack/swift-storage-0" Feb 19 15:27:33 crc kubenswrapper[4810]: I0219 15:27:33.345250 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kjdb\" (UniqueName: \"kubernetes.io/projected/37e2af25-5b30-4fb9-801e-f4a84d665540-kube-api-access-7kjdb\") pod \"swift-storage-0\" (UID: \"37e2af25-5b30-4fb9-801e-f4a84d665540\") " pod="openstack/swift-storage-0" Feb 19 15:27:33 crc kubenswrapper[4810]: I0219 15:27:33.345305 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"37e2af25-5b30-4fb9-801e-f4a84d665540\") " pod="openstack/swift-storage-0" Feb 19 15:27:33 crc kubenswrapper[4810]: I0219 15:27:33.345345 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/37e2af25-5b30-4fb9-801e-f4a84d665540-lock\") pod \"swift-storage-0\" (UID: \"37e2af25-5b30-4fb9-801e-f4a84d665540\") " pod="openstack/swift-storage-0" Feb 19 15:27:33 crc kubenswrapper[4810]: I0219 15:27:33.345365 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/37e2af25-5b30-4fb9-801e-f4a84d665540-cache\") pod \"swift-storage-0\" (UID: \"37e2af25-5b30-4fb9-801e-f4a84d665540\") " pod="openstack/swift-storage-0" Feb 19 15:27:33 crc kubenswrapper[4810]: I0219 15:27:33.345397 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37e2af25-5b30-4fb9-801e-f4a84d665540-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"37e2af25-5b30-4fb9-801e-f4a84d665540\") " pod="openstack/swift-storage-0" Feb 19 15:27:33 crc kubenswrapper[4810]: I0219 15:27:33.345441 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/37e2af25-5b30-4fb9-801e-f4a84d665540-etc-swift\") pod \"swift-storage-0\" (UID: \"37e2af25-5b30-4fb9-801e-f4a84d665540\") " pod="openstack/swift-storage-0" Feb 19 15:27:33 crc kubenswrapper[4810]: E0219 15:27:33.345576 4810 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 19 15:27:33 crc kubenswrapper[4810]: E0219 15:27:33.345590 4810 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 19 15:27:33 crc kubenswrapper[4810]: E0219 15:27:33.345631 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/37e2af25-5b30-4fb9-801e-f4a84d665540-etc-swift podName:37e2af25-5b30-4fb9-801e-f4a84d665540 nodeName:}" failed. No retries permitted until 2026-02-19 15:27:33.845615861 +0000 UTC m=+1083.327645985 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/37e2af25-5b30-4fb9-801e-f4a84d665540-etc-swift") pod "swift-storage-0" (UID: "37e2af25-5b30-4fb9-801e-f4a84d665540") : configmap "swift-ring-files" not found Feb 19 15:27:33 crc kubenswrapper[4810]: I0219 15:27:33.346135 4810 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"37e2af25-5b30-4fb9-801e-f4a84d665540\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/swift-storage-0" Feb 19 15:27:33 crc kubenswrapper[4810]: I0219 15:27:33.347149 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/37e2af25-5b30-4fb9-801e-f4a84d665540-cache\") pod \"swift-storage-0\" (UID: \"37e2af25-5b30-4fb9-801e-f4a84d665540\") " pod="openstack/swift-storage-0" Feb 19 15:27:33 crc kubenswrapper[4810]: I0219 15:27:33.347471 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/37e2af25-5b30-4fb9-801e-f4a84d665540-lock\") pod \"swift-storage-0\" (UID: \"37e2af25-5b30-4fb9-801e-f4a84d665540\") " pod="openstack/swift-storage-0" Feb 19 15:27:33 crc kubenswrapper[4810]: I0219 15:27:33.362525 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kjdb\" (UniqueName: \"kubernetes.io/projected/37e2af25-5b30-4fb9-801e-f4a84d665540-kube-api-access-7kjdb\") pod \"swift-storage-0\" (UID: \"37e2af25-5b30-4fb9-801e-f4a84d665540\") " pod="openstack/swift-storage-0" Feb 19 15:27:33 crc kubenswrapper[4810]: I0219 15:27:33.369690 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"37e2af25-5b30-4fb9-801e-f4a84d665540\") " pod="openstack/swift-storage-0" Feb 19 15:27:33 crc kubenswrapper[4810]: I0219 15:27:33.390432 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37e2af25-5b30-4fb9-801e-f4a84d665540-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"37e2af25-5b30-4fb9-801e-f4a84d665540\") " pod="openstack/swift-storage-0" Feb 19 15:27:33 crc kubenswrapper[4810]: I0219 15:27:33.625289 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-5t6ds" event={"ID":"542da555-4f39-4dff-b378-5306135244db","Type":"ContainerStarted","Data":"937bbc860401565ad8af8a788c4abb62132c4199e86ece180aec57a1dc21c966"} Feb 19 15:27:33 crc kubenswrapper[4810]: I0219 15:27:33.627443 4810 generic.go:334] "Generic (PLEG): container finished" podID="5886c70b-ea09-4a9e-9c31-1689d32735a5" containerID="c07f2dcf7043fcd524ebc329815b7a17ca72dbefb72ca802efe8c1da736fd8ad" exitCode=0 Feb 19 15:27:33 crc kubenswrapper[4810]: I0219 15:27:33.627496 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f8ff78869-24szl" event={"ID":"5886c70b-ea09-4a9e-9c31-1689d32735a5","Type":"ContainerDied","Data":"c07f2dcf7043fcd524ebc329815b7a17ca72dbefb72ca802efe8c1da736fd8ad"} Feb 19 15:27:33 crc kubenswrapper[4810]: I0219 15:27:33.631675 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-s5488" event={"ID":"4a4fa57b-aa00-4866-b31e-df29f7f86480","Type":"ContainerStarted","Data":"5763f40a55ec61c17824f9bf1cef4535f925e85cabd705fc9224e22229d2a90f"} Feb 19 15:27:33 crc kubenswrapper[4810]: I0219 15:27:33.631827 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-s5488" Feb 19 15:27:33 crc kubenswrapper[4810]: I0219 15:27:33.633732 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"00bcfb03-4357-4343-99a5-30dc7f25abe9","Type":"ContainerStarted","Data":"5f65c0deba7b3077c5501137f00e319288d66ec1245a0e431539e6d1d5d3d67c"} Feb 19 15:27:33 crc kubenswrapper[4810]: I0219 15:27:33.637693 4810 generic.go:334] "Generic (PLEG): container finished" podID="cef0e580-4d18-480e-a57d-9c1b31405cd8" containerID="e98ee022771e694a97d33efc6f076045128cb6c9031e6599d9309671e11fc1de" exitCode=0 Feb 19 15:27:33 crc kubenswrapper[4810]: I0219 15:27:33.637920 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58bdb65675-l5l8x" event={"ID":"cef0e580-4d18-480e-a57d-9c1b31405cd8","Type":"ContainerDied","Data":"e98ee022771e694a97d33efc6f076045128cb6c9031e6599d9309671e11fc1de"} Feb 19 15:27:33 crc kubenswrapper[4810]: I0219 15:27:33.641064 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2a3676ed-f06f-4dea-82a1-959716331113","Type":"ContainerStarted","Data":"d8e4585cb9bc557ddc8d8dc697ba266361d794bec388e11f6081c9ec077a1726"} Feb 19 15:27:33 crc kubenswrapper[4810]: I0219 15:27:33.643936 4810 generic.go:334] "Generic (PLEG): container finished" podID="eec01803-2cfd-4e97-a1c0-216c2622e913" containerID="7dff482418537b07f37dc2275d46ba72695f3b6461cc224e2cd000dd183e1d25" exitCode=0 Feb 19 15:27:33 crc kubenswrapper[4810]: I0219 15:27:33.643998 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c99647bb5-xkrgb" event={"ID":"eec01803-2cfd-4e97-a1c0-216c2622e913","Type":"ContainerDied","Data":"7dff482418537b07f37dc2275d46ba72695f3b6461cc224e2cd000dd183e1d25"} Feb 19 15:27:33 crc kubenswrapper[4810]: I0219 15:27:33.645870 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/notifications-rabbitmq-server-0" event={"ID":"4ad875c7-9e7b-4cc8-b6f7-e6fbeb95573c","Type":"ContainerStarted","Data":"781b07acf23d18cc10631b3a01aa0eb27d1e62e7e3cfc8db109ba3d58b915ff0"} Feb 19 15:27:33 crc kubenswrapper[4810]: I0219 15:27:33.652185 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"097bc4d1-5648-4607-9c49-286e4bbbe553","Type":"ContainerStarted","Data":"35654a2e76fc1f65be05a171e2aeec58c5e73e3b78c5850da9115db247aae94f"} Feb 19 15:27:33 crc kubenswrapper[4810]: I0219 15:27:33.652511 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 19 15:27:33 crc kubenswrapper[4810]: I0219 15:27:33.754705 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-s5488" podStartSLOduration=11.269515073 podStartE2EDuration="19.754686715s" podCreationTimestamp="2026-02-19 15:27:14 +0000 UTC" firstStartedPulling="2026-02-19 15:27:22.086856806 +0000 UTC m=+1071.568886930" lastFinishedPulling="2026-02-19 15:27:30.572028448 +0000 UTC m=+1080.054058572" observedRunningTime="2026-02-19 15:27:33.744435344 +0000 UTC m=+1083.226465468" watchObservedRunningTime="2026-02-19 15:27:33.754686715 +0000 UTC m=+1083.236716849" Feb 19 15:27:33 crc kubenswrapper[4810]: I0219 15:27:33.828718 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=12.286684607 podStartE2EDuration="22.828698907s" podCreationTimestamp="2026-02-19 15:27:11 +0000 UTC" firstStartedPulling="2026-02-19 15:27:20.81520985 +0000 UTC m=+1070.297239974" lastFinishedPulling="2026-02-19 15:27:31.35722415 +0000 UTC m=+1080.839254274" observedRunningTime="2026-02-19 15:27:33.827032006 +0000 UTC m=+1083.309062130" watchObservedRunningTime="2026-02-19 15:27:33.828698907 +0000 UTC m=+1083.310729031" Feb 19 15:27:33 crc kubenswrapper[4810]: I0219 15:27:33.855526 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/37e2af25-5b30-4fb9-801e-f4a84d665540-etc-swift\") pod \"swift-storage-0\" (UID: \"37e2af25-5b30-4fb9-801e-f4a84d665540\") " pod="openstack/swift-storage-0" Feb 19 15:27:33 crc kubenswrapper[4810]: E0219 15:27:33.855738 4810 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 19 15:27:33 crc kubenswrapper[4810]: E0219 15:27:33.855765 4810 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 19 15:27:33 crc kubenswrapper[4810]: E0219 15:27:33.855817 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/37e2af25-5b30-4fb9-801e-f4a84d665540-etc-swift podName:37e2af25-5b30-4fb9-801e-f4a84d665540 nodeName:}" failed. No retries permitted until 2026-02-19 15:27:34.85579849 +0000 UTC m=+1084.337828614 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/37e2af25-5b30-4fb9-801e-f4a84d665540-etc-swift") pod "swift-storage-0" (UID: "37e2af25-5b30-4fb9-801e-f4a84d665540") : configmap "swift-ring-files" not found Feb 19 15:27:33 crc kubenswrapper[4810]: I0219 15:27:33.869086 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fdbdbc8cc-mddqt" Feb 19 15:27:33 crc kubenswrapper[4810]: I0219 15:27:33.956758 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/64a59633-cb6f-4631-a980-566894f0ce35-dns-svc\") pod \"64a59633-cb6f-4631-a980-566894f0ce35\" (UID: \"64a59633-cb6f-4631-a980-566894f0ce35\") " Feb 19 15:27:33 crc kubenswrapper[4810]: I0219 15:27:33.956843 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64a59633-cb6f-4631-a980-566894f0ce35-config\") pod \"64a59633-cb6f-4631-a980-566894f0ce35\" (UID: \"64a59633-cb6f-4631-a980-566894f0ce35\") " Feb 19 15:27:33 crc kubenswrapper[4810]: I0219 15:27:33.956953 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dbn6\" (UniqueName: \"kubernetes.io/projected/64a59633-cb6f-4631-a980-566894f0ce35-kube-api-access-5dbn6\") pod \"64a59633-cb6f-4631-a980-566894f0ce35\" (UID: \"64a59633-cb6f-4631-a980-566894f0ce35\") " Feb 19 15:27:33 crc kubenswrapper[4810]: I0219 15:27:33.976905 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64a59633-cb6f-4631-a980-566894f0ce35-kube-api-access-5dbn6" (OuterVolumeSpecName: "kube-api-access-5dbn6") pod "64a59633-cb6f-4631-a980-566894f0ce35" (UID: "64a59633-cb6f-4631-a980-566894f0ce35"). InnerVolumeSpecName "kube-api-access-5dbn6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:27:34 crc kubenswrapper[4810]: I0219 15:27:34.023655 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64a59633-cb6f-4631-a980-566894f0ce35-config" (OuterVolumeSpecName: "config") pod "64a59633-cb6f-4631-a980-566894f0ce35" (UID: "64a59633-cb6f-4631-a980-566894f0ce35"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:27:34 crc kubenswrapper[4810]: I0219 15:27:34.029647 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64a59633-cb6f-4631-a980-566894f0ce35-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "64a59633-cb6f-4631-a980-566894f0ce35" (UID: "64a59633-cb6f-4631-a980-566894f0ce35"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:27:34 crc kubenswrapper[4810]: I0219 15:27:34.059364 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5dbn6\" (UniqueName: \"kubernetes.io/projected/64a59633-cb6f-4631-a980-566894f0ce35-kube-api-access-5dbn6\") on node \"crc\" DevicePath \"\"" Feb 19 15:27:34 crc kubenswrapper[4810]: I0219 15:27:34.059398 4810 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/64a59633-cb6f-4631-a980-566894f0ce35-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 15:27:34 crc kubenswrapper[4810]: I0219 15:27:34.059412 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64a59633-cb6f-4631-a980-566894f0ce35-config\") on node \"crc\" DevicePath \"\"" Feb 19 15:27:34 crc kubenswrapper[4810]: I0219 15:27:34.660682 4810 generic.go:334] "Generic (PLEG): container finished" podID="542da555-4f39-4dff-b378-5306135244db" containerID="937bbc860401565ad8af8a788c4abb62132c4199e86ece180aec57a1dc21c966" exitCode=0 Feb 19 15:27:34 crc kubenswrapper[4810]: I0219 15:27:34.660798 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-5t6ds" event={"ID":"542da555-4f39-4dff-b378-5306135244db","Type":"ContainerDied","Data":"937bbc860401565ad8af8a788c4abb62132c4199e86ece180aec57a1dc21c966"} Feb 19 15:27:34 crc kubenswrapper[4810]: I0219 15:27:34.660945 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fdbdbc8cc-mddqt" Feb 19 15:27:34 crc kubenswrapper[4810]: I0219 15:27:34.700767 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fdbdbc8cc-mddqt"] Feb 19 15:27:34 crc kubenswrapper[4810]: I0219 15:27:34.706065 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5fdbdbc8cc-mddqt"] Feb 19 15:27:34 crc kubenswrapper[4810]: I0219 15:27:34.874677 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/37e2af25-5b30-4fb9-801e-f4a84d665540-etc-swift\") pod \"swift-storage-0\" (UID: \"37e2af25-5b30-4fb9-801e-f4a84d665540\") " pod="openstack/swift-storage-0" Feb 19 15:27:34 crc kubenswrapper[4810]: E0219 15:27:34.874946 4810 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 19 15:27:34 crc kubenswrapper[4810]: E0219 15:27:34.874989 4810 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 19 15:27:34 crc kubenswrapper[4810]: E0219 15:27:34.875067 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/37e2af25-5b30-4fb9-801e-f4a84d665540-etc-swift podName:37e2af25-5b30-4fb9-801e-f4a84d665540 nodeName:}" failed. No retries permitted until 2026-02-19 15:27:36.875043079 +0000 UTC m=+1086.357073213 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/37e2af25-5b30-4fb9-801e-f4a84d665540-etc-swift") pod "swift-storage-0" (UID: "37e2af25-5b30-4fb9-801e-f4a84d665540") : configmap "swift-ring-files" not found Feb 19 15:27:35 crc kubenswrapper[4810]: I0219 15:27:35.476490 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64a59633-cb6f-4631-a980-566894f0ce35" path="/var/lib/kubelet/pods/64a59633-cb6f-4631-a980-566894f0ce35/volumes" Feb 19 15:27:35 crc kubenswrapper[4810]: I0219 15:27:35.672913 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a220bc57-3f31-4851-ad5c-9f61359f7de5","Type":"ContainerStarted","Data":"024e6cbe9539fc73096ba26de873a5cd3591fc2cd12d0be8dd23110aaa2f3ec6"} Feb 19 15:27:35 crc kubenswrapper[4810]: W0219 15:27:35.743948 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod788aae13_b274_4965_ac0c_8ac075c32567.slice/crio-7cd4fcbb128b34896f354e231a1de0a145b80016e5c0592bfefbc4361bfadbaf WatchSource:0}: Error finding container 7cd4fcbb128b34896f354e231a1de0a145b80016e5c0592bfefbc4361bfadbaf: Status 404 returned error can't find the container with id 7cd4fcbb128b34896f354e231a1de0a145b80016e5c0592bfefbc4361bfadbaf Feb 19 15:27:35 crc kubenswrapper[4810]: I0219 15:27:35.975660 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58bdb65675-l5l8x" Feb 19 15:27:36 crc kubenswrapper[4810]: I0219 15:27:36.111481 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cef0e580-4d18-480e-a57d-9c1b31405cd8-ovsdbserver-nb\") pod \"cef0e580-4d18-480e-a57d-9c1b31405cd8\" (UID: \"cef0e580-4d18-480e-a57d-9c1b31405cd8\") " Feb 19 15:27:36 crc kubenswrapper[4810]: I0219 15:27:36.111768 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w28xm\" (UniqueName: \"kubernetes.io/projected/cef0e580-4d18-480e-a57d-9c1b31405cd8-kube-api-access-w28xm\") pod \"cef0e580-4d18-480e-a57d-9c1b31405cd8\" (UID: \"cef0e580-4d18-480e-a57d-9c1b31405cd8\") " Feb 19 15:27:36 crc kubenswrapper[4810]: I0219 15:27:36.111793 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cef0e580-4d18-480e-a57d-9c1b31405cd8-dns-svc\") pod \"cef0e580-4d18-480e-a57d-9c1b31405cd8\" (UID: \"cef0e580-4d18-480e-a57d-9c1b31405cd8\") " Feb 19 15:27:36 crc kubenswrapper[4810]: I0219 15:27:36.111821 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cef0e580-4d18-480e-a57d-9c1b31405cd8-config\") pod \"cef0e580-4d18-480e-a57d-9c1b31405cd8\" (UID: \"cef0e580-4d18-480e-a57d-9c1b31405cd8\") " Feb 19 15:27:36 crc kubenswrapper[4810]: I0219 15:27:36.136185 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cef0e580-4d18-480e-a57d-9c1b31405cd8-kube-api-access-w28xm" (OuterVolumeSpecName: "kube-api-access-w28xm") pod "cef0e580-4d18-480e-a57d-9c1b31405cd8" (UID: "cef0e580-4d18-480e-a57d-9c1b31405cd8"). InnerVolumeSpecName "kube-api-access-w28xm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:27:36 crc kubenswrapper[4810]: I0219 15:27:36.167076 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cef0e580-4d18-480e-a57d-9c1b31405cd8-config" (OuterVolumeSpecName: "config") pod "cef0e580-4d18-480e-a57d-9c1b31405cd8" (UID: "cef0e580-4d18-480e-a57d-9c1b31405cd8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:27:36 crc kubenswrapper[4810]: I0219 15:27:36.168265 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c99647bb5-xkrgb" Feb 19 15:27:36 crc kubenswrapper[4810]: I0219 15:27:36.213702 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w28xm\" (UniqueName: \"kubernetes.io/projected/cef0e580-4d18-480e-a57d-9c1b31405cd8-kube-api-access-w28xm\") on node \"crc\" DevicePath \"\"" Feb 19 15:27:36 crc kubenswrapper[4810]: I0219 15:27:36.213735 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cef0e580-4d18-480e-a57d-9c1b31405cd8-config\") on node \"crc\" DevicePath \"\"" Feb 19 15:27:36 crc kubenswrapper[4810]: I0219 15:27:36.237113 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cef0e580-4d18-480e-a57d-9c1b31405cd8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cef0e580-4d18-480e-a57d-9c1b31405cd8" (UID: "cef0e580-4d18-480e-a57d-9c1b31405cd8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:27:36 crc kubenswrapper[4810]: I0219 15:27:36.238247 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cef0e580-4d18-480e-a57d-9c1b31405cd8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cef0e580-4d18-480e-a57d-9c1b31405cd8" (UID: "cef0e580-4d18-480e-a57d-9c1b31405cd8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:27:36 crc kubenswrapper[4810]: I0219 15:27:36.314612 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xx72g\" (UniqueName: \"kubernetes.io/projected/eec01803-2cfd-4e97-a1c0-216c2622e913-kube-api-access-xx72g\") pod \"eec01803-2cfd-4e97-a1c0-216c2622e913\" (UID: \"eec01803-2cfd-4e97-a1c0-216c2622e913\") " Feb 19 15:27:36 crc kubenswrapper[4810]: I0219 15:27:36.314740 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eec01803-2cfd-4e97-a1c0-216c2622e913-dns-svc\") pod \"eec01803-2cfd-4e97-a1c0-216c2622e913\" (UID: \"eec01803-2cfd-4e97-a1c0-216c2622e913\") " Feb 19 15:27:36 crc kubenswrapper[4810]: I0219 15:27:36.314815 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eec01803-2cfd-4e97-a1c0-216c2622e913-config\") pod \"eec01803-2cfd-4e97-a1c0-216c2622e913\" (UID: \"eec01803-2cfd-4e97-a1c0-216c2622e913\") " Feb 19 15:27:36 crc kubenswrapper[4810]: I0219 15:27:36.315265 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cef0e580-4d18-480e-a57d-9c1b31405cd8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 15:27:36 crc kubenswrapper[4810]: I0219 15:27:36.315283 4810 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cef0e580-4d18-480e-a57d-9c1b31405cd8-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 15:27:36 crc kubenswrapper[4810]: I0219 15:27:36.317989 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eec01803-2cfd-4e97-a1c0-216c2622e913-kube-api-access-xx72g" (OuterVolumeSpecName: "kube-api-access-xx72g") pod "eec01803-2cfd-4e97-a1c0-216c2622e913" (UID: "eec01803-2cfd-4e97-a1c0-216c2622e913"). InnerVolumeSpecName "kube-api-access-xx72g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:27:36 crc kubenswrapper[4810]: I0219 15:27:36.350984 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eec01803-2cfd-4e97-a1c0-216c2622e913-config" (OuterVolumeSpecName: "config") pod "eec01803-2cfd-4e97-a1c0-216c2622e913" (UID: "eec01803-2cfd-4e97-a1c0-216c2622e913"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:27:36 crc kubenswrapper[4810]: I0219 15:27:36.355873 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eec01803-2cfd-4e97-a1c0-216c2622e913-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "eec01803-2cfd-4e97-a1c0-216c2622e913" (UID: "eec01803-2cfd-4e97-a1c0-216c2622e913"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:27:36 crc kubenswrapper[4810]: I0219 15:27:36.416478 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xx72g\" (UniqueName: \"kubernetes.io/projected/eec01803-2cfd-4e97-a1c0-216c2622e913-kube-api-access-xx72g\") on node \"crc\" DevicePath \"\"" Feb 19 15:27:36 crc kubenswrapper[4810]: I0219 15:27:36.416510 4810 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eec01803-2cfd-4e97-a1c0-216c2622e913-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 15:27:36 crc kubenswrapper[4810]: I0219 15:27:36.416519 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eec01803-2cfd-4e97-a1c0-216c2622e913-config\") on node \"crc\" DevicePath \"\"" Feb 19 15:27:36 crc kubenswrapper[4810]: I0219 15:27:36.681673 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58bdb65675-l5l8x" Feb 19 15:27:36 crc kubenswrapper[4810]: I0219 15:27:36.681672 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58bdb65675-l5l8x" event={"ID":"cef0e580-4d18-480e-a57d-9c1b31405cd8","Type":"ContainerDied","Data":"91a11d1dc7d57506c1c71e77acb0aff16fbbb92753ce7889b5cb70d83d2c4e29"} Feb 19 15:27:36 crc kubenswrapper[4810]: I0219 15:27:36.682176 4810 scope.go:117] "RemoveContainer" containerID="e98ee022771e694a97d33efc6f076045128cb6c9031e6599d9309671e11fc1de" Feb 19 15:27:36 crc kubenswrapper[4810]: I0219 15:27:36.683798 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f8ff78869-24szl" event={"ID":"5886c70b-ea09-4a9e-9c31-1689d32735a5","Type":"ContainerStarted","Data":"19026669f4751c718dc10aae0046e726cc204eeb831deb5f5b82ecfb1a0aaee7"} Feb 19 15:27:36 crc kubenswrapper[4810]: I0219 15:27:36.684455 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6f8ff78869-24szl" Feb 19 15:27:36 crc kubenswrapper[4810]: I0219 15:27:36.687477 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-tbt28" event={"ID":"c03aad2b-8ca1-4310-8c11-3287fafcd66f","Type":"ContainerStarted","Data":"b33e056e143dced9b19d9021affc6bb61a726b19dfe75b0845c5af53ec92139d"} Feb 19 15:27:36 crc kubenswrapper[4810]: I0219 15:27:36.688941 4810 generic.go:334] "Generic (PLEG): container finished" podID="788aae13-b274-4965-ac0c-8ac075c32567" containerID="28ca2a964c46491b21497e9a884496f03f8a4889da795bda4b42a303c67c82ef" exitCode=0 Feb 19 15:27:36 crc kubenswrapper[4810]: I0219 15:27:36.689361 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d95ff5b97-flxcw" event={"ID":"788aae13-b274-4965-ac0c-8ac075c32567","Type":"ContainerDied","Data":"28ca2a964c46491b21497e9a884496f03f8a4889da795bda4b42a303c67c82ef"} Feb 19 15:27:36 crc kubenswrapper[4810]: I0219 15:27:36.689395 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d95ff5b97-flxcw" event={"ID":"788aae13-b274-4965-ac0c-8ac075c32567","Type":"ContainerStarted","Data":"7cd4fcbb128b34896f354e231a1de0a145b80016e5c0592bfefbc4361bfadbaf"} Feb 19 15:27:36 crc kubenswrapper[4810]: I0219 15:27:36.692233 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-5t6ds" event={"ID":"542da555-4f39-4dff-b378-5306135244db","Type":"ContainerStarted","Data":"355c2244899b75d0a7bc914f26eb40d584199f86ae7a4abd29d74085bc3b9c7e"} Feb 19 15:27:36 crc kubenswrapper[4810]: I0219 15:27:36.692276 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-5t6ds" event={"ID":"542da555-4f39-4dff-b378-5306135244db","Type":"ContainerStarted","Data":"72c55a37bf97149620687bdab13c5b4f11850a60206da90ebbca63c3bf960c47"} Feb 19 15:27:36 crc kubenswrapper[4810]: I0219 15:27:36.692490 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-5t6ds" Feb 19 15:27:36 crc kubenswrapper[4810]: I0219 15:27:36.692521 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-5t6ds" Feb 19 15:27:36 crc kubenswrapper[4810]: I0219 15:27:36.694628 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c99647bb5-xkrgb" Feb 19 15:27:36 crc kubenswrapper[4810]: I0219 15:27:36.694761 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c99647bb5-xkrgb" event={"ID":"eec01803-2cfd-4e97-a1c0-216c2622e913","Type":"ContainerDied","Data":"d66ef65309b98fdfd5a68e7c80449beb1e0f777f1d9cb1877aa1054da73700ab"} Feb 19 15:27:36 crc kubenswrapper[4810]: I0219 15:27:36.697027 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"5b985124-01b7-430c-b5ea-b9fd095e5f5e","Type":"ContainerStarted","Data":"f6b673831621afe1ea35404595fcddba7c186e5deeb3bc51c84aea830e676504"} Feb 19 15:27:36 crc kubenswrapper[4810]: I0219 15:27:36.704433 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"bdffb5e6-13bb-4c08-ad3c-52d8ded85431","Type":"ContainerStarted","Data":"81218b34ce9b45a147e3a1e08b2d64be46ede41bc685571b49c6521a842f04cb"} Feb 19 15:27:36 crc kubenswrapper[4810]: I0219 15:27:36.706890 4810 scope.go:117] "RemoveContainer" containerID="7dff482418537b07f37dc2275d46ba72695f3b6461cc224e2cd000dd183e1d25" Feb 19 15:27:36 crc kubenswrapper[4810]: I0219 15:27:36.709361 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6f8ff78869-24szl" podStartSLOduration=8.709345489 podStartE2EDuration="8.709345489s" podCreationTimestamp="2026-02-19 15:27:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:27:36.706271004 +0000 UTC m=+1086.188301148" watchObservedRunningTime="2026-02-19 15:27:36.709345489 +0000 UTC m=+1086.191375613" Feb 19 15:27:36 crc kubenswrapper[4810]: I0219 15:27:36.760510 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-5t6ds" podStartSLOduration=15.003856223 podStartE2EDuration="22.760490301s" podCreationTimestamp="2026-02-19 15:27:14 +0000 UTC" firstStartedPulling="2026-02-19 15:27:22.787892817 +0000 UTC m=+1072.269922941" lastFinishedPulling="2026-02-19 15:27:30.544526885 +0000 UTC m=+1080.026557019" observedRunningTime="2026-02-19 15:27:36.757392005 +0000 UTC m=+1086.239422139" watchObservedRunningTime="2026-02-19 15:27:36.760490301 +0000 UTC m=+1086.242520425" Feb 19 15:27:36 crc kubenswrapper[4810]: I0219 15:27:36.781178 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=9.750807656 podStartE2EDuration="22.781160587s" podCreationTimestamp="2026-02-19 15:27:14 +0000 UTC" firstStartedPulling="2026-02-19 15:27:22.879833467 +0000 UTC m=+1072.361863591" lastFinishedPulling="2026-02-19 15:27:35.910186388 +0000 UTC m=+1085.392216522" observedRunningTime="2026-02-19 15:27:36.778350218 +0000 UTC m=+1086.260380342" watchObservedRunningTime="2026-02-19 15:27:36.781160587 +0000 UTC m=+1086.263190711" Feb 19 15:27:36 crc kubenswrapper[4810]: I0219 15:27:36.796510 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Feb 19 15:27:36 crc kubenswrapper[4810]: I0219 15:27:36.797239 4810 scope.go:117] "RemoveContainer" containerID="ed90cdf99f920236e6f409e68476a3cd86bdc892ec839211cda663eab550964c" Feb 19 15:27:36 crc kubenswrapper[4810]: I0219 15:27:36.828888 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=5.784395041 podStartE2EDuration="19.828866775s" podCreationTimestamp="2026-02-19 15:27:17 +0000 UTC" firstStartedPulling="2026-02-19 15:27:21.873650908 +0000 UTC m=+1071.355681032" lastFinishedPulling="2026-02-19 15:27:35.918122622 +0000 UTC m=+1085.400152766" observedRunningTime="2026-02-19 15:27:36.801617178 +0000 UTC m=+1086.283647302" watchObservedRunningTime="2026-02-19 15:27:36.828866775 +0000 UTC m=+1086.310896899" Feb 19 15:27:36 crc kubenswrapper[4810]: I0219 15:27:36.848399 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58bdb65675-l5l8x"] Feb 19 15:27:36 crc kubenswrapper[4810]: I0219 15:27:36.851049 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-58bdb65675-l5l8x"] Feb 19 15:27:36 crc kubenswrapper[4810]: I0219 15:27:36.872474 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Feb 19 15:27:36 crc kubenswrapper[4810]: I0219 15:27:36.874576 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-tbt28" podStartSLOduration=5.3011395629999996 podStartE2EDuration="9.874565453s" podCreationTimestamp="2026-02-19 15:27:27 +0000 UTC" firstStartedPulling="2026-02-19 15:27:31.300186423 +0000 UTC m=+1080.782216547" lastFinishedPulling="2026-02-19 15:27:35.873612323 +0000 UTC m=+1085.355642437" observedRunningTime="2026-02-19 15:27:36.874499192 +0000 UTC m=+1086.356529316" watchObservedRunningTime="2026-02-19 15:27:36.874565453 +0000 UTC m=+1086.356595567" Feb 19 15:27:36 crc kubenswrapper[4810]: I0219 15:27:36.926136 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/37e2af25-5b30-4fb9-801e-f4a84d665540-etc-swift\") pod \"swift-storage-0\" (UID: \"37e2af25-5b30-4fb9-801e-f4a84d665540\") " pod="openstack/swift-storage-0" Feb 19 15:27:36 crc kubenswrapper[4810]: E0219 15:27:36.926540 4810 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 19 15:27:36 crc kubenswrapper[4810]: E0219 15:27:36.926553 4810 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 19 15:27:36 crc kubenswrapper[4810]: E0219 15:27:36.926595 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/37e2af25-5b30-4fb9-801e-f4a84d665540-etc-swift podName:37e2af25-5b30-4fb9-801e-f4a84d665540 nodeName:}" failed. No retries permitted until 2026-02-19 15:27:40.926581157 +0000 UTC m=+1090.408611281 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/37e2af25-5b30-4fb9-801e-f4a84d665540-etc-swift") pod "swift-storage-0" (UID: "37e2af25-5b30-4fb9-801e-f4a84d665540") : configmap "swift-ring-files" not found Feb 19 15:27:36 crc kubenswrapper[4810]: I0219 15:27:36.936093 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c99647bb5-xkrgb"] Feb 19 15:27:36 crc kubenswrapper[4810]: I0219 15:27:36.947291 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-c99647bb5-xkrgb"] Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.028595 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-wlh5p"] Feb 19 15:27:37 crc kubenswrapper[4810]: E0219 15:27:37.028968 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64a59633-cb6f-4631-a980-566894f0ce35" containerName="dnsmasq-dns" Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.028984 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="64a59633-cb6f-4631-a980-566894f0ce35" containerName="dnsmasq-dns" Feb 19 15:27:37 crc kubenswrapper[4810]: E0219 15:27:37.029006 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64a59633-cb6f-4631-a980-566894f0ce35" containerName="init" Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.029012 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="64a59633-cb6f-4631-a980-566894f0ce35" containerName="init" Feb 19 15:27:37 crc kubenswrapper[4810]: E0219 15:27:37.029022 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cef0e580-4d18-480e-a57d-9c1b31405cd8" containerName="init" Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.029029 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="cef0e580-4d18-480e-a57d-9c1b31405cd8" containerName="init" Feb 19 15:27:37 crc kubenswrapper[4810]: E0219 15:27:37.029038 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eec01803-2cfd-4e97-a1c0-216c2622e913" containerName="init" Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.029044 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="eec01803-2cfd-4e97-a1c0-216c2622e913" containerName="init" Feb 19 15:27:37 crc kubenswrapper[4810]: E0219 15:27:37.029071 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eec01803-2cfd-4e97-a1c0-216c2622e913" containerName="dnsmasq-dns" Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.029076 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="eec01803-2cfd-4e97-a1c0-216c2622e913" containerName="dnsmasq-dns" Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.029245 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="eec01803-2cfd-4e97-a1c0-216c2622e913" containerName="dnsmasq-dns" Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.029258 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="cef0e580-4d18-480e-a57d-9c1b31405cd8" containerName="init" Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.029268 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="64a59633-cb6f-4631-a980-566894f0ce35" containerName="dnsmasq-dns" Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.029975 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-wlh5p" Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.032611 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.033355 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.034255 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.062775 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-wlh5p"] Feb 19 15:27:37 crc kubenswrapper[4810]: E0219 15:27:37.063306 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-cnh57 ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/swift-ring-rebalance-wlh5p" podUID="e49c1739-432b-445b-87d5-904af08961e4" Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.072213 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-hrdll"] Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.073536 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-hrdll" Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.084179 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-hrdll"] Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.098599 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-wlh5p"] Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.129402 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e49c1739-432b-445b-87d5-904af08961e4-combined-ca-bundle\") pod \"swift-ring-rebalance-wlh5p\" (UID: \"e49c1739-432b-445b-87d5-904af08961e4\") " pod="openstack/swift-ring-rebalance-wlh5p" Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.129441 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e49c1739-432b-445b-87d5-904af08961e4-swiftconf\") pod \"swift-ring-rebalance-wlh5p\" (UID: \"e49c1739-432b-445b-87d5-904af08961e4\") " pod="openstack/swift-ring-rebalance-wlh5p" Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.129460 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e49c1739-432b-445b-87d5-904af08961e4-scripts\") pod \"swift-ring-rebalance-wlh5p\" (UID: \"e49c1739-432b-445b-87d5-904af08961e4\") " pod="openstack/swift-ring-rebalance-wlh5p" Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.129484 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnh57\" (UniqueName: \"kubernetes.io/projected/e49c1739-432b-445b-87d5-904af08961e4-kube-api-access-cnh57\") pod \"swift-ring-rebalance-wlh5p\" (UID: \"e49c1739-432b-445b-87d5-904af08961e4\") " pod="openstack/swift-ring-rebalance-wlh5p" Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.129503 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e49c1739-432b-445b-87d5-904af08961e4-etc-swift\") pod \"swift-ring-rebalance-wlh5p\" (UID: \"e49c1739-432b-445b-87d5-904af08961e4\") " pod="openstack/swift-ring-rebalance-wlh5p" Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.129546 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e49c1739-432b-445b-87d5-904af08961e4-ring-data-devices\") pod \"swift-ring-rebalance-wlh5p\" (UID: \"e49c1739-432b-445b-87d5-904af08961e4\") " pod="openstack/swift-ring-rebalance-wlh5p" Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.129792 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e49c1739-432b-445b-87d5-904af08961e4-dispersionconf\") pod \"swift-ring-rebalance-wlh5p\" (UID: \"e49c1739-432b-445b-87d5-904af08961e4\") " pod="openstack/swift-ring-rebalance-wlh5p" Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.231387 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e49c1739-432b-445b-87d5-904af08961e4-combined-ca-bundle\") pod \"swift-ring-rebalance-wlh5p\" (UID: \"e49c1739-432b-445b-87d5-904af08961e4\") " pod="openstack/swift-ring-rebalance-wlh5p" Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.231437 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e49c1739-432b-445b-87d5-904af08961e4-swiftconf\") pod \"swift-ring-rebalance-wlh5p\" (UID: \"e49c1739-432b-445b-87d5-904af08961e4\") " pod="openstack/swift-ring-rebalance-wlh5p" Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.231457 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e49c1739-432b-445b-87d5-904af08961e4-scripts\") pod \"swift-ring-rebalance-wlh5p\" (UID: \"e49c1739-432b-445b-87d5-904af08961e4\") " pod="openstack/swift-ring-rebalance-wlh5p" Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.231484 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6c36f3e5-f790-4eda-9486-174f8624dad1-ring-data-devices\") pod \"swift-ring-rebalance-hrdll\" (UID: \"6c36f3e5-f790-4eda-9486-174f8624dad1\") " pod="openstack/swift-ring-rebalance-hrdll" Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.231506 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnh57\" (UniqueName: \"kubernetes.io/projected/e49c1739-432b-445b-87d5-904af08961e4-kube-api-access-cnh57\") pod \"swift-ring-rebalance-wlh5p\" (UID: \"e49c1739-432b-445b-87d5-904af08961e4\") " pod="openstack/swift-ring-rebalance-wlh5p" Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.231562 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e49c1739-432b-445b-87d5-904af08961e4-etc-swift\") pod \"swift-ring-rebalance-wlh5p\" (UID: \"e49c1739-432b-445b-87d5-904af08961e4\") " pod="openstack/swift-ring-rebalance-wlh5p" Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.231581 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6c36f3e5-f790-4eda-9486-174f8624dad1-swiftconf\") pod \"swift-ring-rebalance-hrdll\" (UID: \"6c36f3e5-f790-4eda-9486-174f8624dad1\") " pod="openstack/swift-ring-rebalance-hrdll" Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.231665 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e49c1739-432b-445b-87d5-904af08961e4-ring-data-devices\") pod \"swift-ring-rebalance-wlh5p\" (UID: \"e49c1739-432b-445b-87d5-904af08961e4\") " pod="openstack/swift-ring-rebalance-wlh5p" Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.231729 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6c36f3e5-f790-4eda-9486-174f8624dad1-scripts\") pod \"swift-ring-rebalance-hrdll\" (UID: \"6c36f3e5-f790-4eda-9486-174f8624dad1\") " pod="openstack/swift-ring-rebalance-hrdll" Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.231754 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6c36f3e5-f790-4eda-9486-174f8624dad1-dispersionconf\") pod \"swift-ring-rebalance-hrdll\" (UID: \"6c36f3e5-f790-4eda-9486-174f8624dad1\") " pod="openstack/swift-ring-rebalance-hrdll" Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.231837 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e49c1739-432b-445b-87d5-904af08961e4-dispersionconf\") pod \"swift-ring-rebalance-wlh5p\" (UID: \"e49c1739-432b-445b-87d5-904af08961e4\") " pod="openstack/swift-ring-rebalance-wlh5p" Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.231919 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c36f3e5-f790-4eda-9486-174f8624dad1-combined-ca-bundle\") pod \"swift-ring-rebalance-hrdll\" (UID: \"6c36f3e5-f790-4eda-9486-174f8624dad1\") " pod="openstack/swift-ring-rebalance-hrdll" Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.231992 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fqq2\" (UniqueName: \"kubernetes.io/projected/6c36f3e5-f790-4eda-9486-174f8624dad1-kube-api-access-4fqq2\") pod \"swift-ring-rebalance-hrdll\" (UID: \"6c36f3e5-f790-4eda-9486-174f8624dad1\") " pod="openstack/swift-ring-rebalance-hrdll" Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.232023 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6c36f3e5-f790-4eda-9486-174f8624dad1-etc-swift\") pod \"swift-ring-rebalance-hrdll\" (UID: \"6c36f3e5-f790-4eda-9486-174f8624dad1\") " pod="openstack/swift-ring-rebalance-hrdll" Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.232279 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e49c1739-432b-445b-87d5-904af08961e4-scripts\") pod \"swift-ring-rebalance-wlh5p\" (UID: \"e49c1739-432b-445b-87d5-904af08961e4\") " pod="openstack/swift-ring-rebalance-wlh5p" Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.232700 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e49c1739-432b-445b-87d5-904af08961e4-etc-swift\") pod \"swift-ring-rebalance-wlh5p\" (UID: \"e49c1739-432b-445b-87d5-904af08961e4\") " pod="openstack/swift-ring-rebalance-wlh5p" Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.232912 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e49c1739-432b-445b-87d5-904af08961e4-ring-data-devices\") pod \"swift-ring-rebalance-wlh5p\" (UID: \"e49c1739-432b-445b-87d5-904af08961e4\") " pod="openstack/swift-ring-rebalance-wlh5p" Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.237505 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e49c1739-432b-445b-87d5-904af08961e4-dispersionconf\") pod \"swift-ring-rebalance-wlh5p\" (UID: \"e49c1739-432b-445b-87d5-904af08961e4\") " pod="openstack/swift-ring-rebalance-wlh5p" Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.240151 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e49c1739-432b-445b-87d5-904af08961e4-combined-ca-bundle\") pod \"swift-ring-rebalance-wlh5p\" (UID: \"e49c1739-432b-445b-87d5-904af08961e4\") " pod="openstack/swift-ring-rebalance-wlh5p" Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.250578 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnh57\" (UniqueName: \"kubernetes.io/projected/e49c1739-432b-445b-87d5-904af08961e4-kube-api-access-cnh57\") pod \"swift-ring-rebalance-wlh5p\" (UID: \"e49c1739-432b-445b-87d5-904af08961e4\") " pod="openstack/swift-ring-rebalance-wlh5p" Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.251144 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e49c1739-432b-445b-87d5-904af08961e4-swiftconf\") pod \"swift-ring-rebalance-wlh5p\" (UID: \"e49c1739-432b-445b-87d5-904af08961e4\") " pod="openstack/swift-ring-rebalance-wlh5p" Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.333422 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6c36f3e5-f790-4eda-9486-174f8624dad1-scripts\") pod \"swift-ring-rebalance-hrdll\" (UID: \"6c36f3e5-f790-4eda-9486-174f8624dad1\") " pod="openstack/swift-ring-rebalance-hrdll" Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.333473 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6c36f3e5-f790-4eda-9486-174f8624dad1-dispersionconf\") pod \"swift-ring-rebalance-hrdll\" (UID: \"6c36f3e5-f790-4eda-9486-174f8624dad1\") " pod="openstack/swift-ring-rebalance-hrdll" Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.333569 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c36f3e5-f790-4eda-9486-174f8624dad1-combined-ca-bundle\") pod \"swift-ring-rebalance-hrdll\" (UID: \"6c36f3e5-f790-4eda-9486-174f8624dad1\") " pod="openstack/swift-ring-rebalance-hrdll" Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.333596 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fqq2\" (UniqueName: \"kubernetes.io/projected/6c36f3e5-f790-4eda-9486-174f8624dad1-kube-api-access-4fqq2\") pod \"swift-ring-rebalance-hrdll\" (UID: \"6c36f3e5-f790-4eda-9486-174f8624dad1\") " pod="openstack/swift-ring-rebalance-hrdll" Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.333620 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6c36f3e5-f790-4eda-9486-174f8624dad1-etc-swift\") pod \"swift-ring-rebalance-hrdll\" (UID: \"6c36f3e5-f790-4eda-9486-174f8624dad1\") " pod="openstack/swift-ring-rebalance-hrdll" Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.333664 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6c36f3e5-f790-4eda-9486-174f8624dad1-ring-data-devices\") pod \"swift-ring-rebalance-hrdll\" (UID: \"6c36f3e5-f790-4eda-9486-174f8624dad1\") " pod="openstack/swift-ring-rebalance-hrdll" Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.333691 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6c36f3e5-f790-4eda-9486-174f8624dad1-swiftconf\") pod \"swift-ring-rebalance-hrdll\" (UID: \"6c36f3e5-f790-4eda-9486-174f8624dad1\") " pod="openstack/swift-ring-rebalance-hrdll" Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.334511 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6c36f3e5-f790-4eda-9486-174f8624dad1-etc-swift\") pod \"swift-ring-rebalance-hrdll\" (UID: \"6c36f3e5-f790-4eda-9486-174f8624dad1\") " pod="openstack/swift-ring-rebalance-hrdll" Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.334722 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6c36f3e5-f790-4eda-9486-174f8624dad1-scripts\") pod \"swift-ring-rebalance-hrdll\" (UID: \"6c36f3e5-f790-4eda-9486-174f8624dad1\") " pod="openstack/swift-ring-rebalance-hrdll" Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.334764 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6c36f3e5-f790-4eda-9486-174f8624dad1-ring-data-devices\") pod \"swift-ring-rebalance-hrdll\" (UID: \"6c36f3e5-f790-4eda-9486-174f8624dad1\") " pod="openstack/swift-ring-rebalance-hrdll" Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.336754 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6c36f3e5-f790-4eda-9486-174f8624dad1-swiftconf\") pod \"swift-ring-rebalance-hrdll\" (UID: \"6c36f3e5-f790-4eda-9486-174f8624dad1\") " pod="openstack/swift-ring-rebalance-hrdll" Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.337521 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6c36f3e5-f790-4eda-9486-174f8624dad1-dispersionconf\") pod \"swift-ring-rebalance-hrdll\" (UID: \"6c36f3e5-f790-4eda-9486-174f8624dad1\") " pod="openstack/swift-ring-rebalance-hrdll" Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.337819 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c36f3e5-f790-4eda-9486-174f8624dad1-combined-ca-bundle\") pod \"swift-ring-rebalance-hrdll\" (UID: \"6c36f3e5-f790-4eda-9486-174f8624dad1\") " pod="openstack/swift-ring-rebalance-hrdll" Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.353847 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fqq2\" (UniqueName: \"kubernetes.io/projected/6c36f3e5-f790-4eda-9486-174f8624dad1-kube-api-access-4fqq2\") pod \"swift-ring-rebalance-hrdll\" (UID: \"6c36f3e5-f790-4eda-9486-174f8624dad1\") " pod="openstack/swift-ring-rebalance-hrdll" Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.403734 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-hrdll" Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.451269 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cef0e580-4d18-480e-a57d-9c1b31405cd8" path="/var/lib/kubelet/pods/cef0e580-4d18-480e-a57d-9c1b31405cd8/volumes" Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.452023 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eec01803-2cfd-4e97-a1c0-216c2622e913" path="/var/lib/kubelet/pods/eec01803-2cfd-4e97-a1c0-216c2622e913/volumes" Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.713805 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d95ff5b97-flxcw" event={"ID":"788aae13-b274-4965-ac0c-8ac075c32567","Type":"ContainerStarted","Data":"e901006a072b768e5b2fffb51122f3ae9427a4d8bce325b789cb52d5dc2df384"} Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.714208 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7d95ff5b97-flxcw" Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.716631 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-wlh5p" Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.718088 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.730202 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-wlh5p" Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.759395 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7d95ff5b97-flxcw" podStartSLOduration=6.759377712 podStartE2EDuration="6.759377712s" podCreationTimestamp="2026-02-19 15:27:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:27:37.738088501 +0000 UTC m=+1087.220118645" watchObservedRunningTime="2026-02-19 15:27:37.759377712 +0000 UTC m=+1087.241407836" Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.762476 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.842393 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e49c1739-432b-445b-87d5-904af08961e4-scripts\") pod \"e49c1739-432b-445b-87d5-904af08961e4\" (UID: \"e49c1739-432b-445b-87d5-904af08961e4\") " Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.842547 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e49c1739-432b-445b-87d5-904af08961e4-dispersionconf\") pod \"e49c1739-432b-445b-87d5-904af08961e4\" (UID: \"e49c1739-432b-445b-87d5-904af08961e4\") " Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.842605 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e49c1739-432b-445b-87d5-904af08961e4-swiftconf\") pod \"e49c1739-432b-445b-87d5-904af08961e4\" (UID: \"e49c1739-432b-445b-87d5-904af08961e4\") " Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.842639 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e49c1739-432b-445b-87d5-904af08961e4-etc-swift\") pod \"e49c1739-432b-445b-87d5-904af08961e4\" (UID: \"e49c1739-432b-445b-87d5-904af08961e4\") " Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.842670 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e49c1739-432b-445b-87d5-904af08961e4-combined-ca-bundle\") pod \"e49c1739-432b-445b-87d5-904af08961e4\" (UID: \"e49c1739-432b-445b-87d5-904af08961e4\") " Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.842691 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e49c1739-432b-445b-87d5-904af08961e4-ring-data-devices\") pod \"e49c1739-432b-445b-87d5-904af08961e4\" (UID: \"e49c1739-432b-445b-87d5-904af08961e4\") " Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.842745 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cnh57\" (UniqueName: \"kubernetes.io/projected/e49c1739-432b-445b-87d5-904af08961e4-kube-api-access-cnh57\") pod \"e49c1739-432b-445b-87d5-904af08961e4\" (UID: \"e49c1739-432b-445b-87d5-904af08961e4\") " Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.843780 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e49c1739-432b-445b-87d5-904af08961e4-scripts" (OuterVolumeSpecName: "scripts") pod "e49c1739-432b-445b-87d5-904af08961e4" (UID: "e49c1739-432b-445b-87d5-904af08961e4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.845605 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e49c1739-432b-445b-87d5-904af08961e4-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "e49c1739-432b-445b-87d5-904af08961e4" (UID: "e49c1739-432b-445b-87d5-904af08961e4"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.845666 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e49c1739-432b-445b-87d5-904af08961e4-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "e49c1739-432b-445b-87d5-904af08961e4" (UID: "e49c1739-432b-445b-87d5-904af08961e4"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.849460 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e49c1739-432b-445b-87d5-904af08961e4-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "e49c1739-432b-445b-87d5-904af08961e4" (UID: "e49c1739-432b-445b-87d5-904af08961e4"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.852099 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e49c1739-432b-445b-87d5-904af08961e4-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "e49c1739-432b-445b-87d5-904af08961e4" (UID: "e49c1739-432b-445b-87d5-904af08961e4"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.852417 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e49c1739-432b-445b-87d5-904af08961e4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e49c1739-432b-445b-87d5-904af08961e4" (UID: "e49c1739-432b-445b-87d5-904af08961e4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.856393 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e49c1739-432b-445b-87d5-904af08961e4-kube-api-access-cnh57" (OuterVolumeSpecName: "kube-api-access-cnh57") pod "e49c1739-432b-445b-87d5-904af08961e4" (UID: "e49c1739-432b-445b-87d5-904af08961e4"). InnerVolumeSpecName "kube-api-access-cnh57". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.896369 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-hrdll"] Feb 19 15:27:37 crc kubenswrapper[4810]: W0219 15:27:37.907781 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c36f3e5_f790_4eda_9486_174f8624dad1.slice/crio-d58b7703650c05a6b82d2a1f00bf308198af4dbc385a1e16d43b232960df6c3b WatchSource:0}: Error finding container d58b7703650c05a6b82d2a1f00bf308198af4dbc385a1e16d43b232960df6c3b: Status 404 returned error can't find the container with id d58b7703650c05a6b82d2a1f00bf308198af4dbc385a1e16d43b232960df6c3b Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.945130 4810 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e49c1739-432b-445b-87d5-904af08961e4-dispersionconf\") on node \"crc\" DevicePath \"\"" Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.945168 4810 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e49c1739-432b-445b-87d5-904af08961e4-swiftconf\") on node \"crc\" DevicePath \"\"" Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.945181 4810 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e49c1739-432b-445b-87d5-904af08961e4-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.945191 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e49c1739-432b-445b-87d5-904af08961e4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.945202 4810 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e49c1739-432b-445b-87d5-904af08961e4-ring-data-devices\") on node \"crc\" DevicePath \"\"" Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.945213 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cnh57\" (UniqueName: \"kubernetes.io/projected/e49c1739-432b-445b-87d5-904af08961e4-kube-api-access-cnh57\") on node \"crc\" DevicePath \"\"" Feb 19 15:27:37 crc kubenswrapper[4810]: I0219 15:27:37.945224 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e49c1739-432b-445b-87d5-904af08961e4-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 15:27:38 crc kubenswrapper[4810]: I0219 15:27:38.752246 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-wlh5p" Feb 19 15:27:38 crc kubenswrapper[4810]: I0219 15:27:38.752344 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-hrdll" event={"ID":"6c36f3e5-f790-4eda-9486-174f8624dad1","Type":"ContainerStarted","Data":"d58b7703650c05a6b82d2a1f00bf308198af4dbc385a1e16d43b232960df6c3b"} Feb 19 15:27:38 crc kubenswrapper[4810]: I0219 15:27:38.822691 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-wlh5p"] Feb 19 15:27:38 crc kubenswrapper[4810]: I0219 15:27:38.835031 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-wlh5p"] Feb 19 15:27:39 crc kubenswrapper[4810]: I0219 15:27:39.448726 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e49c1739-432b-445b-87d5-904af08961e4" path="/var/lib/kubelet/pods/e49c1739-432b-445b-87d5-904af08961e4/volumes" Feb 19 15:27:39 crc kubenswrapper[4810]: I0219 15:27:39.653802 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Feb 19 15:27:39 crc kubenswrapper[4810]: I0219 15:27:39.744798 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Feb 19 15:27:39 crc kubenswrapper[4810]: I0219 15:27:39.764952 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Feb 19 15:27:39 crc kubenswrapper[4810]: I0219 15:27:39.805495 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Feb 19 15:27:39 crc kubenswrapper[4810]: I0219 15:27:39.983764 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Feb 19 15:27:39 crc kubenswrapper[4810]: I0219 15:27:39.985210 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 19 15:27:39 crc kubenswrapper[4810]: I0219 15:27:39.989189 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Feb 19 15:27:39 crc kubenswrapper[4810]: I0219 15:27:39.989523 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-f89wk" Feb 19 15:27:39 crc kubenswrapper[4810]: I0219 15:27:39.989590 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Feb 19 15:27:39 crc kubenswrapper[4810]: I0219 15:27:39.989766 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Feb 19 15:27:40 crc kubenswrapper[4810]: I0219 15:27:40.009670 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 19 15:27:40 crc kubenswrapper[4810]: I0219 15:27:40.083277 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22facf67-088b-410b-986a-c9e09b3d8feb-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"22facf67-088b-410b-986a-c9e09b3d8feb\") " pod="openstack/ovn-northd-0" Feb 19 15:27:40 crc kubenswrapper[4810]: I0219 15:27:40.083392 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/22facf67-088b-410b-986a-c9e09b3d8feb-scripts\") pod \"ovn-northd-0\" (UID: \"22facf67-088b-410b-986a-c9e09b3d8feb\") " pod="openstack/ovn-northd-0" Feb 19 15:27:40 crc kubenswrapper[4810]: I0219 15:27:40.083421 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8675\" (UniqueName: \"kubernetes.io/projected/22facf67-088b-410b-986a-c9e09b3d8feb-kube-api-access-p8675\") pod \"ovn-northd-0\" (UID: \"22facf67-088b-410b-986a-c9e09b3d8feb\") " pod="openstack/ovn-northd-0" Feb 19 15:27:40 crc kubenswrapper[4810]: I0219 15:27:40.083585 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/22facf67-088b-410b-986a-c9e09b3d8feb-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"22facf67-088b-410b-986a-c9e09b3d8feb\") " pod="openstack/ovn-northd-0" Feb 19 15:27:40 crc kubenswrapper[4810]: I0219 15:27:40.083628 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/22facf67-088b-410b-986a-c9e09b3d8feb-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"22facf67-088b-410b-986a-c9e09b3d8feb\") " pod="openstack/ovn-northd-0" Feb 19 15:27:40 crc kubenswrapper[4810]: I0219 15:27:40.083655 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22facf67-088b-410b-986a-c9e09b3d8feb-config\") pod \"ovn-northd-0\" (UID: \"22facf67-088b-410b-986a-c9e09b3d8feb\") " pod="openstack/ovn-northd-0" Feb 19 15:27:40 crc kubenswrapper[4810]: I0219 15:27:40.083690 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/22facf67-088b-410b-986a-c9e09b3d8feb-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"22facf67-088b-410b-986a-c9e09b3d8feb\") " pod="openstack/ovn-northd-0" Feb 19 15:27:40 crc kubenswrapper[4810]: I0219 15:27:40.185462 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/22facf67-088b-410b-986a-c9e09b3d8feb-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"22facf67-088b-410b-986a-c9e09b3d8feb\") " pod="openstack/ovn-northd-0" Feb 19 15:27:40 crc kubenswrapper[4810]: I0219 15:27:40.185561 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22facf67-088b-410b-986a-c9e09b3d8feb-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"22facf67-088b-410b-986a-c9e09b3d8feb\") " pod="openstack/ovn-northd-0" Feb 19 15:27:40 crc kubenswrapper[4810]: I0219 15:27:40.185619 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/22facf67-088b-410b-986a-c9e09b3d8feb-scripts\") pod \"ovn-northd-0\" (UID: \"22facf67-088b-410b-986a-c9e09b3d8feb\") " pod="openstack/ovn-northd-0" Feb 19 15:27:40 crc kubenswrapper[4810]: I0219 15:27:40.185645 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8675\" (UniqueName: \"kubernetes.io/projected/22facf67-088b-410b-986a-c9e09b3d8feb-kube-api-access-p8675\") pod \"ovn-northd-0\" (UID: \"22facf67-088b-410b-986a-c9e09b3d8feb\") " pod="openstack/ovn-northd-0" Feb 19 15:27:40 crc kubenswrapper[4810]: I0219 15:27:40.185721 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/22facf67-088b-410b-986a-c9e09b3d8feb-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"22facf67-088b-410b-986a-c9e09b3d8feb\") " pod="openstack/ovn-northd-0" Feb 19 15:27:40 crc kubenswrapper[4810]: I0219 15:27:40.185755 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/22facf67-088b-410b-986a-c9e09b3d8feb-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"22facf67-088b-410b-986a-c9e09b3d8feb\") " pod="openstack/ovn-northd-0" Feb 19 15:27:40 crc kubenswrapper[4810]: I0219 15:27:40.185778 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22facf67-088b-410b-986a-c9e09b3d8feb-config\") pod \"ovn-northd-0\" (UID: \"22facf67-088b-410b-986a-c9e09b3d8feb\") " pod="openstack/ovn-northd-0" Feb 19 15:27:40 crc kubenswrapper[4810]: I0219 15:27:40.186841 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/22facf67-088b-410b-986a-c9e09b3d8feb-scripts\") pod \"ovn-northd-0\" (UID: \"22facf67-088b-410b-986a-c9e09b3d8feb\") " pod="openstack/ovn-northd-0" Feb 19 15:27:40 crc kubenswrapper[4810]: I0219 15:27:40.187164 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22facf67-088b-410b-986a-c9e09b3d8feb-config\") pod \"ovn-northd-0\" (UID: \"22facf67-088b-410b-986a-c9e09b3d8feb\") " pod="openstack/ovn-northd-0" Feb 19 15:27:40 crc kubenswrapper[4810]: I0219 15:27:40.193421 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/22facf67-088b-410b-986a-c9e09b3d8feb-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"22facf67-088b-410b-986a-c9e09b3d8feb\") " pod="openstack/ovn-northd-0" Feb 19 15:27:40 crc kubenswrapper[4810]: I0219 15:27:40.193790 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/22facf67-088b-410b-986a-c9e09b3d8feb-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"22facf67-088b-410b-986a-c9e09b3d8feb\") " pod="openstack/ovn-northd-0" Feb 19 15:27:40 crc kubenswrapper[4810]: I0219 15:27:40.194412 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22facf67-088b-410b-986a-c9e09b3d8feb-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"22facf67-088b-410b-986a-c9e09b3d8feb\") " pod="openstack/ovn-northd-0" Feb 19 15:27:40 crc kubenswrapper[4810]: I0219 15:27:40.195258 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/22facf67-088b-410b-986a-c9e09b3d8feb-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"22facf67-088b-410b-986a-c9e09b3d8feb\") " pod="openstack/ovn-northd-0" Feb 19 15:27:40 crc kubenswrapper[4810]: I0219 15:27:40.219942 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8675\" (UniqueName: \"kubernetes.io/projected/22facf67-088b-410b-986a-c9e09b3d8feb-kube-api-access-p8675\") pod \"ovn-northd-0\" (UID: \"22facf67-088b-410b-986a-c9e09b3d8feb\") " pod="openstack/ovn-northd-0" Feb 19 15:27:40 crc kubenswrapper[4810]: I0219 15:27:40.328925 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 19 15:27:41 crc kubenswrapper[4810]: I0219 15:27:40.999828 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/37e2af25-5b30-4fb9-801e-f4a84d665540-etc-swift\") pod \"swift-storage-0\" (UID: \"37e2af25-5b30-4fb9-801e-f4a84d665540\") " pod="openstack/swift-storage-0" Feb 19 15:27:41 crc kubenswrapper[4810]: E0219 15:27:41.000156 4810 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 19 15:27:41 crc kubenswrapper[4810]: E0219 15:27:41.000205 4810 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 19 15:27:41 crc kubenswrapper[4810]: E0219 15:27:41.000301 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/37e2af25-5b30-4fb9-801e-f4a84d665540-etc-swift podName:37e2af25-5b30-4fb9-801e-f4a84d665540 nodeName:}" failed. No retries permitted until 2026-02-19 15:27:49.000272383 +0000 UTC m=+1098.482302507 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/37e2af25-5b30-4fb9-801e-f4a84d665540-etc-swift") pod "swift-storage-0" (UID: "37e2af25-5b30-4fb9-801e-f4a84d665540") : configmap "swift-ring-files" not found Feb 19 15:27:41 crc kubenswrapper[4810]: I0219 15:27:41.781287 4810 generic.go:334] "Generic (PLEG): container finished" podID="a220bc57-3f31-4851-ad5c-9f61359f7de5" containerID="024e6cbe9539fc73096ba26de873a5cd3591fc2cd12d0be8dd23110aaa2f3ec6" exitCode=0 Feb 19 15:27:41 crc kubenswrapper[4810]: I0219 15:27:41.781363 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a220bc57-3f31-4851-ad5c-9f61359f7de5","Type":"ContainerDied","Data":"024e6cbe9539fc73096ba26de873a5cd3591fc2cd12d0be8dd23110aaa2f3ec6"} Feb 19 15:27:41 crc kubenswrapper[4810]: I0219 15:27:41.963888 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 19 15:27:42 crc kubenswrapper[4810]: I0219 15:27:42.307736 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7d95ff5b97-flxcw" Feb 19 15:27:42 crc kubenswrapper[4810]: I0219 15:27:42.369129 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f8ff78869-24szl"] Feb 19 15:27:42 crc kubenswrapper[4810]: I0219 15:27:42.369486 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6f8ff78869-24szl" podUID="5886c70b-ea09-4a9e-9c31-1689d32735a5" containerName="dnsmasq-dns" containerID="cri-o://19026669f4751c718dc10aae0046e726cc204eeb831deb5f5b82ecfb1a0aaee7" gracePeriod=10 Feb 19 15:27:42 crc kubenswrapper[4810]: I0219 15:27:42.370472 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6f8ff78869-24szl" Feb 19 15:27:42 crc kubenswrapper[4810]: I0219 15:27:42.810540 4810 generic.go:334] "Generic (PLEG): container finished" podID="5886c70b-ea09-4a9e-9c31-1689d32735a5" containerID="19026669f4751c718dc10aae0046e726cc204eeb831deb5f5b82ecfb1a0aaee7" exitCode=0 Feb 19 15:27:42 crc kubenswrapper[4810]: I0219 15:27:42.810632 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f8ff78869-24szl" event={"ID":"5886c70b-ea09-4a9e-9c31-1689d32735a5","Type":"ContainerDied","Data":"19026669f4751c718dc10aae0046e726cc204eeb831deb5f5b82ecfb1a0aaee7"} Feb 19 15:27:42 crc kubenswrapper[4810]: I0219 15:27:42.812733 4810 generic.go:334] "Generic (PLEG): container finished" podID="c0ffb8ce-a356-4416-b96c-49db30ff1947" containerID="bb49108e1faa12136cbb1db0e6f8a1e4d5a337b3554af6b9133dd5428aa7a353" exitCode=0 Feb 19 15:27:42 crc kubenswrapper[4810]: I0219 15:27:42.812789 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"c0ffb8ce-a356-4416-b96c-49db30ff1947","Type":"ContainerDied","Data":"bb49108e1faa12136cbb1db0e6f8a1e4d5a337b3554af6b9133dd5428aa7a353"} Feb 19 15:27:42 crc kubenswrapper[4810]: I0219 15:27:42.834163 4810 generic.go:334] "Generic (PLEG): container finished" podID="30d11a24-9722-4e7a-9be5-f2bd00128167" containerID="1591fb62b46d0cb1a40e18641c32aabc2ceca62bd58d6c739d0fd39dbad2c5c9" exitCode=0 Feb 19 15:27:42 crc kubenswrapper[4810]: I0219 15:27:42.834748 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"30d11a24-9722-4e7a-9be5-f2bd00128167","Type":"ContainerDied","Data":"1591fb62b46d0cb1a40e18641c32aabc2ceca62bd58d6c739d0fd39dbad2c5c9"} Feb 19 15:27:43 crc kubenswrapper[4810]: I0219 15:27:43.026041 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f8ff78869-24szl" Feb 19 15:27:43 crc kubenswrapper[4810]: I0219 15:27:43.136639 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5886c70b-ea09-4a9e-9c31-1689d32735a5-dns-svc\") pod \"5886c70b-ea09-4a9e-9c31-1689d32735a5\" (UID: \"5886c70b-ea09-4a9e-9c31-1689d32735a5\") " Feb 19 15:27:43 crc kubenswrapper[4810]: I0219 15:27:43.136980 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5886c70b-ea09-4a9e-9c31-1689d32735a5-ovsdbserver-nb\") pod \"5886c70b-ea09-4a9e-9c31-1689d32735a5\" (UID: \"5886c70b-ea09-4a9e-9c31-1689d32735a5\") " Feb 19 15:27:43 crc kubenswrapper[4810]: I0219 15:27:43.137036 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hcqld\" (UniqueName: \"kubernetes.io/projected/5886c70b-ea09-4a9e-9c31-1689d32735a5-kube-api-access-hcqld\") pod \"5886c70b-ea09-4a9e-9c31-1689d32735a5\" (UID: \"5886c70b-ea09-4a9e-9c31-1689d32735a5\") " Feb 19 15:27:43 crc kubenswrapper[4810]: I0219 15:27:43.137060 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5886c70b-ea09-4a9e-9c31-1689d32735a5-config\") pod \"5886c70b-ea09-4a9e-9c31-1689d32735a5\" (UID: \"5886c70b-ea09-4a9e-9c31-1689d32735a5\") " Feb 19 15:27:43 crc kubenswrapper[4810]: I0219 15:27:43.137126 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5886c70b-ea09-4a9e-9c31-1689d32735a5-ovsdbserver-sb\") pod \"5886c70b-ea09-4a9e-9c31-1689d32735a5\" (UID: \"5886c70b-ea09-4a9e-9c31-1689d32735a5\") " Feb 19 15:27:43 crc kubenswrapper[4810]: I0219 15:27:43.141676 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5886c70b-ea09-4a9e-9c31-1689d32735a5-kube-api-access-hcqld" (OuterVolumeSpecName: "kube-api-access-hcqld") pod "5886c70b-ea09-4a9e-9c31-1689d32735a5" (UID: "5886c70b-ea09-4a9e-9c31-1689d32735a5"). InnerVolumeSpecName "kube-api-access-hcqld". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:27:43 crc kubenswrapper[4810]: I0219 15:27:43.170081 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5886c70b-ea09-4a9e-9c31-1689d32735a5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5886c70b-ea09-4a9e-9c31-1689d32735a5" (UID: "5886c70b-ea09-4a9e-9c31-1689d32735a5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:27:43 crc kubenswrapper[4810]: I0219 15:27:43.174991 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5886c70b-ea09-4a9e-9c31-1689d32735a5-config" (OuterVolumeSpecName: "config") pod "5886c70b-ea09-4a9e-9c31-1689d32735a5" (UID: "5886c70b-ea09-4a9e-9c31-1689d32735a5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:27:43 crc kubenswrapper[4810]: I0219 15:27:43.177462 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5886c70b-ea09-4a9e-9c31-1689d32735a5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5886c70b-ea09-4a9e-9c31-1689d32735a5" (UID: "5886c70b-ea09-4a9e-9c31-1689d32735a5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:27:43 crc kubenswrapper[4810]: I0219 15:27:43.181768 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5886c70b-ea09-4a9e-9c31-1689d32735a5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5886c70b-ea09-4a9e-9c31-1689d32735a5" (UID: "5886c70b-ea09-4a9e-9c31-1689d32735a5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:27:43 crc kubenswrapper[4810]: I0219 15:27:43.238936 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5886c70b-ea09-4a9e-9c31-1689d32735a5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 15:27:43 crc kubenswrapper[4810]: I0219 15:27:43.238976 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hcqld\" (UniqueName: \"kubernetes.io/projected/5886c70b-ea09-4a9e-9c31-1689d32735a5-kube-api-access-hcqld\") on node \"crc\" DevicePath \"\"" Feb 19 15:27:43 crc kubenswrapper[4810]: I0219 15:27:43.238991 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5886c70b-ea09-4a9e-9c31-1689d32735a5-config\") on node \"crc\" DevicePath \"\"" Feb 19 15:27:43 crc kubenswrapper[4810]: I0219 15:27:43.239003 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5886c70b-ea09-4a9e-9c31-1689d32735a5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 15:27:43 crc kubenswrapper[4810]: I0219 15:27:43.239016 4810 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5886c70b-ea09-4a9e-9c31-1689d32735a5-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 15:27:43 crc kubenswrapper[4810]: I0219 15:27:43.239921 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 19 15:27:43 crc kubenswrapper[4810]: I0219 15:27:43.848263 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"c0ffb8ce-a356-4416-b96c-49db30ff1947","Type":"ContainerStarted","Data":"fddcd1d4e34f6a024682b173f600a45601e51b7ec78c7cf77a5412fe627b7032"} Feb 19 15:27:43 crc kubenswrapper[4810]: I0219 15:27:43.850897 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"30d11a24-9722-4e7a-9be5-f2bd00128167","Type":"ContainerStarted","Data":"a246b7cef810a6f3a4206ac755dc815cc11f710c1574bd45357a8cc029f8ed2d"} Feb 19 15:27:43 crc kubenswrapper[4810]: I0219 15:27:43.853987 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-hrdll" event={"ID":"6c36f3e5-f790-4eda-9486-174f8624dad1","Type":"ContainerStarted","Data":"6446836f423aff91810aae3fb429a61dc4d7160b1082cc50f6cb50a21f6642d5"} Feb 19 15:27:43 crc kubenswrapper[4810]: I0219 15:27:43.856683 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f8ff78869-24szl" event={"ID":"5886c70b-ea09-4a9e-9c31-1689d32735a5","Type":"ContainerDied","Data":"bb8c011e783721ee6fc9b0c30d19bca5636e6fa4d84780893c5d094407e82425"} Feb 19 15:27:43 crc kubenswrapper[4810]: I0219 15:27:43.856715 4810 scope.go:117] "RemoveContainer" containerID="19026669f4751c718dc10aae0046e726cc204eeb831deb5f5b82ecfb1a0aaee7" Feb 19 15:27:43 crc kubenswrapper[4810]: I0219 15:27:43.856731 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f8ff78869-24szl" Feb 19 15:27:43 crc kubenswrapper[4810]: I0219 15:27:43.859547 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"22facf67-088b-410b-986a-c9e09b3d8feb","Type":"ContainerStarted","Data":"47d0d3cf1f575b1c29b0c72f9905fc06df7cfecb4af25e3bbbf84ebdd832cd17"} Feb 19 15:27:43 crc kubenswrapper[4810]: I0219 15:27:43.897785 4810 scope.go:117] "RemoveContainer" containerID="c07f2dcf7043fcd524ebc329815b7a17ca72dbefb72ca802efe8c1da736fd8ad" Feb 19 15:27:43 crc kubenswrapper[4810]: I0219 15:27:43.899827 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=28.53512423 podStartE2EDuration="37.899804708s" podCreationTimestamp="2026-02-19 15:27:06 +0000 UTC" firstStartedPulling="2026-02-19 15:27:20.81438485 +0000 UTC m=+1070.296414974" lastFinishedPulling="2026-02-19 15:27:30.179065328 +0000 UTC m=+1079.661095452" observedRunningTime="2026-02-19 15:27:43.876854356 +0000 UTC m=+1093.358884480" watchObservedRunningTime="2026-02-19 15:27:43.899804708 +0000 UTC m=+1093.381834832" Feb 19 15:27:43 crc kubenswrapper[4810]: I0219 15:27:43.909220 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f8ff78869-24szl"] Feb 19 15:27:43 crc kubenswrapper[4810]: I0219 15:27:43.918032 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6f8ff78869-24szl"] Feb 19 15:27:43 crc kubenswrapper[4810]: I0219 15:27:43.927512 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-hrdll" podStartSLOduration=2.087536813 podStartE2EDuration="6.927496956s" podCreationTimestamp="2026-02-19 15:27:37 +0000 UTC" firstStartedPulling="2026-02-19 15:27:37.910240005 +0000 UTC m=+1087.392270129" lastFinishedPulling="2026-02-19 15:27:42.750200148 +0000 UTC m=+1092.232230272" observedRunningTime="2026-02-19 15:27:43.913999855 +0000 UTC m=+1093.396029979" watchObservedRunningTime="2026-02-19 15:27:43.927496956 +0000 UTC m=+1093.409527080" Feb 19 15:27:43 crc kubenswrapper[4810]: I0219 15:27:43.946956 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=27.043421211 podStartE2EDuration="35.946936352s" podCreationTimestamp="2026-02-19 15:27:08 +0000 UTC" firstStartedPulling="2026-02-19 15:27:21.66861939 +0000 UTC m=+1071.150649514" lastFinishedPulling="2026-02-19 15:27:30.572134511 +0000 UTC m=+1080.054164655" observedRunningTime="2026-02-19 15:27:43.937965472 +0000 UTC m=+1093.419995596" watchObservedRunningTime="2026-02-19 15:27:43.946936352 +0000 UTC m=+1093.428966476" Feb 19 15:27:44 crc kubenswrapper[4810]: I0219 15:27:44.871436 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"22facf67-088b-410b-986a-c9e09b3d8feb","Type":"ContainerStarted","Data":"24e82aa2aad86302ff6a11c62bab36d7447048d609cf0f7fd7a7e16cd1fe62d8"} Feb 19 15:27:44 crc kubenswrapper[4810]: I0219 15:27:44.871681 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"22facf67-088b-410b-986a-c9e09b3d8feb","Type":"ContainerStarted","Data":"b7ff00e8773edef7e35c5a8773e5619fc0d76a58227679370dd65bafea439c9d"} Feb 19 15:27:44 crc kubenswrapper[4810]: I0219 15:27:44.871699 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Feb 19 15:27:44 crc kubenswrapper[4810]: I0219 15:27:44.907264 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=5.390618022 podStartE2EDuration="5.907244248s" podCreationTimestamp="2026-02-19 15:27:39 +0000 UTC" firstStartedPulling="2026-02-19 15:27:43.240378346 +0000 UTC m=+1092.722408470" lastFinishedPulling="2026-02-19 15:27:43.757004572 +0000 UTC m=+1093.239034696" observedRunningTime="2026-02-19 15:27:44.895288686 +0000 UTC m=+1094.377318810" watchObservedRunningTime="2026-02-19 15:27:44.907244248 +0000 UTC m=+1094.389274372" Feb 19 15:27:45 crc kubenswrapper[4810]: I0219 15:27:45.453054 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5886c70b-ea09-4a9e-9c31-1689d32735a5" path="/var/lib/kubelet/pods/5886c70b-ea09-4a9e-9c31-1689d32735a5/volumes" Feb 19 15:27:47 crc kubenswrapper[4810]: I0219 15:27:47.839719 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Feb 19 15:27:47 crc kubenswrapper[4810]: I0219 15:27:47.839994 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Feb 19 15:27:49 crc kubenswrapper[4810]: I0219 15:27:49.082192 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/37e2af25-5b30-4fb9-801e-f4a84d665540-etc-swift\") pod \"swift-storage-0\" (UID: \"37e2af25-5b30-4fb9-801e-f4a84d665540\") " pod="openstack/swift-storage-0" Feb 19 15:27:49 crc kubenswrapper[4810]: E0219 15:27:49.082433 4810 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 19 15:27:49 crc kubenswrapper[4810]: E0219 15:27:49.082600 4810 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 19 15:27:49 crc kubenswrapper[4810]: E0219 15:27:49.082674 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/37e2af25-5b30-4fb9-801e-f4a84d665540-etc-swift podName:37e2af25-5b30-4fb9-801e-f4a84d665540 nodeName:}" failed. No retries permitted until 2026-02-19 15:28:05.082639474 +0000 UTC m=+1114.564669598 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/37e2af25-5b30-4fb9-801e-f4a84d665540-etc-swift") pod "swift-storage-0" (UID: "37e2af25-5b30-4fb9-801e-f4a84d665540") : configmap "swift-ring-files" not found Feb 19 15:27:49 crc kubenswrapper[4810]: I0219 15:27:49.358160 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Feb 19 15:27:49 crc kubenswrapper[4810]: I0219 15:27:49.358230 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Feb 19 15:27:49 crc kubenswrapper[4810]: I0219 15:27:49.505473 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Feb 19 15:27:50 crc kubenswrapper[4810]: I0219 15:27:50.041545 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Feb 19 15:27:50 crc kubenswrapper[4810]: I0219 15:27:50.920282 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a220bc57-3f31-4851-ad5c-9f61359f7de5","Type":"ContainerStarted","Data":"e279b42557dcc4bac021262be680408c24c0a74b26422b2a204f61141627de2f"} Feb 19 15:27:51 crc kubenswrapper[4810]: I0219 15:27:51.949410 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Feb 19 15:27:52 crc kubenswrapper[4810]: I0219 15:27:52.055910 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Feb 19 15:27:52 crc kubenswrapper[4810]: I0219 15:27:52.935820 4810 generic.go:334] "Generic (PLEG): container finished" podID="6c36f3e5-f790-4eda-9486-174f8624dad1" containerID="6446836f423aff91810aae3fb429a61dc4d7160b1082cc50f6cb50a21f6642d5" exitCode=0 Feb 19 15:27:52 crc kubenswrapper[4810]: I0219 15:27:52.935906 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-hrdll" event={"ID":"6c36f3e5-f790-4eda-9486-174f8624dad1","Type":"ContainerDied","Data":"6446836f423aff91810aae3fb429a61dc4d7160b1082cc50f6cb50a21f6642d5"} Feb 19 15:27:52 crc kubenswrapper[4810]: I0219 15:27:52.938352 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a220bc57-3f31-4851-ad5c-9f61359f7de5","Type":"ContainerStarted","Data":"96840091b1cea0064312675928d8c3948ff53222c6323509a84e0412a49c9891"} Feb 19 15:27:54 crc kubenswrapper[4810]: I0219 15:27:54.517943 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-hrdll" Feb 19 15:27:54 crc kubenswrapper[4810]: I0219 15:27:54.677302 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6c36f3e5-f790-4eda-9486-174f8624dad1-dispersionconf\") pod \"6c36f3e5-f790-4eda-9486-174f8624dad1\" (UID: \"6c36f3e5-f790-4eda-9486-174f8624dad1\") " Feb 19 15:27:54 crc kubenswrapper[4810]: I0219 15:27:54.677363 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6c36f3e5-f790-4eda-9486-174f8624dad1-ring-data-devices\") pod \"6c36f3e5-f790-4eda-9486-174f8624dad1\" (UID: \"6c36f3e5-f790-4eda-9486-174f8624dad1\") " Feb 19 15:27:54 crc kubenswrapper[4810]: I0219 15:27:54.677407 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4fqq2\" (UniqueName: \"kubernetes.io/projected/6c36f3e5-f790-4eda-9486-174f8624dad1-kube-api-access-4fqq2\") pod \"6c36f3e5-f790-4eda-9486-174f8624dad1\" (UID: \"6c36f3e5-f790-4eda-9486-174f8624dad1\") " Feb 19 15:27:54 crc kubenswrapper[4810]: I0219 15:27:54.677437 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6c36f3e5-f790-4eda-9486-174f8624dad1-scripts\") pod \"6c36f3e5-f790-4eda-9486-174f8624dad1\" (UID: \"6c36f3e5-f790-4eda-9486-174f8624dad1\") " Feb 19 15:27:54 crc kubenswrapper[4810]: I0219 15:27:54.677492 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6c36f3e5-f790-4eda-9486-174f8624dad1-swiftconf\") pod \"6c36f3e5-f790-4eda-9486-174f8624dad1\" (UID: \"6c36f3e5-f790-4eda-9486-174f8624dad1\") " Feb 19 15:27:54 crc kubenswrapper[4810]: I0219 15:27:54.677544 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c36f3e5-f790-4eda-9486-174f8624dad1-combined-ca-bundle\") pod \"6c36f3e5-f790-4eda-9486-174f8624dad1\" (UID: \"6c36f3e5-f790-4eda-9486-174f8624dad1\") " Feb 19 15:27:54 crc kubenswrapper[4810]: I0219 15:27:54.677597 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6c36f3e5-f790-4eda-9486-174f8624dad1-etc-swift\") pod \"6c36f3e5-f790-4eda-9486-174f8624dad1\" (UID: \"6c36f3e5-f790-4eda-9486-174f8624dad1\") " Feb 19 15:27:54 crc kubenswrapper[4810]: I0219 15:27:54.678643 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c36f3e5-f790-4eda-9486-174f8624dad1-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "6c36f3e5-f790-4eda-9486-174f8624dad1" (UID: "6c36f3e5-f790-4eda-9486-174f8624dad1"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:27:54 crc kubenswrapper[4810]: I0219 15:27:54.678973 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c36f3e5-f790-4eda-9486-174f8624dad1-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "6c36f3e5-f790-4eda-9486-174f8624dad1" (UID: "6c36f3e5-f790-4eda-9486-174f8624dad1"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:27:54 crc kubenswrapper[4810]: I0219 15:27:54.685653 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c36f3e5-f790-4eda-9486-174f8624dad1-kube-api-access-4fqq2" (OuterVolumeSpecName: "kube-api-access-4fqq2") pod "6c36f3e5-f790-4eda-9486-174f8624dad1" (UID: "6c36f3e5-f790-4eda-9486-174f8624dad1"). InnerVolumeSpecName "kube-api-access-4fqq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:27:54 crc kubenswrapper[4810]: I0219 15:27:54.688304 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c36f3e5-f790-4eda-9486-174f8624dad1-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "6c36f3e5-f790-4eda-9486-174f8624dad1" (UID: "6c36f3e5-f790-4eda-9486-174f8624dad1"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:27:54 crc kubenswrapper[4810]: I0219 15:27:54.699139 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c36f3e5-f790-4eda-9486-174f8624dad1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6c36f3e5-f790-4eda-9486-174f8624dad1" (UID: "6c36f3e5-f790-4eda-9486-174f8624dad1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:27:54 crc kubenswrapper[4810]: I0219 15:27:54.700123 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c36f3e5-f790-4eda-9486-174f8624dad1-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "6c36f3e5-f790-4eda-9486-174f8624dad1" (UID: "6c36f3e5-f790-4eda-9486-174f8624dad1"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:27:54 crc kubenswrapper[4810]: I0219 15:27:54.704942 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c36f3e5-f790-4eda-9486-174f8624dad1-scripts" (OuterVolumeSpecName: "scripts") pod "6c36f3e5-f790-4eda-9486-174f8624dad1" (UID: "6c36f3e5-f790-4eda-9486-174f8624dad1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:27:54 crc kubenswrapper[4810]: I0219 15:27:54.779813 4810 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6c36f3e5-f790-4eda-9486-174f8624dad1-dispersionconf\") on node \"crc\" DevicePath \"\"" Feb 19 15:27:54 crc kubenswrapper[4810]: I0219 15:27:54.779877 4810 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6c36f3e5-f790-4eda-9486-174f8624dad1-ring-data-devices\") on node \"crc\" DevicePath \"\"" Feb 19 15:27:54 crc kubenswrapper[4810]: I0219 15:27:54.779890 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4fqq2\" (UniqueName: \"kubernetes.io/projected/6c36f3e5-f790-4eda-9486-174f8624dad1-kube-api-access-4fqq2\") on node \"crc\" DevicePath \"\"" Feb 19 15:27:54 crc kubenswrapper[4810]: I0219 15:27:54.779901 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6c36f3e5-f790-4eda-9486-174f8624dad1-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 15:27:54 crc kubenswrapper[4810]: I0219 15:27:54.779909 4810 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6c36f3e5-f790-4eda-9486-174f8624dad1-swiftconf\") on node \"crc\" DevicePath \"\"" Feb 19 15:27:54 crc kubenswrapper[4810]: I0219 15:27:54.779932 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c36f3e5-f790-4eda-9486-174f8624dad1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 15:27:54 crc kubenswrapper[4810]: I0219 15:27:54.779942 4810 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6c36f3e5-f790-4eda-9486-174f8624dad1-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 19 15:27:55 crc kubenswrapper[4810]: I0219 15:27:55.220895 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-hrdll" event={"ID":"6c36f3e5-f790-4eda-9486-174f8624dad1","Type":"ContainerDied","Data":"d58b7703650c05a6b82d2a1f00bf308198af4dbc385a1e16d43b232960df6c3b"} Feb 19 15:27:55 crc kubenswrapper[4810]: I0219 15:27:55.220938 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d58b7703650c05a6b82d2a1f00bf308198af4dbc385a1e16d43b232960df6c3b" Feb 19 15:27:55 crc kubenswrapper[4810]: I0219 15:27:55.220963 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-hrdll" Feb 19 15:27:56 crc kubenswrapper[4810]: I0219 15:27:56.609680 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-z6ghl"] Feb 19 15:27:56 crc kubenswrapper[4810]: E0219 15:27:56.610346 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5886c70b-ea09-4a9e-9c31-1689d32735a5" containerName="init" Feb 19 15:27:56 crc kubenswrapper[4810]: I0219 15:27:56.610365 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="5886c70b-ea09-4a9e-9c31-1689d32735a5" containerName="init" Feb 19 15:27:56 crc kubenswrapper[4810]: E0219 15:27:56.610397 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5886c70b-ea09-4a9e-9c31-1689d32735a5" containerName="dnsmasq-dns" Feb 19 15:27:56 crc kubenswrapper[4810]: I0219 15:27:56.610405 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="5886c70b-ea09-4a9e-9c31-1689d32735a5" containerName="dnsmasq-dns" Feb 19 15:27:56 crc kubenswrapper[4810]: E0219 15:27:56.610423 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c36f3e5-f790-4eda-9486-174f8624dad1" containerName="swift-ring-rebalance" Feb 19 15:27:56 crc kubenswrapper[4810]: I0219 15:27:56.610432 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c36f3e5-f790-4eda-9486-174f8624dad1" containerName="swift-ring-rebalance" Feb 19 15:27:56 crc kubenswrapper[4810]: I0219 15:27:56.610594 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="5886c70b-ea09-4a9e-9c31-1689d32735a5" containerName="dnsmasq-dns" Feb 19 15:27:56 crc kubenswrapper[4810]: I0219 15:27:56.610605 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c36f3e5-f790-4eda-9486-174f8624dad1" containerName="swift-ring-rebalance" Feb 19 15:27:56 crc kubenswrapper[4810]: I0219 15:27:56.611138 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-z6ghl" Feb 19 15:27:56 crc kubenswrapper[4810]: I0219 15:27:56.618711 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 19 15:27:56 crc kubenswrapper[4810]: I0219 15:27:56.621481 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-z6ghl"] Feb 19 15:27:56 crc kubenswrapper[4810]: I0219 15:27:56.710224 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0cf99137-c194-466c-b92b-fbda63f4b3d5-operator-scripts\") pod \"root-account-create-update-z6ghl\" (UID: \"0cf99137-c194-466c-b92b-fbda63f4b3d5\") " pod="openstack/root-account-create-update-z6ghl" Feb 19 15:27:56 crc kubenswrapper[4810]: I0219 15:27:56.710298 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7k59\" (UniqueName: \"kubernetes.io/projected/0cf99137-c194-466c-b92b-fbda63f4b3d5-kube-api-access-c7k59\") pod \"root-account-create-update-z6ghl\" (UID: \"0cf99137-c194-466c-b92b-fbda63f4b3d5\") " pod="openstack/root-account-create-update-z6ghl" Feb 19 15:27:56 crc kubenswrapper[4810]: I0219 15:27:56.812461 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0cf99137-c194-466c-b92b-fbda63f4b3d5-operator-scripts\") pod \"root-account-create-update-z6ghl\" (UID: \"0cf99137-c194-466c-b92b-fbda63f4b3d5\") " pod="openstack/root-account-create-update-z6ghl" Feb 19 15:27:56 crc kubenswrapper[4810]: I0219 15:27:56.812911 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7k59\" (UniqueName: \"kubernetes.io/projected/0cf99137-c194-466c-b92b-fbda63f4b3d5-kube-api-access-c7k59\") pod \"root-account-create-update-z6ghl\" (UID: \"0cf99137-c194-466c-b92b-fbda63f4b3d5\") " pod="openstack/root-account-create-update-z6ghl" Feb 19 15:27:56 crc kubenswrapper[4810]: I0219 15:27:56.814483 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0cf99137-c194-466c-b92b-fbda63f4b3d5-operator-scripts\") pod \"root-account-create-update-z6ghl\" (UID: \"0cf99137-c194-466c-b92b-fbda63f4b3d5\") " pod="openstack/root-account-create-update-z6ghl" Feb 19 15:27:56 crc kubenswrapper[4810]: I0219 15:27:56.833095 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7k59\" (UniqueName: \"kubernetes.io/projected/0cf99137-c194-466c-b92b-fbda63f4b3d5-kube-api-access-c7k59\") pod \"root-account-create-update-z6ghl\" (UID: \"0cf99137-c194-466c-b92b-fbda63f4b3d5\") " pod="openstack/root-account-create-update-z6ghl" Feb 19 15:27:56 crc kubenswrapper[4810]: I0219 15:27:56.932754 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-z6ghl" Feb 19 15:27:57 crc kubenswrapper[4810]: I0219 15:27:57.401841 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-z6ghl"] Feb 19 15:27:58 crc kubenswrapper[4810]: I0219 15:27:58.245312 4810 generic.go:334] "Generic (PLEG): container finished" podID="0cf99137-c194-466c-b92b-fbda63f4b3d5" containerID="99c27801bb39f1082a20de11443ab5b4c03a227dc67e8dce6456d77eb0a7c2db" exitCode=0 Feb 19 15:27:58 crc kubenswrapper[4810]: I0219 15:27:58.245366 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-z6ghl" event={"ID":"0cf99137-c194-466c-b92b-fbda63f4b3d5","Type":"ContainerDied","Data":"99c27801bb39f1082a20de11443ab5b4c03a227dc67e8dce6456d77eb0a7c2db"} Feb 19 15:27:58 crc kubenswrapper[4810]: I0219 15:27:58.245743 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-z6ghl" event={"ID":"0cf99137-c194-466c-b92b-fbda63f4b3d5","Type":"ContainerStarted","Data":"1f2afc66c4ebe08456c6e61c56714babfb410fb2b236afdbdd2a7670799cf58c"} Feb 19 15:27:59 crc kubenswrapper[4810]: I0219 15:27:59.260016 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a220bc57-3f31-4851-ad5c-9f61359f7de5","Type":"ContainerStarted","Data":"e5ac3906ff8a232fe91505ae472e019caa241a902c02da4ff118ba00b0e5d016"} Feb 19 15:27:59 crc kubenswrapper[4810]: I0219 15:27:59.295183 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=11.648452195 podStartE2EDuration="48.295164457s" podCreationTimestamp="2026-02-19 15:27:11 +0000 UTC" firstStartedPulling="2026-02-19 15:27:21.677479877 +0000 UTC m=+1071.159510001" lastFinishedPulling="2026-02-19 15:27:58.324192139 +0000 UTC m=+1107.806222263" observedRunningTime="2026-02-19 15:27:59.28591405 +0000 UTC m=+1108.767944204" watchObservedRunningTime="2026-02-19 15:27:59.295164457 +0000 UTC m=+1108.777194581" Feb 19 15:27:59 crc kubenswrapper[4810]: I0219 15:27:59.654202 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-z6ghl" Feb 19 15:27:59 crc kubenswrapper[4810]: I0219 15:27:59.769933 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7k59\" (UniqueName: \"kubernetes.io/projected/0cf99137-c194-466c-b92b-fbda63f4b3d5-kube-api-access-c7k59\") pod \"0cf99137-c194-466c-b92b-fbda63f4b3d5\" (UID: \"0cf99137-c194-466c-b92b-fbda63f4b3d5\") " Feb 19 15:27:59 crc kubenswrapper[4810]: I0219 15:27:59.770070 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0cf99137-c194-466c-b92b-fbda63f4b3d5-operator-scripts\") pod \"0cf99137-c194-466c-b92b-fbda63f4b3d5\" (UID: \"0cf99137-c194-466c-b92b-fbda63f4b3d5\") " Feb 19 15:27:59 crc kubenswrapper[4810]: I0219 15:27:59.770879 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0cf99137-c194-466c-b92b-fbda63f4b3d5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0cf99137-c194-466c-b92b-fbda63f4b3d5" (UID: "0cf99137-c194-466c-b92b-fbda63f4b3d5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:27:59 crc kubenswrapper[4810]: I0219 15:27:59.778548 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cf99137-c194-466c-b92b-fbda63f4b3d5-kube-api-access-c7k59" (OuterVolumeSpecName: "kube-api-access-c7k59") pod "0cf99137-c194-466c-b92b-fbda63f4b3d5" (UID: "0cf99137-c194-466c-b92b-fbda63f4b3d5"). InnerVolumeSpecName "kube-api-access-c7k59". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:27:59 crc kubenswrapper[4810]: I0219 15:27:59.828256 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-pd6hg"] Feb 19 15:27:59 crc kubenswrapper[4810]: E0219 15:27:59.828719 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cf99137-c194-466c-b92b-fbda63f4b3d5" containerName="mariadb-account-create-update" Feb 19 15:27:59 crc kubenswrapper[4810]: I0219 15:27:59.828740 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cf99137-c194-466c-b92b-fbda63f4b3d5" containerName="mariadb-account-create-update" Feb 19 15:27:59 crc kubenswrapper[4810]: I0219 15:27:59.828920 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cf99137-c194-466c-b92b-fbda63f4b3d5" containerName="mariadb-account-create-update" Feb 19 15:27:59 crc kubenswrapper[4810]: I0219 15:27:59.829565 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-pd6hg" Feb 19 15:27:59 crc kubenswrapper[4810]: I0219 15:27:59.836216 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-pd6hg"] Feb 19 15:27:59 crc kubenswrapper[4810]: I0219 15:27:59.872464 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6g2w\" (UniqueName: \"kubernetes.io/projected/2222b6ca-79cd-48d7-b262-87e5cd4db6b1-kube-api-access-f6g2w\") pod \"glance-db-create-pd6hg\" (UID: \"2222b6ca-79cd-48d7-b262-87e5cd4db6b1\") " pod="openstack/glance-db-create-pd6hg" Feb 19 15:27:59 crc kubenswrapper[4810]: I0219 15:27:59.872568 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2222b6ca-79cd-48d7-b262-87e5cd4db6b1-operator-scripts\") pod \"glance-db-create-pd6hg\" (UID: \"2222b6ca-79cd-48d7-b262-87e5cd4db6b1\") " pod="openstack/glance-db-create-pd6hg" Feb 19 15:27:59 crc kubenswrapper[4810]: I0219 15:27:59.873006 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c7k59\" (UniqueName: \"kubernetes.io/projected/0cf99137-c194-466c-b92b-fbda63f4b3d5-kube-api-access-c7k59\") on node \"crc\" DevicePath \"\"" Feb 19 15:27:59 crc kubenswrapper[4810]: I0219 15:27:59.873049 4810 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0cf99137-c194-466c-b92b-fbda63f4b3d5-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 15:27:59 crc kubenswrapper[4810]: I0219 15:27:59.936112 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-79a4-account-create-update-mrm9x"] Feb 19 15:27:59 crc kubenswrapper[4810]: I0219 15:27:59.937821 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-79a4-account-create-update-mrm9x" Feb 19 15:27:59 crc kubenswrapper[4810]: I0219 15:27:59.939631 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Feb 19 15:27:59 crc kubenswrapper[4810]: I0219 15:27:59.943874 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-79a4-account-create-update-mrm9x"] Feb 19 15:27:59 crc kubenswrapper[4810]: I0219 15:27:59.974887 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b4b22749-0497-48c2-b943-2c48aef05707-operator-scripts\") pod \"glance-79a4-account-create-update-mrm9x\" (UID: \"b4b22749-0497-48c2-b943-2c48aef05707\") " pod="openstack/glance-79a4-account-create-update-mrm9x" Feb 19 15:27:59 crc kubenswrapper[4810]: I0219 15:27:59.974924 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cpfn\" (UniqueName: \"kubernetes.io/projected/b4b22749-0497-48c2-b943-2c48aef05707-kube-api-access-6cpfn\") pod \"glance-79a4-account-create-update-mrm9x\" (UID: \"b4b22749-0497-48c2-b943-2c48aef05707\") " pod="openstack/glance-79a4-account-create-update-mrm9x" Feb 19 15:27:59 crc kubenswrapper[4810]: I0219 15:27:59.974997 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6g2w\" (UniqueName: \"kubernetes.io/projected/2222b6ca-79cd-48d7-b262-87e5cd4db6b1-kube-api-access-f6g2w\") pod \"glance-db-create-pd6hg\" (UID: \"2222b6ca-79cd-48d7-b262-87e5cd4db6b1\") " pod="openstack/glance-db-create-pd6hg" Feb 19 15:27:59 crc kubenswrapper[4810]: I0219 15:27:59.975057 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2222b6ca-79cd-48d7-b262-87e5cd4db6b1-operator-scripts\") pod \"glance-db-create-pd6hg\" (UID: \"2222b6ca-79cd-48d7-b262-87e5cd4db6b1\") " pod="openstack/glance-db-create-pd6hg" Feb 19 15:27:59 crc kubenswrapper[4810]: I0219 15:27:59.975763 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2222b6ca-79cd-48d7-b262-87e5cd4db6b1-operator-scripts\") pod \"glance-db-create-pd6hg\" (UID: \"2222b6ca-79cd-48d7-b262-87e5cd4db6b1\") " pod="openstack/glance-db-create-pd6hg" Feb 19 15:27:59 crc kubenswrapper[4810]: I0219 15:27:59.994169 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6g2w\" (UniqueName: \"kubernetes.io/projected/2222b6ca-79cd-48d7-b262-87e5cd4db6b1-kube-api-access-f6g2w\") pod \"glance-db-create-pd6hg\" (UID: \"2222b6ca-79cd-48d7-b262-87e5cd4db6b1\") " pod="openstack/glance-db-create-pd6hg" Feb 19 15:28:00 crc kubenswrapper[4810]: I0219 15:28:00.076722 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b4b22749-0497-48c2-b943-2c48aef05707-operator-scripts\") pod \"glance-79a4-account-create-update-mrm9x\" (UID: \"b4b22749-0497-48c2-b943-2c48aef05707\") " pod="openstack/glance-79a4-account-create-update-mrm9x" Feb 19 15:28:00 crc kubenswrapper[4810]: I0219 15:28:00.076785 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cpfn\" (UniqueName: \"kubernetes.io/projected/b4b22749-0497-48c2-b943-2c48aef05707-kube-api-access-6cpfn\") pod \"glance-79a4-account-create-update-mrm9x\" (UID: \"b4b22749-0497-48c2-b943-2c48aef05707\") " pod="openstack/glance-79a4-account-create-update-mrm9x" Feb 19 15:28:00 crc kubenswrapper[4810]: I0219 15:28:00.077691 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b4b22749-0497-48c2-b943-2c48aef05707-operator-scripts\") pod \"glance-79a4-account-create-update-mrm9x\" (UID: \"b4b22749-0497-48c2-b943-2c48aef05707\") " pod="openstack/glance-79a4-account-create-update-mrm9x" Feb 19 15:28:00 crc kubenswrapper[4810]: I0219 15:28:00.092425 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cpfn\" (UniqueName: \"kubernetes.io/projected/b4b22749-0497-48c2-b943-2c48aef05707-kube-api-access-6cpfn\") pod \"glance-79a4-account-create-update-mrm9x\" (UID: \"b4b22749-0497-48c2-b943-2c48aef05707\") " pod="openstack/glance-79a4-account-create-update-mrm9x" Feb 19 15:28:00 crc kubenswrapper[4810]: I0219 15:28:00.155521 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-pd6hg" Feb 19 15:28:00 crc kubenswrapper[4810]: I0219 15:28:00.263619 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-79a4-account-create-update-mrm9x" Feb 19 15:28:00 crc kubenswrapper[4810]: I0219 15:28:00.271837 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-z6ghl" Feb 19 15:28:00 crc kubenswrapper[4810]: I0219 15:28:00.271882 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-z6ghl" event={"ID":"0cf99137-c194-466c-b92b-fbda63f4b3d5","Type":"ContainerDied","Data":"1f2afc66c4ebe08456c6e61c56714babfb410fb2b236afdbdd2a7670799cf58c"} Feb 19 15:28:00 crc kubenswrapper[4810]: I0219 15:28:00.271910 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f2afc66c4ebe08456c6e61c56714babfb410fb2b236afdbdd2a7670799cf58c" Feb 19 15:28:00 crc kubenswrapper[4810]: I0219 15:28:00.399697 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Feb 19 15:28:00 crc kubenswrapper[4810]: I0219 15:28:00.592682 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-pd6hg"] Feb 19 15:28:00 crc kubenswrapper[4810]: I0219 15:28:00.728669 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-79a4-account-create-update-mrm9x"] Feb 19 15:28:00 crc kubenswrapper[4810]: I0219 15:28:00.867114 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-8fq2p"] Feb 19 15:28:00 crc kubenswrapper[4810]: I0219 15:28:00.868276 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-8fq2p" Feb 19 15:28:00 crc kubenswrapper[4810]: I0219 15:28:00.877425 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-8fq2p"] Feb 19 15:28:00 crc kubenswrapper[4810]: I0219 15:28:00.890363 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-002d-account-create-update-6kk29"] Feb 19 15:28:00 crc kubenswrapper[4810]: I0219 15:28:00.891462 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-002d-account-create-update-6kk29" Feb 19 15:28:00 crc kubenswrapper[4810]: I0219 15:28:00.892611 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgx2j\" (UniqueName: \"kubernetes.io/projected/bca24a94-16a8-4b5b-9d99-bc98919feb21-kube-api-access-lgx2j\") pod \"keystone-db-create-8fq2p\" (UID: \"bca24a94-16a8-4b5b-9d99-bc98919feb21\") " pod="openstack/keystone-db-create-8fq2p" Feb 19 15:28:00 crc kubenswrapper[4810]: I0219 15:28:00.892683 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bca24a94-16a8-4b5b-9d99-bc98919feb21-operator-scripts\") pod \"keystone-db-create-8fq2p\" (UID: \"bca24a94-16a8-4b5b-9d99-bc98919feb21\") " pod="openstack/keystone-db-create-8fq2p" Feb 19 15:28:00 crc kubenswrapper[4810]: I0219 15:28:00.893392 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Feb 19 15:28:00 crc kubenswrapper[4810]: I0219 15:28:00.931147 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-002d-account-create-update-6kk29"] Feb 19 15:28:00 crc kubenswrapper[4810]: I0219 15:28:00.993678 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjs86\" (UniqueName: \"kubernetes.io/projected/c141144d-36a6-4c0c-b764-d7453c101ea3-kube-api-access-fjs86\") pod \"keystone-002d-account-create-update-6kk29\" (UID: \"c141144d-36a6-4c0c-b764-d7453c101ea3\") " pod="openstack/keystone-002d-account-create-update-6kk29" Feb 19 15:28:00 crc kubenswrapper[4810]: I0219 15:28:00.993772 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgx2j\" (UniqueName: \"kubernetes.io/projected/bca24a94-16a8-4b5b-9d99-bc98919feb21-kube-api-access-lgx2j\") pod \"keystone-db-create-8fq2p\" (UID: \"bca24a94-16a8-4b5b-9d99-bc98919feb21\") " pod="openstack/keystone-db-create-8fq2p" Feb 19 15:28:00 crc kubenswrapper[4810]: I0219 15:28:00.993823 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bca24a94-16a8-4b5b-9d99-bc98919feb21-operator-scripts\") pod \"keystone-db-create-8fq2p\" (UID: \"bca24a94-16a8-4b5b-9d99-bc98919feb21\") " pod="openstack/keystone-db-create-8fq2p" Feb 19 15:28:00 crc kubenswrapper[4810]: I0219 15:28:00.993846 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c141144d-36a6-4c0c-b764-d7453c101ea3-operator-scripts\") pod \"keystone-002d-account-create-update-6kk29\" (UID: \"c141144d-36a6-4c0c-b764-d7453c101ea3\") " pod="openstack/keystone-002d-account-create-update-6kk29" Feb 19 15:28:00 crc kubenswrapper[4810]: I0219 15:28:00.994894 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bca24a94-16a8-4b5b-9d99-bc98919feb21-operator-scripts\") pod \"keystone-db-create-8fq2p\" (UID: \"bca24a94-16a8-4b5b-9d99-bc98919feb21\") " pod="openstack/keystone-db-create-8fq2p" Feb 19 15:28:01 crc kubenswrapper[4810]: I0219 15:28:01.013364 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgx2j\" (UniqueName: \"kubernetes.io/projected/bca24a94-16a8-4b5b-9d99-bc98919feb21-kube-api-access-lgx2j\") pod \"keystone-db-create-8fq2p\" (UID: \"bca24a94-16a8-4b5b-9d99-bc98919feb21\") " pod="openstack/keystone-db-create-8fq2p" Feb 19 15:28:01 crc kubenswrapper[4810]: I0219 15:28:01.065612 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-hm7ql"] Feb 19 15:28:01 crc kubenswrapper[4810]: I0219 15:28:01.081579 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-c263-account-create-update-wz7k6"] Feb 19 15:28:01 crc kubenswrapper[4810]: I0219 15:28:01.081769 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-hm7ql" Feb 19 15:28:01 crc kubenswrapper[4810]: I0219 15:28:01.093003 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-hm7ql"] Feb 19 15:28:01 crc kubenswrapper[4810]: I0219 15:28:01.093257 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c263-account-create-update-wz7k6" Feb 19 15:28:01 crc kubenswrapper[4810]: I0219 15:28:01.114441 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-c263-account-create-update-wz7k6"] Feb 19 15:28:01 crc kubenswrapper[4810]: I0219 15:28:01.116711 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c141144d-36a6-4c0c-b764-d7453c101ea3-operator-scripts\") pod \"keystone-002d-account-create-update-6kk29\" (UID: \"c141144d-36a6-4c0c-b764-d7453c101ea3\") " pod="openstack/keystone-002d-account-create-update-6kk29" Feb 19 15:28:01 crc kubenswrapper[4810]: I0219 15:28:01.116789 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b6040a1-1df6-44da-ba23-4c7b1ccf17b1-operator-scripts\") pod \"placement-db-create-hm7ql\" (UID: \"8b6040a1-1df6-44da-ba23-4c7b1ccf17b1\") " pod="openstack/placement-db-create-hm7ql" Feb 19 15:28:01 crc kubenswrapper[4810]: I0219 15:28:01.117037 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjs86\" (UniqueName: \"kubernetes.io/projected/c141144d-36a6-4c0c-b764-d7453c101ea3-kube-api-access-fjs86\") pod \"keystone-002d-account-create-update-6kk29\" (UID: \"c141144d-36a6-4c0c-b764-d7453c101ea3\") " pod="openstack/keystone-002d-account-create-update-6kk29" Feb 19 15:28:01 crc kubenswrapper[4810]: I0219 15:28:01.117143 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fk4fc\" (UniqueName: \"kubernetes.io/projected/8b6040a1-1df6-44da-ba23-4c7b1ccf17b1-kube-api-access-fk4fc\") pod \"placement-db-create-hm7ql\" (UID: \"8b6040a1-1df6-44da-ba23-4c7b1ccf17b1\") " pod="openstack/placement-db-create-hm7ql" Feb 19 15:28:01 crc kubenswrapper[4810]: I0219 15:28:01.117755 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c141144d-36a6-4c0c-b764-d7453c101ea3-operator-scripts\") pod \"keystone-002d-account-create-update-6kk29\" (UID: \"c141144d-36a6-4c0c-b764-d7453c101ea3\") " pod="openstack/keystone-002d-account-create-update-6kk29" Feb 19 15:28:01 crc kubenswrapper[4810]: I0219 15:28:01.118619 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Feb 19 15:28:01 crc kubenswrapper[4810]: I0219 15:28:01.141786 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjs86\" (UniqueName: \"kubernetes.io/projected/c141144d-36a6-4c0c-b764-d7453c101ea3-kube-api-access-fjs86\") pod \"keystone-002d-account-create-update-6kk29\" (UID: \"c141144d-36a6-4c0c-b764-d7453c101ea3\") " pod="openstack/keystone-002d-account-create-update-6kk29" Feb 19 15:28:01 crc kubenswrapper[4810]: I0219 15:28:01.187726 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-8fq2p" Feb 19 15:28:01 crc kubenswrapper[4810]: I0219 15:28:01.213122 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-002d-account-create-update-6kk29" Feb 19 15:28:01 crc kubenswrapper[4810]: I0219 15:28:01.220850 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fk4fc\" (UniqueName: \"kubernetes.io/projected/8b6040a1-1df6-44da-ba23-4c7b1ccf17b1-kube-api-access-fk4fc\") pod \"placement-db-create-hm7ql\" (UID: \"8b6040a1-1df6-44da-ba23-4c7b1ccf17b1\") " pod="openstack/placement-db-create-hm7ql" Feb 19 15:28:01 crc kubenswrapper[4810]: I0219 15:28:01.221656 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b6040a1-1df6-44da-ba23-4c7b1ccf17b1-operator-scripts\") pod \"placement-db-create-hm7ql\" (UID: \"8b6040a1-1df6-44da-ba23-4c7b1ccf17b1\") " pod="openstack/placement-db-create-hm7ql" Feb 19 15:28:01 crc kubenswrapper[4810]: I0219 15:28:01.224090 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b6040a1-1df6-44da-ba23-4c7b1ccf17b1-operator-scripts\") pod \"placement-db-create-hm7ql\" (UID: \"8b6040a1-1df6-44da-ba23-4c7b1ccf17b1\") " pod="openstack/placement-db-create-hm7ql" Feb 19 15:28:01 crc kubenswrapper[4810]: I0219 15:28:01.224148 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d8e48ca-d504-48a9-9e92-97651cd15d28-operator-scripts\") pod \"placement-c263-account-create-update-wz7k6\" (UID: \"9d8e48ca-d504-48a9-9e92-97651cd15d28\") " pod="openstack/placement-c263-account-create-update-wz7k6" Feb 19 15:28:01 crc kubenswrapper[4810]: I0219 15:28:01.224432 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzq74\" (UniqueName: \"kubernetes.io/projected/9d8e48ca-d504-48a9-9e92-97651cd15d28-kube-api-access-nzq74\") pod \"placement-c263-account-create-update-wz7k6\" (UID: \"9d8e48ca-d504-48a9-9e92-97651cd15d28\") " pod="openstack/placement-c263-account-create-update-wz7k6" Feb 19 15:28:01 crc kubenswrapper[4810]: I0219 15:28:01.251046 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fk4fc\" (UniqueName: \"kubernetes.io/projected/8b6040a1-1df6-44da-ba23-4c7b1ccf17b1-kube-api-access-fk4fc\") pod \"placement-db-create-hm7ql\" (UID: \"8b6040a1-1df6-44da-ba23-4c7b1ccf17b1\") " pod="openstack/placement-db-create-hm7ql" Feb 19 15:28:01 crc kubenswrapper[4810]: I0219 15:28:01.296700 4810 generic.go:334] "Generic (PLEG): container finished" podID="2222b6ca-79cd-48d7-b262-87e5cd4db6b1" containerID="9704b5e429194139e41b388c5c38c38e001c096f5c05263f386a9f8220160ce9" exitCode=0 Feb 19 15:28:01 crc kubenswrapper[4810]: I0219 15:28:01.296778 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-pd6hg" event={"ID":"2222b6ca-79cd-48d7-b262-87e5cd4db6b1","Type":"ContainerDied","Data":"9704b5e429194139e41b388c5c38c38e001c096f5c05263f386a9f8220160ce9"} Feb 19 15:28:01 crc kubenswrapper[4810]: I0219 15:28:01.296803 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-pd6hg" event={"ID":"2222b6ca-79cd-48d7-b262-87e5cd4db6b1","Type":"ContainerStarted","Data":"85c91ccb7dedab878f29dbfcb4d3c997a0045c38d5392bac957d809750393938"} Feb 19 15:28:01 crc kubenswrapper[4810]: I0219 15:28:01.305445 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-79a4-account-create-update-mrm9x" event={"ID":"b4b22749-0497-48c2-b943-2c48aef05707","Type":"ContainerStarted","Data":"084e9f9fbe2e5f93a513bee51567c16a8dbaf61639fac09083f059ee237ae6a4"} Feb 19 15:28:01 crc kubenswrapper[4810]: I0219 15:28:01.305516 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-79a4-account-create-update-mrm9x" event={"ID":"b4b22749-0497-48c2-b943-2c48aef05707","Type":"ContainerStarted","Data":"0e9ae7399f107576a8fdd2096ba7d6ad90d3f962b000b58e6b5164ee30012fd1"} Feb 19 15:28:01 crc kubenswrapper[4810]: I0219 15:28:01.327124 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d8e48ca-d504-48a9-9e92-97651cd15d28-operator-scripts\") pod \"placement-c263-account-create-update-wz7k6\" (UID: \"9d8e48ca-d504-48a9-9e92-97651cd15d28\") " pod="openstack/placement-c263-account-create-update-wz7k6" Feb 19 15:28:01 crc kubenswrapper[4810]: I0219 15:28:01.327236 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzq74\" (UniqueName: \"kubernetes.io/projected/9d8e48ca-d504-48a9-9e92-97651cd15d28-kube-api-access-nzq74\") pod \"placement-c263-account-create-update-wz7k6\" (UID: \"9d8e48ca-d504-48a9-9e92-97651cd15d28\") " pod="openstack/placement-c263-account-create-update-wz7k6" Feb 19 15:28:01 crc kubenswrapper[4810]: I0219 15:28:01.327864 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d8e48ca-d504-48a9-9e92-97651cd15d28-operator-scripts\") pod \"placement-c263-account-create-update-wz7k6\" (UID: \"9d8e48ca-d504-48a9-9e92-97651cd15d28\") " pod="openstack/placement-c263-account-create-update-wz7k6" Feb 19 15:28:01 crc kubenswrapper[4810]: I0219 15:28:01.338890 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-79a4-account-create-update-mrm9x" podStartSLOduration=2.338871862 podStartE2EDuration="2.338871862s" podCreationTimestamp="2026-02-19 15:27:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:28:01.335191082 +0000 UTC m=+1110.817221206" watchObservedRunningTime="2026-02-19 15:28:01.338871862 +0000 UTC m=+1110.820901986" Feb 19 15:28:01 crc kubenswrapper[4810]: I0219 15:28:01.345775 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzq74\" (UniqueName: \"kubernetes.io/projected/9d8e48ca-d504-48a9-9e92-97651cd15d28-kube-api-access-nzq74\") pod \"placement-c263-account-create-update-wz7k6\" (UID: \"9d8e48ca-d504-48a9-9e92-97651cd15d28\") " pod="openstack/placement-c263-account-create-update-wz7k6" Feb 19 15:28:01 crc kubenswrapper[4810]: I0219 15:28:01.398337 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-hm7ql" Feb 19 15:28:01 crc kubenswrapper[4810]: I0219 15:28:01.436933 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c263-account-create-update-wz7k6" Feb 19 15:28:01 crc kubenswrapper[4810]: I0219 15:28:01.744895 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-002d-account-create-update-6kk29"] Feb 19 15:28:01 crc kubenswrapper[4810]: I0219 15:28:01.823893 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-8fq2p"] Feb 19 15:28:01 crc kubenswrapper[4810]: W0219 15:28:01.826567 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbca24a94_16a8_4b5b_9d99_bc98919feb21.slice/crio-2f15ac378d0660eacd4f50163aac8b0feb80fb1849a53563f3a697096f3125d7 WatchSource:0}: Error finding container 2f15ac378d0660eacd4f50163aac8b0feb80fb1849a53563f3a697096f3125d7: Status 404 returned error can't find the container with id 2f15ac378d0660eacd4f50163aac8b0feb80fb1849a53563f3a697096f3125d7 Feb 19 15:28:01 crc kubenswrapper[4810]: I0219 15:28:01.918975 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-hm7ql"] Feb 19 15:28:01 crc kubenswrapper[4810]: W0219 15:28:01.922592 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b6040a1_1df6_44da_ba23_4c7b1ccf17b1.slice/crio-66d4f8540c8e3cd863482a260aa47693da07db5108ec65200fcc7ea0ffcf0e79 WatchSource:0}: Error finding container 66d4f8540c8e3cd863482a260aa47693da07db5108ec65200fcc7ea0ffcf0e79: Status 404 returned error can't find the container with id 66d4f8540c8e3cd863482a260aa47693da07db5108ec65200fcc7ea0ffcf0e79 Feb 19 15:28:01 crc kubenswrapper[4810]: I0219 15:28:01.986026 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-db-create-ffjjc"] Feb 19 15:28:01 crc kubenswrapper[4810]: I0219 15:28:01.987715 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-ffjjc" Feb 19 15:28:01 crc kubenswrapper[4810]: I0219 15:28:01.994776 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-create-ffjjc"] Feb 19 15:28:02 crc kubenswrapper[4810]: I0219 15:28:02.003516 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-c263-account-create-update-wz7k6"] Feb 19 15:28:02 crc kubenswrapper[4810]: I0219 15:28:02.038643 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21eb5702-ca94-449c-839c-e3970593417d-operator-scripts\") pod \"watcher-db-create-ffjjc\" (UID: \"21eb5702-ca94-449c-839c-e3970593417d\") " pod="openstack/watcher-db-create-ffjjc" Feb 19 15:28:02 crc kubenswrapper[4810]: I0219 15:28:02.038691 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2frb\" (UniqueName: \"kubernetes.io/projected/21eb5702-ca94-449c-839c-e3970593417d-kube-api-access-q2frb\") pod \"watcher-db-create-ffjjc\" (UID: \"21eb5702-ca94-449c-839c-e3970593417d\") " pod="openstack/watcher-db-create-ffjjc" Feb 19 15:28:02 crc kubenswrapper[4810]: I0219 15:28:02.066562 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-a42d-account-create-update-l2kgw"] Feb 19 15:28:02 crc kubenswrapper[4810]: I0219 15:28:02.067574 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-a42d-account-create-update-l2kgw" Feb 19 15:28:02 crc kubenswrapper[4810]: I0219 15:28:02.069753 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-db-secret" Feb 19 15:28:02 crc kubenswrapper[4810]: I0219 15:28:02.078816 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-a42d-account-create-update-l2kgw"] Feb 19 15:28:02 crc kubenswrapper[4810]: I0219 15:28:02.140110 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-522qg\" (UniqueName: \"kubernetes.io/projected/7f791f64-69f8-448d-8370-aeef0db30071-kube-api-access-522qg\") pod \"watcher-a42d-account-create-update-l2kgw\" (UID: \"7f791f64-69f8-448d-8370-aeef0db30071\") " pod="openstack/watcher-a42d-account-create-update-l2kgw" Feb 19 15:28:02 crc kubenswrapper[4810]: I0219 15:28:02.140153 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21eb5702-ca94-449c-839c-e3970593417d-operator-scripts\") pod \"watcher-db-create-ffjjc\" (UID: \"21eb5702-ca94-449c-839c-e3970593417d\") " pod="openstack/watcher-db-create-ffjjc" Feb 19 15:28:02 crc kubenswrapper[4810]: I0219 15:28:02.140184 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2frb\" (UniqueName: \"kubernetes.io/projected/21eb5702-ca94-449c-839c-e3970593417d-kube-api-access-q2frb\") pod \"watcher-db-create-ffjjc\" (UID: \"21eb5702-ca94-449c-839c-e3970593417d\") " pod="openstack/watcher-db-create-ffjjc" Feb 19 15:28:02 crc kubenswrapper[4810]: I0219 15:28:02.140258 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f791f64-69f8-448d-8370-aeef0db30071-operator-scripts\") pod \"watcher-a42d-account-create-update-l2kgw\" (UID: \"7f791f64-69f8-448d-8370-aeef0db30071\") " pod="openstack/watcher-a42d-account-create-update-l2kgw" Feb 19 15:28:02 crc kubenswrapper[4810]: I0219 15:28:02.141313 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21eb5702-ca94-449c-839c-e3970593417d-operator-scripts\") pod \"watcher-db-create-ffjjc\" (UID: \"21eb5702-ca94-449c-839c-e3970593417d\") " pod="openstack/watcher-db-create-ffjjc" Feb 19 15:28:02 crc kubenswrapper[4810]: I0219 15:28:02.163445 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2frb\" (UniqueName: \"kubernetes.io/projected/21eb5702-ca94-449c-839c-e3970593417d-kube-api-access-q2frb\") pod \"watcher-db-create-ffjjc\" (UID: \"21eb5702-ca94-449c-839c-e3970593417d\") " pod="openstack/watcher-db-create-ffjjc" Feb 19 15:28:02 crc kubenswrapper[4810]: I0219 15:28:02.241665 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f791f64-69f8-448d-8370-aeef0db30071-operator-scripts\") pod \"watcher-a42d-account-create-update-l2kgw\" (UID: \"7f791f64-69f8-448d-8370-aeef0db30071\") " pod="openstack/watcher-a42d-account-create-update-l2kgw" Feb 19 15:28:02 crc kubenswrapper[4810]: I0219 15:28:02.241769 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-522qg\" (UniqueName: \"kubernetes.io/projected/7f791f64-69f8-448d-8370-aeef0db30071-kube-api-access-522qg\") pod \"watcher-a42d-account-create-update-l2kgw\" (UID: \"7f791f64-69f8-448d-8370-aeef0db30071\") " pod="openstack/watcher-a42d-account-create-update-l2kgw" Feb 19 15:28:02 crc kubenswrapper[4810]: I0219 15:28:02.243684 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f791f64-69f8-448d-8370-aeef0db30071-operator-scripts\") pod \"watcher-a42d-account-create-update-l2kgw\" (UID: \"7f791f64-69f8-448d-8370-aeef0db30071\") " pod="openstack/watcher-a42d-account-create-update-l2kgw" Feb 19 15:28:02 crc kubenswrapper[4810]: I0219 15:28:02.259023 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-522qg\" (UniqueName: \"kubernetes.io/projected/7f791f64-69f8-448d-8370-aeef0db30071-kube-api-access-522qg\") pod \"watcher-a42d-account-create-update-l2kgw\" (UID: \"7f791f64-69f8-448d-8370-aeef0db30071\") " pod="openstack/watcher-a42d-account-create-update-l2kgw" Feb 19 15:28:02 crc kubenswrapper[4810]: I0219 15:28:02.315513 4810 generic.go:334] "Generic (PLEG): container finished" podID="b4b22749-0497-48c2-b943-2c48aef05707" containerID="084e9f9fbe2e5f93a513bee51567c16a8dbaf61639fac09083f059ee237ae6a4" exitCode=0 Feb 19 15:28:02 crc kubenswrapper[4810]: I0219 15:28:02.315597 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-79a4-account-create-update-mrm9x" event={"ID":"b4b22749-0497-48c2-b943-2c48aef05707","Type":"ContainerDied","Data":"084e9f9fbe2e5f93a513bee51567c16a8dbaf61639fac09083f059ee237ae6a4"} Feb 19 15:28:02 crc kubenswrapper[4810]: I0219 15:28:02.319702 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c263-account-create-update-wz7k6" event={"ID":"9d8e48ca-d504-48a9-9e92-97651cd15d28","Type":"ContainerStarted","Data":"683b765d3388918ee0690173c641c6f414e8fc77c164afb4ab566f37723b326b"} Feb 19 15:28:02 crc kubenswrapper[4810]: I0219 15:28:02.320187 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c263-account-create-update-wz7k6" event={"ID":"9d8e48ca-d504-48a9-9e92-97651cd15d28","Type":"ContainerStarted","Data":"dc2b8768ce86d039d1a4f235de22e0f1357ab21c7bcf98cf5982179ca5a156ce"} Feb 19 15:28:02 crc kubenswrapper[4810]: I0219 15:28:02.322275 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-8fq2p" event={"ID":"bca24a94-16a8-4b5b-9d99-bc98919feb21","Type":"ContainerStarted","Data":"78c0fb5a6a2ddab1d7b49b378f905fccf1b07a5af8d34ee0f62b947801682e49"} Feb 19 15:28:02 crc kubenswrapper[4810]: I0219 15:28:02.322305 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-8fq2p" event={"ID":"bca24a94-16a8-4b5b-9d99-bc98919feb21","Type":"ContainerStarted","Data":"2f15ac378d0660eacd4f50163aac8b0feb80fb1849a53563f3a697096f3125d7"} Feb 19 15:28:02 crc kubenswrapper[4810]: I0219 15:28:02.324409 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-hm7ql" event={"ID":"8b6040a1-1df6-44da-ba23-4c7b1ccf17b1","Type":"ContainerStarted","Data":"72214a8edd0c54f3823201969c4eb1d1b241f1f9c89ed676fa59ee81e422993e"} Feb 19 15:28:02 crc kubenswrapper[4810]: I0219 15:28:02.324451 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-hm7ql" event={"ID":"8b6040a1-1df6-44da-ba23-4c7b1ccf17b1","Type":"ContainerStarted","Data":"66d4f8540c8e3cd863482a260aa47693da07db5108ec65200fcc7ea0ffcf0e79"} Feb 19 15:28:02 crc kubenswrapper[4810]: I0219 15:28:02.326715 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-002d-account-create-update-6kk29" event={"ID":"c141144d-36a6-4c0c-b764-d7453c101ea3","Type":"ContainerStarted","Data":"7503cb280210459a8b150bda1b5d65c5f4d10619291800104ee52fa8927bdb82"} Feb 19 15:28:02 crc kubenswrapper[4810]: I0219 15:28:02.326767 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-002d-account-create-update-6kk29" event={"ID":"c141144d-36a6-4c0c-b764-d7453c101ea3","Type":"ContainerStarted","Data":"0afa87d29255170cab02cb057e707fc9c3faa4ab5535334cc3ac3057731edf16"} Feb 19 15:28:02 crc kubenswrapper[4810]: I0219 15:28:02.340938 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-ffjjc" Feb 19 15:28:02 crc kubenswrapper[4810]: I0219 15:28:02.349372 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-c263-account-create-update-wz7k6" podStartSLOduration=1.349348697 podStartE2EDuration="1.349348697s" podCreationTimestamp="2026-02-19 15:28:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:28:02.346161819 +0000 UTC m=+1111.828191943" watchObservedRunningTime="2026-02-19 15:28:02.349348697 +0000 UTC m=+1111.831378821" Feb 19 15:28:02 crc kubenswrapper[4810]: I0219 15:28:02.364418 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-hm7ql" podStartSLOduration=1.364397125 podStartE2EDuration="1.364397125s" podCreationTimestamp="2026-02-19 15:28:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:28:02.360603083 +0000 UTC m=+1111.842633207" watchObservedRunningTime="2026-02-19 15:28:02.364397125 +0000 UTC m=+1111.846427249" Feb 19 15:28:02 crc kubenswrapper[4810]: I0219 15:28:02.379996 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-002d-account-create-update-6kk29" podStartSLOduration=2.379978047 podStartE2EDuration="2.379978047s" podCreationTimestamp="2026-02-19 15:28:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:28:02.374729128 +0000 UTC m=+1111.856759252" watchObservedRunningTime="2026-02-19 15:28:02.379978047 +0000 UTC m=+1111.862008171" Feb 19 15:28:02 crc kubenswrapper[4810]: I0219 15:28:02.393086 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-a42d-account-create-update-l2kgw" Feb 19 15:28:02 crc kubenswrapper[4810]: I0219 15:28:02.398665 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-8fq2p" podStartSLOduration=2.398647754 podStartE2EDuration="2.398647754s" podCreationTimestamp="2026-02-19 15:28:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:28:02.394042021 +0000 UTC m=+1111.876072145" watchObservedRunningTime="2026-02-19 15:28:02.398647754 +0000 UTC m=+1111.880677878" Feb 19 15:28:02 crc kubenswrapper[4810]: I0219 15:28:02.767583 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-pd6hg" Feb 19 15:28:02 crc kubenswrapper[4810]: I0219 15:28:02.854631 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6g2w\" (UniqueName: \"kubernetes.io/projected/2222b6ca-79cd-48d7-b262-87e5cd4db6b1-kube-api-access-f6g2w\") pod \"2222b6ca-79cd-48d7-b262-87e5cd4db6b1\" (UID: \"2222b6ca-79cd-48d7-b262-87e5cd4db6b1\") " Feb 19 15:28:02 crc kubenswrapper[4810]: I0219 15:28:02.855009 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2222b6ca-79cd-48d7-b262-87e5cd4db6b1-operator-scripts\") pod \"2222b6ca-79cd-48d7-b262-87e5cd4db6b1\" (UID: \"2222b6ca-79cd-48d7-b262-87e5cd4db6b1\") " Feb 19 15:28:02 crc kubenswrapper[4810]: I0219 15:28:02.855799 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2222b6ca-79cd-48d7-b262-87e5cd4db6b1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2222b6ca-79cd-48d7-b262-87e5cd4db6b1" (UID: "2222b6ca-79cd-48d7-b262-87e5cd4db6b1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:28:02 crc kubenswrapper[4810]: I0219 15:28:02.880013 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2222b6ca-79cd-48d7-b262-87e5cd4db6b1-kube-api-access-f6g2w" (OuterVolumeSpecName: "kube-api-access-f6g2w") pod "2222b6ca-79cd-48d7-b262-87e5cd4db6b1" (UID: "2222b6ca-79cd-48d7-b262-87e5cd4db6b1"). InnerVolumeSpecName "kube-api-access-f6g2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:28:02 crc kubenswrapper[4810]: I0219 15:28:02.940743 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-create-ffjjc"] Feb 19 15:28:02 crc kubenswrapper[4810]: I0219 15:28:02.956998 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f6g2w\" (UniqueName: \"kubernetes.io/projected/2222b6ca-79cd-48d7-b262-87e5cd4db6b1-kube-api-access-f6g2w\") on node \"crc\" DevicePath \"\"" Feb 19 15:28:02 crc kubenswrapper[4810]: I0219 15:28:02.957027 4810 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2222b6ca-79cd-48d7-b262-87e5cd4db6b1-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 15:28:03 crc kubenswrapper[4810]: I0219 15:28:03.032167 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-z6ghl"] Feb 19 15:28:03 crc kubenswrapper[4810]: I0219 15:28:03.039659 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-z6ghl"] Feb 19 15:28:03 crc kubenswrapper[4810]: I0219 15:28:03.074878 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-a42d-account-create-update-l2kgw"] Feb 19 15:28:03 crc kubenswrapper[4810]: W0219 15:28:03.078752 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f791f64_69f8_448d_8370_aeef0db30071.slice/crio-7a67cdf1df2baa9907f5e2da0518cbfe07eb77699db82e7ea4f435cb906f9c75 WatchSource:0}: Error finding container 7a67cdf1df2baa9907f5e2da0518cbfe07eb77699db82e7ea4f435cb906f9c75: Status 404 returned error can't find the container with id 7a67cdf1df2baa9907f5e2da0518cbfe07eb77699db82e7ea4f435cb906f9c75 Feb 19 15:28:03 crc kubenswrapper[4810]: I0219 15:28:03.303346 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Feb 19 15:28:03 crc kubenswrapper[4810]: I0219 15:28:03.335666 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-pd6hg" event={"ID":"2222b6ca-79cd-48d7-b262-87e5cd4db6b1","Type":"ContainerDied","Data":"85c91ccb7dedab878f29dbfcb4d3c997a0045c38d5392bac957d809750393938"} Feb 19 15:28:03 crc kubenswrapper[4810]: I0219 15:28:03.335703 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85c91ccb7dedab878f29dbfcb4d3c997a0045c38d5392bac957d809750393938" Feb 19 15:28:03 crc kubenswrapper[4810]: I0219 15:28:03.335719 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-pd6hg" Feb 19 15:28:03 crc kubenswrapper[4810]: I0219 15:28:03.340799 4810 generic.go:334] "Generic (PLEG): container finished" podID="c141144d-36a6-4c0c-b764-d7453c101ea3" containerID="7503cb280210459a8b150bda1b5d65c5f4d10619291800104ee52fa8927bdb82" exitCode=0 Feb 19 15:28:03 crc kubenswrapper[4810]: I0219 15:28:03.340844 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-002d-account-create-update-6kk29" event={"ID":"c141144d-36a6-4c0c-b764-d7453c101ea3","Type":"ContainerDied","Data":"7503cb280210459a8b150bda1b5d65c5f4d10619291800104ee52fa8927bdb82"} Feb 19 15:28:03 crc kubenswrapper[4810]: I0219 15:28:03.342253 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-a42d-account-create-update-l2kgw" event={"ID":"7f791f64-69f8-448d-8370-aeef0db30071","Type":"ContainerStarted","Data":"e9bec1a534d25c6a14471d437a183afe877fd56061c47642147a9b763a2c6190"} Feb 19 15:28:03 crc kubenswrapper[4810]: I0219 15:28:03.342302 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-a42d-account-create-update-l2kgw" event={"ID":"7f791f64-69f8-448d-8370-aeef0db30071","Type":"ContainerStarted","Data":"7a67cdf1df2baa9907f5e2da0518cbfe07eb77699db82e7ea4f435cb906f9c75"} Feb 19 15:28:03 crc kubenswrapper[4810]: I0219 15:28:03.348146 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-ffjjc" event={"ID":"21eb5702-ca94-449c-839c-e3970593417d","Type":"ContainerStarted","Data":"ffa1569c8787f552547599568a5b882194ae4226f8c9d82766a9a36d606eb91a"} Feb 19 15:28:03 crc kubenswrapper[4810]: I0219 15:28:03.348204 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-ffjjc" event={"ID":"21eb5702-ca94-449c-839c-e3970593417d","Type":"ContainerStarted","Data":"a4e1ddd0bcf56dcdcf7a8d77529bbac7c3a7ecc8e18e47545f5e7f391f76a850"} Feb 19 15:28:03 crc kubenswrapper[4810]: I0219 15:28:03.349964 4810 generic.go:334] "Generic (PLEG): container finished" podID="9d8e48ca-d504-48a9-9e92-97651cd15d28" containerID="683b765d3388918ee0690173c641c6f414e8fc77c164afb4ab566f37723b326b" exitCode=0 Feb 19 15:28:03 crc kubenswrapper[4810]: I0219 15:28:03.350040 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c263-account-create-update-wz7k6" event={"ID":"9d8e48ca-d504-48a9-9e92-97651cd15d28","Type":"ContainerDied","Data":"683b765d3388918ee0690173c641c6f414e8fc77c164afb4ab566f37723b326b"} Feb 19 15:28:03 crc kubenswrapper[4810]: I0219 15:28:03.356882 4810 generic.go:334] "Generic (PLEG): container finished" podID="bca24a94-16a8-4b5b-9d99-bc98919feb21" containerID="78c0fb5a6a2ddab1d7b49b378f905fccf1b07a5af8d34ee0f62b947801682e49" exitCode=0 Feb 19 15:28:03 crc kubenswrapper[4810]: I0219 15:28:03.356975 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-8fq2p" event={"ID":"bca24a94-16a8-4b5b-9d99-bc98919feb21","Type":"ContainerDied","Data":"78c0fb5a6a2ddab1d7b49b378f905fccf1b07a5af8d34ee0f62b947801682e49"} Feb 19 15:28:03 crc kubenswrapper[4810]: I0219 15:28:03.358473 4810 generic.go:334] "Generic (PLEG): container finished" podID="8b6040a1-1df6-44da-ba23-4c7b1ccf17b1" containerID="72214a8edd0c54f3823201969c4eb1d1b241f1f9c89ed676fa59ee81e422993e" exitCode=0 Feb 19 15:28:03 crc kubenswrapper[4810]: I0219 15:28:03.358682 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-hm7ql" event={"ID":"8b6040a1-1df6-44da-ba23-4c7b1ccf17b1","Type":"ContainerDied","Data":"72214a8edd0c54f3823201969c4eb1d1b241f1f9c89ed676fa59ee81e422993e"} Feb 19 15:28:03 crc kubenswrapper[4810]: I0219 15:28:03.380652 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-db-create-ffjjc" podStartSLOduration=2.380631391 podStartE2EDuration="2.380631391s" podCreationTimestamp="2026-02-19 15:28:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:28:03.375369372 +0000 UTC m=+1112.857399496" watchObservedRunningTime="2026-02-19 15:28:03.380631391 +0000 UTC m=+1112.862661515" Feb 19 15:28:03 crc kubenswrapper[4810]: I0219 15:28:03.438187 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-a42d-account-create-update-l2kgw" podStartSLOduration=1.43817054 podStartE2EDuration="1.43817054s" podCreationTimestamp="2026-02-19 15:28:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:28:03.436998811 +0000 UTC m=+1112.919028935" watchObservedRunningTime="2026-02-19 15:28:03.43817054 +0000 UTC m=+1112.920200664" Feb 19 15:28:03 crc kubenswrapper[4810]: I0219 15:28:03.449210 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cf99137-c194-466c-b92b-fbda63f4b3d5" path="/var/lib/kubelet/pods/0cf99137-c194-466c-b92b-fbda63f4b3d5/volumes" Feb 19 15:28:03 crc kubenswrapper[4810]: I0219 15:28:03.621290 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-79a4-account-create-update-mrm9x" Feb 19 15:28:03 crc kubenswrapper[4810]: I0219 15:28:03.679481 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b4b22749-0497-48c2-b943-2c48aef05707-operator-scripts\") pod \"b4b22749-0497-48c2-b943-2c48aef05707\" (UID: \"b4b22749-0497-48c2-b943-2c48aef05707\") " Feb 19 15:28:03 crc kubenswrapper[4810]: I0219 15:28:03.679589 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6cpfn\" (UniqueName: \"kubernetes.io/projected/b4b22749-0497-48c2-b943-2c48aef05707-kube-api-access-6cpfn\") pod \"b4b22749-0497-48c2-b943-2c48aef05707\" (UID: \"b4b22749-0497-48c2-b943-2c48aef05707\") " Feb 19 15:28:03 crc kubenswrapper[4810]: I0219 15:28:03.681080 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4b22749-0497-48c2-b943-2c48aef05707-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b4b22749-0497-48c2-b943-2c48aef05707" (UID: "b4b22749-0497-48c2-b943-2c48aef05707"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:28:03 crc kubenswrapper[4810]: I0219 15:28:03.685559 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4b22749-0497-48c2-b943-2c48aef05707-kube-api-access-6cpfn" (OuterVolumeSpecName: "kube-api-access-6cpfn") pod "b4b22749-0497-48c2-b943-2c48aef05707" (UID: "b4b22749-0497-48c2-b943-2c48aef05707"). InnerVolumeSpecName "kube-api-access-6cpfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:28:03 crc kubenswrapper[4810]: I0219 15:28:03.781734 4810 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b4b22749-0497-48c2-b943-2c48aef05707-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 15:28:03 crc kubenswrapper[4810]: I0219 15:28:03.781987 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6cpfn\" (UniqueName: \"kubernetes.io/projected/b4b22749-0497-48c2-b943-2c48aef05707-kube-api-access-6cpfn\") on node \"crc\" DevicePath \"\"" Feb 19 15:28:04 crc kubenswrapper[4810]: I0219 15:28:04.374082 4810 generic.go:334] "Generic (PLEG): container finished" podID="7f791f64-69f8-448d-8370-aeef0db30071" containerID="e9bec1a534d25c6a14471d437a183afe877fd56061c47642147a9b763a2c6190" exitCode=0 Feb 19 15:28:04 crc kubenswrapper[4810]: I0219 15:28:04.374182 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-a42d-account-create-update-l2kgw" event={"ID":"7f791f64-69f8-448d-8370-aeef0db30071","Type":"ContainerDied","Data":"e9bec1a534d25c6a14471d437a183afe877fd56061c47642147a9b763a2c6190"} Feb 19 15:28:04 crc kubenswrapper[4810]: I0219 15:28:04.381268 4810 generic.go:334] "Generic (PLEG): container finished" podID="21eb5702-ca94-449c-839c-e3970593417d" containerID="ffa1569c8787f552547599568a5b882194ae4226f8c9d82766a9a36d606eb91a" exitCode=0 Feb 19 15:28:04 crc kubenswrapper[4810]: I0219 15:28:04.381366 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-ffjjc" event={"ID":"21eb5702-ca94-449c-839c-e3970593417d","Type":"ContainerDied","Data":"ffa1569c8787f552547599568a5b882194ae4226f8c9d82766a9a36d606eb91a"} Feb 19 15:28:04 crc kubenswrapper[4810]: I0219 15:28:04.384349 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-79a4-account-create-update-mrm9x" Feb 19 15:28:04 crc kubenswrapper[4810]: I0219 15:28:04.384438 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-79a4-account-create-update-mrm9x" event={"ID":"b4b22749-0497-48c2-b943-2c48aef05707","Type":"ContainerDied","Data":"0e9ae7399f107576a8fdd2096ba7d6ad90d3f962b000b58e6b5164ee30012fd1"} Feb 19 15:28:04 crc kubenswrapper[4810]: I0219 15:28:04.384482 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e9ae7399f107576a8fdd2096ba7d6ad90d3f962b000b58e6b5164ee30012fd1" Feb 19 15:28:04 crc kubenswrapper[4810]: I0219 15:28:04.821842 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-8fq2p" Feb 19 15:28:04 crc kubenswrapper[4810]: I0219 15:28:04.898731 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bca24a94-16a8-4b5b-9d99-bc98919feb21-operator-scripts\") pod \"bca24a94-16a8-4b5b-9d99-bc98919feb21\" (UID: \"bca24a94-16a8-4b5b-9d99-bc98919feb21\") " Feb 19 15:28:04 crc kubenswrapper[4810]: I0219 15:28:04.898781 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lgx2j\" (UniqueName: \"kubernetes.io/projected/bca24a94-16a8-4b5b-9d99-bc98919feb21-kube-api-access-lgx2j\") pod \"bca24a94-16a8-4b5b-9d99-bc98919feb21\" (UID: \"bca24a94-16a8-4b5b-9d99-bc98919feb21\") " Feb 19 15:28:04 crc kubenswrapper[4810]: I0219 15:28:04.900535 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bca24a94-16a8-4b5b-9d99-bc98919feb21-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bca24a94-16a8-4b5b-9d99-bc98919feb21" (UID: "bca24a94-16a8-4b5b-9d99-bc98919feb21"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:28:04 crc kubenswrapper[4810]: I0219 15:28:04.904245 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bca24a94-16a8-4b5b-9d99-bc98919feb21-kube-api-access-lgx2j" (OuterVolumeSpecName: "kube-api-access-lgx2j") pod "bca24a94-16a8-4b5b-9d99-bc98919feb21" (UID: "bca24a94-16a8-4b5b-9d99-bc98919feb21"). InnerVolumeSpecName "kube-api-access-lgx2j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:28:04 crc kubenswrapper[4810]: I0219 15:28:04.961550 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c263-account-create-update-wz7k6" Feb 19 15:28:04 crc kubenswrapper[4810]: I0219 15:28:04.966914 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-002d-account-create-update-6kk29" Feb 19 15:28:04 crc kubenswrapper[4810]: I0219 15:28:04.976230 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-hm7ql" Feb 19 15:28:05 crc kubenswrapper[4810]: I0219 15:28:05.000999 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d8e48ca-d504-48a9-9e92-97651cd15d28-operator-scripts\") pod \"9d8e48ca-d504-48a9-9e92-97651cd15d28\" (UID: \"9d8e48ca-d504-48a9-9e92-97651cd15d28\") " Feb 19 15:28:05 crc kubenswrapper[4810]: I0219 15:28:05.001053 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjs86\" (UniqueName: \"kubernetes.io/projected/c141144d-36a6-4c0c-b764-d7453c101ea3-kube-api-access-fjs86\") pod \"c141144d-36a6-4c0c-b764-d7453c101ea3\" (UID: \"c141144d-36a6-4c0c-b764-d7453c101ea3\") " Feb 19 15:28:05 crc kubenswrapper[4810]: I0219 15:28:05.001115 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b6040a1-1df6-44da-ba23-4c7b1ccf17b1-operator-scripts\") pod \"8b6040a1-1df6-44da-ba23-4c7b1ccf17b1\" (UID: \"8b6040a1-1df6-44da-ba23-4c7b1ccf17b1\") " Feb 19 15:28:05 crc kubenswrapper[4810]: I0219 15:28:05.001163 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fk4fc\" (UniqueName: \"kubernetes.io/projected/8b6040a1-1df6-44da-ba23-4c7b1ccf17b1-kube-api-access-fk4fc\") pod \"8b6040a1-1df6-44da-ba23-4c7b1ccf17b1\" (UID: \"8b6040a1-1df6-44da-ba23-4c7b1ccf17b1\") " Feb 19 15:28:05 crc kubenswrapper[4810]: I0219 15:28:05.001203 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzq74\" (UniqueName: \"kubernetes.io/projected/9d8e48ca-d504-48a9-9e92-97651cd15d28-kube-api-access-nzq74\") pod \"9d8e48ca-d504-48a9-9e92-97651cd15d28\" (UID: \"9d8e48ca-d504-48a9-9e92-97651cd15d28\") " Feb 19 15:28:05 crc kubenswrapper[4810]: I0219 15:28:05.001320 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c141144d-36a6-4c0c-b764-d7453c101ea3-operator-scripts\") pod \"c141144d-36a6-4c0c-b764-d7453c101ea3\" (UID: \"c141144d-36a6-4c0c-b764-d7453c101ea3\") " Feb 19 15:28:05 crc kubenswrapper[4810]: I0219 15:28:05.001778 4810 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bca24a94-16a8-4b5b-9d99-bc98919feb21-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 15:28:05 crc kubenswrapper[4810]: I0219 15:28:05.001801 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lgx2j\" (UniqueName: \"kubernetes.io/projected/bca24a94-16a8-4b5b-9d99-bc98919feb21-kube-api-access-lgx2j\") on node \"crc\" DevicePath \"\"" Feb 19 15:28:05 crc kubenswrapper[4810]: I0219 15:28:05.002287 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c141144d-36a6-4c0c-b764-d7453c101ea3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c141144d-36a6-4c0c-b764-d7453c101ea3" (UID: "c141144d-36a6-4c0c-b764-d7453c101ea3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:28:05 crc kubenswrapper[4810]: I0219 15:28:05.004419 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d8e48ca-d504-48a9-9e92-97651cd15d28-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9d8e48ca-d504-48a9-9e92-97651cd15d28" (UID: "9d8e48ca-d504-48a9-9e92-97651cd15d28"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:28:05 crc kubenswrapper[4810]: I0219 15:28:05.005736 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b6040a1-1df6-44da-ba23-4c7b1ccf17b1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8b6040a1-1df6-44da-ba23-4c7b1ccf17b1" (UID: "8b6040a1-1df6-44da-ba23-4c7b1ccf17b1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:28:05 crc kubenswrapper[4810]: I0219 15:28:05.011595 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c141144d-36a6-4c0c-b764-d7453c101ea3-kube-api-access-fjs86" (OuterVolumeSpecName: "kube-api-access-fjs86") pod "c141144d-36a6-4c0c-b764-d7453c101ea3" (UID: "c141144d-36a6-4c0c-b764-d7453c101ea3"). InnerVolumeSpecName "kube-api-access-fjs86". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:28:05 crc kubenswrapper[4810]: I0219 15:28:05.011726 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b6040a1-1df6-44da-ba23-4c7b1ccf17b1-kube-api-access-fk4fc" (OuterVolumeSpecName: "kube-api-access-fk4fc") pod "8b6040a1-1df6-44da-ba23-4c7b1ccf17b1" (UID: "8b6040a1-1df6-44da-ba23-4c7b1ccf17b1"). InnerVolumeSpecName "kube-api-access-fk4fc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:28:05 crc kubenswrapper[4810]: I0219 15:28:05.011620 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d8e48ca-d504-48a9-9e92-97651cd15d28-kube-api-access-nzq74" (OuterVolumeSpecName: "kube-api-access-nzq74") pod "9d8e48ca-d504-48a9-9e92-97651cd15d28" (UID: "9d8e48ca-d504-48a9-9e92-97651cd15d28"). InnerVolumeSpecName "kube-api-access-nzq74". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:28:05 crc kubenswrapper[4810]: I0219 15:28:05.078732 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-kpf4t"] Feb 19 15:28:05 crc kubenswrapper[4810]: E0219 15:28:05.079230 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b6040a1-1df6-44da-ba23-4c7b1ccf17b1" containerName="mariadb-database-create" Feb 19 15:28:05 crc kubenswrapper[4810]: I0219 15:28:05.079254 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b6040a1-1df6-44da-ba23-4c7b1ccf17b1" containerName="mariadb-database-create" Feb 19 15:28:05 crc kubenswrapper[4810]: E0219 15:28:05.079270 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d8e48ca-d504-48a9-9e92-97651cd15d28" containerName="mariadb-account-create-update" Feb 19 15:28:05 crc kubenswrapper[4810]: I0219 15:28:05.079277 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d8e48ca-d504-48a9-9e92-97651cd15d28" containerName="mariadb-account-create-update" Feb 19 15:28:05 crc kubenswrapper[4810]: E0219 15:28:05.079295 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4b22749-0497-48c2-b943-2c48aef05707" containerName="mariadb-account-create-update" Feb 19 15:28:05 crc kubenswrapper[4810]: I0219 15:28:05.079303 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4b22749-0497-48c2-b943-2c48aef05707" containerName="mariadb-account-create-update" Feb 19 15:28:05 crc kubenswrapper[4810]: E0219 15:28:05.079347 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c141144d-36a6-4c0c-b764-d7453c101ea3" containerName="mariadb-account-create-update" Feb 19 15:28:05 crc kubenswrapper[4810]: I0219 15:28:05.079358 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="c141144d-36a6-4c0c-b764-d7453c101ea3" containerName="mariadb-account-create-update" Feb 19 15:28:05 crc kubenswrapper[4810]: E0219 15:28:05.079374 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2222b6ca-79cd-48d7-b262-87e5cd4db6b1" containerName="mariadb-database-create" Feb 19 15:28:05 crc kubenswrapper[4810]: I0219 15:28:05.079383 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="2222b6ca-79cd-48d7-b262-87e5cd4db6b1" containerName="mariadb-database-create" Feb 19 15:28:05 crc kubenswrapper[4810]: E0219 15:28:05.079396 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bca24a94-16a8-4b5b-9d99-bc98919feb21" containerName="mariadb-database-create" Feb 19 15:28:05 crc kubenswrapper[4810]: I0219 15:28:05.079403 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="bca24a94-16a8-4b5b-9d99-bc98919feb21" containerName="mariadb-database-create" Feb 19 15:28:05 crc kubenswrapper[4810]: I0219 15:28:05.079617 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4b22749-0497-48c2-b943-2c48aef05707" containerName="mariadb-account-create-update" Feb 19 15:28:05 crc kubenswrapper[4810]: I0219 15:28:05.079644 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b6040a1-1df6-44da-ba23-4c7b1ccf17b1" containerName="mariadb-database-create" Feb 19 15:28:05 crc kubenswrapper[4810]: I0219 15:28:05.079656 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="c141144d-36a6-4c0c-b764-d7453c101ea3" containerName="mariadb-account-create-update" Feb 19 15:28:05 crc kubenswrapper[4810]: I0219 15:28:05.079675 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d8e48ca-d504-48a9-9e92-97651cd15d28" containerName="mariadb-account-create-update" Feb 19 15:28:05 crc kubenswrapper[4810]: I0219 15:28:05.079687 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="2222b6ca-79cd-48d7-b262-87e5cd4db6b1" containerName="mariadb-database-create" Feb 19 15:28:05 crc kubenswrapper[4810]: I0219 15:28:05.079706 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="bca24a94-16a8-4b5b-9d99-bc98919feb21" containerName="mariadb-database-create" Feb 19 15:28:05 crc kubenswrapper[4810]: I0219 15:28:05.080436 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-kpf4t" Feb 19 15:28:05 crc kubenswrapper[4810]: I0219 15:28:05.082969 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Feb 19 15:28:05 crc kubenswrapper[4810]: I0219 15:28:05.083262 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-lrdct" Feb 19 15:28:05 crc kubenswrapper[4810]: I0219 15:28:05.098164 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-kpf4t"] Feb 19 15:28:05 crc kubenswrapper[4810]: I0219 15:28:05.103135 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31093793-65b6-467c-8d5b-218e108fd330-config-data\") pod \"glance-db-sync-kpf4t\" (UID: \"31093793-65b6-467c-8d5b-218e108fd330\") " pod="openstack/glance-db-sync-kpf4t" Feb 19 15:28:05 crc kubenswrapper[4810]: I0219 15:28:05.103181 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5tqp\" (UniqueName: \"kubernetes.io/projected/31093793-65b6-467c-8d5b-218e108fd330-kube-api-access-h5tqp\") pod \"glance-db-sync-kpf4t\" (UID: \"31093793-65b6-467c-8d5b-218e108fd330\") " pod="openstack/glance-db-sync-kpf4t" Feb 19 15:28:05 crc kubenswrapper[4810]: I0219 15:28:05.103260 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/31093793-65b6-467c-8d5b-218e108fd330-db-sync-config-data\") pod \"glance-db-sync-kpf4t\" (UID: \"31093793-65b6-467c-8d5b-218e108fd330\") " pod="openstack/glance-db-sync-kpf4t" Feb 19 15:28:05 crc kubenswrapper[4810]: I0219 15:28:05.103292 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31093793-65b6-467c-8d5b-218e108fd330-combined-ca-bundle\") pod \"glance-db-sync-kpf4t\" (UID: \"31093793-65b6-467c-8d5b-218e108fd330\") " pod="openstack/glance-db-sync-kpf4t" Feb 19 15:28:05 crc kubenswrapper[4810]: I0219 15:28:05.103366 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/37e2af25-5b30-4fb9-801e-f4a84d665540-etc-swift\") pod \"swift-storage-0\" (UID: \"37e2af25-5b30-4fb9-801e-f4a84d665540\") " pod="openstack/swift-storage-0" Feb 19 15:28:05 crc kubenswrapper[4810]: I0219 15:28:05.103414 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjs86\" (UniqueName: \"kubernetes.io/projected/c141144d-36a6-4c0c-b764-d7453c101ea3-kube-api-access-fjs86\") on node \"crc\" DevicePath \"\"" Feb 19 15:28:05 crc kubenswrapper[4810]: I0219 15:28:05.103425 4810 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b6040a1-1df6-44da-ba23-4c7b1ccf17b1-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 15:28:05 crc kubenswrapper[4810]: I0219 15:28:05.103434 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fk4fc\" (UniqueName: \"kubernetes.io/projected/8b6040a1-1df6-44da-ba23-4c7b1ccf17b1-kube-api-access-fk4fc\") on node \"crc\" DevicePath \"\"" Feb 19 15:28:05 crc kubenswrapper[4810]: I0219 15:28:05.103444 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzq74\" (UniqueName: \"kubernetes.io/projected/9d8e48ca-d504-48a9-9e92-97651cd15d28-kube-api-access-nzq74\") on node \"crc\" DevicePath \"\"" Feb 19 15:28:05 crc kubenswrapper[4810]: I0219 15:28:05.103452 4810 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c141144d-36a6-4c0c-b764-d7453c101ea3-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 15:28:05 crc kubenswrapper[4810]: I0219 15:28:05.103514 4810 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d8e48ca-d504-48a9-9e92-97651cd15d28-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 15:28:05 crc kubenswrapper[4810]: I0219 15:28:05.110027 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/37e2af25-5b30-4fb9-801e-f4a84d665540-etc-swift\") pod \"swift-storage-0\" (UID: \"37e2af25-5b30-4fb9-801e-f4a84d665540\") " pod="openstack/swift-storage-0" Feb 19 15:28:05 crc kubenswrapper[4810]: I0219 15:28:05.204678 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31093793-65b6-467c-8d5b-218e108fd330-config-data\") pod \"glance-db-sync-kpf4t\" (UID: \"31093793-65b6-467c-8d5b-218e108fd330\") " pod="openstack/glance-db-sync-kpf4t" Feb 19 15:28:05 crc kubenswrapper[4810]: I0219 15:28:05.205027 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5tqp\" (UniqueName: \"kubernetes.io/projected/31093793-65b6-467c-8d5b-218e108fd330-kube-api-access-h5tqp\") pod \"glance-db-sync-kpf4t\" (UID: \"31093793-65b6-467c-8d5b-218e108fd330\") " pod="openstack/glance-db-sync-kpf4t" Feb 19 15:28:05 crc kubenswrapper[4810]: I0219 15:28:05.205173 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/31093793-65b6-467c-8d5b-218e108fd330-db-sync-config-data\") pod \"glance-db-sync-kpf4t\" (UID: \"31093793-65b6-467c-8d5b-218e108fd330\") " pod="openstack/glance-db-sync-kpf4t" Feb 19 15:28:05 crc kubenswrapper[4810]: I0219 15:28:05.205259 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31093793-65b6-467c-8d5b-218e108fd330-combined-ca-bundle\") pod \"glance-db-sync-kpf4t\" (UID: \"31093793-65b6-467c-8d5b-218e108fd330\") " pod="openstack/glance-db-sync-kpf4t" Feb 19 15:28:05 crc kubenswrapper[4810]: I0219 15:28:05.207709 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31093793-65b6-467c-8d5b-218e108fd330-config-data\") pod \"glance-db-sync-kpf4t\" (UID: \"31093793-65b6-467c-8d5b-218e108fd330\") " pod="openstack/glance-db-sync-kpf4t" Feb 19 15:28:05 crc kubenswrapper[4810]: I0219 15:28:05.207782 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/31093793-65b6-467c-8d5b-218e108fd330-db-sync-config-data\") pod \"glance-db-sync-kpf4t\" (UID: \"31093793-65b6-467c-8d5b-218e108fd330\") " pod="openstack/glance-db-sync-kpf4t" Feb 19 15:28:05 crc kubenswrapper[4810]: I0219 15:28:05.213543 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31093793-65b6-467c-8d5b-218e108fd330-combined-ca-bundle\") pod \"glance-db-sync-kpf4t\" (UID: \"31093793-65b6-467c-8d5b-218e108fd330\") " pod="openstack/glance-db-sync-kpf4t" Feb 19 15:28:05 crc kubenswrapper[4810]: I0219 15:28:05.220611 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5tqp\" (UniqueName: \"kubernetes.io/projected/31093793-65b6-467c-8d5b-218e108fd330-kube-api-access-h5tqp\") pod \"glance-db-sync-kpf4t\" (UID: \"31093793-65b6-467c-8d5b-218e108fd330\") " pod="openstack/glance-db-sync-kpf4t" Feb 19 15:28:05 crc kubenswrapper[4810]: I0219 15:28:05.345448 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-s5488" podUID="4a4fa57b-aa00-4866-b31e-df29f7f86480" containerName="ovn-controller" probeResult="failure" output=< Feb 19 15:28:05 crc kubenswrapper[4810]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 19 15:28:05 crc kubenswrapper[4810]: > Feb 19 15:28:05 crc kubenswrapper[4810]: I0219 15:28:05.392012 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 19 15:28:05 crc kubenswrapper[4810]: I0219 15:28:05.402295 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-kpf4t" Feb 19 15:28:05 crc kubenswrapper[4810]: I0219 15:28:05.425231 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-8fq2p" Feb 19 15:28:05 crc kubenswrapper[4810]: I0219 15:28:05.425244 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-8fq2p" event={"ID":"bca24a94-16a8-4b5b-9d99-bc98919feb21","Type":"ContainerDied","Data":"2f15ac378d0660eacd4f50163aac8b0feb80fb1849a53563f3a697096f3125d7"} Feb 19 15:28:05 crc kubenswrapper[4810]: I0219 15:28:05.425309 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f15ac378d0660eacd4f50163aac8b0feb80fb1849a53563f3a697096f3125d7" Feb 19 15:28:05 crc kubenswrapper[4810]: I0219 15:28:05.427351 4810 generic.go:334] "Generic (PLEG): container finished" podID="4ad875c7-9e7b-4cc8-b6f7-e6fbeb95573c" containerID="781b07acf23d18cc10631b3a01aa0eb27d1e62e7e3cfc8db109ba3d58b915ff0" exitCode=0 Feb 19 15:28:05 crc kubenswrapper[4810]: I0219 15:28:05.427394 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/notifications-rabbitmq-server-0" event={"ID":"4ad875c7-9e7b-4cc8-b6f7-e6fbeb95573c","Type":"ContainerDied","Data":"781b07acf23d18cc10631b3a01aa0eb27d1e62e7e3cfc8db109ba3d58b915ff0"} Feb 19 15:28:05 crc kubenswrapper[4810]: I0219 15:28:05.430715 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-hm7ql" event={"ID":"8b6040a1-1df6-44da-ba23-4c7b1ccf17b1","Type":"ContainerDied","Data":"66d4f8540c8e3cd863482a260aa47693da07db5108ec65200fcc7ea0ffcf0e79"} Feb 19 15:28:05 crc kubenswrapper[4810]: I0219 15:28:05.430749 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66d4f8540c8e3cd863482a260aa47693da07db5108ec65200fcc7ea0ffcf0e79" Feb 19 15:28:05 crc kubenswrapper[4810]: I0219 15:28:05.430805 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-hm7ql" Feb 19 15:28:05 crc kubenswrapper[4810]: I0219 15:28:05.434134 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-002d-account-create-update-6kk29" event={"ID":"c141144d-36a6-4c0c-b764-d7453c101ea3","Type":"ContainerDied","Data":"0afa87d29255170cab02cb057e707fc9c3faa4ab5535334cc3ac3057731edf16"} Feb 19 15:28:05 crc kubenswrapper[4810]: I0219 15:28:05.434181 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0afa87d29255170cab02cb057e707fc9c3faa4ab5535334cc3ac3057731edf16" Feb 19 15:28:05 crc kubenswrapper[4810]: I0219 15:28:05.434196 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-002d-account-create-update-6kk29" Feb 19 15:28:05 crc kubenswrapper[4810]: I0219 15:28:05.436134 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c263-account-create-update-wz7k6" event={"ID":"9d8e48ca-d504-48a9-9e92-97651cd15d28","Type":"ContainerDied","Data":"dc2b8768ce86d039d1a4f235de22e0f1357ab21c7bcf98cf5982179ca5a156ce"} Feb 19 15:28:05 crc kubenswrapper[4810]: I0219 15:28:05.436161 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c263-account-create-update-wz7k6" Feb 19 15:28:05 crc kubenswrapper[4810]: I0219 15:28:05.436204 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc2b8768ce86d039d1a4f235de22e0f1357ab21c7bcf98cf5982179ca5a156ce" Feb 19 15:28:05 crc kubenswrapper[4810]: I0219 15:28:05.437968 4810 generic.go:334] "Generic (PLEG): container finished" podID="2a3676ed-f06f-4dea-82a1-959716331113" containerID="d8e4585cb9bc557ddc8d8dc697ba266361d794bec388e11f6081c9ec077a1726" exitCode=0 Feb 19 15:28:05 crc kubenswrapper[4810]: I0219 15:28:05.438081 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2a3676ed-f06f-4dea-82a1-959716331113","Type":"ContainerDied","Data":"d8e4585cb9bc557ddc8d8dc697ba266361d794bec388e11f6081c9ec077a1726"} Feb 19 15:28:05 crc kubenswrapper[4810]: I0219 15:28:05.850255 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-ffjjc" Feb 19 15:28:05 crc kubenswrapper[4810]: I0219 15:28:05.931174 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-a42d-account-create-update-l2kgw" Feb 19 15:28:06 crc kubenswrapper[4810]: I0219 15:28:06.024515 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21eb5702-ca94-449c-839c-e3970593417d-operator-scripts\") pod \"21eb5702-ca94-449c-839c-e3970593417d\" (UID: \"21eb5702-ca94-449c-839c-e3970593417d\") " Feb 19 15:28:06 crc kubenswrapper[4810]: I0219 15:28:06.024557 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f791f64-69f8-448d-8370-aeef0db30071-operator-scripts\") pod \"7f791f64-69f8-448d-8370-aeef0db30071\" (UID: \"7f791f64-69f8-448d-8370-aeef0db30071\") " Feb 19 15:28:06 crc kubenswrapper[4810]: I0219 15:28:06.024679 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-522qg\" (UniqueName: \"kubernetes.io/projected/7f791f64-69f8-448d-8370-aeef0db30071-kube-api-access-522qg\") pod \"7f791f64-69f8-448d-8370-aeef0db30071\" (UID: \"7f791f64-69f8-448d-8370-aeef0db30071\") " Feb 19 15:28:06 crc kubenswrapper[4810]: I0219 15:28:06.024900 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2frb\" (UniqueName: \"kubernetes.io/projected/21eb5702-ca94-449c-839c-e3970593417d-kube-api-access-q2frb\") pod \"21eb5702-ca94-449c-839c-e3970593417d\" (UID: \"21eb5702-ca94-449c-839c-e3970593417d\") " Feb 19 15:28:06 crc kubenswrapper[4810]: I0219 15:28:06.025669 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f791f64-69f8-448d-8370-aeef0db30071-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7f791f64-69f8-448d-8370-aeef0db30071" (UID: "7f791f64-69f8-448d-8370-aeef0db30071"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:28:06 crc kubenswrapper[4810]: I0219 15:28:06.025672 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21eb5702-ca94-449c-839c-e3970593417d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "21eb5702-ca94-449c-839c-e3970593417d" (UID: "21eb5702-ca94-449c-839c-e3970593417d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:28:06 crc kubenswrapper[4810]: I0219 15:28:06.026289 4810 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21eb5702-ca94-449c-839c-e3970593417d-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 15:28:06 crc kubenswrapper[4810]: I0219 15:28:06.026313 4810 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f791f64-69f8-448d-8370-aeef0db30071-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 15:28:06 crc kubenswrapper[4810]: I0219 15:28:06.030105 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21eb5702-ca94-449c-839c-e3970593417d-kube-api-access-q2frb" (OuterVolumeSpecName: "kube-api-access-q2frb") pod "21eb5702-ca94-449c-839c-e3970593417d" (UID: "21eb5702-ca94-449c-839c-e3970593417d"). InnerVolumeSpecName "kube-api-access-q2frb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:28:06 crc kubenswrapper[4810]: I0219 15:28:06.051015 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f791f64-69f8-448d-8370-aeef0db30071-kube-api-access-522qg" (OuterVolumeSpecName: "kube-api-access-522qg") pod "7f791f64-69f8-448d-8370-aeef0db30071" (UID: "7f791f64-69f8-448d-8370-aeef0db30071"). InnerVolumeSpecName "kube-api-access-522qg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:28:06 crc kubenswrapper[4810]: I0219 15:28:06.128118 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-522qg\" (UniqueName: \"kubernetes.io/projected/7f791f64-69f8-448d-8370-aeef0db30071-kube-api-access-522qg\") on node \"crc\" DevicePath \"\"" Feb 19 15:28:06 crc kubenswrapper[4810]: I0219 15:28:06.128162 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2frb\" (UniqueName: \"kubernetes.io/projected/21eb5702-ca94-449c-839c-e3970593417d-kube-api-access-q2frb\") on node \"crc\" DevicePath \"\"" Feb 19 15:28:06 crc kubenswrapper[4810]: W0219 15:28:06.172726 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37e2af25_5b30_4fb9_801e_f4a84d665540.slice/crio-45d9895cbd2409d4877c5d09b01ea6adaa17628d68fb6990723440872a02d8ff WatchSource:0}: Error finding container 45d9895cbd2409d4877c5d09b01ea6adaa17628d68fb6990723440872a02d8ff: Status 404 returned error can't find the container with id 45d9895cbd2409d4877c5d09b01ea6adaa17628d68fb6990723440872a02d8ff Feb 19 15:28:06 crc kubenswrapper[4810]: I0219 15:28:06.175680 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 19 15:28:06 crc kubenswrapper[4810]: I0219 15:28:06.446937 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-ffjjc" event={"ID":"21eb5702-ca94-449c-839c-e3970593417d","Type":"ContainerDied","Data":"a4e1ddd0bcf56dcdcf7a8d77529bbac7c3a7ecc8e18e47545f5e7f391f76a850"} Feb 19 15:28:06 crc kubenswrapper[4810]: I0219 15:28:06.446977 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a4e1ddd0bcf56dcdcf7a8d77529bbac7c3a7ecc8e18e47545f5e7f391f76a850" Feb 19 15:28:06 crc kubenswrapper[4810]: I0219 15:28:06.447030 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-ffjjc" Feb 19 15:28:06 crc kubenswrapper[4810]: I0219 15:28:06.451589 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2a3676ed-f06f-4dea-82a1-959716331113","Type":"ContainerStarted","Data":"dbc27193c644946d801635eac3f4ac8fd7e93e4ac9ea0ee7cb027f5cec7cc199"} Feb 19 15:28:06 crc kubenswrapper[4810]: I0219 15:28:06.451791 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 19 15:28:06 crc kubenswrapper[4810]: I0219 15:28:06.455424 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/notifications-rabbitmq-server-0" event={"ID":"4ad875c7-9e7b-4cc8-b6f7-e6fbeb95573c","Type":"ContainerStarted","Data":"a47a4ee84b2d0fae427bb4579e5143bbbc5bb37b1a0a8a5a2b3b47a263edc8d0"} Feb 19 15:28:06 crc kubenswrapper[4810]: I0219 15:28:06.455615 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/notifications-rabbitmq-server-0" Feb 19 15:28:06 crc kubenswrapper[4810]: I0219 15:28:06.459687 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"37e2af25-5b30-4fb9-801e-f4a84d665540","Type":"ContainerStarted","Data":"45d9895cbd2409d4877c5d09b01ea6adaa17628d68fb6990723440872a02d8ff"} Feb 19 15:28:06 crc kubenswrapper[4810]: I0219 15:28:06.460836 4810 generic.go:334] "Generic (PLEG): container finished" podID="00bcfb03-4357-4343-99a5-30dc7f25abe9" containerID="5f65c0deba7b3077c5501137f00e319288d66ec1245a0e431539e6d1d5d3d67c" exitCode=0 Feb 19 15:28:06 crc kubenswrapper[4810]: I0219 15:28:06.460898 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"00bcfb03-4357-4343-99a5-30dc7f25abe9","Type":"ContainerDied","Data":"5f65c0deba7b3077c5501137f00e319288d66ec1245a0e431539e6d1d5d3d67c"} Feb 19 15:28:06 crc kubenswrapper[4810]: I0219 15:28:06.463657 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-a42d-account-create-update-l2kgw" event={"ID":"7f791f64-69f8-448d-8370-aeef0db30071","Type":"ContainerDied","Data":"7a67cdf1df2baa9907f5e2da0518cbfe07eb77699db82e7ea4f435cb906f9c75"} Feb 19 15:28:06 crc kubenswrapper[4810]: I0219 15:28:06.463699 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a67cdf1df2baa9907f5e2da0518cbfe07eb77699db82e7ea4f435cb906f9c75" Feb 19 15:28:06 crc kubenswrapper[4810]: I0219 15:28:06.463718 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-a42d-account-create-update-l2kgw" Feb 19 15:28:06 crc kubenswrapper[4810]: I0219 15:28:06.485576 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=52.121225893 podStartE2EDuration="1m1.485553593s" podCreationTimestamp="2026-02-19 15:27:05 +0000 UTC" firstStartedPulling="2026-02-19 15:27:20.81481788 +0000 UTC m=+1070.296848004" lastFinishedPulling="2026-02-19 15:27:30.17914558 +0000 UTC m=+1079.661175704" observedRunningTime="2026-02-19 15:28:06.473196411 +0000 UTC m=+1115.955226545" watchObservedRunningTime="2026-02-19 15:28:06.485553593 +0000 UTC m=+1115.967583717" Feb 19 15:28:06 crc kubenswrapper[4810]: I0219 15:28:06.533393 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/notifications-rabbitmq-server-0" podStartSLOduration=52.766031208 podStartE2EDuration="1m2.533368634s" podCreationTimestamp="2026-02-19 15:27:04 +0000 UTC" firstStartedPulling="2026-02-19 15:27:20.804720103 +0000 UTC m=+1070.286750227" lastFinishedPulling="2026-02-19 15:27:30.572057529 +0000 UTC m=+1080.054087653" observedRunningTime="2026-02-19 15:28:06.524963028 +0000 UTC m=+1116.006993152" watchObservedRunningTime="2026-02-19 15:28:06.533368634 +0000 UTC m=+1116.015398798" Feb 19 15:28:06 crc kubenswrapper[4810]: I0219 15:28:06.869814 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-kpf4t"] Feb 19 15:28:06 crc kubenswrapper[4810]: W0219 15:28:06.882145 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31093793_65b6_467c_8d5b_218e108fd330.slice/crio-bf6ea40573eeae6956f4018d71eb8ba4737a0b7a57f0e9d98686d8f2e0c053a9 WatchSource:0}: Error finding container bf6ea40573eeae6956f4018d71eb8ba4737a0b7a57f0e9d98686d8f2e0c053a9: Status 404 returned error can't find the container with id bf6ea40573eeae6956f4018d71eb8ba4737a0b7a57f0e9d98686d8f2e0c053a9 Feb 19 15:28:07 crc kubenswrapper[4810]: I0219 15:28:07.478020 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-kpf4t" event={"ID":"31093793-65b6-467c-8d5b-218e108fd330","Type":"ContainerStarted","Data":"bf6ea40573eeae6956f4018d71eb8ba4737a0b7a57f0e9d98686d8f2e0c053a9"} Feb 19 15:28:07 crc kubenswrapper[4810]: I0219 15:28:07.479540 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"37e2af25-5b30-4fb9-801e-f4a84d665540","Type":"ContainerStarted","Data":"1ba1d37d6ad8e9c154ffe9f1daa0f1bf53029b1342beeac2e90156b0bd5d9b19"} Feb 19 15:28:07 crc kubenswrapper[4810]: I0219 15:28:07.483007 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"00bcfb03-4357-4343-99a5-30dc7f25abe9","Type":"ContainerStarted","Data":"c5b200cb5f29f71ee660d52a8a35bdae1c6f7de01b68b55aa0a763c8a2cc371f"} Feb 19 15:28:07 crc kubenswrapper[4810]: I0219 15:28:07.483682 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 19 15:28:08 crc kubenswrapper[4810]: I0219 15:28:08.071345 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=54.310539915 podStartE2EDuration="1m4.07131166s" podCreationTimestamp="2026-02-19 15:27:04 +0000 UTC" firstStartedPulling="2026-02-19 15:27:20.811287514 +0000 UTC m=+1070.293317638" lastFinishedPulling="2026-02-19 15:27:30.572059249 +0000 UTC m=+1080.054089383" observedRunningTime="2026-02-19 15:28:07.533229509 +0000 UTC m=+1117.015259633" watchObservedRunningTime="2026-02-19 15:28:08.07131166 +0000 UTC m=+1117.553341784" Feb 19 15:28:08 crc kubenswrapper[4810]: I0219 15:28:08.071955 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-j78h8"] Feb 19 15:28:08 crc kubenswrapper[4810]: E0219 15:28:08.072266 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f791f64-69f8-448d-8370-aeef0db30071" containerName="mariadb-account-create-update" Feb 19 15:28:08 crc kubenswrapper[4810]: I0219 15:28:08.072281 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f791f64-69f8-448d-8370-aeef0db30071" containerName="mariadb-account-create-update" Feb 19 15:28:08 crc kubenswrapper[4810]: E0219 15:28:08.072311 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21eb5702-ca94-449c-839c-e3970593417d" containerName="mariadb-database-create" Feb 19 15:28:08 crc kubenswrapper[4810]: I0219 15:28:08.072318 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="21eb5702-ca94-449c-839c-e3970593417d" containerName="mariadb-database-create" Feb 19 15:28:08 crc kubenswrapper[4810]: I0219 15:28:08.072484 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f791f64-69f8-448d-8370-aeef0db30071" containerName="mariadb-account-create-update" Feb 19 15:28:08 crc kubenswrapper[4810]: I0219 15:28:08.072504 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="21eb5702-ca94-449c-839c-e3970593417d" containerName="mariadb-database-create" Feb 19 15:28:08 crc kubenswrapper[4810]: I0219 15:28:08.073070 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-j78h8" Feb 19 15:28:08 crc kubenswrapper[4810]: I0219 15:28:08.083252 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Feb 19 15:28:08 crc kubenswrapper[4810]: I0219 15:28:08.087605 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-j78h8"] Feb 19 15:28:08 crc kubenswrapper[4810]: I0219 15:28:08.170357 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e12a1f8-d78c-41b0-b295-e5e661bf0820-operator-scripts\") pod \"root-account-create-update-j78h8\" (UID: \"8e12a1f8-d78c-41b0-b295-e5e661bf0820\") " pod="openstack/root-account-create-update-j78h8" Feb 19 15:28:08 crc kubenswrapper[4810]: I0219 15:28:08.170469 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzmnf\" (UniqueName: \"kubernetes.io/projected/8e12a1f8-d78c-41b0-b295-e5e661bf0820-kube-api-access-vzmnf\") pod \"root-account-create-update-j78h8\" (UID: \"8e12a1f8-d78c-41b0-b295-e5e661bf0820\") " pod="openstack/root-account-create-update-j78h8" Feb 19 15:28:08 crc kubenswrapper[4810]: I0219 15:28:08.272465 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e12a1f8-d78c-41b0-b295-e5e661bf0820-operator-scripts\") pod \"root-account-create-update-j78h8\" (UID: \"8e12a1f8-d78c-41b0-b295-e5e661bf0820\") " pod="openstack/root-account-create-update-j78h8" Feb 19 15:28:08 crc kubenswrapper[4810]: I0219 15:28:08.272557 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzmnf\" (UniqueName: \"kubernetes.io/projected/8e12a1f8-d78c-41b0-b295-e5e661bf0820-kube-api-access-vzmnf\") pod \"root-account-create-update-j78h8\" (UID: \"8e12a1f8-d78c-41b0-b295-e5e661bf0820\") " pod="openstack/root-account-create-update-j78h8" Feb 19 15:28:08 crc kubenswrapper[4810]: I0219 15:28:08.273159 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e12a1f8-d78c-41b0-b295-e5e661bf0820-operator-scripts\") pod \"root-account-create-update-j78h8\" (UID: \"8e12a1f8-d78c-41b0-b295-e5e661bf0820\") " pod="openstack/root-account-create-update-j78h8" Feb 19 15:28:08 crc kubenswrapper[4810]: I0219 15:28:08.292693 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzmnf\" (UniqueName: \"kubernetes.io/projected/8e12a1f8-d78c-41b0-b295-e5e661bf0820-kube-api-access-vzmnf\") pod \"root-account-create-update-j78h8\" (UID: \"8e12a1f8-d78c-41b0-b295-e5e661bf0820\") " pod="openstack/root-account-create-update-j78h8" Feb 19 15:28:08 crc kubenswrapper[4810]: I0219 15:28:08.402768 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-j78h8" Feb 19 15:28:08 crc kubenswrapper[4810]: I0219 15:28:08.493418 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"37e2af25-5b30-4fb9-801e-f4a84d665540","Type":"ContainerStarted","Data":"cd302b333f4e25ad0c36ebdbc56eef1b56c9908008ee49f7079b0bece263e84b"} Feb 19 15:28:08 crc kubenswrapper[4810]: I0219 15:28:08.493663 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"37e2af25-5b30-4fb9-801e-f4a84d665540","Type":"ContainerStarted","Data":"9fd9aa9263382e1efda3414490d21caa5e02f1ef981b536f01a4b062b8959f11"} Feb 19 15:28:08 crc kubenswrapper[4810]: I0219 15:28:08.493674 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"37e2af25-5b30-4fb9-801e-f4a84d665540","Type":"ContainerStarted","Data":"d50773999dfe3be05193a3f63c80d61aafc1b17203612313ef1633afce7cfe53"} Feb 19 15:28:08 crc kubenswrapper[4810]: I0219 15:28:08.838496 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-j78h8"] Feb 19 15:28:09 crc kubenswrapper[4810]: W0219 15:28:09.108558 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8e12a1f8_d78c_41b0_b295_e5e661bf0820.slice/crio-ccc35fad754bea5f86e280f1f346423215e3e8c4c6dc61ec34321a2365fec438 WatchSource:0}: Error finding container ccc35fad754bea5f86e280f1f346423215e3e8c4c6dc61ec34321a2365fec438: Status 404 returned error can't find the container with id ccc35fad754bea5f86e280f1f346423215e3e8c4c6dc61ec34321a2365fec438 Feb 19 15:28:09 crc kubenswrapper[4810]: I0219 15:28:09.501559 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-j78h8" event={"ID":"8e12a1f8-d78c-41b0-b295-e5e661bf0820","Type":"ContainerStarted","Data":"f70ca73c865d94282b27c6f5f6e86e7e4679dda8ba68c283e08b2a6314c29261"} Feb 19 15:28:09 crc kubenswrapper[4810]: I0219 15:28:09.501902 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-j78h8" event={"ID":"8e12a1f8-d78c-41b0-b295-e5e661bf0820","Type":"ContainerStarted","Data":"ccc35fad754bea5f86e280f1f346423215e3e8c4c6dc61ec34321a2365fec438"} Feb 19 15:28:09 crc kubenswrapper[4810]: I0219 15:28:09.506894 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"37e2af25-5b30-4fb9-801e-f4a84d665540","Type":"ContainerStarted","Data":"660d66f469b3973c5e645c0ac3a97338ff5859a9678d2775d702a8f431f12f4e"} Feb 19 15:28:09 crc kubenswrapper[4810]: I0219 15:28:09.522813 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-j78h8" podStartSLOduration=1.522793799 podStartE2EDuration="1.522793799s" podCreationTimestamp="2026-02-19 15:28:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:28:09.516247669 +0000 UTC m=+1118.998277813" watchObservedRunningTime="2026-02-19 15:28:09.522793799 +0000 UTC m=+1119.004823923" Feb 19 15:28:10 crc kubenswrapper[4810]: I0219 15:28:10.361608 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-5t6ds" Feb 19 15:28:10 crc kubenswrapper[4810]: I0219 15:28:10.376468 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-s5488" podUID="4a4fa57b-aa00-4866-b31e-df29f7f86480" containerName="ovn-controller" probeResult="failure" output=< Feb 19 15:28:10 crc kubenswrapper[4810]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 19 15:28:10 crc kubenswrapper[4810]: > Feb 19 15:28:10 crc kubenswrapper[4810]: I0219 15:28:10.380676 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-5t6ds" Feb 19 15:28:10 crc kubenswrapper[4810]: I0219 15:28:10.520359 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"37e2af25-5b30-4fb9-801e-f4a84d665540","Type":"ContainerStarted","Data":"e4fb6d2bda16502d024736976d935c4fe81f687c15140b0dd94266e3e3d9f390"} Feb 19 15:28:10 crc kubenswrapper[4810]: I0219 15:28:10.520417 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"37e2af25-5b30-4fb9-801e-f4a84d665540","Type":"ContainerStarted","Data":"2732d970ffbc02e85ff4de493b80613565b0be8ca47af7642d3c73044ca3fb1e"} Feb 19 15:28:10 crc kubenswrapper[4810]: I0219 15:28:10.520432 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"37e2af25-5b30-4fb9-801e-f4a84d665540","Type":"ContainerStarted","Data":"2341827275b751933bf24c3e78ff3b3ecc526884571fd989fcb9aec28efc66bc"} Feb 19 15:28:10 crc kubenswrapper[4810]: I0219 15:28:10.524192 4810 generic.go:334] "Generic (PLEG): container finished" podID="8e12a1f8-d78c-41b0-b295-e5e661bf0820" containerID="f70ca73c865d94282b27c6f5f6e86e7e4679dda8ba68c283e08b2a6314c29261" exitCode=0 Feb 19 15:28:10 crc kubenswrapper[4810]: I0219 15:28:10.524703 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-j78h8" event={"ID":"8e12a1f8-d78c-41b0-b295-e5e661bf0820","Type":"ContainerDied","Data":"f70ca73c865d94282b27c6f5f6e86e7e4679dda8ba68c283e08b2a6314c29261"} Feb 19 15:28:10 crc kubenswrapper[4810]: I0219 15:28:10.596610 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-s5488-config-vj8mr"] Feb 19 15:28:10 crc kubenswrapper[4810]: I0219 15:28:10.597736 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-s5488-config-vj8mr" Feb 19 15:28:10 crc kubenswrapper[4810]: I0219 15:28:10.599728 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 19 15:28:10 crc kubenswrapper[4810]: I0219 15:28:10.608191 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-s5488-config-vj8mr"] Feb 19 15:28:10 crc kubenswrapper[4810]: I0219 15:28:10.724297 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cd94f6ba-17f4-407d-96ce-aafc0d390ef0-scripts\") pod \"ovn-controller-s5488-config-vj8mr\" (UID: \"cd94f6ba-17f4-407d-96ce-aafc0d390ef0\") " pod="openstack/ovn-controller-s5488-config-vj8mr" Feb 19 15:28:10 crc kubenswrapper[4810]: I0219 15:28:10.724358 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/cd94f6ba-17f4-407d-96ce-aafc0d390ef0-var-run\") pod \"ovn-controller-s5488-config-vj8mr\" (UID: \"cd94f6ba-17f4-407d-96ce-aafc0d390ef0\") " pod="openstack/ovn-controller-s5488-config-vj8mr" Feb 19 15:28:10 crc kubenswrapper[4810]: I0219 15:28:10.724617 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/cd94f6ba-17f4-407d-96ce-aafc0d390ef0-additional-scripts\") pod \"ovn-controller-s5488-config-vj8mr\" (UID: \"cd94f6ba-17f4-407d-96ce-aafc0d390ef0\") " pod="openstack/ovn-controller-s5488-config-vj8mr" Feb 19 15:28:10 crc kubenswrapper[4810]: I0219 15:28:10.724676 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/cd94f6ba-17f4-407d-96ce-aafc0d390ef0-var-run-ovn\") pod \"ovn-controller-s5488-config-vj8mr\" (UID: \"cd94f6ba-17f4-407d-96ce-aafc0d390ef0\") " pod="openstack/ovn-controller-s5488-config-vj8mr" Feb 19 15:28:10 crc kubenswrapper[4810]: I0219 15:28:10.724797 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/cd94f6ba-17f4-407d-96ce-aafc0d390ef0-var-log-ovn\") pod \"ovn-controller-s5488-config-vj8mr\" (UID: \"cd94f6ba-17f4-407d-96ce-aafc0d390ef0\") " pod="openstack/ovn-controller-s5488-config-vj8mr" Feb 19 15:28:10 crc kubenswrapper[4810]: I0219 15:28:10.724985 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmcqb\" (UniqueName: \"kubernetes.io/projected/cd94f6ba-17f4-407d-96ce-aafc0d390ef0-kube-api-access-qmcqb\") pod \"ovn-controller-s5488-config-vj8mr\" (UID: \"cd94f6ba-17f4-407d-96ce-aafc0d390ef0\") " pod="openstack/ovn-controller-s5488-config-vj8mr" Feb 19 15:28:10 crc kubenswrapper[4810]: I0219 15:28:10.827237 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cd94f6ba-17f4-407d-96ce-aafc0d390ef0-scripts\") pod \"ovn-controller-s5488-config-vj8mr\" (UID: \"cd94f6ba-17f4-407d-96ce-aafc0d390ef0\") " pod="openstack/ovn-controller-s5488-config-vj8mr" Feb 19 15:28:10 crc kubenswrapper[4810]: I0219 15:28:10.827777 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/cd94f6ba-17f4-407d-96ce-aafc0d390ef0-var-run\") pod \"ovn-controller-s5488-config-vj8mr\" (UID: \"cd94f6ba-17f4-407d-96ce-aafc0d390ef0\") " pod="openstack/ovn-controller-s5488-config-vj8mr" Feb 19 15:28:10 crc kubenswrapper[4810]: I0219 15:28:10.828061 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/cd94f6ba-17f4-407d-96ce-aafc0d390ef0-additional-scripts\") pod \"ovn-controller-s5488-config-vj8mr\" (UID: \"cd94f6ba-17f4-407d-96ce-aafc0d390ef0\") " pod="openstack/ovn-controller-s5488-config-vj8mr" Feb 19 15:28:10 crc kubenswrapper[4810]: I0219 15:28:10.828238 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/cd94f6ba-17f4-407d-96ce-aafc0d390ef0-var-run-ovn\") pod \"ovn-controller-s5488-config-vj8mr\" (UID: \"cd94f6ba-17f4-407d-96ce-aafc0d390ef0\") " pod="openstack/ovn-controller-s5488-config-vj8mr" Feb 19 15:28:10 crc kubenswrapper[4810]: I0219 15:28:10.828132 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/cd94f6ba-17f4-407d-96ce-aafc0d390ef0-var-run\") pod \"ovn-controller-s5488-config-vj8mr\" (UID: \"cd94f6ba-17f4-407d-96ce-aafc0d390ef0\") " pod="openstack/ovn-controller-s5488-config-vj8mr" Feb 19 15:28:10 crc kubenswrapper[4810]: I0219 15:28:10.828365 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/cd94f6ba-17f4-407d-96ce-aafc0d390ef0-var-log-ovn\") pod \"ovn-controller-s5488-config-vj8mr\" (UID: \"cd94f6ba-17f4-407d-96ce-aafc0d390ef0\") " pod="openstack/ovn-controller-s5488-config-vj8mr" Feb 19 15:28:10 crc kubenswrapper[4810]: I0219 15:28:10.828402 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/cd94f6ba-17f4-407d-96ce-aafc0d390ef0-var-log-ovn\") pod \"ovn-controller-s5488-config-vj8mr\" (UID: \"cd94f6ba-17f4-407d-96ce-aafc0d390ef0\") " pod="openstack/ovn-controller-s5488-config-vj8mr" Feb 19 15:28:10 crc kubenswrapper[4810]: I0219 15:28:10.828369 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/cd94f6ba-17f4-407d-96ce-aafc0d390ef0-var-run-ovn\") pod \"ovn-controller-s5488-config-vj8mr\" (UID: \"cd94f6ba-17f4-407d-96ce-aafc0d390ef0\") " pod="openstack/ovn-controller-s5488-config-vj8mr" Feb 19 15:28:10 crc kubenswrapper[4810]: I0219 15:28:10.828659 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmcqb\" (UniqueName: \"kubernetes.io/projected/cd94f6ba-17f4-407d-96ce-aafc0d390ef0-kube-api-access-qmcqb\") pod \"ovn-controller-s5488-config-vj8mr\" (UID: \"cd94f6ba-17f4-407d-96ce-aafc0d390ef0\") " pod="openstack/ovn-controller-s5488-config-vj8mr" Feb 19 15:28:10 crc kubenswrapper[4810]: I0219 15:28:10.841793 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/cd94f6ba-17f4-407d-96ce-aafc0d390ef0-additional-scripts\") pod \"ovn-controller-s5488-config-vj8mr\" (UID: \"cd94f6ba-17f4-407d-96ce-aafc0d390ef0\") " pod="openstack/ovn-controller-s5488-config-vj8mr" Feb 19 15:28:10 crc kubenswrapper[4810]: I0219 15:28:10.846871 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cd94f6ba-17f4-407d-96ce-aafc0d390ef0-scripts\") pod \"ovn-controller-s5488-config-vj8mr\" (UID: \"cd94f6ba-17f4-407d-96ce-aafc0d390ef0\") " pod="openstack/ovn-controller-s5488-config-vj8mr" Feb 19 15:28:10 crc kubenswrapper[4810]: I0219 15:28:10.860519 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmcqb\" (UniqueName: \"kubernetes.io/projected/cd94f6ba-17f4-407d-96ce-aafc0d390ef0-kube-api-access-qmcqb\") pod \"ovn-controller-s5488-config-vj8mr\" (UID: \"cd94f6ba-17f4-407d-96ce-aafc0d390ef0\") " pod="openstack/ovn-controller-s5488-config-vj8mr" Feb 19 15:28:10 crc kubenswrapper[4810]: I0219 15:28:10.923683 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-s5488-config-vj8mr" Feb 19 15:28:11 crc kubenswrapper[4810]: I0219 15:28:11.480426 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-s5488-config-vj8mr"] Feb 19 15:28:11 crc kubenswrapper[4810]: I0219 15:28:11.541516 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"37e2af25-5b30-4fb9-801e-f4a84d665540","Type":"ContainerStarted","Data":"029a7644f1b3f3adfa640290641c929aa41a9c5077caf3bc1c5fe0762c760b0a"} Feb 19 15:28:11 crc kubenswrapper[4810]: I0219 15:28:11.541571 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"37e2af25-5b30-4fb9-801e-f4a84d665540","Type":"ContainerStarted","Data":"5452ca22717c7fdc7d9e2a9180cd6e5790679477011844d93e17b4c181ed1493"} Feb 19 15:28:11 crc kubenswrapper[4810]: I0219 15:28:11.543693 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-s5488-config-vj8mr" event={"ID":"cd94f6ba-17f4-407d-96ce-aafc0d390ef0","Type":"ContainerStarted","Data":"845fa0b026213cafa08bcde7439eb4aca0fbbbffe9ee97506b610230e0f93d27"} Feb 19 15:28:11 crc kubenswrapper[4810]: I0219 15:28:11.844937 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-j78h8" Feb 19 15:28:11 crc kubenswrapper[4810]: I0219 15:28:11.946299 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e12a1f8-d78c-41b0-b295-e5e661bf0820-operator-scripts\") pod \"8e12a1f8-d78c-41b0-b295-e5e661bf0820\" (UID: \"8e12a1f8-d78c-41b0-b295-e5e661bf0820\") " Feb 19 15:28:11 crc kubenswrapper[4810]: I0219 15:28:11.946457 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vzmnf\" (UniqueName: \"kubernetes.io/projected/8e12a1f8-d78c-41b0-b295-e5e661bf0820-kube-api-access-vzmnf\") pod \"8e12a1f8-d78c-41b0-b295-e5e661bf0820\" (UID: \"8e12a1f8-d78c-41b0-b295-e5e661bf0820\") " Feb 19 15:28:11 crc kubenswrapper[4810]: I0219 15:28:11.947984 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e12a1f8-d78c-41b0-b295-e5e661bf0820-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8e12a1f8-d78c-41b0-b295-e5e661bf0820" (UID: "8e12a1f8-d78c-41b0-b295-e5e661bf0820"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:28:11 crc kubenswrapper[4810]: I0219 15:28:11.954101 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e12a1f8-d78c-41b0-b295-e5e661bf0820-kube-api-access-vzmnf" (OuterVolumeSpecName: "kube-api-access-vzmnf") pod "8e12a1f8-d78c-41b0-b295-e5e661bf0820" (UID: "8e12a1f8-d78c-41b0-b295-e5e661bf0820"). InnerVolumeSpecName "kube-api-access-vzmnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:28:12 crc kubenswrapper[4810]: I0219 15:28:12.048097 4810 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e12a1f8-d78c-41b0-b295-e5e661bf0820-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 15:28:12 crc kubenswrapper[4810]: I0219 15:28:12.048141 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vzmnf\" (UniqueName: \"kubernetes.io/projected/8e12a1f8-d78c-41b0-b295-e5e661bf0820-kube-api-access-vzmnf\") on node \"crc\" DevicePath \"\"" Feb 19 15:28:12 crc kubenswrapper[4810]: I0219 15:28:12.570806 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"37e2af25-5b30-4fb9-801e-f4a84d665540","Type":"ContainerStarted","Data":"d7afe90f92f6c863fef61af81f424cfcdca8a085d68a7769e175b490d0c93e90"} Feb 19 15:28:12 crc kubenswrapper[4810]: I0219 15:28:12.570861 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"37e2af25-5b30-4fb9-801e-f4a84d665540","Type":"ContainerStarted","Data":"c5da4be9591440c1f64482455b1dff1d9275f518bf8bdbfe449a495712984261"} Feb 19 15:28:12 crc kubenswrapper[4810]: I0219 15:28:12.570874 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"37e2af25-5b30-4fb9-801e-f4a84d665540","Type":"ContainerStarted","Data":"5d1b1a39a5e1076c953861b63e89de6b9de74961d48072fedbe67b9154a5d5e7"} Feb 19 15:28:12 crc kubenswrapper[4810]: I0219 15:28:12.570885 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"37e2af25-5b30-4fb9-801e-f4a84d665540","Type":"ContainerStarted","Data":"6ab9e7f3cd670e2b5593371b7f00c38d53866f044a217c5b17cd350777f09512"} Feb 19 15:28:12 crc kubenswrapper[4810]: I0219 15:28:12.570896 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"37e2af25-5b30-4fb9-801e-f4a84d665540","Type":"ContainerStarted","Data":"c903e9c02663a0961112a44749a7b2bd76f40928f112da91a67b24fd0c29d9bc"} Feb 19 15:28:12 crc kubenswrapper[4810]: I0219 15:28:12.573759 4810 generic.go:334] "Generic (PLEG): container finished" podID="cd94f6ba-17f4-407d-96ce-aafc0d390ef0" containerID="c96da79c9ab27a3dc86e77ea8607bc39b965b2f04ca64ded9b1c4a74386d352e" exitCode=0 Feb 19 15:28:12 crc kubenswrapper[4810]: I0219 15:28:12.573917 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-s5488-config-vj8mr" event={"ID":"cd94f6ba-17f4-407d-96ce-aafc0d390ef0","Type":"ContainerDied","Data":"c96da79c9ab27a3dc86e77ea8607bc39b965b2f04ca64ded9b1c4a74386d352e"} Feb 19 15:28:12 crc kubenswrapper[4810]: I0219 15:28:12.576930 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-j78h8" event={"ID":"8e12a1f8-d78c-41b0-b295-e5e661bf0820","Type":"ContainerDied","Data":"ccc35fad754bea5f86e280f1f346423215e3e8c4c6dc61ec34321a2365fec438"} Feb 19 15:28:12 crc kubenswrapper[4810]: I0219 15:28:12.576963 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ccc35fad754bea5f86e280f1f346423215e3e8c4c6dc61ec34321a2365fec438" Feb 19 15:28:12 crc kubenswrapper[4810]: I0219 15:28:12.576985 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-j78h8" Feb 19 15:28:12 crc kubenswrapper[4810]: I0219 15:28:12.636247 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=35.765587287 podStartE2EDuration="40.636228501s" podCreationTimestamp="2026-02-19 15:27:32 +0000 UTC" firstStartedPulling="2026-02-19 15:28:06.17494336 +0000 UTC m=+1115.656973484" lastFinishedPulling="2026-02-19 15:28:11.045584584 +0000 UTC m=+1120.527614698" observedRunningTime="2026-02-19 15:28:12.624397501 +0000 UTC m=+1122.106427625" watchObservedRunningTime="2026-02-19 15:28:12.636228501 +0000 UTC m=+1122.118258625" Feb 19 15:28:12 crc kubenswrapper[4810]: I0219 15:28:12.916003 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6999bddfcf-fzf7g"] Feb 19 15:28:12 crc kubenswrapper[4810]: E0219 15:28:12.916403 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e12a1f8-d78c-41b0-b295-e5e661bf0820" containerName="mariadb-account-create-update" Feb 19 15:28:12 crc kubenswrapper[4810]: I0219 15:28:12.916418 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e12a1f8-d78c-41b0-b295-e5e661bf0820" containerName="mariadb-account-create-update" Feb 19 15:28:12 crc kubenswrapper[4810]: I0219 15:28:12.916630 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e12a1f8-d78c-41b0-b295-e5e661bf0820" containerName="mariadb-account-create-update" Feb 19 15:28:12 crc kubenswrapper[4810]: I0219 15:28:12.922983 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6999bddfcf-fzf7g" Feb 19 15:28:12 crc kubenswrapper[4810]: I0219 15:28:12.924904 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Feb 19 15:28:12 crc kubenswrapper[4810]: I0219 15:28:12.941053 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6999bddfcf-fzf7g"] Feb 19 15:28:13 crc kubenswrapper[4810]: I0219 15:28:13.073852 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2584fed3-16a4-489e-bf03-1c7461e9d3d8-config\") pod \"dnsmasq-dns-6999bddfcf-fzf7g\" (UID: \"2584fed3-16a4-489e-bf03-1c7461e9d3d8\") " pod="openstack/dnsmasq-dns-6999bddfcf-fzf7g" Feb 19 15:28:13 crc kubenswrapper[4810]: I0219 15:28:13.073924 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2584fed3-16a4-489e-bf03-1c7461e9d3d8-ovsdbserver-nb\") pod \"dnsmasq-dns-6999bddfcf-fzf7g\" (UID: \"2584fed3-16a4-489e-bf03-1c7461e9d3d8\") " pod="openstack/dnsmasq-dns-6999bddfcf-fzf7g" Feb 19 15:28:13 crc kubenswrapper[4810]: I0219 15:28:13.073952 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2584fed3-16a4-489e-bf03-1c7461e9d3d8-ovsdbserver-sb\") pod \"dnsmasq-dns-6999bddfcf-fzf7g\" (UID: \"2584fed3-16a4-489e-bf03-1c7461e9d3d8\") " pod="openstack/dnsmasq-dns-6999bddfcf-fzf7g" Feb 19 15:28:13 crc kubenswrapper[4810]: I0219 15:28:13.074127 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2584fed3-16a4-489e-bf03-1c7461e9d3d8-dns-svc\") pod \"dnsmasq-dns-6999bddfcf-fzf7g\" (UID: \"2584fed3-16a4-489e-bf03-1c7461e9d3d8\") " pod="openstack/dnsmasq-dns-6999bddfcf-fzf7g" Feb 19 15:28:13 crc kubenswrapper[4810]: I0219 15:28:13.074292 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckj5k\" (UniqueName: \"kubernetes.io/projected/2584fed3-16a4-489e-bf03-1c7461e9d3d8-kube-api-access-ckj5k\") pod \"dnsmasq-dns-6999bddfcf-fzf7g\" (UID: \"2584fed3-16a4-489e-bf03-1c7461e9d3d8\") " pod="openstack/dnsmasq-dns-6999bddfcf-fzf7g" Feb 19 15:28:13 crc kubenswrapper[4810]: I0219 15:28:13.074422 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2584fed3-16a4-489e-bf03-1c7461e9d3d8-dns-swift-storage-0\") pod \"dnsmasq-dns-6999bddfcf-fzf7g\" (UID: \"2584fed3-16a4-489e-bf03-1c7461e9d3d8\") " pod="openstack/dnsmasq-dns-6999bddfcf-fzf7g" Feb 19 15:28:13 crc kubenswrapper[4810]: I0219 15:28:13.176150 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2584fed3-16a4-489e-bf03-1c7461e9d3d8-dns-svc\") pod \"dnsmasq-dns-6999bddfcf-fzf7g\" (UID: \"2584fed3-16a4-489e-bf03-1c7461e9d3d8\") " pod="openstack/dnsmasq-dns-6999bddfcf-fzf7g" Feb 19 15:28:13 crc kubenswrapper[4810]: I0219 15:28:13.176230 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckj5k\" (UniqueName: \"kubernetes.io/projected/2584fed3-16a4-489e-bf03-1c7461e9d3d8-kube-api-access-ckj5k\") pod \"dnsmasq-dns-6999bddfcf-fzf7g\" (UID: \"2584fed3-16a4-489e-bf03-1c7461e9d3d8\") " pod="openstack/dnsmasq-dns-6999bddfcf-fzf7g" Feb 19 15:28:13 crc kubenswrapper[4810]: I0219 15:28:13.176265 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2584fed3-16a4-489e-bf03-1c7461e9d3d8-dns-swift-storage-0\") pod \"dnsmasq-dns-6999bddfcf-fzf7g\" (UID: \"2584fed3-16a4-489e-bf03-1c7461e9d3d8\") " pod="openstack/dnsmasq-dns-6999bddfcf-fzf7g" Feb 19 15:28:13 crc kubenswrapper[4810]: I0219 15:28:13.176348 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2584fed3-16a4-489e-bf03-1c7461e9d3d8-config\") pod \"dnsmasq-dns-6999bddfcf-fzf7g\" (UID: \"2584fed3-16a4-489e-bf03-1c7461e9d3d8\") " pod="openstack/dnsmasq-dns-6999bddfcf-fzf7g" Feb 19 15:28:13 crc kubenswrapper[4810]: I0219 15:28:13.176386 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2584fed3-16a4-489e-bf03-1c7461e9d3d8-ovsdbserver-nb\") pod \"dnsmasq-dns-6999bddfcf-fzf7g\" (UID: \"2584fed3-16a4-489e-bf03-1c7461e9d3d8\") " pod="openstack/dnsmasq-dns-6999bddfcf-fzf7g" Feb 19 15:28:13 crc kubenswrapper[4810]: I0219 15:28:13.176409 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2584fed3-16a4-489e-bf03-1c7461e9d3d8-ovsdbserver-sb\") pod \"dnsmasq-dns-6999bddfcf-fzf7g\" (UID: \"2584fed3-16a4-489e-bf03-1c7461e9d3d8\") " pod="openstack/dnsmasq-dns-6999bddfcf-fzf7g" Feb 19 15:28:13 crc kubenswrapper[4810]: I0219 15:28:13.177313 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2584fed3-16a4-489e-bf03-1c7461e9d3d8-ovsdbserver-sb\") pod \"dnsmasq-dns-6999bddfcf-fzf7g\" (UID: \"2584fed3-16a4-489e-bf03-1c7461e9d3d8\") " pod="openstack/dnsmasq-dns-6999bddfcf-fzf7g" Feb 19 15:28:13 crc kubenswrapper[4810]: I0219 15:28:13.177687 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2584fed3-16a4-489e-bf03-1c7461e9d3d8-ovsdbserver-nb\") pod \"dnsmasq-dns-6999bddfcf-fzf7g\" (UID: \"2584fed3-16a4-489e-bf03-1c7461e9d3d8\") " pod="openstack/dnsmasq-dns-6999bddfcf-fzf7g" Feb 19 15:28:13 crc kubenswrapper[4810]: I0219 15:28:13.177797 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2584fed3-16a4-489e-bf03-1c7461e9d3d8-config\") pod \"dnsmasq-dns-6999bddfcf-fzf7g\" (UID: \"2584fed3-16a4-489e-bf03-1c7461e9d3d8\") " pod="openstack/dnsmasq-dns-6999bddfcf-fzf7g" Feb 19 15:28:13 crc kubenswrapper[4810]: I0219 15:28:13.177901 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2584fed3-16a4-489e-bf03-1c7461e9d3d8-dns-swift-storage-0\") pod \"dnsmasq-dns-6999bddfcf-fzf7g\" (UID: \"2584fed3-16a4-489e-bf03-1c7461e9d3d8\") " pod="openstack/dnsmasq-dns-6999bddfcf-fzf7g" Feb 19 15:28:13 crc kubenswrapper[4810]: I0219 15:28:13.178306 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2584fed3-16a4-489e-bf03-1c7461e9d3d8-dns-svc\") pod \"dnsmasq-dns-6999bddfcf-fzf7g\" (UID: \"2584fed3-16a4-489e-bf03-1c7461e9d3d8\") " pod="openstack/dnsmasq-dns-6999bddfcf-fzf7g" Feb 19 15:28:13 crc kubenswrapper[4810]: I0219 15:28:13.203410 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckj5k\" (UniqueName: \"kubernetes.io/projected/2584fed3-16a4-489e-bf03-1c7461e9d3d8-kube-api-access-ckj5k\") pod \"dnsmasq-dns-6999bddfcf-fzf7g\" (UID: \"2584fed3-16a4-489e-bf03-1c7461e9d3d8\") " pod="openstack/dnsmasq-dns-6999bddfcf-fzf7g" Feb 19 15:28:13 crc kubenswrapper[4810]: I0219 15:28:13.276732 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6999bddfcf-fzf7g" Feb 19 15:28:13 crc kubenswrapper[4810]: I0219 15:28:13.304170 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Feb 19 15:28:13 crc kubenswrapper[4810]: I0219 15:28:13.307708 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Feb 19 15:28:13 crc kubenswrapper[4810]: I0219 15:28:13.592222 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Feb 19 15:28:15 crc kubenswrapper[4810]: I0219 15:28:15.344806 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-s5488" Feb 19 15:28:15 crc kubenswrapper[4810]: I0219 15:28:15.888228 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/notifications-rabbitmq-server-0" podUID="4ad875c7-9e7b-4cc8-b6f7-e6fbeb95573c" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.105:5671: connect: connection refused" Feb 19 15:28:15 crc kubenswrapper[4810]: I0219 15:28:15.938827 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 19 15:28:15 crc kubenswrapper[4810]: I0219 15:28:15.939073 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="a220bc57-3f31-4851-ad5c-9f61359f7de5" containerName="prometheus" containerID="cri-o://e279b42557dcc4bac021262be680408c24c0a74b26422b2a204f61141627de2f" gracePeriod=600 Feb 19 15:28:15 crc kubenswrapper[4810]: I0219 15:28:15.939456 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="a220bc57-3f31-4851-ad5c-9f61359f7de5" containerName="thanos-sidecar" containerID="cri-o://e5ac3906ff8a232fe91505ae472e019caa241a902c02da4ff118ba00b0e5d016" gracePeriod=600 Feb 19 15:28:15 crc kubenswrapper[4810]: I0219 15:28:15.939524 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="a220bc57-3f31-4851-ad5c-9f61359f7de5" containerName="config-reloader" containerID="cri-o://96840091b1cea0064312675928d8c3948ff53222c6323509a84e0412a49c9891" gracePeriod=600 Feb 19 15:28:16 crc kubenswrapper[4810]: I0219 15:28:16.134206 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="00bcfb03-4357-4343-99a5-30dc7f25abe9" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.106:5671: connect: connection refused" Feb 19 15:28:16 crc kubenswrapper[4810]: I0219 15:28:16.444834 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="2a3676ed-f06f-4dea-82a1-959716331113" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.107:5671: connect: connection refused" Feb 19 15:28:16 crc kubenswrapper[4810]: I0219 15:28:16.619242 4810 generic.go:334] "Generic (PLEG): container finished" podID="a220bc57-3f31-4851-ad5c-9f61359f7de5" containerID="e5ac3906ff8a232fe91505ae472e019caa241a902c02da4ff118ba00b0e5d016" exitCode=0 Feb 19 15:28:16 crc kubenswrapper[4810]: I0219 15:28:16.619289 4810 generic.go:334] "Generic (PLEG): container finished" podID="a220bc57-3f31-4851-ad5c-9f61359f7de5" containerID="96840091b1cea0064312675928d8c3948ff53222c6323509a84e0412a49c9891" exitCode=0 Feb 19 15:28:16 crc kubenswrapper[4810]: I0219 15:28:16.619303 4810 generic.go:334] "Generic (PLEG): container finished" podID="a220bc57-3f31-4851-ad5c-9f61359f7de5" containerID="e279b42557dcc4bac021262be680408c24c0a74b26422b2a204f61141627de2f" exitCode=0 Feb 19 15:28:16 crc kubenswrapper[4810]: I0219 15:28:16.619350 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a220bc57-3f31-4851-ad5c-9f61359f7de5","Type":"ContainerDied","Data":"e5ac3906ff8a232fe91505ae472e019caa241a902c02da4ff118ba00b0e5d016"} Feb 19 15:28:16 crc kubenswrapper[4810]: I0219 15:28:16.619379 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a220bc57-3f31-4851-ad5c-9f61359f7de5","Type":"ContainerDied","Data":"96840091b1cea0064312675928d8c3948ff53222c6323509a84e0412a49c9891"} Feb 19 15:28:16 crc kubenswrapper[4810]: I0219 15:28:16.619396 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a220bc57-3f31-4851-ad5c-9f61359f7de5","Type":"ContainerDied","Data":"e279b42557dcc4bac021262be680408c24c0a74b26422b2a204f61141627de2f"} Feb 19 15:28:18 crc kubenswrapper[4810]: I0219 15:28:18.304172 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="a220bc57-3f31-4851-ad5c-9f61359f7de5" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.0.112:9090/-/ready\": dial tcp 10.217.0.112:9090: connect: connection refused" Feb 19 15:28:19 crc kubenswrapper[4810]: I0219 15:28:19.538211 4810 patch_prober.go:28] interesting pod/machine-config-daemon-t499d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 15:28:19 crc kubenswrapper[4810]: I0219 15:28:19.538287 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 15:28:20 crc kubenswrapper[4810]: I0219 15:28:20.959383 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-s5488-config-vj8mr" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.009902 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cd94f6ba-17f4-407d-96ce-aafc0d390ef0-scripts\") pod \"cd94f6ba-17f4-407d-96ce-aafc0d390ef0\" (UID: \"cd94f6ba-17f4-407d-96ce-aafc0d390ef0\") " Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.010031 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/cd94f6ba-17f4-407d-96ce-aafc0d390ef0-var-run\") pod \"cd94f6ba-17f4-407d-96ce-aafc0d390ef0\" (UID: \"cd94f6ba-17f4-407d-96ce-aafc0d390ef0\") " Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.010078 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmcqb\" (UniqueName: \"kubernetes.io/projected/cd94f6ba-17f4-407d-96ce-aafc0d390ef0-kube-api-access-qmcqb\") pod \"cd94f6ba-17f4-407d-96ce-aafc0d390ef0\" (UID: \"cd94f6ba-17f4-407d-96ce-aafc0d390ef0\") " Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.010109 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/cd94f6ba-17f4-407d-96ce-aafc0d390ef0-var-run-ovn\") pod \"cd94f6ba-17f4-407d-96ce-aafc0d390ef0\" (UID: \"cd94f6ba-17f4-407d-96ce-aafc0d390ef0\") " Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.010184 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/cd94f6ba-17f4-407d-96ce-aafc0d390ef0-additional-scripts\") pod \"cd94f6ba-17f4-407d-96ce-aafc0d390ef0\" (UID: \"cd94f6ba-17f4-407d-96ce-aafc0d390ef0\") " Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.010230 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/cd94f6ba-17f4-407d-96ce-aafc0d390ef0-var-log-ovn\") pod \"cd94f6ba-17f4-407d-96ce-aafc0d390ef0\" (UID: \"cd94f6ba-17f4-407d-96ce-aafc0d390ef0\") " Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.010618 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cd94f6ba-17f4-407d-96ce-aafc0d390ef0-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "cd94f6ba-17f4-407d-96ce-aafc0d390ef0" (UID: "cd94f6ba-17f4-407d-96ce-aafc0d390ef0"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.010646 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cd94f6ba-17f4-407d-96ce-aafc0d390ef0-var-run" (OuterVolumeSpecName: "var-run") pod "cd94f6ba-17f4-407d-96ce-aafc0d390ef0" (UID: "cd94f6ba-17f4-407d-96ce-aafc0d390ef0"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.011219 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cd94f6ba-17f4-407d-96ce-aafc0d390ef0-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "cd94f6ba-17f4-407d-96ce-aafc0d390ef0" (UID: "cd94f6ba-17f4-407d-96ce-aafc0d390ef0"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.011637 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd94f6ba-17f4-407d-96ce-aafc0d390ef0-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "cd94f6ba-17f4-407d-96ce-aafc0d390ef0" (UID: "cd94f6ba-17f4-407d-96ce-aafc0d390ef0"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.012385 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd94f6ba-17f4-407d-96ce-aafc0d390ef0-scripts" (OuterVolumeSpecName: "scripts") pod "cd94f6ba-17f4-407d-96ce-aafc0d390ef0" (UID: "cd94f6ba-17f4-407d-96ce-aafc0d390ef0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.013924 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd94f6ba-17f4-407d-96ce-aafc0d390ef0-kube-api-access-qmcqb" (OuterVolumeSpecName: "kube-api-access-qmcqb") pod "cd94f6ba-17f4-407d-96ce-aafc0d390ef0" (UID: "cd94f6ba-17f4-407d-96ce-aafc0d390ef0"). InnerVolumeSpecName "kube-api-access-qmcqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.113157 4810 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/cd94f6ba-17f4-407d-96ce-aafc0d390ef0-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.113203 4810 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/cd94f6ba-17f4-407d-96ce-aafc0d390ef0-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.113214 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cd94f6ba-17f4-407d-96ce-aafc0d390ef0-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.113224 4810 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/cd94f6ba-17f4-407d-96ce-aafc0d390ef0-var-run\") on node \"crc\" DevicePath \"\"" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.113235 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmcqb\" (UniqueName: \"kubernetes.io/projected/cd94f6ba-17f4-407d-96ce-aafc0d390ef0-kube-api-access-qmcqb\") on node \"crc\" DevicePath \"\"" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.113246 4810 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/cd94f6ba-17f4-407d-96ce-aafc0d390ef0-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.125438 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.214205 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/a220bc57-3f31-4851-ad5c-9f61359f7de5-prometheus-metric-storage-rulefiles-1\") pod \"a220bc57-3f31-4851-ad5c-9f61359f7de5\" (UID: \"a220bc57-3f31-4851-ad5c-9f61359f7de5\") " Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.214273 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a220bc57-3f31-4851-ad5c-9f61359f7de5-web-config\") pod \"a220bc57-3f31-4851-ad5c-9f61359f7de5\" (UID: \"a220bc57-3f31-4851-ad5c-9f61359f7de5\") " Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.214365 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a220bc57-3f31-4851-ad5c-9f61359f7de5-thanos-prometheus-http-client-file\") pod \"a220bc57-3f31-4851-ad5c-9f61359f7de5\" (UID: \"a220bc57-3f31-4851-ad5c-9f61359f7de5\") " Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.214430 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a220bc57-3f31-4851-ad5c-9f61359f7de5-prometheus-metric-storage-rulefiles-0\") pod \"a220bc57-3f31-4851-ad5c-9f61359f7de5\" (UID: \"a220bc57-3f31-4851-ad5c-9f61359f7de5\") " Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.214511 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a220bc57-3f31-4851-ad5c-9f61359f7de5-config\") pod \"a220bc57-3f31-4851-ad5c-9f61359f7de5\" (UID: \"a220bc57-3f31-4851-ad5c-9f61359f7de5\") " Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.214653 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b3b143f1-488b-49bf-8792-af0d760f341e\") pod \"a220bc57-3f31-4851-ad5c-9f61359f7de5\" (UID: \"a220bc57-3f31-4851-ad5c-9f61359f7de5\") " Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.214718 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a220bc57-3f31-4851-ad5c-9f61359f7de5-config-out\") pod \"a220bc57-3f31-4851-ad5c-9f61359f7de5\" (UID: \"a220bc57-3f31-4851-ad5c-9f61359f7de5\") " Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.214783 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ks5p\" (UniqueName: \"kubernetes.io/projected/a220bc57-3f31-4851-ad5c-9f61359f7de5-kube-api-access-8ks5p\") pod \"a220bc57-3f31-4851-ad5c-9f61359f7de5\" (UID: \"a220bc57-3f31-4851-ad5c-9f61359f7de5\") " Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.214819 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/a220bc57-3f31-4851-ad5c-9f61359f7de5-prometheus-metric-storage-rulefiles-2\") pod \"a220bc57-3f31-4851-ad5c-9f61359f7de5\" (UID: \"a220bc57-3f31-4851-ad5c-9f61359f7de5\") " Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.214854 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a220bc57-3f31-4851-ad5c-9f61359f7de5-tls-assets\") pod \"a220bc57-3f31-4851-ad5c-9f61359f7de5\" (UID: \"a220bc57-3f31-4851-ad5c-9f61359f7de5\") " Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.215171 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a220bc57-3f31-4851-ad5c-9f61359f7de5-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "a220bc57-3f31-4851-ad5c-9f61359f7de5" (UID: "a220bc57-3f31-4851-ad5c-9f61359f7de5"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.215347 4810 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/a220bc57-3f31-4851-ad5c-9f61359f7de5-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.216078 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a220bc57-3f31-4851-ad5c-9f61359f7de5-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "a220bc57-3f31-4851-ad5c-9f61359f7de5" (UID: "a220bc57-3f31-4851-ad5c-9f61359f7de5"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.217050 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a220bc57-3f31-4851-ad5c-9f61359f7de5-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "a220bc57-3f31-4851-ad5c-9f61359f7de5" (UID: "a220bc57-3f31-4851-ad5c-9f61359f7de5"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.217779 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a220bc57-3f31-4851-ad5c-9f61359f7de5-config" (OuterVolumeSpecName: "config") pod "a220bc57-3f31-4851-ad5c-9f61359f7de5" (UID: "a220bc57-3f31-4851-ad5c-9f61359f7de5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.221702 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a220bc57-3f31-4851-ad5c-9f61359f7de5-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "a220bc57-3f31-4851-ad5c-9f61359f7de5" (UID: "a220bc57-3f31-4851-ad5c-9f61359f7de5"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.222857 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a220bc57-3f31-4851-ad5c-9f61359f7de5-kube-api-access-8ks5p" (OuterVolumeSpecName: "kube-api-access-8ks5p") pod "a220bc57-3f31-4851-ad5c-9f61359f7de5" (UID: "a220bc57-3f31-4851-ad5c-9f61359f7de5"). InnerVolumeSpecName "kube-api-access-8ks5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.222928 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a220bc57-3f31-4851-ad5c-9f61359f7de5-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "a220bc57-3f31-4851-ad5c-9f61359f7de5" (UID: "a220bc57-3f31-4851-ad5c-9f61359f7de5"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.223032 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a220bc57-3f31-4851-ad5c-9f61359f7de5-config-out" (OuterVolumeSpecName: "config-out") pod "a220bc57-3f31-4851-ad5c-9f61359f7de5" (UID: "a220bc57-3f31-4851-ad5c-9f61359f7de5"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.236074 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b3b143f1-488b-49bf-8792-af0d760f341e" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "a220bc57-3f31-4851-ad5c-9f61359f7de5" (UID: "a220bc57-3f31-4851-ad5c-9f61359f7de5"). InnerVolumeSpecName "pvc-b3b143f1-488b-49bf-8792-af0d760f341e". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.249820 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a220bc57-3f31-4851-ad5c-9f61359f7de5-web-config" (OuterVolumeSpecName: "web-config") pod "a220bc57-3f31-4851-ad5c-9f61359f7de5" (UID: "a220bc57-3f31-4851-ad5c-9f61359f7de5"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.317693 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/a220bc57-3f31-4851-ad5c-9f61359f7de5-config\") on node \"crc\" DevicePath \"\"" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.317772 4810 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-b3b143f1-488b-49bf-8792-af0d760f341e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b3b143f1-488b-49bf-8792-af0d760f341e\") on node \"crc\" " Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.317793 4810 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a220bc57-3f31-4851-ad5c-9f61359f7de5-config-out\") on node \"crc\" DevicePath \"\"" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.317815 4810 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/a220bc57-3f31-4851-ad5c-9f61359f7de5-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.317832 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8ks5p\" (UniqueName: \"kubernetes.io/projected/a220bc57-3f31-4851-ad5c-9f61359f7de5-kube-api-access-8ks5p\") on node \"crc\" DevicePath \"\"" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.317845 4810 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a220bc57-3f31-4851-ad5c-9f61359f7de5-tls-assets\") on node \"crc\" DevicePath \"\"" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.317855 4810 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a220bc57-3f31-4851-ad5c-9f61359f7de5-web-config\") on node \"crc\" DevicePath \"\"" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.317867 4810 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a220bc57-3f31-4851-ad5c-9f61359f7de5-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.317878 4810 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a220bc57-3f31-4851-ad5c-9f61359f7de5-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.338259 4810 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.338415 4810 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-b3b143f1-488b-49bf-8792-af0d760f341e" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b3b143f1-488b-49bf-8792-af0d760f341e") on node "crc" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.362585 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6999bddfcf-fzf7g"] Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.419209 4810 reconciler_common.go:293] "Volume detached for volume \"pvc-b3b143f1-488b-49bf-8792-af0d760f341e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b3b143f1-488b-49bf-8792-af0d760f341e\") on node \"crc\" DevicePath \"\"" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.663621 4810 generic.go:334] "Generic (PLEG): container finished" podID="2584fed3-16a4-489e-bf03-1c7461e9d3d8" containerID="821ae759a5ba32197a93051b435e6f01263ceab4bc6d3d77eccce527b39b8143" exitCode=0 Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.663684 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6999bddfcf-fzf7g" event={"ID":"2584fed3-16a4-489e-bf03-1c7461e9d3d8","Type":"ContainerDied","Data":"821ae759a5ba32197a93051b435e6f01263ceab4bc6d3d77eccce527b39b8143"} Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.663708 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6999bddfcf-fzf7g" event={"ID":"2584fed3-16a4-489e-bf03-1c7461e9d3d8","Type":"ContainerStarted","Data":"31b96436742b6e2f2b15881309695bfa289fff7a6145248f7da06333643cac1d"} Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.667001 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a220bc57-3f31-4851-ad5c-9f61359f7de5","Type":"ContainerDied","Data":"2e7a3617c56f7d96899fe573b10e0613d97d13afc38ab2fb9c62813a642860d3"} Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.667046 4810 scope.go:117] "RemoveContainer" containerID="e5ac3906ff8a232fe91505ae472e019caa241a902c02da4ff118ba00b0e5d016" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.667061 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.672289 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-s5488-config-vj8mr" event={"ID":"cd94f6ba-17f4-407d-96ce-aafc0d390ef0","Type":"ContainerDied","Data":"845fa0b026213cafa08bcde7439eb4aca0fbbbffe9ee97506b610230e0f93d27"} Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.672345 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="845fa0b026213cafa08bcde7439eb4aca0fbbbffe9ee97506b610230e0f93d27" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.672414 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-s5488-config-vj8mr" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.675907 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-kpf4t" event={"ID":"31093793-65b6-467c-8d5b-218e108fd330","Type":"ContainerStarted","Data":"f13c00b75444d82ae151313db252a559d67eb3a9e93fc91fd59fa886fe8ada73"} Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.692589 4810 scope.go:117] "RemoveContainer" containerID="96840091b1cea0064312675928d8c3948ff53222c6323509a84e0412a49c9891" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.710096 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.719635 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.729094 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-kpf4t" podStartSLOduration=2.641325275 podStartE2EDuration="16.729076827s" podCreationTimestamp="2026-02-19 15:28:05 +0000 UTC" firstStartedPulling="2026-02-19 15:28:06.884839077 +0000 UTC m=+1116.366869201" lastFinishedPulling="2026-02-19 15:28:20.972590619 +0000 UTC m=+1130.454620753" observedRunningTime="2026-02-19 15:28:21.721735297 +0000 UTC m=+1131.203765431" watchObservedRunningTime="2026-02-19 15:28:21.729076827 +0000 UTC m=+1131.211106951" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.741645 4810 scope.go:117] "RemoveContainer" containerID="e279b42557dcc4bac021262be680408c24c0a74b26422b2a204f61141627de2f" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.743985 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 19 15:28:21 crc kubenswrapper[4810]: E0219 15:28:21.745064 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd94f6ba-17f4-407d-96ce-aafc0d390ef0" containerName="ovn-config" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.745087 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd94f6ba-17f4-407d-96ce-aafc0d390ef0" containerName="ovn-config" Feb 19 15:28:21 crc kubenswrapper[4810]: E0219 15:28:21.745122 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a220bc57-3f31-4851-ad5c-9f61359f7de5" containerName="init-config-reloader" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.745131 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="a220bc57-3f31-4851-ad5c-9f61359f7de5" containerName="init-config-reloader" Feb 19 15:28:21 crc kubenswrapper[4810]: E0219 15:28:21.745146 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a220bc57-3f31-4851-ad5c-9f61359f7de5" containerName="thanos-sidecar" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.745154 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="a220bc57-3f31-4851-ad5c-9f61359f7de5" containerName="thanos-sidecar" Feb 19 15:28:21 crc kubenswrapper[4810]: E0219 15:28:21.745176 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a220bc57-3f31-4851-ad5c-9f61359f7de5" containerName="config-reloader" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.745184 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="a220bc57-3f31-4851-ad5c-9f61359f7de5" containerName="config-reloader" Feb 19 15:28:21 crc kubenswrapper[4810]: E0219 15:28:21.745200 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a220bc57-3f31-4851-ad5c-9f61359f7de5" containerName="prometheus" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.745209 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="a220bc57-3f31-4851-ad5c-9f61359f7de5" containerName="prometheus" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.745421 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="a220bc57-3f31-4851-ad5c-9f61359f7de5" containerName="config-reloader" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.745436 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="a220bc57-3f31-4851-ad5c-9f61359f7de5" containerName="thanos-sidecar" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.745466 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd94f6ba-17f4-407d-96ce-aafc0d390ef0" containerName="ovn-config" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.745484 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="a220bc57-3f31-4851-ad5c-9f61359f7de5" containerName="prometheus" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.747470 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.751857 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.752129 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.752497 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-x7hn6" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.752659 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.752825 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.753158 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.754582 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.755417 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.762162 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.776324 4810 scope.go:117] "RemoveContainer" containerID="024e6cbe9539fc73096ba26de873a5cd3591fc2cd12d0be8dd23110aaa2f3ec6" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.779151 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.826127 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5c213a3a-78fd-4b42-bc1c-e09837eae684-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"5c213a3a-78fd-4b42-bc1c-e09837eae684\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.826565 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmlm6\" (UniqueName: \"kubernetes.io/projected/5c213a3a-78fd-4b42-bc1c-e09837eae684-kube-api-access-zmlm6\") pod \"prometheus-metric-storage-0\" (UID: \"5c213a3a-78fd-4b42-bc1c-e09837eae684\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.826673 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-b3b143f1-488b-49bf-8792-af0d760f341e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b3b143f1-488b-49bf-8792-af0d760f341e\") pod \"prometheus-metric-storage-0\" (UID: \"5c213a3a-78fd-4b42-bc1c-e09837eae684\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.826765 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/5c213a3a-78fd-4b42-bc1c-e09837eae684-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"5c213a3a-78fd-4b42-bc1c-e09837eae684\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.826851 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/5c213a3a-78fd-4b42-bc1c-e09837eae684-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"5c213a3a-78fd-4b42-bc1c-e09837eae684\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.826930 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/5c213a3a-78fd-4b42-bc1c-e09837eae684-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"5c213a3a-78fd-4b42-bc1c-e09837eae684\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.827030 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/5c213a3a-78fd-4b42-bc1c-e09837eae684-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"5c213a3a-78fd-4b42-bc1c-e09837eae684\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.827200 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c213a3a-78fd-4b42-bc1c-e09837eae684-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"5c213a3a-78fd-4b42-bc1c-e09837eae684\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.827250 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/5c213a3a-78fd-4b42-bc1c-e09837eae684-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"5c213a3a-78fd-4b42-bc1c-e09837eae684\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.827277 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5c213a3a-78fd-4b42-bc1c-e09837eae684-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"5c213a3a-78fd-4b42-bc1c-e09837eae684\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.827298 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5c213a3a-78fd-4b42-bc1c-e09837eae684-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"5c213a3a-78fd-4b42-bc1c-e09837eae684\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.827340 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5c213a3a-78fd-4b42-bc1c-e09837eae684-config\") pod \"prometheus-metric-storage-0\" (UID: \"5c213a3a-78fd-4b42-bc1c-e09837eae684\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.827563 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/5c213a3a-78fd-4b42-bc1c-e09837eae684-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"5c213a3a-78fd-4b42-bc1c-e09837eae684\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.928741 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5c213a3a-78fd-4b42-bc1c-e09837eae684-config\") pod \"prometheus-metric-storage-0\" (UID: \"5c213a3a-78fd-4b42-bc1c-e09837eae684\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.928836 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/5c213a3a-78fd-4b42-bc1c-e09837eae684-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"5c213a3a-78fd-4b42-bc1c-e09837eae684\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.928861 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5c213a3a-78fd-4b42-bc1c-e09837eae684-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"5c213a3a-78fd-4b42-bc1c-e09837eae684\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.928881 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmlm6\" (UniqueName: \"kubernetes.io/projected/5c213a3a-78fd-4b42-bc1c-e09837eae684-kube-api-access-zmlm6\") pod \"prometheus-metric-storage-0\" (UID: \"5c213a3a-78fd-4b42-bc1c-e09837eae684\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.928907 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-b3b143f1-488b-49bf-8792-af0d760f341e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b3b143f1-488b-49bf-8792-af0d760f341e\") pod \"prometheus-metric-storage-0\" (UID: \"5c213a3a-78fd-4b42-bc1c-e09837eae684\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.928928 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/5c213a3a-78fd-4b42-bc1c-e09837eae684-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"5c213a3a-78fd-4b42-bc1c-e09837eae684\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.928953 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/5c213a3a-78fd-4b42-bc1c-e09837eae684-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"5c213a3a-78fd-4b42-bc1c-e09837eae684\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.928974 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/5c213a3a-78fd-4b42-bc1c-e09837eae684-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"5c213a3a-78fd-4b42-bc1c-e09837eae684\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.929026 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/5c213a3a-78fd-4b42-bc1c-e09837eae684-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"5c213a3a-78fd-4b42-bc1c-e09837eae684\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.929057 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c213a3a-78fd-4b42-bc1c-e09837eae684-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"5c213a3a-78fd-4b42-bc1c-e09837eae684\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.929076 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/5c213a3a-78fd-4b42-bc1c-e09837eae684-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"5c213a3a-78fd-4b42-bc1c-e09837eae684\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.929092 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5c213a3a-78fd-4b42-bc1c-e09837eae684-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"5c213a3a-78fd-4b42-bc1c-e09837eae684\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.929107 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5c213a3a-78fd-4b42-bc1c-e09837eae684-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"5c213a3a-78fd-4b42-bc1c-e09837eae684\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.931030 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/5c213a3a-78fd-4b42-bc1c-e09837eae684-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"5c213a3a-78fd-4b42-bc1c-e09837eae684\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.931314 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/5c213a3a-78fd-4b42-bc1c-e09837eae684-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"5c213a3a-78fd-4b42-bc1c-e09837eae684\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.931818 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/5c213a3a-78fd-4b42-bc1c-e09837eae684-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"5c213a3a-78fd-4b42-bc1c-e09837eae684\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.935509 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5c213a3a-78fd-4b42-bc1c-e09837eae684-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"5c213a3a-78fd-4b42-bc1c-e09837eae684\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.936405 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5c213a3a-78fd-4b42-bc1c-e09837eae684-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"5c213a3a-78fd-4b42-bc1c-e09837eae684\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.936445 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/5c213a3a-78fd-4b42-bc1c-e09837eae684-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"5c213a3a-78fd-4b42-bc1c-e09837eae684\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.936540 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/5c213a3a-78fd-4b42-bc1c-e09837eae684-config\") pod \"prometheus-metric-storage-0\" (UID: \"5c213a3a-78fd-4b42-bc1c-e09837eae684\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.936534 4810 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.936850 4810 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-b3b143f1-488b-49bf-8792-af0d760f341e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b3b143f1-488b-49bf-8792-af0d760f341e\") pod \"prometheus-metric-storage-0\" (UID: \"5c213a3a-78fd-4b42-bc1c-e09837eae684\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/e63bc62ea909687cb5abb0c5cf8da7008d795f1441aaff1987b707a42a388027/globalmount\"" pod="openstack/prometheus-metric-storage-0" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.938381 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c213a3a-78fd-4b42-bc1c-e09837eae684-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"5c213a3a-78fd-4b42-bc1c-e09837eae684\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.944000 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5c213a3a-78fd-4b42-bc1c-e09837eae684-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"5c213a3a-78fd-4b42-bc1c-e09837eae684\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.945608 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/5c213a3a-78fd-4b42-bc1c-e09837eae684-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"5c213a3a-78fd-4b42-bc1c-e09837eae684\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.945888 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/5c213a3a-78fd-4b42-bc1c-e09837eae684-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"5c213a3a-78fd-4b42-bc1c-e09837eae684\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.947654 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmlm6\" (UniqueName: \"kubernetes.io/projected/5c213a3a-78fd-4b42-bc1c-e09837eae684-kube-api-access-zmlm6\") pod \"prometheus-metric-storage-0\" (UID: \"5c213a3a-78fd-4b42-bc1c-e09837eae684\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:28:21 crc kubenswrapper[4810]: I0219 15:28:21.991985 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-b3b143f1-488b-49bf-8792-af0d760f341e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b3b143f1-488b-49bf-8792-af0d760f341e\") pod \"prometheus-metric-storage-0\" (UID: \"5c213a3a-78fd-4b42-bc1c-e09837eae684\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:28:22 crc kubenswrapper[4810]: I0219 15:28:22.065027 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 19 15:28:22 crc kubenswrapper[4810]: I0219 15:28:22.118918 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-s5488-config-vj8mr"] Feb 19 15:28:22 crc kubenswrapper[4810]: I0219 15:28:22.139966 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-s5488-config-vj8mr"] Feb 19 15:28:22 crc kubenswrapper[4810]: I0219 15:28:22.175427 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-s5488-config-lsj57"] Feb 19 15:28:22 crc kubenswrapper[4810]: I0219 15:28:22.176752 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-s5488-config-lsj57" Feb 19 15:28:22 crc kubenswrapper[4810]: I0219 15:28:22.181500 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 19 15:28:22 crc kubenswrapper[4810]: I0219 15:28:22.186553 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-s5488-config-lsj57"] Feb 19 15:28:22 crc kubenswrapper[4810]: I0219 15:28:22.236114 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/89fe19f5-3c0f-4fa5-8eed-59bd7b76f900-scripts\") pod \"ovn-controller-s5488-config-lsj57\" (UID: \"89fe19f5-3c0f-4fa5-8eed-59bd7b76f900\") " pod="openstack/ovn-controller-s5488-config-lsj57" Feb 19 15:28:22 crc kubenswrapper[4810]: I0219 15:28:22.236162 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/89fe19f5-3c0f-4fa5-8eed-59bd7b76f900-var-run\") pod \"ovn-controller-s5488-config-lsj57\" (UID: \"89fe19f5-3c0f-4fa5-8eed-59bd7b76f900\") " pod="openstack/ovn-controller-s5488-config-lsj57" Feb 19 15:28:22 crc kubenswrapper[4810]: I0219 15:28:22.236249 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/89fe19f5-3c0f-4fa5-8eed-59bd7b76f900-var-run-ovn\") pod \"ovn-controller-s5488-config-lsj57\" (UID: \"89fe19f5-3c0f-4fa5-8eed-59bd7b76f900\") " pod="openstack/ovn-controller-s5488-config-lsj57" Feb 19 15:28:22 crc kubenswrapper[4810]: I0219 15:28:22.236316 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/89fe19f5-3c0f-4fa5-8eed-59bd7b76f900-additional-scripts\") pod \"ovn-controller-s5488-config-lsj57\" (UID: \"89fe19f5-3c0f-4fa5-8eed-59bd7b76f900\") " pod="openstack/ovn-controller-s5488-config-lsj57" Feb 19 15:28:22 crc kubenswrapper[4810]: I0219 15:28:22.236589 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/89fe19f5-3c0f-4fa5-8eed-59bd7b76f900-var-log-ovn\") pod \"ovn-controller-s5488-config-lsj57\" (UID: \"89fe19f5-3c0f-4fa5-8eed-59bd7b76f900\") " pod="openstack/ovn-controller-s5488-config-lsj57" Feb 19 15:28:22 crc kubenswrapper[4810]: I0219 15:28:22.236630 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbz2r\" (UniqueName: \"kubernetes.io/projected/89fe19f5-3c0f-4fa5-8eed-59bd7b76f900-kube-api-access-mbz2r\") pod \"ovn-controller-s5488-config-lsj57\" (UID: \"89fe19f5-3c0f-4fa5-8eed-59bd7b76f900\") " pod="openstack/ovn-controller-s5488-config-lsj57" Feb 19 15:28:22 crc kubenswrapper[4810]: I0219 15:28:22.338380 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/89fe19f5-3c0f-4fa5-8eed-59bd7b76f900-var-run-ovn\") pod \"ovn-controller-s5488-config-lsj57\" (UID: \"89fe19f5-3c0f-4fa5-8eed-59bd7b76f900\") " pod="openstack/ovn-controller-s5488-config-lsj57" Feb 19 15:28:22 crc kubenswrapper[4810]: I0219 15:28:22.338436 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/89fe19f5-3c0f-4fa5-8eed-59bd7b76f900-additional-scripts\") pod \"ovn-controller-s5488-config-lsj57\" (UID: \"89fe19f5-3c0f-4fa5-8eed-59bd7b76f900\") " pod="openstack/ovn-controller-s5488-config-lsj57" Feb 19 15:28:22 crc kubenswrapper[4810]: I0219 15:28:22.338458 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/89fe19f5-3c0f-4fa5-8eed-59bd7b76f900-var-log-ovn\") pod \"ovn-controller-s5488-config-lsj57\" (UID: \"89fe19f5-3c0f-4fa5-8eed-59bd7b76f900\") " pod="openstack/ovn-controller-s5488-config-lsj57" Feb 19 15:28:22 crc kubenswrapper[4810]: I0219 15:28:22.338487 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbz2r\" (UniqueName: \"kubernetes.io/projected/89fe19f5-3c0f-4fa5-8eed-59bd7b76f900-kube-api-access-mbz2r\") pod \"ovn-controller-s5488-config-lsj57\" (UID: \"89fe19f5-3c0f-4fa5-8eed-59bd7b76f900\") " pod="openstack/ovn-controller-s5488-config-lsj57" Feb 19 15:28:22 crc kubenswrapper[4810]: I0219 15:28:22.338527 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/89fe19f5-3c0f-4fa5-8eed-59bd7b76f900-scripts\") pod \"ovn-controller-s5488-config-lsj57\" (UID: \"89fe19f5-3c0f-4fa5-8eed-59bd7b76f900\") " pod="openstack/ovn-controller-s5488-config-lsj57" Feb 19 15:28:22 crc kubenswrapper[4810]: I0219 15:28:22.338547 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/89fe19f5-3c0f-4fa5-8eed-59bd7b76f900-var-run\") pod \"ovn-controller-s5488-config-lsj57\" (UID: \"89fe19f5-3c0f-4fa5-8eed-59bd7b76f900\") " pod="openstack/ovn-controller-s5488-config-lsj57" Feb 19 15:28:22 crc kubenswrapper[4810]: I0219 15:28:22.338832 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/89fe19f5-3c0f-4fa5-8eed-59bd7b76f900-var-run\") pod \"ovn-controller-s5488-config-lsj57\" (UID: \"89fe19f5-3c0f-4fa5-8eed-59bd7b76f900\") " pod="openstack/ovn-controller-s5488-config-lsj57" Feb 19 15:28:22 crc kubenswrapper[4810]: I0219 15:28:22.338878 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/89fe19f5-3c0f-4fa5-8eed-59bd7b76f900-var-run-ovn\") pod \"ovn-controller-s5488-config-lsj57\" (UID: \"89fe19f5-3c0f-4fa5-8eed-59bd7b76f900\") " pod="openstack/ovn-controller-s5488-config-lsj57" Feb 19 15:28:22 crc kubenswrapper[4810]: I0219 15:28:22.339567 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/89fe19f5-3c0f-4fa5-8eed-59bd7b76f900-additional-scripts\") pod \"ovn-controller-s5488-config-lsj57\" (UID: \"89fe19f5-3c0f-4fa5-8eed-59bd7b76f900\") " pod="openstack/ovn-controller-s5488-config-lsj57" Feb 19 15:28:22 crc kubenswrapper[4810]: I0219 15:28:22.339611 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/89fe19f5-3c0f-4fa5-8eed-59bd7b76f900-var-log-ovn\") pod \"ovn-controller-s5488-config-lsj57\" (UID: \"89fe19f5-3c0f-4fa5-8eed-59bd7b76f900\") " pod="openstack/ovn-controller-s5488-config-lsj57" Feb 19 15:28:22 crc kubenswrapper[4810]: I0219 15:28:22.341320 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/89fe19f5-3c0f-4fa5-8eed-59bd7b76f900-scripts\") pod \"ovn-controller-s5488-config-lsj57\" (UID: \"89fe19f5-3c0f-4fa5-8eed-59bd7b76f900\") " pod="openstack/ovn-controller-s5488-config-lsj57" Feb 19 15:28:22 crc kubenswrapper[4810]: I0219 15:28:22.355031 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbz2r\" (UniqueName: \"kubernetes.io/projected/89fe19f5-3c0f-4fa5-8eed-59bd7b76f900-kube-api-access-mbz2r\") pod \"ovn-controller-s5488-config-lsj57\" (UID: \"89fe19f5-3c0f-4fa5-8eed-59bd7b76f900\") " pod="openstack/ovn-controller-s5488-config-lsj57" Feb 19 15:28:22 crc kubenswrapper[4810]: I0219 15:28:22.507468 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-s5488-config-lsj57" Feb 19 15:28:22 crc kubenswrapper[4810]: I0219 15:28:22.572243 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 19 15:28:22 crc kubenswrapper[4810]: W0219 15:28:22.572637 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c213a3a_78fd_4b42_bc1c_e09837eae684.slice/crio-971051d9ba9c03f03793d16fa10f74ed254c90779c474f368dd54279d38af75a WatchSource:0}: Error finding container 971051d9ba9c03f03793d16fa10f74ed254c90779c474f368dd54279d38af75a: Status 404 returned error can't find the container with id 971051d9ba9c03f03793d16fa10f74ed254c90779c474f368dd54279d38af75a Feb 19 15:28:22 crc kubenswrapper[4810]: I0219 15:28:22.697614 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"5c213a3a-78fd-4b42-bc1c-e09837eae684","Type":"ContainerStarted","Data":"971051d9ba9c03f03793d16fa10f74ed254c90779c474f368dd54279d38af75a"} Feb 19 15:28:22 crc kubenswrapper[4810]: I0219 15:28:22.702413 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6999bddfcf-fzf7g" event={"ID":"2584fed3-16a4-489e-bf03-1c7461e9d3d8","Type":"ContainerStarted","Data":"66285c6ba52e055f610f9273f522d8c7987aa81e9fe271b980a52b4e8dbd8794"} Feb 19 15:28:22 crc kubenswrapper[4810]: I0219 15:28:22.702626 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6999bddfcf-fzf7g" Feb 19 15:28:22 crc kubenswrapper[4810]: I0219 15:28:22.724940 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6999bddfcf-fzf7g" podStartSLOduration=10.724908962 podStartE2EDuration="10.724908962s" podCreationTimestamp="2026-02-19 15:28:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:28:22.720025762 +0000 UTC m=+1132.202055886" watchObservedRunningTime="2026-02-19 15:28:22.724908962 +0000 UTC m=+1132.206939086" Feb 19 15:28:22 crc kubenswrapper[4810]: I0219 15:28:22.980946 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-s5488-config-lsj57"] Feb 19 15:28:23 crc kubenswrapper[4810]: I0219 15:28:23.450360 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a220bc57-3f31-4851-ad5c-9f61359f7de5" path="/var/lib/kubelet/pods/a220bc57-3f31-4851-ad5c-9f61359f7de5/volumes" Feb 19 15:28:23 crc kubenswrapper[4810]: I0219 15:28:23.451434 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd94f6ba-17f4-407d-96ce-aafc0d390ef0" path="/var/lib/kubelet/pods/cd94f6ba-17f4-407d-96ce-aafc0d390ef0/volumes" Feb 19 15:28:23 crc kubenswrapper[4810]: I0219 15:28:23.712501 4810 generic.go:334] "Generic (PLEG): container finished" podID="89fe19f5-3c0f-4fa5-8eed-59bd7b76f900" containerID="930ffde39b4c3d7913e11cb429594ee8cd480971fa345e4d0d06f707706f3472" exitCode=0 Feb 19 15:28:23 crc kubenswrapper[4810]: I0219 15:28:23.712592 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-s5488-config-lsj57" event={"ID":"89fe19f5-3c0f-4fa5-8eed-59bd7b76f900","Type":"ContainerDied","Data":"930ffde39b4c3d7913e11cb429594ee8cd480971fa345e4d0d06f707706f3472"} Feb 19 15:28:23 crc kubenswrapper[4810]: I0219 15:28:23.712652 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-s5488-config-lsj57" event={"ID":"89fe19f5-3c0f-4fa5-8eed-59bd7b76f900","Type":"ContainerStarted","Data":"710b255712c252a63884f7d5e66f38e1321cdca52c846bde3a6059ddb76b7537"} Feb 19 15:28:25 crc kubenswrapper[4810]: I0219 15:28:25.053769 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-s5488-config-lsj57" Feb 19 15:28:25 crc kubenswrapper[4810]: I0219 15:28:25.080164 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/89fe19f5-3c0f-4fa5-8eed-59bd7b76f900-additional-scripts\") pod \"89fe19f5-3c0f-4fa5-8eed-59bd7b76f900\" (UID: \"89fe19f5-3c0f-4fa5-8eed-59bd7b76f900\") " Feb 19 15:28:25 crc kubenswrapper[4810]: I0219 15:28:25.080241 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/89fe19f5-3c0f-4fa5-8eed-59bd7b76f900-var-run\") pod \"89fe19f5-3c0f-4fa5-8eed-59bd7b76f900\" (UID: \"89fe19f5-3c0f-4fa5-8eed-59bd7b76f900\") " Feb 19 15:28:25 crc kubenswrapper[4810]: I0219 15:28:25.080270 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/89fe19f5-3c0f-4fa5-8eed-59bd7b76f900-var-log-ovn\") pod \"89fe19f5-3c0f-4fa5-8eed-59bd7b76f900\" (UID: \"89fe19f5-3c0f-4fa5-8eed-59bd7b76f900\") " Feb 19 15:28:25 crc kubenswrapper[4810]: I0219 15:28:25.080346 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/89fe19f5-3c0f-4fa5-8eed-59bd7b76f900-scripts\") pod \"89fe19f5-3c0f-4fa5-8eed-59bd7b76f900\" (UID: \"89fe19f5-3c0f-4fa5-8eed-59bd7b76f900\") " Feb 19 15:28:25 crc kubenswrapper[4810]: I0219 15:28:25.080471 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbz2r\" (UniqueName: \"kubernetes.io/projected/89fe19f5-3c0f-4fa5-8eed-59bd7b76f900-kube-api-access-mbz2r\") pod \"89fe19f5-3c0f-4fa5-8eed-59bd7b76f900\" (UID: \"89fe19f5-3c0f-4fa5-8eed-59bd7b76f900\") " Feb 19 15:28:25 crc kubenswrapper[4810]: I0219 15:28:25.080509 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/89fe19f5-3c0f-4fa5-8eed-59bd7b76f900-var-run-ovn\") pod \"89fe19f5-3c0f-4fa5-8eed-59bd7b76f900\" (UID: \"89fe19f5-3c0f-4fa5-8eed-59bd7b76f900\") " Feb 19 15:28:25 crc kubenswrapper[4810]: I0219 15:28:25.080734 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/89fe19f5-3c0f-4fa5-8eed-59bd7b76f900-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "89fe19f5-3c0f-4fa5-8eed-59bd7b76f900" (UID: "89fe19f5-3c0f-4fa5-8eed-59bd7b76f900"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 15:28:25 crc kubenswrapper[4810]: I0219 15:28:25.080797 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/89fe19f5-3c0f-4fa5-8eed-59bd7b76f900-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "89fe19f5-3c0f-4fa5-8eed-59bd7b76f900" (UID: "89fe19f5-3c0f-4fa5-8eed-59bd7b76f900"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 15:28:25 crc kubenswrapper[4810]: I0219 15:28:25.080941 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89fe19f5-3c0f-4fa5-8eed-59bd7b76f900-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "89fe19f5-3c0f-4fa5-8eed-59bd7b76f900" (UID: "89fe19f5-3c0f-4fa5-8eed-59bd7b76f900"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:28:25 crc kubenswrapper[4810]: I0219 15:28:25.080987 4810 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/89fe19f5-3c0f-4fa5-8eed-59bd7b76f900-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 19 15:28:25 crc kubenswrapper[4810]: I0219 15:28:25.081752 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89fe19f5-3c0f-4fa5-8eed-59bd7b76f900-scripts" (OuterVolumeSpecName: "scripts") pod "89fe19f5-3c0f-4fa5-8eed-59bd7b76f900" (UID: "89fe19f5-3c0f-4fa5-8eed-59bd7b76f900"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:28:25 crc kubenswrapper[4810]: I0219 15:28:25.081976 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/89fe19f5-3c0f-4fa5-8eed-59bd7b76f900-var-run" (OuterVolumeSpecName: "var-run") pod "89fe19f5-3c0f-4fa5-8eed-59bd7b76f900" (UID: "89fe19f5-3c0f-4fa5-8eed-59bd7b76f900"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 15:28:25 crc kubenswrapper[4810]: I0219 15:28:25.088289 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89fe19f5-3c0f-4fa5-8eed-59bd7b76f900-kube-api-access-mbz2r" (OuterVolumeSpecName: "kube-api-access-mbz2r") pod "89fe19f5-3c0f-4fa5-8eed-59bd7b76f900" (UID: "89fe19f5-3c0f-4fa5-8eed-59bd7b76f900"). InnerVolumeSpecName "kube-api-access-mbz2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:28:25 crc kubenswrapper[4810]: I0219 15:28:25.182265 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mbz2r\" (UniqueName: \"kubernetes.io/projected/89fe19f5-3c0f-4fa5-8eed-59bd7b76f900-kube-api-access-mbz2r\") on node \"crc\" DevicePath \"\"" Feb 19 15:28:25 crc kubenswrapper[4810]: I0219 15:28:25.182309 4810 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/89fe19f5-3c0f-4fa5-8eed-59bd7b76f900-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 19 15:28:25 crc kubenswrapper[4810]: I0219 15:28:25.182338 4810 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/89fe19f5-3c0f-4fa5-8eed-59bd7b76f900-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 15:28:25 crc kubenswrapper[4810]: I0219 15:28:25.182353 4810 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/89fe19f5-3c0f-4fa5-8eed-59bd7b76f900-var-run\") on node \"crc\" DevicePath \"\"" Feb 19 15:28:25 crc kubenswrapper[4810]: I0219 15:28:25.182366 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/89fe19f5-3c0f-4fa5-8eed-59bd7b76f900-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 15:28:25 crc kubenswrapper[4810]: I0219 15:28:25.752706 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"5c213a3a-78fd-4b42-bc1c-e09837eae684","Type":"ContainerStarted","Data":"f004bfd72f8d26f3aaf22a2c6561639f20d7d6d9b961893f534ded590f3c40ec"} Feb 19 15:28:25 crc kubenswrapper[4810]: I0219 15:28:25.756507 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-s5488-config-lsj57" event={"ID":"89fe19f5-3c0f-4fa5-8eed-59bd7b76f900","Type":"ContainerDied","Data":"710b255712c252a63884f7d5e66f38e1321cdca52c846bde3a6059ddb76b7537"} Feb 19 15:28:25 crc kubenswrapper[4810]: I0219 15:28:25.756718 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="710b255712c252a63884f7d5e66f38e1321cdca52c846bde3a6059ddb76b7537" Feb 19 15:28:25 crc kubenswrapper[4810]: I0219 15:28:25.756587 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-s5488-config-lsj57" Feb 19 15:28:25 crc kubenswrapper[4810]: I0219 15:28:25.885691 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/notifications-rabbitmq-server-0" Feb 19 15:28:26 crc kubenswrapper[4810]: I0219 15:28:26.133715 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 19 15:28:26 crc kubenswrapper[4810]: I0219 15:28:26.139389 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-s5488-config-lsj57"] Feb 19 15:28:26 crc kubenswrapper[4810]: I0219 15:28:26.152868 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-s5488-config-lsj57"] Feb 19 15:28:26 crc kubenswrapper[4810]: I0219 15:28:26.444551 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 19 15:28:27 crc kubenswrapper[4810]: I0219 15:28:27.448638 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89fe19f5-3c0f-4fa5-8eed-59bd7b76f900" path="/var/lib/kubelet/pods/89fe19f5-3c0f-4fa5-8eed-59bd7b76f900/volumes" Feb 19 15:28:27 crc kubenswrapper[4810]: I0219 15:28:27.646642 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-9wtnf"] Feb 19 15:28:27 crc kubenswrapper[4810]: E0219 15:28:27.646980 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89fe19f5-3c0f-4fa5-8eed-59bd7b76f900" containerName="ovn-config" Feb 19 15:28:27 crc kubenswrapper[4810]: I0219 15:28:27.646997 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="89fe19f5-3c0f-4fa5-8eed-59bd7b76f900" containerName="ovn-config" Feb 19 15:28:27 crc kubenswrapper[4810]: I0219 15:28:27.647162 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="89fe19f5-3c0f-4fa5-8eed-59bd7b76f900" containerName="ovn-config" Feb 19 15:28:27 crc kubenswrapper[4810]: I0219 15:28:27.647748 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-9wtnf" Feb 19 15:28:27 crc kubenswrapper[4810]: I0219 15:28:27.669304 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-9wtnf"] Feb 19 15:28:27 crc kubenswrapper[4810]: I0219 15:28:27.746734 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-k9zsz"] Feb 19 15:28:27 crc kubenswrapper[4810]: I0219 15:28:27.748029 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-k9zsz" Feb 19 15:28:27 crc kubenswrapper[4810]: W0219 15:28:27.754268 4810 reflector.go:561] object-"openstack"/"keystone-keystone-dockercfg-j78zz": failed to list *v1.Secret: secrets "keystone-keystone-dockercfg-j78zz" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Feb 19 15:28:27 crc kubenswrapper[4810]: E0219 15:28:27.754550 4810 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"keystone-keystone-dockercfg-j78zz\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"keystone-keystone-dockercfg-j78zz\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 19 15:28:27 crc kubenswrapper[4810]: W0219 15:28:27.754611 4810 reflector.go:561] object-"openstack"/"keystone": failed to list *v1.Secret: secrets "keystone" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Feb 19 15:28:27 crc kubenswrapper[4810]: E0219 15:28:27.754638 4810 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"keystone\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"keystone\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 19 15:28:27 crc kubenswrapper[4810]: W0219 15:28:27.755107 4810 reflector.go:561] object-"openstack"/"keystone-scripts": failed to list *v1.Secret: secrets "keystone-scripts" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Feb 19 15:28:27 crc kubenswrapper[4810]: E0219 15:28:27.755132 4810 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"keystone-scripts\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"keystone-scripts\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 19 15:28:27 crc kubenswrapper[4810]: W0219 15:28:27.755634 4810 reflector.go:561] object-"openstack"/"keystone-config-data": failed to list *v1.Secret: secrets "keystone-config-data" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Feb 19 15:28:27 crc kubenswrapper[4810]: E0219 15:28:27.755660 4810 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"keystone-config-data\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"keystone-config-data\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 19 15:28:27 crc kubenswrapper[4810]: I0219 15:28:27.823522 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-k9zsz"] Feb 19 15:28:27 crc kubenswrapper[4810]: I0219 15:28:27.830579 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6217aad-07e6-49b6-8e80-41e75cecaaf5-operator-scripts\") pod \"barbican-db-create-9wtnf\" (UID: \"a6217aad-07e6-49b6-8e80-41e75cecaaf5\") " pod="openstack/barbican-db-create-9wtnf" Feb 19 15:28:27 crc kubenswrapper[4810]: I0219 15:28:27.830631 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxq5k\" (UniqueName: \"kubernetes.io/projected/a6217aad-07e6-49b6-8e80-41e75cecaaf5-kube-api-access-lxq5k\") pod \"barbican-db-create-9wtnf\" (UID: \"a6217aad-07e6-49b6-8e80-41e75cecaaf5\") " pod="openstack/barbican-db-create-9wtnf" Feb 19 15:28:27 crc kubenswrapper[4810]: I0219 15:28:27.857913 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-8ce6-account-create-update-ztxw9"] Feb 19 15:28:27 crc kubenswrapper[4810]: I0219 15:28:27.860373 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-8ce6-account-create-update-ztxw9" Feb 19 15:28:27 crc kubenswrapper[4810]: I0219 15:28:27.867441 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Feb 19 15:28:27 crc kubenswrapper[4810]: I0219 15:28:27.868288 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-8ce6-account-create-update-ztxw9"] Feb 19 15:28:27 crc kubenswrapper[4810]: I0219 15:28:27.932480 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxq5k\" (UniqueName: \"kubernetes.io/projected/a6217aad-07e6-49b6-8e80-41e75cecaaf5-kube-api-access-lxq5k\") pod \"barbican-db-create-9wtnf\" (UID: \"a6217aad-07e6-49b6-8e80-41e75cecaaf5\") " pod="openstack/barbican-db-create-9wtnf" Feb 19 15:28:27 crc kubenswrapper[4810]: I0219 15:28:27.932952 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/082fc735-2850-452d-841a-0af9ed7ed171-config-data\") pod \"keystone-db-sync-k9zsz\" (UID: \"082fc735-2850-452d-841a-0af9ed7ed171\") " pod="openstack/keystone-db-sync-k9zsz" Feb 19 15:28:27 crc kubenswrapper[4810]: I0219 15:28:27.933180 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/082fc735-2850-452d-841a-0af9ed7ed171-combined-ca-bundle\") pod \"keystone-db-sync-k9zsz\" (UID: \"082fc735-2850-452d-841a-0af9ed7ed171\") " pod="openstack/keystone-db-sync-k9zsz" Feb 19 15:28:27 crc kubenswrapper[4810]: I0219 15:28:27.933210 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4qlb\" (UniqueName: \"kubernetes.io/projected/082fc735-2850-452d-841a-0af9ed7ed171-kube-api-access-t4qlb\") pod \"keystone-db-sync-k9zsz\" (UID: \"082fc735-2850-452d-841a-0af9ed7ed171\") " pod="openstack/keystone-db-sync-k9zsz" Feb 19 15:28:27 crc kubenswrapper[4810]: I0219 15:28:27.933423 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6217aad-07e6-49b6-8e80-41e75cecaaf5-operator-scripts\") pod \"barbican-db-create-9wtnf\" (UID: \"a6217aad-07e6-49b6-8e80-41e75cecaaf5\") " pod="openstack/barbican-db-create-9wtnf" Feb 19 15:28:27 crc kubenswrapper[4810]: I0219 15:28:27.933768 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-ptwwk"] Feb 19 15:28:27 crc kubenswrapper[4810]: I0219 15:28:27.934294 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6217aad-07e6-49b6-8e80-41e75cecaaf5-operator-scripts\") pod \"barbican-db-create-9wtnf\" (UID: \"a6217aad-07e6-49b6-8e80-41e75cecaaf5\") " pod="openstack/barbican-db-create-9wtnf" Feb 19 15:28:27 crc kubenswrapper[4810]: I0219 15:28:27.934805 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-ptwwk" Feb 19 15:28:27 crc kubenswrapper[4810]: I0219 15:28:27.957714 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-ptwwk"] Feb 19 15:28:27 crc kubenswrapper[4810]: I0219 15:28:27.968835 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxq5k\" (UniqueName: \"kubernetes.io/projected/a6217aad-07e6-49b6-8e80-41e75cecaaf5-kube-api-access-lxq5k\") pod \"barbican-db-create-9wtnf\" (UID: \"a6217aad-07e6-49b6-8e80-41e75cecaaf5\") " pod="openstack/barbican-db-create-9wtnf" Feb 19 15:28:27 crc kubenswrapper[4810]: I0219 15:28:27.975715 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-db-sync-sd4lr"] Feb 19 15:28:27 crc kubenswrapper[4810]: I0219 15:28:27.976933 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-sd4lr" Feb 19 15:28:27 crc kubenswrapper[4810]: I0219 15:28:27.981156 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-config-data" Feb 19 15:28:27 crc kubenswrapper[4810]: I0219 15:28:27.981687 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-watcher-dockercfg-ngb4n" Feb 19 15:28:27 crc kubenswrapper[4810]: I0219 15:28:27.984430 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-sync-sd4lr"] Feb 19 15:28:28 crc kubenswrapper[4810]: I0219 15:28:28.035540 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqz6c\" (UniqueName: \"kubernetes.io/projected/5ac347c6-4f1b-4b05-87a0-9332dec2ba9d-kube-api-access-tqz6c\") pod \"barbican-8ce6-account-create-update-ztxw9\" (UID: \"5ac347c6-4f1b-4b05-87a0-9332dec2ba9d\") " pod="openstack/barbican-8ce6-account-create-update-ztxw9" Feb 19 15:28:28 crc kubenswrapper[4810]: I0219 15:28:28.035620 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/082fc735-2850-452d-841a-0af9ed7ed171-combined-ca-bundle\") pod \"keystone-db-sync-k9zsz\" (UID: \"082fc735-2850-452d-841a-0af9ed7ed171\") " pod="openstack/keystone-db-sync-k9zsz" Feb 19 15:28:28 crc kubenswrapper[4810]: I0219 15:28:28.035641 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4qlb\" (UniqueName: \"kubernetes.io/projected/082fc735-2850-452d-841a-0af9ed7ed171-kube-api-access-t4qlb\") pod \"keystone-db-sync-k9zsz\" (UID: \"082fc735-2850-452d-841a-0af9ed7ed171\") " pod="openstack/keystone-db-sync-k9zsz" Feb 19 15:28:28 crc kubenswrapper[4810]: I0219 15:28:28.035671 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ac347c6-4f1b-4b05-87a0-9332dec2ba9d-operator-scripts\") pod \"barbican-8ce6-account-create-update-ztxw9\" (UID: \"5ac347c6-4f1b-4b05-87a0-9332dec2ba9d\") " pod="openstack/barbican-8ce6-account-create-update-ztxw9" Feb 19 15:28:28 crc kubenswrapper[4810]: I0219 15:28:28.035836 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2c7665c-330a-45b8-b461-bd08b069b747-operator-scripts\") pod \"neutron-db-create-ptwwk\" (UID: \"c2c7665c-330a-45b8-b461-bd08b069b747\") " pod="openstack/neutron-db-create-ptwwk" Feb 19 15:28:28 crc kubenswrapper[4810]: I0219 15:28:28.035872 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8z7f9\" (UniqueName: \"kubernetes.io/projected/c2c7665c-330a-45b8-b461-bd08b069b747-kube-api-access-8z7f9\") pod \"neutron-db-create-ptwwk\" (UID: \"c2c7665c-330a-45b8-b461-bd08b069b747\") " pod="openstack/neutron-db-create-ptwwk" Feb 19 15:28:28 crc kubenswrapper[4810]: I0219 15:28:28.035936 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/082fc735-2850-452d-841a-0af9ed7ed171-config-data\") pod \"keystone-db-sync-k9zsz\" (UID: \"082fc735-2850-452d-841a-0af9ed7ed171\") " pod="openstack/keystone-db-sync-k9zsz" Feb 19 15:28:28 crc kubenswrapper[4810]: I0219 15:28:28.041636 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/082fc735-2850-452d-841a-0af9ed7ed171-combined-ca-bundle\") pod \"keystone-db-sync-k9zsz\" (UID: \"082fc735-2850-452d-841a-0af9ed7ed171\") " pod="openstack/keystone-db-sync-k9zsz" Feb 19 15:28:28 crc kubenswrapper[4810]: I0219 15:28:28.068585 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4qlb\" (UniqueName: \"kubernetes.io/projected/082fc735-2850-452d-841a-0af9ed7ed171-kube-api-access-t4qlb\") pod \"keystone-db-sync-k9zsz\" (UID: \"082fc735-2850-452d-841a-0af9ed7ed171\") " pod="openstack/keystone-db-sync-k9zsz" Feb 19 15:28:28 crc kubenswrapper[4810]: I0219 15:28:28.076284 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-dc04-account-create-update-dmf9z"] Feb 19 15:28:28 crc kubenswrapper[4810]: I0219 15:28:28.077502 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dc04-account-create-update-dmf9z" Feb 19 15:28:28 crc kubenswrapper[4810]: I0219 15:28:28.079862 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Feb 19 15:28:28 crc kubenswrapper[4810]: I0219 15:28:28.084386 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dc04-account-create-update-dmf9z"] Feb 19 15:28:28 crc kubenswrapper[4810]: I0219 15:28:28.139687 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnkbf\" (UniqueName: \"kubernetes.io/projected/63eeb47c-9c4a-4e36-be24-61c126517600-kube-api-access-mnkbf\") pod \"watcher-db-sync-sd4lr\" (UID: \"63eeb47c-9c4a-4e36-be24-61c126517600\") " pod="openstack/watcher-db-sync-sd4lr" Feb 19 15:28:28 crc kubenswrapper[4810]: I0219 15:28:28.139740 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqz6c\" (UniqueName: \"kubernetes.io/projected/5ac347c6-4f1b-4b05-87a0-9332dec2ba9d-kube-api-access-tqz6c\") pod \"barbican-8ce6-account-create-update-ztxw9\" (UID: \"5ac347c6-4f1b-4b05-87a0-9332dec2ba9d\") " pod="openstack/barbican-8ce6-account-create-update-ztxw9" Feb 19 15:28:28 crc kubenswrapper[4810]: I0219 15:28:28.139803 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ac347c6-4f1b-4b05-87a0-9332dec2ba9d-operator-scripts\") pod \"barbican-8ce6-account-create-update-ztxw9\" (UID: \"5ac347c6-4f1b-4b05-87a0-9332dec2ba9d\") " pod="openstack/barbican-8ce6-account-create-update-ztxw9" Feb 19 15:28:28 crc kubenswrapper[4810]: I0219 15:28:28.139825 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2c7665c-330a-45b8-b461-bd08b069b747-operator-scripts\") pod \"neutron-db-create-ptwwk\" (UID: \"c2c7665c-330a-45b8-b461-bd08b069b747\") " pod="openstack/neutron-db-create-ptwwk" Feb 19 15:28:28 crc kubenswrapper[4810]: I0219 15:28:28.139850 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8z7f9\" (UniqueName: \"kubernetes.io/projected/c2c7665c-330a-45b8-b461-bd08b069b747-kube-api-access-8z7f9\") pod \"neutron-db-create-ptwwk\" (UID: \"c2c7665c-330a-45b8-b461-bd08b069b747\") " pod="openstack/neutron-db-create-ptwwk" Feb 19 15:28:28 crc kubenswrapper[4810]: I0219 15:28:28.139875 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63eeb47c-9c4a-4e36-be24-61c126517600-combined-ca-bundle\") pod \"watcher-db-sync-sd4lr\" (UID: \"63eeb47c-9c4a-4e36-be24-61c126517600\") " pod="openstack/watcher-db-sync-sd4lr" Feb 19 15:28:28 crc kubenswrapper[4810]: I0219 15:28:28.139913 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/63eeb47c-9c4a-4e36-be24-61c126517600-db-sync-config-data\") pod \"watcher-db-sync-sd4lr\" (UID: \"63eeb47c-9c4a-4e36-be24-61c126517600\") " pod="openstack/watcher-db-sync-sd4lr" Feb 19 15:28:28 crc kubenswrapper[4810]: I0219 15:28:28.139951 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63eeb47c-9c4a-4e36-be24-61c126517600-config-data\") pod \"watcher-db-sync-sd4lr\" (UID: \"63eeb47c-9c4a-4e36-be24-61c126517600\") " pod="openstack/watcher-db-sync-sd4lr" Feb 19 15:28:28 crc kubenswrapper[4810]: I0219 15:28:28.141039 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2c7665c-330a-45b8-b461-bd08b069b747-operator-scripts\") pod \"neutron-db-create-ptwwk\" (UID: \"c2c7665c-330a-45b8-b461-bd08b069b747\") " pod="openstack/neutron-db-create-ptwwk" Feb 19 15:28:28 crc kubenswrapper[4810]: I0219 15:28:28.141674 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ac347c6-4f1b-4b05-87a0-9332dec2ba9d-operator-scripts\") pod \"barbican-8ce6-account-create-update-ztxw9\" (UID: \"5ac347c6-4f1b-4b05-87a0-9332dec2ba9d\") " pod="openstack/barbican-8ce6-account-create-update-ztxw9" Feb 19 15:28:28 crc kubenswrapper[4810]: I0219 15:28:28.147798 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-9cp2h"] Feb 19 15:28:28 crc kubenswrapper[4810]: I0219 15:28:28.150839 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-9cp2h" Feb 19 15:28:28 crc kubenswrapper[4810]: I0219 15:28:28.156778 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-9cp2h"] Feb 19 15:28:28 crc kubenswrapper[4810]: I0219 15:28:28.210870 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8z7f9\" (UniqueName: \"kubernetes.io/projected/c2c7665c-330a-45b8-b461-bd08b069b747-kube-api-access-8z7f9\") pod \"neutron-db-create-ptwwk\" (UID: \"c2c7665c-330a-45b8-b461-bd08b069b747\") " pod="openstack/neutron-db-create-ptwwk" Feb 19 15:28:28 crc kubenswrapper[4810]: I0219 15:28:28.211559 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqz6c\" (UniqueName: \"kubernetes.io/projected/5ac347c6-4f1b-4b05-87a0-9332dec2ba9d-kube-api-access-tqz6c\") pod \"barbican-8ce6-account-create-update-ztxw9\" (UID: \"5ac347c6-4f1b-4b05-87a0-9332dec2ba9d\") " pod="openstack/barbican-8ce6-account-create-update-ztxw9" Feb 19 15:28:28 crc kubenswrapper[4810]: I0219 15:28:28.241054 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/700fd144-e077-4468-80a4-f131fdb9d67e-operator-scripts\") pod \"cinder-db-create-9cp2h\" (UID: \"700fd144-e077-4468-80a4-f131fdb9d67e\") " pod="openstack/cinder-db-create-9cp2h" Feb 19 15:28:28 crc kubenswrapper[4810]: I0219 15:28:28.241115 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzqqn\" (UniqueName: \"kubernetes.io/projected/05b15da5-701a-492a-b986-99b767d2876c-kube-api-access-vzqqn\") pod \"neutron-dc04-account-create-update-dmf9z\" (UID: \"05b15da5-701a-492a-b986-99b767d2876c\") " pod="openstack/neutron-dc04-account-create-update-dmf9z" Feb 19 15:28:28 crc kubenswrapper[4810]: I0219 15:28:28.241153 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psgfb\" (UniqueName: \"kubernetes.io/projected/700fd144-e077-4468-80a4-f131fdb9d67e-kube-api-access-psgfb\") pod \"cinder-db-create-9cp2h\" (UID: \"700fd144-e077-4468-80a4-f131fdb9d67e\") " pod="openstack/cinder-db-create-9cp2h" Feb 19 15:28:28 crc kubenswrapper[4810]: I0219 15:28:28.241193 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63eeb47c-9c4a-4e36-be24-61c126517600-combined-ca-bundle\") pod \"watcher-db-sync-sd4lr\" (UID: \"63eeb47c-9c4a-4e36-be24-61c126517600\") " pod="openstack/watcher-db-sync-sd4lr" Feb 19 15:28:28 crc kubenswrapper[4810]: I0219 15:28:28.241224 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05b15da5-701a-492a-b986-99b767d2876c-operator-scripts\") pod \"neutron-dc04-account-create-update-dmf9z\" (UID: \"05b15da5-701a-492a-b986-99b767d2876c\") " pod="openstack/neutron-dc04-account-create-update-dmf9z" Feb 19 15:28:28 crc kubenswrapper[4810]: I0219 15:28:28.241243 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/63eeb47c-9c4a-4e36-be24-61c126517600-db-sync-config-data\") pod \"watcher-db-sync-sd4lr\" (UID: \"63eeb47c-9c4a-4e36-be24-61c126517600\") " pod="openstack/watcher-db-sync-sd4lr" Feb 19 15:28:28 crc kubenswrapper[4810]: I0219 15:28:28.241277 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63eeb47c-9c4a-4e36-be24-61c126517600-config-data\") pod \"watcher-db-sync-sd4lr\" (UID: \"63eeb47c-9c4a-4e36-be24-61c126517600\") " pod="openstack/watcher-db-sync-sd4lr" Feb 19 15:28:28 crc kubenswrapper[4810]: I0219 15:28:28.241312 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnkbf\" (UniqueName: \"kubernetes.io/projected/63eeb47c-9c4a-4e36-be24-61c126517600-kube-api-access-mnkbf\") pod \"watcher-db-sync-sd4lr\" (UID: \"63eeb47c-9c4a-4e36-be24-61c126517600\") " pod="openstack/watcher-db-sync-sd4lr" Feb 19 15:28:28 crc kubenswrapper[4810]: I0219 15:28:28.244961 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63eeb47c-9c4a-4e36-be24-61c126517600-combined-ca-bundle\") pod \"watcher-db-sync-sd4lr\" (UID: \"63eeb47c-9c4a-4e36-be24-61c126517600\") " pod="openstack/watcher-db-sync-sd4lr" Feb 19 15:28:28 crc kubenswrapper[4810]: I0219 15:28:28.248167 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-ptwwk" Feb 19 15:28:28 crc kubenswrapper[4810]: I0219 15:28:28.254711 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/63eeb47c-9c4a-4e36-be24-61c126517600-db-sync-config-data\") pod \"watcher-db-sync-sd4lr\" (UID: \"63eeb47c-9c4a-4e36-be24-61c126517600\") " pod="openstack/watcher-db-sync-sd4lr" Feb 19 15:28:28 crc kubenswrapper[4810]: I0219 15:28:28.256917 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63eeb47c-9c4a-4e36-be24-61c126517600-config-data\") pod \"watcher-db-sync-sd4lr\" (UID: \"63eeb47c-9c4a-4e36-be24-61c126517600\") " pod="openstack/watcher-db-sync-sd4lr" Feb 19 15:28:28 crc kubenswrapper[4810]: I0219 15:28:28.261934 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnkbf\" (UniqueName: \"kubernetes.io/projected/63eeb47c-9c4a-4e36-be24-61c126517600-kube-api-access-mnkbf\") pod \"watcher-db-sync-sd4lr\" (UID: \"63eeb47c-9c4a-4e36-be24-61c126517600\") " pod="openstack/watcher-db-sync-sd4lr" Feb 19 15:28:28 crc kubenswrapper[4810]: I0219 15:28:28.262299 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-9wtnf" Feb 19 15:28:28 crc kubenswrapper[4810]: I0219 15:28:28.277480 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6999bddfcf-fzf7g" Feb 19 15:28:28 crc kubenswrapper[4810]: I0219 15:28:28.347857 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-sd4lr" Feb 19 15:28:28 crc kubenswrapper[4810]: I0219 15:28:28.348361 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psgfb\" (UniqueName: \"kubernetes.io/projected/700fd144-e077-4468-80a4-f131fdb9d67e-kube-api-access-psgfb\") pod \"cinder-db-create-9cp2h\" (UID: \"700fd144-e077-4468-80a4-f131fdb9d67e\") " pod="openstack/cinder-db-create-9cp2h" Feb 19 15:28:28 crc kubenswrapper[4810]: I0219 15:28:28.348454 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05b15da5-701a-492a-b986-99b767d2876c-operator-scripts\") pod \"neutron-dc04-account-create-update-dmf9z\" (UID: \"05b15da5-701a-492a-b986-99b767d2876c\") " pod="openstack/neutron-dc04-account-create-update-dmf9z" Feb 19 15:28:28 crc kubenswrapper[4810]: I0219 15:28:28.348550 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/700fd144-e077-4468-80a4-f131fdb9d67e-operator-scripts\") pod \"cinder-db-create-9cp2h\" (UID: \"700fd144-e077-4468-80a4-f131fdb9d67e\") " pod="openstack/cinder-db-create-9cp2h" Feb 19 15:28:28 crc kubenswrapper[4810]: I0219 15:28:28.348605 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzqqn\" (UniqueName: \"kubernetes.io/projected/05b15da5-701a-492a-b986-99b767d2876c-kube-api-access-vzqqn\") pod \"neutron-dc04-account-create-update-dmf9z\" (UID: \"05b15da5-701a-492a-b986-99b767d2876c\") " pod="openstack/neutron-dc04-account-create-update-dmf9z" Feb 19 15:28:28 crc kubenswrapper[4810]: I0219 15:28:28.349304 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05b15da5-701a-492a-b986-99b767d2876c-operator-scripts\") pod \"neutron-dc04-account-create-update-dmf9z\" (UID: \"05b15da5-701a-492a-b986-99b767d2876c\") " pod="openstack/neutron-dc04-account-create-update-dmf9z" Feb 19 15:28:28 crc kubenswrapper[4810]: I0219 15:28:28.350213 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/700fd144-e077-4468-80a4-f131fdb9d67e-operator-scripts\") pod \"cinder-db-create-9cp2h\" (UID: \"700fd144-e077-4468-80a4-f131fdb9d67e\") " pod="openstack/cinder-db-create-9cp2h" Feb 19 15:28:28 crc kubenswrapper[4810]: I0219 15:28:28.353906 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d95ff5b97-flxcw"] Feb 19 15:28:28 crc kubenswrapper[4810]: I0219 15:28:28.354127 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7d95ff5b97-flxcw" podUID="788aae13-b274-4965-ac0c-8ac075c32567" containerName="dnsmasq-dns" containerID="cri-o://e901006a072b768e5b2fffb51122f3ae9427a4d8bce325b789cb52d5dc2df384" gracePeriod=10 Feb 19 15:28:28 crc kubenswrapper[4810]: I0219 15:28:28.371662 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-43b3-account-create-update-6gqq5"] Feb 19 15:28:28 crc kubenswrapper[4810]: I0219 15:28:28.373225 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-43b3-account-create-update-6gqq5" Feb 19 15:28:28 crc kubenswrapper[4810]: I0219 15:28:28.378935 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Feb 19 15:28:28 crc kubenswrapper[4810]: I0219 15:28:28.378996 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzqqn\" (UniqueName: \"kubernetes.io/projected/05b15da5-701a-492a-b986-99b767d2876c-kube-api-access-vzqqn\") pod \"neutron-dc04-account-create-update-dmf9z\" (UID: \"05b15da5-701a-492a-b986-99b767d2876c\") " pod="openstack/neutron-dc04-account-create-update-dmf9z" Feb 19 15:28:28 crc kubenswrapper[4810]: I0219 15:28:28.408684 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psgfb\" (UniqueName: \"kubernetes.io/projected/700fd144-e077-4468-80a4-f131fdb9d67e-kube-api-access-psgfb\") pod \"cinder-db-create-9cp2h\" (UID: \"700fd144-e077-4468-80a4-f131fdb9d67e\") " pod="openstack/cinder-db-create-9cp2h" Feb 19 15:28:28 crc kubenswrapper[4810]: I0219 15:28:28.418120 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-43b3-account-create-update-6gqq5"] Feb 19 15:28:28 crc kubenswrapper[4810]: I0219 15:28:28.429224 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dc04-account-create-update-dmf9z" Feb 19 15:28:28 crc kubenswrapper[4810]: I0219 15:28:28.454034 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25jng\" (UniqueName: \"kubernetes.io/projected/5770188c-7480-4529-8450-3d1a44cf50d6-kube-api-access-25jng\") pod \"cinder-43b3-account-create-update-6gqq5\" (UID: \"5770188c-7480-4529-8450-3d1a44cf50d6\") " pod="openstack/cinder-43b3-account-create-update-6gqq5" Feb 19 15:28:28 crc kubenswrapper[4810]: I0219 15:28:28.455765 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5770188c-7480-4529-8450-3d1a44cf50d6-operator-scripts\") pod \"cinder-43b3-account-create-update-6gqq5\" (UID: \"5770188c-7480-4529-8450-3d1a44cf50d6\") " pod="openstack/cinder-43b3-account-create-update-6gqq5" Feb 19 15:28:28 crc kubenswrapper[4810]: I0219 15:28:28.481261 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-8ce6-account-create-update-ztxw9" Feb 19 15:28:28 crc kubenswrapper[4810]: E0219 15:28:28.545445 4810 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod788aae13_b274_4965_ac0c_8ac075c32567.slice/crio-conmon-e901006a072b768e5b2fffb51122f3ae9427a4d8bce325b789cb52d5dc2df384.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod788aae13_b274_4965_ac0c_8ac075c32567.slice/crio-e901006a072b768e5b2fffb51122f3ae9427a4d8bce325b789cb52d5dc2df384.scope\": RecentStats: unable to find data in memory cache]" Feb 19 15:28:28 crc kubenswrapper[4810]: I0219 15:28:28.560311 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25jng\" (UniqueName: \"kubernetes.io/projected/5770188c-7480-4529-8450-3d1a44cf50d6-kube-api-access-25jng\") pod \"cinder-43b3-account-create-update-6gqq5\" (UID: \"5770188c-7480-4529-8450-3d1a44cf50d6\") " pod="openstack/cinder-43b3-account-create-update-6gqq5" Feb 19 15:28:28 crc kubenswrapper[4810]: I0219 15:28:28.560460 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5770188c-7480-4529-8450-3d1a44cf50d6-operator-scripts\") pod \"cinder-43b3-account-create-update-6gqq5\" (UID: \"5770188c-7480-4529-8450-3d1a44cf50d6\") " pod="openstack/cinder-43b3-account-create-update-6gqq5" Feb 19 15:28:28 crc kubenswrapper[4810]: I0219 15:28:28.561677 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5770188c-7480-4529-8450-3d1a44cf50d6-operator-scripts\") pod \"cinder-43b3-account-create-update-6gqq5\" (UID: \"5770188c-7480-4529-8450-3d1a44cf50d6\") " pod="openstack/cinder-43b3-account-create-update-6gqq5" Feb 19 15:28:28 crc kubenswrapper[4810]: I0219 15:28:28.576954 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25jng\" (UniqueName: \"kubernetes.io/projected/5770188c-7480-4529-8450-3d1a44cf50d6-kube-api-access-25jng\") pod \"cinder-43b3-account-create-update-6gqq5\" (UID: \"5770188c-7480-4529-8450-3d1a44cf50d6\") " pod="openstack/cinder-43b3-account-create-update-6gqq5" Feb 19 15:28:28 crc kubenswrapper[4810]: I0219 15:28:28.587079 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 19 15:28:28 crc kubenswrapper[4810]: I0219 15:28:28.605656 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-9cp2h" Feb 19 15:28:28 crc kubenswrapper[4810]: I0219 15:28:28.660782 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 19 15:28:28 crc kubenswrapper[4810]: I0219 15:28:28.700143 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-43b3-account-create-update-6gqq5" Feb 19 15:28:28 crc kubenswrapper[4810]: I0219 15:28:28.730340 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 19 15:28:28 crc kubenswrapper[4810]: E0219 15:28:28.736957 4810 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.162:55748->38.102.83.162:41765: write tcp 38.102.83.162:55748->38.102.83.162:41765: write: broken pipe Feb 19 15:28:28 crc kubenswrapper[4810]: I0219 15:28:28.740549 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/082fc735-2850-452d-841a-0af9ed7ed171-config-data\") pod \"keystone-db-sync-k9zsz\" (UID: \"082fc735-2850-452d-841a-0af9ed7ed171\") " pod="openstack/keystone-db-sync-k9zsz" Feb 19 15:28:28 crc kubenswrapper[4810]: I0219 15:28:28.786456 4810 generic.go:334] "Generic (PLEG): container finished" podID="788aae13-b274-4965-ac0c-8ac075c32567" containerID="e901006a072b768e5b2fffb51122f3ae9427a4d8bce325b789cb52d5dc2df384" exitCode=0 Feb 19 15:28:28 crc kubenswrapper[4810]: I0219 15:28:28.786499 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d95ff5b97-flxcw" event={"ID":"788aae13-b274-4965-ac0c-8ac075c32567","Type":"ContainerDied","Data":"e901006a072b768e5b2fffb51122f3ae9427a4d8bce325b789cb52d5dc2df384"} Feb 19 15:28:28 crc kubenswrapper[4810]: I0219 15:28:28.829289 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-ptwwk"] Feb 19 15:28:28 crc kubenswrapper[4810]: I0219 15:28:28.878881 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-9wtnf"] Feb 19 15:28:29 crc kubenswrapper[4810]: I0219 15:28:29.002838 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-sync-sd4lr"] Feb 19 15:28:29 crc kubenswrapper[4810]: I0219 15:28:29.139820 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-j78zz" Feb 19 15:28:29 crc kubenswrapper[4810]: I0219 15:28:29.144078 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-k9zsz" Feb 19 15:28:29 crc kubenswrapper[4810]: I0219 15:28:29.160424 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d95ff5b97-flxcw" Feb 19 15:28:29 crc kubenswrapper[4810]: I0219 15:28:29.172147 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dc04-account-create-update-dmf9z"] Feb 19 15:28:29 crc kubenswrapper[4810]: I0219 15:28:29.201042 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-8ce6-account-create-update-ztxw9"] Feb 19 15:28:29 crc kubenswrapper[4810]: I0219 15:28:29.273902 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/788aae13-b274-4965-ac0c-8ac075c32567-ovsdbserver-nb\") pod \"788aae13-b274-4965-ac0c-8ac075c32567\" (UID: \"788aae13-b274-4965-ac0c-8ac075c32567\") " Feb 19 15:28:29 crc kubenswrapper[4810]: I0219 15:28:29.274291 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/788aae13-b274-4965-ac0c-8ac075c32567-config\") pod \"788aae13-b274-4965-ac0c-8ac075c32567\" (UID: \"788aae13-b274-4965-ac0c-8ac075c32567\") " Feb 19 15:28:29 crc kubenswrapper[4810]: I0219 15:28:29.274364 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/788aae13-b274-4965-ac0c-8ac075c32567-dns-svc\") pod \"788aae13-b274-4965-ac0c-8ac075c32567\" (UID: \"788aae13-b274-4965-ac0c-8ac075c32567\") " Feb 19 15:28:29 crc kubenswrapper[4810]: I0219 15:28:29.274401 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9d7lt\" (UniqueName: \"kubernetes.io/projected/788aae13-b274-4965-ac0c-8ac075c32567-kube-api-access-9d7lt\") pod \"788aae13-b274-4965-ac0c-8ac075c32567\" (UID: \"788aae13-b274-4965-ac0c-8ac075c32567\") " Feb 19 15:28:29 crc kubenswrapper[4810]: I0219 15:28:29.274490 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/788aae13-b274-4965-ac0c-8ac075c32567-ovsdbserver-sb\") pod \"788aae13-b274-4965-ac0c-8ac075c32567\" (UID: \"788aae13-b274-4965-ac0c-8ac075c32567\") " Feb 19 15:28:29 crc kubenswrapper[4810]: I0219 15:28:29.295990 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/788aae13-b274-4965-ac0c-8ac075c32567-kube-api-access-9d7lt" (OuterVolumeSpecName: "kube-api-access-9d7lt") pod "788aae13-b274-4965-ac0c-8ac075c32567" (UID: "788aae13-b274-4965-ac0c-8ac075c32567"). InnerVolumeSpecName "kube-api-access-9d7lt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:28:29 crc kubenswrapper[4810]: I0219 15:28:29.330953 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-9cp2h"] Feb 19 15:28:29 crc kubenswrapper[4810]: I0219 15:28:29.376207 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9d7lt\" (UniqueName: \"kubernetes.io/projected/788aae13-b274-4965-ac0c-8ac075c32567-kube-api-access-9d7lt\") on node \"crc\" DevicePath \"\"" Feb 19 15:28:29 crc kubenswrapper[4810]: I0219 15:28:29.465126 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/788aae13-b274-4965-ac0c-8ac075c32567-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "788aae13-b274-4965-ac0c-8ac075c32567" (UID: "788aae13-b274-4965-ac0c-8ac075c32567"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:28:29 crc kubenswrapper[4810]: I0219 15:28:29.477626 4810 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/788aae13-b274-4965-ac0c-8ac075c32567-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 15:28:29 crc kubenswrapper[4810]: I0219 15:28:29.494371 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/788aae13-b274-4965-ac0c-8ac075c32567-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "788aae13-b274-4965-ac0c-8ac075c32567" (UID: "788aae13-b274-4965-ac0c-8ac075c32567"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:28:29 crc kubenswrapper[4810]: I0219 15:28:29.496196 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-43b3-account-create-update-6gqq5"] Feb 19 15:28:29 crc kubenswrapper[4810]: I0219 15:28:29.557660 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/788aae13-b274-4965-ac0c-8ac075c32567-config" (OuterVolumeSpecName: "config") pod "788aae13-b274-4965-ac0c-8ac075c32567" (UID: "788aae13-b274-4965-ac0c-8ac075c32567"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:28:29 crc kubenswrapper[4810]: I0219 15:28:29.579184 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/788aae13-b274-4965-ac0c-8ac075c32567-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 15:28:29 crc kubenswrapper[4810]: I0219 15:28:29.579214 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/788aae13-b274-4965-ac0c-8ac075c32567-config\") on node \"crc\" DevicePath \"\"" Feb 19 15:28:29 crc kubenswrapper[4810]: I0219 15:28:29.596282 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-k9zsz"] Feb 19 15:28:29 crc kubenswrapper[4810]: I0219 15:28:29.602919 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/788aae13-b274-4965-ac0c-8ac075c32567-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "788aae13-b274-4965-ac0c-8ac075c32567" (UID: "788aae13-b274-4965-ac0c-8ac075c32567"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:28:29 crc kubenswrapper[4810]: W0219 15:28:29.613636 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod082fc735_2850_452d_841a_0af9ed7ed171.slice/crio-866dbef12abd89bdc5dcdad8828a3677ff57d6665594483a4a2f0d6d9a3ec62c WatchSource:0}: Error finding container 866dbef12abd89bdc5dcdad8828a3677ff57d6665594483a4a2f0d6d9a3ec62c: Status 404 returned error can't find the container with id 866dbef12abd89bdc5dcdad8828a3677ff57d6665594483a4a2f0d6d9a3ec62c Feb 19 15:28:29 crc kubenswrapper[4810]: I0219 15:28:29.680253 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/788aae13-b274-4965-ac0c-8ac075c32567-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 15:28:29 crc kubenswrapper[4810]: I0219 15:28:29.817703 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-sd4lr" event={"ID":"63eeb47c-9c4a-4e36-be24-61c126517600","Type":"ContainerStarted","Data":"490bfe3163d0e855ea419f13cf8a5bebf9306234f76375e3967a08a8c33723b7"} Feb 19 15:28:29 crc kubenswrapper[4810]: I0219 15:28:29.822503 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-ptwwk" event={"ID":"c2c7665c-330a-45b8-b461-bd08b069b747","Type":"ContainerStarted","Data":"f1368987940841c90c15fd62f94998dda89194fc11c576179de536979c6adc82"} Feb 19 15:28:29 crc kubenswrapper[4810]: I0219 15:28:29.822541 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-ptwwk" event={"ID":"c2c7665c-330a-45b8-b461-bd08b069b747","Type":"ContainerStarted","Data":"fbfaae03009c8948f491fee5a2a00dca9fa1a0a5f93189fe34e02d8b5cd0dc77"} Feb 19 15:28:29 crc kubenswrapper[4810]: I0219 15:28:29.827112 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-43b3-account-create-update-6gqq5" event={"ID":"5770188c-7480-4529-8450-3d1a44cf50d6","Type":"ContainerStarted","Data":"a57eff1ff8b65e63b3a00f379e6a439481a3517ce81fa871e65130447c5a498b"} Feb 19 15:28:29 crc kubenswrapper[4810]: I0219 15:28:29.829688 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dc04-account-create-update-dmf9z" event={"ID":"05b15da5-701a-492a-b986-99b767d2876c","Type":"ContainerStarted","Data":"e7fde16e8762a1bda91d4aa8dbb2846cb1cc1e9049ca68a05b5713d4c512a4b8"} Feb 19 15:28:29 crc kubenswrapper[4810]: I0219 15:28:29.829720 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dc04-account-create-update-dmf9z" event={"ID":"05b15da5-701a-492a-b986-99b767d2876c","Type":"ContainerStarted","Data":"8e600340158d54f5ebb4c03dc507ad12f985664e178179ccd963dddb0bb69620"} Feb 19 15:28:29 crc kubenswrapper[4810]: I0219 15:28:29.831703 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-8ce6-account-create-update-ztxw9" event={"ID":"5ac347c6-4f1b-4b05-87a0-9332dec2ba9d","Type":"ContainerStarted","Data":"58aa0a5c9039d1f2e2e2fa3a8464a2460e5ec3924d291301dda66cb2085c37f2"} Feb 19 15:28:29 crc kubenswrapper[4810]: I0219 15:28:29.831732 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-8ce6-account-create-update-ztxw9" event={"ID":"5ac347c6-4f1b-4b05-87a0-9332dec2ba9d","Type":"ContainerStarted","Data":"3b7cd5c39591003c8d8b9a8e7e5c5a0c7eb4fe26dcbaf6fdb194cc285fd78923"} Feb 19 15:28:29 crc kubenswrapper[4810]: I0219 15:28:29.835193 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-k9zsz" event={"ID":"082fc735-2850-452d-841a-0af9ed7ed171","Type":"ContainerStarted","Data":"866dbef12abd89bdc5dcdad8828a3677ff57d6665594483a4a2f0d6d9a3ec62c"} Feb 19 15:28:29 crc kubenswrapper[4810]: I0219 15:28:29.838165 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-ptwwk" podStartSLOduration=2.83814776 podStartE2EDuration="2.83814776s" podCreationTimestamp="2026-02-19 15:28:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:28:29.836408678 +0000 UTC m=+1139.318438802" watchObservedRunningTime="2026-02-19 15:28:29.83814776 +0000 UTC m=+1139.320177884" Feb 19 15:28:29 crc kubenswrapper[4810]: I0219 15:28:29.840958 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d95ff5b97-flxcw" event={"ID":"788aae13-b274-4965-ac0c-8ac075c32567","Type":"ContainerDied","Data":"7cd4fcbb128b34896f354e231a1de0a145b80016e5c0592bfefbc4361bfadbaf"} Feb 19 15:28:29 crc kubenswrapper[4810]: I0219 15:28:29.841420 4810 scope.go:117] "RemoveContainer" containerID="e901006a072b768e5b2fffb51122f3ae9427a4d8bce325b789cb52d5dc2df384" Feb 19 15:28:29 crc kubenswrapper[4810]: I0219 15:28:29.841490 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d95ff5b97-flxcw" Feb 19 15:28:29 crc kubenswrapper[4810]: I0219 15:28:29.846049 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-9cp2h" event={"ID":"700fd144-e077-4468-80a4-f131fdb9d67e","Type":"ContainerStarted","Data":"03a5a65957944d9b807e9bb77848eeeb10aa95967ed1b4ac444e66a5d8f6c6f2"} Feb 19 15:28:29 crc kubenswrapper[4810]: I0219 15:28:29.854045 4810 generic.go:334] "Generic (PLEG): container finished" podID="a6217aad-07e6-49b6-8e80-41e75cecaaf5" containerID="317f50bb910bab31d9c1242a97f9988671bee73e88d7e795833b2626793ec0c6" exitCode=0 Feb 19 15:28:29 crc kubenswrapper[4810]: I0219 15:28:29.854098 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-9wtnf" event={"ID":"a6217aad-07e6-49b6-8e80-41e75cecaaf5","Type":"ContainerDied","Data":"317f50bb910bab31d9c1242a97f9988671bee73e88d7e795833b2626793ec0c6"} Feb 19 15:28:29 crc kubenswrapper[4810]: I0219 15:28:29.854123 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-9wtnf" event={"ID":"a6217aad-07e6-49b6-8e80-41e75cecaaf5","Type":"ContainerStarted","Data":"6e01ddf79bfd9efeeb1dcc46b06ded16ef7b0034b875cc74ad588200086084a8"} Feb 19 15:28:29 crc kubenswrapper[4810]: I0219 15:28:29.869393 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-dc04-account-create-update-dmf9z" podStartSLOduration=1.869377955 podStartE2EDuration="1.869377955s" podCreationTimestamp="2026-02-19 15:28:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:28:29.865453119 +0000 UTC m=+1139.347483243" watchObservedRunningTime="2026-02-19 15:28:29.869377955 +0000 UTC m=+1139.351408079" Feb 19 15:28:29 crc kubenswrapper[4810]: I0219 15:28:29.873624 4810 scope.go:117] "RemoveContainer" containerID="28ca2a964c46491b21497e9a884496f03f8a4889da795bda4b42a303c67c82ef" Feb 19 15:28:29 crc kubenswrapper[4810]: I0219 15:28:29.880615 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-8ce6-account-create-update-ztxw9" podStartSLOduration=2.880595389 podStartE2EDuration="2.880595389s" podCreationTimestamp="2026-02-19 15:28:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:28:29.880105287 +0000 UTC m=+1139.362135411" watchObservedRunningTime="2026-02-19 15:28:29.880595389 +0000 UTC m=+1139.362625513" Feb 19 15:28:29 crc kubenswrapper[4810]: I0219 15:28:29.915272 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d95ff5b97-flxcw"] Feb 19 15:28:29 crc kubenswrapper[4810]: I0219 15:28:29.929378 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7d95ff5b97-flxcw"] Feb 19 15:28:29 crc kubenswrapper[4810]: I0219 15:28:29.951707 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-9cp2h" podStartSLOduration=1.95169003 podStartE2EDuration="1.95169003s" podCreationTimestamp="2026-02-19 15:28:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:28:29.936414436 +0000 UTC m=+1139.418444560" watchObservedRunningTime="2026-02-19 15:28:29.95169003 +0000 UTC m=+1139.433720144" Feb 19 15:28:30 crc kubenswrapper[4810]: I0219 15:28:30.864463 4810 generic.go:334] "Generic (PLEG): container finished" podID="31093793-65b6-467c-8d5b-218e108fd330" containerID="f13c00b75444d82ae151313db252a559d67eb3a9e93fc91fd59fa886fe8ada73" exitCode=0 Feb 19 15:28:30 crc kubenswrapper[4810]: I0219 15:28:30.864565 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-kpf4t" event={"ID":"31093793-65b6-467c-8d5b-218e108fd330","Type":"ContainerDied","Data":"f13c00b75444d82ae151313db252a559d67eb3a9e93fc91fd59fa886fe8ada73"} Feb 19 15:28:30 crc kubenswrapper[4810]: I0219 15:28:30.866673 4810 generic.go:334] "Generic (PLEG): container finished" podID="05b15da5-701a-492a-b986-99b767d2876c" containerID="e7fde16e8762a1bda91d4aa8dbb2846cb1cc1e9049ca68a05b5713d4c512a4b8" exitCode=0 Feb 19 15:28:30 crc kubenswrapper[4810]: I0219 15:28:30.866727 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dc04-account-create-update-dmf9z" event={"ID":"05b15da5-701a-492a-b986-99b767d2876c","Type":"ContainerDied","Data":"e7fde16e8762a1bda91d4aa8dbb2846cb1cc1e9049ca68a05b5713d4c512a4b8"} Feb 19 15:28:30 crc kubenswrapper[4810]: I0219 15:28:30.869311 4810 generic.go:334] "Generic (PLEG): container finished" podID="5ac347c6-4f1b-4b05-87a0-9332dec2ba9d" containerID="58aa0a5c9039d1f2e2e2fa3a8464a2460e5ec3924d291301dda66cb2085c37f2" exitCode=0 Feb 19 15:28:30 crc kubenswrapper[4810]: I0219 15:28:30.869390 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-8ce6-account-create-update-ztxw9" event={"ID":"5ac347c6-4f1b-4b05-87a0-9332dec2ba9d","Type":"ContainerDied","Data":"58aa0a5c9039d1f2e2e2fa3a8464a2460e5ec3924d291301dda66cb2085c37f2"} Feb 19 15:28:30 crc kubenswrapper[4810]: I0219 15:28:30.871489 4810 generic.go:334] "Generic (PLEG): container finished" podID="700fd144-e077-4468-80a4-f131fdb9d67e" containerID="9f4b973758b7ab4df4daae42709ff686161b77974aebe961605bcdd7b7ee6895" exitCode=0 Feb 19 15:28:30 crc kubenswrapper[4810]: I0219 15:28:30.871529 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-9cp2h" event={"ID":"700fd144-e077-4468-80a4-f131fdb9d67e","Type":"ContainerDied","Data":"9f4b973758b7ab4df4daae42709ff686161b77974aebe961605bcdd7b7ee6895"} Feb 19 15:28:30 crc kubenswrapper[4810]: I0219 15:28:30.874252 4810 generic.go:334] "Generic (PLEG): container finished" podID="5770188c-7480-4529-8450-3d1a44cf50d6" containerID="235b187eb3bd473181cf1d8a9d02071a2d445e3841afa055bb60de833ffbcec1" exitCode=0 Feb 19 15:28:30 crc kubenswrapper[4810]: I0219 15:28:30.874292 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-43b3-account-create-update-6gqq5" event={"ID":"5770188c-7480-4529-8450-3d1a44cf50d6","Type":"ContainerDied","Data":"235b187eb3bd473181cf1d8a9d02071a2d445e3841afa055bb60de833ffbcec1"} Feb 19 15:28:30 crc kubenswrapper[4810]: I0219 15:28:30.880637 4810 generic.go:334] "Generic (PLEG): container finished" podID="c2c7665c-330a-45b8-b461-bd08b069b747" containerID="f1368987940841c90c15fd62f94998dda89194fc11c576179de536979c6adc82" exitCode=0 Feb 19 15:28:30 crc kubenswrapper[4810]: I0219 15:28:30.881136 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-ptwwk" event={"ID":"c2c7665c-330a-45b8-b461-bd08b069b747","Type":"ContainerDied","Data":"f1368987940841c90c15fd62f94998dda89194fc11c576179de536979c6adc82"} Feb 19 15:28:31 crc kubenswrapper[4810]: I0219 15:28:31.229765 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-9wtnf" Feb 19 15:28:31 crc kubenswrapper[4810]: I0219 15:28:31.307628 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6217aad-07e6-49b6-8e80-41e75cecaaf5-operator-scripts\") pod \"a6217aad-07e6-49b6-8e80-41e75cecaaf5\" (UID: \"a6217aad-07e6-49b6-8e80-41e75cecaaf5\") " Feb 19 15:28:31 crc kubenswrapper[4810]: I0219 15:28:31.307820 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxq5k\" (UniqueName: \"kubernetes.io/projected/a6217aad-07e6-49b6-8e80-41e75cecaaf5-kube-api-access-lxq5k\") pod \"a6217aad-07e6-49b6-8e80-41e75cecaaf5\" (UID: \"a6217aad-07e6-49b6-8e80-41e75cecaaf5\") " Feb 19 15:28:31 crc kubenswrapper[4810]: I0219 15:28:31.308426 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6217aad-07e6-49b6-8e80-41e75cecaaf5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a6217aad-07e6-49b6-8e80-41e75cecaaf5" (UID: "a6217aad-07e6-49b6-8e80-41e75cecaaf5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:28:31 crc kubenswrapper[4810]: I0219 15:28:31.314313 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6217aad-07e6-49b6-8e80-41e75cecaaf5-kube-api-access-lxq5k" (OuterVolumeSpecName: "kube-api-access-lxq5k") pod "a6217aad-07e6-49b6-8e80-41e75cecaaf5" (UID: "a6217aad-07e6-49b6-8e80-41e75cecaaf5"). InnerVolumeSpecName "kube-api-access-lxq5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:28:31 crc kubenswrapper[4810]: I0219 15:28:31.409693 4810 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6217aad-07e6-49b6-8e80-41e75cecaaf5-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 15:28:31 crc kubenswrapper[4810]: I0219 15:28:31.409952 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxq5k\" (UniqueName: \"kubernetes.io/projected/a6217aad-07e6-49b6-8e80-41e75cecaaf5-kube-api-access-lxq5k\") on node \"crc\" DevicePath \"\"" Feb 19 15:28:31 crc kubenswrapper[4810]: I0219 15:28:31.459468 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="788aae13-b274-4965-ac0c-8ac075c32567" path="/var/lib/kubelet/pods/788aae13-b274-4965-ac0c-8ac075c32567/volumes" Feb 19 15:28:31 crc kubenswrapper[4810]: I0219 15:28:31.895622 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-9wtnf" event={"ID":"a6217aad-07e6-49b6-8e80-41e75cecaaf5","Type":"ContainerDied","Data":"6e01ddf79bfd9efeeb1dcc46b06ded16ef7b0034b875cc74ad588200086084a8"} Feb 19 15:28:31 crc kubenswrapper[4810]: I0219 15:28:31.895663 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e01ddf79bfd9efeeb1dcc46b06ded16ef7b0034b875cc74ad588200086084a8" Feb 19 15:28:31 crc kubenswrapper[4810]: I0219 15:28:31.895675 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-9wtnf" Feb 19 15:28:31 crc kubenswrapper[4810]: I0219 15:28:31.899577 4810 generic.go:334] "Generic (PLEG): container finished" podID="5c213a3a-78fd-4b42-bc1c-e09837eae684" containerID="f004bfd72f8d26f3aaf22a2c6561639f20d7d6d9b961893f534ded590f3c40ec" exitCode=0 Feb 19 15:28:31 crc kubenswrapper[4810]: I0219 15:28:31.899717 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"5c213a3a-78fd-4b42-bc1c-e09837eae684","Type":"ContainerDied","Data":"f004bfd72f8d26f3aaf22a2c6561639f20d7d6d9b961893f534ded590f3c40ec"} Feb 19 15:28:34 crc kubenswrapper[4810]: I0219 15:28:34.177619 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-9cp2h" Feb 19 15:28:34 crc kubenswrapper[4810]: I0219 15:28:34.190852 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dc04-account-create-update-dmf9z" Feb 19 15:28:34 crc kubenswrapper[4810]: I0219 15:28:34.259391 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/700fd144-e077-4468-80a4-f131fdb9d67e-operator-scripts\") pod \"700fd144-e077-4468-80a4-f131fdb9d67e\" (UID: \"700fd144-e077-4468-80a4-f131fdb9d67e\") " Feb 19 15:28:34 crc kubenswrapper[4810]: I0219 15:28:34.259434 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05b15da5-701a-492a-b986-99b767d2876c-operator-scripts\") pod \"05b15da5-701a-492a-b986-99b767d2876c\" (UID: \"05b15da5-701a-492a-b986-99b767d2876c\") " Feb 19 15:28:34 crc kubenswrapper[4810]: I0219 15:28:34.259631 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vzqqn\" (UniqueName: \"kubernetes.io/projected/05b15da5-701a-492a-b986-99b767d2876c-kube-api-access-vzqqn\") pod \"05b15da5-701a-492a-b986-99b767d2876c\" (UID: \"05b15da5-701a-492a-b986-99b767d2876c\") " Feb 19 15:28:34 crc kubenswrapper[4810]: I0219 15:28:34.259696 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-psgfb\" (UniqueName: \"kubernetes.io/projected/700fd144-e077-4468-80a4-f131fdb9d67e-kube-api-access-psgfb\") pod \"700fd144-e077-4468-80a4-f131fdb9d67e\" (UID: \"700fd144-e077-4468-80a4-f131fdb9d67e\") " Feb 19 15:28:34 crc kubenswrapper[4810]: I0219 15:28:34.260367 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/700fd144-e077-4468-80a4-f131fdb9d67e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "700fd144-e077-4468-80a4-f131fdb9d67e" (UID: "700fd144-e077-4468-80a4-f131fdb9d67e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:28:34 crc kubenswrapper[4810]: I0219 15:28:34.261209 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05b15da5-701a-492a-b986-99b767d2876c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "05b15da5-701a-492a-b986-99b767d2876c" (UID: "05b15da5-701a-492a-b986-99b767d2876c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:28:34 crc kubenswrapper[4810]: I0219 15:28:34.265466 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05b15da5-701a-492a-b986-99b767d2876c-kube-api-access-vzqqn" (OuterVolumeSpecName: "kube-api-access-vzqqn") pod "05b15da5-701a-492a-b986-99b767d2876c" (UID: "05b15da5-701a-492a-b986-99b767d2876c"). InnerVolumeSpecName "kube-api-access-vzqqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:28:34 crc kubenswrapper[4810]: I0219 15:28:34.266514 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/700fd144-e077-4468-80a4-f131fdb9d67e-kube-api-access-psgfb" (OuterVolumeSpecName: "kube-api-access-psgfb") pod "700fd144-e077-4468-80a4-f131fdb9d67e" (UID: "700fd144-e077-4468-80a4-f131fdb9d67e"). InnerVolumeSpecName "kube-api-access-psgfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:28:34 crc kubenswrapper[4810]: I0219 15:28:34.361543 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vzqqn\" (UniqueName: \"kubernetes.io/projected/05b15da5-701a-492a-b986-99b767d2876c-kube-api-access-vzqqn\") on node \"crc\" DevicePath \"\"" Feb 19 15:28:34 crc kubenswrapper[4810]: I0219 15:28:34.361576 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-psgfb\" (UniqueName: \"kubernetes.io/projected/700fd144-e077-4468-80a4-f131fdb9d67e-kube-api-access-psgfb\") on node \"crc\" DevicePath \"\"" Feb 19 15:28:34 crc kubenswrapper[4810]: I0219 15:28:34.361586 4810 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/700fd144-e077-4468-80a4-f131fdb9d67e-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 15:28:34 crc kubenswrapper[4810]: I0219 15:28:34.361595 4810 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05b15da5-701a-492a-b986-99b767d2876c-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 15:28:34 crc kubenswrapper[4810]: I0219 15:28:34.939029 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dc04-account-create-update-dmf9z" event={"ID":"05b15da5-701a-492a-b986-99b767d2876c","Type":"ContainerDied","Data":"8e600340158d54f5ebb4c03dc507ad12f985664e178179ccd963dddb0bb69620"} Feb 19 15:28:34 crc kubenswrapper[4810]: I0219 15:28:34.939119 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e600340158d54f5ebb4c03dc507ad12f985664e178179ccd963dddb0bb69620" Feb 19 15:28:34 crc kubenswrapper[4810]: I0219 15:28:34.939597 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dc04-account-create-update-dmf9z" Feb 19 15:28:34 crc kubenswrapper[4810]: I0219 15:28:34.942295 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-9cp2h" event={"ID":"700fd144-e077-4468-80a4-f131fdb9d67e","Type":"ContainerDied","Data":"03a5a65957944d9b807e9bb77848eeeb10aa95967ed1b4ac444e66a5d8f6c6f2"} Feb 19 15:28:34 crc kubenswrapper[4810]: I0219 15:28:34.942376 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="03a5a65957944d9b807e9bb77848eeeb10aa95967ed1b4ac444e66a5d8f6c6f2" Feb 19 15:28:34 crc kubenswrapper[4810]: I0219 15:28:34.942495 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-9cp2h" Feb 19 15:28:37 crc kubenswrapper[4810]: I0219 15:28:37.563048 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-8ce6-account-create-update-ztxw9" Feb 19 15:28:37 crc kubenswrapper[4810]: I0219 15:28:37.725012 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ac347c6-4f1b-4b05-87a0-9332dec2ba9d-operator-scripts\") pod \"5ac347c6-4f1b-4b05-87a0-9332dec2ba9d\" (UID: \"5ac347c6-4f1b-4b05-87a0-9332dec2ba9d\") " Feb 19 15:28:37 crc kubenswrapper[4810]: I0219 15:28:37.725113 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqz6c\" (UniqueName: \"kubernetes.io/projected/5ac347c6-4f1b-4b05-87a0-9332dec2ba9d-kube-api-access-tqz6c\") pod \"5ac347c6-4f1b-4b05-87a0-9332dec2ba9d\" (UID: \"5ac347c6-4f1b-4b05-87a0-9332dec2ba9d\") " Feb 19 15:28:37 crc kubenswrapper[4810]: I0219 15:28:37.725590 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ac347c6-4f1b-4b05-87a0-9332dec2ba9d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5ac347c6-4f1b-4b05-87a0-9332dec2ba9d" (UID: "5ac347c6-4f1b-4b05-87a0-9332dec2ba9d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:28:37 crc kubenswrapper[4810]: I0219 15:28:37.730564 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ac347c6-4f1b-4b05-87a0-9332dec2ba9d-kube-api-access-tqz6c" (OuterVolumeSpecName: "kube-api-access-tqz6c") pod "5ac347c6-4f1b-4b05-87a0-9332dec2ba9d" (UID: "5ac347c6-4f1b-4b05-87a0-9332dec2ba9d"). InnerVolumeSpecName "kube-api-access-tqz6c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:28:37 crc kubenswrapper[4810]: I0219 15:28:37.826832 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tqz6c\" (UniqueName: \"kubernetes.io/projected/5ac347c6-4f1b-4b05-87a0-9332dec2ba9d-kube-api-access-tqz6c\") on node \"crc\" DevicePath \"\"" Feb 19 15:28:37 crc kubenswrapper[4810]: I0219 15:28:37.826867 4810 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ac347c6-4f1b-4b05-87a0-9332dec2ba9d-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 15:28:37 crc kubenswrapper[4810]: I0219 15:28:37.970350 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-8ce6-account-create-update-ztxw9" event={"ID":"5ac347c6-4f1b-4b05-87a0-9332dec2ba9d","Type":"ContainerDied","Data":"3b7cd5c39591003c8d8b9a8e7e5c5a0c7eb4fe26dcbaf6fdb194cc285fd78923"} Feb 19 15:28:37 crc kubenswrapper[4810]: I0219 15:28:37.970400 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b7cd5c39591003c8d8b9a8e7e5c5a0c7eb4fe26dcbaf6fdb194cc285fd78923" Feb 19 15:28:37 crc kubenswrapper[4810]: I0219 15:28:37.970447 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-8ce6-account-create-update-ztxw9" Feb 19 15:28:38 crc kubenswrapper[4810]: I0219 15:28:38.071487 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-kpf4t" Feb 19 15:28:38 crc kubenswrapper[4810]: I0219 15:28:38.086722 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-43b3-account-create-update-6gqq5" Feb 19 15:28:38 crc kubenswrapper[4810]: I0219 15:28:38.106333 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-ptwwk" Feb 19 15:28:38 crc kubenswrapper[4810]: I0219 15:28:38.234835 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/31093793-65b6-467c-8d5b-218e108fd330-db-sync-config-data\") pod \"31093793-65b6-467c-8d5b-218e108fd330\" (UID: \"31093793-65b6-467c-8d5b-218e108fd330\") " Feb 19 15:28:38 crc kubenswrapper[4810]: I0219 15:28:38.234892 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25jng\" (UniqueName: \"kubernetes.io/projected/5770188c-7480-4529-8450-3d1a44cf50d6-kube-api-access-25jng\") pod \"5770188c-7480-4529-8450-3d1a44cf50d6\" (UID: \"5770188c-7480-4529-8450-3d1a44cf50d6\") " Feb 19 15:28:38 crc kubenswrapper[4810]: I0219 15:28:38.234924 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5770188c-7480-4529-8450-3d1a44cf50d6-operator-scripts\") pod \"5770188c-7480-4529-8450-3d1a44cf50d6\" (UID: \"5770188c-7480-4529-8450-3d1a44cf50d6\") " Feb 19 15:28:38 crc kubenswrapper[4810]: I0219 15:28:38.234964 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5tqp\" (UniqueName: \"kubernetes.io/projected/31093793-65b6-467c-8d5b-218e108fd330-kube-api-access-h5tqp\") pod \"31093793-65b6-467c-8d5b-218e108fd330\" (UID: \"31093793-65b6-467c-8d5b-218e108fd330\") " Feb 19 15:28:38 crc kubenswrapper[4810]: I0219 15:28:38.235004 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31093793-65b6-467c-8d5b-218e108fd330-config-data\") pod \"31093793-65b6-467c-8d5b-218e108fd330\" (UID: \"31093793-65b6-467c-8d5b-218e108fd330\") " Feb 19 15:28:38 crc kubenswrapper[4810]: I0219 15:28:38.235023 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31093793-65b6-467c-8d5b-218e108fd330-combined-ca-bundle\") pod \"31093793-65b6-467c-8d5b-218e108fd330\" (UID: \"31093793-65b6-467c-8d5b-218e108fd330\") " Feb 19 15:28:38 crc kubenswrapper[4810]: I0219 15:28:38.235089 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8z7f9\" (UniqueName: \"kubernetes.io/projected/c2c7665c-330a-45b8-b461-bd08b069b747-kube-api-access-8z7f9\") pod \"c2c7665c-330a-45b8-b461-bd08b069b747\" (UID: \"c2c7665c-330a-45b8-b461-bd08b069b747\") " Feb 19 15:28:38 crc kubenswrapper[4810]: I0219 15:28:38.235120 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2c7665c-330a-45b8-b461-bd08b069b747-operator-scripts\") pod \"c2c7665c-330a-45b8-b461-bd08b069b747\" (UID: \"c2c7665c-330a-45b8-b461-bd08b069b747\") " Feb 19 15:28:38 crc kubenswrapper[4810]: I0219 15:28:38.236614 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2c7665c-330a-45b8-b461-bd08b069b747-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c2c7665c-330a-45b8-b461-bd08b069b747" (UID: "c2c7665c-330a-45b8-b461-bd08b069b747"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:28:38 crc kubenswrapper[4810]: I0219 15:28:38.240066 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31093793-65b6-467c-8d5b-218e108fd330-kube-api-access-h5tqp" (OuterVolumeSpecName: "kube-api-access-h5tqp") pod "31093793-65b6-467c-8d5b-218e108fd330" (UID: "31093793-65b6-467c-8d5b-218e108fd330"). InnerVolumeSpecName "kube-api-access-h5tqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:28:38 crc kubenswrapper[4810]: I0219 15:28:38.240500 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5770188c-7480-4529-8450-3d1a44cf50d6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5770188c-7480-4529-8450-3d1a44cf50d6" (UID: "5770188c-7480-4529-8450-3d1a44cf50d6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:28:38 crc kubenswrapper[4810]: I0219 15:28:38.243861 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31093793-65b6-467c-8d5b-218e108fd330-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "31093793-65b6-467c-8d5b-218e108fd330" (UID: "31093793-65b6-467c-8d5b-218e108fd330"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:28:38 crc kubenswrapper[4810]: I0219 15:28:38.245923 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2c7665c-330a-45b8-b461-bd08b069b747-kube-api-access-8z7f9" (OuterVolumeSpecName: "kube-api-access-8z7f9") pod "c2c7665c-330a-45b8-b461-bd08b069b747" (UID: "c2c7665c-330a-45b8-b461-bd08b069b747"). InnerVolumeSpecName "kube-api-access-8z7f9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:28:38 crc kubenswrapper[4810]: I0219 15:28:38.252924 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5770188c-7480-4529-8450-3d1a44cf50d6-kube-api-access-25jng" (OuterVolumeSpecName: "kube-api-access-25jng") pod "5770188c-7480-4529-8450-3d1a44cf50d6" (UID: "5770188c-7480-4529-8450-3d1a44cf50d6"). InnerVolumeSpecName "kube-api-access-25jng". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:28:38 crc kubenswrapper[4810]: I0219 15:28:38.293801 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31093793-65b6-467c-8d5b-218e108fd330-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "31093793-65b6-467c-8d5b-218e108fd330" (UID: "31093793-65b6-467c-8d5b-218e108fd330"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:28:38 crc kubenswrapper[4810]: I0219 15:28:38.330500 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31093793-65b6-467c-8d5b-218e108fd330-config-data" (OuterVolumeSpecName: "config-data") pod "31093793-65b6-467c-8d5b-218e108fd330" (UID: "31093793-65b6-467c-8d5b-218e108fd330"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:28:38 crc kubenswrapper[4810]: I0219 15:28:38.337671 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8z7f9\" (UniqueName: \"kubernetes.io/projected/c2c7665c-330a-45b8-b461-bd08b069b747-kube-api-access-8z7f9\") on node \"crc\" DevicePath \"\"" Feb 19 15:28:38 crc kubenswrapper[4810]: I0219 15:28:38.338520 4810 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2c7665c-330a-45b8-b461-bd08b069b747-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 15:28:38 crc kubenswrapper[4810]: I0219 15:28:38.338546 4810 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/31093793-65b6-467c-8d5b-218e108fd330-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 15:28:38 crc kubenswrapper[4810]: I0219 15:28:38.338560 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25jng\" (UniqueName: \"kubernetes.io/projected/5770188c-7480-4529-8450-3d1a44cf50d6-kube-api-access-25jng\") on node \"crc\" DevicePath \"\"" Feb 19 15:28:38 crc kubenswrapper[4810]: I0219 15:28:38.338574 4810 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5770188c-7480-4529-8450-3d1a44cf50d6-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 15:28:38 crc kubenswrapper[4810]: I0219 15:28:38.338586 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h5tqp\" (UniqueName: \"kubernetes.io/projected/31093793-65b6-467c-8d5b-218e108fd330-kube-api-access-h5tqp\") on node \"crc\" DevicePath \"\"" Feb 19 15:28:38 crc kubenswrapper[4810]: I0219 15:28:38.338598 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31093793-65b6-467c-8d5b-218e108fd330-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 15:28:38 crc kubenswrapper[4810]: I0219 15:28:38.338609 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31093793-65b6-467c-8d5b-218e108fd330-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 15:28:38 crc kubenswrapper[4810]: I0219 15:28:38.979386 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-sd4lr" event={"ID":"63eeb47c-9c4a-4e36-be24-61c126517600","Type":"ContainerStarted","Data":"7e83b0c5177b1183e58ad0498417fc1c3b6e142723e7482bda0235e4615b43f5"} Feb 19 15:28:38 crc kubenswrapper[4810]: I0219 15:28:38.980911 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-43b3-account-create-update-6gqq5" event={"ID":"5770188c-7480-4529-8450-3d1a44cf50d6","Type":"ContainerDied","Data":"a57eff1ff8b65e63b3a00f379e6a439481a3517ce81fa871e65130447c5a498b"} Feb 19 15:28:38 crc kubenswrapper[4810]: I0219 15:28:38.981311 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a57eff1ff8b65e63b3a00f379e6a439481a3517ce81fa871e65130447c5a498b" Feb 19 15:28:38 crc kubenswrapper[4810]: I0219 15:28:38.980955 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-43b3-account-create-update-6gqq5" Feb 19 15:28:38 crc kubenswrapper[4810]: I0219 15:28:38.984065 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-ptwwk" event={"ID":"c2c7665c-330a-45b8-b461-bd08b069b747","Type":"ContainerDied","Data":"fbfaae03009c8948f491fee5a2a00dca9fa1a0a5f93189fe34e02d8b5cd0dc77"} Feb 19 15:28:38 crc kubenswrapper[4810]: I0219 15:28:38.984186 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fbfaae03009c8948f491fee5a2a00dca9fa1a0a5f93189fe34e02d8b5cd0dc77" Feb 19 15:28:38 crc kubenswrapper[4810]: I0219 15:28:38.984343 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-ptwwk" Feb 19 15:28:38 crc kubenswrapper[4810]: I0219 15:28:38.988614 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-kpf4t" event={"ID":"31093793-65b6-467c-8d5b-218e108fd330","Type":"ContainerDied","Data":"bf6ea40573eeae6956f4018d71eb8ba4737a0b7a57f0e9d98686d8f2e0c053a9"} Feb 19 15:28:38 crc kubenswrapper[4810]: I0219 15:28:38.988648 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf6ea40573eeae6956f4018d71eb8ba4737a0b7a57f0e9d98686d8f2e0c053a9" Feb 19 15:28:38 crc kubenswrapper[4810]: I0219 15:28:38.988732 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-kpf4t" Feb 19 15:28:38 crc kubenswrapper[4810]: I0219 15:28:38.999545 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-k9zsz" event={"ID":"082fc735-2850-452d-841a-0af9ed7ed171","Type":"ContainerStarted","Data":"22a00ed65eebcb7030f20de212b927e0556118314908589176cea5b5329504cb"} Feb 19 15:28:39 crc kubenswrapper[4810]: I0219 15:28:39.002985 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"5c213a3a-78fd-4b42-bc1c-e09837eae684","Type":"ContainerStarted","Data":"09171322701d01cc2b65792ae9294bf2facdbb2647bec3473b5a2f2c7595769d"} Feb 19 15:28:39 crc kubenswrapper[4810]: I0219 15:28:39.006063 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-db-sync-sd4lr" podStartSLOduration=2.975481553 podStartE2EDuration="12.006040143s" podCreationTimestamp="2026-02-19 15:28:27 +0000 UTC" firstStartedPulling="2026-02-19 15:28:29.05535626 +0000 UTC m=+1138.537386384" lastFinishedPulling="2026-02-19 15:28:38.08591484 +0000 UTC m=+1147.567944974" observedRunningTime="2026-02-19 15:28:38.998076338 +0000 UTC m=+1148.480106462" watchObservedRunningTime="2026-02-19 15:28:39.006040143 +0000 UTC m=+1148.488070287" Feb 19 15:28:39 crc kubenswrapper[4810]: I0219 15:28:39.020549 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-k9zsz" podStartSLOduration=3.566756766 podStartE2EDuration="12.020533308s" podCreationTimestamp="2026-02-19 15:28:27 +0000 UTC" firstStartedPulling="2026-02-19 15:28:29.615662345 +0000 UTC m=+1139.097692469" lastFinishedPulling="2026-02-19 15:28:38.069438887 +0000 UTC m=+1147.551469011" observedRunningTime="2026-02-19 15:28:39.017533095 +0000 UTC m=+1148.499563219" watchObservedRunningTime="2026-02-19 15:28:39.020533308 +0000 UTC m=+1148.502563432" Feb 19 15:28:39 crc kubenswrapper[4810]: I0219 15:28:39.518240 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-589c97547-9nnhp"] Feb 19 15:28:39 crc kubenswrapper[4810]: E0219 15:28:39.518630 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="700fd144-e077-4468-80a4-f131fdb9d67e" containerName="mariadb-database-create" Feb 19 15:28:39 crc kubenswrapper[4810]: I0219 15:28:39.518651 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="700fd144-e077-4468-80a4-f131fdb9d67e" containerName="mariadb-database-create" Feb 19 15:28:39 crc kubenswrapper[4810]: E0219 15:28:39.518669 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="788aae13-b274-4965-ac0c-8ac075c32567" containerName="init" Feb 19 15:28:39 crc kubenswrapper[4810]: I0219 15:28:39.518675 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="788aae13-b274-4965-ac0c-8ac075c32567" containerName="init" Feb 19 15:28:39 crc kubenswrapper[4810]: E0219 15:28:39.518681 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6217aad-07e6-49b6-8e80-41e75cecaaf5" containerName="mariadb-database-create" Feb 19 15:28:39 crc kubenswrapper[4810]: I0219 15:28:39.518687 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6217aad-07e6-49b6-8e80-41e75cecaaf5" containerName="mariadb-database-create" Feb 19 15:28:39 crc kubenswrapper[4810]: E0219 15:28:39.518697 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05b15da5-701a-492a-b986-99b767d2876c" containerName="mariadb-account-create-update" Feb 19 15:28:39 crc kubenswrapper[4810]: I0219 15:28:39.518702 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="05b15da5-701a-492a-b986-99b767d2876c" containerName="mariadb-account-create-update" Feb 19 15:28:39 crc kubenswrapper[4810]: E0219 15:28:39.518715 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="788aae13-b274-4965-ac0c-8ac075c32567" containerName="dnsmasq-dns" Feb 19 15:28:39 crc kubenswrapper[4810]: I0219 15:28:39.518720 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="788aae13-b274-4965-ac0c-8ac075c32567" containerName="dnsmasq-dns" Feb 19 15:28:39 crc kubenswrapper[4810]: E0219 15:28:39.518734 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5770188c-7480-4529-8450-3d1a44cf50d6" containerName="mariadb-account-create-update" Feb 19 15:28:39 crc kubenswrapper[4810]: I0219 15:28:39.518740 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="5770188c-7480-4529-8450-3d1a44cf50d6" containerName="mariadb-account-create-update" Feb 19 15:28:39 crc kubenswrapper[4810]: E0219 15:28:39.518746 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2c7665c-330a-45b8-b461-bd08b069b747" containerName="mariadb-database-create" Feb 19 15:28:39 crc kubenswrapper[4810]: I0219 15:28:39.518752 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2c7665c-330a-45b8-b461-bd08b069b747" containerName="mariadb-database-create" Feb 19 15:28:39 crc kubenswrapper[4810]: E0219 15:28:39.518763 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ac347c6-4f1b-4b05-87a0-9332dec2ba9d" containerName="mariadb-account-create-update" Feb 19 15:28:39 crc kubenswrapper[4810]: I0219 15:28:39.518769 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ac347c6-4f1b-4b05-87a0-9332dec2ba9d" containerName="mariadb-account-create-update" Feb 19 15:28:39 crc kubenswrapper[4810]: E0219 15:28:39.518781 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31093793-65b6-467c-8d5b-218e108fd330" containerName="glance-db-sync" Feb 19 15:28:39 crc kubenswrapper[4810]: I0219 15:28:39.518786 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="31093793-65b6-467c-8d5b-218e108fd330" containerName="glance-db-sync" Feb 19 15:28:39 crc kubenswrapper[4810]: I0219 15:28:39.518930 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="5770188c-7480-4529-8450-3d1a44cf50d6" containerName="mariadb-account-create-update" Feb 19 15:28:39 crc kubenswrapper[4810]: I0219 15:28:39.518956 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2c7665c-330a-45b8-b461-bd08b069b747" containerName="mariadb-database-create" Feb 19 15:28:39 crc kubenswrapper[4810]: I0219 15:28:39.518968 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6217aad-07e6-49b6-8e80-41e75cecaaf5" containerName="mariadb-database-create" Feb 19 15:28:39 crc kubenswrapper[4810]: I0219 15:28:39.518985 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="31093793-65b6-467c-8d5b-218e108fd330" containerName="glance-db-sync" Feb 19 15:28:39 crc kubenswrapper[4810]: I0219 15:28:39.518998 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="700fd144-e077-4468-80a4-f131fdb9d67e" containerName="mariadb-database-create" Feb 19 15:28:39 crc kubenswrapper[4810]: I0219 15:28:39.519016 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ac347c6-4f1b-4b05-87a0-9332dec2ba9d" containerName="mariadb-account-create-update" Feb 19 15:28:39 crc kubenswrapper[4810]: I0219 15:28:39.519031 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="788aae13-b274-4965-ac0c-8ac075c32567" containerName="dnsmasq-dns" Feb 19 15:28:39 crc kubenswrapper[4810]: I0219 15:28:39.519040 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="05b15da5-701a-492a-b986-99b767d2876c" containerName="mariadb-account-create-update" Feb 19 15:28:39 crc kubenswrapper[4810]: I0219 15:28:39.519877 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589c97547-9nnhp" Feb 19 15:28:39 crc kubenswrapper[4810]: I0219 15:28:39.554549 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-589c97547-9nnhp"] Feb 19 15:28:39 crc kubenswrapper[4810]: I0219 15:28:39.658544 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7cc2bd68-d5b3-4416-8752-edf7ee85bf88-dns-swift-storage-0\") pod \"dnsmasq-dns-589c97547-9nnhp\" (UID: \"7cc2bd68-d5b3-4416-8752-edf7ee85bf88\") " pod="openstack/dnsmasq-dns-589c97547-9nnhp" Feb 19 15:28:39 crc kubenswrapper[4810]: I0219 15:28:39.658637 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7cc2bd68-d5b3-4416-8752-edf7ee85bf88-ovsdbserver-nb\") pod \"dnsmasq-dns-589c97547-9nnhp\" (UID: \"7cc2bd68-d5b3-4416-8752-edf7ee85bf88\") " pod="openstack/dnsmasq-dns-589c97547-9nnhp" Feb 19 15:28:39 crc kubenswrapper[4810]: I0219 15:28:39.658668 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7cc2bd68-d5b3-4416-8752-edf7ee85bf88-ovsdbserver-sb\") pod \"dnsmasq-dns-589c97547-9nnhp\" (UID: \"7cc2bd68-d5b3-4416-8752-edf7ee85bf88\") " pod="openstack/dnsmasq-dns-589c97547-9nnhp" Feb 19 15:28:39 crc kubenswrapper[4810]: I0219 15:28:39.658684 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cc2bd68-d5b3-4416-8752-edf7ee85bf88-config\") pod \"dnsmasq-dns-589c97547-9nnhp\" (UID: \"7cc2bd68-d5b3-4416-8752-edf7ee85bf88\") " pod="openstack/dnsmasq-dns-589c97547-9nnhp" Feb 19 15:28:39 crc kubenswrapper[4810]: I0219 15:28:39.658714 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7cc2bd68-d5b3-4416-8752-edf7ee85bf88-dns-svc\") pod \"dnsmasq-dns-589c97547-9nnhp\" (UID: \"7cc2bd68-d5b3-4416-8752-edf7ee85bf88\") " pod="openstack/dnsmasq-dns-589c97547-9nnhp" Feb 19 15:28:39 crc kubenswrapper[4810]: I0219 15:28:39.658763 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g84kv\" (UniqueName: \"kubernetes.io/projected/7cc2bd68-d5b3-4416-8752-edf7ee85bf88-kube-api-access-g84kv\") pod \"dnsmasq-dns-589c97547-9nnhp\" (UID: \"7cc2bd68-d5b3-4416-8752-edf7ee85bf88\") " pod="openstack/dnsmasq-dns-589c97547-9nnhp" Feb 19 15:28:39 crc kubenswrapper[4810]: I0219 15:28:39.760440 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7cc2bd68-d5b3-4416-8752-edf7ee85bf88-ovsdbserver-nb\") pod \"dnsmasq-dns-589c97547-9nnhp\" (UID: \"7cc2bd68-d5b3-4416-8752-edf7ee85bf88\") " pod="openstack/dnsmasq-dns-589c97547-9nnhp" Feb 19 15:28:39 crc kubenswrapper[4810]: I0219 15:28:39.760486 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7cc2bd68-d5b3-4416-8752-edf7ee85bf88-ovsdbserver-sb\") pod \"dnsmasq-dns-589c97547-9nnhp\" (UID: \"7cc2bd68-d5b3-4416-8752-edf7ee85bf88\") " pod="openstack/dnsmasq-dns-589c97547-9nnhp" Feb 19 15:28:39 crc kubenswrapper[4810]: I0219 15:28:39.760507 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cc2bd68-d5b3-4416-8752-edf7ee85bf88-config\") pod \"dnsmasq-dns-589c97547-9nnhp\" (UID: \"7cc2bd68-d5b3-4416-8752-edf7ee85bf88\") " pod="openstack/dnsmasq-dns-589c97547-9nnhp" Feb 19 15:28:39 crc kubenswrapper[4810]: I0219 15:28:39.760541 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7cc2bd68-d5b3-4416-8752-edf7ee85bf88-dns-svc\") pod \"dnsmasq-dns-589c97547-9nnhp\" (UID: \"7cc2bd68-d5b3-4416-8752-edf7ee85bf88\") " pod="openstack/dnsmasq-dns-589c97547-9nnhp" Feb 19 15:28:39 crc kubenswrapper[4810]: I0219 15:28:39.760591 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g84kv\" (UniqueName: \"kubernetes.io/projected/7cc2bd68-d5b3-4416-8752-edf7ee85bf88-kube-api-access-g84kv\") pod \"dnsmasq-dns-589c97547-9nnhp\" (UID: \"7cc2bd68-d5b3-4416-8752-edf7ee85bf88\") " pod="openstack/dnsmasq-dns-589c97547-9nnhp" Feb 19 15:28:39 crc kubenswrapper[4810]: I0219 15:28:39.760618 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7cc2bd68-d5b3-4416-8752-edf7ee85bf88-dns-swift-storage-0\") pod \"dnsmasq-dns-589c97547-9nnhp\" (UID: \"7cc2bd68-d5b3-4416-8752-edf7ee85bf88\") " pod="openstack/dnsmasq-dns-589c97547-9nnhp" Feb 19 15:28:39 crc kubenswrapper[4810]: I0219 15:28:39.761726 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7cc2bd68-d5b3-4416-8752-edf7ee85bf88-dns-swift-storage-0\") pod \"dnsmasq-dns-589c97547-9nnhp\" (UID: \"7cc2bd68-d5b3-4416-8752-edf7ee85bf88\") " pod="openstack/dnsmasq-dns-589c97547-9nnhp" Feb 19 15:28:39 crc kubenswrapper[4810]: I0219 15:28:39.762209 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7cc2bd68-d5b3-4416-8752-edf7ee85bf88-ovsdbserver-nb\") pod \"dnsmasq-dns-589c97547-9nnhp\" (UID: \"7cc2bd68-d5b3-4416-8752-edf7ee85bf88\") " pod="openstack/dnsmasq-dns-589c97547-9nnhp" Feb 19 15:28:39 crc kubenswrapper[4810]: I0219 15:28:39.762708 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7cc2bd68-d5b3-4416-8752-edf7ee85bf88-ovsdbserver-sb\") pod \"dnsmasq-dns-589c97547-9nnhp\" (UID: \"7cc2bd68-d5b3-4416-8752-edf7ee85bf88\") " pod="openstack/dnsmasq-dns-589c97547-9nnhp" Feb 19 15:28:39 crc kubenswrapper[4810]: I0219 15:28:39.763192 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cc2bd68-d5b3-4416-8752-edf7ee85bf88-config\") pod \"dnsmasq-dns-589c97547-9nnhp\" (UID: \"7cc2bd68-d5b3-4416-8752-edf7ee85bf88\") " pod="openstack/dnsmasq-dns-589c97547-9nnhp" Feb 19 15:28:39 crc kubenswrapper[4810]: I0219 15:28:39.763709 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7cc2bd68-d5b3-4416-8752-edf7ee85bf88-dns-svc\") pod \"dnsmasq-dns-589c97547-9nnhp\" (UID: \"7cc2bd68-d5b3-4416-8752-edf7ee85bf88\") " pod="openstack/dnsmasq-dns-589c97547-9nnhp" Feb 19 15:28:39 crc kubenswrapper[4810]: I0219 15:28:39.783300 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g84kv\" (UniqueName: \"kubernetes.io/projected/7cc2bd68-d5b3-4416-8752-edf7ee85bf88-kube-api-access-g84kv\") pod \"dnsmasq-dns-589c97547-9nnhp\" (UID: \"7cc2bd68-d5b3-4416-8752-edf7ee85bf88\") " pod="openstack/dnsmasq-dns-589c97547-9nnhp" Feb 19 15:28:39 crc kubenswrapper[4810]: I0219 15:28:39.837644 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589c97547-9nnhp" Feb 19 15:28:40 crc kubenswrapper[4810]: I0219 15:28:40.368036 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-589c97547-9nnhp"] Feb 19 15:28:41 crc kubenswrapper[4810]: I0219 15:28:41.026382 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"5c213a3a-78fd-4b42-bc1c-e09837eae684","Type":"ContainerStarted","Data":"1cf6b1e26b593f87dbae05831261b5448c34285d54ffc513dcc21ce3c137d57f"} Feb 19 15:28:41 crc kubenswrapper[4810]: I0219 15:28:41.028047 4810 generic.go:334] "Generic (PLEG): container finished" podID="7cc2bd68-d5b3-4416-8752-edf7ee85bf88" containerID="e8fb43b2b423ed4972c4fdcf61131fbebdb2452721ad8c709c2892ac5c467178" exitCode=0 Feb 19 15:28:41 crc kubenswrapper[4810]: I0219 15:28:41.028151 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589c97547-9nnhp" event={"ID":"7cc2bd68-d5b3-4416-8752-edf7ee85bf88","Type":"ContainerDied","Data":"e8fb43b2b423ed4972c4fdcf61131fbebdb2452721ad8c709c2892ac5c467178"} Feb 19 15:28:41 crc kubenswrapper[4810]: I0219 15:28:41.028368 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589c97547-9nnhp" event={"ID":"7cc2bd68-d5b3-4416-8752-edf7ee85bf88","Type":"ContainerStarted","Data":"6cdfc0e390a65c8dc0f4b9e7aa2f77b612821b90bf368ae80a8f0a6cf661f875"} Feb 19 15:28:42 crc kubenswrapper[4810]: I0219 15:28:42.039431 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"5c213a3a-78fd-4b42-bc1c-e09837eae684","Type":"ContainerStarted","Data":"ab34c252f979f018755a4fd37fb43b24f2dc32b972b84fea55031898aade6962"} Feb 19 15:28:42 crc kubenswrapper[4810]: I0219 15:28:42.041541 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589c97547-9nnhp" event={"ID":"7cc2bd68-d5b3-4416-8752-edf7ee85bf88","Type":"ContainerStarted","Data":"b3c769533587d67c2cf2579dac0d7fe9ba6b6dac50efc5e3e91af785d9090a99"} Feb 19 15:28:42 crc kubenswrapper[4810]: I0219 15:28:42.041816 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-589c97547-9nnhp" Feb 19 15:28:42 crc kubenswrapper[4810]: I0219 15:28:42.065877 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Feb 19 15:28:42 crc kubenswrapper[4810]: I0219 15:28:42.075768 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=21.075746813 podStartE2EDuration="21.075746813s" podCreationTimestamp="2026-02-19 15:28:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:28:42.065679237 +0000 UTC m=+1151.547709381" watchObservedRunningTime="2026-02-19 15:28:42.075746813 +0000 UTC m=+1151.557776947" Feb 19 15:28:42 crc kubenswrapper[4810]: I0219 15:28:42.092304 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-589c97547-9nnhp" podStartSLOduration=3.092284178 podStartE2EDuration="3.092284178s" podCreationTimestamp="2026-02-19 15:28:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:28:42.083005131 +0000 UTC m=+1151.565035265" watchObservedRunningTime="2026-02-19 15:28:42.092284178 +0000 UTC m=+1151.574314312" Feb 19 15:28:45 crc kubenswrapper[4810]: I0219 15:28:45.083540 4810 generic.go:334] "Generic (PLEG): container finished" podID="63eeb47c-9c4a-4e36-be24-61c126517600" containerID="7e83b0c5177b1183e58ad0498417fc1c3b6e142723e7482bda0235e4615b43f5" exitCode=0 Feb 19 15:28:45 crc kubenswrapper[4810]: I0219 15:28:45.083716 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-sd4lr" event={"ID":"63eeb47c-9c4a-4e36-be24-61c126517600","Type":"ContainerDied","Data":"7e83b0c5177b1183e58ad0498417fc1c3b6e142723e7482bda0235e4615b43f5"} Feb 19 15:28:46 crc kubenswrapper[4810]: I0219 15:28:46.099937 4810 generic.go:334] "Generic (PLEG): container finished" podID="082fc735-2850-452d-841a-0af9ed7ed171" containerID="22a00ed65eebcb7030f20de212b927e0556118314908589176cea5b5329504cb" exitCode=0 Feb 19 15:28:46 crc kubenswrapper[4810]: I0219 15:28:46.100096 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-k9zsz" event={"ID":"082fc735-2850-452d-841a-0af9ed7ed171","Type":"ContainerDied","Data":"22a00ed65eebcb7030f20de212b927e0556118314908589176cea5b5329504cb"} Feb 19 15:28:46 crc kubenswrapper[4810]: I0219 15:28:46.540347 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-sd4lr" Feb 19 15:28:46 crc kubenswrapper[4810]: I0219 15:28:46.593729 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63eeb47c-9c4a-4e36-be24-61c126517600-combined-ca-bundle\") pod \"63eeb47c-9c4a-4e36-be24-61c126517600\" (UID: \"63eeb47c-9c4a-4e36-be24-61c126517600\") " Feb 19 15:28:46 crc kubenswrapper[4810]: I0219 15:28:46.593805 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63eeb47c-9c4a-4e36-be24-61c126517600-config-data\") pod \"63eeb47c-9c4a-4e36-be24-61c126517600\" (UID: \"63eeb47c-9c4a-4e36-be24-61c126517600\") " Feb 19 15:28:46 crc kubenswrapper[4810]: I0219 15:28:46.593900 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/63eeb47c-9c4a-4e36-be24-61c126517600-db-sync-config-data\") pod \"63eeb47c-9c4a-4e36-be24-61c126517600\" (UID: \"63eeb47c-9c4a-4e36-be24-61c126517600\") " Feb 19 15:28:46 crc kubenswrapper[4810]: I0219 15:28:46.593960 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnkbf\" (UniqueName: \"kubernetes.io/projected/63eeb47c-9c4a-4e36-be24-61c126517600-kube-api-access-mnkbf\") pod \"63eeb47c-9c4a-4e36-be24-61c126517600\" (UID: \"63eeb47c-9c4a-4e36-be24-61c126517600\") " Feb 19 15:28:46 crc kubenswrapper[4810]: I0219 15:28:46.599887 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63eeb47c-9c4a-4e36-be24-61c126517600-kube-api-access-mnkbf" (OuterVolumeSpecName: "kube-api-access-mnkbf") pod "63eeb47c-9c4a-4e36-be24-61c126517600" (UID: "63eeb47c-9c4a-4e36-be24-61c126517600"). InnerVolumeSpecName "kube-api-access-mnkbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:28:46 crc kubenswrapper[4810]: I0219 15:28:46.600698 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63eeb47c-9c4a-4e36-be24-61c126517600-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "63eeb47c-9c4a-4e36-be24-61c126517600" (UID: "63eeb47c-9c4a-4e36-be24-61c126517600"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:28:46 crc kubenswrapper[4810]: I0219 15:28:46.626874 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63eeb47c-9c4a-4e36-be24-61c126517600-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "63eeb47c-9c4a-4e36-be24-61c126517600" (UID: "63eeb47c-9c4a-4e36-be24-61c126517600"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:28:46 crc kubenswrapper[4810]: I0219 15:28:46.658598 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63eeb47c-9c4a-4e36-be24-61c126517600-config-data" (OuterVolumeSpecName: "config-data") pod "63eeb47c-9c4a-4e36-be24-61c126517600" (UID: "63eeb47c-9c4a-4e36-be24-61c126517600"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:28:46 crc kubenswrapper[4810]: I0219 15:28:46.696472 4810 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/63eeb47c-9c4a-4e36-be24-61c126517600-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 15:28:46 crc kubenswrapper[4810]: I0219 15:28:46.696513 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnkbf\" (UniqueName: \"kubernetes.io/projected/63eeb47c-9c4a-4e36-be24-61c126517600-kube-api-access-mnkbf\") on node \"crc\" DevicePath \"\"" Feb 19 15:28:46 crc kubenswrapper[4810]: I0219 15:28:46.696576 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63eeb47c-9c4a-4e36-be24-61c126517600-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 15:28:46 crc kubenswrapper[4810]: I0219 15:28:46.696590 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63eeb47c-9c4a-4e36-be24-61c126517600-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 15:28:47 crc kubenswrapper[4810]: I0219 15:28:47.115822 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-sd4lr" event={"ID":"63eeb47c-9c4a-4e36-be24-61c126517600","Type":"ContainerDied","Data":"490bfe3163d0e855ea419f13cf8a5bebf9306234f76375e3967a08a8c33723b7"} Feb 19 15:28:47 crc kubenswrapper[4810]: I0219 15:28:47.115911 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="490bfe3163d0e855ea419f13cf8a5bebf9306234f76375e3967a08a8c33723b7" Feb 19 15:28:47 crc kubenswrapper[4810]: I0219 15:28:47.115845 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-sd4lr" Feb 19 15:28:47 crc kubenswrapper[4810]: I0219 15:28:47.537354 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-k9zsz" Feb 19 15:28:47 crc kubenswrapper[4810]: I0219 15:28:47.612574 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/082fc735-2850-452d-841a-0af9ed7ed171-combined-ca-bundle\") pod \"082fc735-2850-452d-841a-0af9ed7ed171\" (UID: \"082fc735-2850-452d-841a-0af9ed7ed171\") " Feb 19 15:28:47 crc kubenswrapper[4810]: I0219 15:28:47.612681 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4qlb\" (UniqueName: \"kubernetes.io/projected/082fc735-2850-452d-841a-0af9ed7ed171-kube-api-access-t4qlb\") pod \"082fc735-2850-452d-841a-0af9ed7ed171\" (UID: \"082fc735-2850-452d-841a-0af9ed7ed171\") " Feb 19 15:28:47 crc kubenswrapper[4810]: I0219 15:28:47.612706 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/082fc735-2850-452d-841a-0af9ed7ed171-config-data\") pod \"082fc735-2850-452d-841a-0af9ed7ed171\" (UID: \"082fc735-2850-452d-841a-0af9ed7ed171\") " Feb 19 15:28:47 crc kubenswrapper[4810]: I0219 15:28:47.616753 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/082fc735-2850-452d-841a-0af9ed7ed171-kube-api-access-t4qlb" (OuterVolumeSpecName: "kube-api-access-t4qlb") pod "082fc735-2850-452d-841a-0af9ed7ed171" (UID: "082fc735-2850-452d-841a-0af9ed7ed171"). InnerVolumeSpecName "kube-api-access-t4qlb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:28:47 crc kubenswrapper[4810]: I0219 15:28:47.634175 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/082fc735-2850-452d-841a-0af9ed7ed171-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "082fc735-2850-452d-841a-0af9ed7ed171" (UID: "082fc735-2850-452d-841a-0af9ed7ed171"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:28:47 crc kubenswrapper[4810]: I0219 15:28:47.655990 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/082fc735-2850-452d-841a-0af9ed7ed171-config-data" (OuterVolumeSpecName: "config-data") pod "082fc735-2850-452d-841a-0af9ed7ed171" (UID: "082fc735-2850-452d-841a-0af9ed7ed171"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:28:47 crc kubenswrapper[4810]: I0219 15:28:47.717786 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/082fc735-2850-452d-841a-0af9ed7ed171-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 15:28:47 crc kubenswrapper[4810]: I0219 15:28:47.717812 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/082fc735-2850-452d-841a-0af9ed7ed171-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 15:28:47 crc kubenswrapper[4810]: I0219 15:28:47.717826 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t4qlb\" (UniqueName: \"kubernetes.io/projected/082fc735-2850-452d-841a-0af9ed7ed171-kube-api-access-t4qlb\") on node \"crc\" DevicePath \"\"" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.129843 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-k9zsz" event={"ID":"082fc735-2850-452d-841a-0af9ed7ed171","Type":"ContainerDied","Data":"866dbef12abd89bdc5dcdad8828a3677ff57d6665594483a4a2f0d6d9a3ec62c"} Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.129899 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="866dbef12abd89bdc5dcdad8828a3677ff57d6665594483a4a2f0d6d9a3ec62c" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.129961 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-k9zsz" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.386630 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-589c97547-9nnhp"] Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.387376 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-589c97547-9nnhp" podUID="7cc2bd68-d5b3-4416-8752-edf7ee85bf88" containerName="dnsmasq-dns" containerID="cri-o://b3c769533587d67c2cf2579dac0d7fe9ba6b6dac50efc5e3e91af785d9090a99" gracePeriod=10 Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.388483 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-589c97547-9nnhp" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.425517 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-744c9b5447-bdbq7"] Feb 19 15:28:48 crc kubenswrapper[4810]: E0219 15:28:48.425919 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63eeb47c-9c4a-4e36-be24-61c126517600" containerName="watcher-db-sync" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.425936 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="63eeb47c-9c4a-4e36-be24-61c126517600" containerName="watcher-db-sync" Feb 19 15:28:48 crc kubenswrapper[4810]: E0219 15:28:48.425962 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="082fc735-2850-452d-841a-0af9ed7ed171" containerName="keystone-db-sync" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.425969 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="082fc735-2850-452d-841a-0af9ed7ed171" containerName="keystone-db-sync" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.426140 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="082fc735-2850-452d-841a-0af9ed7ed171" containerName="keystone-db-sync" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.426165 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="63eeb47c-9c4a-4e36-be24-61c126517600" containerName="watcher-db-sync" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.427063 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-744c9b5447-bdbq7" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.454137 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-2f6g5"] Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.455311 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-2f6g5" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.457696 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.458251 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-j78zz" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.462070 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.462101 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.462349 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.464970 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-744c9b5447-bdbq7"] Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.527419 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-2f6g5"] Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.545056 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/32e59d75-7087-41ab-8571-5e8830baeec0-dns-swift-storage-0\") pod \"dnsmasq-dns-744c9b5447-bdbq7\" (UID: \"32e59d75-7087-41ab-8571-5e8830baeec0\") " pod="openstack/dnsmasq-dns-744c9b5447-bdbq7" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.545095 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a2a904f-47f9-40da-bc5f-aba73c4c1c57-scripts\") pod \"keystone-bootstrap-2f6g5\" (UID: \"5a2a904f-47f9-40da-bc5f-aba73c4c1c57\") " pod="openstack/keystone-bootstrap-2f6g5" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.545130 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cf6bv\" (UniqueName: \"kubernetes.io/projected/32e59d75-7087-41ab-8571-5e8830baeec0-kube-api-access-cf6bv\") pod \"dnsmasq-dns-744c9b5447-bdbq7\" (UID: \"32e59d75-7087-41ab-8571-5e8830baeec0\") " pod="openstack/dnsmasq-dns-744c9b5447-bdbq7" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.545145 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a2a904f-47f9-40da-bc5f-aba73c4c1c57-combined-ca-bundle\") pod \"keystone-bootstrap-2f6g5\" (UID: \"5a2a904f-47f9-40da-bc5f-aba73c4c1c57\") " pod="openstack/keystone-bootstrap-2f6g5" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.545173 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32e59d75-7087-41ab-8571-5e8830baeec0-config\") pod \"dnsmasq-dns-744c9b5447-bdbq7\" (UID: \"32e59d75-7087-41ab-8571-5e8830baeec0\") " pod="openstack/dnsmasq-dns-744c9b5447-bdbq7" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.545193 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a2a904f-47f9-40da-bc5f-aba73c4c1c57-config-data\") pod \"keystone-bootstrap-2f6g5\" (UID: \"5a2a904f-47f9-40da-bc5f-aba73c4c1c57\") " pod="openstack/keystone-bootstrap-2f6g5" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.545208 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlnqn\" (UniqueName: \"kubernetes.io/projected/5a2a904f-47f9-40da-bc5f-aba73c4c1c57-kube-api-access-qlnqn\") pod \"keystone-bootstrap-2f6g5\" (UID: \"5a2a904f-47f9-40da-bc5f-aba73c4c1c57\") " pod="openstack/keystone-bootstrap-2f6g5" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.545244 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32e59d75-7087-41ab-8571-5e8830baeec0-dns-svc\") pod \"dnsmasq-dns-744c9b5447-bdbq7\" (UID: \"32e59d75-7087-41ab-8571-5e8830baeec0\") " pod="openstack/dnsmasq-dns-744c9b5447-bdbq7" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.545258 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/32e59d75-7087-41ab-8571-5e8830baeec0-ovsdbserver-sb\") pod \"dnsmasq-dns-744c9b5447-bdbq7\" (UID: \"32e59d75-7087-41ab-8571-5e8830baeec0\") " pod="openstack/dnsmasq-dns-744c9b5447-bdbq7" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.545298 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5a2a904f-47f9-40da-bc5f-aba73c4c1c57-credential-keys\") pod \"keystone-bootstrap-2f6g5\" (UID: \"5a2a904f-47f9-40da-bc5f-aba73c4c1c57\") " pod="openstack/keystone-bootstrap-2f6g5" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.545315 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5a2a904f-47f9-40da-bc5f-aba73c4c1c57-fernet-keys\") pod \"keystone-bootstrap-2f6g5\" (UID: \"5a2a904f-47f9-40da-bc5f-aba73c4c1c57\") " pod="openstack/keystone-bootstrap-2f6g5" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.545357 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/32e59d75-7087-41ab-8571-5e8830baeec0-ovsdbserver-nb\") pod \"dnsmasq-dns-744c9b5447-bdbq7\" (UID: \"32e59d75-7087-41ab-8571-5e8830baeec0\") " pod="openstack/dnsmasq-dns-744c9b5447-bdbq7" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.627210 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-applier-0"] Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.628372 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.633674 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-applier-config-data" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.633838 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-watcher-dockercfg-ngb4n" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.640273 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.641663 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.644081 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.646280 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32e59d75-7087-41ab-8571-5e8830baeec0-dns-svc\") pod \"dnsmasq-dns-744c9b5447-bdbq7\" (UID: \"32e59d75-7087-41ab-8571-5e8830baeec0\") " pod="openstack/dnsmasq-dns-744c9b5447-bdbq7" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.646318 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/32e59d75-7087-41ab-8571-5e8830baeec0-ovsdbserver-sb\") pod \"dnsmasq-dns-744c9b5447-bdbq7\" (UID: \"32e59d75-7087-41ab-8571-5e8830baeec0\") " pod="openstack/dnsmasq-dns-744c9b5447-bdbq7" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.646411 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5a2a904f-47f9-40da-bc5f-aba73c4c1c57-credential-keys\") pod \"keystone-bootstrap-2f6g5\" (UID: \"5a2a904f-47f9-40da-bc5f-aba73c4c1c57\") " pod="openstack/keystone-bootstrap-2f6g5" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.646431 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5a2a904f-47f9-40da-bc5f-aba73c4c1c57-fernet-keys\") pod \"keystone-bootstrap-2f6g5\" (UID: \"5a2a904f-47f9-40da-bc5f-aba73c4c1c57\") " pod="openstack/keystone-bootstrap-2f6g5" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.646452 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/32e59d75-7087-41ab-8571-5e8830baeec0-ovsdbserver-nb\") pod \"dnsmasq-dns-744c9b5447-bdbq7\" (UID: \"32e59d75-7087-41ab-8571-5e8830baeec0\") " pod="openstack/dnsmasq-dns-744c9b5447-bdbq7" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.646536 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/32e59d75-7087-41ab-8571-5e8830baeec0-dns-swift-storage-0\") pod \"dnsmasq-dns-744c9b5447-bdbq7\" (UID: \"32e59d75-7087-41ab-8571-5e8830baeec0\") " pod="openstack/dnsmasq-dns-744c9b5447-bdbq7" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.646563 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a2a904f-47f9-40da-bc5f-aba73c4c1c57-scripts\") pod \"keystone-bootstrap-2f6g5\" (UID: \"5a2a904f-47f9-40da-bc5f-aba73c4c1c57\") " pod="openstack/keystone-bootstrap-2f6g5" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.646592 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cf6bv\" (UniqueName: \"kubernetes.io/projected/32e59d75-7087-41ab-8571-5e8830baeec0-kube-api-access-cf6bv\") pod \"dnsmasq-dns-744c9b5447-bdbq7\" (UID: \"32e59d75-7087-41ab-8571-5e8830baeec0\") " pod="openstack/dnsmasq-dns-744c9b5447-bdbq7" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.646610 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a2a904f-47f9-40da-bc5f-aba73c4c1c57-combined-ca-bundle\") pod \"keystone-bootstrap-2f6g5\" (UID: \"5a2a904f-47f9-40da-bc5f-aba73c4c1c57\") " pod="openstack/keystone-bootstrap-2f6g5" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.646640 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32e59d75-7087-41ab-8571-5e8830baeec0-config\") pod \"dnsmasq-dns-744c9b5447-bdbq7\" (UID: \"32e59d75-7087-41ab-8571-5e8830baeec0\") " pod="openstack/dnsmasq-dns-744c9b5447-bdbq7" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.646665 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a2a904f-47f9-40da-bc5f-aba73c4c1c57-config-data\") pod \"keystone-bootstrap-2f6g5\" (UID: \"5a2a904f-47f9-40da-bc5f-aba73c4c1c57\") " pod="openstack/keystone-bootstrap-2f6g5" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.646684 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlnqn\" (UniqueName: \"kubernetes.io/projected/5a2a904f-47f9-40da-bc5f-aba73c4c1c57-kube-api-access-qlnqn\") pod \"keystone-bootstrap-2f6g5\" (UID: \"5a2a904f-47f9-40da-bc5f-aba73c4c1c57\") " pod="openstack/keystone-bootstrap-2f6g5" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.649954 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/32e59d75-7087-41ab-8571-5e8830baeec0-dns-swift-storage-0\") pod \"dnsmasq-dns-744c9b5447-bdbq7\" (UID: \"32e59d75-7087-41ab-8571-5e8830baeec0\") " pod="openstack/dnsmasq-dns-744c9b5447-bdbq7" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.650852 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32e59d75-7087-41ab-8571-5e8830baeec0-dns-svc\") pod \"dnsmasq-dns-744c9b5447-bdbq7\" (UID: \"32e59d75-7087-41ab-8571-5e8830baeec0\") " pod="openstack/dnsmasq-dns-744c9b5447-bdbq7" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.651536 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32e59d75-7087-41ab-8571-5e8830baeec0-config\") pod \"dnsmasq-dns-744c9b5447-bdbq7\" (UID: \"32e59d75-7087-41ab-8571-5e8830baeec0\") " pod="openstack/dnsmasq-dns-744c9b5447-bdbq7" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.654668 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/32e59d75-7087-41ab-8571-5e8830baeec0-ovsdbserver-sb\") pod \"dnsmasq-dns-744c9b5447-bdbq7\" (UID: \"32e59d75-7087-41ab-8571-5e8830baeec0\") " pod="openstack/dnsmasq-dns-744c9b5447-bdbq7" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.657495 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/32e59d75-7087-41ab-8571-5e8830baeec0-ovsdbserver-nb\") pod \"dnsmasq-dns-744c9b5447-bdbq7\" (UID: \"32e59d75-7087-41ab-8571-5e8830baeec0\") " pod="openstack/dnsmasq-dns-744c9b5447-bdbq7" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.664281 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a2a904f-47f9-40da-bc5f-aba73c4c1c57-combined-ca-bundle\") pod \"keystone-bootstrap-2f6g5\" (UID: \"5a2a904f-47f9-40da-bc5f-aba73c4c1c57\") " pod="openstack/keystone-bootstrap-2f6g5" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.664352 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.665754 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.667406 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.671003 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5a2a904f-47f9-40da-bc5f-aba73c4c1c57-fernet-keys\") pod \"keystone-bootstrap-2f6g5\" (UID: \"5a2a904f-47f9-40da-bc5f-aba73c4c1c57\") " pod="openstack/keystone-bootstrap-2f6g5" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.679593 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a2a904f-47f9-40da-bc5f-aba73c4c1c57-config-data\") pod \"keystone-bootstrap-2f6g5\" (UID: \"5a2a904f-47f9-40da-bc5f-aba73c4c1c57\") " pod="openstack/keystone-bootstrap-2f6g5" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.682022 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a2a904f-47f9-40da-bc5f-aba73c4c1c57-scripts\") pod \"keystone-bootstrap-2f6g5\" (UID: \"5a2a904f-47f9-40da-bc5f-aba73c4c1c57\") " pod="openstack/keystone-bootstrap-2f6g5" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.683155 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlnqn\" (UniqueName: \"kubernetes.io/projected/5a2a904f-47f9-40da-bc5f-aba73c4c1c57-kube-api-access-qlnqn\") pod \"keystone-bootstrap-2f6g5\" (UID: \"5a2a904f-47f9-40da-bc5f-aba73c4c1c57\") " pod="openstack/keystone-bootstrap-2f6g5" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.683919 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5a2a904f-47f9-40da-bc5f-aba73c4c1c57-credential-keys\") pod \"keystone-bootstrap-2f6g5\" (UID: \"5a2a904f-47f9-40da-bc5f-aba73c4c1c57\") " pod="openstack/keystone-bootstrap-2f6g5" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.687104 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.694037 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cf6bv\" (UniqueName: \"kubernetes.io/projected/32e59d75-7087-41ab-8571-5e8830baeec0-kube-api-access-cf6bv\") pod \"dnsmasq-dns-744c9b5447-bdbq7\" (UID: \"32e59d75-7087-41ab-8571-5e8830baeec0\") " pod="openstack/dnsmasq-dns-744c9b5447-bdbq7" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.723200 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.741288 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-j989d"] Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.742588 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-j989d" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.749401 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc\") " pod="openstack/watcher-api-0" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.749450 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfnzr\" (UniqueName: \"kubernetes.io/projected/2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc-kube-api-access-mfnzr\") pod \"watcher-api-0\" (UID: \"2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc\") " pod="openstack/watcher-api-0" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.749525 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3eb2dccd-c5dc-436f-b7a6-954af7bc51c5-logs\") pod \"watcher-decision-engine-0\" (UID: \"3eb2dccd-c5dc-436f-b7a6-954af7bc51c5\") " pod="openstack/watcher-decision-engine-0" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.749546 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4be760b2-263c-4b89-8bdf-ecf98114a24f-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"4be760b2-263c-4b89-8bdf-ecf98114a24f\") " pod="openstack/watcher-applier-0" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.749566 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bwsx\" (UniqueName: \"kubernetes.io/projected/3eb2dccd-c5dc-436f-b7a6-954af7bc51c5-kube-api-access-2bwsx\") pod \"watcher-decision-engine-0\" (UID: \"3eb2dccd-c5dc-436f-b7a6-954af7bc51c5\") " pod="openstack/watcher-decision-engine-0" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.749587 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc-logs\") pod \"watcher-api-0\" (UID: \"2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc\") " pod="openstack/watcher-api-0" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.749614 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjt9b\" (UniqueName: \"kubernetes.io/projected/4be760b2-263c-4b89-8bdf-ecf98114a24f-kube-api-access-kjt9b\") pod \"watcher-applier-0\" (UID: \"4be760b2-263c-4b89-8bdf-ecf98114a24f\") " pod="openstack/watcher-applier-0" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.749651 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4be760b2-263c-4b89-8bdf-ecf98114a24f-logs\") pod \"watcher-applier-0\" (UID: \"4be760b2-263c-4b89-8bdf-ecf98114a24f\") " pod="openstack/watcher-applier-0" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.749682 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/3eb2dccd-c5dc-436f-b7a6-954af7bc51c5-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"3eb2dccd-c5dc-436f-b7a6-954af7bc51c5\") " pod="openstack/watcher-decision-engine-0" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.749703 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3eb2dccd-c5dc-436f-b7a6-954af7bc51c5-config-data\") pod \"watcher-decision-engine-0\" (UID: \"3eb2dccd-c5dc-436f-b7a6-954af7bc51c5\") " pod="openstack/watcher-decision-engine-0" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.749726 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc\") " pod="openstack/watcher-api-0" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.749756 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3eb2dccd-c5dc-436f-b7a6-954af7bc51c5-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"3eb2dccd-c5dc-436f-b7a6-954af7bc51c5\") " pod="openstack/watcher-decision-engine-0" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.749777 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc-config-data\") pod \"watcher-api-0\" (UID: \"2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc\") " pod="openstack/watcher-api-0" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.749798 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4be760b2-263c-4b89-8bdf-ecf98114a24f-config-data\") pod \"watcher-applier-0\" (UID: \"4be760b2-263c-4b89-8bdf-ecf98114a24f\") " pod="openstack/watcher-applier-0" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.750065 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-744c9b5447-bdbq7" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.750811 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-xcpvh" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.750946 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.751083 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.765865 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-j989d"] Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.778557 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-2f6g5" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.811666 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.859461 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-svmgl"] Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.866365 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-svmgl" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.873933 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.874206 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.874365 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-nxm2z" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.878291 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3eb2dccd-c5dc-436f-b7a6-954af7bc51c5-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"3eb2dccd-c5dc-436f-b7a6-954af7bc51c5\") " pod="openstack/watcher-decision-engine-0" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.878348 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc-config-data\") pod \"watcher-api-0\" (UID: \"2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc\") " pod="openstack/watcher-api-0" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.878471 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4be760b2-263c-4b89-8bdf-ecf98114a24f-config-data\") pod \"watcher-applier-0\" (UID: \"4be760b2-263c-4b89-8bdf-ecf98114a24f\") " pod="openstack/watcher-applier-0" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.878592 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc\") " pod="openstack/watcher-api-0" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.878659 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4dd5dede-cf58-43c7-954e-b9b1d33ad8d1-config\") pod \"neutron-db-sync-j989d\" (UID: \"4dd5dede-cf58-43c7-954e-b9b1d33ad8d1\") " pod="openstack/neutron-db-sync-j989d" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.878681 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfnzr\" (UniqueName: \"kubernetes.io/projected/2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc-kube-api-access-mfnzr\") pod \"watcher-api-0\" (UID: \"2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc\") " pod="openstack/watcher-api-0" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.878907 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dd5dede-cf58-43c7-954e-b9b1d33ad8d1-combined-ca-bundle\") pod \"neutron-db-sync-j989d\" (UID: \"4dd5dede-cf58-43c7-954e-b9b1d33ad8d1\") " pod="openstack/neutron-db-sync-j989d" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.879218 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4be760b2-263c-4b89-8bdf-ecf98114a24f-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"4be760b2-263c-4b89-8bdf-ecf98114a24f\") " pod="openstack/watcher-applier-0" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.879235 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3eb2dccd-c5dc-436f-b7a6-954af7bc51c5-logs\") pod \"watcher-decision-engine-0\" (UID: \"3eb2dccd-c5dc-436f-b7a6-954af7bc51c5\") " pod="openstack/watcher-decision-engine-0" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.879297 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bwsx\" (UniqueName: \"kubernetes.io/projected/3eb2dccd-c5dc-436f-b7a6-954af7bc51c5-kube-api-access-2bwsx\") pod \"watcher-decision-engine-0\" (UID: \"3eb2dccd-c5dc-436f-b7a6-954af7bc51c5\") " pod="openstack/watcher-decision-engine-0" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.879352 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc-logs\") pod \"watcher-api-0\" (UID: \"2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc\") " pod="openstack/watcher-api-0" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.879399 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjt9b\" (UniqueName: \"kubernetes.io/projected/4be760b2-263c-4b89-8bdf-ecf98114a24f-kube-api-access-kjt9b\") pod \"watcher-applier-0\" (UID: \"4be760b2-263c-4b89-8bdf-ecf98114a24f\") " pod="openstack/watcher-applier-0" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.879548 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sllpm\" (UniqueName: \"kubernetes.io/projected/4dd5dede-cf58-43c7-954e-b9b1d33ad8d1-kube-api-access-sllpm\") pod \"neutron-db-sync-j989d\" (UID: \"4dd5dede-cf58-43c7-954e-b9b1d33ad8d1\") " pod="openstack/neutron-db-sync-j989d" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.879941 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4be760b2-263c-4b89-8bdf-ecf98114a24f-logs\") pod \"watcher-applier-0\" (UID: \"4be760b2-263c-4b89-8bdf-ecf98114a24f\") " pod="openstack/watcher-applier-0" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.880043 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/3eb2dccd-c5dc-436f-b7a6-954af7bc51c5-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"3eb2dccd-c5dc-436f-b7a6-954af7bc51c5\") " pod="openstack/watcher-decision-engine-0" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.880091 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3eb2dccd-c5dc-436f-b7a6-954af7bc51c5-config-data\") pod \"watcher-decision-engine-0\" (UID: \"3eb2dccd-c5dc-436f-b7a6-954af7bc51c5\") " pod="openstack/watcher-decision-engine-0" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.880131 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc\") " pod="openstack/watcher-api-0" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.883391 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc-logs\") pod \"watcher-api-0\" (UID: \"2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc\") " pod="openstack/watcher-api-0" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.890594 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-svmgl"] Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.890969 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3eb2dccd-c5dc-436f-b7a6-954af7bc51c5-logs\") pod \"watcher-decision-engine-0\" (UID: \"3eb2dccd-c5dc-436f-b7a6-954af7bc51c5\") " pod="openstack/watcher-decision-engine-0" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.891371 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4be760b2-263c-4b89-8bdf-ecf98114a24f-logs\") pod \"watcher-applier-0\" (UID: \"4be760b2-263c-4b89-8bdf-ecf98114a24f\") " pod="openstack/watcher-applier-0" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.904280 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-849c785789-5xrh2"] Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.905590 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-849c785789-5xrh2" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.914610 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc\") " pod="openstack/watcher-api-0" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.914737 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3eb2dccd-c5dc-436f-b7a6-954af7bc51c5-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"3eb2dccd-c5dc-436f-b7a6-954af7bc51c5\") " pod="openstack/watcher-decision-engine-0" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.916501 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.917158 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.917265 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.917439 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-f4mj6" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.918397 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4be760b2-263c-4b89-8bdf-ecf98114a24f-config-data\") pod \"watcher-applier-0\" (UID: \"4be760b2-263c-4b89-8bdf-ecf98114a24f\") " pod="openstack/watcher-applier-0" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.921435 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-849c785789-5xrh2"] Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.953604 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4be760b2-263c-4b89-8bdf-ecf98114a24f-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"4be760b2-263c-4b89-8bdf-ecf98114a24f\") " pod="openstack/watcher-applier-0" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.963080 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc\") " pod="openstack/watcher-api-0" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.963970 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc-config-data\") pod \"watcher-api-0\" (UID: \"2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc\") " pod="openstack/watcher-api-0" Feb 19 15:28:48 crc kubenswrapper[4810]: I0219 15:28:48.965030 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3eb2dccd-c5dc-436f-b7a6-954af7bc51c5-config-data\") pod \"watcher-decision-engine-0\" (UID: \"3eb2dccd-c5dc-436f-b7a6-954af7bc51c5\") " pod="openstack/watcher-decision-engine-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.007915 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/3eb2dccd-c5dc-436f-b7a6-954af7bc51c5-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"3eb2dccd-c5dc-436f-b7a6-954af7bc51c5\") " pod="openstack/watcher-decision-engine-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.013778 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/848dfe9d-05f4-4ba9-919e-23e9a7ae63d5-etc-machine-id\") pod \"cinder-db-sync-svmgl\" (UID: \"848dfe9d-05f4-4ba9-919e-23e9a7ae63d5\") " pod="openstack/cinder-db-sync-svmgl" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.013866 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqnf7\" (UniqueName: \"kubernetes.io/projected/848dfe9d-05f4-4ba9-919e-23e9a7ae63d5-kube-api-access-cqnf7\") pod \"cinder-db-sync-svmgl\" (UID: \"848dfe9d-05f4-4ba9-919e-23e9a7ae63d5\") " pod="openstack/cinder-db-sync-svmgl" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.014032 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c39b1dd9-9e73-4cca-aea6-e228f1ba5942-scripts\") pod \"horizon-849c785789-5xrh2\" (UID: \"c39b1dd9-9e73-4cca-aea6-e228f1ba5942\") " pod="openstack/horizon-849c785789-5xrh2" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.014065 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcbv5\" (UniqueName: \"kubernetes.io/projected/c39b1dd9-9e73-4cca-aea6-e228f1ba5942-kube-api-access-pcbv5\") pod \"horizon-849c785789-5xrh2\" (UID: \"c39b1dd9-9e73-4cca-aea6-e228f1ba5942\") " pod="openstack/horizon-849c785789-5xrh2" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.014084 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/848dfe9d-05f4-4ba9-919e-23e9a7ae63d5-combined-ca-bundle\") pod \"cinder-db-sync-svmgl\" (UID: \"848dfe9d-05f4-4ba9-919e-23e9a7ae63d5\") " pod="openstack/cinder-db-sync-svmgl" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.014254 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4dd5dede-cf58-43c7-954e-b9b1d33ad8d1-config\") pod \"neutron-db-sync-j989d\" (UID: \"4dd5dede-cf58-43c7-954e-b9b1d33ad8d1\") " pod="openstack/neutron-db-sync-j989d" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.014290 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/848dfe9d-05f4-4ba9-919e-23e9a7ae63d5-config-data\") pod \"cinder-db-sync-svmgl\" (UID: \"848dfe9d-05f4-4ba9-919e-23e9a7ae63d5\") " pod="openstack/cinder-db-sync-svmgl" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.014319 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/848dfe9d-05f4-4ba9-919e-23e9a7ae63d5-scripts\") pod \"cinder-db-sync-svmgl\" (UID: \"848dfe9d-05f4-4ba9-919e-23e9a7ae63d5\") " pod="openstack/cinder-db-sync-svmgl" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.027434 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfnzr\" (UniqueName: \"kubernetes.io/projected/2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc-kube-api-access-mfnzr\") pod \"watcher-api-0\" (UID: \"2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc\") " pod="openstack/watcher-api-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.033207 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dd5dede-cf58-43c7-954e-b9b1d33ad8d1-combined-ca-bundle\") pod \"neutron-db-sync-j989d\" (UID: \"4dd5dede-cf58-43c7-954e-b9b1d33ad8d1\") " pod="openstack/neutron-db-sync-j989d" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.033300 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/848dfe9d-05f4-4ba9-919e-23e9a7ae63d5-db-sync-config-data\") pod \"cinder-db-sync-svmgl\" (UID: \"848dfe9d-05f4-4ba9-919e-23e9a7ae63d5\") " pod="openstack/cinder-db-sync-svmgl" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.033360 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c39b1dd9-9e73-4cca-aea6-e228f1ba5942-horizon-secret-key\") pod \"horizon-849c785789-5xrh2\" (UID: \"c39b1dd9-9e73-4cca-aea6-e228f1ba5942\") " pod="openstack/horizon-849c785789-5xrh2" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.033433 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sllpm\" (UniqueName: \"kubernetes.io/projected/4dd5dede-cf58-43c7-954e-b9b1d33ad8d1-kube-api-access-sllpm\") pod \"neutron-db-sync-j989d\" (UID: \"4dd5dede-cf58-43c7-954e-b9b1d33ad8d1\") " pod="openstack/neutron-db-sync-j989d" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.033461 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c39b1dd9-9e73-4cca-aea6-e228f1ba5942-logs\") pod \"horizon-849c785789-5xrh2\" (UID: \"c39b1dd9-9e73-4cca-aea6-e228f1ba5942\") " pod="openstack/horizon-849c785789-5xrh2" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.033504 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c39b1dd9-9e73-4cca-aea6-e228f1ba5942-config-data\") pod \"horizon-849c785789-5xrh2\" (UID: \"c39b1dd9-9e73-4cca-aea6-e228f1ba5942\") " pod="openstack/horizon-849c785789-5xrh2" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.044164 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bwsx\" (UniqueName: \"kubernetes.io/projected/3eb2dccd-c5dc-436f-b7a6-954af7bc51c5-kube-api-access-2bwsx\") pod \"watcher-decision-engine-0\" (UID: \"3eb2dccd-c5dc-436f-b7a6-954af7bc51c5\") " pod="openstack/watcher-decision-engine-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.060306 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjt9b\" (UniqueName: \"kubernetes.io/projected/4be760b2-263c-4b89-8bdf-ecf98114a24f-kube-api-access-kjt9b\") pod \"watcher-applier-0\" (UID: \"4be760b2-263c-4b89-8bdf-ecf98114a24f\") " pod="openstack/watcher-applier-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.074398 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sllpm\" (UniqueName: \"kubernetes.io/projected/4dd5dede-cf58-43c7-954e-b9b1d33ad8d1-kube-api-access-sllpm\") pod \"neutron-db-sync-j989d\" (UID: \"4dd5dede-cf58-43c7-954e-b9b1d33ad8d1\") " pod="openstack/neutron-db-sync-j989d" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.077574 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.093966 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dd5dede-cf58-43c7-954e-b9b1d33ad8d1-combined-ca-bundle\") pod \"neutron-db-sync-j989d\" (UID: \"4dd5dede-cf58-43c7-954e-b9b1d33ad8d1\") " pod="openstack/neutron-db-sync-j989d" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.120922 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/4dd5dede-cf58-43c7-954e-b9b1d33ad8d1-config\") pod \"neutron-db-sync-j989d\" (UID: \"4dd5dede-cf58-43c7-954e-b9b1d33ad8d1\") " pod="openstack/neutron-db-sync-j989d" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.123746 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.124309 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.127371 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.128804 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.133219 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.133391 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.134050 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-lrdct" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.134190 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.135937 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c39b1dd9-9e73-4cca-aea6-e228f1ba5942-logs\") pod \"horizon-849c785789-5xrh2\" (UID: \"c39b1dd9-9e73-4cca-aea6-e228f1ba5942\") " pod="openstack/horizon-849c785789-5xrh2" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.136001 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c39b1dd9-9e73-4cca-aea6-e228f1ba5942-config-data\") pod \"horizon-849c785789-5xrh2\" (UID: \"c39b1dd9-9e73-4cca-aea6-e228f1ba5942\") " pod="openstack/horizon-849c785789-5xrh2" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.136042 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/848dfe9d-05f4-4ba9-919e-23e9a7ae63d5-etc-machine-id\") pod \"cinder-db-sync-svmgl\" (UID: \"848dfe9d-05f4-4ba9-919e-23e9a7ae63d5\") " pod="openstack/cinder-db-sync-svmgl" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.136069 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqnf7\" (UniqueName: \"kubernetes.io/projected/848dfe9d-05f4-4ba9-919e-23e9a7ae63d5-kube-api-access-cqnf7\") pod \"cinder-db-sync-svmgl\" (UID: \"848dfe9d-05f4-4ba9-919e-23e9a7ae63d5\") " pod="openstack/cinder-db-sync-svmgl" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.136107 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c39b1dd9-9e73-4cca-aea6-e228f1ba5942-scripts\") pod \"horizon-849c785789-5xrh2\" (UID: \"c39b1dd9-9e73-4cca-aea6-e228f1ba5942\") " pod="openstack/horizon-849c785789-5xrh2" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.136124 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pcbv5\" (UniqueName: \"kubernetes.io/projected/c39b1dd9-9e73-4cca-aea6-e228f1ba5942-kube-api-access-pcbv5\") pod \"horizon-849c785789-5xrh2\" (UID: \"c39b1dd9-9e73-4cca-aea6-e228f1ba5942\") " pod="openstack/horizon-849c785789-5xrh2" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.136140 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/848dfe9d-05f4-4ba9-919e-23e9a7ae63d5-combined-ca-bundle\") pod \"cinder-db-sync-svmgl\" (UID: \"848dfe9d-05f4-4ba9-919e-23e9a7ae63d5\") " pod="openstack/cinder-db-sync-svmgl" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.136173 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/848dfe9d-05f4-4ba9-919e-23e9a7ae63d5-config-data\") pod \"cinder-db-sync-svmgl\" (UID: \"848dfe9d-05f4-4ba9-919e-23e9a7ae63d5\") " pod="openstack/cinder-db-sync-svmgl" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.136193 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/848dfe9d-05f4-4ba9-919e-23e9a7ae63d5-scripts\") pod \"cinder-db-sync-svmgl\" (UID: \"848dfe9d-05f4-4ba9-919e-23e9a7ae63d5\") " pod="openstack/cinder-db-sync-svmgl" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.136244 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/848dfe9d-05f4-4ba9-919e-23e9a7ae63d5-db-sync-config-data\") pod \"cinder-db-sync-svmgl\" (UID: \"848dfe9d-05f4-4ba9-919e-23e9a7ae63d5\") " pod="openstack/cinder-db-sync-svmgl" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.136265 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c39b1dd9-9e73-4cca-aea6-e228f1ba5942-horizon-secret-key\") pod \"horizon-849c785789-5xrh2\" (UID: \"c39b1dd9-9e73-4cca-aea6-e228f1ba5942\") " pod="openstack/horizon-849c785789-5xrh2" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.137638 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c39b1dd9-9e73-4cca-aea6-e228f1ba5942-scripts\") pod \"horizon-849c785789-5xrh2\" (UID: \"c39b1dd9-9e73-4cca-aea6-e228f1ba5942\") " pod="openstack/horizon-849c785789-5xrh2" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.137684 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-j989d" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.138416 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c39b1dd9-9e73-4cca-aea6-e228f1ba5942-config-data\") pod \"horizon-849c785789-5xrh2\" (UID: \"c39b1dd9-9e73-4cca-aea6-e228f1ba5942\") " pod="openstack/horizon-849c785789-5xrh2" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.138248 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c39b1dd9-9e73-4cca-aea6-e228f1ba5942-logs\") pod \"horizon-849c785789-5xrh2\" (UID: \"c39b1dd9-9e73-4cca-aea6-e228f1ba5942\") " pod="openstack/horizon-849c785789-5xrh2" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.143910 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c39b1dd9-9e73-4cca-aea6-e228f1ba5942-horizon-secret-key\") pod \"horizon-849c785789-5xrh2\" (UID: \"c39b1dd9-9e73-4cca-aea6-e228f1ba5942\") " pod="openstack/horizon-849c785789-5xrh2" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.148916 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/848dfe9d-05f4-4ba9-919e-23e9a7ae63d5-etc-machine-id\") pod \"cinder-db-sync-svmgl\" (UID: \"848dfe9d-05f4-4ba9-919e-23e9a7ae63d5\") " pod="openstack/cinder-db-sync-svmgl" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.153937 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/848dfe9d-05f4-4ba9-919e-23e9a7ae63d5-combined-ca-bundle\") pod \"cinder-db-sync-svmgl\" (UID: \"848dfe9d-05f4-4ba9-919e-23e9a7ae63d5\") " pod="openstack/cinder-db-sync-svmgl" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.154436 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/848dfe9d-05f4-4ba9-919e-23e9a7ae63d5-scripts\") pod \"cinder-db-sync-svmgl\" (UID: \"848dfe9d-05f4-4ba9-919e-23e9a7ae63d5\") " pod="openstack/cinder-db-sync-svmgl" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.159252 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/848dfe9d-05f4-4ba9-919e-23e9a7ae63d5-config-data\") pod \"cinder-db-sync-svmgl\" (UID: \"848dfe9d-05f4-4ba9-919e-23e9a7ae63d5\") " pod="openstack/cinder-db-sync-svmgl" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.180269 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.188819 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqnf7\" (UniqueName: \"kubernetes.io/projected/848dfe9d-05f4-4ba9-919e-23e9a7ae63d5-kube-api-access-cqnf7\") pod \"cinder-db-sync-svmgl\" (UID: \"848dfe9d-05f4-4ba9-919e-23e9a7ae63d5\") " pod="openstack/cinder-db-sync-svmgl" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.202109 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcbv5\" (UniqueName: \"kubernetes.io/projected/c39b1dd9-9e73-4cca-aea6-e228f1ba5942-kube-api-access-pcbv5\") pod \"horizon-849c785789-5xrh2\" (UID: \"c39b1dd9-9e73-4cca-aea6-e228f1ba5942\") " pod="openstack/horizon-849c785789-5xrh2" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.202234 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/848dfe9d-05f4-4ba9-919e-23e9a7ae63d5-db-sync-config-data\") pod \"cinder-db-sync-svmgl\" (UID: \"848dfe9d-05f4-4ba9-919e-23e9a7ae63d5\") " pod="openstack/cinder-db-sync-svmgl" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.214433 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-744c9b5447-bdbq7"] Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.236197 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-849c785789-5xrh2" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.237193 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a0789f1-099a-4f95-9626-a5ad7da804bc-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7a0789f1-099a-4f95-9626-a5ad7da804bc\") " pod="openstack/glance-default-external-api-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.237253 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a0789f1-099a-4f95-9626-a5ad7da804bc-logs\") pod \"glance-default-external-api-0\" (UID: \"7a0789f1-099a-4f95-9626-a5ad7da804bc\") " pod="openstack/glance-default-external-api-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.237271 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7a0789f1-099a-4f95-9626-a5ad7da804bc-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7a0789f1-099a-4f95-9626-a5ad7da804bc\") " pod="openstack/glance-default-external-api-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.237306 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a0789f1-099a-4f95-9626-a5ad7da804bc-config-data\") pod \"glance-default-external-api-0\" (UID: \"7a0789f1-099a-4f95-9626-a5ad7da804bc\") " pod="openstack/glance-default-external-api-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.237364 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a0789f1-099a-4f95-9626-a5ad7da804bc-scripts\") pod \"glance-default-external-api-0\" (UID: \"7a0789f1-099a-4f95-9626-a5ad7da804bc\") " pod="openstack/glance-default-external-api-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.237378 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8f8d7\" (UniqueName: \"kubernetes.io/projected/7a0789f1-099a-4f95-9626-a5ad7da804bc-kube-api-access-8f8d7\") pod \"glance-default-external-api-0\" (UID: \"7a0789f1-099a-4f95-9626-a5ad7da804bc\") " pod="openstack/glance-default-external-api-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.237421 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a0789f1-099a-4f95-9626-a5ad7da804bc-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7a0789f1-099a-4f95-9626-a5ad7da804bc\") " pod="openstack/glance-default-external-api-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.237511 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"7a0789f1-099a-4f95-9626-a5ad7da804bc\") " pod="openstack/glance-default-external-api-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.248204 4810 generic.go:334] "Generic (PLEG): container finished" podID="7cc2bd68-d5b3-4416-8752-edf7ee85bf88" containerID="b3c769533587d67c2cf2579dac0d7fe9ba6b6dac50efc5e3e91af785d9090a99" exitCode=0 Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.248249 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589c97547-9nnhp" event={"ID":"7cc2bd68-d5b3-4416-8752-edf7ee85bf88","Type":"ContainerDied","Data":"b3c769533587d67c2cf2579dac0d7fe9ba6b6dac50efc5e3e91af785d9090a99"} Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.252563 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f4cfd6f6c-s7m64"] Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.253984 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f4cfd6f6c-s7m64" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.260261 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-7jdcp"] Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.263972 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-7jdcp" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.267808 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-qpnp7" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.268023 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.268213 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.268372 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-7jdcp"] Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.275809 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f4cfd6f6c-s7m64"] Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.319727 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-hmc6k"] Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.321800 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-hmc6k" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.323496 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.323669 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-mx2zj" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.327488 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-hmc6k"] Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.338198 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5fbd85f69f-5jhnw"] Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.338504 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a0789f1-099a-4f95-9626-a5ad7da804bc-config-data\") pod \"glance-default-external-api-0\" (UID: \"7a0789f1-099a-4f95-9626-a5ad7da804bc\") " pod="openstack/glance-default-external-api-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.338535 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18ca6546-69fd-492d-81c5-bb18c56b045d-dns-svc\") pod \"dnsmasq-dns-5f4cfd6f6c-s7m64\" (UID: \"18ca6546-69fd-492d-81c5-bb18c56b045d\") " pod="openstack/dnsmasq-dns-5f4cfd6f6c-s7m64" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.338555 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a0789f1-099a-4f95-9626-a5ad7da804bc-scripts\") pod \"glance-default-external-api-0\" (UID: \"7a0789f1-099a-4f95-9626-a5ad7da804bc\") " pod="openstack/glance-default-external-api-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.338570 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8f8d7\" (UniqueName: \"kubernetes.io/projected/7a0789f1-099a-4f95-9626-a5ad7da804bc-kube-api-access-8f8d7\") pod \"glance-default-external-api-0\" (UID: \"7a0789f1-099a-4f95-9626-a5ad7da804bc\") " pod="openstack/glance-default-external-api-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.338589 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/18ca6546-69fd-492d-81c5-bb18c56b045d-ovsdbserver-nb\") pod \"dnsmasq-dns-5f4cfd6f6c-s7m64\" (UID: \"18ca6546-69fd-492d-81c5-bb18c56b045d\") " pod="openstack/dnsmasq-dns-5f4cfd6f6c-s7m64" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.338827 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btmxg\" (UniqueName: \"kubernetes.io/projected/2024a783-c3f9-4e57-b00f-52bec164e64e-kube-api-access-btmxg\") pod \"barbican-db-sync-hmc6k\" (UID: \"2024a783-c3f9-4e57-b00f-52bec164e64e\") " pod="openstack/barbican-db-sync-hmc6k" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.338851 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a0789f1-099a-4f95-9626-a5ad7da804bc-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7a0789f1-099a-4f95-9626-a5ad7da804bc\") " pod="openstack/glance-default-external-api-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.338871 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2024a783-c3f9-4e57-b00f-52bec164e64e-combined-ca-bundle\") pod \"barbican-db-sync-hmc6k\" (UID: \"2024a783-c3f9-4e57-b00f-52bec164e64e\") " pod="openstack/barbican-db-sync-hmc6k" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.338922 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qksxk\" (UniqueName: \"kubernetes.io/projected/36fe6fdb-2970-4773-8184-a2d16b8ca89a-kube-api-access-qksxk\") pod \"placement-db-sync-7jdcp\" (UID: \"36fe6fdb-2970-4773-8184-a2d16b8ca89a\") " pod="openstack/placement-db-sync-7jdcp" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.338943 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36fe6fdb-2970-4773-8184-a2d16b8ca89a-logs\") pod \"placement-db-sync-7jdcp\" (UID: \"36fe6fdb-2970-4773-8184-a2d16b8ca89a\") " pod="openstack/placement-db-sync-7jdcp" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.339094 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36fe6fdb-2970-4773-8184-a2d16b8ca89a-scripts\") pod \"placement-db-sync-7jdcp\" (UID: \"36fe6fdb-2970-4773-8184-a2d16b8ca89a\") " pod="openstack/placement-db-sync-7jdcp" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.341012 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5fbd85f69f-5jhnw" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.342373 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nljdr\" (UniqueName: \"kubernetes.io/projected/18ca6546-69fd-492d-81c5-bb18c56b045d-kube-api-access-nljdr\") pod \"dnsmasq-dns-5f4cfd6f6c-s7m64\" (UID: \"18ca6546-69fd-492d-81c5-bb18c56b045d\") " pod="openstack/dnsmasq-dns-5f4cfd6f6c-s7m64" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.342429 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"7a0789f1-099a-4f95-9626-a5ad7da804bc\") " pod="openstack/glance-default-external-api-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.342449 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36fe6fdb-2970-4773-8184-a2d16b8ca89a-combined-ca-bundle\") pod \"placement-db-sync-7jdcp\" (UID: \"36fe6fdb-2970-4773-8184-a2d16b8ca89a\") " pod="openstack/placement-db-sync-7jdcp" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.342481 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/18ca6546-69fd-492d-81c5-bb18c56b045d-ovsdbserver-sb\") pod \"dnsmasq-dns-5f4cfd6f6c-s7m64\" (UID: \"18ca6546-69fd-492d-81c5-bb18c56b045d\") " pod="openstack/dnsmasq-dns-5f4cfd6f6c-s7m64" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.342499 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2024a783-c3f9-4e57-b00f-52bec164e64e-db-sync-config-data\") pod \"barbican-db-sync-hmc6k\" (UID: \"2024a783-c3f9-4e57-b00f-52bec164e64e\") " pod="openstack/barbican-db-sync-hmc6k" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.342557 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36fe6fdb-2970-4773-8184-a2d16b8ca89a-config-data\") pod \"placement-db-sync-7jdcp\" (UID: \"36fe6fdb-2970-4773-8184-a2d16b8ca89a\") " pod="openstack/placement-db-sync-7jdcp" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.342577 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a0789f1-099a-4f95-9626-a5ad7da804bc-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7a0789f1-099a-4f95-9626-a5ad7da804bc\") " pod="openstack/glance-default-external-api-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.342604 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18ca6546-69fd-492d-81c5-bb18c56b045d-config\") pod \"dnsmasq-dns-5f4cfd6f6c-s7m64\" (UID: \"18ca6546-69fd-492d-81c5-bb18c56b045d\") " pod="openstack/dnsmasq-dns-5f4cfd6f6c-s7m64" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.342639 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a0789f1-099a-4f95-9626-a5ad7da804bc-logs\") pod \"glance-default-external-api-0\" (UID: \"7a0789f1-099a-4f95-9626-a5ad7da804bc\") " pod="openstack/glance-default-external-api-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.342661 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7a0789f1-099a-4f95-9626-a5ad7da804bc-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7a0789f1-099a-4f95-9626-a5ad7da804bc\") " pod="openstack/glance-default-external-api-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.342708 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/18ca6546-69fd-492d-81c5-bb18c56b045d-dns-swift-storage-0\") pod \"dnsmasq-dns-5f4cfd6f6c-s7m64\" (UID: \"18ca6546-69fd-492d-81c5-bb18c56b045d\") " pod="openstack/dnsmasq-dns-5f4cfd6f6c-s7m64" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.342914 4810 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"7a0789f1-099a-4f95-9626-a5ad7da804bc\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-external-api-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.343197 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a0789f1-099a-4f95-9626-a5ad7da804bc-config-data\") pod \"glance-default-external-api-0\" (UID: \"7a0789f1-099a-4f95-9626-a5ad7da804bc\") " pod="openstack/glance-default-external-api-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.343506 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a0789f1-099a-4f95-9626-a5ad7da804bc-logs\") pod \"glance-default-external-api-0\" (UID: \"7a0789f1-099a-4f95-9626-a5ad7da804bc\") " pod="openstack/glance-default-external-api-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.343734 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7a0789f1-099a-4f95-9626-a5ad7da804bc-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7a0789f1-099a-4f95-9626-a5ad7da804bc\") " pod="openstack/glance-default-external-api-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.348927 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a0789f1-099a-4f95-9626-a5ad7da804bc-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7a0789f1-099a-4f95-9626-a5ad7da804bc\") " pod="openstack/glance-default-external-api-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.355800 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a0789f1-099a-4f95-9626-a5ad7da804bc-scripts\") pod \"glance-default-external-api-0\" (UID: \"7a0789f1-099a-4f95-9626-a5ad7da804bc\") " pod="openstack/glance-default-external-api-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.362130 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.369618 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a0789f1-099a-4f95-9626-a5ad7da804bc-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7a0789f1-099a-4f95-9626-a5ad7da804bc\") " pod="openstack/glance-default-external-api-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.372407 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.375113 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.376676 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"7a0789f1-099a-4f95-9626-a5ad7da804bc\") " pod="openstack/glance-default-external-api-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.379381 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5fbd85f69f-5jhnw"] Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.380119 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.380318 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8f8d7\" (UniqueName: \"kubernetes.io/projected/7a0789f1-099a-4f95-9626-a5ad7da804bc-kube-api-access-8f8d7\") pod \"glance-default-external-api-0\" (UID: \"7a0789f1-099a-4f95-9626-a5ad7da804bc\") " pod="openstack/glance-default-external-api-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.396725 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.411374 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589c97547-9nnhp" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.423442 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 19 15:28:49 crc kubenswrapper[4810]: E0219 15:28:49.423826 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cc2bd68-d5b3-4416-8752-edf7ee85bf88" containerName="dnsmasq-dns" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.423838 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cc2bd68-d5b3-4416-8752-edf7ee85bf88" containerName="dnsmasq-dns" Feb 19 15:28:49 crc kubenswrapper[4810]: E0219 15:28:49.423873 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cc2bd68-d5b3-4416-8752-edf7ee85bf88" containerName="init" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.423880 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cc2bd68-d5b3-4416-8752-edf7ee85bf88" containerName="init" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.424028 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cc2bd68-d5b3-4416-8752-edf7ee85bf88" containerName="dnsmasq-dns" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.425904 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.428603 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.431210 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.432976 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.445112 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36fe6fdb-2970-4773-8184-a2d16b8ca89a-combined-ca-bundle\") pod \"placement-db-sync-7jdcp\" (UID: \"36fe6fdb-2970-4773-8184-a2d16b8ca89a\") " pod="openstack/placement-db-sync-7jdcp" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.445163 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/18ca6546-69fd-492d-81c5-bb18c56b045d-ovsdbserver-sb\") pod \"dnsmasq-dns-5f4cfd6f6c-s7m64\" (UID: \"18ca6546-69fd-492d-81c5-bb18c56b045d\") " pod="openstack/dnsmasq-dns-5f4cfd6f6c-s7m64" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.445181 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2024a783-c3f9-4e57-b00f-52bec164e64e-db-sync-config-data\") pod \"barbican-db-sync-hmc6k\" (UID: \"2024a783-c3f9-4e57-b00f-52bec164e64e\") " pod="openstack/barbican-db-sync-hmc6k" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.445223 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36fe6fdb-2970-4773-8184-a2d16b8ca89a-config-data\") pod \"placement-db-sync-7jdcp\" (UID: \"36fe6fdb-2970-4773-8184-a2d16b8ca89a\") " pod="openstack/placement-db-sync-7jdcp" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.445251 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18ca6546-69fd-492d-81c5-bb18c56b045d-config\") pod \"dnsmasq-dns-5f4cfd6f6c-s7m64\" (UID: \"18ca6546-69fd-492d-81c5-bb18c56b045d\") " pod="openstack/dnsmasq-dns-5f4cfd6f6c-s7m64" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.445306 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/18ca6546-69fd-492d-81c5-bb18c56b045d-dns-swift-storage-0\") pod \"dnsmasq-dns-5f4cfd6f6c-s7m64\" (UID: \"18ca6546-69fd-492d-81c5-bb18c56b045d\") " pod="openstack/dnsmasq-dns-5f4cfd6f6c-s7m64" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.445354 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18ca6546-69fd-492d-81c5-bb18c56b045d-dns-svc\") pod \"dnsmasq-dns-5f4cfd6f6c-s7m64\" (UID: \"18ca6546-69fd-492d-81c5-bb18c56b045d\") " pod="openstack/dnsmasq-dns-5f4cfd6f6c-s7m64" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.445381 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/18ca6546-69fd-492d-81c5-bb18c56b045d-ovsdbserver-nb\") pod \"dnsmasq-dns-5f4cfd6f6c-s7m64\" (UID: \"18ca6546-69fd-492d-81c5-bb18c56b045d\") " pod="openstack/dnsmasq-dns-5f4cfd6f6c-s7m64" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.445413 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btmxg\" (UniqueName: \"kubernetes.io/projected/2024a783-c3f9-4e57-b00f-52bec164e64e-kube-api-access-btmxg\") pod \"barbican-db-sync-hmc6k\" (UID: \"2024a783-c3f9-4e57-b00f-52bec164e64e\") " pod="openstack/barbican-db-sync-hmc6k" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.445447 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2024a783-c3f9-4e57-b00f-52bec164e64e-combined-ca-bundle\") pod \"barbican-db-sync-hmc6k\" (UID: \"2024a783-c3f9-4e57-b00f-52bec164e64e\") " pod="openstack/barbican-db-sync-hmc6k" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.445485 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qksxk\" (UniqueName: \"kubernetes.io/projected/36fe6fdb-2970-4773-8184-a2d16b8ca89a-kube-api-access-qksxk\") pod \"placement-db-sync-7jdcp\" (UID: \"36fe6fdb-2970-4773-8184-a2d16b8ca89a\") " pod="openstack/placement-db-sync-7jdcp" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.445513 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36fe6fdb-2970-4773-8184-a2d16b8ca89a-logs\") pod \"placement-db-sync-7jdcp\" (UID: \"36fe6fdb-2970-4773-8184-a2d16b8ca89a\") " pod="openstack/placement-db-sync-7jdcp" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.445545 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36fe6fdb-2970-4773-8184-a2d16b8ca89a-scripts\") pod \"placement-db-sync-7jdcp\" (UID: \"36fe6fdb-2970-4773-8184-a2d16b8ca89a\") " pod="openstack/placement-db-sync-7jdcp" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.445569 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nljdr\" (UniqueName: \"kubernetes.io/projected/18ca6546-69fd-492d-81c5-bb18c56b045d-kube-api-access-nljdr\") pod \"dnsmasq-dns-5f4cfd6f6c-s7m64\" (UID: \"18ca6546-69fd-492d-81c5-bb18c56b045d\") " pod="openstack/dnsmasq-dns-5f4cfd6f6c-s7m64" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.446582 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18ca6546-69fd-492d-81c5-bb18c56b045d-config\") pod \"dnsmasq-dns-5f4cfd6f6c-s7m64\" (UID: \"18ca6546-69fd-492d-81c5-bb18c56b045d\") " pod="openstack/dnsmasq-dns-5f4cfd6f6c-s7m64" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.447035 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/18ca6546-69fd-492d-81c5-bb18c56b045d-ovsdbserver-sb\") pod \"dnsmasq-dns-5f4cfd6f6c-s7m64\" (UID: \"18ca6546-69fd-492d-81c5-bb18c56b045d\") " pod="openstack/dnsmasq-dns-5f4cfd6f6c-s7m64" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.447171 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/18ca6546-69fd-492d-81c5-bb18c56b045d-dns-swift-storage-0\") pod \"dnsmasq-dns-5f4cfd6f6c-s7m64\" (UID: \"18ca6546-69fd-492d-81c5-bb18c56b045d\") " pod="openstack/dnsmasq-dns-5f4cfd6f6c-s7m64" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.447668 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18ca6546-69fd-492d-81c5-bb18c56b045d-dns-svc\") pod \"dnsmasq-dns-5f4cfd6f6c-s7m64\" (UID: \"18ca6546-69fd-492d-81c5-bb18c56b045d\") " pod="openstack/dnsmasq-dns-5f4cfd6f6c-s7m64" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.448150 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/18ca6546-69fd-492d-81c5-bb18c56b045d-ovsdbserver-nb\") pod \"dnsmasq-dns-5f4cfd6f6c-s7m64\" (UID: \"18ca6546-69fd-492d-81c5-bb18c56b045d\") " pod="openstack/dnsmasq-dns-5f4cfd6f6c-s7m64" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.449592 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36fe6fdb-2970-4773-8184-a2d16b8ca89a-logs\") pod \"placement-db-sync-7jdcp\" (UID: \"36fe6fdb-2970-4773-8184-a2d16b8ca89a\") " pod="openstack/placement-db-sync-7jdcp" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.453038 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2024a783-c3f9-4e57-b00f-52bec164e64e-combined-ca-bundle\") pod \"barbican-db-sync-hmc6k\" (UID: \"2024a783-c3f9-4e57-b00f-52bec164e64e\") " pod="openstack/barbican-db-sync-hmc6k" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.456617 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36fe6fdb-2970-4773-8184-a2d16b8ca89a-scripts\") pod \"placement-db-sync-7jdcp\" (UID: \"36fe6fdb-2970-4773-8184-a2d16b8ca89a\") " pod="openstack/placement-db-sync-7jdcp" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.467644 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36fe6fdb-2970-4773-8184-a2d16b8ca89a-config-data\") pod \"placement-db-sync-7jdcp\" (UID: \"36fe6fdb-2970-4773-8184-a2d16b8ca89a\") " pod="openstack/placement-db-sync-7jdcp" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.474719 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2024a783-c3f9-4e57-b00f-52bec164e64e-db-sync-config-data\") pod \"barbican-db-sync-hmc6k\" (UID: \"2024a783-c3f9-4e57-b00f-52bec164e64e\") " pod="openstack/barbican-db-sync-hmc6k" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.500175 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36fe6fdb-2970-4773-8184-a2d16b8ca89a-combined-ca-bundle\") pod \"placement-db-sync-7jdcp\" (UID: \"36fe6fdb-2970-4773-8184-a2d16b8ca89a\") " pod="openstack/placement-db-sync-7jdcp" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.510225 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btmxg\" (UniqueName: \"kubernetes.io/projected/2024a783-c3f9-4e57-b00f-52bec164e64e-kube-api-access-btmxg\") pod \"barbican-db-sync-hmc6k\" (UID: \"2024a783-c3f9-4e57-b00f-52bec164e64e\") " pod="openstack/barbican-db-sync-hmc6k" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.511694 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-svmgl" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.529222 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.538784 4810 patch_prober.go:28] interesting pod/machine-config-daemon-t499d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.538846 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.552477 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qksxk\" (UniqueName: \"kubernetes.io/projected/36fe6fdb-2970-4773-8184-a2d16b8ca89a-kube-api-access-qksxk\") pod \"placement-db-sync-7jdcp\" (UID: \"36fe6fdb-2970-4773-8184-a2d16b8ca89a\") " pod="openstack/placement-db-sync-7jdcp" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.552543 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7cc2bd68-d5b3-4416-8752-edf7ee85bf88-dns-svc\") pod \"7cc2bd68-d5b3-4416-8752-edf7ee85bf88\" (UID: \"7cc2bd68-d5b3-4416-8752-edf7ee85bf88\") " Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.552634 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g84kv\" (UniqueName: \"kubernetes.io/projected/7cc2bd68-d5b3-4416-8752-edf7ee85bf88-kube-api-access-g84kv\") pod \"7cc2bd68-d5b3-4416-8752-edf7ee85bf88\" (UID: \"7cc2bd68-d5b3-4416-8752-edf7ee85bf88\") " Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.552674 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cc2bd68-d5b3-4416-8752-edf7ee85bf88-config\") pod \"7cc2bd68-d5b3-4416-8752-edf7ee85bf88\" (UID: \"7cc2bd68-d5b3-4416-8752-edf7ee85bf88\") " Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.552718 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7cc2bd68-d5b3-4416-8752-edf7ee85bf88-dns-swift-storage-0\") pod \"7cc2bd68-d5b3-4416-8752-edf7ee85bf88\" (UID: \"7cc2bd68-d5b3-4416-8752-edf7ee85bf88\") " Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.552767 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7cc2bd68-d5b3-4416-8752-edf7ee85bf88-ovsdbserver-sb\") pod \"7cc2bd68-d5b3-4416-8752-edf7ee85bf88\" (UID: \"7cc2bd68-d5b3-4416-8752-edf7ee85bf88\") " Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.552789 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7cc2bd68-d5b3-4416-8752-edf7ee85bf88-ovsdbserver-nb\") pod \"7cc2bd68-d5b3-4416-8752-edf7ee85bf88\" (UID: \"7cc2bd68-d5b3-4416-8752-edf7ee85bf88\") " Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.553493 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b8cd44d0-7395-44e1-9112-9e8bb4198b93-log-httpd\") pod \"ceilometer-0\" (UID: \"b8cd44d0-7395-44e1-9112-9e8bb4198b93\") " pod="openstack/ceilometer-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.553560 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b8cd44d0-7395-44e1-9112-9e8bb4198b93-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b8cd44d0-7395-44e1-9112-9e8bb4198b93\") " pod="openstack/ceilometer-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.553611 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8cd44d0-7395-44e1-9112-9e8bb4198b93-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b8cd44d0-7395-44e1-9112-9e8bb4198b93\") " pod="openstack/ceilometer-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.553656 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8cd44d0-7395-44e1-9112-9e8bb4198b93-scripts\") pod \"ceilometer-0\" (UID: \"b8cd44d0-7395-44e1-9112-9e8bb4198b93\") " pod="openstack/ceilometer-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.553694 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2b86\" (UniqueName: \"kubernetes.io/projected/9c2f952c-2122-43d9-b006-6967fd2b9029-kube-api-access-n2b86\") pod \"glance-default-internal-api-0\" (UID: \"9c2f952c-2122-43d9-b006-6967fd2b9029\") " pod="openstack/glance-default-internal-api-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.553728 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/978c1383-cc82-4788-beea-b1e15b25eb1f-horizon-secret-key\") pod \"horizon-5fbd85f69f-5jhnw\" (UID: \"978c1383-cc82-4788-beea-b1e15b25eb1f\") " pod="openstack/horizon-5fbd85f69f-5jhnw" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.553747 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c2f952c-2122-43d9-b006-6967fd2b9029-logs\") pod \"glance-default-internal-api-0\" (UID: \"9c2f952c-2122-43d9-b006-6967fd2b9029\") " pod="openstack/glance-default-internal-api-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.553770 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c2f952c-2122-43d9-b006-6967fd2b9029-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9c2f952c-2122-43d9-b006-6967fd2b9029\") " pod="openstack/glance-default-internal-api-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.553797 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c2f952c-2122-43d9-b006-6967fd2b9029-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9c2f952c-2122-43d9-b006-6967fd2b9029\") " pod="openstack/glance-default-internal-api-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.553819 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/978c1383-cc82-4788-beea-b1e15b25eb1f-logs\") pod \"horizon-5fbd85f69f-5jhnw\" (UID: \"978c1383-cc82-4788-beea-b1e15b25eb1f\") " pod="openstack/horizon-5fbd85f69f-5jhnw" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.553838 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9c2f952c-2122-43d9-b006-6967fd2b9029-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9c2f952c-2122-43d9-b006-6967fd2b9029\") " pod="openstack/glance-default-internal-api-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.553867 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b8cd44d0-7395-44e1-9112-9e8bb4198b93-run-httpd\") pod \"ceilometer-0\" (UID: \"b8cd44d0-7395-44e1-9112-9e8bb4198b93\") " pod="openstack/ceilometer-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.553889 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/978c1383-cc82-4788-beea-b1e15b25eb1f-scripts\") pod \"horizon-5fbd85f69f-5jhnw\" (UID: \"978c1383-cc82-4788-beea-b1e15b25eb1f\") " pod="openstack/horizon-5fbd85f69f-5jhnw" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.553910 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jd77w\" (UniqueName: \"kubernetes.io/projected/b8cd44d0-7395-44e1-9112-9e8bb4198b93-kube-api-access-jd77w\") pod \"ceilometer-0\" (UID: \"b8cd44d0-7395-44e1-9112-9e8bb4198b93\") " pod="openstack/ceilometer-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.553950 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c2f952c-2122-43d9-b006-6967fd2b9029-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9c2f952c-2122-43d9-b006-6967fd2b9029\") " pod="openstack/glance-default-internal-api-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.553995 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"9c2f952c-2122-43d9-b006-6967fd2b9029\") " pod="openstack/glance-default-internal-api-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.554042 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c2f952c-2122-43d9-b006-6967fd2b9029-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9c2f952c-2122-43d9-b006-6967fd2b9029\") " pod="openstack/glance-default-internal-api-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.554089 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/978c1383-cc82-4788-beea-b1e15b25eb1f-config-data\") pod \"horizon-5fbd85f69f-5jhnw\" (UID: \"978c1383-cc82-4788-beea-b1e15b25eb1f\") " pod="openstack/horizon-5fbd85f69f-5jhnw" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.554130 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4ql7\" (UniqueName: \"kubernetes.io/projected/978c1383-cc82-4788-beea-b1e15b25eb1f-kube-api-access-h4ql7\") pod \"horizon-5fbd85f69f-5jhnw\" (UID: \"978c1383-cc82-4788-beea-b1e15b25eb1f\") " pod="openstack/horizon-5fbd85f69f-5jhnw" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.554172 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8cd44d0-7395-44e1-9112-9e8bb4198b93-config-data\") pod \"ceilometer-0\" (UID: \"b8cd44d0-7395-44e1-9112-9e8bb4198b93\") " pod="openstack/ceilometer-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.571089 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nljdr\" (UniqueName: \"kubernetes.io/projected/18ca6546-69fd-492d-81c5-bb18c56b045d-kube-api-access-nljdr\") pod \"dnsmasq-dns-5f4cfd6f6c-s7m64\" (UID: \"18ca6546-69fd-492d-81c5-bb18c56b045d\") " pod="openstack/dnsmasq-dns-5f4cfd6f6c-s7m64" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.610610 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cc2bd68-d5b3-4416-8752-edf7ee85bf88-kube-api-access-g84kv" (OuterVolumeSpecName: "kube-api-access-g84kv") pod "7cc2bd68-d5b3-4416-8752-edf7ee85bf88" (UID: "7cc2bd68-d5b3-4416-8752-edf7ee85bf88"). InnerVolumeSpecName "kube-api-access-g84kv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.611383 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f4cfd6f6c-s7m64" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.616928 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-7jdcp" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.656051 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c2f952c-2122-43d9-b006-6967fd2b9029-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9c2f952c-2122-43d9-b006-6967fd2b9029\") " pod="openstack/glance-default-internal-api-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.656099 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/978c1383-cc82-4788-beea-b1e15b25eb1f-logs\") pod \"horizon-5fbd85f69f-5jhnw\" (UID: \"978c1383-cc82-4788-beea-b1e15b25eb1f\") " pod="openstack/horizon-5fbd85f69f-5jhnw" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.656118 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9c2f952c-2122-43d9-b006-6967fd2b9029-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9c2f952c-2122-43d9-b006-6967fd2b9029\") " pod="openstack/glance-default-internal-api-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.656164 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b8cd44d0-7395-44e1-9112-9e8bb4198b93-run-httpd\") pod \"ceilometer-0\" (UID: \"b8cd44d0-7395-44e1-9112-9e8bb4198b93\") " pod="openstack/ceilometer-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.656185 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/978c1383-cc82-4788-beea-b1e15b25eb1f-scripts\") pod \"horizon-5fbd85f69f-5jhnw\" (UID: \"978c1383-cc82-4788-beea-b1e15b25eb1f\") " pod="openstack/horizon-5fbd85f69f-5jhnw" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.656203 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jd77w\" (UniqueName: \"kubernetes.io/projected/b8cd44d0-7395-44e1-9112-9e8bb4198b93-kube-api-access-jd77w\") pod \"ceilometer-0\" (UID: \"b8cd44d0-7395-44e1-9112-9e8bb4198b93\") " pod="openstack/ceilometer-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.656255 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c2f952c-2122-43d9-b006-6967fd2b9029-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9c2f952c-2122-43d9-b006-6967fd2b9029\") " pod="openstack/glance-default-internal-api-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.656305 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"9c2f952c-2122-43d9-b006-6967fd2b9029\") " pod="openstack/glance-default-internal-api-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.656392 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c2f952c-2122-43d9-b006-6967fd2b9029-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9c2f952c-2122-43d9-b006-6967fd2b9029\") " pod="openstack/glance-default-internal-api-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.656411 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/978c1383-cc82-4788-beea-b1e15b25eb1f-config-data\") pod \"horizon-5fbd85f69f-5jhnw\" (UID: \"978c1383-cc82-4788-beea-b1e15b25eb1f\") " pod="openstack/horizon-5fbd85f69f-5jhnw" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.656429 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4ql7\" (UniqueName: \"kubernetes.io/projected/978c1383-cc82-4788-beea-b1e15b25eb1f-kube-api-access-h4ql7\") pod \"horizon-5fbd85f69f-5jhnw\" (UID: \"978c1383-cc82-4788-beea-b1e15b25eb1f\") " pod="openstack/horizon-5fbd85f69f-5jhnw" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.656442 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8cd44d0-7395-44e1-9112-9e8bb4198b93-config-data\") pod \"ceilometer-0\" (UID: \"b8cd44d0-7395-44e1-9112-9e8bb4198b93\") " pod="openstack/ceilometer-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.656462 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b8cd44d0-7395-44e1-9112-9e8bb4198b93-log-httpd\") pod \"ceilometer-0\" (UID: \"b8cd44d0-7395-44e1-9112-9e8bb4198b93\") " pod="openstack/ceilometer-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.656483 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b8cd44d0-7395-44e1-9112-9e8bb4198b93-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b8cd44d0-7395-44e1-9112-9e8bb4198b93\") " pod="openstack/ceilometer-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.656502 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8cd44d0-7395-44e1-9112-9e8bb4198b93-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b8cd44d0-7395-44e1-9112-9e8bb4198b93\") " pod="openstack/ceilometer-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.656539 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8cd44d0-7395-44e1-9112-9e8bb4198b93-scripts\") pod \"ceilometer-0\" (UID: \"b8cd44d0-7395-44e1-9112-9e8bb4198b93\") " pod="openstack/ceilometer-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.656557 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2b86\" (UniqueName: \"kubernetes.io/projected/9c2f952c-2122-43d9-b006-6967fd2b9029-kube-api-access-n2b86\") pod \"glance-default-internal-api-0\" (UID: \"9c2f952c-2122-43d9-b006-6967fd2b9029\") " pod="openstack/glance-default-internal-api-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.656587 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/978c1383-cc82-4788-beea-b1e15b25eb1f-horizon-secret-key\") pod \"horizon-5fbd85f69f-5jhnw\" (UID: \"978c1383-cc82-4788-beea-b1e15b25eb1f\") " pod="openstack/horizon-5fbd85f69f-5jhnw" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.656602 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c2f952c-2122-43d9-b006-6967fd2b9029-logs\") pod \"glance-default-internal-api-0\" (UID: \"9c2f952c-2122-43d9-b006-6967fd2b9029\") " pod="openstack/glance-default-internal-api-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.656622 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c2f952c-2122-43d9-b006-6967fd2b9029-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9c2f952c-2122-43d9-b006-6967fd2b9029\") " pod="openstack/glance-default-internal-api-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.656672 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g84kv\" (UniqueName: \"kubernetes.io/projected/7cc2bd68-d5b3-4416-8752-edf7ee85bf88-kube-api-access-g84kv\") on node \"crc\" DevicePath \"\"" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.658115 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/978c1383-cc82-4788-beea-b1e15b25eb1f-scripts\") pod \"horizon-5fbd85f69f-5jhnw\" (UID: \"978c1383-cc82-4788-beea-b1e15b25eb1f\") " pod="openstack/horizon-5fbd85f69f-5jhnw" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.659172 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b8cd44d0-7395-44e1-9112-9e8bb4198b93-log-httpd\") pod \"ceilometer-0\" (UID: \"b8cd44d0-7395-44e1-9112-9e8bb4198b93\") " pod="openstack/ceilometer-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.662411 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/978c1383-cc82-4788-beea-b1e15b25eb1f-config-data\") pod \"horizon-5fbd85f69f-5jhnw\" (UID: \"978c1383-cc82-4788-beea-b1e15b25eb1f\") " pod="openstack/horizon-5fbd85f69f-5jhnw" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.665537 4810 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"9c2f952c-2122-43d9-b006-6967fd2b9029\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.666998 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/978c1383-cc82-4788-beea-b1e15b25eb1f-logs\") pod \"horizon-5fbd85f69f-5jhnw\" (UID: \"978c1383-cc82-4788-beea-b1e15b25eb1f\") " pod="openstack/horizon-5fbd85f69f-5jhnw" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.670507 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9c2f952c-2122-43d9-b006-6967fd2b9029-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9c2f952c-2122-43d9-b006-6967fd2b9029\") " pod="openstack/glance-default-internal-api-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.674050 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b8cd44d0-7395-44e1-9112-9e8bb4198b93-run-httpd\") pod \"ceilometer-0\" (UID: \"b8cd44d0-7395-44e1-9112-9e8bb4198b93\") " pod="openstack/ceilometer-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.676622 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c2f952c-2122-43d9-b006-6967fd2b9029-logs\") pod \"glance-default-internal-api-0\" (UID: \"9c2f952c-2122-43d9-b006-6967fd2b9029\") " pod="openstack/glance-default-internal-api-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.676737 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-hmc6k" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.677735 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c2f952c-2122-43d9-b006-6967fd2b9029-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9c2f952c-2122-43d9-b006-6967fd2b9029\") " pod="openstack/glance-default-internal-api-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.716305 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/978c1383-cc82-4788-beea-b1e15b25eb1f-horizon-secret-key\") pod \"horizon-5fbd85f69f-5jhnw\" (UID: \"978c1383-cc82-4788-beea-b1e15b25eb1f\") " pod="openstack/horizon-5fbd85f69f-5jhnw" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.716800 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c2f952c-2122-43d9-b006-6967fd2b9029-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9c2f952c-2122-43d9-b006-6967fd2b9029\") " pod="openstack/glance-default-internal-api-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.718642 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c2f952c-2122-43d9-b006-6967fd2b9029-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9c2f952c-2122-43d9-b006-6967fd2b9029\") " pod="openstack/glance-default-internal-api-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.719025 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b8cd44d0-7395-44e1-9112-9e8bb4198b93-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b8cd44d0-7395-44e1-9112-9e8bb4198b93\") " pod="openstack/ceilometer-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.719712 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c2f952c-2122-43d9-b006-6967fd2b9029-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9c2f952c-2122-43d9-b006-6967fd2b9029\") " pod="openstack/glance-default-internal-api-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.721869 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2b86\" (UniqueName: \"kubernetes.io/projected/9c2f952c-2122-43d9-b006-6967fd2b9029-kube-api-access-n2b86\") pod \"glance-default-internal-api-0\" (UID: \"9c2f952c-2122-43d9-b006-6967fd2b9029\") " pod="openstack/glance-default-internal-api-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.721905 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8cd44d0-7395-44e1-9112-9e8bb4198b93-scripts\") pod \"ceilometer-0\" (UID: \"b8cd44d0-7395-44e1-9112-9e8bb4198b93\") " pod="openstack/ceilometer-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.723147 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8cd44d0-7395-44e1-9112-9e8bb4198b93-config-data\") pod \"ceilometer-0\" (UID: \"b8cd44d0-7395-44e1-9112-9e8bb4198b93\") " pod="openstack/ceilometer-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.737833 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8cd44d0-7395-44e1-9112-9e8bb4198b93-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b8cd44d0-7395-44e1-9112-9e8bb4198b93\") " pod="openstack/ceilometer-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.740458 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4ql7\" (UniqueName: \"kubernetes.io/projected/978c1383-cc82-4788-beea-b1e15b25eb1f-kube-api-access-h4ql7\") pod \"horizon-5fbd85f69f-5jhnw\" (UID: \"978c1383-cc82-4788-beea-b1e15b25eb1f\") " pod="openstack/horizon-5fbd85f69f-5jhnw" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.740698 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jd77w\" (UniqueName: \"kubernetes.io/projected/b8cd44d0-7395-44e1-9112-9e8bb4198b93-kube-api-access-jd77w\") pod \"ceilometer-0\" (UID: \"b8cd44d0-7395-44e1-9112-9e8bb4198b93\") " pod="openstack/ceilometer-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.781521 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"9c2f952c-2122-43d9-b006-6967fd2b9029\") " pod="openstack/glance-default-internal-api-0" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.880732 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cc2bd68-d5b3-4416-8752-edf7ee85bf88-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7cc2bd68-d5b3-4416-8752-edf7ee85bf88" (UID: "7cc2bd68-d5b3-4416-8752-edf7ee85bf88"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.951597 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cc2bd68-d5b3-4416-8752-edf7ee85bf88-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7cc2bd68-d5b3-4416-8752-edf7ee85bf88" (UID: "7cc2bd68-d5b3-4416-8752-edf7ee85bf88"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.962389 4810 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7cc2bd68-d5b3-4416-8752-edf7ee85bf88-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.962413 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7cc2bd68-d5b3-4416-8752-edf7ee85bf88-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.973919 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cc2bd68-d5b3-4416-8752-edf7ee85bf88-config" (OuterVolumeSpecName: "config") pod "7cc2bd68-d5b3-4416-8752-edf7ee85bf88" (UID: "7cc2bd68-d5b3-4416-8752-edf7ee85bf88"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.974681 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cc2bd68-d5b3-4416-8752-edf7ee85bf88-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "7cc2bd68-d5b3-4416-8752-edf7ee85bf88" (UID: "7cc2bd68-d5b3-4416-8752-edf7ee85bf88"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.975290 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cc2bd68-d5b3-4416-8752-edf7ee85bf88-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7cc2bd68-d5b3-4416-8752-edf7ee85bf88" (UID: "7cc2bd68-d5b3-4416-8752-edf7ee85bf88"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:28:49 crc kubenswrapper[4810]: I0219 15:28:49.994311 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5fbd85f69f-5jhnw" Feb 19 15:28:50 crc kubenswrapper[4810]: I0219 15:28:50.027288 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 15:28:50 crc kubenswrapper[4810]: I0219 15:28:50.064123 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cc2bd68-d5b3-4416-8752-edf7ee85bf88-config\") on node \"crc\" DevicePath \"\"" Feb 19 15:28:50 crc kubenswrapper[4810]: I0219 15:28:50.064150 4810 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7cc2bd68-d5b3-4416-8752-edf7ee85bf88-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 19 15:28:50 crc kubenswrapper[4810]: I0219 15:28:50.064161 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7cc2bd68-d5b3-4416-8752-edf7ee85bf88-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 15:28:50 crc kubenswrapper[4810]: I0219 15:28:50.277533 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-2f6g5"] Feb 19 15:28:50 crc kubenswrapper[4810]: I0219 15:28:50.277887 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-744c9b5447-bdbq7"] Feb 19 15:28:50 crc kubenswrapper[4810]: I0219 15:28:50.286786 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 15:28:50 crc kubenswrapper[4810]: I0219 15:28:50.292355 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Feb 19 15:28:50 crc kubenswrapper[4810]: I0219 15:28:50.313016 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-744c9b5447-bdbq7" event={"ID":"32e59d75-7087-41ab-8571-5e8830baeec0","Type":"ContainerStarted","Data":"f5ecac10f7282bb7c7be67fe8fb7362bec2d603a414fe0b250f0f959893cb30e"} Feb 19 15:28:50 crc kubenswrapper[4810]: I0219 15:28:50.314591 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-2f6g5" event={"ID":"5a2a904f-47f9-40da-bc5f-aba73c4c1c57","Type":"ContainerStarted","Data":"e83e19f0a88b6d8caed6cea2c56d5be687bd87f6e4bb330f1df96f263a9c3db4"} Feb 19 15:28:50 crc kubenswrapper[4810]: I0219 15:28:50.326848 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589c97547-9nnhp" event={"ID":"7cc2bd68-d5b3-4416-8752-edf7ee85bf88","Type":"ContainerDied","Data":"6cdfc0e390a65c8dc0f4b9e7aa2f77b612821b90bf368ae80a8f0a6cf661f875"} Feb 19 15:28:50 crc kubenswrapper[4810]: I0219 15:28:50.326899 4810 scope.go:117] "RemoveContainer" containerID="b3c769533587d67c2cf2579dac0d7fe9ba6b6dac50efc5e3e91af785d9090a99" Feb 19 15:28:50 crc kubenswrapper[4810]: I0219 15:28:50.326922 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589c97547-9nnhp" Feb 19 15:28:50 crc kubenswrapper[4810]: I0219 15:28:50.375846 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-589c97547-9nnhp"] Feb 19 15:28:50 crc kubenswrapper[4810]: I0219 15:28:50.386692 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-589c97547-9nnhp"] Feb 19 15:28:50 crc kubenswrapper[4810]: I0219 15:28:50.422593 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Feb 19 15:28:50 crc kubenswrapper[4810]: I0219 15:28:50.451486 4810 scope.go:117] "RemoveContainer" containerID="e8fb43b2b423ed4972c4fdcf61131fbebdb2452721ad8c709c2892ac5c467178" Feb 19 15:28:50 crc kubenswrapper[4810]: I0219 15:28:50.896746 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-j989d"] Feb 19 15:28:50 crc kubenswrapper[4810]: I0219 15:28:50.957633 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-849c785789-5xrh2"] Feb 19 15:28:50 crc kubenswrapper[4810]: I0219 15:28:50.975905 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-hmc6k"] Feb 19 15:28:50 crc kubenswrapper[4810]: I0219 15:28:50.999101 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 19 15:28:51 crc kubenswrapper[4810]: I0219 15:28:51.010682 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f4cfd6f6c-s7m64"] Feb 19 15:28:51 crc kubenswrapper[4810]: I0219 15:28:51.019704 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-svmgl"] Feb 19 15:28:51 crc kubenswrapper[4810]: I0219 15:28:51.031968 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-7jdcp"] Feb 19 15:28:51 crc kubenswrapper[4810]: I0219 15:28:51.039698 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 15:28:51 crc kubenswrapper[4810]: I0219 15:28:51.280147 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 15:28:51 crc kubenswrapper[4810]: I0219 15:28:51.297385 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Feb 19 15:28:51 crc kubenswrapper[4810]: I0219 15:28:51.320077 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5fbd85f69f-5jhnw"] Feb 19 15:28:51 crc kubenswrapper[4810]: I0219 15:28:51.345735 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5fbd85f69f-5jhnw"] Feb 19 15:28:51 crc kubenswrapper[4810]: I0219 15:28:51.351259 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7a0789f1-099a-4f95-9626-a5ad7da804bc","Type":"ContainerStarted","Data":"123c419edcd4407399df0f1f1452e33c4f00d373cd040db3a818757999826cf9"} Feb 19 15:28:51 crc kubenswrapper[4810]: I0219 15:28:51.359959 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-668f7d7fb5-l5kpq"] Feb 19 15:28:51 crc kubenswrapper[4810]: I0219 15:28:51.361880 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-hmc6k" event={"ID":"2024a783-c3f9-4e57-b00f-52bec164e64e","Type":"ContainerStarted","Data":"398a9dcb2e74ac56cd2827ea038790abeb16b5f6b3573a48b306842a166c3f44"} Feb 19 15:28:51 crc kubenswrapper[4810]: I0219 15:28:51.361979 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-668f7d7fb5-l5kpq" Feb 19 15:28:51 crc kubenswrapper[4810]: I0219 15:28:51.381383 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"3eb2dccd-c5dc-436f-b7a6-954af7bc51c5","Type":"ContainerStarted","Data":"3750609201e07e960ce122b5fe6baad963df212daffe611a1c8ba29e4bf01f7a"} Feb 19 15:28:51 crc kubenswrapper[4810]: I0219 15:28:51.396009 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f4cfd6f6c-s7m64" event={"ID":"18ca6546-69fd-492d-81c5-bb18c56b045d","Type":"ContainerStarted","Data":"3e6a529b000841e709c2e1d05c4d119a28c8f20c2ece39574181d8df78a6c626"} Feb 19 15:28:51 crc kubenswrapper[4810]: I0219 15:28:51.409447 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-668f7d7fb5-l5kpq"] Feb 19 15:28:51 crc kubenswrapper[4810]: I0219 15:28:51.482154 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cc2bd68-d5b3-4416-8752-edf7ee85bf88" path="/var/lib/kubelet/pods/7cc2bd68-d5b3-4416-8752-edf7ee85bf88/volumes" Feb 19 15:28:51 crc kubenswrapper[4810]: I0219 15:28:51.482812 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-2f6g5" event={"ID":"5a2a904f-47f9-40da-bc5f-aba73c4c1c57","Type":"ContainerStarted","Data":"e36236dacc44e9719f0a5616b325da89fd715c826d97ed6b2c660301840187d2"} Feb 19 15:28:51 crc kubenswrapper[4810]: I0219 15:28:51.482842 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 15:28:51 crc kubenswrapper[4810]: I0219 15:28:51.482859 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"4be760b2-263c-4b89-8bdf-ecf98114a24f","Type":"ContainerStarted","Data":"69118f89362c4dfde4f05ac6e0fde30afdc3fb74ce3c8ffeaee8a6df7e8789a5"} Feb 19 15:28:51 crc kubenswrapper[4810]: I0219 15:28:51.494192 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-7jdcp" event={"ID":"36fe6fdb-2970-4773-8184-a2d16b8ca89a","Type":"ContainerStarted","Data":"63add7d4ad95ec66116fce5bf4ebf1368a0ffff3203d7f02a7456da5980c8ad7"} Feb 19 15:28:51 crc kubenswrapper[4810]: W0219 15:28:51.502002 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8cd44d0_7395_44e1_9112_9e8bb4198b93.slice/crio-89e242fd2781af0141617d648aab74ae173b61b87fc7bb8cdef6002ffaed43fc WatchSource:0}: Error finding container 89e242fd2781af0141617d648aab74ae173b61b87fc7bb8cdef6002ffaed43fc: Status 404 returned error can't find the container with id 89e242fd2781af0141617d648aab74ae173b61b87fc7bb8cdef6002ffaed43fc Feb 19 15:28:51 crc kubenswrapper[4810]: I0219 15:28:51.512551 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 15:28:51 crc kubenswrapper[4810]: I0219 15:28:51.514810 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08eca88c-a4b4-461b-8568-ebbf54645272-logs\") pod \"horizon-668f7d7fb5-l5kpq\" (UID: \"08eca88c-a4b4-461b-8568-ebbf54645272\") " pod="openstack/horizon-668f7d7fb5-l5kpq" Feb 19 15:28:51 crc kubenswrapper[4810]: I0219 15:28:51.514940 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wb5qb\" (UniqueName: \"kubernetes.io/projected/08eca88c-a4b4-461b-8568-ebbf54645272-kube-api-access-wb5qb\") pod \"horizon-668f7d7fb5-l5kpq\" (UID: \"08eca88c-a4b4-461b-8568-ebbf54645272\") " pod="openstack/horizon-668f7d7fb5-l5kpq" Feb 19 15:28:51 crc kubenswrapper[4810]: I0219 15:28:51.514967 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/08eca88c-a4b4-461b-8568-ebbf54645272-config-data\") pod \"horizon-668f7d7fb5-l5kpq\" (UID: \"08eca88c-a4b4-461b-8568-ebbf54645272\") " pod="openstack/horizon-668f7d7fb5-l5kpq" Feb 19 15:28:51 crc kubenswrapper[4810]: I0219 15:28:51.514981 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/08eca88c-a4b4-461b-8568-ebbf54645272-horizon-secret-key\") pod \"horizon-668f7d7fb5-l5kpq\" (UID: \"08eca88c-a4b4-461b-8568-ebbf54645272\") " pod="openstack/horizon-668f7d7fb5-l5kpq" Feb 19 15:28:51 crc kubenswrapper[4810]: I0219 15:28:51.514997 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/08eca88c-a4b4-461b-8568-ebbf54645272-scripts\") pod \"horizon-668f7d7fb5-l5kpq\" (UID: \"08eca88c-a4b4-461b-8568-ebbf54645272\") " pod="openstack/horizon-668f7d7fb5-l5kpq" Feb 19 15:28:51 crc kubenswrapper[4810]: I0219 15:28:51.524131 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 15:28:51 crc kubenswrapper[4810]: I0219 15:28:51.531559 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-849c785789-5xrh2" event={"ID":"c39b1dd9-9e73-4cca-aea6-e228f1ba5942","Type":"ContainerStarted","Data":"f7e61fd52ad6569907f8a84cbc32aef547486ffda35028466938ada8d5e3aa10"} Feb 19 15:28:51 crc kubenswrapper[4810]: I0219 15:28:51.556573 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc","Type":"ContainerStarted","Data":"c6f5d9b5a6b15c45dc5d760616281a4656e6d239a89c530245a55061c13bc709"} Feb 19 15:28:51 crc kubenswrapper[4810]: I0219 15:28:51.556624 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc","Type":"ContainerStarted","Data":"5e6853b5e9878d2a03e52d891d8a223a4096ce551b9935e29c1d5c5f37ac41cb"} Feb 19 15:28:51 crc kubenswrapper[4810]: I0219 15:28:51.556639 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc","Type":"ContainerStarted","Data":"1347fa88cd5df9f6d1e59a77fbeedca11e777e87af7e22e44f3db90ca1dd624c"} Feb 19 15:28:51 crc kubenswrapper[4810]: I0219 15:28:51.557693 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Feb 19 15:28:51 crc kubenswrapper[4810]: I0219 15:28:51.577069 4810 generic.go:334] "Generic (PLEG): container finished" podID="32e59d75-7087-41ab-8571-5e8830baeec0" containerID="d88c68d698d3972f0825f45fc0f2b6d882a24f69749acf74ad5f3d90f016e7f9" exitCode=0 Feb 19 15:28:51 crc kubenswrapper[4810]: I0219 15:28:51.577184 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-744c9b5447-bdbq7" event={"ID":"32e59d75-7087-41ab-8571-5e8830baeec0","Type":"ContainerDied","Data":"d88c68d698d3972f0825f45fc0f2b6d882a24f69749acf74ad5f3d90f016e7f9"} Feb 19 15:28:51 crc kubenswrapper[4810]: I0219 15:28:51.585522 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.152:9322/\": dial tcp 10.217.0.152:9322: connect: connection refused" Feb 19 15:28:51 crc kubenswrapper[4810]: I0219 15:28:51.595674 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-svmgl" event={"ID":"848dfe9d-05f4-4ba9-919e-23e9a7ae63d5","Type":"ContainerStarted","Data":"ce811eb96390a8f8365be6dee0926b85f5365cb868957ed22165ebbf7d343712"} Feb 19 15:28:51 crc kubenswrapper[4810]: I0219 15:28:51.601598 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-j989d" event={"ID":"4dd5dede-cf58-43c7-954e-b9b1d33ad8d1","Type":"ContainerStarted","Data":"af9963a31801a20460aedc2d93f7da81f8b9a7c2e7ec298ae21e257191169331"} Feb 19 15:28:51 crc kubenswrapper[4810]: I0219 15:28:51.618233 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wb5qb\" (UniqueName: \"kubernetes.io/projected/08eca88c-a4b4-461b-8568-ebbf54645272-kube-api-access-wb5qb\") pod \"horizon-668f7d7fb5-l5kpq\" (UID: \"08eca88c-a4b4-461b-8568-ebbf54645272\") " pod="openstack/horizon-668f7d7fb5-l5kpq" Feb 19 15:28:51 crc kubenswrapper[4810]: I0219 15:28:51.618295 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/08eca88c-a4b4-461b-8568-ebbf54645272-config-data\") pod \"horizon-668f7d7fb5-l5kpq\" (UID: \"08eca88c-a4b4-461b-8568-ebbf54645272\") " pod="openstack/horizon-668f7d7fb5-l5kpq" Feb 19 15:28:51 crc kubenswrapper[4810]: I0219 15:28:51.618316 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/08eca88c-a4b4-461b-8568-ebbf54645272-horizon-secret-key\") pod \"horizon-668f7d7fb5-l5kpq\" (UID: \"08eca88c-a4b4-461b-8568-ebbf54645272\") " pod="openstack/horizon-668f7d7fb5-l5kpq" Feb 19 15:28:51 crc kubenswrapper[4810]: I0219 15:28:51.618357 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/08eca88c-a4b4-461b-8568-ebbf54645272-scripts\") pod \"horizon-668f7d7fb5-l5kpq\" (UID: \"08eca88c-a4b4-461b-8568-ebbf54645272\") " pod="openstack/horizon-668f7d7fb5-l5kpq" Feb 19 15:28:51 crc kubenswrapper[4810]: I0219 15:28:51.618512 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08eca88c-a4b4-461b-8568-ebbf54645272-logs\") pod \"horizon-668f7d7fb5-l5kpq\" (UID: \"08eca88c-a4b4-461b-8568-ebbf54645272\") " pod="openstack/horizon-668f7d7fb5-l5kpq" Feb 19 15:28:51 crc kubenswrapper[4810]: I0219 15:28:51.620672 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/08eca88c-a4b4-461b-8568-ebbf54645272-config-data\") pod \"horizon-668f7d7fb5-l5kpq\" (UID: \"08eca88c-a4b4-461b-8568-ebbf54645272\") " pod="openstack/horizon-668f7d7fb5-l5kpq" Feb 19 15:28:51 crc kubenswrapper[4810]: I0219 15:28:51.622104 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08eca88c-a4b4-461b-8568-ebbf54645272-logs\") pod \"horizon-668f7d7fb5-l5kpq\" (UID: \"08eca88c-a4b4-461b-8568-ebbf54645272\") " pod="openstack/horizon-668f7d7fb5-l5kpq" Feb 19 15:28:51 crc kubenswrapper[4810]: I0219 15:28:51.624387 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/08eca88c-a4b4-461b-8568-ebbf54645272-scripts\") pod \"horizon-668f7d7fb5-l5kpq\" (UID: \"08eca88c-a4b4-461b-8568-ebbf54645272\") " pod="openstack/horizon-668f7d7fb5-l5kpq" Feb 19 15:28:51 crc kubenswrapper[4810]: I0219 15:28:51.629934 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/08eca88c-a4b4-461b-8568-ebbf54645272-horizon-secret-key\") pod \"horizon-668f7d7fb5-l5kpq\" (UID: \"08eca88c-a4b4-461b-8568-ebbf54645272\") " pod="openstack/horizon-668f7d7fb5-l5kpq" Feb 19 15:28:51 crc kubenswrapper[4810]: I0219 15:28:51.645116 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 15:28:51 crc kubenswrapper[4810]: I0219 15:28:51.652952 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wb5qb\" (UniqueName: \"kubernetes.io/projected/08eca88c-a4b4-461b-8568-ebbf54645272-kube-api-access-wb5qb\") pod \"horizon-668f7d7fb5-l5kpq\" (UID: \"08eca88c-a4b4-461b-8568-ebbf54645272\") " pod="openstack/horizon-668f7d7fb5-l5kpq" Feb 19 15:28:51 crc kubenswrapper[4810]: I0219 15:28:51.722185 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-668f7d7fb5-l5kpq" Feb 19 15:28:51 crc kubenswrapper[4810]: I0219 15:28:51.925244 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-2f6g5" podStartSLOduration=3.925229479 podStartE2EDuration="3.925229479s" podCreationTimestamp="2026-02-19 15:28:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:28:51.924648905 +0000 UTC m=+1161.406679029" watchObservedRunningTime="2026-02-19 15:28:51.925229479 +0000 UTC m=+1161.407259603" Feb 19 15:28:52 crc kubenswrapper[4810]: I0219 15:28:52.043292 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=4.043270839 podStartE2EDuration="4.043270839s" podCreationTimestamp="2026-02-19 15:28:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:28:51.994006053 +0000 UTC m=+1161.476036177" watchObservedRunningTime="2026-02-19 15:28:52.043270839 +0000 UTC m=+1161.525300963" Feb 19 15:28:52 crc kubenswrapper[4810]: I0219 15:28:52.067471 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Feb 19 15:28:52 crc kubenswrapper[4810]: I0219 15:28:52.080641 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Feb 19 15:28:52 crc kubenswrapper[4810]: I0219 15:28:52.520050 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-668f7d7fb5-l5kpq"] Feb 19 15:28:52 crc kubenswrapper[4810]: I0219 15:28:52.618644 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-j989d" event={"ID":"4dd5dede-cf58-43c7-954e-b9b1d33ad8d1","Type":"ContainerStarted","Data":"da258067ce2c7912909dc7c937b6ad45df02bf5c8504937ad6d6f0ea0359724a"} Feb 19 15:28:52 crc kubenswrapper[4810]: I0219 15:28:52.625132 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9c2f952c-2122-43d9-b006-6967fd2b9029","Type":"ContainerStarted","Data":"84ac3613a017c9c6f49ab1cf6c99c11f9ee32460de5f526447f4c9de546b4b3b"} Feb 19 15:28:52 crc kubenswrapper[4810]: I0219 15:28:52.629551 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5fbd85f69f-5jhnw" event={"ID":"978c1383-cc82-4788-beea-b1e15b25eb1f","Type":"ContainerStarted","Data":"a02086ae9fd1e62e23bb222b4fb6d0999b14d605294e6aa8f70c1665e61a1617"} Feb 19 15:28:52 crc kubenswrapper[4810]: I0219 15:28:52.636436 4810 generic.go:334] "Generic (PLEG): container finished" podID="18ca6546-69fd-492d-81c5-bb18c56b045d" containerID="7bc85bb988b4afd58694705b3dae68dca31213574b00f92c71f4c77c5edfdf98" exitCode=0 Feb 19 15:28:52 crc kubenswrapper[4810]: I0219 15:28:52.636519 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f4cfd6f6c-s7m64" event={"ID":"18ca6546-69fd-492d-81c5-bb18c56b045d","Type":"ContainerDied","Data":"7bc85bb988b4afd58694705b3dae68dca31213574b00f92c71f4c77c5edfdf98"} Feb 19 15:28:52 crc kubenswrapper[4810]: I0219 15:28:52.642122 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b8cd44d0-7395-44e1-9112-9e8bb4198b93","Type":"ContainerStarted","Data":"89e242fd2781af0141617d648aab74ae173b61b87fc7bb8cdef6002ffaed43fc"} Feb 19 15:28:52 crc kubenswrapper[4810]: I0219 15:28:52.642706 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc" containerName="watcher-api-log" containerID="cri-o://5e6853b5e9878d2a03e52d891d8a223a4096ce551b9935e29c1d5c5f37ac41cb" gracePeriod=30 Feb 19 15:28:52 crc kubenswrapper[4810]: I0219 15:28:52.642831 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc" containerName="watcher-api" containerID="cri-o://c6f5d9b5a6b15c45dc5d760616281a4656e6d239a89c530245a55061c13bc709" gracePeriod=30 Feb 19 15:28:52 crc kubenswrapper[4810]: I0219 15:28:52.648995 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Feb 19 15:28:52 crc kubenswrapper[4810]: I0219 15:28:52.676019 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-j989d" podStartSLOduration=4.675999348 podStartE2EDuration="4.675999348s" podCreationTimestamp="2026-02-19 15:28:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:28:52.635925017 +0000 UTC m=+1162.117955141" watchObservedRunningTime="2026-02-19 15:28:52.675999348 +0000 UTC m=+1162.158029472" Feb 19 15:28:53 crc kubenswrapper[4810]: I0219 15:28:53.654450 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7a0789f1-099a-4f95-9626-a5ad7da804bc","Type":"ContainerStarted","Data":"039aba0ed63f1dd6f66e34144e1de83684475b75135af6fd57bb70ce97ffc6b0"} Feb 19 15:28:53 crc kubenswrapper[4810]: I0219 15:28:53.656869 4810 generic.go:334] "Generic (PLEG): container finished" podID="2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc" containerID="5e6853b5e9878d2a03e52d891d8a223a4096ce551b9935e29c1d5c5f37ac41cb" exitCode=143 Feb 19 15:28:53 crc kubenswrapper[4810]: I0219 15:28:53.656950 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc","Type":"ContainerDied","Data":"5e6853b5e9878d2a03e52d891d8a223a4096ce551b9935e29c1d5c5f37ac41cb"} Feb 19 15:28:54 crc kubenswrapper[4810]: I0219 15:28:54.124593 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Feb 19 15:28:54 crc kubenswrapper[4810]: W0219 15:28:54.613876 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08eca88c_a4b4_461b_8568_ebbf54645272.slice/crio-b936d9f6773d6dc856145fa11df03aba298425ab6b8a943cdd257e7a943da84e WatchSource:0}: Error finding container b936d9f6773d6dc856145fa11df03aba298425ab6b8a943cdd257e7a943da84e: Status 404 returned error can't find the container with id b936d9f6773d6dc856145fa11df03aba298425ab6b8a943cdd257e7a943da84e Feb 19 15:28:54 crc kubenswrapper[4810]: I0219 15:28:54.674282 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9c2f952c-2122-43d9-b006-6967fd2b9029","Type":"ContainerStarted","Data":"ba83a3830aa23a7f50a84bf64a23008ce3bd4119d839094ddf609dc429db2d93"} Feb 19 15:28:54 crc kubenswrapper[4810]: I0219 15:28:54.676436 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-744c9b5447-bdbq7" event={"ID":"32e59d75-7087-41ab-8571-5e8830baeec0","Type":"ContainerDied","Data":"f5ecac10f7282bb7c7be67fe8fb7362bec2d603a414fe0b250f0f959893cb30e"} Feb 19 15:28:54 crc kubenswrapper[4810]: I0219 15:28:54.676458 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5ecac10f7282bb7c7be67fe8fb7362bec2d603a414fe0b250f0f959893cb30e" Feb 19 15:28:54 crc kubenswrapper[4810]: I0219 15:28:54.679586 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-668f7d7fb5-l5kpq" event={"ID":"08eca88c-a4b4-461b-8568-ebbf54645272","Type":"ContainerStarted","Data":"b936d9f6773d6dc856145fa11df03aba298425ab6b8a943cdd257e7a943da84e"} Feb 19 15:28:54 crc kubenswrapper[4810]: I0219 15:28:54.727693 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-744c9b5447-bdbq7" Feb 19 15:28:54 crc kubenswrapper[4810]: I0219 15:28:54.801562 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/32e59d75-7087-41ab-8571-5e8830baeec0-ovsdbserver-nb\") pod \"32e59d75-7087-41ab-8571-5e8830baeec0\" (UID: \"32e59d75-7087-41ab-8571-5e8830baeec0\") " Feb 19 15:28:54 crc kubenswrapper[4810]: I0219 15:28:54.801833 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32e59d75-7087-41ab-8571-5e8830baeec0-config\") pod \"32e59d75-7087-41ab-8571-5e8830baeec0\" (UID: \"32e59d75-7087-41ab-8571-5e8830baeec0\") " Feb 19 15:28:54 crc kubenswrapper[4810]: I0219 15:28:54.801859 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/32e59d75-7087-41ab-8571-5e8830baeec0-dns-swift-storage-0\") pod \"32e59d75-7087-41ab-8571-5e8830baeec0\" (UID: \"32e59d75-7087-41ab-8571-5e8830baeec0\") " Feb 19 15:28:54 crc kubenswrapper[4810]: I0219 15:28:54.801894 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32e59d75-7087-41ab-8571-5e8830baeec0-dns-svc\") pod \"32e59d75-7087-41ab-8571-5e8830baeec0\" (UID: \"32e59d75-7087-41ab-8571-5e8830baeec0\") " Feb 19 15:28:54 crc kubenswrapper[4810]: I0219 15:28:54.801922 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/32e59d75-7087-41ab-8571-5e8830baeec0-ovsdbserver-sb\") pod \"32e59d75-7087-41ab-8571-5e8830baeec0\" (UID: \"32e59d75-7087-41ab-8571-5e8830baeec0\") " Feb 19 15:28:54 crc kubenswrapper[4810]: I0219 15:28:54.802047 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cf6bv\" (UniqueName: \"kubernetes.io/projected/32e59d75-7087-41ab-8571-5e8830baeec0-kube-api-access-cf6bv\") pod \"32e59d75-7087-41ab-8571-5e8830baeec0\" (UID: \"32e59d75-7087-41ab-8571-5e8830baeec0\") " Feb 19 15:28:54 crc kubenswrapper[4810]: I0219 15:28:54.820278 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32e59d75-7087-41ab-8571-5e8830baeec0-kube-api-access-cf6bv" (OuterVolumeSpecName: "kube-api-access-cf6bv") pod "32e59d75-7087-41ab-8571-5e8830baeec0" (UID: "32e59d75-7087-41ab-8571-5e8830baeec0"). InnerVolumeSpecName "kube-api-access-cf6bv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:28:54 crc kubenswrapper[4810]: I0219 15:28:54.828410 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32e59d75-7087-41ab-8571-5e8830baeec0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "32e59d75-7087-41ab-8571-5e8830baeec0" (UID: "32e59d75-7087-41ab-8571-5e8830baeec0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:28:54 crc kubenswrapper[4810]: I0219 15:28:54.835737 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32e59d75-7087-41ab-8571-5e8830baeec0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "32e59d75-7087-41ab-8571-5e8830baeec0" (UID: "32e59d75-7087-41ab-8571-5e8830baeec0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:28:54 crc kubenswrapper[4810]: I0219 15:28:54.855711 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32e59d75-7087-41ab-8571-5e8830baeec0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "32e59d75-7087-41ab-8571-5e8830baeec0" (UID: "32e59d75-7087-41ab-8571-5e8830baeec0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:28:54 crc kubenswrapper[4810]: I0219 15:28:54.859379 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32e59d75-7087-41ab-8571-5e8830baeec0-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "32e59d75-7087-41ab-8571-5e8830baeec0" (UID: "32e59d75-7087-41ab-8571-5e8830baeec0"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:28:54 crc kubenswrapper[4810]: I0219 15:28:54.865992 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32e59d75-7087-41ab-8571-5e8830baeec0-config" (OuterVolumeSpecName: "config") pod "32e59d75-7087-41ab-8571-5e8830baeec0" (UID: "32e59d75-7087-41ab-8571-5e8830baeec0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:28:54 crc kubenswrapper[4810]: I0219 15:28:54.908699 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/32e59d75-7087-41ab-8571-5e8830baeec0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 15:28:54 crc kubenswrapper[4810]: I0219 15:28:54.908738 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cf6bv\" (UniqueName: \"kubernetes.io/projected/32e59d75-7087-41ab-8571-5e8830baeec0-kube-api-access-cf6bv\") on node \"crc\" DevicePath \"\"" Feb 19 15:28:54 crc kubenswrapper[4810]: I0219 15:28:54.908749 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/32e59d75-7087-41ab-8571-5e8830baeec0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 15:28:54 crc kubenswrapper[4810]: I0219 15:28:54.908758 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32e59d75-7087-41ab-8571-5e8830baeec0-config\") on node \"crc\" DevicePath \"\"" Feb 19 15:28:54 crc kubenswrapper[4810]: I0219 15:28:54.908766 4810 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/32e59d75-7087-41ab-8571-5e8830baeec0-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 19 15:28:54 crc kubenswrapper[4810]: I0219 15:28:54.908774 4810 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32e59d75-7087-41ab-8571-5e8830baeec0-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 15:28:55 crc kubenswrapper[4810]: I0219 15:28:55.454192 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Feb 19 15:28:55 crc kubenswrapper[4810]: I0219 15:28:55.702300 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-744c9b5447-bdbq7" Feb 19 15:28:55 crc kubenswrapper[4810]: I0219 15:28:55.750894 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-744c9b5447-bdbq7"] Feb 19 15:28:55 crc kubenswrapper[4810]: I0219 15:28:55.764351 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-744c9b5447-bdbq7"] Feb 19 15:28:56 crc kubenswrapper[4810]: I0219 15:28:56.432563 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.152:9322/\": read tcp 10.217.0.2:45822->10.217.0.152:9322: read: connection reset by peer" Feb 19 15:28:56 crc kubenswrapper[4810]: I0219 15:28:56.716978 4810 generic.go:334] "Generic (PLEG): container finished" podID="2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc" containerID="c6f5d9b5a6b15c45dc5d760616281a4656e6d239a89c530245a55061c13bc709" exitCode=0 Feb 19 15:28:56 crc kubenswrapper[4810]: I0219 15:28:56.717036 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc","Type":"ContainerDied","Data":"c6f5d9b5a6b15c45dc5d760616281a4656e6d239a89c530245a55061c13bc709"} Feb 19 15:28:57 crc kubenswrapper[4810]: I0219 15:28:57.456775 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32e59d75-7087-41ab-8571-5e8830baeec0" path="/var/lib/kubelet/pods/32e59d75-7087-41ab-8571-5e8830baeec0/volumes" Feb 19 15:28:57 crc kubenswrapper[4810]: I0219 15:28:57.746301 4810 generic.go:334] "Generic (PLEG): container finished" podID="5a2a904f-47f9-40da-bc5f-aba73c4c1c57" containerID="e36236dacc44e9719f0a5616b325da89fd715c826d97ed6b2c660301840187d2" exitCode=0 Feb 19 15:28:57 crc kubenswrapper[4810]: I0219 15:28:57.746353 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-2f6g5" event={"ID":"5a2a904f-47f9-40da-bc5f-aba73c4c1c57","Type":"ContainerDied","Data":"e36236dacc44e9719f0a5616b325da89fd715c826d97ed6b2c660301840187d2"} Feb 19 15:28:57 crc kubenswrapper[4810]: I0219 15:28:57.771118 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-849c785789-5xrh2"] Feb 19 15:28:57 crc kubenswrapper[4810]: I0219 15:28:57.815943 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-869f57798-ngdtl"] Feb 19 15:28:57 crc kubenswrapper[4810]: E0219 15:28:57.816531 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32e59d75-7087-41ab-8571-5e8830baeec0" containerName="init" Feb 19 15:28:57 crc kubenswrapper[4810]: I0219 15:28:57.816551 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="32e59d75-7087-41ab-8571-5e8830baeec0" containerName="init" Feb 19 15:28:57 crc kubenswrapper[4810]: I0219 15:28:57.816816 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="32e59d75-7087-41ab-8571-5e8830baeec0" containerName="init" Feb 19 15:28:57 crc kubenswrapper[4810]: I0219 15:28:57.818036 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-869f57798-ngdtl" Feb 19 15:28:57 crc kubenswrapper[4810]: I0219 15:28:57.822792 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Feb 19 15:28:57 crc kubenswrapper[4810]: I0219 15:28:57.825395 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-869f57798-ngdtl"] Feb 19 15:28:57 crc kubenswrapper[4810]: I0219 15:28:57.877239 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/58c845f2-0069-4ee5-9d4b-b5871e078926-horizon-secret-key\") pod \"horizon-869f57798-ngdtl\" (UID: \"58c845f2-0069-4ee5-9d4b-b5871e078926\") " pod="openstack/horizon-869f57798-ngdtl" Feb 19 15:28:57 crc kubenswrapper[4810]: I0219 15:28:57.877294 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58c845f2-0069-4ee5-9d4b-b5871e078926-combined-ca-bundle\") pod \"horizon-869f57798-ngdtl\" (UID: \"58c845f2-0069-4ee5-9d4b-b5871e078926\") " pod="openstack/horizon-869f57798-ngdtl" Feb 19 15:28:57 crc kubenswrapper[4810]: I0219 15:28:57.877319 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58c845f2-0069-4ee5-9d4b-b5871e078926-logs\") pod \"horizon-869f57798-ngdtl\" (UID: \"58c845f2-0069-4ee5-9d4b-b5871e078926\") " pod="openstack/horizon-869f57798-ngdtl" Feb 19 15:28:57 crc kubenswrapper[4810]: I0219 15:28:57.877458 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/58c845f2-0069-4ee5-9d4b-b5871e078926-config-data\") pod \"horizon-869f57798-ngdtl\" (UID: \"58c845f2-0069-4ee5-9d4b-b5871e078926\") " pod="openstack/horizon-869f57798-ngdtl" Feb 19 15:28:57 crc kubenswrapper[4810]: I0219 15:28:57.877503 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/58c845f2-0069-4ee5-9d4b-b5871e078926-scripts\") pod \"horizon-869f57798-ngdtl\" (UID: \"58c845f2-0069-4ee5-9d4b-b5871e078926\") " pod="openstack/horizon-869f57798-ngdtl" Feb 19 15:28:57 crc kubenswrapper[4810]: I0219 15:28:57.877543 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/58c845f2-0069-4ee5-9d4b-b5871e078926-horizon-tls-certs\") pod \"horizon-869f57798-ngdtl\" (UID: \"58c845f2-0069-4ee5-9d4b-b5871e078926\") " pod="openstack/horizon-869f57798-ngdtl" Feb 19 15:28:57 crc kubenswrapper[4810]: I0219 15:28:57.877634 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjb4j\" (UniqueName: \"kubernetes.io/projected/58c845f2-0069-4ee5-9d4b-b5871e078926-kube-api-access-zjb4j\") pod \"horizon-869f57798-ngdtl\" (UID: \"58c845f2-0069-4ee5-9d4b-b5871e078926\") " pod="openstack/horizon-869f57798-ngdtl" Feb 19 15:28:57 crc kubenswrapper[4810]: I0219 15:28:57.898715 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-668f7d7fb5-l5kpq"] Feb 19 15:28:57 crc kubenswrapper[4810]: I0219 15:28:57.930191 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5f56498b8d-9gwmf"] Feb 19 15:28:57 crc kubenswrapper[4810]: I0219 15:28:57.932958 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5f56498b8d-9gwmf" Feb 19 15:28:57 crc kubenswrapper[4810]: I0219 15:28:57.953945 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5f56498b8d-9gwmf"] Feb 19 15:28:57 crc kubenswrapper[4810]: I0219 15:28:57.982359 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/737d6629-747f-4d16-a545-d0070c20fe5d-horizon-secret-key\") pod \"horizon-5f56498b8d-9gwmf\" (UID: \"737d6629-747f-4d16-a545-d0070c20fe5d\") " pod="openstack/horizon-5f56498b8d-9gwmf" Feb 19 15:28:57 crc kubenswrapper[4810]: I0219 15:28:57.982422 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjb4j\" (UniqueName: \"kubernetes.io/projected/58c845f2-0069-4ee5-9d4b-b5871e078926-kube-api-access-zjb4j\") pod \"horizon-869f57798-ngdtl\" (UID: \"58c845f2-0069-4ee5-9d4b-b5871e078926\") " pod="openstack/horizon-869f57798-ngdtl" Feb 19 15:28:57 crc kubenswrapper[4810]: I0219 15:28:57.982493 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/58c845f2-0069-4ee5-9d4b-b5871e078926-horizon-secret-key\") pod \"horizon-869f57798-ngdtl\" (UID: \"58c845f2-0069-4ee5-9d4b-b5871e078926\") " pod="openstack/horizon-869f57798-ngdtl" Feb 19 15:28:57 crc kubenswrapper[4810]: I0219 15:28:57.982509 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58c845f2-0069-4ee5-9d4b-b5871e078926-combined-ca-bundle\") pod \"horizon-869f57798-ngdtl\" (UID: \"58c845f2-0069-4ee5-9d4b-b5871e078926\") " pod="openstack/horizon-869f57798-ngdtl" Feb 19 15:28:57 crc kubenswrapper[4810]: I0219 15:28:57.982526 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58c845f2-0069-4ee5-9d4b-b5871e078926-logs\") pod \"horizon-869f57798-ngdtl\" (UID: \"58c845f2-0069-4ee5-9d4b-b5871e078926\") " pod="openstack/horizon-869f57798-ngdtl" Feb 19 15:28:57 crc kubenswrapper[4810]: I0219 15:28:57.982566 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mnzp\" (UniqueName: \"kubernetes.io/projected/737d6629-747f-4d16-a545-d0070c20fe5d-kube-api-access-5mnzp\") pod \"horizon-5f56498b8d-9gwmf\" (UID: \"737d6629-747f-4d16-a545-d0070c20fe5d\") " pod="openstack/horizon-5f56498b8d-9gwmf" Feb 19 15:28:57 crc kubenswrapper[4810]: I0219 15:28:57.982585 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/737d6629-747f-4d16-a545-d0070c20fe5d-logs\") pod \"horizon-5f56498b8d-9gwmf\" (UID: \"737d6629-747f-4d16-a545-d0070c20fe5d\") " pod="openstack/horizon-5f56498b8d-9gwmf" Feb 19 15:28:57 crc kubenswrapper[4810]: I0219 15:28:57.982654 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/737d6629-747f-4d16-a545-d0070c20fe5d-horizon-tls-certs\") pod \"horizon-5f56498b8d-9gwmf\" (UID: \"737d6629-747f-4d16-a545-d0070c20fe5d\") " pod="openstack/horizon-5f56498b8d-9gwmf" Feb 19 15:28:57 crc kubenswrapper[4810]: I0219 15:28:57.982687 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/737d6629-747f-4d16-a545-d0070c20fe5d-combined-ca-bundle\") pod \"horizon-5f56498b8d-9gwmf\" (UID: \"737d6629-747f-4d16-a545-d0070c20fe5d\") " pod="openstack/horizon-5f56498b8d-9gwmf" Feb 19 15:28:57 crc kubenswrapper[4810]: I0219 15:28:57.982707 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/737d6629-747f-4d16-a545-d0070c20fe5d-config-data\") pod \"horizon-5f56498b8d-9gwmf\" (UID: \"737d6629-747f-4d16-a545-d0070c20fe5d\") " pod="openstack/horizon-5f56498b8d-9gwmf" Feb 19 15:28:57 crc kubenswrapper[4810]: I0219 15:28:57.982736 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/58c845f2-0069-4ee5-9d4b-b5871e078926-config-data\") pod \"horizon-869f57798-ngdtl\" (UID: \"58c845f2-0069-4ee5-9d4b-b5871e078926\") " pod="openstack/horizon-869f57798-ngdtl" Feb 19 15:28:57 crc kubenswrapper[4810]: I0219 15:28:57.982756 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/737d6629-747f-4d16-a545-d0070c20fe5d-scripts\") pod \"horizon-5f56498b8d-9gwmf\" (UID: \"737d6629-747f-4d16-a545-d0070c20fe5d\") " pod="openstack/horizon-5f56498b8d-9gwmf" Feb 19 15:28:57 crc kubenswrapper[4810]: I0219 15:28:57.982782 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/58c845f2-0069-4ee5-9d4b-b5871e078926-scripts\") pod \"horizon-869f57798-ngdtl\" (UID: \"58c845f2-0069-4ee5-9d4b-b5871e078926\") " pod="openstack/horizon-869f57798-ngdtl" Feb 19 15:28:57 crc kubenswrapper[4810]: I0219 15:28:57.982835 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/58c845f2-0069-4ee5-9d4b-b5871e078926-horizon-tls-certs\") pod \"horizon-869f57798-ngdtl\" (UID: \"58c845f2-0069-4ee5-9d4b-b5871e078926\") " pod="openstack/horizon-869f57798-ngdtl" Feb 19 15:28:57 crc kubenswrapper[4810]: I0219 15:28:57.987763 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58c845f2-0069-4ee5-9d4b-b5871e078926-logs\") pod \"horizon-869f57798-ngdtl\" (UID: \"58c845f2-0069-4ee5-9d4b-b5871e078926\") " pod="openstack/horizon-869f57798-ngdtl" Feb 19 15:28:57 crc kubenswrapper[4810]: I0219 15:28:57.988438 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/58c845f2-0069-4ee5-9d4b-b5871e078926-scripts\") pod \"horizon-869f57798-ngdtl\" (UID: \"58c845f2-0069-4ee5-9d4b-b5871e078926\") " pod="openstack/horizon-869f57798-ngdtl" Feb 19 15:28:57 crc kubenswrapper[4810]: I0219 15:28:57.988800 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/58c845f2-0069-4ee5-9d4b-b5871e078926-config-data\") pod \"horizon-869f57798-ngdtl\" (UID: \"58c845f2-0069-4ee5-9d4b-b5871e078926\") " pod="openstack/horizon-869f57798-ngdtl" Feb 19 15:28:57 crc kubenswrapper[4810]: I0219 15:28:57.995922 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/58c845f2-0069-4ee5-9d4b-b5871e078926-horizon-tls-certs\") pod \"horizon-869f57798-ngdtl\" (UID: \"58c845f2-0069-4ee5-9d4b-b5871e078926\") " pod="openstack/horizon-869f57798-ngdtl" Feb 19 15:28:58 crc kubenswrapper[4810]: I0219 15:28:58.009482 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/58c845f2-0069-4ee5-9d4b-b5871e078926-horizon-secret-key\") pod \"horizon-869f57798-ngdtl\" (UID: \"58c845f2-0069-4ee5-9d4b-b5871e078926\") " pod="openstack/horizon-869f57798-ngdtl" Feb 19 15:28:58 crc kubenswrapper[4810]: I0219 15:28:58.010018 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58c845f2-0069-4ee5-9d4b-b5871e078926-combined-ca-bundle\") pod \"horizon-869f57798-ngdtl\" (UID: \"58c845f2-0069-4ee5-9d4b-b5871e078926\") " pod="openstack/horizon-869f57798-ngdtl" Feb 19 15:28:58 crc kubenswrapper[4810]: I0219 15:28:58.024892 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjb4j\" (UniqueName: \"kubernetes.io/projected/58c845f2-0069-4ee5-9d4b-b5871e078926-kube-api-access-zjb4j\") pod \"horizon-869f57798-ngdtl\" (UID: \"58c845f2-0069-4ee5-9d4b-b5871e078926\") " pod="openstack/horizon-869f57798-ngdtl" Feb 19 15:28:58 crc kubenswrapper[4810]: I0219 15:28:58.084247 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/737d6629-747f-4d16-a545-d0070c20fe5d-horizon-secret-key\") pod \"horizon-5f56498b8d-9gwmf\" (UID: \"737d6629-747f-4d16-a545-d0070c20fe5d\") " pod="openstack/horizon-5f56498b8d-9gwmf" Feb 19 15:28:58 crc kubenswrapper[4810]: I0219 15:28:58.084499 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mnzp\" (UniqueName: \"kubernetes.io/projected/737d6629-747f-4d16-a545-d0070c20fe5d-kube-api-access-5mnzp\") pod \"horizon-5f56498b8d-9gwmf\" (UID: \"737d6629-747f-4d16-a545-d0070c20fe5d\") " pod="openstack/horizon-5f56498b8d-9gwmf" Feb 19 15:28:58 crc kubenswrapper[4810]: I0219 15:28:58.084536 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/737d6629-747f-4d16-a545-d0070c20fe5d-logs\") pod \"horizon-5f56498b8d-9gwmf\" (UID: \"737d6629-747f-4d16-a545-d0070c20fe5d\") " pod="openstack/horizon-5f56498b8d-9gwmf" Feb 19 15:28:58 crc kubenswrapper[4810]: I0219 15:28:58.084566 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/737d6629-747f-4d16-a545-d0070c20fe5d-horizon-tls-certs\") pod \"horizon-5f56498b8d-9gwmf\" (UID: \"737d6629-747f-4d16-a545-d0070c20fe5d\") " pod="openstack/horizon-5f56498b8d-9gwmf" Feb 19 15:28:58 crc kubenswrapper[4810]: I0219 15:28:58.084602 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/737d6629-747f-4d16-a545-d0070c20fe5d-combined-ca-bundle\") pod \"horizon-5f56498b8d-9gwmf\" (UID: \"737d6629-747f-4d16-a545-d0070c20fe5d\") " pod="openstack/horizon-5f56498b8d-9gwmf" Feb 19 15:28:58 crc kubenswrapper[4810]: I0219 15:28:58.084629 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/737d6629-747f-4d16-a545-d0070c20fe5d-config-data\") pod \"horizon-5f56498b8d-9gwmf\" (UID: \"737d6629-747f-4d16-a545-d0070c20fe5d\") " pod="openstack/horizon-5f56498b8d-9gwmf" Feb 19 15:28:58 crc kubenswrapper[4810]: I0219 15:28:58.084677 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/737d6629-747f-4d16-a545-d0070c20fe5d-scripts\") pod \"horizon-5f56498b8d-9gwmf\" (UID: \"737d6629-747f-4d16-a545-d0070c20fe5d\") " pod="openstack/horizon-5f56498b8d-9gwmf" Feb 19 15:28:58 crc kubenswrapper[4810]: I0219 15:28:58.085827 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/737d6629-747f-4d16-a545-d0070c20fe5d-scripts\") pod \"horizon-5f56498b8d-9gwmf\" (UID: \"737d6629-747f-4d16-a545-d0070c20fe5d\") " pod="openstack/horizon-5f56498b8d-9gwmf" Feb 19 15:28:58 crc kubenswrapper[4810]: I0219 15:28:58.091741 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/737d6629-747f-4d16-a545-d0070c20fe5d-horizon-tls-certs\") pod \"horizon-5f56498b8d-9gwmf\" (UID: \"737d6629-747f-4d16-a545-d0070c20fe5d\") " pod="openstack/horizon-5f56498b8d-9gwmf" Feb 19 15:28:58 crc kubenswrapper[4810]: I0219 15:28:58.092455 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/737d6629-747f-4d16-a545-d0070c20fe5d-horizon-secret-key\") pod \"horizon-5f56498b8d-9gwmf\" (UID: \"737d6629-747f-4d16-a545-d0070c20fe5d\") " pod="openstack/horizon-5f56498b8d-9gwmf" Feb 19 15:28:58 crc kubenswrapper[4810]: I0219 15:28:58.092781 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/737d6629-747f-4d16-a545-d0070c20fe5d-logs\") pod \"horizon-5f56498b8d-9gwmf\" (UID: \"737d6629-747f-4d16-a545-d0070c20fe5d\") " pod="openstack/horizon-5f56498b8d-9gwmf" Feb 19 15:28:58 crc kubenswrapper[4810]: I0219 15:28:58.093016 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/737d6629-747f-4d16-a545-d0070c20fe5d-config-data\") pod \"horizon-5f56498b8d-9gwmf\" (UID: \"737d6629-747f-4d16-a545-d0070c20fe5d\") " pod="openstack/horizon-5f56498b8d-9gwmf" Feb 19 15:28:58 crc kubenswrapper[4810]: I0219 15:28:58.100048 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/737d6629-747f-4d16-a545-d0070c20fe5d-combined-ca-bundle\") pod \"horizon-5f56498b8d-9gwmf\" (UID: \"737d6629-747f-4d16-a545-d0070c20fe5d\") " pod="openstack/horizon-5f56498b8d-9gwmf" Feb 19 15:28:58 crc kubenswrapper[4810]: I0219 15:28:58.105082 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mnzp\" (UniqueName: \"kubernetes.io/projected/737d6629-747f-4d16-a545-d0070c20fe5d-kube-api-access-5mnzp\") pod \"horizon-5f56498b8d-9gwmf\" (UID: \"737d6629-747f-4d16-a545-d0070c20fe5d\") " pod="openstack/horizon-5f56498b8d-9gwmf" Feb 19 15:28:58 crc kubenswrapper[4810]: I0219 15:28:58.161503 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-869f57798-ngdtl" Feb 19 15:28:58 crc kubenswrapper[4810]: I0219 15:28:58.271981 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5f56498b8d-9gwmf" Feb 19 15:28:59 crc kubenswrapper[4810]: I0219 15:28:59.125374 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.152:9322/\": dial tcp 10.217.0.152:9322: connect: connection refused" Feb 19 15:29:04 crc kubenswrapper[4810]: I0219 15:29:04.127580 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.152:9322/\": dial tcp 10.217.0.152:9322: connect: connection refused" Feb 19 15:29:04 crc kubenswrapper[4810]: I0219 15:29:04.128515 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Feb 19 15:29:07 crc kubenswrapper[4810]: E0219 15:29:07.475512 4810 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.159:5001/podified-master-centos10/openstack-ceilometer-central:watcher_latest" Feb 19 15:29:07 crc kubenswrapper[4810]: E0219 15:29:07.475889 4810 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.159:5001/podified-master-centos10/openstack-ceilometer-central:watcher_latest" Feb 19 15:29:07 crc kubenswrapper[4810]: E0219 15:29:07.476050 4810 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:38.102.83.159:5001/podified-master-centos10/openstack-ceilometer-central:watcher_latest,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nc4h668h666h568hb4h5ddh659h67fhb5h687h599h678hd7h5cbh586h6bh74h546h68dhch97h574h55bh79h59fh96h9fh68bh656hf4h64h5b4q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jd77w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(b8cd44d0-7395-44e1-9112-9e8bb4198b93): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 15:29:09 crc kubenswrapper[4810]: I0219 15:29:09.125934 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.152:9322/\": dial tcp 10.217.0.152:9322: connect: connection refused" Feb 19 15:29:09 crc kubenswrapper[4810]: I0219 15:29:09.564102 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-2f6g5" Feb 19 15:29:09 crc kubenswrapper[4810]: E0219 15:29:09.571311 4810 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.159:5001/podified-master-centos10/openstack-horizon:watcher_latest" Feb 19 15:29:09 crc kubenswrapper[4810]: E0219 15:29:09.571371 4810 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.159:5001/podified-master-centos10/openstack-horizon:watcher_latest" Feb 19 15:29:09 crc kubenswrapper[4810]: E0219 15:29:09.571487 4810 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:38.102.83.159:5001/podified-master-centos10/openstack-horizon:watcher_latest,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfdh59h668h67h5b5h5b9h575h578h548h644h56bhc4h557h674h98h74h7chc5h66h68h566h9dh5cch68fh577h8fh695h544h64ch5fh9h9fq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:yes,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h4ql7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-5fbd85f69f-5jhnw_openstack(978c1383-cc82-4788-beea-b1e15b25eb1f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 15:29:09 crc kubenswrapper[4810]: E0219 15:29:09.576618 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.159:5001/podified-master-centos10/openstack-horizon:watcher_latest\\\"\"]" pod="openstack/horizon-5fbd85f69f-5jhnw" podUID="978c1383-cc82-4788-beea-b1e15b25eb1f" Feb 19 15:29:09 crc kubenswrapper[4810]: I0219 15:29:09.719170 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a2a904f-47f9-40da-bc5f-aba73c4c1c57-scripts\") pod \"5a2a904f-47f9-40da-bc5f-aba73c4c1c57\" (UID: \"5a2a904f-47f9-40da-bc5f-aba73c4c1c57\") " Feb 19 15:29:09 crc kubenswrapper[4810]: I0219 15:29:09.719215 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a2a904f-47f9-40da-bc5f-aba73c4c1c57-combined-ca-bundle\") pod \"5a2a904f-47f9-40da-bc5f-aba73c4c1c57\" (UID: \"5a2a904f-47f9-40da-bc5f-aba73c4c1c57\") " Feb 19 15:29:09 crc kubenswrapper[4810]: I0219 15:29:09.719239 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5a2a904f-47f9-40da-bc5f-aba73c4c1c57-credential-keys\") pod \"5a2a904f-47f9-40da-bc5f-aba73c4c1c57\" (UID: \"5a2a904f-47f9-40da-bc5f-aba73c4c1c57\") " Feb 19 15:29:09 crc kubenswrapper[4810]: I0219 15:29:09.719349 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5a2a904f-47f9-40da-bc5f-aba73c4c1c57-fernet-keys\") pod \"5a2a904f-47f9-40da-bc5f-aba73c4c1c57\" (UID: \"5a2a904f-47f9-40da-bc5f-aba73c4c1c57\") " Feb 19 15:29:09 crc kubenswrapper[4810]: I0219 15:29:09.719499 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a2a904f-47f9-40da-bc5f-aba73c4c1c57-config-data\") pod \"5a2a904f-47f9-40da-bc5f-aba73c4c1c57\" (UID: \"5a2a904f-47f9-40da-bc5f-aba73c4c1c57\") " Feb 19 15:29:09 crc kubenswrapper[4810]: I0219 15:29:09.719524 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qlnqn\" (UniqueName: \"kubernetes.io/projected/5a2a904f-47f9-40da-bc5f-aba73c4c1c57-kube-api-access-qlnqn\") pod \"5a2a904f-47f9-40da-bc5f-aba73c4c1c57\" (UID: \"5a2a904f-47f9-40da-bc5f-aba73c4c1c57\") " Feb 19 15:29:09 crc kubenswrapper[4810]: I0219 15:29:09.728075 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a2a904f-47f9-40da-bc5f-aba73c4c1c57-kube-api-access-qlnqn" (OuterVolumeSpecName: "kube-api-access-qlnqn") pod "5a2a904f-47f9-40da-bc5f-aba73c4c1c57" (UID: "5a2a904f-47f9-40da-bc5f-aba73c4c1c57"). InnerVolumeSpecName "kube-api-access-qlnqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:29:09 crc kubenswrapper[4810]: I0219 15:29:09.728455 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a2a904f-47f9-40da-bc5f-aba73c4c1c57-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "5a2a904f-47f9-40da-bc5f-aba73c4c1c57" (UID: "5a2a904f-47f9-40da-bc5f-aba73c4c1c57"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:29:09 crc kubenswrapper[4810]: I0219 15:29:09.728593 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a2a904f-47f9-40da-bc5f-aba73c4c1c57-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "5a2a904f-47f9-40da-bc5f-aba73c4c1c57" (UID: "5a2a904f-47f9-40da-bc5f-aba73c4c1c57"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:29:09 crc kubenswrapper[4810]: I0219 15:29:09.752582 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a2a904f-47f9-40da-bc5f-aba73c4c1c57-scripts" (OuterVolumeSpecName: "scripts") pod "5a2a904f-47f9-40da-bc5f-aba73c4c1c57" (UID: "5a2a904f-47f9-40da-bc5f-aba73c4c1c57"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:29:09 crc kubenswrapper[4810]: I0219 15:29:09.758662 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a2a904f-47f9-40da-bc5f-aba73c4c1c57-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5a2a904f-47f9-40da-bc5f-aba73c4c1c57" (UID: "5a2a904f-47f9-40da-bc5f-aba73c4c1c57"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:29:09 crc kubenswrapper[4810]: I0219 15:29:09.761452 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a2a904f-47f9-40da-bc5f-aba73c4c1c57-config-data" (OuterVolumeSpecName: "config-data") pod "5a2a904f-47f9-40da-bc5f-aba73c4c1c57" (UID: "5a2a904f-47f9-40da-bc5f-aba73c4c1c57"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:29:09 crc kubenswrapper[4810]: I0219 15:29:09.822118 4810 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5a2a904f-47f9-40da-bc5f-aba73c4c1c57-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:09 crc kubenswrapper[4810]: I0219 15:29:09.822143 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a2a904f-47f9-40da-bc5f-aba73c4c1c57-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:09 crc kubenswrapper[4810]: I0219 15:29:09.822154 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qlnqn\" (UniqueName: \"kubernetes.io/projected/5a2a904f-47f9-40da-bc5f-aba73c4c1c57-kube-api-access-qlnqn\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:09 crc kubenswrapper[4810]: I0219 15:29:09.822164 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a2a904f-47f9-40da-bc5f-aba73c4c1c57-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:09 crc kubenswrapper[4810]: I0219 15:29:09.822171 4810 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5a2a904f-47f9-40da-bc5f-aba73c4c1c57-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:09 crc kubenswrapper[4810]: I0219 15:29:09.822179 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a2a904f-47f9-40da-bc5f-aba73c4c1c57-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:09 crc kubenswrapper[4810]: I0219 15:29:09.915123 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-2f6g5" Feb 19 15:29:09 crc kubenswrapper[4810]: I0219 15:29:09.915339 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-2f6g5" event={"ID":"5a2a904f-47f9-40da-bc5f-aba73c4c1c57","Type":"ContainerDied","Data":"e83e19f0a88b6d8caed6cea2c56d5be687bd87f6e4bb330f1df96f263a9c3db4"} Feb 19 15:29:09 crc kubenswrapper[4810]: I0219 15:29:09.915367 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e83e19f0a88b6d8caed6cea2c56d5be687bd87f6e4bb330f1df96f263a9c3db4" Feb 19 15:29:10 crc kubenswrapper[4810]: I0219 15:29:10.647221 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-2f6g5"] Feb 19 15:29:10 crc kubenswrapper[4810]: I0219 15:29:10.654972 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-2f6g5"] Feb 19 15:29:10 crc kubenswrapper[4810]: I0219 15:29:10.747422 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-thnc7"] Feb 19 15:29:10 crc kubenswrapper[4810]: E0219 15:29:10.747769 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a2a904f-47f9-40da-bc5f-aba73c4c1c57" containerName="keystone-bootstrap" Feb 19 15:29:10 crc kubenswrapper[4810]: I0219 15:29:10.747786 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a2a904f-47f9-40da-bc5f-aba73c4c1c57" containerName="keystone-bootstrap" Feb 19 15:29:10 crc kubenswrapper[4810]: I0219 15:29:10.747996 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a2a904f-47f9-40da-bc5f-aba73c4c1c57" containerName="keystone-bootstrap" Feb 19 15:29:10 crc kubenswrapper[4810]: I0219 15:29:10.748712 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-thnc7" Feb 19 15:29:10 crc kubenswrapper[4810]: I0219 15:29:10.751581 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 19 15:29:10 crc kubenswrapper[4810]: I0219 15:29:10.751599 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 19 15:29:10 crc kubenswrapper[4810]: I0219 15:29:10.751659 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 19 15:29:10 crc kubenswrapper[4810]: I0219 15:29:10.751686 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 19 15:29:10 crc kubenswrapper[4810]: I0219 15:29:10.751773 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-j78zz" Feb 19 15:29:10 crc kubenswrapper[4810]: I0219 15:29:10.757764 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-thnc7"] Feb 19 15:29:10 crc kubenswrapper[4810]: I0219 15:29:10.948241 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92797675-ddf7-43cf-90af-0248cf097509-config-data\") pod \"keystone-bootstrap-thnc7\" (UID: \"92797675-ddf7-43cf-90af-0248cf097509\") " pod="openstack/keystone-bootstrap-thnc7" Feb 19 15:29:10 crc kubenswrapper[4810]: I0219 15:29:10.948628 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92797675-ddf7-43cf-90af-0248cf097509-scripts\") pod \"keystone-bootstrap-thnc7\" (UID: \"92797675-ddf7-43cf-90af-0248cf097509\") " pod="openstack/keystone-bootstrap-thnc7" Feb 19 15:29:10 crc kubenswrapper[4810]: I0219 15:29:10.948721 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/92797675-ddf7-43cf-90af-0248cf097509-fernet-keys\") pod \"keystone-bootstrap-thnc7\" (UID: \"92797675-ddf7-43cf-90af-0248cf097509\") " pod="openstack/keystone-bootstrap-thnc7" Feb 19 15:29:10 crc kubenswrapper[4810]: I0219 15:29:10.948746 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxtzf\" (UniqueName: \"kubernetes.io/projected/92797675-ddf7-43cf-90af-0248cf097509-kube-api-access-gxtzf\") pod \"keystone-bootstrap-thnc7\" (UID: \"92797675-ddf7-43cf-90af-0248cf097509\") " pod="openstack/keystone-bootstrap-thnc7" Feb 19 15:29:10 crc kubenswrapper[4810]: I0219 15:29:10.949018 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92797675-ddf7-43cf-90af-0248cf097509-combined-ca-bundle\") pod \"keystone-bootstrap-thnc7\" (UID: \"92797675-ddf7-43cf-90af-0248cf097509\") " pod="openstack/keystone-bootstrap-thnc7" Feb 19 15:29:10 crc kubenswrapper[4810]: I0219 15:29:10.949164 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/92797675-ddf7-43cf-90af-0248cf097509-credential-keys\") pod \"keystone-bootstrap-thnc7\" (UID: \"92797675-ddf7-43cf-90af-0248cf097509\") " pod="openstack/keystone-bootstrap-thnc7" Feb 19 15:29:11 crc kubenswrapper[4810]: I0219 15:29:11.050549 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/92797675-ddf7-43cf-90af-0248cf097509-fernet-keys\") pod \"keystone-bootstrap-thnc7\" (UID: \"92797675-ddf7-43cf-90af-0248cf097509\") " pod="openstack/keystone-bootstrap-thnc7" Feb 19 15:29:11 crc kubenswrapper[4810]: I0219 15:29:11.050591 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxtzf\" (UniqueName: \"kubernetes.io/projected/92797675-ddf7-43cf-90af-0248cf097509-kube-api-access-gxtzf\") pod \"keystone-bootstrap-thnc7\" (UID: \"92797675-ddf7-43cf-90af-0248cf097509\") " pod="openstack/keystone-bootstrap-thnc7" Feb 19 15:29:11 crc kubenswrapper[4810]: I0219 15:29:11.050662 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92797675-ddf7-43cf-90af-0248cf097509-combined-ca-bundle\") pod \"keystone-bootstrap-thnc7\" (UID: \"92797675-ddf7-43cf-90af-0248cf097509\") " pod="openstack/keystone-bootstrap-thnc7" Feb 19 15:29:11 crc kubenswrapper[4810]: I0219 15:29:11.050707 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/92797675-ddf7-43cf-90af-0248cf097509-credential-keys\") pod \"keystone-bootstrap-thnc7\" (UID: \"92797675-ddf7-43cf-90af-0248cf097509\") " pod="openstack/keystone-bootstrap-thnc7" Feb 19 15:29:11 crc kubenswrapper[4810]: I0219 15:29:11.050733 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92797675-ddf7-43cf-90af-0248cf097509-config-data\") pod \"keystone-bootstrap-thnc7\" (UID: \"92797675-ddf7-43cf-90af-0248cf097509\") " pod="openstack/keystone-bootstrap-thnc7" Feb 19 15:29:11 crc kubenswrapper[4810]: I0219 15:29:11.050758 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92797675-ddf7-43cf-90af-0248cf097509-scripts\") pod \"keystone-bootstrap-thnc7\" (UID: \"92797675-ddf7-43cf-90af-0248cf097509\") " pod="openstack/keystone-bootstrap-thnc7" Feb 19 15:29:11 crc kubenswrapper[4810]: I0219 15:29:11.058053 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/92797675-ddf7-43cf-90af-0248cf097509-credential-keys\") pod \"keystone-bootstrap-thnc7\" (UID: \"92797675-ddf7-43cf-90af-0248cf097509\") " pod="openstack/keystone-bootstrap-thnc7" Feb 19 15:29:11 crc kubenswrapper[4810]: I0219 15:29:11.058136 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92797675-ddf7-43cf-90af-0248cf097509-scripts\") pod \"keystone-bootstrap-thnc7\" (UID: \"92797675-ddf7-43cf-90af-0248cf097509\") " pod="openstack/keystone-bootstrap-thnc7" Feb 19 15:29:11 crc kubenswrapper[4810]: I0219 15:29:11.059523 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92797675-ddf7-43cf-90af-0248cf097509-config-data\") pod \"keystone-bootstrap-thnc7\" (UID: \"92797675-ddf7-43cf-90af-0248cf097509\") " pod="openstack/keystone-bootstrap-thnc7" Feb 19 15:29:11 crc kubenswrapper[4810]: I0219 15:29:11.059980 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92797675-ddf7-43cf-90af-0248cf097509-combined-ca-bundle\") pod \"keystone-bootstrap-thnc7\" (UID: \"92797675-ddf7-43cf-90af-0248cf097509\") " pod="openstack/keystone-bootstrap-thnc7" Feb 19 15:29:11 crc kubenswrapper[4810]: I0219 15:29:11.068069 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/92797675-ddf7-43cf-90af-0248cf097509-fernet-keys\") pod \"keystone-bootstrap-thnc7\" (UID: \"92797675-ddf7-43cf-90af-0248cf097509\") " pod="openstack/keystone-bootstrap-thnc7" Feb 19 15:29:11 crc kubenswrapper[4810]: I0219 15:29:11.070758 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxtzf\" (UniqueName: \"kubernetes.io/projected/92797675-ddf7-43cf-90af-0248cf097509-kube-api-access-gxtzf\") pod \"keystone-bootstrap-thnc7\" (UID: \"92797675-ddf7-43cf-90af-0248cf097509\") " pod="openstack/keystone-bootstrap-thnc7" Feb 19 15:29:11 crc kubenswrapper[4810]: I0219 15:29:11.096909 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-thnc7" Feb 19 15:29:11 crc kubenswrapper[4810]: I0219 15:29:11.456517 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a2a904f-47f9-40da-bc5f-aba73c4c1c57" path="/var/lib/kubelet/pods/5a2a904f-47f9-40da-bc5f-aba73c4c1c57/volumes" Feb 19 15:29:18 crc kubenswrapper[4810]: E0219 15:29:18.933578 4810 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.159:5001/podified-master-centos10/openstack-barbican-api:watcher_latest" Feb 19 15:29:18 crc kubenswrapper[4810]: E0219 15:29:18.934144 4810 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.159:5001/podified-master-centos10/openstack-barbican-api:watcher_latest" Feb 19 15:29:18 crc kubenswrapper[4810]: E0219 15:29:18.934248 4810 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:38.102.83.159:5001/podified-master-centos10/openstack-barbican-api:watcher_latest,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-btmxg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-hmc6k_openstack(2024a783-c3f9-4e57-b00f-52bec164e64e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 15:29:18 crc kubenswrapper[4810]: E0219 15:29:18.935397 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-hmc6k" podUID="2024a783-c3f9-4e57-b00f-52bec164e64e" Feb 19 15:29:19 crc kubenswrapper[4810]: I0219 15:29:19.010790 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc","Type":"ContainerDied","Data":"1347fa88cd5df9f6d1e59a77fbeedca11e777e87af7e22e44f3db90ca1dd624c"} Feb 19 15:29:19 crc kubenswrapper[4810]: I0219 15:29:19.010849 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1347fa88cd5df9f6d1e59a77fbeedca11e777e87af7e22e44f3db90ca1dd624c" Feb 19 15:29:19 crc kubenswrapper[4810]: E0219 15:29:19.018590 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.159:5001/podified-master-centos10/openstack-barbican-api:watcher_latest\\\"\"" pod="openstack/barbican-db-sync-hmc6k" podUID="2024a783-c3f9-4e57-b00f-52bec164e64e" Feb 19 15:29:19 crc kubenswrapper[4810]: I0219 15:29:19.076587 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Feb 19 15:29:19 crc kubenswrapper[4810]: I0219 15:29:19.086523 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5fbd85f69f-5jhnw" Feb 19 15:29:19 crc kubenswrapper[4810]: I0219 15:29:19.125686 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.152:9322/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 15:29:19 crc kubenswrapper[4810]: I0219 15:29:19.217856 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc-combined-ca-bundle\") pod \"2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc\" (UID: \"2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc\") " Feb 19 15:29:19 crc kubenswrapper[4810]: I0219 15:29:19.217922 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc-config-data\") pod \"2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc\" (UID: \"2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc\") " Feb 19 15:29:19 crc kubenswrapper[4810]: I0219 15:29:19.217973 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/978c1383-cc82-4788-beea-b1e15b25eb1f-scripts\") pod \"978c1383-cc82-4788-beea-b1e15b25eb1f\" (UID: \"978c1383-cc82-4788-beea-b1e15b25eb1f\") " Feb 19 15:29:19 crc kubenswrapper[4810]: I0219 15:29:19.218001 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc-custom-prometheus-ca\") pod \"2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc\" (UID: \"2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc\") " Feb 19 15:29:19 crc kubenswrapper[4810]: I0219 15:29:19.218031 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/978c1383-cc82-4788-beea-b1e15b25eb1f-logs\") pod \"978c1383-cc82-4788-beea-b1e15b25eb1f\" (UID: \"978c1383-cc82-4788-beea-b1e15b25eb1f\") " Feb 19 15:29:19 crc kubenswrapper[4810]: I0219 15:29:19.218057 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/978c1383-cc82-4788-beea-b1e15b25eb1f-horizon-secret-key\") pod \"978c1383-cc82-4788-beea-b1e15b25eb1f\" (UID: \"978c1383-cc82-4788-beea-b1e15b25eb1f\") " Feb 19 15:29:19 crc kubenswrapper[4810]: I0219 15:29:19.219200 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mfnzr\" (UniqueName: \"kubernetes.io/projected/2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc-kube-api-access-mfnzr\") pod \"2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc\" (UID: \"2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc\") " Feb 19 15:29:19 crc kubenswrapper[4810]: I0219 15:29:19.218527 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/978c1383-cc82-4788-beea-b1e15b25eb1f-logs" (OuterVolumeSpecName: "logs") pod "978c1383-cc82-4788-beea-b1e15b25eb1f" (UID: "978c1383-cc82-4788-beea-b1e15b25eb1f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:29:19 crc kubenswrapper[4810]: I0219 15:29:19.219142 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/978c1383-cc82-4788-beea-b1e15b25eb1f-scripts" (OuterVolumeSpecName: "scripts") pod "978c1383-cc82-4788-beea-b1e15b25eb1f" (UID: "978c1383-cc82-4788-beea-b1e15b25eb1f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:29:19 crc kubenswrapper[4810]: I0219 15:29:19.219220 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/978c1383-cc82-4788-beea-b1e15b25eb1f-config-data\") pod \"978c1383-cc82-4788-beea-b1e15b25eb1f\" (UID: \"978c1383-cc82-4788-beea-b1e15b25eb1f\") " Feb 19 15:29:19 crc kubenswrapper[4810]: I0219 15:29:19.219613 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4ql7\" (UniqueName: \"kubernetes.io/projected/978c1383-cc82-4788-beea-b1e15b25eb1f-kube-api-access-h4ql7\") pod \"978c1383-cc82-4788-beea-b1e15b25eb1f\" (UID: \"978c1383-cc82-4788-beea-b1e15b25eb1f\") " Feb 19 15:29:19 crc kubenswrapper[4810]: I0219 15:29:19.219651 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc-logs\") pod \"2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc\" (UID: \"2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc\") " Feb 19 15:29:19 crc kubenswrapper[4810]: I0219 15:29:19.220434 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc-logs" (OuterVolumeSpecName: "logs") pod "2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc" (UID: "2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:29:19 crc kubenswrapper[4810]: I0219 15:29:19.220690 4810 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc-logs\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:19 crc kubenswrapper[4810]: I0219 15:29:19.220703 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/978c1383-cc82-4788-beea-b1e15b25eb1f-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:19 crc kubenswrapper[4810]: I0219 15:29:19.220712 4810 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/978c1383-cc82-4788-beea-b1e15b25eb1f-logs\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:19 crc kubenswrapper[4810]: I0219 15:29:19.221132 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/978c1383-cc82-4788-beea-b1e15b25eb1f-config-data" (OuterVolumeSpecName: "config-data") pod "978c1383-cc82-4788-beea-b1e15b25eb1f" (UID: "978c1383-cc82-4788-beea-b1e15b25eb1f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:29:19 crc kubenswrapper[4810]: I0219 15:29:19.221359 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/978c1383-cc82-4788-beea-b1e15b25eb1f-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "978c1383-cc82-4788-beea-b1e15b25eb1f" (UID: "978c1383-cc82-4788-beea-b1e15b25eb1f"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:29:19 crc kubenswrapper[4810]: I0219 15:29:19.222532 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc-kube-api-access-mfnzr" (OuterVolumeSpecName: "kube-api-access-mfnzr") pod "2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc" (UID: "2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc"). InnerVolumeSpecName "kube-api-access-mfnzr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:29:19 crc kubenswrapper[4810]: I0219 15:29:19.225484 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/978c1383-cc82-4788-beea-b1e15b25eb1f-kube-api-access-h4ql7" (OuterVolumeSpecName: "kube-api-access-h4ql7") pod "978c1383-cc82-4788-beea-b1e15b25eb1f" (UID: "978c1383-cc82-4788-beea-b1e15b25eb1f"). InnerVolumeSpecName "kube-api-access-h4ql7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:29:19 crc kubenswrapper[4810]: I0219 15:29:19.243712 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc" (UID: "2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:29:19 crc kubenswrapper[4810]: I0219 15:29:19.248164 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc" (UID: "2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:29:19 crc kubenswrapper[4810]: I0219 15:29:19.266873 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc-config-data" (OuterVolumeSpecName: "config-data") pod "2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc" (UID: "2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:29:19 crc kubenswrapper[4810]: I0219 15:29:19.325437 4810 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/978c1383-cc82-4788-beea-b1e15b25eb1f-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:19 crc kubenswrapper[4810]: I0219 15:29:19.325466 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/978c1383-cc82-4788-beea-b1e15b25eb1f-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:19 crc kubenswrapper[4810]: I0219 15:29:19.325476 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mfnzr\" (UniqueName: \"kubernetes.io/projected/2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc-kube-api-access-mfnzr\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:19 crc kubenswrapper[4810]: I0219 15:29:19.325486 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h4ql7\" (UniqueName: \"kubernetes.io/projected/978c1383-cc82-4788-beea-b1e15b25eb1f-kube-api-access-h4ql7\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:19 crc kubenswrapper[4810]: I0219 15:29:19.325494 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:19 crc kubenswrapper[4810]: I0219 15:29:19.325502 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:19 crc kubenswrapper[4810]: I0219 15:29:19.325509 4810 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:19 crc kubenswrapper[4810]: I0219 15:29:19.538176 4810 patch_prober.go:28] interesting pod/machine-config-daemon-t499d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 15:29:19 crc kubenswrapper[4810]: I0219 15:29:19.538239 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 15:29:19 crc kubenswrapper[4810]: I0219 15:29:19.538278 4810 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t499d" Feb 19 15:29:19 crc kubenswrapper[4810]: I0219 15:29:19.539029 4810 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"37fe95e370faa9fca4a69499713730a8ba7e7939f57cd237ea9a505f9b09a6bf"} pod="openshift-machine-config-operator/machine-config-daemon-t499d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 15:29:19 crc kubenswrapper[4810]: I0219 15:29:19.539083 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" containerID="cri-o://37fe95e370faa9fca4a69499713730a8ba7e7939f57cd237ea9a505f9b09a6bf" gracePeriod=600 Feb 19 15:29:20 crc kubenswrapper[4810]: I0219 15:29:20.033372 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5fbd85f69f-5jhnw" event={"ID":"978c1383-cc82-4788-beea-b1e15b25eb1f","Type":"ContainerDied","Data":"a02086ae9fd1e62e23bb222b4fb6d0999b14d605294e6aa8f70c1665e61a1617"} Feb 19 15:29:20 crc kubenswrapper[4810]: I0219 15:29:20.033460 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5fbd85f69f-5jhnw" Feb 19 15:29:20 crc kubenswrapper[4810]: I0219 15:29:20.037452 4810 generic.go:334] "Generic (PLEG): container finished" podID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerID="37fe95e370faa9fca4a69499713730a8ba7e7939f57cd237ea9a505f9b09a6bf" exitCode=0 Feb 19 15:29:20 crc kubenswrapper[4810]: I0219 15:29:20.037521 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" event={"ID":"dfbf74ef-1e94-4826-8583-42b2e246ccf3","Type":"ContainerDied","Data":"37fe95e370faa9fca4a69499713730a8ba7e7939f57cd237ea9a505f9b09a6bf"} Feb 19 15:29:20 crc kubenswrapper[4810]: I0219 15:29:20.037551 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Feb 19 15:29:20 crc kubenswrapper[4810]: I0219 15:29:20.037611 4810 scope.go:117] "RemoveContainer" containerID="6c88e0127771a4aa28c6261d9a83da29a3f930023146271a9d942e738f8152ff" Feb 19 15:29:20 crc kubenswrapper[4810]: E0219 15:29:20.058903 4810 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.159:5001/podified-master-centos10/openstack-cinder-api:watcher_latest" Feb 19 15:29:20 crc kubenswrapper[4810]: E0219 15:29:20.058951 4810 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.159:5001/podified-master-centos10/openstack-cinder-api:watcher_latest" Feb 19 15:29:20 crc kubenswrapper[4810]: E0219 15:29:20.059070 4810 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:38.102.83.159:5001/podified-master-centos10/openstack-cinder-api:watcher_latest,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cqnf7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-svmgl_openstack(848dfe9d-05f4-4ba9-919e-23e9a7ae63d5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 15:29:20 crc kubenswrapper[4810]: E0219 15:29:20.062291 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-svmgl" podUID="848dfe9d-05f4-4ba9-919e-23e9a7ae63d5" Feb 19 15:29:20 crc kubenswrapper[4810]: I0219 15:29:20.072233 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Feb 19 15:29:20 crc kubenswrapper[4810]: I0219 15:29:20.082199 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-api-0"] Feb 19 15:29:20 crc kubenswrapper[4810]: I0219 15:29:20.129995 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Feb 19 15:29:20 crc kubenswrapper[4810]: E0219 15:29:20.130414 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc" containerName="watcher-api-log" Feb 19 15:29:20 crc kubenswrapper[4810]: I0219 15:29:20.130429 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc" containerName="watcher-api-log" Feb 19 15:29:20 crc kubenswrapper[4810]: E0219 15:29:20.130453 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc" containerName="watcher-api" Feb 19 15:29:20 crc kubenswrapper[4810]: I0219 15:29:20.130459 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc" containerName="watcher-api" Feb 19 15:29:20 crc kubenswrapper[4810]: I0219 15:29:20.130636 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc" containerName="watcher-api-log" Feb 19 15:29:20 crc kubenswrapper[4810]: I0219 15:29:20.130657 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc" containerName="watcher-api" Feb 19 15:29:20 crc kubenswrapper[4810]: I0219 15:29:20.131670 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Feb 19 15:29:20 crc kubenswrapper[4810]: I0219 15:29:20.134606 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Feb 19 15:29:20 crc kubenswrapper[4810]: I0219 15:29:20.137859 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5fbd85f69f-5jhnw"] Feb 19 15:29:20 crc kubenswrapper[4810]: I0219 15:29:20.151248 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5fbd85f69f-5jhnw"] Feb 19 15:29:20 crc kubenswrapper[4810]: I0219 15:29:20.161779 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Feb 19 15:29:20 crc kubenswrapper[4810]: I0219 15:29:20.239048 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd961c7d-d551-4f5b-a08a-07d088947698-logs\") pod \"watcher-api-0\" (UID: \"cd961c7d-d551-4f5b-a08a-07d088947698\") " pod="openstack/watcher-api-0" Feb 19 15:29:20 crc kubenswrapper[4810]: I0219 15:29:20.239104 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdn2x\" (UniqueName: \"kubernetes.io/projected/cd961c7d-d551-4f5b-a08a-07d088947698-kube-api-access-zdn2x\") pod \"watcher-api-0\" (UID: \"cd961c7d-d551-4f5b-a08a-07d088947698\") " pod="openstack/watcher-api-0" Feb 19 15:29:20 crc kubenswrapper[4810]: I0219 15:29:20.239304 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd961c7d-d551-4f5b-a08a-07d088947698-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"cd961c7d-d551-4f5b-a08a-07d088947698\") " pod="openstack/watcher-api-0" Feb 19 15:29:20 crc kubenswrapper[4810]: I0219 15:29:20.239425 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/cd961c7d-d551-4f5b-a08a-07d088947698-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"cd961c7d-d551-4f5b-a08a-07d088947698\") " pod="openstack/watcher-api-0" Feb 19 15:29:20 crc kubenswrapper[4810]: I0219 15:29:20.239573 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd961c7d-d551-4f5b-a08a-07d088947698-config-data\") pod \"watcher-api-0\" (UID: \"cd961c7d-d551-4f5b-a08a-07d088947698\") " pod="openstack/watcher-api-0" Feb 19 15:29:20 crc kubenswrapper[4810]: I0219 15:29:20.341146 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd961c7d-d551-4f5b-a08a-07d088947698-config-data\") pod \"watcher-api-0\" (UID: \"cd961c7d-d551-4f5b-a08a-07d088947698\") " pod="openstack/watcher-api-0" Feb 19 15:29:20 crc kubenswrapper[4810]: I0219 15:29:20.341274 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd961c7d-d551-4f5b-a08a-07d088947698-logs\") pod \"watcher-api-0\" (UID: \"cd961c7d-d551-4f5b-a08a-07d088947698\") " pod="openstack/watcher-api-0" Feb 19 15:29:20 crc kubenswrapper[4810]: I0219 15:29:20.341349 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdn2x\" (UniqueName: \"kubernetes.io/projected/cd961c7d-d551-4f5b-a08a-07d088947698-kube-api-access-zdn2x\") pod \"watcher-api-0\" (UID: \"cd961c7d-d551-4f5b-a08a-07d088947698\") " pod="openstack/watcher-api-0" Feb 19 15:29:20 crc kubenswrapper[4810]: I0219 15:29:20.341418 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd961c7d-d551-4f5b-a08a-07d088947698-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"cd961c7d-d551-4f5b-a08a-07d088947698\") " pod="openstack/watcher-api-0" Feb 19 15:29:20 crc kubenswrapper[4810]: I0219 15:29:20.341450 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/cd961c7d-d551-4f5b-a08a-07d088947698-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"cd961c7d-d551-4f5b-a08a-07d088947698\") " pod="openstack/watcher-api-0" Feb 19 15:29:20 crc kubenswrapper[4810]: I0219 15:29:20.342301 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd961c7d-d551-4f5b-a08a-07d088947698-logs\") pod \"watcher-api-0\" (UID: \"cd961c7d-d551-4f5b-a08a-07d088947698\") " pod="openstack/watcher-api-0" Feb 19 15:29:20 crc kubenswrapper[4810]: I0219 15:29:20.348629 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/cd961c7d-d551-4f5b-a08a-07d088947698-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"cd961c7d-d551-4f5b-a08a-07d088947698\") " pod="openstack/watcher-api-0" Feb 19 15:29:20 crc kubenswrapper[4810]: I0219 15:29:20.348712 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd961c7d-d551-4f5b-a08a-07d088947698-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"cd961c7d-d551-4f5b-a08a-07d088947698\") " pod="openstack/watcher-api-0" Feb 19 15:29:20 crc kubenswrapper[4810]: I0219 15:29:20.350498 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd961c7d-d551-4f5b-a08a-07d088947698-config-data\") pod \"watcher-api-0\" (UID: \"cd961c7d-d551-4f5b-a08a-07d088947698\") " pod="openstack/watcher-api-0" Feb 19 15:29:20 crc kubenswrapper[4810]: I0219 15:29:20.360617 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdn2x\" (UniqueName: \"kubernetes.io/projected/cd961c7d-d551-4f5b-a08a-07d088947698-kube-api-access-zdn2x\") pod \"watcher-api-0\" (UID: \"cd961c7d-d551-4f5b-a08a-07d088947698\") " pod="openstack/watcher-api-0" Feb 19 15:29:20 crc kubenswrapper[4810]: I0219 15:29:20.449083 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Feb 19 15:29:20 crc kubenswrapper[4810]: I0219 15:29:20.926174 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5f56498b8d-9gwmf"] Feb 19 15:29:20 crc kubenswrapper[4810]: W0219 15:29:20.957679 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod737d6629_747f_4d16_a545_d0070c20fe5d.slice/crio-da90b84310b3d5abeea30fa57f1618f55f0eafa5818785fd1f92d533a8fff76b WatchSource:0}: Error finding container da90b84310b3d5abeea30fa57f1618f55f0eafa5818785fd1f92d533a8fff76b: Status 404 returned error can't find the container with id da90b84310b3d5abeea30fa57f1618f55f0eafa5818785fd1f92d533a8fff76b Feb 19 15:29:21 crc kubenswrapper[4810]: I0219 15:29:21.055864 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f4cfd6f6c-s7m64" event={"ID":"18ca6546-69fd-492d-81c5-bb18c56b045d","Type":"ContainerStarted","Data":"acf279e70fd332fdcdd2bf83f0303bb99e19cd10482ff8ab44a134fa747add8b"} Feb 19 15:29:21 crc kubenswrapper[4810]: I0219 15:29:21.062814 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5f4cfd6f6c-s7m64" Feb 19 15:29:21 crc kubenswrapper[4810]: I0219 15:29:21.063899 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5f56498b8d-9gwmf" event={"ID":"737d6629-747f-4d16-a545-d0070c20fe5d","Type":"ContainerStarted","Data":"da90b84310b3d5abeea30fa57f1618f55f0eafa5818785fd1f92d533a8fff76b"} Feb 19 15:29:21 crc kubenswrapper[4810]: E0219 15:29:21.065311 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.159:5001/podified-master-centos10/openstack-cinder-api:watcher_latest\\\"\"" pod="openstack/cinder-db-sync-svmgl" podUID="848dfe9d-05f4-4ba9-919e-23e9a7ae63d5" Feb 19 15:29:21 crc kubenswrapper[4810]: I0219 15:29:21.106939 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5f4cfd6f6c-s7m64" podStartSLOduration=33.10691847 podStartE2EDuration="33.10691847s" podCreationTimestamp="2026-02-19 15:28:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:29:21.077708313 +0000 UTC m=+1190.559738437" watchObservedRunningTime="2026-02-19 15:29:21.10691847 +0000 UTC m=+1190.588948594" Feb 19 15:29:21 crc kubenswrapper[4810]: I0219 15:29:21.126666 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-869f57798-ngdtl"] Feb 19 15:29:21 crc kubenswrapper[4810]: I0219 15:29:21.292676 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-thnc7"] Feb 19 15:29:21 crc kubenswrapper[4810]: I0219 15:29:21.401723 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Feb 19 15:29:21 crc kubenswrapper[4810]: I0219 15:29:21.456151 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc" path="/var/lib/kubelet/pods/2760e5f1-9d54-4ae0-a36e-344cd4f2e2cc/volumes" Feb 19 15:29:21 crc kubenswrapper[4810]: I0219 15:29:21.457437 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="978c1383-cc82-4788-beea-b1e15b25eb1f" path="/var/lib/kubelet/pods/978c1383-cc82-4788-beea-b1e15b25eb1f/volumes" Feb 19 15:29:22 crc kubenswrapper[4810]: I0219 15:29:22.077884 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-869f57798-ngdtl" event={"ID":"58c845f2-0069-4ee5-9d4b-b5871e078926","Type":"ContainerStarted","Data":"5ab1bae28a55f588686fefd9b6e6ee98c22d6657796a662570ae5cd62319bd13"} Feb 19 15:29:22 crc kubenswrapper[4810]: I0219 15:29:22.078299 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-869f57798-ngdtl" event={"ID":"58c845f2-0069-4ee5-9d4b-b5871e078926","Type":"ContainerStarted","Data":"b7473c6c07a1c77d67ffe62af3e5c262ab61dca816caf8aab0acb14dc5b23ebd"} Feb 19 15:29:22 crc kubenswrapper[4810]: I0219 15:29:22.078309 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-869f57798-ngdtl" event={"ID":"58c845f2-0069-4ee5-9d4b-b5871e078926","Type":"ContainerStarted","Data":"dd8cab14dea5221ed6bc57de9b6e6053cd08e7d2f18677d44feb73bc0f3396df"} Feb 19 15:29:22 crc kubenswrapper[4810]: I0219 15:29:22.083534 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"4be760b2-263c-4b89-8bdf-ecf98114a24f","Type":"ContainerStarted","Data":"8c99f71d93423dbb260d03502181a976def39a613d97465a06298513d67bb0bf"} Feb 19 15:29:22 crc kubenswrapper[4810]: I0219 15:29:22.091127 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"cd961c7d-d551-4f5b-a08a-07d088947698","Type":"ContainerStarted","Data":"aa9a5e6b8de561c312023bc0224fd25e600c4ac446d6ec5ee19e031c464523e1"} Feb 19 15:29:22 crc kubenswrapper[4810]: I0219 15:29:22.091171 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"cd961c7d-d551-4f5b-a08a-07d088947698","Type":"ContainerStarted","Data":"4368969b33587354836e3cdb6c31f7bd645f510dc4e3c262614ef8da31c0eb92"} Feb 19 15:29:22 crc kubenswrapper[4810]: I0219 15:29:22.091181 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"cd961c7d-d551-4f5b-a08a-07d088947698","Type":"ContainerStarted","Data":"3ca7fe1f4f8bad9a3a06d89d4141ff28e32b10f6d3445d4b3f2404b6e71c942f"} Feb 19 15:29:22 crc kubenswrapper[4810]: I0219 15:29:22.092189 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Feb 19 15:29:22 crc kubenswrapper[4810]: I0219 15:29:22.093977 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="cd961c7d-d551-4f5b-a08a-07d088947698" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.168:9322/\": dial tcp 10.217.0.168:9322: connect: connection refused" Feb 19 15:29:22 crc kubenswrapper[4810]: I0219 15:29:22.098378 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-7jdcp" event={"ID":"36fe6fdb-2970-4773-8184-a2d16b8ca89a","Type":"ContainerStarted","Data":"829a51aca23df8d8763078bcae4b4cba43b6c265996ab11fc55f6d42ce950516"} Feb 19 15:29:22 crc kubenswrapper[4810]: I0219 15:29:22.100568 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-869f57798-ngdtl" podStartSLOduration=25.100550056 podStartE2EDuration="25.100550056s" podCreationTimestamp="2026-02-19 15:28:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:29:22.095615815 +0000 UTC m=+1191.577645929" watchObservedRunningTime="2026-02-19 15:29:22.100550056 +0000 UTC m=+1191.582580170" Feb 19 15:29:22 crc kubenswrapper[4810]: I0219 15:29:22.105553 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5f56498b8d-9gwmf" event={"ID":"737d6629-747f-4d16-a545-d0070c20fe5d","Type":"ContainerStarted","Data":"93ddf2af467af361323be362f93ceed997e447e87fc63962e86036a1907feb9e"} Feb 19 15:29:22 crc kubenswrapper[4810]: I0219 15:29:22.105605 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5f56498b8d-9gwmf" event={"ID":"737d6629-747f-4d16-a545-d0070c20fe5d","Type":"ContainerStarted","Data":"f456c8628f413cbddb39a4ea21eda28481bc8ce0a9fad14f3d8e9c10f3206ddc"} Feb 19 15:29:22 crc kubenswrapper[4810]: I0219 15:29:22.128566 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9c2f952c-2122-43d9-b006-6967fd2b9029","Type":"ContainerStarted","Data":"aacb41ee75296b30fe484f99ea7be7d89eedb8e0ad8d20c614e61233ead7a68d"} Feb 19 15:29:22 crc kubenswrapper[4810]: I0219 15:29:22.128792 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="9c2f952c-2122-43d9-b006-6967fd2b9029" containerName="glance-log" containerID="cri-o://ba83a3830aa23a7f50a84bf64a23008ce3bd4119d839094ddf609dc429db2d93" gracePeriod=30 Feb 19 15:29:22 crc kubenswrapper[4810]: I0219 15:29:22.129081 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="9c2f952c-2122-43d9-b006-6967fd2b9029" containerName="glance-httpd" containerID="cri-o://aacb41ee75296b30fe484f99ea7be7d89eedb8e0ad8d20c614e61233ead7a68d" gracePeriod=30 Feb 19 15:29:22 crc kubenswrapper[4810]: I0219 15:29:22.150860 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"3eb2dccd-c5dc-436f-b7a6-954af7bc51c5","Type":"ContainerStarted","Data":"f9469eea900aae9e94121b7851a3eacdafc431e0f15d0e8319f5a93a2616b851"} Feb 19 15:29:22 crc kubenswrapper[4810]: I0219 15:29:22.153916 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-applier-0" podStartSLOduration=5.633308836 podStartE2EDuration="34.153897195s" podCreationTimestamp="2026-02-19 15:28:48 +0000 UTC" firstStartedPulling="2026-02-19 15:28:50.395331821 +0000 UTC m=+1159.877361945" lastFinishedPulling="2026-02-19 15:29:18.91592018 +0000 UTC m=+1188.397950304" observedRunningTime="2026-02-19 15:29:22.113584746 +0000 UTC m=+1191.595614870" watchObservedRunningTime="2026-02-19 15:29:22.153897195 +0000 UTC m=+1191.635927309" Feb 19 15:29:22 crc kubenswrapper[4810]: I0219 15:29:22.158021 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b8cd44d0-7395-44e1-9112-9e8bb4198b93","Type":"ContainerStarted","Data":"48bd8312dc5f2e91c1a4d6b015bb83960b232d7ff3a764add13cbac66bd0441f"} Feb 19 15:29:22 crc kubenswrapper[4810]: I0219 15:29:22.182348 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=2.182315382 podStartE2EDuration="2.182315382s" podCreationTimestamp="2026-02-19 15:29:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:29:22.145399257 +0000 UTC m=+1191.627429381" watchObservedRunningTime="2026-02-19 15:29:22.182315382 +0000 UTC m=+1191.664345506" Feb 19 15:29:22 crc kubenswrapper[4810]: I0219 15:29:22.187230 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-7jdcp" podStartSLOduration=6.210674157 podStartE2EDuration="34.187220132s" podCreationTimestamp="2026-02-19 15:28:48 +0000 UTC" firstStartedPulling="2026-02-19 15:28:50.989576617 +0000 UTC m=+1160.471606741" lastFinishedPulling="2026-02-19 15:29:18.966122592 +0000 UTC m=+1188.448152716" observedRunningTime="2026-02-19 15:29:22.165473159 +0000 UTC m=+1191.647503283" watchObservedRunningTime="2026-02-19 15:29:22.187220132 +0000 UTC m=+1191.669250256" Feb 19 15:29:22 crc kubenswrapper[4810]: I0219 15:29:22.190640 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="7a0789f1-099a-4f95-9626-a5ad7da804bc" containerName="glance-log" containerID="cri-o://039aba0ed63f1dd6f66e34144e1de83684475b75135af6fd57bb70ce97ffc6b0" gracePeriod=30 Feb 19 15:29:22 crc kubenswrapper[4810]: I0219 15:29:22.190667 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7a0789f1-099a-4f95-9626-a5ad7da804bc","Type":"ContainerStarted","Data":"1bdbd9fc9e86820ed204a84ba415c42f653c518a62ccc95617b0bc659f34c23c"} Feb 19 15:29:22 crc kubenswrapper[4810]: I0219 15:29:22.190753 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="7a0789f1-099a-4f95-9626-a5ad7da804bc" containerName="glance-httpd" containerID="cri-o://1bdbd9fc9e86820ed204a84ba415c42f653c518a62ccc95617b0bc659f34c23c" gracePeriod=30 Feb 19 15:29:22 crc kubenswrapper[4810]: I0219 15:29:22.217230 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-668f7d7fb5-l5kpq" event={"ID":"08eca88c-a4b4-461b-8568-ebbf54645272","Type":"ContainerStarted","Data":"11bb73290ec186744ef4e88375d87c281860032daa137cc8bd4779ab70117f2b"} Feb 19 15:29:22 crc kubenswrapper[4810]: I0219 15:29:22.217269 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-668f7d7fb5-l5kpq" event={"ID":"08eca88c-a4b4-461b-8568-ebbf54645272","Type":"ContainerStarted","Data":"cc512cef3a5d05465ad16d8e6be38874fe910640b0911d914b7146b795d387d4"} Feb 19 15:29:22 crc kubenswrapper[4810]: I0219 15:29:22.217399 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-668f7d7fb5-l5kpq" podUID="08eca88c-a4b4-461b-8568-ebbf54645272" containerName="horizon-log" containerID="cri-o://cc512cef3a5d05465ad16d8e6be38874fe910640b0911d914b7146b795d387d4" gracePeriod=30 Feb 19 15:29:22 crc kubenswrapper[4810]: I0219 15:29:22.217703 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-668f7d7fb5-l5kpq" podUID="08eca88c-a4b4-461b-8568-ebbf54645272" containerName="horizon" containerID="cri-o://11bb73290ec186744ef4e88375d87c281860032daa137cc8bd4779ab70117f2b" gracePeriod=30 Feb 19 15:29:22 crc kubenswrapper[4810]: I0219 15:29:22.233116 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=33.233091678 podStartE2EDuration="33.233091678s" podCreationTimestamp="2026-02-19 15:28:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:29:22.229670084 +0000 UTC m=+1191.711700228" watchObservedRunningTime="2026-02-19 15:29:22.233091678 +0000 UTC m=+1191.715121802" Feb 19 15:29:22 crc kubenswrapper[4810]: I0219 15:29:22.241950 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5f56498b8d-9gwmf" podStartSLOduration=25.241923154 podStartE2EDuration="25.241923154s" podCreationTimestamp="2026-02-19 15:28:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:29:22.202668981 +0000 UTC m=+1191.684699105" watchObservedRunningTime="2026-02-19 15:29:22.241923154 +0000 UTC m=+1191.723953298" Feb 19 15:29:22 crc kubenswrapper[4810]: I0219 15:29:22.269898 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" event={"ID":"dfbf74ef-1e94-4826-8583-42b2e246ccf3","Type":"ContainerStarted","Data":"2979159f3325af188cf73d374cfc4f7b1a64cb0be10361454a84d92914ce8075"} Feb 19 15:29:22 crc kubenswrapper[4810]: I0219 15:29:22.290725 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-thnc7" event={"ID":"92797675-ddf7-43cf-90af-0248cf097509","Type":"ContainerStarted","Data":"aa561f23770b052d6b320e47499c0a8789e25a7a2367b69634f88f903c8d780a"} Feb 19 15:29:22 crc kubenswrapper[4810]: I0219 15:29:22.290775 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-thnc7" event={"ID":"92797675-ddf7-43cf-90af-0248cf097509","Type":"ContainerStarted","Data":"29912d3ef9b6f31b1e73feb89f2d45be94dbb1cafdcd248d477030898edd801f"} Feb 19 15:29:22 crc kubenswrapper[4810]: I0219 15:29:22.292914 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=34.292894095 podStartE2EDuration="34.292894095s" podCreationTimestamp="2026-02-19 15:28:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:29:22.256170814 +0000 UTC m=+1191.738200938" watchObservedRunningTime="2026-02-19 15:29:22.292894095 +0000 UTC m=+1191.774924219" Feb 19 15:29:22 crc kubenswrapper[4810]: I0219 15:29:22.307383 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-849c785789-5xrh2" podUID="c39b1dd9-9e73-4cca-aea6-e228f1ba5942" containerName="horizon-log" containerID="cri-o://09ef2e9038e53c0d3694b2f3cb1b25543038065797746360e5b00060a157df5b" gracePeriod=30 Feb 19 15:29:22 crc kubenswrapper[4810]: I0219 15:29:22.307567 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-849c785789-5xrh2" event={"ID":"c39b1dd9-9e73-4cca-aea6-e228f1ba5942","Type":"ContainerStarted","Data":"267e2a2aa6b67d8303b6404df33bfa4941b4f403604fb949cf5ec932e82ab1b7"} Feb 19 15:29:22 crc kubenswrapper[4810]: I0219 15:29:22.307588 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-849c785789-5xrh2" event={"ID":"c39b1dd9-9e73-4cca-aea6-e228f1ba5942","Type":"ContainerStarted","Data":"09ef2e9038e53c0d3694b2f3cb1b25543038065797746360e5b00060a157df5b"} Feb 19 15:29:22 crc kubenswrapper[4810]: I0219 15:29:22.307620 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-849c785789-5xrh2" podUID="c39b1dd9-9e73-4cca-aea6-e228f1ba5942" containerName="horizon" containerID="cri-o://267e2a2aa6b67d8303b6404df33bfa4941b4f403604fb949cf5ec932e82ab1b7" gracePeriod=30 Feb 19 15:29:22 crc kubenswrapper[4810]: I0219 15:29:22.323344 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-668f7d7fb5-l5kpq" podStartSLOduration=5.524350402 podStartE2EDuration="31.323307491s" podCreationTimestamp="2026-02-19 15:28:51 +0000 UTC" firstStartedPulling="2026-02-19 15:28:54.625524709 +0000 UTC m=+1164.107554823" lastFinishedPulling="2026-02-19 15:29:20.424481788 +0000 UTC m=+1189.906511912" observedRunningTime="2026-02-19 15:29:22.277993399 +0000 UTC m=+1191.760023523" watchObservedRunningTime="2026-02-19 15:29:22.323307491 +0000 UTC m=+1191.805337615" Feb 19 15:29:22 crc kubenswrapper[4810]: I0219 15:29:22.370347 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=6.420035095 podStartE2EDuration="34.370307624s" podCreationTimestamp="2026-02-19 15:28:48 +0000 UTC" firstStartedPulling="2026-02-19 15:28:50.966562404 +0000 UTC m=+1160.448592528" lastFinishedPulling="2026-02-19 15:29:18.916834933 +0000 UTC m=+1188.398865057" observedRunningTime="2026-02-19 15:29:22.292373682 +0000 UTC m=+1191.774403806" watchObservedRunningTime="2026-02-19 15:29:22.370307624 +0000 UTC m=+1191.852337748" Feb 19 15:29:22 crc kubenswrapper[4810]: I0219 15:29:22.382472 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-thnc7" podStartSLOduration=12.382459112 podStartE2EDuration="12.382459112s" podCreationTimestamp="2026-02-19 15:29:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:29:22.306019287 +0000 UTC m=+1191.788049411" watchObservedRunningTime="2026-02-19 15:29:22.382459112 +0000 UTC m=+1191.864489236" Feb 19 15:29:22 crc kubenswrapper[4810]: I0219 15:29:22.402393 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-849c785789-5xrh2" podStartSLOduration=6.36844711 podStartE2EDuration="34.402377301s" podCreationTimestamp="2026-02-19 15:28:48 +0000 UTC" firstStartedPulling="2026-02-19 15:28:50.951545526 +0000 UTC m=+1160.433575650" lastFinishedPulling="2026-02-19 15:29:18.985475717 +0000 UTC m=+1188.467505841" observedRunningTime="2026-02-19 15:29:22.351003171 +0000 UTC m=+1191.833033295" watchObservedRunningTime="2026-02-19 15:29:22.402377301 +0000 UTC m=+1191.884407425" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.268718 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.361055 4810 generic.go:334] "Generic (PLEG): container finished" podID="9c2f952c-2122-43d9-b006-6967fd2b9029" containerID="aacb41ee75296b30fe484f99ea7be7d89eedb8e0ad8d20c614e61233ead7a68d" exitCode=0 Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.361084 4810 generic.go:334] "Generic (PLEG): container finished" podID="9c2f952c-2122-43d9-b006-6967fd2b9029" containerID="ba83a3830aa23a7f50a84bf64a23008ce3bd4119d839094ddf609dc429db2d93" exitCode=143 Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.361150 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9c2f952c-2122-43d9-b006-6967fd2b9029","Type":"ContainerDied","Data":"aacb41ee75296b30fe484f99ea7be7d89eedb8e0ad8d20c614e61233ead7a68d"} Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.361177 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9c2f952c-2122-43d9-b006-6967fd2b9029","Type":"ContainerDied","Data":"ba83a3830aa23a7f50a84bf64a23008ce3bd4119d839094ddf609dc429db2d93"} Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.361186 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9c2f952c-2122-43d9-b006-6967fd2b9029","Type":"ContainerDied","Data":"84ac3613a017c9c6f49ab1cf6c99c11f9ee32460de5f526447f4c9de546b4b3b"} Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.361201 4810 scope.go:117] "RemoveContainer" containerID="aacb41ee75296b30fe484f99ea7be7d89eedb8e0ad8d20c614e61233ead7a68d" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.361414 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.368840 4810 generic.go:334] "Generic (PLEG): container finished" podID="7a0789f1-099a-4f95-9626-a5ad7da804bc" containerID="1bdbd9fc9e86820ed204a84ba415c42f653c518a62ccc95617b0bc659f34c23c" exitCode=0 Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.369094 4810 generic.go:334] "Generic (PLEG): container finished" podID="7a0789f1-099a-4f95-9626-a5ad7da804bc" containerID="039aba0ed63f1dd6f66e34144e1de83684475b75135af6fd57bb70ce97ffc6b0" exitCode=143 Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.369404 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7a0789f1-099a-4f95-9626-a5ad7da804bc","Type":"ContainerDied","Data":"1bdbd9fc9e86820ed204a84ba415c42f653c518a62ccc95617b0bc659f34c23c"} Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.369450 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7a0789f1-099a-4f95-9626-a5ad7da804bc","Type":"ContainerDied","Data":"039aba0ed63f1dd6f66e34144e1de83684475b75135af6fd57bb70ce97ffc6b0"} Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.416826 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c2f952c-2122-43d9-b006-6967fd2b9029-scripts\") pod \"9c2f952c-2122-43d9-b006-6967fd2b9029\" (UID: \"9c2f952c-2122-43d9-b006-6967fd2b9029\") " Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.417058 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9c2f952c-2122-43d9-b006-6967fd2b9029-httpd-run\") pod \"9c2f952c-2122-43d9-b006-6967fd2b9029\" (UID: \"9c2f952c-2122-43d9-b006-6967fd2b9029\") " Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.417141 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c2f952c-2122-43d9-b006-6967fd2b9029-logs\") pod \"9c2f952c-2122-43d9-b006-6967fd2b9029\" (UID: \"9c2f952c-2122-43d9-b006-6967fd2b9029\") " Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.417253 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2b86\" (UniqueName: \"kubernetes.io/projected/9c2f952c-2122-43d9-b006-6967fd2b9029-kube-api-access-n2b86\") pod \"9c2f952c-2122-43d9-b006-6967fd2b9029\" (UID: \"9c2f952c-2122-43d9-b006-6967fd2b9029\") " Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.417354 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c2f952c-2122-43d9-b006-6967fd2b9029-combined-ca-bundle\") pod \"9c2f952c-2122-43d9-b006-6967fd2b9029\" (UID: \"9c2f952c-2122-43d9-b006-6967fd2b9029\") " Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.417437 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c2f952c-2122-43d9-b006-6967fd2b9029-internal-tls-certs\") pod \"9c2f952c-2122-43d9-b006-6967fd2b9029\" (UID: \"9c2f952c-2122-43d9-b006-6967fd2b9029\") " Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.417613 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c2f952c-2122-43d9-b006-6967fd2b9029-config-data\") pod \"9c2f952c-2122-43d9-b006-6967fd2b9029\" (UID: \"9c2f952c-2122-43d9-b006-6967fd2b9029\") " Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.417691 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"9c2f952c-2122-43d9-b006-6967fd2b9029\" (UID: \"9c2f952c-2122-43d9-b006-6967fd2b9029\") " Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.419693 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c2f952c-2122-43d9-b006-6967fd2b9029-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "9c2f952c-2122-43d9-b006-6967fd2b9029" (UID: "9c2f952c-2122-43d9-b006-6967fd2b9029"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.419775 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c2f952c-2122-43d9-b006-6967fd2b9029-logs" (OuterVolumeSpecName: "logs") pod "9c2f952c-2122-43d9-b006-6967fd2b9029" (UID: "9c2f952c-2122-43d9-b006-6967fd2b9029"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.422827 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "9c2f952c-2122-43d9-b006-6967fd2b9029" (UID: "9c2f952c-2122-43d9-b006-6967fd2b9029"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.431592 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c2f952c-2122-43d9-b006-6967fd2b9029-scripts" (OuterVolumeSpecName: "scripts") pod "9c2f952c-2122-43d9-b006-6967fd2b9029" (UID: "9c2f952c-2122-43d9-b006-6967fd2b9029"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.447386 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c2f952c-2122-43d9-b006-6967fd2b9029-kube-api-access-n2b86" (OuterVolumeSpecName: "kube-api-access-n2b86") pod "9c2f952c-2122-43d9-b006-6967fd2b9029" (UID: "9c2f952c-2122-43d9-b006-6967fd2b9029"). InnerVolumeSpecName "kube-api-access-n2b86". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.485769 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c2f952c-2122-43d9-b006-6967fd2b9029-config-data" (OuterVolumeSpecName: "config-data") pod "9c2f952c-2122-43d9-b006-6967fd2b9029" (UID: "9c2f952c-2122-43d9-b006-6967fd2b9029"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.495485 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c2f952c-2122-43d9-b006-6967fd2b9029-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "9c2f952c-2122-43d9-b006-6967fd2b9029" (UID: "9c2f952c-2122-43d9-b006-6967fd2b9029"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.506315 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c2f952c-2122-43d9-b006-6967fd2b9029-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9c2f952c-2122-43d9-b006-6967fd2b9029" (UID: "9c2f952c-2122-43d9-b006-6967fd2b9029"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.520384 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n2b86\" (UniqueName: \"kubernetes.io/projected/9c2f952c-2122-43d9-b006-6967fd2b9029-kube-api-access-n2b86\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.522993 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c2f952c-2122-43d9-b006-6967fd2b9029-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.523088 4810 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c2f952c-2122-43d9-b006-6967fd2b9029-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.523158 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c2f952c-2122-43d9-b006-6967fd2b9029-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.523236 4810 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.523298 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c2f952c-2122-43d9-b006-6967fd2b9029-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.523381 4810 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9c2f952c-2122-43d9-b006-6967fd2b9029-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.523467 4810 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c2f952c-2122-43d9-b006-6967fd2b9029-logs\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.546631 4810 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.575710 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.634566 4810 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.639727 4810 scope.go:117] "RemoveContainer" containerID="ba83a3830aa23a7f50a84bf64a23008ce3bd4119d839094ddf609dc429db2d93" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.702459 4810 scope.go:117] "RemoveContainer" containerID="aacb41ee75296b30fe484f99ea7be7d89eedb8e0ad8d20c614e61233ead7a68d" Feb 19 15:29:23 crc kubenswrapper[4810]: E0219 15:29:23.707095 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aacb41ee75296b30fe484f99ea7be7d89eedb8e0ad8d20c614e61233ead7a68d\": container with ID starting with aacb41ee75296b30fe484f99ea7be7d89eedb8e0ad8d20c614e61233ead7a68d not found: ID does not exist" containerID="aacb41ee75296b30fe484f99ea7be7d89eedb8e0ad8d20c614e61233ead7a68d" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.707296 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aacb41ee75296b30fe484f99ea7be7d89eedb8e0ad8d20c614e61233ead7a68d"} err="failed to get container status \"aacb41ee75296b30fe484f99ea7be7d89eedb8e0ad8d20c614e61233ead7a68d\": rpc error: code = NotFound desc = could not find container \"aacb41ee75296b30fe484f99ea7be7d89eedb8e0ad8d20c614e61233ead7a68d\": container with ID starting with aacb41ee75296b30fe484f99ea7be7d89eedb8e0ad8d20c614e61233ead7a68d not found: ID does not exist" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.707395 4810 scope.go:117] "RemoveContainer" containerID="ba83a3830aa23a7f50a84bf64a23008ce3bd4119d839094ddf609dc429db2d93" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.710433 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 15:29:23 crc kubenswrapper[4810]: E0219 15:29:23.711280 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba83a3830aa23a7f50a84bf64a23008ce3bd4119d839094ddf609dc429db2d93\": container with ID starting with ba83a3830aa23a7f50a84bf64a23008ce3bd4119d839094ddf609dc429db2d93 not found: ID does not exist" containerID="ba83a3830aa23a7f50a84bf64a23008ce3bd4119d839094ddf609dc429db2d93" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.711420 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba83a3830aa23a7f50a84bf64a23008ce3bd4119d839094ddf609dc429db2d93"} err="failed to get container status \"ba83a3830aa23a7f50a84bf64a23008ce3bd4119d839094ddf609dc429db2d93\": rpc error: code = NotFound desc = could not find container \"ba83a3830aa23a7f50a84bf64a23008ce3bd4119d839094ddf609dc429db2d93\": container with ID starting with ba83a3830aa23a7f50a84bf64a23008ce3bd4119d839094ddf609dc429db2d93 not found: ID does not exist" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.711508 4810 scope.go:117] "RemoveContainer" containerID="aacb41ee75296b30fe484f99ea7be7d89eedb8e0ad8d20c614e61233ead7a68d" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.713973 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aacb41ee75296b30fe484f99ea7be7d89eedb8e0ad8d20c614e61233ead7a68d"} err="failed to get container status \"aacb41ee75296b30fe484f99ea7be7d89eedb8e0ad8d20c614e61233ead7a68d\": rpc error: code = NotFound desc = could not find container \"aacb41ee75296b30fe484f99ea7be7d89eedb8e0ad8d20c614e61233ead7a68d\": container with ID starting with aacb41ee75296b30fe484f99ea7be7d89eedb8e0ad8d20c614e61233ead7a68d not found: ID does not exist" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.714139 4810 scope.go:117] "RemoveContainer" containerID="ba83a3830aa23a7f50a84bf64a23008ce3bd4119d839094ddf609dc429db2d93" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.717085 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba83a3830aa23a7f50a84bf64a23008ce3bd4119d839094ddf609dc429db2d93"} err="failed to get container status \"ba83a3830aa23a7f50a84bf64a23008ce3bd4119d839094ddf609dc429db2d93\": rpc error: code = NotFound desc = could not find container \"ba83a3830aa23a7f50a84bf64a23008ce3bd4119d839094ddf609dc429db2d93\": container with ID starting with ba83a3830aa23a7f50a84bf64a23008ce3bd4119d839094ddf609dc429db2d93 not found: ID does not exist" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.717349 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.731586 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 15:29:23 crc kubenswrapper[4810]: E0219 15:29:23.734717 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c2f952c-2122-43d9-b006-6967fd2b9029" containerName="glance-httpd" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.734749 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c2f952c-2122-43d9-b006-6967fd2b9029" containerName="glance-httpd" Feb 19 15:29:23 crc kubenswrapper[4810]: E0219 15:29:23.734771 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a0789f1-099a-4f95-9626-a5ad7da804bc" containerName="glance-httpd" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.734777 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a0789f1-099a-4f95-9626-a5ad7da804bc" containerName="glance-httpd" Feb 19 15:29:23 crc kubenswrapper[4810]: E0219 15:29:23.734789 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c2f952c-2122-43d9-b006-6967fd2b9029" containerName="glance-log" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.734795 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c2f952c-2122-43d9-b006-6967fd2b9029" containerName="glance-log" Feb 19 15:29:23 crc kubenswrapper[4810]: E0219 15:29:23.734818 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a0789f1-099a-4f95-9626-a5ad7da804bc" containerName="glance-log" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.734836 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a0789f1-099a-4f95-9626-a5ad7da804bc" containerName="glance-log" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.735048 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a0789f1-099a-4f95-9626-a5ad7da804bc" containerName="glance-httpd" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.735069 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c2f952c-2122-43d9-b006-6967fd2b9029" containerName="glance-httpd" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.735077 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a0789f1-099a-4f95-9626-a5ad7da804bc" containerName="glance-log" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.735094 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c2f952c-2122-43d9-b006-6967fd2b9029" containerName="glance-log" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.735730 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7a0789f1-099a-4f95-9626-a5ad7da804bc-httpd-run\") pod \"7a0789f1-099a-4f95-9626-a5ad7da804bc\" (UID: \"7a0789f1-099a-4f95-9626-a5ad7da804bc\") " Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.735825 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a0789f1-099a-4f95-9626-a5ad7da804bc-public-tls-certs\") pod \"7a0789f1-099a-4f95-9626-a5ad7da804bc\" (UID: \"7a0789f1-099a-4f95-9626-a5ad7da804bc\") " Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.735905 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a0789f1-099a-4f95-9626-a5ad7da804bc-scripts\") pod \"7a0789f1-099a-4f95-9626-a5ad7da804bc\" (UID: \"7a0789f1-099a-4f95-9626-a5ad7da804bc\") " Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.736022 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a0789f1-099a-4f95-9626-a5ad7da804bc-config-data\") pod \"7a0789f1-099a-4f95-9626-a5ad7da804bc\" (UID: \"7a0789f1-099a-4f95-9626-a5ad7da804bc\") " Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.736139 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a0789f1-099a-4f95-9626-a5ad7da804bc-logs\") pod \"7a0789f1-099a-4f95-9626-a5ad7da804bc\" (UID: \"7a0789f1-099a-4f95-9626-a5ad7da804bc\") " Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.736294 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a0789f1-099a-4f95-9626-a5ad7da804bc-combined-ca-bundle\") pod \"7a0789f1-099a-4f95-9626-a5ad7da804bc\" (UID: \"7a0789f1-099a-4f95-9626-a5ad7da804bc\") " Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.736390 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"7a0789f1-099a-4f95-9626-a5ad7da804bc\" (UID: \"7a0789f1-099a-4f95-9626-a5ad7da804bc\") " Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.736456 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8f8d7\" (UniqueName: \"kubernetes.io/projected/7a0789f1-099a-4f95-9626-a5ad7da804bc-kube-api-access-8f8d7\") pod \"7a0789f1-099a-4f95-9626-a5ad7da804bc\" (UID: \"7a0789f1-099a-4f95-9626-a5ad7da804bc\") " Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.736679 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.738513 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.740850 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.741051 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.741563 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a0789f1-099a-4f95-9626-a5ad7da804bc-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "7a0789f1-099a-4f95-9626-a5ad7da804bc" (UID: "7a0789f1-099a-4f95-9626-a5ad7da804bc"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.741819 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a0789f1-099a-4f95-9626-a5ad7da804bc-logs" (OuterVolumeSpecName: "logs") pod "7a0789f1-099a-4f95-9626-a5ad7da804bc" (UID: "7a0789f1-099a-4f95-9626-a5ad7da804bc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.756037 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "7a0789f1-099a-4f95-9626-a5ad7da804bc" (UID: "7a0789f1-099a-4f95-9626-a5ad7da804bc"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.770646 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a0789f1-099a-4f95-9626-a5ad7da804bc-scripts" (OuterVolumeSpecName: "scripts") pod "7a0789f1-099a-4f95-9626-a5ad7da804bc" (UID: "7a0789f1-099a-4f95-9626-a5ad7da804bc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.773518 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a0789f1-099a-4f95-9626-a5ad7da804bc-kube-api-access-8f8d7" (OuterVolumeSpecName: "kube-api-access-8f8d7") pod "7a0789f1-099a-4f95-9626-a5ad7da804bc" (UID: "7a0789f1-099a-4f95-9626-a5ad7da804bc"). InnerVolumeSpecName "kube-api-access-8f8d7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.806669 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a0789f1-099a-4f95-9626-a5ad7da804bc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7a0789f1-099a-4f95-9626-a5ad7da804bc" (UID: "7a0789f1-099a-4f95-9626-a5ad7da804bc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.816576 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a0789f1-099a-4f95-9626-a5ad7da804bc-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "7a0789f1-099a-4f95-9626-a5ad7da804bc" (UID: "7a0789f1-099a-4f95-9626-a5ad7da804bc"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.838187 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/addf00fe-9b9b-41d4-bd81-4e5f2c339fff-config-data\") pod \"glance-default-internal-api-0\" (UID: \"addf00fe-9b9b-41d4-bd81-4e5f2c339fff\") " pod="openstack/glance-default-internal-api-0" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.838267 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/addf00fe-9b9b-41d4-bd81-4e5f2c339fff-logs\") pod \"glance-default-internal-api-0\" (UID: \"addf00fe-9b9b-41d4-bd81-4e5f2c339fff\") " pod="openstack/glance-default-internal-api-0" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.838285 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/addf00fe-9b9b-41d4-bd81-4e5f2c339fff-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"addf00fe-9b9b-41d4-bd81-4e5f2c339fff\") " pod="openstack/glance-default-internal-api-0" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.838306 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/addf00fe-9b9b-41d4-bd81-4e5f2c339fff-scripts\") pod \"glance-default-internal-api-0\" (UID: \"addf00fe-9b9b-41d4-bd81-4e5f2c339fff\") " pod="openstack/glance-default-internal-api-0" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.838376 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/addf00fe-9b9b-41d4-bd81-4e5f2c339fff-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"addf00fe-9b9b-41d4-bd81-4e5f2c339fff\") " pod="openstack/glance-default-internal-api-0" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.838448 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgrng\" (UniqueName: \"kubernetes.io/projected/addf00fe-9b9b-41d4-bd81-4e5f2c339fff-kube-api-access-tgrng\") pod \"glance-default-internal-api-0\" (UID: \"addf00fe-9b9b-41d4-bd81-4e5f2c339fff\") " pod="openstack/glance-default-internal-api-0" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.838504 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"addf00fe-9b9b-41d4-bd81-4e5f2c339fff\") " pod="openstack/glance-default-internal-api-0" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.838528 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/addf00fe-9b9b-41d4-bd81-4e5f2c339fff-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"addf00fe-9b9b-41d4-bd81-4e5f2c339fff\") " pod="openstack/glance-default-internal-api-0" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.838605 4810 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a0789f1-099a-4f95-9626-a5ad7da804bc-logs\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.838618 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a0789f1-099a-4f95-9626-a5ad7da804bc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.838656 4810 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.838666 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8f8d7\" (UniqueName: \"kubernetes.io/projected/7a0789f1-099a-4f95-9626-a5ad7da804bc-kube-api-access-8f8d7\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.838676 4810 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7a0789f1-099a-4f95-9626-a5ad7da804bc-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.838687 4810 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a0789f1-099a-4f95-9626-a5ad7da804bc-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.838695 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a0789f1-099a-4f95-9626-a5ad7da804bc-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.869837 4810 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.911322 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a0789f1-099a-4f95-9626-a5ad7da804bc-config-data" (OuterVolumeSpecName: "config-data") pod "7a0789f1-099a-4f95-9626-a5ad7da804bc" (UID: "7a0789f1-099a-4f95-9626-a5ad7da804bc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.939803 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/addf00fe-9b9b-41d4-bd81-4e5f2c339fff-config-data\") pod \"glance-default-internal-api-0\" (UID: \"addf00fe-9b9b-41d4-bd81-4e5f2c339fff\") " pod="openstack/glance-default-internal-api-0" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.939888 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/addf00fe-9b9b-41d4-bd81-4e5f2c339fff-logs\") pod \"glance-default-internal-api-0\" (UID: \"addf00fe-9b9b-41d4-bd81-4e5f2c339fff\") " pod="openstack/glance-default-internal-api-0" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.939918 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/addf00fe-9b9b-41d4-bd81-4e5f2c339fff-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"addf00fe-9b9b-41d4-bd81-4e5f2c339fff\") " pod="openstack/glance-default-internal-api-0" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.939943 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/addf00fe-9b9b-41d4-bd81-4e5f2c339fff-scripts\") pod \"glance-default-internal-api-0\" (UID: \"addf00fe-9b9b-41d4-bd81-4e5f2c339fff\") " pod="openstack/glance-default-internal-api-0" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.940027 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/addf00fe-9b9b-41d4-bd81-4e5f2c339fff-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"addf00fe-9b9b-41d4-bd81-4e5f2c339fff\") " pod="openstack/glance-default-internal-api-0" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.940125 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgrng\" (UniqueName: \"kubernetes.io/projected/addf00fe-9b9b-41d4-bd81-4e5f2c339fff-kube-api-access-tgrng\") pod \"glance-default-internal-api-0\" (UID: \"addf00fe-9b9b-41d4-bd81-4e5f2c339fff\") " pod="openstack/glance-default-internal-api-0" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.940194 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"addf00fe-9b9b-41d4-bd81-4e5f2c339fff\") " pod="openstack/glance-default-internal-api-0" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.940231 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/addf00fe-9b9b-41d4-bd81-4e5f2c339fff-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"addf00fe-9b9b-41d4-bd81-4e5f2c339fff\") " pod="openstack/glance-default-internal-api-0" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.940298 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a0789f1-099a-4f95-9626-a5ad7da804bc-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.940321 4810 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.940932 4810 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"addf00fe-9b9b-41d4-bd81-4e5f2c339fff\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.941412 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/addf00fe-9b9b-41d4-bd81-4e5f2c339fff-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"addf00fe-9b9b-41d4-bd81-4e5f2c339fff\") " pod="openstack/glance-default-internal-api-0" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.941466 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/addf00fe-9b9b-41d4-bd81-4e5f2c339fff-logs\") pod \"glance-default-internal-api-0\" (UID: \"addf00fe-9b9b-41d4-bd81-4e5f2c339fff\") " pod="openstack/glance-default-internal-api-0" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.959520 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/addf00fe-9b9b-41d4-bd81-4e5f2c339fff-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"addf00fe-9b9b-41d4-bd81-4e5f2c339fff\") " pod="openstack/glance-default-internal-api-0" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.960461 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/addf00fe-9b9b-41d4-bd81-4e5f2c339fff-config-data\") pod \"glance-default-internal-api-0\" (UID: \"addf00fe-9b9b-41d4-bd81-4e5f2c339fff\") " pod="openstack/glance-default-internal-api-0" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.961014 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/addf00fe-9b9b-41d4-bd81-4e5f2c339fff-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"addf00fe-9b9b-41d4-bd81-4e5f2c339fff\") " pod="openstack/glance-default-internal-api-0" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.963143 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/addf00fe-9b9b-41d4-bd81-4e5f2c339fff-scripts\") pod \"glance-default-internal-api-0\" (UID: \"addf00fe-9b9b-41d4-bd81-4e5f2c339fff\") " pod="openstack/glance-default-internal-api-0" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.971608 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgrng\" (UniqueName: \"kubernetes.io/projected/addf00fe-9b9b-41d4-bd81-4e5f2c339fff-kube-api-access-tgrng\") pod \"glance-default-internal-api-0\" (UID: \"addf00fe-9b9b-41d4-bd81-4e5f2c339fff\") " pod="openstack/glance-default-internal-api-0" Feb 19 15:29:23 crc kubenswrapper[4810]: I0219 15:29:23.981531 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"addf00fe-9b9b-41d4-bd81-4e5f2c339fff\") " pod="openstack/glance-default-internal-api-0" Feb 19 15:29:24 crc kubenswrapper[4810]: I0219 15:29:24.078202 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-applier-0" Feb 19 15:29:24 crc kubenswrapper[4810]: I0219 15:29:24.089961 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 15:29:24 crc kubenswrapper[4810]: I0219 15:29:24.404443 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7a0789f1-099a-4f95-9626-a5ad7da804bc","Type":"ContainerDied","Data":"123c419edcd4407399df0f1f1452e33c4f00d373cd040db3a818757999826cf9"} Feb 19 15:29:24 crc kubenswrapper[4810]: I0219 15:29:24.404728 4810 scope.go:117] "RemoveContainer" containerID="1bdbd9fc9e86820ed204a84ba415c42f653c518a62ccc95617b0bc659f34c23c" Feb 19 15:29:24 crc kubenswrapper[4810]: I0219 15:29:24.404833 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 15:29:24 crc kubenswrapper[4810]: I0219 15:29:24.472379 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 15:29:24 crc kubenswrapper[4810]: I0219 15:29:24.490614 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 15:29:24 crc kubenswrapper[4810]: I0219 15:29:24.498895 4810 scope.go:117] "RemoveContainer" containerID="039aba0ed63f1dd6f66e34144e1de83684475b75135af6fd57bb70ce97ffc6b0" Feb 19 15:29:24 crc kubenswrapper[4810]: I0219 15:29:24.509446 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 15:29:24 crc kubenswrapper[4810]: I0219 15:29:24.510898 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 15:29:24 crc kubenswrapper[4810]: I0219 15:29:24.517671 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 19 15:29:24 crc kubenswrapper[4810]: I0219 15:29:24.517857 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 19 15:29:24 crc kubenswrapper[4810]: I0219 15:29:24.534144 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 15:29:24 crc kubenswrapper[4810]: I0219 15:29:24.660646 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25d19ba7-dcfe-443b-9aec-afd1fc1bee1c-scripts\") pod \"glance-default-external-api-0\" (UID: \"25d19ba7-dcfe-443b-9aec-afd1fc1bee1c\") " pod="openstack/glance-default-external-api-0" Feb 19 15:29:24 crc kubenswrapper[4810]: I0219 15:29:24.661006 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h47wl\" (UniqueName: \"kubernetes.io/projected/25d19ba7-dcfe-443b-9aec-afd1fc1bee1c-kube-api-access-h47wl\") pod \"glance-default-external-api-0\" (UID: \"25d19ba7-dcfe-443b-9aec-afd1fc1bee1c\") " pod="openstack/glance-default-external-api-0" Feb 19 15:29:24 crc kubenswrapper[4810]: I0219 15:29:24.661075 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/25d19ba7-dcfe-443b-9aec-afd1fc1bee1c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"25d19ba7-dcfe-443b-9aec-afd1fc1bee1c\") " pod="openstack/glance-default-external-api-0" Feb 19 15:29:24 crc kubenswrapper[4810]: I0219 15:29:24.661144 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25d19ba7-dcfe-443b-9aec-afd1fc1bee1c-logs\") pod \"glance-default-external-api-0\" (UID: \"25d19ba7-dcfe-443b-9aec-afd1fc1bee1c\") " pod="openstack/glance-default-external-api-0" Feb 19 15:29:24 crc kubenswrapper[4810]: I0219 15:29:24.661171 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25d19ba7-dcfe-443b-9aec-afd1fc1bee1c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"25d19ba7-dcfe-443b-9aec-afd1fc1bee1c\") " pod="openstack/glance-default-external-api-0" Feb 19 15:29:24 crc kubenswrapper[4810]: I0219 15:29:24.661191 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25d19ba7-dcfe-443b-9aec-afd1fc1bee1c-config-data\") pod \"glance-default-external-api-0\" (UID: \"25d19ba7-dcfe-443b-9aec-afd1fc1bee1c\") " pod="openstack/glance-default-external-api-0" Feb 19 15:29:24 crc kubenswrapper[4810]: I0219 15:29:24.661227 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/25d19ba7-dcfe-443b-9aec-afd1fc1bee1c-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"25d19ba7-dcfe-443b-9aec-afd1fc1bee1c\") " pod="openstack/glance-default-external-api-0" Feb 19 15:29:24 crc kubenswrapper[4810]: I0219 15:29:24.661245 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"25d19ba7-dcfe-443b-9aec-afd1fc1bee1c\") " pod="openstack/glance-default-external-api-0" Feb 19 15:29:24 crc kubenswrapper[4810]: I0219 15:29:24.708858 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 15:29:24 crc kubenswrapper[4810]: I0219 15:29:24.762886 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25d19ba7-dcfe-443b-9aec-afd1fc1bee1c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"25d19ba7-dcfe-443b-9aec-afd1fc1bee1c\") " pod="openstack/glance-default-external-api-0" Feb 19 15:29:24 crc kubenswrapper[4810]: I0219 15:29:24.762936 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25d19ba7-dcfe-443b-9aec-afd1fc1bee1c-config-data\") pod \"glance-default-external-api-0\" (UID: \"25d19ba7-dcfe-443b-9aec-afd1fc1bee1c\") " pod="openstack/glance-default-external-api-0" Feb 19 15:29:24 crc kubenswrapper[4810]: I0219 15:29:24.762978 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/25d19ba7-dcfe-443b-9aec-afd1fc1bee1c-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"25d19ba7-dcfe-443b-9aec-afd1fc1bee1c\") " pod="openstack/glance-default-external-api-0" Feb 19 15:29:24 crc kubenswrapper[4810]: I0219 15:29:24.762997 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"25d19ba7-dcfe-443b-9aec-afd1fc1bee1c\") " pod="openstack/glance-default-external-api-0" Feb 19 15:29:24 crc kubenswrapper[4810]: I0219 15:29:24.763085 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25d19ba7-dcfe-443b-9aec-afd1fc1bee1c-scripts\") pod \"glance-default-external-api-0\" (UID: \"25d19ba7-dcfe-443b-9aec-afd1fc1bee1c\") " pod="openstack/glance-default-external-api-0" Feb 19 15:29:24 crc kubenswrapper[4810]: I0219 15:29:24.763113 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h47wl\" (UniqueName: \"kubernetes.io/projected/25d19ba7-dcfe-443b-9aec-afd1fc1bee1c-kube-api-access-h47wl\") pod \"glance-default-external-api-0\" (UID: \"25d19ba7-dcfe-443b-9aec-afd1fc1bee1c\") " pod="openstack/glance-default-external-api-0" Feb 19 15:29:24 crc kubenswrapper[4810]: I0219 15:29:24.763182 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/25d19ba7-dcfe-443b-9aec-afd1fc1bee1c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"25d19ba7-dcfe-443b-9aec-afd1fc1bee1c\") " pod="openstack/glance-default-external-api-0" Feb 19 15:29:24 crc kubenswrapper[4810]: I0219 15:29:24.763207 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25d19ba7-dcfe-443b-9aec-afd1fc1bee1c-logs\") pod \"glance-default-external-api-0\" (UID: \"25d19ba7-dcfe-443b-9aec-afd1fc1bee1c\") " pod="openstack/glance-default-external-api-0" Feb 19 15:29:24 crc kubenswrapper[4810]: I0219 15:29:24.763647 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25d19ba7-dcfe-443b-9aec-afd1fc1bee1c-logs\") pod \"glance-default-external-api-0\" (UID: \"25d19ba7-dcfe-443b-9aec-afd1fc1bee1c\") " pod="openstack/glance-default-external-api-0" Feb 19 15:29:24 crc kubenswrapper[4810]: I0219 15:29:24.764082 4810 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"25d19ba7-dcfe-443b-9aec-afd1fc1bee1c\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-external-api-0" Feb 19 15:29:24 crc kubenswrapper[4810]: I0219 15:29:24.775494 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25d19ba7-dcfe-443b-9aec-afd1fc1bee1c-config-data\") pod \"glance-default-external-api-0\" (UID: \"25d19ba7-dcfe-443b-9aec-afd1fc1bee1c\") " pod="openstack/glance-default-external-api-0" Feb 19 15:29:24 crc kubenswrapper[4810]: I0219 15:29:24.781614 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/25d19ba7-dcfe-443b-9aec-afd1fc1bee1c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"25d19ba7-dcfe-443b-9aec-afd1fc1bee1c\") " pod="openstack/glance-default-external-api-0" Feb 19 15:29:24 crc kubenswrapper[4810]: I0219 15:29:24.785948 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/25d19ba7-dcfe-443b-9aec-afd1fc1bee1c-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"25d19ba7-dcfe-443b-9aec-afd1fc1bee1c\") " pod="openstack/glance-default-external-api-0" Feb 19 15:29:24 crc kubenswrapper[4810]: I0219 15:29:24.786223 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25d19ba7-dcfe-443b-9aec-afd1fc1bee1c-scripts\") pod \"glance-default-external-api-0\" (UID: \"25d19ba7-dcfe-443b-9aec-afd1fc1bee1c\") " pod="openstack/glance-default-external-api-0" Feb 19 15:29:24 crc kubenswrapper[4810]: I0219 15:29:24.797987 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25d19ba7-dcfe-443b-9aec-afd1fc1bee1c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"25d19ba7-dcfe-443b-9aec-afd1fc1bee1c\") " pod="openstack/glance-default-external-api-0" Feb 19 15:29:24 crc kubenswrapper[4810]: I0219 15:29:24.816856 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h47wl\" (UniqueName: \"kubernetes.io/projected/25d19ba7-dcfe-443b-9aec-afd1fc1bee1c-kube-api-access-h47wl\") pod \"glance-default-external-api-0\" (UID: \"25d19ba7-dcfe-443b-9aec-afd1fc1bee1c\") " pod="openstack/glance-default-external-api-0" Feb 19 15:29:24 crc kubenswrapper[4810]: I0219 15:29:24.828671 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"25d19ba7-dcfe-443b-9aec-afd1fc1bee1c\") " pod="openstack/glance-default-external-api-0" Feb 19 15:29:24 crc kubenswrapper[4810]: I0219 15:29:24.848842 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 15:29:25 crc kubenswrapper[4810]: I0219 15:29:25.420571 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"addf00fe-9b9b-41d4-bd81-4e5f2c339fff","Type":"ContainerStarted","Data":"58cd5599b992d54d64de99ed5546382a6f34cf94866c9af8e9254502abddbf03"} Feb 19 15:29:25 crc kubenswrapper[4810]: I0219 15:29:25.423806 4810 generic.go:334] "Generic (PLEG): container finished" podID="4dd5dede-cf58-43c7-954e-b9b1d33ad8d1" containerID="da258067ce2c7912909dc7c937b6ad45df02bf5c8504937ad6d6f0ea0359724a" exitCode=0 Feb 19 15:29:25 crc kubenswrapper[4810]: I0219 15:29:25.423879 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-j989d" event={"ID":"4dd5dede-cf58-43c7-954e-b9b1d33ad8d1","Type":"ContainerDied","Data":"da258067ce2c7912909dc7c937b6ad45df02bf5c8504937ad6d6f0ea0359724a"} Feb 19 15:29:25 crc kubenswrapper[4810]: I0219 15:29:25.468905 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a0789f1-099a-4f95-9626-a5ad7da804bc" path="/var/lib/kubelet/pods/7a0789f1-099a-4f95-9626-a5ad7da804bc/volumes" Feb 19 15:29:25 crc kubenswrapper[4810]: I0219 15:29:25.470042 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c2f952c-2122-43d9-b006-6967fd2b9029" path="/var/lib/kubelet/pods/9c2f952c-2122-43d9-b006-6967fd2b9029/volumes" Feb 19 15:29:25 crc kubenswrapper[4810]: I0219 15:29:25.470836 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Feb 19 15:29:25 crc kubenswrapper[4810]: I0219 15:29:25.470933 4810 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 19 15:29:25 crc kubenswrapper[4810]: I0219 15:29:25.553533 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 15:29:26 crc kubenswrapper[4810]: I0219 15:29:26.453094 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"addf00fe-9b9b-41d4-bd81-4e5f2c339fff","Type":"ContainerStarted","Data":"1f3fdcf1ede0870a63fe45639c44983223bdd32c1b88c14ab338ab8b9a3f039c"} Feb 19 15:29:26 crc kubenswrapper[4810]: I0219 15:29:26.468634 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"25d19ba7-dcfe-443b-9aec-afd1fc1bee1c","Type":"ContainerStarted","Data":"a5b582a70e74643764c5c634d13034b32fb543e1cf8d7471fc29e46f970a5375"} Feb 19 15:29:26 crc kubenswrapper[4810]: I0219 15:29:26.468681 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"25d19ba7-dcfe-443b-9aec-afd1fc1bee1c","Type":"ContainerStarted","Data":"8e2ec136ee1702cfd45683995f2deb05321488c4c561ae75b5ecc3c327d09b7a"} Feb 19 15:29:27 crc kubenswrapper[4810]: I0219 15:29:27.214384 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Feb 19 15:29:27 crc kubenswrapper[4810]: I0219 15:29:27.483912 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"addf00fe-9b9b-41d4-bd81-4e5f2c339fff","Type":"ContainerStarted","Data":"598ccff405c1e1906ae06df4732ddadd78333232ea68a893ba2db6513c96c3dd"} Feb 19 15:29:27 crc kubenswrapper[4810]: I0219 15:29:27.497789 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"25d19ba7-dcfe-443b-9aec-afd1fc1bee1c","Type":"ContainerStarted","Data":"d4f8797a868301d8a063b9ee6d65b2b3734521a53ee7809ebafe72723ce9d630"} Feb 19 15:29:27 crc kubenswrapper[4810]: I0219 15:29:27.535618 4810 generic.go:334] "Generic (PLEG): container finished" podID="36fe6fdb-2970-4773-8184-a2d16b8ca89a" containerID="829a51aca23df8d8763078bcae4b4cba43b6c265996ab11fc55f6d42ce950516" exitCode=0 Feb 19 15:29:27 crc kubenswrapper[4810]: I0219 15:29:27.535672 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-7jdcp" event={"ID":"36fe6fdb-2970-4773-8184-a2d16b8ca89a","Type":"ContainerDied","Data":"829a51aca23df8d8763078bcae4b4cba43b6c265996ab11fc55f6d42ce950516"} Feb 19 15:29:27 crc kubenswrapper[4810]: I0219 15:29:27.536910 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.536888191 podStartE2EDuration="4.536888191s" podCreationTimestamp="2026-02-19 15:29:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:29:27.505218554 +0000 UTC m=+1196.987248678" watchObservedRunningTime="2026-02-19 15:29:27.536888191 +0000 UTC m=+1197.018918315" Feb 19 15:29:27 crc kubenswrapper[4810]: I0219 15:29:27.583243 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.583221818 podStartE2EDuration="3.583221818s" podCreationTimestamp="2026-02-19 15:29:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:29:27.550675159 +0000 UTC m=+1197.032705283" watchObservedRunningTime="2026-02-19 15:29:27.583221818 +0000 UTC m=+1197.065251942" Feb 19 15:29:28 crc kubenswrapper[4810]: I0219 15:29:28.162404 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-869f57798-ngdtl" Feb 19 15:29:28 crc kubenswrapper[4810]: I0219 15:29:28.162465 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-869f57798-ngdtl" Feb 19 15:29:28 crc kubenswrapper[4810]: I0219 15:29:28.273172 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-5f56498b8d-9gwmf" Feb 19 15:29:28 crc kubenswrapper[4810]: I0219 15:29:28.273232 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5f56498b8d-9gwmf" Feb 19 15:29:29 crc kubenswrapper[4810]: I0219 15:29:29.078415 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-applier-0" Feb 19 15:29:29 crc kubenswrapper[4810]: I0219 15:29:29.112273 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-applier-0" Feb 19 15:29:29 crc kubenswrapper[4810]: I0219 15:29:29.125709 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Feb 19 15:29:29 crc kubenswrapper[4810]: E0219 15:29:29.126403 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f9469eea900aae9e94121b7851a3eacdafc431e0f15d0e8319f5a93a2616b851 is running failed: container process not found" containerID="f9469eea900aae9e94121b7851a3eacdafc431e0f15d0e8319f5a93a2616b851" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Feb 19 15:29:29 crc kubenswrapper[4810]: E0219 15:29:29.127699 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f9469eea900aae9e94121b7851a3eacdafc431e0f15d0e8319f5a93a2616b851 is running failed: container process not found" containerID="f9469eea900aae9e94121b7851a3eacdafc431e0f15d0e8319f5a93a2616b851" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Feb 19 15:29:29 crc kubenswrapper[4810]: E0219 15:29:29.128021 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f9469eea900aae9e94121b7851a3eacdafc431e0f15d0e8319f5a93a2616b851 is running failed: container process not found" containerID="f9469eea900aae9e94121b7851a3eacdafc431e0f15d0e8319f5a93a2616b851" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Feb 19 15:29:29 crc kubenswrapper[4810]: E0219 15:29:29.128056 4810 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f9469eea900aae9e94121b7851a3eacdafc431e0f15d0e8319f5a93a2616b851 is running failed: container process not found" probeType="Startup" pod="openstack/watcher-decision-engine-0" podUID="3eb2dccd-c5dc-436f-b7a6-954af7bc51c5" containerName="watcher-decision-engine" Feb 19 15:29:29 crc kubenswrapper[4810]: I0219 15:29:29.237251 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-849c785789-5xrh2" Feb 19 15:29:29 crc kubenswrapper[4810]: I0219 15:29:29.557379 4810 generic.go:334] "Generic (PLEG): container finished" podID="92797675-ddf7-43cf-90af-0248cf097509" containerID="aa561f23770b052d6b320e47499c0a8789e25a7a2367b69634f88f903c8d780a" exitCode=0 Feb 19 15:29:29 crc kubenswrapper[4810]: I0219 15:29:29.557476 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-thnc7" event={"ID":"92797675-ddf7-43cf-90af-0248cf097509","Type":"ContainerDied","Data":"aa561f23770b052d6b320e47499c0a8789e25a7a2367b69634f88f903c8d780a"} Feb 19 15:29:29 crc kubenswrapper[4810]: I0219 15:29:29.559107 4810 generic.go:334] "Generic (PLEG): container finished" podID="3eb2dccd-c5dc-436f-b7a6-954af7bc51c5" containerID="f9469eea900aae9e94121b7851a3eacdafc431e0f15d0e8319f5a93a2616b851" exitCode=1 Feb 19 15:29:29 crc kubenswrapper[4810]: I0219 15:29:29.559185 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"3eb2dccd-c5dc-436f-b7a6-954af7bc51c5","Type":"ContainerDied","Data":"f9469eea900aae9e94121b7851a3eacdafc431e0f15d0e8319f5a93a2616b851"} Feb 19 15:29:29 crc kubenswrapper[4810]: I0219 15:29:29.559939 4810 scope.go:117] "RemoveContainer" containerID="f9469eea900aae9e94121b7851a3eacdafc431e0f15d0e8319f5a93a2616b851" Feb 19 15:29:29 crc kubenswrapper[4810]: I0219 15:29:29.587848 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-applier-0" Feb 19 15:29:29 crc kubenswrapper[4810]: I0219 15:29:29.612513 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5f4cfd6f6c-s7m64" Feb 19 15:29:29 crc kubenswrapper[4810]: I0219 15:29:29.615380 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-applier-0"] Feb 19 15:29:29 crc kubenswrapper[4810]: I0219 15:29:29.664517 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6999bddfcf-fzf7g"] Feb 19 15:29:29 crc kubenswrapper[4810]: I0219 15:29:29.665114 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6999bddfcf-fzf7g" podUID="2584fed3-16a4-489e-bf03-1c7461e9d3d8" containerName="dnsmasq-dns" containerID="cri-o://66285c6ba52e055f610f9273f522d8c7987aa81e9fe271b980a52b4e8dbd8794" gracePeriod=10 Feb 19 15:29:30 crc kubenswrapper[4810]: I0219 15:29:30.450196 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0" Feb 19 15:29:30 crc kubenswrapper[4810]: I0219 15:29:30.455379 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0" Feb 19 15:29:30 crc kubenswrapper[4810]: I0219 15:29:30.570317 4810 generic.go:334] "Generic (PLEG): container finished" podID="2584fed3-16a4-489e-bf03-1c7461e9d3d8" containerID="66285c6ba52e055f610f9273f522d8c7987aa81e9fe271b980a52b4e8dbd8794" exitCode=0 Feb 19 15:29:30 crc kubenswrapper[4810]: I0219 15:29:30.570465 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6999bddfcf-fzf7g" event={"ID":"2584fed3-16a4-489e-bf03-1c7461e9d3d8","Type":"ContainerDied","Data":"66285c6ba52e055f610f9273f522d8c7987aa81e9fe271b980a52b4e8dbd8794"} Feb 19 15:29:30 crc kubenswrapper[4810]: I0219 15:29:30.575309 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Feb 19 15:29:31 crc kubenswrapper[4810]: I0219 15:29:31.578132 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-applier-0" podUID="4be760b2-263c-4b89-8bdf-ecf98114a24f" containerName="watcher-applier" containerID="cri-o://8c99f71d93423dbb260d03502181a976def39a613d97465a06298513d67bb0bf" gracePeriod=30 Feb 19 15:29:31 crc kubenswrapper[4810]: I0219 15:29:31.722567 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-668f7d7fb5-l5kpq" Feb 19 15:29:32 crc kubenswrapper[4810]: I0219 15:29:32.142685 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-j989d" Feb 19 15:29:32 crc kubenswrapper[4810]: I0219 15:29:32.162801 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-thnc7" Feb 19 15:29:32 crc kubenswrapper[4810]: I0219 15:29:32.177728 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-7jdcp" Feb 19 15:29:32 crc kubenswrapper[4810]: I0219 15:29:32.240230 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sllpm\" (UniqueName: \"kubernetes.io/projected/4dd5dede-cf58-43c7-954e-b9b1d33ad8d1-kube-api-access-sllpm\") pod \"4dd5dede-cf58-43c7-954e-b9b1d33ad8d1\" (UID: \"4dd5dede-cf58-43c7-954e-b9b1d33ad8d1\") " Feb 19 15:29:32 crc kubenswrapper[4810]: I0219 15:29:32.240537 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dd5dede-cf58-43c7-954e-b9b1d33ad8d1-combined-ca-bundle\") pod \"4dd5dede-cf58-43c7-954e-b9b1d33ad8d1\" (UID: \"4dd5dede-cf58-43c7-954e-b9b1d33ad8d1\") " Feb 19 15:29:32 crc kubenswrapper[4810]: I0219 15:29:32.240593 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/92797675-ddf7-43cf-90af-0248cf097509-credential-keys\") pod \"92797675-ddf7-43cf-90af-0248cf097509\" (UID: \"92797675-ddf7-43cf-90af-0248cf097509\") " Feb 19 15:29:32 crc kubenswrapper[4810]: I0219 15:29:32.240697 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92797675-ddf7-43cf-90af-0248cf097509-combined-ca-bundle\") pod \"92797675-ddf7-43cf-90af-0248cf097509\" (UID: \"92797675-ddf7-43cf-90af-0248cf097509\") " Feb 19 15:29:32 crc kubenswrapper[4810]: I0219 15:29:32.240717 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4dd5dede-cf58-43c7-954e-b9b1d33ad8d1-config\") pod \"4dd5dede-cf58-43c7-954e-b9b1d33ad8d1\" (UID: \"4dd5dede-cf58-43c7-954e-b9b1d33ad8d1\") " Feb 19 15:29:32 crc kubenswrapper[4810]: I0219 15:29:32.240787 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92797675-ddf7-43cf-90af-0248cf097509-scripts\") pod \"92797675-ddf7-43cf-90af-0248cf097509\" (UID: \"92797675-ddf7-43cf-90af-0248cf097509\") " Feb 19 15:29:32 crc kubenswrapper[4810]: I0219 15:29:32.240807 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92797675-ddf7-43cf-90af-0248cf097509-config-data\") pod \"92797675-ddf7-43cf-90af-0248cf097509\" (UID: \"92797675-ddf7-43cf-90af-0248cf097509\") " Feb 19 15:29:32 crc kubenswrapper[4810]: I0219 15:29:32.240843 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxtzf\" (UniqueName: \"kubernetes.io/projected/92797675-ddf7-43cf-90af-0248cf097509-kube-api-access-gxtzf\") pod \"92797675-ddf7-43cf-90af-0248cf097509\" (UID: \"92797675-ddf7-43cf-90af-0248cf097509\") " Feb 19 15:29:32 crc kubenswrapper[4810]: I0219 15:29:32.240873 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/92797675-ddf7-43cf-90af-0248cf097509-fernet-keys\") pod \"92797675-ddf7-43cf-90af-0248cf097509\" (UID: \"92797675-ddf7-43cf-90af-0248cf097509\") " Feb 19 15:29:32 crc kubenswrapper[4810]: I0219 15:29:32.244497 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92797675-ddf7-43cf-90af-0248cf097509-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "92797675-ddf7-43cf-90af-0248cf097509" (UID: "92797675-ddf7-43cf-90af-0248cf097509"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:29:32 crc kubenswrapper[4810]: I0219 15:29:32.252089 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4dd5dede-cf58-43c7-954e-b9b1d33ad8d1-kube-api-access-sllpm" (OuterVolumeSpecName: "kube-api-access-sllpm") pod "4dd5dede-cf58-43c7-954e-b9b1d33ad8d1" (UID: "4dd5dede-cf58-43c7-954e-b9b1d33ad8d1"). InnerVolumeSpecName "kube-api-access-sllpm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:29:32 crc kubenswrapper[4810]: I0219 15:29:32.252547 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92797675-ddf7-43cf-90af-0248cf097509-kube-api-access-gxtzf" (OuterVolumeSpecName: "kube-api-access-gxtzf") pod "92797675-ddf7-43cf-90af-0248cf097509" (UID: "92797675-ddf7-43cf-90af-0248cf097509"). InnerVolumeSpecName "kube-api-access-gxtzf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:29:32 crc kubenswrapper[4810]: I0219 15:29:32.259377 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92797675-ddf7-43cf-90af-0248cf097509-scripts" (OuterVolumeSpecName: "scripts") pod "92797675-ddf7-43cf-90af-0248cf097509" (UID: "92797675-ddf7-43cf-90af-0248cf097509"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:29:32 crc kubenswrapper[4810]: I0219 15:29:32.265020 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92797675-ddf7-43cf-90af-0248cf097509-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "92797675-ddf7-43cf-90af-0248cf097509" (UID: "92797675-ddf7-43cf-90af-0248cf097509"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:29:32 crc kubenswrapper[4810]: I0219 15:29:32.278429 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92797675-ddf7-43cf-90af-0248cf097509-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "92797675-ddf7-43cf-90af-0248cf097509" (UID: "92797675-ddf7-43cf-90af-0248cf097509"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:29:32 crc kubenswrapper[4810]: I0219 15:29:32.297064 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dd5dede-cf58-43c7-954e-b9b1d33ad8d1-config" (OuterVolumeSpecName: "config") pod "4dd5dede-cf58-43c7-954e-b9b1d33ad8d1" (UID: "4dd5dede-cf58-43c7-954e-b9b1d33ad8d1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:29:32 crc kubenswrapper[4810]: I0219 15:29:32.299135 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dd5dede-cf58-43c7-954e-b9b1d33ad8d1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4dd5dede-cf58-43c7-954e-b9b1d33ad8d1" (UID: "4dd5dede-cf58-43c7-954e-b9b1d33ad8d1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:29:32 crc kubenswrapper[4810]: I0219 15:29:32.301566 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92797675-ddf7-43cf-90af-0248cf097509-config-data" (OuterVolumeSpecName: "config-data") pod "92797675-ddf7-43cf-90af-0248cf097509" (UID: "92797675-ddf7-43cf-90af-0248cf097509"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:29:32 crc kubenswrapper[4810]: I0219 15:29:32.342812 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36fe6fdb-2970-4773-8184-a2d16b8ca89a-config-data\") pod \"36fe6fdb-2970-4773-8184-a2d16b8ca89a\" (UID: \"36fe6fdb-2970-4773-8184-a2d16b8ca89a\") " Feb 19 15:29:32 crc kubenswrapper[4810]: I0219 15:29:32.343097 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36fe6fdb-2970-4773-8184-a2d16b8ca89a-logs\") pod \"36fe6fdb-2970-4773-8184-a2d16b8ca89a\" (UID: \"36fe6fdb-2970-4773-8184-a2d16b8ca89a\") " Feb 19 15:29:32 crc kubenswrapper[4810]: I0219 15:29:32.343468 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36fe6fdb-2970-4773-8184-a2d16b8ca89a-logs" (OuterVolumeSpecName: "logs") pod "36fe6fdb-2970-4773-8184-a2d16b8ca89a" (UID: "36fe6fdb-2970-4773-8184-a2d16b8ca89a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:29:32 crc kubenswrapper[4810]: I0219 15:29:32.343698 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36fe6fdb-2970-4773-8184-a2d16b8ca89a-combined-ca-bundle\") pod \"36fe6fdb-2970-4773-8184-a2d16b8ca89a\" (UID: \"36fe6fdb-2970-4773-8184-a2d16b8ca89a\") " Feb 19 15:29:32 crc kubenswrapper[4810]: I0219 15:29:32.343878 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36fe6fdb-2970-4773-8184-a2d16b8ca89a-scripts\") pod \"36fe6fdb-2970-4773-8184-a2d16b8ca89a\" (UID: \"36fe6fdb-2970-4773-8184-a2d16b8ca89a\") " Feb 19 15:29:32 crc kubenswrapper[4810]: I0219 15:29:32.344248 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qksxk\" (UniqueName: \"kubernetes.io/projected/36fe6fdb-2970-4773-8184-a2d16b8ca89a-kube-api-access-qksxk\") pod \"36fe6fdb-2970-4773-8184-a2d16b8ca89a\" (UID: \"36fe6fdb-2970-4773-8184-a2d16b8ca89a\") " Feb 19 15:29:32 crc kubenswrapper[4810]: I0219 15:29:32.344944 4810 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/92797675-ddf7-43cf-90af-0248cf097509-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:32 crc kubenswrapper[4810]: I0219 15:29:32.345040 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92797675-ddf7-43cf-90af-0248cf097509-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:32 crc kubenswrapper[4810]: I0219 15:29:32.345224 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/4dd5dede-cf58-43c7-954e-b9b1d33ad8d1-config\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:32 crc kubenswrapper[4810]: I0219 15:29:32.345295 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92797675-ddf7-43cf-90af-0248cf097509-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:32 crc kubenswrapper[4810]: I0219 15:29:32.345522 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92797675-ddf7-43cf-90af-0248cf097509-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:32 crc kubenswrapper[4810]: I0219 15:29:32.345621 4810 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36fe6fdb-2970-4773-8184-a2d16b8ca89a-logs\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:32 crc kubenswrapper[4810]: I0219 15:29:32.345786 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxtzf\" (UniqueName: \"kubernetes.io/projected/92797675-ddf7-43cf-90af-0248cf097509-kube-api-access-gxtzf\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:32 crc kubenswrapper[4810]: I0219 15:29:32.345860 4810 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/92797675-ddf7-43cf-90af-0248cf097509-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:32 crc kubenswrapper[4810]: I0219 15:29:32.345913 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sllpm\" (UniqueName: \"kubernetes.io/projected/4dd5dede-cf58-43c7-954e-b9b1d33ad8d1-kube-api-access-sllpm\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:32 crc kubenswrapper[4810]: I0219 15:29:32.345987 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dd5dede-cf58-43c7-954e-b9b1d33ad8d1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:32 crc kubenswrapper[4810]: I0219 15:29:32.350150 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36fe6fdb-2970-4773-8184-a2d16b8ca89a-scripts" (OuterVolumeSpecName: "scripts") pod "36fe6fdb-2970-4773-8184-a2d16b8ca89a" (UID: "36fe6fdb-2970-4773-8184-a2d16b8ca89a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:29:32 crc kubenswrapper[4810]: I0219 15:29:32.350761 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36fe6fdb-2970-4773-8184-a2d16b8ca89a-kube-api-access-qksxk" (OuterVolumeSpecName: "kube-api-access-qksxk") pod "36fe6fdb-2970-4773-8184-a2d16b8ca89a" (UID: "36fe6fdb-2970-4773-8184-a2d16b8ca89a"). InnerVolumeSpecName "kube-api-access-qksxk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:29:32 crc kubenswrapper[4810]: I0219 15:29:32.396697 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6999bddfcf-fzf7g" Feb 19 15:29:32 crc kubenswrapper[4810]: I0219 15:29:32.452383 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qksxk\" (UniqueName: \"kubernetes.io/projected/36fe6fdb-2970-4773-8184-a2d16b8ca89a-kube-api-access-qksxk\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:32 crc kubenswrapper[4810]: I0219 15:29:32.452412 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36fe6fdb-2970-4773-8184-a2d16b8ca89a-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:32 crc kubenswrapper[4810]: I0219 15:29:32.511198 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36fe6fdb-2970-4773-8184-a2d16b8ca89a-config-data" (OuterVolumeSpecName: "config-data") pod "36fe6fdb-2970-4773-8184-a2d16b8ca89a" (UID: "36fe6fdb-2970-4773-8184-a2d16b8ca89a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:29:32 crc kubenswrapper[4810]: I0219 15:29:32.527817 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36fe6fdb-2970-4773-8184-a2d16b8ca89a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "36fe6fdb-2970-4773-8184-a2d16b8ca89a" (UID: "36fe6fdb-2970-4773-8184-a2d16b8ca89a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:29:32 crc kubenswrapper[4810]: I0219 15:29:32.553763 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2584fed3-16a4-489e-bf03-1c7461e9d3d8-ovsdbserver-nb\") pod \"2584fed3-16a4-489e-bf03-1c7461e9d3d8\" (UID: \"2584fed3-16a4-489e-bf03-1c7461e9d3d8\") " Feb 19 15:29:32 crc kubenswrapper[4810]: I0219 15:29:32.553811 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2584fed3-16a4-489e-bf03-1c7461e9d3d8-ovsdbserver-sb\") pod \"2584fed3-16a4-489e-bf03-1c7461e9d3d8\" (UID: \"2584fed3-16a4-489e-bf03-1c7461e9d3d8\") " Feb 19 15:29:32 crc kubenswrapper[4810]: I0219 15:29:32.553849 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckj5k\" (UniqueName: \"kubernetes.io/projected/2584fed3-16a4-489e-bf03-1c7461e9d3d8-kube-api-access-ckj5k\") pod \"2584fed3-16a4-489e-bf03-1c7461e9d3d8\" (UID: \"2584fed3-16a4-489e-bf03-1c7461e9d3d8\") " Feb 19 15:29:32 crc kubenswrapper[4810]: I0219 15:29:32.553962 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2584fed3-16a4-489e-bf03-1c7461e9d3d8-dns-svc\") pod \"2584fed3-16a4-489e-bf03-1c7461e9d3d8\" (UID: \"2584fed3-16a4-489e-bf03-1c7461e9d3d8\") " Feb 19 15:29:32 crc kubenswrapper[4810]: I0219 15:29:32.554098 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2584fed3-16a4-489e-bf03-1c7461e9d3d8-config\") pod \"2584fed3-16a4-489e-bf03-1c7461e9d3d8\" (UID: \"2584fed3-16a4-489e-bf03-1c7461e9d3d8\") " Feb 19 15:29:32 crc kubenswrapper[4810]: I0219 15:29:32.554155 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2584fed3-16a4-489e-bf03-1c7461e9d3d8-dns-swift-storage-0\") pod \"2584fed3-16a4-489e-bf03-1c7461e9d3d8\" (UID: \"2584fed3-16a4-489e-bf03-1c7461e9d3d8\") " Feb 19 15:29:32 crc kubenswrapper[4810]: I0219 15:29:32.554683 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36fe6fdb-2970-4773-8184-a2d16b8ca89a-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:32 crc kubenswrapper[4810]: I0219 15:29:32.554711 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36fe6fdb-2970-4773-8184-a2d16b8ca89a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:32 crc kubenswrapper[4810]: I0219 15:29:32.560672 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2584fed3-16a4-489e-bf03-1c7461e9d3d8-kube-api-access-ckj5k" (OuterVolumeSpecName: "kube-api-access-ckj5k") pod "2584fed3-16a4-489e-bf03-1c7461e9d3d8" (UID: "2584fed3-16a4-489e-bf03-1c7461e9d3d8"). InnerVolumeSpecName "kube-api-access-ckj5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:29:32 crc kubenswrapper[4810]: I0219 15:29:32.594244 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b8cd44d0-7395-44e1-9112-9e8bb4198b93","Type":"ContainerStarted","Data":"1b6e83705ca6e6c238d2cc26ae7440d08d4c7c41779dea1e88d4da0c7c6c4ca7"} Feb 19 15:29:32 crc kubenswrapper[4810]: I0219 15:29:32.597134 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-j989d" Feb 19 15:29:32 crc kubenswrapper[4810]: I0219 15:29:32.600506 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-j989d" event={"ID":"4dd5dede-cf58-43c7-954e-b9b1d33ad8d1","Type":"ContainerDied","Data":"af9963a31801a20460aedc2d93f7da81f8b9a7c2e7ec298ae21e257191169331"} Feb 19 15:29:32 crc kubenswrapper[4810]: I0219 15:29:32.600545 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af9963a31801a20460aedc2d93f7da81f8b9a7c2e7ec298ae21e257191169331" Feb 19 15:29:32 crc kubenswrapper[4810]: I0219 15:29:32.603113 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-7jdcp" event={"ID":"36fe6fdb-2970-4773-8184-a2d16b8ca89a","Type":"ContainerDied","Data":"63add7d4ad95ec66116fce5bf4ebf1368a0ffff3203d7f02a7456da5980c8ad7"} Feb 19 15:29:32 crc kubenswrapper[4810]: I0219 15:29:32.603168 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="63add7d4ad95ec66116fce5bf4ebf1368a0ffff3203d7f02a7456da5980c8ad7" Feb 19 15:29:32 crc kubenswrapper[4810]: I0219 15:29:32.603254 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-7jdcp" Feb 19 15:29:32 crc kubenswrapper[4810]: I0219 15:29:32.606914 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2584fed3-16a4-489e-bf03-1c7461e9d3d8-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2584fed3-16a4-489e-bf03-1c7461e9d3d8" (UID: "2584fed3-16a4-489e-bf03-1c7461e9d3d8"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:29:32 crc kubenswrapper[4810]: I0219 15:29:32.607116 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2584fed3-16a4-489e-bf03-1c7461e9d3d8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2584fed3-16a4-489e-bf03-1c7461e9d3d8" (UID: "2584fed3-16a4-489e-bf03-1c7461e9d3d8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:29:32 crc kubenswrapper[4810]: I0219 15:29:32.609552 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-thnc7" event={"ID":"92797675-ddf7-43cf-90af-0248cf097509","Type":"ContainerDied","Data":"29912d3ef9b6f31b1e73feb89f2d45be94dbb1cafdcd248d477030898edd801f"} Feb 19 15:29:32 crc kubenswrapper[4810]: I0219 15:29:32.609585 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="29912d3ef9b6f31b1e73feb89f2d45be94dbb1cafdcd248d477030898edd801f" Feb 19 15:29:32 crc kubenswrapper[4810]: I0219 15:29:32.609659 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-thnc7" Feb 19 15:29:32 crc kubenswrapper[4810]: I0219 15:29:32.612802 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2584fed3-16a4-489e-bf03-1c7461e9d3d8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2584fed3-16a4-489e-bf03-1c7461e9d3d8" (UID: "2584fed3-16a4-489e-bf03-1c7461e9d3d8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:29:32 crc kubenswrapper[4810]: I0219 15:29:32.617808 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2584fed3-16a4-489e-bf03-1c7461e9d3d8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2584fed3-16a4-489e-bf03-1c7461e9d3d8" (UID: "2584fed3-16a4-489e-bf03-1c7461e9d3d8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:29:32 crc kubenswrapper[4810]: I0219 15:29:32.618573 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6999bddfcf-fzf7g" event={"ID":"2584fed3-16a4-489e-bf03-1c7461e9d3d8","Type":"ContainerDied","Data":"31b96436742b6e2f2b15881309695bfa289fff7a6145248f7da06333643cac1d"} Feb 19 15:29:32 crc kubenswrapper[4810]: I0219 15:29:32.618614 4810 scope.go:117] "RemoveContainer" containerID="66285c6ba52e055f610f9273f522d8c7987aa81e9fe271b980a52b4e8dbd8794" Feb 19 15:29:32 crc kubenswrapper[4810]: I0219 15:29:32.618745 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6999bddfcf-fzf7g" Feb 19 15:29:32 crc kubenswrapper[4810]: I0219 15:29:32.625989 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"3eb2dccd-c5dc-436f-b7a6-954af7bc51c5","Type":"ContainerStarted","Data":"ea64343d53131969232078700626ee5996ba47b778be09a59da7ab42c9aa2631"} Feb 19 15:29:32 crc kubenswrapper[4810]: I0219 15:29:32.640085 4810 scope.go:117] "RemoveContainer" containerID="821ae759a5ba32197a93051b435e6f01263ceab4bc6d3d77eccce527b39b8143" Feb 19 15:29:32 crc kubenswrapper[4810]: I0219 15:29:32.644847 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2584fed3-16a4-489e-bf03-1c7461e9d3d8-config" (OuterVolumeSpecName: "config") pod "2584fed3-16a4-489e-bf03-1c7461e9d3d8" (UID: "2584fed3-16a4-489e-bf03-1c7461e9d3d8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:29:32 crc kubenswrapper[4810]: I0219 15:29:32.656154 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2584fed3-16a4-489e-bf03-1c7461e9d3d8-config\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:32 crc kubenswrapper[4810]: I0219 15:29:32.656189 4810 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2584fed3-16a4-489e-bf03-1c7461e9d3d8-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:32 crc kubenswrapper[4810]: I0219 15:29:32.656202 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2584fed3-16a4-489e-bf03-1c7461e9d3d8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:32 crc kubenswrapper[4810]: I0219 15:29:32.656214 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2584fed3-16a4-489e-bf03-1c7461e9d3d8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:32 crc kubenswrapper[4810]: I0219 15:29:32.656226 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckj5k\" (UniqueName: \"kubernetes.io/projected/2584fed3-16a4-489e-bf03-1c7461e9d3d8-kube-api-access-ckj5k\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:32 crc kubenswrapper[4810]: I0219 15:29:32.656239 4810 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2584fed3-16a4-489e-bf03-1c7461e9d3d8-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:32 crc kubenswrapper[4810]: I0219 15:29:32.949363 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6999bddfcf-fzf7g"] Feb 19 15:29:32 crc kubenswrapper[4810]: I0219 15:29:32.955949 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6999bddfcf-fzf7g"] Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.355847 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-6cd8bf58f4-ktsjk"] Feb 19 15:29:33 crc kubenswrapper[4810]: E0219 15:29:33.356563 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2584fed3-16a4-489e-bf03-1c7461e9d3d8" containerName="dnsmasq-dns" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.356583 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="2584fed3-16a4-489e-bf03-1c7461e9d3d8" containerName="dnsmasq-dns" Feb 19 15:29:33 crc kubenswrapper[4810]: E0219 15:29:33.356597 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92797675-ddf7-43cf-90af-0248cf097509" containerName="keystone-bootstrap" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.356605 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="92797675-ddf7-43cf-90af-0248cf097509" containerName="keystone-bootstrap" Feb 19 15:29:33 crc kubenswrapper[4810]: E0219 15:29:33.356631 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2584fed3-16a4-489e-bf03-1c7461e9d3d8" containerName="init" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.356639 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="2584fed3-16a4-489e-bf03-1c7461e9d3d8" containerName="init" Feb 19 15:29:33 crc kubenswrapper[4810]: E0219 15:29:33.356654 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dd5dede-cf58-43c7-954e-b9b1d33ad8d1" containerName="neutron-db-sync" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.356661 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dd5dede-cf58-43c7-954e-b9b1d33ad8d1" containerName="neutron-db-sync" Feb 19 15:29:33 crc kubenswrapper[4810]: E0219 15:29:33.356677 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36fe6fdb-2970-4773-8184-a2d16b8ca89a" containerName="placement-db-sync" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.356685 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="36fe6fdb-2970-4773-8184-a2d16b8ca89a" containerName="placement-db-sync" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.356891 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="2584fed3-16a4-489e-bf03-1c7461e9d3d8" containerName="dnsmasq-dns" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.356918 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="92797675-ddf7-43cf-90af-0248cf097509" containerName="keystone-bootstrap" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.356932 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="36fe6fdb-2970-4773-8184-a2d16b8ca89a" containerName="placement-db-sync" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.356944 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dd5dede-cf58-43c7-954e-b9b1d33ad8d1" containerName="neutron-db-sync" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.357609 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6cd8bf58f4-ktsjk" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.359679 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.359825 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-j78zz" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.359962 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.361587 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.361779 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.369894 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-68ff886dc8-nntj6"] Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.371600 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-68ff886dc8-nntj6" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.372529 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.377089 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.377273 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.377774 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-qpnp7" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.378048 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.378277 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.401645 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6cd8bf58f4-ktsjk"] Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.436376 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-68ff886dc8-nntj6"] Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.463444 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2584fed3-16a4-489e-bf03-1c7461e9d3d8" path="/var/lib/kubelet/pods/2584fed3-16a4-489e-bf03-1c7461e9d3d8/volumes" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.479285 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/95165d88-ea72-4785-8c1a-eea4d54466fb-public-tls-certs\") pod \"keystone-6cd8bf58f4-ktsjk\" (UID: \"95165d88-ea72-4785-8c1a-eea4d54466fb\") " pod="openstack/keystone-6cd8bf58f4-ktsjk" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.479352 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjl2x\" (UniqueName: \"kubernetes.io/projected/95165d88-ea72-4785-8c1a-eea4d54466fb-kube-api-access-qjl2x\") pod \"keystone-6cd8bf58f4-ktsjk\" (UID: \"95165d88-ea72-4785-8c1a-eea4d54466fb\") " pod="openstack/keystone-6cd8bf58f4-ktsjk" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.479402 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95165d88-ea72-4785-8c1a-eea4d54466fb-combined-ca-bundle\") pod \"keystone-6cd8bf58f4-ktsjk\" (UID: \"95165d88-ea72-4785-8c1a-eea4d54466fb\") " pod="openstack/keystone-6cd8bf58f4-ktsjk" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.479458 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0116ca5-826e-4a77-bc6f-11e89c047af8-internal-tls-certs\") pod \"placement-68ff886dc8-nntj6\" (UID: \"e0116ca5-826e-4a77-bc6f-11e89c047af8\") " pod="openstack/placement-68ff886dc8-nntj6" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.479519 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/95165d88-ea72-4785-8c1a-eea4d54466fb-internal-tls-certs\") pod \"keystone-6cd8bf58f4-ktsjk\" (UID: \"95165d88-ea72-4785-8c1a-eea4d54466fb\") " pod="openstack/keystone-6cd8bf58f4-ktsjk" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.479545 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95165d88-ea72-4785-8c1a-eea4d54466fb-config-data\") pod \"keystone-6cd8bf58f4-ktsjk\" (UID: \"95165d88-ea72-4785-8c1a-eea4d54466fb\") " pod="openstack/keystone-6cd8bf58f4-ktsjk" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.479574 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0116ca5-826e-4a77-bc6f-11e89c047af8-public-tls-certs\") pod \"placement-68ff886dc8-nntj6\" (UID: \"e0116ca5-826e-4a77-bc6f-11e89c047af8\") " pod="openstack/placement-68ff886dc8-nntj6" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.479599 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ln9c7\" (UniqueName: \"kubernetes.io/projected/e0116ca5-826e-4a77-bc6f-11e89c047af8-kube-api-access-ln9c7\") pod \"placement-68ff886dc8-nntj6\" (UID: \"e0116ca5-826e-4a77-bc6f-11e89c047af8\") " pod="openstack/placement-68ff886dc8-nntj6" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.479615 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0116ca5-826e-4a77-bc6f-11e89c047af8-config-data\") pod \"placement-68ff886dc8-nntj6\" (UID: \"e0116ca5-826e-4a77-bc6f-11e89c047af8\") " pod="openstack/placement-68ff886dc8-nntj6" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.479645 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/95165d88-ea72-4785-8c1a-eea4d54466fb-fernet-keys\") pod \"keystone-6cd8bf58f4-ktsjk\" (UID: \"95165d88-ea72-4785-8c1a-eea4d54466fb\") " pod="openstack/keystone-6cd8bf58f4-ktsjk" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.479661 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/95165d88-ea72-4785-8c1a-eea4d54466fb-credential-keys\") pod \"keystone-6cd8bf58f4-ktsjk\" (UID: \"95165d88-ea72-4785-8c1a-eea4d54466fb\") " pod="openstack/keystone-6cd8bf58f4-ktsjk" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.479702 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0116ca5-826e-4a77-bc6f-11e89c047af8-logs\") pod \"placement-68ff886dc8-nntj6\" (UID: \"e0116ca5-826e-4a77-bc6f-11e89c047af8\") " pod="openstack/placement-68ff886dc8-nntj6" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.479726 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0116ca5-826e-4a77-bc6f-11e89c047af8-combined-ca-bundle\") pod \"placement-68ff886dc8-nntj6\" (UID: \"e0116ca5-826e-4a77-bc6f-11e89c047af8\") " pod="openstack/placement-68ff886dc8-nntj6" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.479743 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95165d88-ea72-4785-8c1a-eea4d54466fb-scripts\") pod \"keystone-6cd8bf58f4-ktsjk\" (UID: \"95165d88-ea72-4785-8c1a-eea4d54466fb\") " pod="openstack/keystone-6cd8bf58f4-ktsjk" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.479804 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0116ca5-826e-4a77-bc6f-11e89c047af8-scripts\") pod \"placement-68ff886dc8-nntj6\" (UID: \"e0116ca5-826e-4a77-bc6f-11e89c047af8\") " pod="openstack/placement-68ff886dc8-nntj6" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.510397 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6cf5f86dff-7482l"] Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.511916 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cf5f86dff-7482l" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.518540 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-695bb7cdc6-72zs2"] Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.519981 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-695bb7cdc6-72zs2" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.522639 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.522875 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.522989 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.523126 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-xcpvh" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.535268 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6cf5f86dff-7482l"] Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.557804 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-695bb7cdc6-72zs2"] Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.586989 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0116ca5-826e-4a77-bc6f-11e89c047af8-combined-ca-bundle\") pod \"placement-68ff886dc8-nntj6\" (UID: \"e0116ca5-826e-4a77-bc6f-11e89c047af8\") " pod="openstack/placement-68ff886dc8-nntj6" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.587207 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95165d88-ea72-4785-8c1a-eea4d54466fb-scripts\") pod \"keystone-6cd8bf58f4-ktsjk\" (UID: \"95165d88-ea72-4785-8c1a-eea4d54466fb\") " pod="openstack/keystone-6cd8bf58f4-ktsjk" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.587285 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f0b73197-3c7e-44c3-8a49-35d9e0a40629-config\") pod \"neutron-695bb7cdc6-72zs2\" (UID: \"f0b73197-3c7e-44c3-8a49-35d9e0a40629\") " pod="openstack/neutron-695bb7cdc6-72zs2" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.587400 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drqw4\" (UniqueName: \"kubernetes.io/projected/f0b73197-3c7e-44c3-8a49-35d9e0a40629-kube-api-access-drqw4\") pod \"neutron-695bb7cdc6-72zs2\" (UID: \"f0b73197-3c7e-44c3-8a49-35d9e0a40629\") " pod="openstack/neutron-695bb7cdc6-72zs2" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.587494 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0b73197-3c7e-44c3-8a49-35d9e0a40629-combined-ca-bundle\") pod \"neutron-695bb7cdc6-72zs2\" (UID: \"f0b73197-3c7e-44c3-8a49-35d9e0a40629\") " pod="openstack/neutron-695bb7cdc6-72zs2" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.587597 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0116ca5-826e-4a77-bc6f-11e89c047af8-scripts\") pod \"placement-68ff886dc8-nntj6\" (UID: \"e0116ca5-826e-4a77-bc6f-11e89c047af8\") " pod="openstack/placement-68ff886dc8-nntj6" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.587673 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/95165d88-ea72-4785-8c1a-eea4d54466fb-public-tls-certs\") pod \"keystone-6cd8bf58f4-ktsjk\" (UID: \"95165d88-ea72-4785-8c1a-eea4d54466fb\") " pod="openstack/keystone-6cd8bf58f4-ktsjk" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.587736 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjl2x\" (UniqueName: \"kubernetes.io/projected/95165d88-ea72-4785-8c1a-eea4d54466fb-kube-api-access-qjl2x\") pod \"keystone-6cd8bf58f4-ktsjk\" (UID: \"95165d88-ea72-4785-8c1a-eea4d54466fb\") " pod="openstack/keystone-6cd8bf58f4-ktsjk" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.587803 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhf9w\" (UniqueName: \"kubernetes.io/projected/4f9534ee-827a-49fb-8588-a5a8a494be3c-kube-api-access-vhf9w\") pod \"dnsmasq-dns-6cf5f86dff-7482l\" (UID: \"4f9534ee-827a-49fb-8588-a5a8a494be3c\") " pod="openstack/dnsmasq-dns-6cf5f86dff-7482l" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.587873 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4f9534ee-827a-49fb-8588-a5a8a494be3c-ovsdbserver-nb\") pod \"dnsmasq-dns-6cf5f86dff-7482l\" (UID: \"4f9534ee-827a-49fb-8588-a5a8a494be3c\") " pod="openstack/dnsmasq-dns-6cf5f86dff-7482l" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.587940 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f0b73197-3c7e-44c3-8a49-35d9e0a40629-httpd-config\") pod \"neutron-695bb7cdc6-72zs2\" (UID: \"f0b73197-3c7e-44c3-8a49-35d9e0a40629\") " pod="openstack/neutron-695bb7cdc6-72zs2" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.588015 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95165d88-ea72-4785-8c1a-eea4d54466fb-combined-ca-bundle\") pod \"keystone-6cd8bf58f4-ktsjk\" (UID: \"95165d88-ea72-4785-8c1a-eea4d54466fb\") " pod="openstack/keystone-6cd8bf58f4-ktsjk" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.588102 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0116ca5-826e-4a77-bc6f-11e89c047af8-internal-tls-certs\") pod \"placement-68ff886dc8-nntj6\" (UID: \"e0116ca5-826e-4a77-bc6f-11e89c047af8\") " pod="openstack/placement-68ff886dc8-nntj6" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.588224 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4f9534ee-827a-49fb-8588-a5a8a494be3c-dns-swift-storage-0\") pod \"dnsmasq-dns-6cf5f86dff-7482l\" (UID: \"4f9534ee-827a-49fb-8588-a5a8a494be3c\") " pod="openstack/dnsmasq-dns-6cf5f86dff-7482l" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.588341 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/95165d88-ea72-4785-8c1a-eea4d54466fb-internal-tls-certs\") pod \"keystone-6cd8bf58f4-ktsjk\" (UID: \"95165d88-ea72-4785-8c1a-eea4d54466fb\") " pod="openstack/keystone-6cd8bf58f4-ktsjk" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.588444 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95165d88-ea72-4785-8c1a-eea4d54466fb-config-data\") pod \"keystone-6cd8bf58f4-ktsjk\" (UID: \"95165d88-ea72-4785-8c1a-eea4d54466fb\") " pod="openstack/keystone-6cd8bf58f4-ktsjk" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.588542 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4f9534ee-827a-49fb-8588-a5a8a494be3c-dns-svc\") pod \"dnsmasq-dns-6cf5f86dff-7482l\" (UID: \"4f9534ee-827a-49fb-8588-a5a8a494be3c\") " pod="openstack/dnsmasq-dns-6cf5f86dff-7482l" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.588641 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0116ca5-826e-4a77-bc6f-11e89c047af8-public-tls-certs\") pod \"placement-68ff886dc8-nntj6\" (UID: \"e0116ca5-826e-4a77-bc6f-11e89c047af8\") " pod="openstack/placement-68ff886dc8-nntj6" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.588744 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ln9c7\" (UniqueName: \"kubernetes.io/projected/e0116ca5-826e-4a77-bc6f-11e89c047af8-kube-api-access-ln9c7\") pod \"placement-68ff886dc8-nntj6\" (UID: \"e0116ca5-826e-4a77-bc6f-11e89c047af8\") " pod="openstack/placement-68ff886dc8-nntj6" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.588839 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0116ca5-826e-4a77-bc6f-11e89c047af8-config-data\") pod \"placement-68ff886dc8-nntj6\" (UID: \"e0116ca5-826e-4a77-bc6f-11e89c047af8\") " pod="openstack/placement-68ff886dc8-nntj6" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.588960 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f9534ee-827a-49fb-8588-a5a8a494be3c-config\") pod \"dnsmasq-dns-6cf5f86dff-7482l\" (UID: \"4f9534ee-827a-49fb-8588-a5a8a494be3c\") " pod="openstack/dnsmasq-dns-6cf5f86dff-7482l" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.589054 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0b73197-3c7e-44c3-8a49-35d9e0a40629-ovndb-tls-certs\") pod \"neutron-695bb7cdc6-72zs2\" (UID: \"f0b73197-3c7e-44c3-8a49-35d9e0a40629\") " pod="openstack/neutron-695bb7cdc6-72zs2" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.589158 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/95165d88-ea72-4785-8c1a-eea4d54466fb-fernet-keys\") pod \"keystone-6cd8bf58f4-ktsjk\" (UID: \"95165d88-ea72-4785-8c1a-eea4d54466fb\") " pod="openstack/keystone-6cd8bf58f4-ktsjk" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.589242 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/95165d88-ea72-4785-8c1a-eea4d54466fb-credential-keys\") pod \"keystone-6cd8bf58f4-ktsjk\" (UID: \"95165d88-ea72-4785-8c1a-eea4d54466fb\") " pod="openstack/keystone-6cd8bf58f4-ktsjk" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.589382 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4f9534ee-827a-49fb-8588-a5a8a494be3c-ovsdbserver-sb\") pod \"dnsmasq-dns-6cf5f86dff-7482l\" (UID: \"4f9534ee-827a-49fb-8588-a5a8a494be3c\") " pod="openstack/dnsmasq-dns-6cf5f86dff-7482l" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.596769 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0116ca5-826e-4a77-bc6f-11e89c047af8-logs\") pod \"placement-68ff886dc8-nntj6\" (UID: \"e0116ca5-826e-4a77-bc6f-11e89c047af8\") " pod="openstack/placement-68ff886dc8-nntj6" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.597434 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0116ca5-826e-4a77-bc6f-11e89c047af8-logs\") pod \"placement-68ff886dc8-nntj6\" (UID: \"e0116ca5-826e-4a77-bc6f-11e89c047af8\") " pod="openstack/placement-68ff886dc8-nntj6" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.602972 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95165d88-ea72-4785-8c1a-eea4d54466fb-combined-ca-bundle\") pod \"keystone-6cd8bf58f4-ktsjk\" (UID: \"95165d88-ea72-4785-8c1a-eea4d54466fb\") " pod="openstack/keystone-6cd8bf58f4-ktsjk" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.604132 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0116ca5-826e-4a77-bc6f-11e89c047af8-scripts\") pod \"placement-68ff886dc8-nntj6\" (UID: \"e0116ca5-826e-4a77-bc6f-11e89c047af8\") " pod="openstack/placement-68ff886dc8-nntj6" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.605757 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95165d88-ea72-4785-8c1a-eea4d54466fb-config-data\") pod \"keystone-6cd8bf58f4-ktsjk\" (UID: \"95165d88-ea72-4785-8c1a-eea4d54466fb\") " pod="openstack/keystone-6cd8bf58f4-ktsjk" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.606318 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/95165d88-ea72-4785-8c1a-eea4d54466fb-internal-tls-certs\") pod \"keystone-6cd8bf58f4-ktsjk\" (UID: \"95165d88-ea72-4785-8c1a-eea4d54466fb\") " pod="openstack/keystone-6cd8bf58f4-ktsjk" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.606818 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0116ca5-826e-4a77-bc6f-11e89c047af8-combined-ca-bundle\") pod \"placement-68ff886dc8-nntj6\" (UID: \"e0116ca5-826e-4a77-bc6f-11e89c047af8\") " pod="openstack/placement-68ff886dc8-nntj6" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.611602 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/95165d88-ea72-4785-8c1a-eea4d54466fb-public-tls-certs\") pod \"keystone-6cd8bf58f4-ktsjk\" (UID: \"95165d88-ea72-4785-8c1a-eea4d54466fb\") " pod="openstack/keystone-6cd8bf58f4-ktsjk" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.614712 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95165d88-ea72-4785-8c1a-eea4d54466fb-scripts\") pod \"keystone-6cd8bf58f4-ktsjk\" (UID: \"95165d88-ea72-4785-8c1a-eea4d54466fb\") " pod="openstack/keystone-6cd8bf58f4-ktsjk" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.615029 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0116ca5-826e-4a77-bc6f-11e89c047af8-internal-tls-certs\") pod \"placement-68ff886dc8-nntj6\" (UID: \"e0116ca5-826e-4a77-bc6f-11e89c047af8\") " pod="openstack/placement-68ff886dc8-nntj6" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.615149 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/95165d88-ea72-4785-8c1a-eea4d54466fb-fernet-keys\") pod \"keystone-6cd8bf58f4-ktsjk\" (UID: \"95165d88-ea72-4785-8c1a-eea4d54466fb\") " pod="openstack/keystone-6cd8bf58f4-ktsjk" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.615593 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0116ca5-826e-4a77-bc6f-11e89c047af8-public-tls-certs\") pod \"placement-68ff886dc8-nntj6\" (UID: \"e0116ca5-826e-4a77-bc6f-11e89c047af8\") " pod="openstack/placement-68ff886dc8-nntj6" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.615898 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/95165d88-ea72-4785-8c1a-eea4d54466fb-credential-keys\") pod \"keystone-6cd8bf58f4-ktsjk\" (UID: \"95165d88-ea72-4785-8c1a-eea4d54466fb\") " pod="openstack/keystone-6cd8bf58f4-ktsjk" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.617924 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0116ca5-826e-4a77-bc6f-11e89c047af8-config-data\") pod \"placement-68ff886dc8-nntj6\" (UID: \"e0116ca5-826e-4a77-bc6f-11e89c047af8\") " pod="openstack/placement-68ff886dc8-nntj6" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.628894 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ln9c7\" (UniqueName: \"kubernetes.io/projected/e0116ca5-826e-4a77-bc6f-11e89c047af8-kube-api-access-ln9c7\") pod \"placement-68ff886dc8-nntj6\" (UID: \"e0116ca5-826e-4a77-bc6f-11e89c047af8\") " pod="openstack/placement-68ff886dc8-nntj6" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.645454 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjl2x\" (UniqueName: \"kubernetes.io/projected/95165d88-ea72-4785-8c1a-eea4d54466fb-kube-api-access-qjl2x\") pod \"keystone-6cd8bf58f4-ktsjk\" (UID: \"95165d88-ea72-4785-8c1a-eea4d54466fb\") " pod="openstack/keystone-6cd8bf58f4-ktsjk" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.685976 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6cd8bf58f4-ktsjk" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.703304 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f9534ee-827a-49fb-8588-a5a8a494be3c-config\") pod \"dnsmasq-dns-6cf5f86dff-7482l\" (UID: \"4f9534ee-827a-49fb-8588-a5a8a494be3c\") " pod="openstack/dnsmasq-dns-6cf5f86dff-7482l" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.703452 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0b73197-3c7e-44c3-8a49-35d9e0a40629-ovndb-tls-certs\") pod \"neutron-695bb7cdc6-72zs2\" (UID: \"f0b73197-3c7e-44c3-8a49-35d9e0a40629\") " pod="openstack/neutron-695bb7cdc6-72zs2" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.703506 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4f9534ee-827a-49fb-8588-a5a8a494be3c-ovsdbserver-sb\") pod \"dnsmasq-dns-6cf5f86dff-7482l\" (UID: \"4f9534ee-827a-49fb-8588-a5a8a494be3c\") " pod="openstack/dnsmasq-dns-6cf5f86dff-7482l" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.703542 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f0b73197-3c7e-44c3-8a49-35d9e0a40629-config\") pod \"neutron-695bb7cdc6-72zs2\" (UID: \"f0b73197-3c7e-44c3-8a49-35d9e0a40629\") " pod="openstack/neutron-695bb7cdc6-72zs2" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.703585 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drqw4\" (UniqueName: \"kubernetes.io/projected/f0b73197-3c7e-44c3-8a49-35d9e0a40629-kube-api-access-drqw4\") pod \"neutron-695bb7cdc6-72zs2\" (UID: \"f0b73197-3c7e-44c3-8a49-35d9e0a40629\") " pod="openstack/neutron-695bb7cdc6-72zs2" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.703623 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0b73197-3c7e-44c3-8a49-35d9e0a40629-combined-ca-bundle\") pod \"neutron-695bb7cdc6-72zs2\" (UID: \"f0b73197-3c7e-44c3-8a49-35d9e0a40629\") " pod="openstack/neutron-695bb7cdc6-72zs2" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.703666 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhf9w\" (UniqueName: \"kubernetes.io/projected/4f9534ee-827a-49fb-8588-a5a8a494be3c-kube-api-access-vhf9w\") pod \"dnsmasq-dns-6cf5f86dff-7482l\" (UID: \"4f9534ee-827a-49fb-8588-a5a8a494be3c\") " pod="openstack/dnsmasq-dns-6cf5f86dff-7482l" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.703694 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4f9534ee-827a-49fb-8588-a5a8a494be3c-ovsdbserver-nb\") pod \"dnsmasq-dns-6cf5f86dff-7482l\" (UID: \"4f9534ee-827a-49fb-8588-a5a8a494be3c\") " pod="openstack/dnsmasq-dns-6cf5f86dff-7482l" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.703713 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f0b73197-3c7e-44c3-8a49-35d9e0a40629-httpd-config\") pod \"neutron-695bb7cdc6-72zs2\" (UID: \"f0b73197-3c7e-44c3-8a49-35d9e0a40629\") " pod="openstack/neutron-695bb7cdc6-72zs2" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.703766 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4f9534ee-827a-49fb-8588-a5a8a494be3c-dns-swift-storage-0\") pod \"dnsmasq-dns-6cf5f86dff-7482l\" (UID: \"4f9534ee-827a-49fb-8588-a5a8a494be3c\") " pod="openstack/dnsmasq-dns-6cf5f86dff-7482l" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.703791 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4f9534ee-827a-49fb-8588-a5a8a494be3c-dns-svc\") pod \"dnsmasq-dns-6cf5f86dff-7482l\" (UID: \"4f9534ee-827a-49fb-8588-a5a8a494be3c\") " pod="openstack/dnsmasq-dns-6cf5f86dff-7482l" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.704104 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-68ff886dc8-nntj6" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.705301 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f9534ee-827a-49fb-8588-a5a8a494be3c-config\") pod \"dnsmasq-dns-6cf5f86dff-7482l\" (UID: \"4f9534ee-827a-49fb-8588-a5a8a494be3c\") " pod="openstack/dnsmasq-dns-6cf5f86dff-7482l" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.706416 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4f9534ee-827a-49fb-8588-a5a8a494be3c-ovsdbserver-sb\") pod \"dnsmasq-dns-6cf5f86dff-7482l\" (UID: \"4f9534ee-827a-49fb-8588-a5a8a494be3c\") " pod="openstack/dnsmasq-dns-6cf5f86dff-7482l" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.706914 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4f9534ee-827a-49fb-8588-a5a8a494be3c-ovsdbserver-nb\") pod \"dnsmasq-dns-6cf5f86dff-7482l\" (UID: \"4f9534ee-827a-49fb-8588-a5a8a494be3c\") " pod="openstack/dnsmasq-dns-6cf5f86dff-7482l" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.711404 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4f9534ee-827a-49fb-8588-a5a8a494be3c-dns-swift-storage-0\") pod \"dnsmasq-dns-6cf5f86dff-7482l\" (UID: \"4f9534ee-827a-49fb-8588-a5a8a494be3c\") " pod="openstack/dnsmasq-dns-6cf5f86dff-7482l" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.712505 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4f9534ee-827a-49fb-8588-a5a8a494be3c-dns-svc\") pod \"dnsmasq-dns-6cf5f86dff-7482l\" (UID: \"4f9534ee-827a-49fb-8588-a5a8a494be3c\") " pod="openstack/dnsmasq-dns-6cf5f86dff-7482l" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.731136 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0b73197-3c7e-44c3-8a49-35d9e0a40629-combined-ca-bundle\") pod \"neutron-695bb7cdc6-72zs2\" (UID: \"f0b73197-3c7e-44c3-8a49-35d9e0a40629\") " pod="openstack/neutron-695bb7cdc6-72zs2" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.731677 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0b73197-3c7e-44c3-8a49-35d9e0a40629-ovndb-tls-certs\") pod \"neutron-695bb7cdc6-72zs2\" (UID: \"f0b73197-3c7e-44c3-8a49-35d9e0a40629\") " pod="openstack/neutron-695bb7cdc6-72zs2" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.731741 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f0b73197-3c7e-44c3-8a49-35d9e0a40629-httpd-config\") pod \"neutron-695bb7cdc6-72zs2\" (UID: \"f0b73197-3c7e-44c3-8a49-35d9e0a40629\") " pod="openstack/neutron-695bb7cdc6-72zs2" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.752142 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f0b73197-3c7e-44c3-8a49-35d9e0a40629-config\") pod \"neutron-695bb7cdc6-72zs2\" (UID: \"f0b73197-3c7e-44c3-8a49-35d9e0a40629\") " pod="openstack/neutron-695bb7cdc6-72zs2" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.761674 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhf9w\" (UniqueName: \"kubernetes.io/projected/4f9534ee-827a-49fb-8588-a5a8a494be3c-kube-api-access-vhf9w\") pod \"dnsmasq-dns-6cf5f86dff-7482l\" (UID: \"4f9534ee-827a-49fb-8588-a5a8a494be3c\") " pod="openstack/dnsmasq-dns-6cf5f86dff-7482l" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.761874 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drqw4\" (UniqueName: \"kubernetes.io/projected/f0b73197-3c7e-44c3-8a49-35d9e0a40629-kube-api-access-drqw4\") pod \"neutron-695bb7cdc6-72zs2\" (UID: \"f0b73197-3c7e-44c3-8a49-35d9e0a40629\") " pod="openstack/neutron-695bb7cdc6-72zs2" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.774179 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6cc67d5fc8-hs8lf"] Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.775835 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6cc67d5fc8-hs8lf" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.816634 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6cc67d5fc8-hs8lf"] Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.837239 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-695bb7cdc6-72zs2" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.852692 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cf5f86dff-7482l" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.916460 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb06ec19-dfd7-459f-9bfc-6c3e1619abb0-ovndb-tls-certs\") pod \"neutron-6cc67d5fc8-hs8lf\" (UID: \"fb06ec19-dfd7-459f-9bfc-6c3e1619abb0\") " pod="openstack/neutron-6cc67d5fc8-hs8lf" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.916517 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fb06ec19-dfd7-459f-9bfc-6c3e1619abb0-httpd-config\") pod \"neutron-6cc67d5fc8-hs8lf\" (UID: \"fb06ec19-dfd7-459f-9bfc-6c3e1619abb0\") " pod="openstack/neutron-6cc67d5fc8-hs8lf" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.916618 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fb06ec19-dfd7-459f-9bfc-6c3e1619abb0-config\") pod \"neutron-6cc67d5fc8-hs8lf\" (UID: \"fb06ec19-dfd7-459f-9bfc-6c3e1619abb0\") " pod="openstack/neutron-6cc67d5fc8-hs8lf" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.918183 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-7555d68ddd-xqj8c"] Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.919798 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7555d68ddd-xqj8c" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.928464 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lk7nj\" (UniqueName: \"kubernetes.io/projected/fb06ec19-dfd7-459f-9bfc-6c3e1619abb0-kube-api-access-lk7nj\") pod \"neutron-6cc67d5fc8-hs8lf\" (UID: \"fb06ec19-dfd7-459f-9bfc-6c3e1619abb0\") " pod="openstack/neutron-6cc67d5fc8-hs8lf" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.928824 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb06ec19-dfd7-459f-9bfc-6c3e1619abb0-combined-ca-bundle\") pod \"neutron-6cc67d5fc8-hs8lf\" (UID: \"fb06ec19-dfd7-459f-9bfc-6c3e1619abb0\") " pod="openstack/neutron-6cc67d5fc8-hs8lf" Feb 19 15:29:33 crc kubenswrapper[4810]: I0219 15:29:33.930367 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7555d68ddd-xqj8c"] Feb 19 15:29:34 crc kubenswrapper[4810]: I0219 15:29:34.031298 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/565eac29-daec-4b40-bcb7-751696560c3a-logs\") pod \"placement-7555d68ddd-xqj8c\" (UID: \"565eac29-daec-4b40-bcb7-751696560c3a\") " pod="openstack/placement-7555d68ddd-xqj8c" Feb 19 15:29:34 crc kubenswrapper[4810]: I0219 15:29:34.031625 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb06ec19-dfd7-459f-9bfc-6c3e1619abb0-combined-ca-bundle\") pod \"neutron-6cc67d5fc8-hs8lf\" (UID: \"fb06ec19-dfd7-459f-9bfc-6c3e1619abb0\") " pod="openstack/neutron-6cc67d5fc8-hs8lf" Feb 19 15:29:34 crc kubenswrapper[4810]: I0219 15:29:34.031664 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/565eac29-daec-4b40-bcb7-751696560c3a-scripts\") pod \"placement-7555d68ddd-xqj8c\" (UID: \"565eac29-daec-4b40-bcb7-751696560c3a\") " pod="openstack/placement-7555d68ddd-xqj8c" Feb 19 15:29:34 crc kubenswrapper[4810]: I0219 15:29:34.031685 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/565eac29-daec-4b40-bcb7-751696560c3a-public-tls-certs\") pod \"placement-7555d68ddd-xqj8c\" (UID: \"565eac29-daec-4b40-bcb7-751696560c3a\") " pod="openstack/placement-7555d68ddd-xqj8c" Feb 19 15:29:34 crc kubenswrapper[4810]: I0219 15:29:34.031725 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb06ec19-dfd7-459f-9bfc-6c3e1619abb0-ovndb-tls-certs\") pod \"neutron-6cc67d5fc8-hs8lf\" (UID: \"fb06ec19-dfd7-459f-9bfc-6c3e1619abb0\") " pod="openstack/neutron-6cc67d5fc8-hs8lf" Feb 19 15:29:34 crc kubenswrapper[4810]: I0219 15:29:34.031742 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fb06ec19-dfd7-459f-9bfc-6c3e1619abb0-httpd-config\") pod \"neutron-6cc67d5fc8-hs8lf\" (UID: \"fb06ec19-dfd7-459f-9bfc-6c3e1619abb0\") " pod="openstack/neutron-6cc67d5fc8-hs8lf" Feb 19 15:29:34 crc kubenswrapper[4810]: I0219 15:29:34.031776 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/565eac29-daec-4b40-bcb7-751696560c3a-combined-ca-bundle\") pod \"placement-7555d68ddd-xqj8c\" (UID: \"565eac29-daec-4b40-bcb7-751696560c3a\") " pod="openstack/placement-7555d68ddd-xqj8c" Feb 19 15:29:34 crc kubenswrapper[4810]: I0219 15:29:34.031810 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25pj2\" (UniqueName: \"kubernetes.io/projected/565eac29-daec-4b40-bcb7-751696560c3a-kube-api-access-25pj2\") pod \"placement-7555d68ddd-xqj8c\" (UID: \"565eac29-daec-4b40-bcb7-751696560c3a\") " pod="openstack/placement-7555d68ddd-xqj8c" Feb 19 15:29:34 crc kubenswrapper[4810]: I0219 15:29:34.031862 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fb06ec19-dfd7-459f-9bfc-6c3e1619abb0-config\") pod \"neutron-6cc67d5fc8-hs8lf\" (UID: \"fb06ec19-dfd7-459f-9bfc-6c3e1619abb0\") " pod="openstack/neutron-6cc67d5fc8-hs8lf" Feb 19 15:29:34 crc kubenswrapper[4810]: I0219 15:29:34.031893 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/565eac29-daec-4b40-bcb7-751696560c3a-config-data\") pod \"placement-7555d68ddd-xqj8c\" (UID: \"565eac29-daec-4b40-bcb7-751696560c3a\") " pod="openstack/placement-7555d68ddd-xqj8c" Feb 19 15:29:34 crc kubenswrapper[4810]: I0219 15:29:34.031914 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lk7nj\" (UniqueName: \"kubernetes.io/projected/fb06ec19-dfd7-459f-9bfc-6c3e1619abb0-kube-api-access-lk7nj\") pod \"neutron-6cc67d5fc8-hs8lf\" (UID: \"fb06ec19-dfd7-459f-9bfc-6c3e1619abb0\") " pod="openstack/neutron-6cc67d5fc8-hs8lf" Feb 19 15:29:34 crc kubenswrapper[4810]: I0219 15:29:34.031944 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/565eac29-daec-4b40-bcb7-751696560c3a-internal-tls-certs\") pod \"placement-7555d68ddd-xqj8c\" (UID: \"565eac29-daec-4b40-bcb7-751696560c3a\") " pod="openstack/placement-7555d68ddd-xqj8c" Feb 19 15:29:34 crc kubenswrapper[4810]: I0219 15:29:34.036706 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb06ec19-dfd7-459f-9bfc-6c3e1619abb0-combined-ca-bundle\") pod \"neutron-6cc67d5fc8-hs8lf\" (UID: \"fb06ec19-dfd7-459f-9bfc-6c3e1619abb0\") " pod="openstack/neutron-6cc67d5fc8-hs8lf" Feb 19 15:29:34 crc kubenswrapper[4810]: I0219 15:29:34.049781 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fb06ec19-dfd7-459f-9bfc-6c3e1619abb0-httpd-config\") pod \"neutron-6cc67d5fc8-hs8lf\" (UID: \"fb06ec19-dfd7-459f-9bfc-6c3e1619abb0\") " pod="openstack/neutron-6cc67d5fc8-hs8lf" Feb 19 15:29:34 crc kubenswrapper[4810]: I0219 15:29:34.050045 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb06ec19-dfd7-459f-9bfc-6c3e1619abb0-ovndb-tls-certs\") pod \"neutron-6cc67d5fc8-hs8lf\" (UID: \"fb06ec19-dfd7-459f-9bfc-6c3e1619abb0\") " pod="openstack/neutron-6cc67d5fc8-hs8lf" Feb 19 15:29:34 crc kubenswrapper[4810]: I0219 15:29:34.056085 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/fb06ec19-dfd7-459f-9bfc-6c3e1619abb0-config\") pod \"neutron-6cc67d5fc8-hs8lf\" (UID: \"fb06ec19-dfd7-459f-9bfc-6c3e1619abb0\") " pod="openstack/neutron-6cc67d5fc8-hs8lf" Feb 19 15:29:34 crc kubenswrapper[4810]: I0219 15:29:34.069023 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lk7nj\" (UniqueName: \"kubernetes.io/projected/fb06ec19-dfd7-459f-9bfc-6c3e1619abb0-kube-api-access-lk7nj\") pod \"neutron-6cc67d5fc8-hs8lf\" (UID: \"fb06ec19-dfd7-459f-9bfc-6c3e1619abb0\") " pod="openstack/neutron-6cc67d5fc8-hs8lf" Feb 19 15:29:34 crc kubenswrapper[4810]: I0219 15:29:34.093704 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 19 15:29:34 crc kubenswrapper[4810]: I0219 15:29:34.093750 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 19 15:29:34 crc kubenswrapper[4810]: E0219 15:29:34.116174 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8c99f71d93423dbb260d03502181a976def39a613d97465a06298513d67bb0bf" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Feb 19 15:29:34 crc kubenswrapper[4810]: I0219 15:29:34.135532 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/565eac29-daec-4b40-bcb7-751696560c3a-config-data\") pod \"placement-7555d68ddd-xqj8c\" (UID: \"565eac29-daec-4b40-bcb7-751696560c3a\") " pod="openstack/placement-7555d68ddd-xqj8c" Feb 19 15:29:34 crc kubenswrapper[4810]: I0219 15:29:34.135602 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/565eac29-daec-4b40-bcb7-751696560c3a-internal-tls-certs\") pod \"placement-7555d68ddd-xqj8c\" (UID: \"565eac29-daec-4b40-bcb7-751696560c3a\") " pod="openstack/placement-7555d68ddd-xqj8c" Feb 19 15:29:34 crc kubenswrapper[4810]: I0219 15:29:34.135634 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/565eac29-daec-4b40-bcb7-751696560c3a-logs\") pod \"placement-7555d68ddd-xqj8c\" (UID: \"565eac29-daec-4b40-bcb7-751696560c3a\") " pod="openstack/placement-7555d68ddd-xqj8c" Feb 19 15:29:34 crc kubenswrapper[4810]: I0219 15:29:34.135673 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/565eac29-daec-4b40-bcb7-751696560c3a-scripts\") pod \"placement-7555d68ddd-xqj8c\" (UID: \"565eac29-daec-4b40-bcb7-751696560c3a\") " pod="openstack/placement-7555d68ddd-xqj8c" Feb 19 15:29:34 crc kubenswrapper[4810]: I0219 15:29:34.135695 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/565eac29-daec-4b40-bcb7-751696560c3a-public-tls-certs\") pod \"placement-7555d68ddd-xqj8c\" (UID: \"565eac29-daec-4b40-bcb7-751696560c3a\") " pod="openstack/placement-7555d68ddd-xqj8c" Feb 19 15:29:34 crc kubenswrapper[4810]: I0219 15:29:34.135745 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/565eac29-daec-4b40-bcb7-751696560c3a-combined-ca-bundle\") pod \"placement-7555d68ddd-xqj8c\" (UID: \"565eac29-daec-4b40-bcb7-751696560c3a\") " pod="openstack/placement-7555d68ddd-xqj8c" Feb 19 15:29:34 crc kubenswrapper[4810]: I0219 15:29:34.135776 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25pj2\" (UniqueName: \"kubernetes.io/projected/565eac29-daec-4b40-bcb7-751696560c3a-kube-api-access-25pj2\") pod \"placement-7555d68ddd-xqj8c\" (UID: \"565eac29-daec-4b40-bcb7-751696560c3a\") " pod="openstack/placement-7555d68ddd-xqj8c" Feb 19 15:29:34 crc kubenswrapper[4810]: I0219 15:29:34.143801 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/565eac29-daec-4b40-bcb7-751696560c3a-combined-ca-bundle\") pod \"placement-7555d68ddd-xqj8c\" (UID: \"565eac29-daec-4b40-bcb7-751696560c3a\") " pod="openstack/placement-7555d68ddd-xqj8c" Feb 19 15:29:34 crc kubenswrapper[4810]: I0219 15:29:34.144068 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/565eac29-daec-4b40-bcb7-751696560c3a-logs\") pod \"placement-7555d68ddd-xqj8c\" (UID: \"565eac29-daec-4b40-bcb7-751696560c3a\") " pod="openstack/placement-7555d68ddd-xqj8c" Feb 19 15:29:34 crc kubenswrapper[4810]: E0219 15:29:34.160489 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8c99f71d93423dbb260d03502181a976def39a613d97465a06298513d67bb0bf" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Feb 19 15:29:34 crc kubenswrapper[4810]: I0219 15:29:34.160885 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/565eac29-daec-4b40-bcb7-751696560c3a-scripts\") pod \"placement-7555d68ddd-xqj8c\" (UID: \"565eac29-daec-4b40-bcb7-751696560c3a\") " pod="openstack/placement-7555d68ddd-xqj8c" Feb 19 15:29:34 crc kubenswrapper[4810]: I0219 15:29:34.161786 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/565eac29-daec-4b40-bcb7-751696560c3a-config-data\") pod \"placement-7555d68ddd-xqj8c\" (UID: \"565eac29-daec-4b40-bcb7-751696560c3a\") " pod="openstack/placement-7555d68ddd-xqj8c" Feb 19 15:29:34 crc kubenswrapper[4810]: I0219 15:29:34.165799 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/565eac29-daec-4b40-bcb7-751696560c3a-internal-tls-certs\") pod \"placement-7555d68ddd-xqj8c\" (UID: \"565eac29-daec-4b40-bcb7-751696560c3a\") " pod="openstack/placement-7555d68ddd-xqj8c" Feb 19 15:29:34 crc kubenswrapper[4810]: I0219 15:29:34.193787 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/565eac29-daec-4b40-bcb7-751696560c3a-public-tls-certs\") pod \"placement-7555d68ddd-xqj8c\" (UID: \"565eac29-daec-4b40-bcb7-751696560c3a\") " pod="openstack/placement-7555d68ddd-xqj8c" Feb 19 15:29:34 crc kubenswrapper[4810]: E0219 15:29:34.200095 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8c99f71d93423dbb260d03502181a976def39a613d97465a06298513d67bb0bf" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Feb 19 15:29:34 crc kubenswrapper[4810]: E0219 15:29:34.200161 4810 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/watcher-applier-0" podUID="4be760b2-263c-4b89-8bdf-ecf98114a24f" containerName="watcher-applier" Feb 19 15:29:34 crc kubenswrapper[4810]: I0219 15:29:34.200908 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25pj2\" (UniqueName: \"kubernetes.io/projected/565eac29-daec-4b40-bcb7-751696560c3a-kube-api-access-25pj2\") pod \"placement-7555d68ddd-xqj8c\" (UID: \"565eac29-daec-4b40-bcb7-751696560c3a\") " pod="openstack/placement-7555d68ddd-xqj8c" Feb 19 15:29:34 crc kubenswrapper[4810]: I0219 15:29:34.219477 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6cc67d5fc8-hs8lf" Feb 19 15:29:34 crc kubenswrapper[4810]: I0219 15:29:34.227071 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 19 15:29:34 crc kubenswrapper[4810]: I0219 15:29:34.242721 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7555d68ddd-xqj8c" Feb 19 15:29:34 crc kubenswrapper[4810]: I0219 15:29:34.345662 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 19 15:29:34 crc kubenswrapper[4810]: I0219 15:29:34.352735 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6cd8bf58f4-ktsjk"] Feb 19 15:29:34 crc kubenswrapper[4810]: I0219 15:29:34.497833 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Feb 19 15:29:34 crc kubenswrapper[4810]: I0219 15:29:34.498224 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="cd961c7d-d551-4f5b-a08a-07d088947698" containerName="watcher-api-log" containerID="cri-o://4368969b33587354836e3cdb6c31f7bd645f510dc4e3c262614ef8da31c0eb92" gracePeriod=30 Feb 19 15:29:34 crc kubenswrapper[4810]: I0219 15:29:34.498676 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="cd961c7d-d551-4f5b-a08a-07d088947698" containerName="watcher-api" containerID="cri-o://aa9a5e6b8de561c312023bc0224fd25e600c4ac446d6ec5ee19e031c464523e1" gracePeriod=30 Feb 19 15:29:34 crc kubenswrapper[4810]: I0219 15:29:34.748659 4810 generic.go:334] "Generic (PLEG): container finished" podID="cd961c7d-d551-4f5b-a08a-07d088947698" containerID="4368969b33587354836e3cdb6c31f7bd645f510dc4e3c262614ef8da31c0eb92" exitCode=143 Feb 19 15:29:34 crc kubenswrapper[4810]: I0219 15:29:34.748743 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"cd961c7d-d551-4f5b-a08a-07d088947698","Type":"ContainerDied","Data":"4368969b33587354836e3cdb6c31f7bd645f510dc4e3c262614ef8da31c0eb92"} Feb 19 15:29:34 crc kubenswrapper[4810]: I0219 15:29:34.758177 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6cd8bf58f4-ktsjk" event={"ID":"95165d88-ea72-4785-8c1a-eea4d54466fb","Type":"ContainerStarted","Data":"aa41c53b974c1a8afd57bdec5addd138e6051cecaa12caee9704509f5c61518d"} Feb 19 15:29:34 crc kubenswrapper[4810]: I0219 15:29:34.803377 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-hmc6k" event={"ID":"2024a783-c3f9-4e57-b00f-52bec164e64e","Type":"ContainerStarted","Data":"ad28bfaa41efd8e4e6c465f81c081a0451a00386412156a5267acdd97840a40b"} Feb 19 15:29:34 crc kubenswrapper[4810]: I0219 15:29:34.803448 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 19 15:29:34 crc kubenswrapper[4810]: I0219 15:29:34.803462 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 19 15:29:34 crc kubenswrapper[4810]: I0219 15:29:34.804128 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-68ff886dc8-nntj6"] Feb 19 15:29:34 crc kubenswrapper[4810]: I0219 15:29:34.839900 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-hmc6k" podStartSLOduration=4.306284117 podStartE2EDuration="46.839878099s" podCreationTimestamp="2026-02-19 15:28:48 +0000 UTC" firstStartedPulling="2026-02-19 15:28:50.998900665 +0000 UTC m=+1160.480930789" lastFinishedPulling="2026-02-19 15:29:33.532494647 +0000 UTC m=+1203.014524771" observedRunningTime="2026-02-19 15:29:34.829910204 +0000 UTC m=+1204.311940328" watchObservedRunningTime="2026-02-19 15:29:34.839878099 +0000 UTC m=+1204.321908223" Feb 19 15:29:34 crc kubenswrapper[4810]: I0219 15:29:34.850385 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 19 15:29:34 crc kubenswrapper[4810]: I0219 15:29:34.850433 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 19 15:29:34 crc kubenswrapper[4810]: I0219 15:29:34.965943 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 19 15:29:34 crc kubenswrapper[4810]: I0219 15:29:34.971467 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 19 15:29:34 crc kubenswrapper[4810]: I0219 15:29:34.979687 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6cf5f86dff-7482l"] Feb 19 15:29:35 crc kubenswrapper[4810]: I0219 15:29:35.133857 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-695bb7cdc6-72zs2"] Feb 19 15:29:35 crc kubenswrapper[4810]: I0219 15:29:35.281897 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7555d68ddd-xqj8c"] Feb 19 15:29:35 crc kubenswrapper[4810]: I0219 15:29:35.377289 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6cc67d5fc8-hs8lf"] Feb 19 15:29:35 crc kubenswrapper[4810]: I0219 15:29:35.846211 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-695bb7cdc6-72zs2" event={"ID":"f0b73197-3c7e-44c3-8a49-35d9e0a40629","Type":"ContainerStarted","Data":"fc28d9c929a268b7ca57d72d22325d489071902ea7ff07035df3c9d98d6a728a"} Feb 19 15:29:35 crc kubenswrapper[4810]: I0219 15:29:35.846675 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-695bb7cdc6-72zs2" event={"ID":"f0b73197-3c7e-44c3-8a49-35d9e0a40629","Type":"ContainerStarted","Data":"b18b549a747d3077270169b17318853029444572e9290aac2a4fb288910d92c3"} Feb 19 15:29:35 crc kubenswrapper[4810]: I0219 15:29:35.849360 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6cd8bf58f4-ktsjk" event={"ID":"95165d88-ea72-4785-8c1a-eea4d54466fb","Type":"ContainerStarted","Data":"969ebc400f1294f5e658b8b96332740cbd1b67f0be09a41e3a45f2e93ee26edb"} Feb 19 15:29:35 crc kubenswrapper[4810]: I0219 15:29:35.850570 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-6cd8bf58f4-ktsjk" Feb 19 15:29:35 crc kubenswrapper[4810]: I0219 15:29:35.853899 4810 generic.go:334] "Generic (PLEG): container finished" podID="4f9534ee-827a-49fb-8588-a5a8a494be3c" containerID="77f0991b435da4c5e7f87592511070821c3d2ea7637aa55b69e68f2a981e4095" exitCode=0 Feb 19 15:29:35 crc kubenswrapper[4810]: I0219 15:29:35.853957 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cf5f86dff-7482l" event={"ID":"4f9534ee-827a-49fb-8588-a5a8a494be3c","Type":"ContainerDied","Data":"77f0991b435da4c5e7f87592511070821c3d2ea7637aa55b69e68f2a981e4095"} Feb 19 15:29:35 crc kubenswrapper[4810]: I0219 15:29:35.853982 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cf5f86dff-7482l" event={"ID":"4f9534ee-827a-49fb-8588-a5a8a494be3c","Type":"ContainerStarted","Data":"32fbc60f0dab31f943ed4d231a260fb910bacf46dc6517badaa0cc870b972e03"} Feb 19 15:29:35 crc kubenswrapper[4810]: I0219 15:29:35.873774 4810 generic.go:334] "Generic (PLEG): container finished" podID="4be760b2-263c-4b89-8bdf-ecf98114a24f" containerID="8c99f71d93423dbb260d03502181a976def39a613d97465a06298513d67bb0bf" exitCode=0 Feb 19 15:29:35 crc kubenswrapper[4810]: I0219 15:29:35.873833 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"4be760b2-263c-4b89-8bdf-ecf98114a24f","Type":"ContainerDied","Data":"8c99f71d93423dbb260d03502181a976def39a613d97465a06298513d67bb0bf"} Feb 19 15:29:35 crc kubenswrapper[4810]: I0219 15:29:35.875342 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6cc67d5fc8-hs8lf" event={"ID":"fb06ec19-dfd7-459f-9bfc-6c3e1619abb0","Type":"ContainerStarted","Data":"d1b99914aff75d6854dd18c44e4702b0f89c549bb21d23b94afcb4182e4386df"} Feb 19 15:29:35 crc kubenswrapper[4810]: I0219 15:29:35.888040 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-6cd8bf58f4-ktsjk" podStartSLOduration=2.888021922 podStartE2EDuration="2.888021922s" podCreationTimestamp="2026-02-19 15:29:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:29:35.883894651 +0000 UTC m=+1205.365924775" watchObservedRunningTime="2026-02-19 15:29:35.888021922 +0000 UTC m=+1205.370052066" Feb 19 15:29:35 crc kubenswrapper[4810]: I0219 15:29:35.894742 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7555d68ddd-xqj8c" event={"ID":"565eac29-daec-4b40-bcb7-751696560c3a","Type":"ContainerStarted","Data":"c57b5ba5a394f063373e611f4d92175660e2a4ae7e14c603f3b3bd232203b455"} Feb 19 15:29:35 crc kubenswrapper[4810]: I0219 15:29:35.937218 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-68ff886dc8-nntj6" event={"ID":"e0116ca5-826e-4a77-bc6f-11e89c047af8","Type":"ContainerStarted","Data":"0f0fdb4ec7d48ce5efa4d4ce421e22663d31ffdf324056a1695cd08d0b02000f"} Feb 19 15:29:35 crc kubenswrapper[4810]: I0219 15:29:35.937273 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-68ff886dc8-nntj6" event={"ID":"e0116ca5-826e-4a77-bc6f-11e89c047af8","Type":"ContainerStarted","Data":"4f57c96680174420000630479bd05d2a43d222e27a564e7d19f21df9961dd34c"} Feb 19 15:29:35 crc kubenswrapper[4810]: I0219 15:29:35.938166 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 19 15:29:35 crc kubenswrapper[4810]: I0219 15:29:35.938200 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 19 15:29:36 crc kubenswrapper[4810]: I0219 15:29:36.364814 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Feb 19 15:29:36 crc kubenswrapper[4810]: I0219 15:29:36.402072 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4be760b2-263c-4b89-8bdf-ecf98114a24f-combined-ca-bundle\") pod \"4be760b2-263c-4b89-8bdf-ecf98114a24f\" (UID: \"4be760b2-263c-4b89-8bdf-ecf98114a24f\") " Feb 19 15:29:36 crc kubenswrapper[4810]: I0219 15:29:36.402201 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4be760b2-263c-4b89-8bdf-ecf98114a24f-config-data\") pod \"4be760b2-263c-4b89-8bdf-ecf98114a24f\" (UID: \"4be760b2-263c-4b89-8bdf-ecf98114a24f\") " Feb 19 15:29:36 crc kubenswrapper[4810]: I0219 15:29:36.402226 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4be760b2-263c-4b89-8bdf-ecf98114a24f-logs\") pod \"4be760b2-263c-4b89-8bdf-ecf98114a24f\" (UID: \"4be760b2-263c-4b89-8bdf-ecf98114a24f\") " Feb 19 15:29:36 crc kubenswrapper[4810]: I0219 15:29:36.402350 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjt9b\" (UniqueName: \"kubernetes.io/projected/4be760b2-263c-4b89-8bdf-ecf98114a24f-kube-api-access-kjt9b\") pod \"4be760b2-263c-4b89-8bdf-ecf98114a24f\" (UID: \"4be760b2-263c-4b89-8bdf-ecf98114a24f\") " Feb 19 15:29:36 crc kubenswrapper[4810]: I0219 15:29:36.407644 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4be760b2-263c-4b89-8bdf-ecf98114a24f-logs" (OuterVolumeSpecName: "logs") pod "4be760b2-263c-4b89-8bdf-ecf98114a24f" (UID: "4be760b2-263c-4b89-8bdf-ecf98114a24f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:29:36 crc kubenswrapper[4810]: I0219 15:29:36.408940 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4be760b2-263c-4b89-8bdf-ecf98114a24f-kube-api-access-kjt9b" (OuterVolumeSpecName: "kube-api-access-kjt9b") pod "4be760b2-263c-4b89-8bdf-ecf98114a24f" (UID: "4be760b2-263c-4b89-8bdf-ecf98114a24f"). InnerVolumeSpecName "kube-api-access-kjt9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:29:36 crc kubenswrapper[4810]: I0219 15:29:36.490451 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4be760b2-263c-4b89-8bdf-ecf98114a24f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4be760b2-263c-4b89-8bdf-ecf98114a24f" (UID: "4be760b2-263c-4b89-8bdf-ecf98114a24f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:29:36 crc kubenswrapper[4810]: I0219 15:29:36.507223 4810 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4be760b2-263c-4b89-8bdf-ecf98114a24f-logs\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:36 crc kubenswrapper[4810]: I0219 15:29:36.507266 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kjt9b\" (UniqueName: \"kubernetes.io/projected/4be760b2-263c-4b89-8bdf-ecf98114a24f-kube-api-access-kjt9b\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:36 crc kubenswrapper[4810]: I0219 15:29:36.507281 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4be760b2-263c-4b89-8bdf-ecf98114a24f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:36 crc kubenswrapper[4810]: I0219 15:29:36.537478 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4be760b2-263c-4b89-8bdf-ecf98114a24f-config-data" (OuterVolumeSpecName: "config-data") pod "4be760b2-263c-4b89-8bdf-ecf98114a24f" (UID: "4be760b2-263c-4b89-8bdf-ecf98114a24f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:29:36 crc kubenswrapper[4810]: I0219 15:29:36.611231 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4be760b2-263c-4b89-8bdf-ecf98114a24f-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:36 crc kubenswrapper[4810]: I0219 15:29:36.965531 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-68ff886dc8-nntj6" event={"ID":"e0116ca5-826e-4a77-bc6f-11e89c047af8","Type":"ContainerStarted","Data":"4ac6a4c63292ab530356cb491f1c19b551ba50fa93c7f50d74148f2ff94c41a3"} Feb 19 15:29:36 crc kubenswrapper[4810]: I0219 15:29:36.965829 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-68ff886dc8-nntj6" Feb 19 15:29:36 crc kubenswrapper[4810]: I0219 15:29:36.965848 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-68ff886dc8-nntj6" Feb 19 15:29:36 crc kubenswrapper[4810]: I0219 15:29:36.991711 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-695bb7cdc6-72zs2" event={"ID":"f0b73197-3c7e-44c3-8a49-35d9e0a40629","Type":"ContainerStarted","Data":"dc4301bc9761619133902901d1997ab38671d0d229f4feca664d5a1fa8e30e7a"} Feb 19 15:29:36 crc kubenswrapper[4810]: I0219 15:29:36.991799 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-695bb7cdc6-72zs2" Feb 19 15:29:37 crc kubenswrapper[4810]: I0219 15:29:37.016285 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-68ff886dc8-nntj6" podStartSLOduration=4.016265811 podStartE2EDuration="4.016265811s" podCreationTimestamp="2026-02-19 15:29:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:29:36.998006753 +0000 UTC m=+1206.480036877" watchObservedRunningTime="2026-02-19 15:29:37.016265811 +0000 UTC m=+1206.498295935" Feb 19 15:29:37 crc kubenswrapper[4810]: I0219 15:29:37.052587 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-695bb7cdc6-72zs2" podStartSLOduration=4.052563921 podStartE2EDuration="4.052563921s" podCreationTimestamp="2026-02-19 15:29:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:29:37.035731558 +0000 UTC m=+1206.517761672" watchObservedRunningTime="2026-02-19 15:29:37.052563921 +0000 UTC m=+1206.534594045" Feb 19 15:29:37 crc kubenswrapper[4810]: I0219 15:29:37.053624 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cf5f86dff-7482l" event={"ID":"4f9534ee-827a-49fb-8588-a5a8a494be3c","Type":"ContainerStarted","Data":"f7ecab250fb259bafddfe5958b677b33308b00c3f83a99f4416f709cb8747963"} Feb 19 15:29:37 crc kubenswrapper[4810]: I0219 15:29:37.053697 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6cf5f86dff-7482l" Feb 19 15:29:37 crc kubenswrapper[4810]: I0219 15:29:37.083245 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"4be760b2-263c-4b89-8bdf-ecf98114a24f","Type":"ContainerDied","Data":"69118f89362c4dfde4f05ac6e0fde30afdc3fb74ce3c8ffeaee8a6df7e8789a5"} Feb 19 15:29:37 crc kubenswrapper[4810]: I0219 15:29:37.083304 4810 scope.go:117] "RemoveContainer" containerID="8c99f71d93423dbb260d03502181a976def39a613d97465a06298513d67bb0bf" Feb 19 15:29:37 crc kubenswrapper[4810]: I0219 15:29:37.083517 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Feb 19 15:29:37 crc kubenswrapper[4810]: I0219 15:29:37.091860 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6cf5f86dff-7482l" podStartSLOduration=4.091837705 podStartE2EDuration="4.091837705s" podCreationTimestamp="2026-02-19 15:29:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:29:37.081720087 +0000 UTC m=+1206.563750211" watchObservedRunningTime="2026-02-19 15:29:37.091837705 +0000 UTC m=+1206.573867829" Feb 19 15:29:37 crc kubenswrapper[4810]: I0219 15:29:37.173656 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6cc67d5fc8-hs8lf" event={"ID":"fb06ec19-dfd7-459f-9bfc-6c3e1619abb0","Type":"ContainerStarted","Data":"4e847a42f9e6fb1b9f7435b4df35d14a3fd82483b8e4fe2e6eb4d640b1450737"} Feb 19 15:29:37 crc kubenswrapper[4810]: I0219 15:29:37.191768 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-applier-0"] Feb 19 15:29:37 crc kubenswrapper[4810]: I0219 15:29:37.206803 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-applier-0"] Feb 19 15:29:37 crc kubenswrapper[4810]: I0219 15:29:37.223470 4810 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 19 15:29:37 crc kubenswrapper[4810]: I0219 15:29:37.223509 4810 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 19 15:29:37 crc kubenswrapper[4810]: I0219 15:29:37.224413 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7555d68ddd-xqj8c" event={"ID":"565eac29-daec-4b40-bcb7-751696560c3a","Type":"ContainerStarted","Data":"7bd22beb5a6519315081ec047ff7382f0fc2a07b39d0290f03b43dbfb6d54778"} Feb 19 15:29:37 crc kubenswrapper[4810]: I0219 15:29:37.224483 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7555d68ddd-xqj8c" event={"ID":"565eac29-daec-4b40-bcb7-751696560c3a","Type":"ContainerStarted","Data":"84a7ee0e1e0aec17ff38a3db779574c02a5552a158b322fbf4958626bf04c841"} Feb 19 15:29:37 crc kubenswrapper[4810]: I0219 15:29:37.233195 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-applier-0"] Feb 19 15:29:37 crc kubenswrapper[4810]: E0219 15:29:37.233654 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4be760b2-263c-4b89-8bdf-ecf98114a24f" containerName="watcher-applier" Feb 19 15:29:37 crc kubenswrapper[4810]: I0219 15:29:37.233674 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="4be760b2-263c-4b89-8bdf-ecf98114a24f" containerName="watcher-applier" Feb 19 15:29:37 crc kubenswrapper[4810]: I0219 15:29:37.233880 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="4be760b2-263c-4b89-8bdf-ecf98114a24f" containerName="watcher-applier" Feb 19 15:29:37 crc kubenswrapper[4810]: I0219 15:29:37.234645 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Feb 19 15:29:37 crc kubenswrapper[4810]: I0219 15:29:37.240556 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-applier-config-data" Feb 19 15:29:37 crc kubenswrapper[4810]: I0219 15:29:37.290498 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Feb 19 15:29:37 crc kubenswrapper[4810]: I0219 15:29:37.310257 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-7555d68ddd-xqj8c" podStartSLOduration=4.310239463 podStartE2EDuration="4.310239463s" podCreationTimestamp="2026-02-19 15:29:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:29:37.24818163 +0000 UTC m=+1206.730211754" watchObservedRunningTime="2026-02-19 15:29:37.310239463 +0000 UTC m=+1206.792269587" Feb 19 15:29:37 crc kubenswrapper[4810]: I0219 15:29:37.314402 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-695bb7cdc6-72zs2"] Feb 19 15:29:37 crc kubenswrapper[4810]: I0219 15:29:37.370893 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6dfcf65577-bd5w2"] Feb 19 15:29:37 crc kubenswrapper[4810]: I0219 15:29:37.372634 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6dfcf65577-bd5w2" Feb 19 15:29:37 crc kubenswrapper[4810]: I0219 15:29:37.379638 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Feb 19 15:29:37 crc kubenswrapper[4810]: I0219 15:29:37.379808 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Feb 19 15:29:37 crc kubenswrapper[4810]: I0219 15:29:37.380704 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6dfcf65577-bd5w2"] Feb 19 15:29:37 crc kubenswrapper[4810]: I0219 15:29:37.452708 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4be760b2-263c-4b89-8bdf-ecf98114a24f" path="/var/lib/kubelet/pods/4be760b2-263c-4b89-8bdf-ecf98114a24f/volumes" Feb 19 15:29:37 crc kubenswrapper[4810]: I0219 15:29:37.469851 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ea855ba-523c-4143-8fe8-b0b1150299d0-config-data\") pod \"watcher-applier-0\" (UID: \"2ea855ba-523c-4143-8fe8-b0b1150299d0\") " pod="openstack/watcher-applier-0" Feb 19 15:29:37 crc kubenswrapper[4810]: I0219 15:29:37.469896 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f72k9\" (UniqueName: \"kubernetes.io/projected/2ea855ba-523c-4143-8fe8-b0b1150299d0-kube-api-access-f72k9\") pod \"watcher-applier-0\" (UID: \"2ea855ba-523c-4143-8fe8-b0b1150299d0\") " pod="openstack/watcher-applier-0" Feb 19 15:29:37 crc kubenswrapper[4810]: I0219 15:29:37.469993 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ea855ba-523c-4143-8fe8-b0b1150299d0-logs\") pod \"watcher-applier-0\" (UID: \"2ea855ba-523c-4143-8fe8-b0b1150299d0\") " pod="openstack/watcher-applier-0" Feb 19 15:29:37 crc kubenswrapper[4810]: I0219 15:29:37.470013 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ea855ba-523c-4143-8fe8-b0b1150299d0-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"2ea855ba-523c-4143-8fe8-b0b1150299d0\") " pod="openstack/watcher-applier-0" Feb 19 15:29:37 crc kubenswrapper[4810]: I0219 15:29:37.570994 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6528bdfd-3389-4776-826e-164fc5117682-config\") pod \"neutron-6dfcf65577-bd5w2\" (UID: \"6528bdfd-3389-4776-826e-164fc5117682\") " pod="openstack/neutron-6dfcf65577-bd5w2" Feb 19 15:29:37 crc kubenswrapper[4810]: I0219 15:29:37.571034 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t97g7\" (UniqueName: \"kubernetes.io/projected/6528bdfd-3389-4776-826e-164fc5117682-kube-api-access-t97g7\") pod \"neutron-6dfcf65577-bd5w2\" (UID: \"6528bdfd-3389-4776-826e-164fc5117682\") " pod="openstack/neutron-6dfcf65577-bd5w2" Feb 19 15:29:37 crc kubenswrapper[4810]: I0219 15:29:37.571056 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6528bdfd-3389-4776-826e-164fc5117682-httpd-config\") pod \"neutron-6dfcf65577-bd5w2\" (UID: \"6528bdfd-3389-4776-826e-164fc5117682\") " pod="openstack/neutron-6dfcf65577-bd5w2" Feb 19 15:29:37 crc kubenswrapper[4810]: I0219 15:29:37.571081 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6528bdfd-3389-4776-826e-164fc5117682-ovndb-tls-certs\") pod \"neutron-6dfcf65577-bd5w2\" (UID: \"6528bdfd-3389-4776-826e-164fc5117682\") " pod="openstack/neutron-6dfcf65577-bd5w2" Feb 19 15:29:37 crc kubenswrapper[4810]: I0219 15:29:37.571108 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ea855ba-523c-4143-8fe8-b0b1150299d0-config-data\") pod \"watcher-applier-0\" (UID: \"2ea855ba-523c-4143-8fe8-b0b1150299d0\") " pod="openstack/watcher-applier-0" Feb 19 15:29:37 crc kubenswrapper[4810]: I0219 15:29:37.571124 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f72k9\" (UniqueName: \"kubernetes.io/projected/2ea855ba-523c-4143-8fe8-b0b1150299d0-kube-api-access-f72k9\") pod \"watcher-applier-0\" (UID: \"2ea855ba-523c-4143-8fe8-b0b1150299d0\") " pod="openstack/watcher-applier-0" Feb 19 15:29:37 crc kubenswrapper[4810]: I0219 15:29:37.571147 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6528bdfd-3389-4776-826e-164fc5117682-combined-ca-bundle\") pod \"neutron-6dfcf65577-bd5w2\" (UID: \"6528bdfd-3389-4776-826e-164fc5117682\") " pod="openstack/neutron-6dfcf65577-bd5w2" Feb 19 15:29:37 crc kubenswrapper[4810]: I0219 15:29:37.571174 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6528bdfd-3389-4776-826e-164fc5117682-public-tls-certs\") pod \"neutron-6dfcf65577-bd5w2\" (UID: \"6528bdfd-3389-4776-826e-164fc5117682\") " pod="openstack/neutron-6dfcf65577-bd5w2" Feb 19 15:29:37 crc kubenswrapper[4810]: I0219 15:29:37.571234 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ea855ba-523c-4143-8fe8-b0b1150299d0-logs\") pod \"watcher-applier-0\" (UID: \"2ea855ba-523c-4143-8fe8-b0b1150299d0\") " pod="openstack/watcher-applier-0" Feb 19 15:29:37 crc kubenswrapper[4810]: I0219 15:29:37.571250 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ea855ba-523c-4143-8fe8-b0b1150299d0-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"2ea855ba-523c-4143-8fe8-b0b1150299d0\") " pod="openstack/watcher-applier-0" Feb 19 15:29:37 crc kubenswrapper[4810]: I0219 15:29:37.571313 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6528bdfd-3389-4776-826e-164fc5117682-internal-tls-certs\") pod \"neutron-6dfcf65577-bd5w2\" (UID: \"6528bdfd-3389-4776-826e-164fc5117682\") " pod="openstack/neutron-6dfcf65577-bd5w2" Feb 19 15:29:37 crc kubenswrapper[4810]: I0219 15:29:37.575513 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ea855ba-523c-4143-8fe8-b0b1150299d0-logs\") pod \"watcher-applier-0\" (UID: \"2ea855ba-523c-4143-8fe8-b0b1150299d0\") " pod="openstack/watcher-applier-0" Feb 19 15:29:37 crc kubenswrapper[4810]: I0219 15:29:37.576233 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ea855ba-523c-4143-8fe8-b0b1150299d0-config-data\") pod \"watcher-applier-0\" (UID: \"2ea855ba-523c-4143-8fe8-b0b1150299d0\") " pod="openstack/watcher-applier-0" Feb 19 15:29:37 crc kubenswrapper[4810]: I0219 15:29:37.576765 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ea855ba-523c-4143-8fe8-b0b1150299d0-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"2ea855ba-523c-4143-8fe8-b0b1150299d0\") " pod="openstack/watcher-applier-0" Feb 19 15:29:37 crc kubenswrapper[4810]: I0219 15:29:37.595022 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f72k9\" (UniqueName: \"kubernetes.io/projected/2ea855ba-523c-4143-8fe8-b0b1150299d0-kube-api-access-f72k9\") pod \"watcher-applier-0\" (UID: \"2ea855ba-523c-4143-8fe8-b0b1150299d0\") " pod="openstack/watcher-applier-0" Feb 19 15:29:37 crc kubenswrapper[4810]: I0219 15:29:37.653825 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="cd961c7d-d551-4f5b-a08a-07d088947698" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.168:9322/\": read tcp 10.217.0.2:33186->10.217.0.168:9322: read: connection reset by peer" Feb 19 15:29:37 crc kubenswrapper[4810]: I0219 15:29:37.653902 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="cd961c7d-d551-4f5b-a08a-07d088947698" containerName="watcher-api-log" probeResult="failure" output="Get \"http://10.217.0.168:9322/\": read tcp 10.217.0.2:33190->10.217.0.168:9322: read: connection reset by peer" Feb 19 15:29:37 crc kubenswrapper[4810]: I0219 15:29:37.672812 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6528bdfd-3389-4776-826e-164fc5117682-httpd-config\") pod \"neutron-6dfcf65577-bd5w2\" (UID: \"6528bdfd-3389-4776-826e-164fc5117682\") " pod="openstack/neutron-6dfcf65577-bd5w2" Feb 19 15:29:37 crc kubenswrapper[4810]: I0219 15:29:37.672859 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6528bdfd-3389-4776-826e-164fc5117682-ovndb-tls-certs\") pod \"neutron-6dfcf65577-bd5w2\" (UID: \"6528bdfd-3389-4776-826e-164fc5117682\") " pod="openstack/neutron-6dfcf65577-bd5w2" Feb 19 15:29:37 crc kubenswrapper[4810]: I0219 15:29:37.672895 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6528bdfd-3389-4776-826e-164fc5117682-combined-ca-bundle\") pod \"neutron-6dfcf65577-bd5w2\" (UID: \"6528bdfd-3389-4776-826e-164fc5117682\") " pod="openstack/neutron-6dfcf65577-bd5w2" Feb 19 15:29:37 crc kubenswrapper[4810]: I0219 15:29:37.672922 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6528bdfd-3389-4776-826e-164fc5117682-public-tls-certs\") pod \"neutron-6dfcf65577-bd5w2\" (UID: \"6528bdfd-3389-4776-826e-164fc5117682\") " pod="openstack/neutron-6dfcf65577-bd5w2" Feb 19 15:29:37 crc kubenswrapper[4810]: I0219 15:29:37.673023 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6528bdfd-3389-4776-826e-164fc5117682-internal-tls-certs\") pod \"neutron-6dfcf65577-bd5w2\" (UID: \"6528bdfd-3389-4776-826e-164fc5117682\") " pod="openstack/neutron-6dfcf65577-bd5w2" Feb 19 15:29:37 crc kubenswrapper[4810]: I0219 15:29:37.673044 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6528bdfd-3389-4776-826e-164fc5117682-config\") pod \"neutron-6dfcf65577-bd5w2\" (UID: \"6528bdfd-3389-4776-826e-164fc5117682\") " pod="openstack/neutron-6dfcf65577-bd5w2" Feb 19 15:29:37 crc kubenswrapper[4810]: I0219 15:29:37.673059 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t97g7\" (UniqueName: \"kubernetes.io/projected/6528bdfd-3389-4776-826e-164fc5117682-kube-api-access-t97g7\") pod \"neutron-6dfcf65577-bd5w2\" (UID: \"6528bdfd-3389-4776-826e-164fc5117682\") " pod="openstack/neutron-6dfcf65577-bd5w2" Feb 19 15:29:37 crc kubenswrapper[4810]: I0219 15:29:37.681794 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6528bdfd-3389-4776-826e-164fc5117682-internal-tls-certs\") pod \"neutron-6dfcf65577-bd5w2\" (UID: \"6528bdfd-3389-4776-826e-164fc5117682\") " pod="openstack/neutron-6dfcf65577-bd5w2" Feb 19 15:29:37 crc kubenswrapper[4810]: I0219 15:29:37.682992 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6528bdfd-3389-4776-826e-164fc5117682-combined-ca-bundle\") pod \"neutron-6dfcf65577-bd5w2\" (UID: \"6528bdfd-3389-4776-826e-164fc5117682\") " pod="openstack/neutron-6dfcf65577-bd5w2" Feb 19 15:29:37 crc kubenswrapper[4810]: I0219 15:29:37.686369 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/6528bdfd-3389-4776-826e-164fc5117682-config\") pod \"neutron-6dfcf65577-bd5w2\" (UID: \"6528bdfd-3389-4776-826e-164fc5117682\") " pod="openstack/neutron-6dfcf65577-bd5w2" Feb 19 15:29:37 crc kubenswrapper[4810]: I0219 15:29:37.687053 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6528bdfd-3389-4776-826e-164fc5117682-ovndb-tls-certs\") pod \"neutron-6dfcf65577-bd5w2\" (UID: \"6528bdfd-3389-4776-826e-164fc5117682\") " pod="openstack/neutron-6dfcf65577-bd5w2" Feb 19 15:29:37 crc kubenswrapper[4810]: I0219 15:29:37.688041 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6528bdfd-3389-4776-826e-164fc5117682-public-tls-certs\") pod \"neutron-6dfcf65577-bd5w2\" (UID: \"6528bdfd-3389-4776-826e-164fc5117682\") " pod="openstack/neutron-6dfcf65577-bd5w2" Feb 19 15:29:37 crc kubenswrapper[4810]: I0219 15:29:37.688417 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6528bdfd-3389-4776-826e-164fc5117682-httpd-config\") pod \"neutron-6dfcf65577-bd5w2\" (UID: \"6528bdfd-3389-4776-826e-164fc5117682\") " pod="openstack/neutron-6dfcf65577-bd5w2" Feb 19 15:29:37 crc kubenswrapper[4810]: I0219 15:29:37.692903 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t97g7\" (UniqueName: \"kubernetes.io/projected/6528bdfd-3389-4776-826e-164fc5117682-kube-api-access-t97g7\") pod \"neutron-6dfcf65577-bd5w2\" (UID: \"6528bdfd-3389-4776-826e-164fc5117682\") " pod="openstack/neutron-6dfcf65577-bd5w2" Feb 19 15:29:37 crc kubenswrapper[4810]: I0219 15:29:37.702160 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6dfcf65577-bd5w2" Feb 19 15:29:37 crc kubenswrapper[4810]: I0219 15:29:37.890649 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Feb 19 15:29:38 crc kubenswrapper[4810]: I0219 15:29:38.166149 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-869f57798-ngdtl" podUID="58c845f2-0069-4ee5-9d4b-b5871e078926" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.165:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.165:8443: connect: connection refused" Feb 19 15:29:38 crc kubenswrapper[4810]: I0219 15:29:38.251873 4810 generic.go:334] "Generic (PLEG): container finished" podID="cd961c7d-d551-4f5b-a08a-07d088947698" containerID="aa9a5e6b8de561c312023bc0224fd25e600c4ac446d6ec5ee19e031c464523e1" exitCode=0 Feb 19 15:29:38 crc kubenswrapper[4810]: I0219 15:29:38.252706 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"cd961c7d-d551-4f5b-a08a-07d088947698","Type":"ContainerDied","Data":"aa9a5e6b8de561c312023bc0224fd25e600c4ac446d6ec5ee19e031c464523e1"} Feb 19 15:29:38 crc kubenswrapper[4810]: I0219 15:29:38.252839 4810 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 19 15:29:38 crc kubenswrapper[4810]: I0219 15:29:38.252854 4810 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 19 15:29:38 crc kubenswrapper[4810]: I0219 15:29:38.253887 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7555d68ddd-xqj8c" Feb 19 15:29:38 crc kubenswrapper[4810]: I0219 15:29:38.253910 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7555d68ddd-xqj8c" Feb 19 15:29:38 crc kubenswrapper[4810]: I0219 15:29:38.274949 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-5f56498b8d-9gwmf" podUID="737d6629-747f-4d16-a545-d0070c20fe5d" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.166:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.166:8443: connect: connection refused" Feb 19 15:29:39 crc kubenswrapper[4810]: I0219 15:29:39.050096 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Feb 19 15:29:39 crc kubenswrapper[4810]: I0219 15:29:39.051837 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6dfcf65577-bd5w2"] Feb 19 15:29:39 crc kubenswrapper[4810]: I0219 15:29:39.112004 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd961c7d-d551-4f5b-a08a-07d088947698-logs\") pod \"cd961c7d-d551-4f5b-a08a-07d088947698\" (UID: \"cd961c7d-d551-4f5b-a08a-07d088947698\") " Feb 19 15:29:39 crc kubenswrapper[4810]: I0219 15:29:39.112199 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/cd961c7d-d551-4f5b-a08a-07d088947698-custom-prometheus-ca\") pod \"cd961c7d-d551-4f5b-a08a-07d088947698\" (UID: \"cd961c7d-d551-4f5b-a08a-07d088947698\") " Feb 19 15:29:39 crc kubenswrapper[4810]: I0219 15:29:39.112292 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd961c7d-d551-4f5b-a08a-07d088947698-combined-ca-bundle\") pod \"cd961c7d-d551-4f5b-a08a-07d088947698\" (UID: \"cd961c7d-d551-4f5b-a08a-07d088947698\") " Feb 19 15:29:39 crc kubenswrapper[4810]: I0219 15:29:39.112370 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd961c7d-d551-4f5b-a08a-07d088947698-config-data\") pod \"cd961c7d-d551-4f5b-a08a-07d088947698\" (UID: \"cd961c7d-d551-4f5b-a08a-07d088947698\") " Feb 19 15:29:39 crc kubenswrapper[4810]: I0219 15:29:39.112425 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdn2x\" (UniqueName: \"kubernetes.io/projected/cd961c7d-d551-4f5b-a08a-07d088947698-kube-api-access-zdn2x\") pod \"cd961c7d-d551-4f5b-a08a-07d088947698\" (UID: \"cd961c7d-d551-4f5b-a08a-07d088947698\") " Feb 19 15:29:39 crc kubenswrapper[4810]: I0219 15:29:39.113708 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd961c7d-d551-4f5b-a08a-07d088947698-logs" (OuterVolumeSpecName: "logs") pod "cd961c7d-d551-4f5b-a08a-07d088947698" (UID: "cd961c7d-d551-4f5b-a08a-07d088947698"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:29:39 crc kubenswrapper[4810]: I0219 15:29:39.125536 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Feb 19 15:29:39 crc kubenswrapper[4810]: E0219 15:29:39.129701 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ea64343d53131969232078700626ee5996ba47b778be09a59da7ab42c9aa2631 is running failed: container process not found" containerID="ea64343d53131969232078700626ee5996ba47b778be09a59da7ab42c9aa2631" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Feb 19 15:29:39 crc kubenswrapper[4810]: I0219 15:29:39.129890 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd961c7d-d551-4f5b-a08a-07d088947698-kube-api-access-zdn2x" (OuterVolumeSpecName: "kube-api-access-zdn2x") pod "cd961c7d-d551-4f5b-a08a-07d088947698" (UID: "cd961c7d-d551-4f5b-a08a-07d088947698"). InnerVolumeSpecName "kube-api-access-zdn2x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:29:39 crc kubenswrapper[4810]: E0219 15:29:39.133445 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ea64343d53131969232078700626ee5996ba47b778be09a59da7ab42c9aa2631 is running failed: container process not found" containerID="ea64343d53131969232078700626ee5996ba47b778be09a59da7ab42c9aa2631" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Feb 19 15:29:39 crc kubenswrapper[4810]: E0219 15:29:39.137686 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ea64343d53131969232078700626ee5996ba47b778be09a59da7ab42c9aa2631 is running failed: container process not found" containerID="ea64343d53131969232078700626ee5996ba47b778be09a59da7ab42c9aa2631" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Feb 19 15:29:39 crc kubenswrapper[4810]: E0219 15:29:39.137770 4810 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ea64343d53131969232078700626ee5996ba47b778be09a59da7ab42c9aa2631 is running failed: container process not found" probeType="Startup" pod="openstack/watcher-decision-engine-0" podUID="3eb2dccd-c5dc-436f-b7a6-954af7bc51c5" containerName="watcher-decision-engine" Feb 19 15:29:39 crc kubenswrapper[4810]: I0219 15:29:39.157039 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Feb 19 15:29:39 crc kubenswrapper[4810]: I0219 15:29:39.186852 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd961c7d-d551-4f5b-a08a-07d088947698-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cd961c7d-d551-4f5b-a08a-07d088947698" (UID: "cd961c7d-d551-4f5b-a08a-07d088947698"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:29:39 crc kubenswrapper[4810]: I0219 15:29:39.186929 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd961c7d-d551-4f5b-a08a-07d088947698-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "cd961c7d-d551-4f5b-a08a-07d088947698" (UID: "cd961c7d-d551-4f5b-a08a-07d088947698"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:29:39 crc kubenswrapper[4810]: I0219 15:29:39.206535 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd961c7d-d551-4f5b-a08a-07d088947698-config-data" (OuterVolumeSpecName: "config-data") pod "cd961c7d-d551-4f5b-a08a-07d088947698" (UID: "cd961c7d-d551-4f5b-a08a-07d088947698"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:29:39 crc kubenswrapper[4810]: I0219 15:29:39.214917 4810 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/cd961c7d-d551-4f5b-a08a-07d088947698-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:39 crc kubenswrapper[4810]: I0219 15:29:39.214947 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd961c7d-d551-4f5b-a08a-07d088947698-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:39 crc kubenswrapper[4810]: I0219 15:29:39.214959 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd961c7d-d551-4f5b-a08a-07d088947698-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:39 crc kubenswrapper[4810]: I0219 15:29:39.214972 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdn2x\" (UniqueName: \"kubernetes.io/projected/cd961c7d-d551-4f5b-a08a-07d088947698-kube-api-access-zdn2x\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:39 crc kubenswrapper[4810]: I0219 15:29:39.214983 4810 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd961c7d-d551-4f5b-a08a-07d088947698-logs\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:39 crc kubenswrapper[4810]: I0219 15:29:39.296433 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6dfcf65577-bd5w2" event={"ID":"6528bdfd-3389-4776-826e-164fc5117682","Type":"ContainerStarted","Data":"8d4fcb3fc70457369ed906639e5d7920cbca2fb8876e0eee30e7e8eccacefa6c"} Feb 19 15:29:39 crc kubenswrapper[4810]: I0219 15:29:39.305649 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-svmgl" event={"ID":"848dfe9d-05f4-4ba9-919e-23e9a7ae63d5","Type":"ContainerStarted","Data":"61fee9f5cc97dc9164d9a8b37259645ec27704b544f8031e79cd8630294aa448"} Feb 19 15:29:39 crc kubenswrapper[4810]: I0219 15:29:39.321965 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"cd961c7d-d551-4f5b-a08a-07d088947698","Type":"ContainerDied","Data":"3ca7fe1f4f8bad9a3a06d89d4141ff28e32b10f6d3445d4b3f2404b6e71c942f"} Feb 19 15:29:39 crc kubenswrapper[4810]: I0219 15:29:39.322173 4810 scope.go:117] "RemoveContainer" containerID="aa9a5e6b8de561c312023bc0224fd25e600c4ac446d6ec5ee19e031c464523e1" Feb 19 15:29:39 crc kubenswrapper[4810]: I0219 15:29:39.322410 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Feb 19 15:29:39 crc kubenswrapper[4810]: I0219 15:29:39.346916 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6cc67d5fc8-hs8lf" event={"ID":"fb06ec19-dfd7-459f-9bfc-6c3e1619abb0","Type":"ContainerStarted","Data":"ec03377efcba956dd84e47222d7f2fa8900a14d68a543b851a67f801c749c23f"} Feb 19 15:29:39 crc kubenswrapper[4810]: I0219 15:29:39.348150 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6cc67d5fc8-hs8lf" Feb 19 15:29:39 crc kubenswrapper[4810]: I0219 15:29:39.354143 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"2ea855ba-523c-4143-8fe8-b0b1150299d0","Type":"ContainerStarted","Data":"1d01a4c95612c0c4f1b7f9b7042052db9fc19c8db413469f648bf9735bce00e6"} Feb 19 15:29:39 crc kubenswrapper[4810]: I0219 15:29:39.375535 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-svmgl" podStartSLOduration=6.618609006 podStartE2EDuration="51.375517298s" podCreationTimestamp="2026-02-19 15:28:48 +0000 UTC" firstStartedPulling="2026-02-19 15:28:50.967593859 +0000 UTC m=+1160.449623973" lastFinishedPulling="2026-02-19 15:29:35.724502141 +0000 UTC m=+1205.206532265" observedRunningTime="2026-02-19 15:29:39.323075582 +0000 UTC m=+1208.805105706" watchObservedRunningTime="2026-02-19 15:29:39.375517298 +0000 UTC m=+1208.857547422" Feb 19 15:29:39 crc kubenswrapper[4810]: I0219 15:29:39.379133 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6cc67d5fc8-hs8lf" podStartSLOduration=6.379123227 podStartE2EDuration="6.379123227s" podCreationTimestamp="2026-02-19 15:29:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:29:39.369699225 +0000 UTC m=+1208.851729379" watchObservedRunningTime="2026-02-19 15:29:39.379123227 +0000 UTC m=+1208.861153351" Feb 19 15:29:39 crc kubenswrapper[4810]: I0219 15:29:39.380054 4810 generic.go:334] "Generic (PLEG): container finished" podID="3eb2dccd-c5dc-436f-b7a6-954af7bc51c5" containerID="ea64343d53131969232078700626ee5996ba47b778be09a59da7ab42c9aa2631" exitCode=1 Feb 19 15:29:39 crc kubenswrapper[4810]: I0219 15:29:39.380146 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"3eb2dccd-c5dc-436f-b7a6-954af7bc51c5","Type":"ContainerDied","Data":"ea64343d53131969232078700626ee5996ba47b778be09a59da7ab42c9aa2631"} Feb 19 15:29:39 crc kubenswrapper[4810]: I0219 15:29:39.381102 4810 scope.go:117] "RemoveContainer" containerID="ea64343d53131969232078700626ee5996ba47b778be09a59da7ab42c9aa2631" Feb 19 15:29:39 crc kubenswrapper[4810]: E0219 15:29:39.381346 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 10s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(3eb2dccd-c5dc-436f-b7a6-954af7bc51c5)\"" pod="openstack/watcher-decision-engine-0" podUID="3eb2dccd-c5dc-436f-b7a6-954af7bc51c5" Feb 19 15:29:39 crc kubenswrapper[4810]: I0219 15:29:39.381923 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-695bb7cdc6-72zs2" podUID="f0b73197-3c7e-44c3-8a49-35d9e0a40629" containerName="neutron-api" containerID="cri-o://fc28d9c929a268b7ca57d72d22325d489071902ea7ff07035df3c9d98d6a728a" gracePeriod=30 Feb 19 15:29:39 crc kubenswrapper[4810]: I0219 15:29:39.382085 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-695bb7cdc6-72zs2" podUID="f0b73197-3c7e-44c3-8a49-35d9e0a40629" containerName="neutron-httpd" containerID="cri-o://dc4301bc9761619133902901d1997ab38671d0d229f4feca664d5a1fa8e30e7a" gracePeriod=30 Feb 19 15:29:39 crc kubenswrapper[4810]: I0219 15:29:39.509308 4810 scope.go:117] "RemoveContainer" containerID="4368969b33587354836e3cdb6c31f7bd645f510dc4e3c262614ef8da31c0eb92" Feb 19 15:29:39 crc kubenswrapper[4810]: I0219 15:29:39.593444 4810 scope.go:117] "RemoveContainer" containerID="f9469eea900aae9e94121b7851a3eacdafc431e0f15d0e8319f5a93a2616b851" Feb 19 15:29:39 crc kubenswrapper[4810]: I0219 15:29:39.918373 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 19 15:29:39 crc kubenswrapper[4810]: I0219 15:29:39.918460 4810 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 19 15:29:40 crc kubenswrapper[4810]: I0219 15:29:40.068044 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 19 15:29:40 crc kubenswrapper[4810]: I0219 15:29:40.325115 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 19 15:29:40 crc kubenswrapper[4810]: I0219 15:29:40.325220 4810 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 19 15:29:40 crc kubenswrapper[4810]: I0219 15:29:40.341400 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 19 15:29:40 crc kubenswrapper[4810]: I0219 15:29:40.437847 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"2ea855ba-523c-4143-8fe8-b0b1150299d0","Type":"ContainerStarted","Data":"9386dd3b59b24a748e770d6384d92f3e8aff8a701badb29067310dec0fb2fbb8"} Feb 19 15:29:40 crc kubenswrapper[4810]: I0219 15:29:40.455617 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6dfcf65577-bd5w2" event={"ID":"6528bdfd-3389-4776-826e-164fc5117682","Type":"ContainerStarted","Data":"b8c272c13d1b92aba75222a0570054705e53f0081b56e6b23d6a0c0ea9d19e36"} Feb 19 15:29:40 crc kubenswrapper[4810]: I0219 15:29:40.455670 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6dfcf65577-bd5w2" event={"ID":"6528bdfd-3389-4776-826e-164fc5117682","Type":"ContainerStarted","Data":"15e4052ed266d6ebf83cd909256437484aa3ff4ec25a466bae110f8eecff1146"} Feb 19 15:29:40 crc kubenswrapper[4810]: I0219 15:29:40.455711 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6dfcf65577-bd5w2" Feb 19 15:29:40 crc kubenswrapper[4810]: I0219 15:29:40.468799 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-applier-0" podStartSLOduration=3.468780979 podStartE2EDuration="3.468780979s" podCreationTimestamp="2026-02-19 15:29:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:29:40.463673883 +0000 UTC m=+1209.945703997" watchObservedRunningTime="2026-02-19 15:29:40.468780979 +0000 UTC m=+1209.950811103" Feb 19 15:29:40 crc kubenswrapper[4810]: I0219 15:29:40.479864 4810 generic.go:334] "Generic (PLEG): container finished" podID="f0b73197-3c7e-44c3-8a49-35d9e0a40629" containerID="dc4301bc9761619133902901d1997ab38671d0d229f4feca664d5a1fa8e30e7a" exitCode=0 Feb 19 15:29:40 crc kubenswrapper[4810]: I0219 15:29:40.480996 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-695bb7cdc6-72zs2" event={"ID":"f0b73197-3c7e-44c3-8a49-35d9e0a40629","Type":"ContainerDied","Data":"dc4301bc9761619133902901d1997ab38671d0d229f4feca664d5a1fa8e30e7a"} Feb 19 15:29:40 crc kubenswrapper[4810]: I0219 15:29:40.490759 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6dfcf65577-bd5w2" podStartSLOduration=3.490743957 podStartE2EDuration="3.490743957s" podCreationTimestamp="2026-02-19 15:29:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:29:40.489013395 +0000 UTC m=+1209.971043519" watchObservedRunningTime="2026-02-19 15:29:40.490743957 +0000 UTC m=+1209.972774081" Feb 19 15:29:42 crc kubenswrapper[4810]: I0219 15:29:42.891162 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-applier-0" Feb 19 15:29:43 crc kubenswrapper[4810]: I0219 15:29:43.527351 4810 generic.go:334] "Generic (PLEG): container finished" podID="2024a783-c3f9-4e57-b00f-52bec164e64e" containerID="ad28bfaa41efd8e4e6c465f81c081a0451a00386412156a5267acdd97840a40b" exitCode=0 Feb 19 15:29:43 crc kubenswrapper[4810]: I0219 15:29:43.527382 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-hmc6k" event={"ID":"2024a783-c3f9-4e57-b00f-52bec164e64e","Type":"ContainerDied","Data":"ad28bfaa41efd8e4e6c465f81c081a0451a00386412156a5267acdd97840a40b"} Feb 19 15:29:43 crc kubenswrapper[4810]: I0219 15:29:43.854928 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6cf5f86dff-7482l" Feb 19 15:29:43 crc kubenswrapper[4810]: I0219 15:29:43.920503 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f4cfd6f6c-s7m64"] Feb 19 15:29:43 crc kubenswrapper[4810]: I0219 15:29:43.920719 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5f4cfd6f6c-s7m64" podUID="18ca6546-69fd-492d-81c5-bb18c56b045d" containerName="dnsmasq-dns" containerID="cri-o://acf279e70fd332fdcdd2bf83f0303bb99e19cd10482ff8ab44a134fa747add8b" gracePeriod=10 Feb 19 15:29:44 crc kubenswrapper[4810]: I0219 15:29:44.555965 4810 generic.go:334] "Generic (PLEG): container finished" podID="18ca6546-69fd-492d-81c5-bb18c56b045d" containerID="acf279e70fd332fdcdd2bf83f0303bb99e19cd10482ff8ab44a134fa747add8b" exitCode=0 Feb 19 15:29:44 crc kubenswrapper[4810]: I0219 15:29:44.556460 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f4cfd6f6c-s7m64" event={"ID":"18ca6546-69fd-492d-81c5-bb18c56b045d","Type":"ContainerDied","Data":"acf279e70fd332fdcdd2bf83f0303bb99e19cd10482ff8ab44a134fa747add8b"} Feb 19 15:29:44 crc kubenswrapper[4810]: I0219 15:29:44.612800 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5f4cfd6f6c-s7m64" podUID="18ca6546-69fd-492d-81c5-bb18c56b045d" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.158:5353: connect: connection refused" Feb 19 15:29:45 crc kubenswrapper[4810]: I0219 15:29:45.956224 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-hmc6k" Feb 19 15:29:46 crc kubenswrapper[4810]: I0219 15:29:46.076715 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2024a783-c3f9-4e57-b00f-52bec164e64e-db-sync-config-data\") pod \"2024a783-c3f9-4e57-b00f-52bec164e64e\" (UID: \"2024a783-c3f9-4e57-b00f-52bec164e64e\") " Feb 19 15:29:46 crc kubenswrapper[4810]: I0219 15:29:46.076803 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-btmxg\" (UniqueName: \"kubernetes.io/projected/2024a783-c3f9-4e57-b00f-52bec164e64e-kube-api-access-btmxg\") pod \"2024a783-c3f9-4e57-b00f-52bec164e64e\" (UID: \"2024a783-c3f9-4e57-b00f-52bec164e64e\") " Feb 19 15:29:46 crc kubenswrapper[4810]: I0219 15:29:46.076876 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2024a783-c3f9-4e57-b00f-52bec164e64e-combined-ca-bundle\") pod \"2024a783-c3f9-4e57-b00f-52bec164e64e\" (UID: \"2024a783-c3f9-4e57-b00f-52bec164e64e\") " Feb 19 15:29:46 crc kubenswrapper[4810]: I0219 15:29:46.087530 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2024a783-c3f9-4e57-b00f-52bec164e64e-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "2024a783-c3f9-4e57-b00f-52bec164e64e" (UID: "2024a783-c3f9-4e57-b00f-52bec164e64e"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:29:46 crc kubenswrapper[4810]: I0219 15:29:46.101849 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2024a783-c3f9-4e57-b00f-52bec164e64e-kube-api-access-btmxg" (OuterVolumeSpecName: "kube-api-access-btmxg") pod "2024a783-c3f9-4e57-b00f-52bec164e64e" (UID: "2024a783-c3f9-4e57-b00f-52bec164e64e"). InnerVolumeSpecName "kube-api-access-btmxg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:29:46 crc kubenswrapper[4810]: I0219 15:29:46.108682 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2024a783-c3f9-4e57-b00f-52bec164e64e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2024a783-c3f9-4e57-b00f-52bec164e64e" (UID: "2024a783-c3f9-4e57-b00f-52bec164e64e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:29:46 crc kubenswrapper[4810]: I0219 15:29:46.181744 4810 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2024a783-c3f9-4e57-b00f-52bec164e64e-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:46 crc kubenswrapper[4810]: I0219 15:29:46.181779 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-btmxg\" (UniqueName: \"kubernetes.io/projected/2024a783-c3f9-4e57-b00f-52bec164e64e-kube-api-access-btmxg\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:46 crc kubenswrapper[4810]: I0219 15:29:46.181790 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2024a783-c3f9-4e57-b00f-52bec164e64e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:46 crc kubenswrapper[4810]: I0219 15:29:46.573044 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-hmc6k" event={"ID":"2024a783-c3f9-4e57-b00f-52bec164e64e","Type":"ContainerDied","Data":"398a9dcb2e74ac56cd2827ea038790abeb16b5f6b3573a48b306842a166c3f44"} Feb 19 15:29:46 crc kubenswrapper[4810]: I0219 15:29:46.573347 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="398a9dcb2e74ac56cd2827ea038790abeb16b5f6b3573a48b306842a166c3f44" Feb 19 15:29:46 crc kubenswrapper[4810]: I0219 15:29:46.573397 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-hmc6k" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.351400 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b44bbb4dc-dktkl"] Feb 19 15:29:47 crc kubenswrapper[4810]: E0219 15:29:47.351763 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2024a783-c3f9-4e57-b00f-52bec164e64e" containerName="barbican-db-sync" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.351776 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="2024a783-c3f9-4e57-b00f-52bec164e64e" containerName="barbican-db-sync" Feb 19 15:29:47 crc kubenswrapper[4810]: E0219 15:29:47.351804 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd961c7d-d551-4f5b-a08a-07d088947698" containerName="watcher-api" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.351811 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd961c7d-d551-4f5b-a08a-07d088947698" containerName="watcher-api" Feb 19 15:29:47 crc kubenswrapper[4810]: E0219 15:29:47.351826 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd961c7d-d551-4f5b-a08a-07d088947698" containerName="watcher-api-log" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.351833 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd961c7d-d551-4f5b-a08a-07d088947698" containerName="watcher-api-log" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.352001 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="2024a783-c3f9-4e57-b00f-52bec164e64e" containerName="barbican-db-sync" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.352029 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd961c7d-d551-4f5b-a08a-07d088947698" containerName="watcher-api-log" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.352040 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd961c7d-d551-4f5b-a08a-07d088947698" containerName="watcher-api" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.352988 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b44bbb4dc-dktkl" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.398387 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-58f8775989-n9rgr"] Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.400008 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-58f8775989-n9rgr" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.415957 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b44bbb4dc-dktkl"] Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.416368 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.416564 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-mx2zj" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.420779 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.426523 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-75f99f68b4-d7hj4"] Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.428088 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-75f99f68b4-d7hj4" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.431112 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.466810 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-58f8775989-n9rgr"] Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.466854 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-75f99f68b4-d7hj4"] Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.518135 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f277c31b-ff97-4f3b-aec3-c5cfe9293d60-logs\") pod \"barbican-worker-58f8775989-n9rgr\" (UID: \"f277c31b-ff97-4f3b-aec3-c5cfe9293d60\") " pod="openstack/barbican-worker-58f8775989-n9rgr" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.518194 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b1f44651-e4eb-4cce-a493-9dd9b491b22a-dns-swift-storage-0\") pod \"dnsmasq-dns-5b44bbb4dc-dktkl\" (UID: \"b1f44651-e4eb-4cce-a493-9dd9b491b22a\") " pod="openstack/dnsmasq-dns-5b44bbb4dc-dktkl" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.518356 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f277c31b-ff97-4f3b-aec3-c5cfe9293d60-config-data-custom\") pod \"barbican-worker-58f8775989-n9rgr\" (UID: \"f277c31b-ff97-4f3b-aec3-c5cfe9293d60\") " pod="openstack/barbican-worker-58f8775989-n9rgr" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.519730 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f277c31b-ff97-4f3b-aec3-c5cfe9293d60-config-data\") pod \"barbican-worker-58f8775989-n9rgr\" (UID: \"f277c31b-ff97-4f3b-aec3-c5cfe9293d60\") " pod="openstack/barbican-worker-58f8775989-n9rgr" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.519833 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1f44651-e4eb-4cce-a493-9dd9b491b22a-dns-svc\") pod \"dnsmasq-dns-5b44bbb4dc-dktkl\" (UID: \"b1f44651-e4eb-4cce-a493-9dd9b491b22a\") " pod="openstack/dnsmasq-dns-5b44bbb4dc-dktkl" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.519853 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b1f44651-e4eb-4cce-a493-9dd9b491b22a-ovsdbserver-nb\") pod \"dnsmasq-dns-5b44bbb4dc-dktkl\" (UID: \"b1f44651-e4eb-4cce-a493-9dd9b491b22a\") " pod="openstack/dnsmasq-dns-5b44bbb4dc-dktkl" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.519908 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1f44651-e4eb-4cce-a493-9dd9b491b22a-config\") pod \"dnsmasq-dns-5b44bbb4dc-dktkl\" (UID: \"b1f44651-e4eb-4cce-a493-9dd9b491b22a\") " pod="openstack/dnsmasq-dns-5b44bbb4dc-dktkl" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.519952 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b1f44651-e4eb-4cce-a493-9dd9b491b22a-ovsdbserver-sb\") pod \"dnsmasq-dns-5b44bbb4dc-dktkl\" (UID: \"b1f44651-e4eb-4cce-a493-9dd9b491b22a\") " pod="openstack/dnsmasq-dns-5b44bbb4dc-dktkl" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.520013 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m85jt\" (UniqueName: \"kubernetes.io/projected/b1f44651-e4eb-4cce-a493-9dd9b491b22a-kube-api-access-m85jt\") pod \"dnsmasq-dns-5b44bbb4dc-dktkl\" (UID: \"b1f44651-e4eb-4cce-a493-9dd9b491b22a\") " pod="openstack/dnsmasq-dns-5b44bbb4dc-dktkl" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.520055 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f277c31b-ff97-4f3b-aec3-c5cfe9293d60-combined-ca-bundle\") pod \"barbican-worker-58f8775989-n9rgr\" (UID: \"f277c31b-ff97-4f3b-aec3-c5cfe9293d60\") " pod="openstack/barbican-worker-58f8775989-n9rgr" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.520104 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvx44\" (UniqueName: \"kubernetes.io/projected/f277c31b-ff97-4f3b-aec3-c5cfe9293d60-kube-api-access-rvx44\") pod \"barbican-worker-58f8775989-n9rgr\" (UID: \"f277c31b-ff97-4f3b-aec3-c5cfe9293d60\") " pod="openstack/barbican-worker-58f8775989-n9rgr" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.582528 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-79fc56bc44-tfjh4"] Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.584264 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-79fc56bc44-tfjh4" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.588261 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.603388 4810 generic.go:334] "Generic (PLEG): container finished" podID="848dfe9d-05f4-4ba9-919e-23e9a7ae63d5" containerID="61fee9f5cc97dc9164d9a8b37259645ec27704b544f8031e79cd8630294aa448" exitCode=0 Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.603434 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-svmgl" event={"ID":"848dfe9d-05f4-4ba9-919e-23e9a7ae63d5","Type":"ContainerDied","Data":"61fee9f5cc97dc9164d9a8b37259645ec27704b544f8031e79cd8630294aa448"} Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.621485 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1f44651-e4eb-4cce-a493-9dd9b491b22a-config\") pod \"dnsmasq-dns-5b44bbb4dc-dktkl\" (UID: \"b1f44651-e4eb-4cce-a493-9dd9b491b22a\") " pod="openstack/dnsmasq-dns-5b44bbb4dc-dktkl" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.621537 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c008ffcd-bb96-47dd-a311-fdc58f6d8918-config-data\") pod \"barbican-keystone-listener-75f99f68b4-d7hj4\" (UID: \"c008ffcd-bb96-47dd-a311-fdc58f6d8918\") " pod="openstack/barbican-keystone-listener-75f99f68b4-d7hj4" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.621570 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b1f44651-e4eb-4cce-a493-9dd9b491b22a-ovsdbserver-sb\") pod \"dnsmasq-dns-5b44bbb4dc-dktkl\" (UID: \"b1f44651-e4eb-4cce-a493-9dd9b491b22a\") " pod="openstack/dnsmasq-dns-5b44bbb4dc-dktkl" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.621612 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m85jt\" (UniqueName: \"kubernetes.io/projected/b1f44651-e4eb-4cce-a493-9dd9b491b22a-kube-api-access-m85jt\") pod \"dnsmasq-dns-5b44bbb4dc-dktkl\" (UID: \"b1f44651-e4eb-4cce-a493-9dd9b491b22a\") " pod="openstack/dnsmasq-dns-5b44bbb4dc-dktkl" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.621644 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f277c31b-ff97-4f3b-aec3-c5cfe9293d60-combined-ca-bundle\") pod \"barbican-worker-58f8775989-n9rgr\" (UID: \"f277c31b-ff97-4f3b-aec3-c5cfe9293d60\") " pod="openstack/barbican-worker-58f8775989-n9rgr" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.621678 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvx44\" (UniqueName: \"kubernetes.io/projected/f277c31b-ff97-4f3b-aec3-c5cfe9293d60-kube-api-access-rvx44\") pod \"barbican-worker-58f8775989-n9rgr\" (UID: \"f277c31b-ff97-4f3b-aec3-c5cfe9293d60\") " pod="openstack/barbican-worker-58f8775989-n9rgr" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.621718 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c008ffcd-bb96-47dd-a311-fdc58f6d8918-config-data-custom\") pod \"barbican-keystone-listener-75f99f68b4-d7hj4\" (UID: \"c008ffcd-bb96-47dd-a311-fdc58f6d8918\") " pod="openstack/barbican-keystone-listener-75f99f68b4-d7hj4" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.621748 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f277c31b-ff97-4f3b-aec3-c5cfe9293d60-logs\") pod \"barbican-worker-58f8775989-n9rgr\" (UID: \"f277c31b-ff97-4f3b-aec3-c5cfe9293d60\") " pod="openstack/barbican-worker-58f8775989-n9rgr" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.621786 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b1f44651-e4eb-4cce-a493-9dd9b491b22a-dns-swift-storage-0\") pod \"dnsmasq-dns-5b44bbb4dc-dktkl\" (UID: \"b1f44651-e4eb-4cce-a493-9dd9b491b22a\") " pod="openstack/dnsmasq-dns-5b44bbb4dc-dktkl" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.621834 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kgxx\" (UniqueName: \"kubernetes.io/projected/c008ffcd-bb96-47dd-a311-fdc58f6d8918-kube-api-access-4kgxx\") pod \"barbican-keystone-listener-75f99f68b4-d7hj4\" (UID: \"c008ffcd-bb96-47dd-a311-fdc58f6d8918\") " pod="openstack/barbican-keystone-listener-75f99f68b4-d7hj4" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.621913 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f277c31b-ff97-4f3b-aec3-c5cfe9293d60-config-data-custom\") pod \"barbican-worker-58f8775989-n9rgr\" (UID: \"f277c31b-ff97-4f3b-aec3-c5cfe9293d60\") " pod="openstack/barbican-worker-58f8775989-n9rgr" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.621937 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f277c31b-ff97-4f3b-aec3-c5cfe9293d60-config-data\") pod \"barbican-worker-58f8775989-n9rgr\" (UID: \"f277c31b-ff97-4f3b-aec3-c5cfe9293d60\") " pod="openstack/barbican-worker-58f8775989-n9rgr" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.621965 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c008ffcd-bb96-47dd-a311-fdc58f6d8918-combined-ca-bundle\") pod \"barbican-keystone-listener-75f99f68b4-d7hj4\" (UID: \"c008ffcd-bb96-47dd-a311-fdc58f6d8918\") " pod="openstack/barbican-keystone-listener-75f99f68b4-d7hj4" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.621994 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c008ffcd-bb96-47dd-a311-fdc58f6d8918-logs\") pod \"barbican-keystone-listener-75f99f68b4-d7hj4\" (UID: \"c008ffcd-bb96-47dd-a311-fdc58f6d8918\") " pod="openstack/barbican-keystone-listener-75f99f68b4-d7hj4" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.622092 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1f44651-e4eb-4cce-a493-9dd9b491b22a-dns-svc\") pod \"dnsmasq-dns-5b44bbb4dc-dktkl\" (UID: \"b1f44651-e4eb-4cce-a493-9dd9b491b22a\") " pod="openstack/dnsmasq-dns-5b44bbb4dc-dktkl" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.622115 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b1f44651-e4eb-4cce-a493-9dd9b491b22a-ovsdbserver-nb\") pod \"dnsmasq-dns-5b44bbb4dc-dktkl\" (UID: \"b1f44651-e4eb-4cce-a493-9dd9b491b22a\") " pod="openstack/dnsmasq-dns-5b44bbb4dc-dktkl" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.623143 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b1f44651-e4eb-4cce-a493-9dd9b491b22a-ovsdbserver-nb\") pod \"dnsmasq-dns-5b44bbb4dc-dktkl\" (UID: \"b1f44651-e4eb-4cce-a493-9dd9b491b22a\") " pod="openstack/dnsmasq-dns-5b44bbb4dc-dktkl" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.623829 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b1f44651-e4eb-4cce-a493-9dd9b491b22a-dns-swift-storage-0\") pod \"dnsmasq-dns-5b44bbb4dc-dktkl\" (UID: \"b1f44651-e4eb-4cce-a493-9dd9b491b22a\") " pod="openstack/dnsmasq-dns-5b44bbb4dc-dktkl" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.624618 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1f44651-e4eb-4cce-a493-9dd9b491b22a-config\") pod \"dnsmasq-dns-5b44bbb4dc-dktkl\" (UID: \"b1f44651-e4eb-4cce-a493-9dd9b491b22a\") " pod="openstack/dnsmasq-dns-5b44bbb4dc-dktkl" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.625248 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b1f44651-e4eb-4cce-a493-9dd9b491b22a-ovsdbserver-sb\") pod \"dnsmasq-dns-5b44bbb4dc-dktkl\" (UID: \"b1f44651-e4eb-4cce-a493-9dd9b491b22a\") " pod="openstack/dnsmasq-dns-5b44bbb4dc-dktkl" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.631870 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f277c31b-ff97-4f3b-aec3-c5cfe9293d60-config-data-custom\") pod \"barbican-worker-58f8775989-n9rgr\" (UID: \"f277c31b-ff97-4f3b-aec3-c5cfe9293d60\") " pod="openstack/barbican-worker-58f8775989-n9rgr" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.635976 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-79fc56bc44-tfjh4"] Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.636033 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f277c31b-ff97-4f3b-aec3-c5cfe9293d60-logs\") pod \"barbican-worker-58f8775989-n9rgr\" (UID: \"f277c31b-ff97-4f3b-aec3-c5cfe9293d60\") " pod="openstack/barbican-worker-58f8775989-n9rgr" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.637559 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1f44651-e4eb-4cce-a493-9dd9b491b22a-dns-svc\") pod \"dnsmasq-dns-5b44bbb4dc-dktkl\" (UID: \"b1f44651-e4eb-4cce-a493-9dd9b491b22a\") " pod="openstack/dnsmasq-dns-5b44bbb4dc-dktkl" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.639477 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f277c31b-ff97-4f3b-aec3-c5cfe9293d60-combined-ca-bundle\") pod \"barbican-worker-58f8775989-n9rgr\" (UID: \"f277c31b-ff97-4f3b-aec3-c5cfe9293d60\") " pod="openstack/barbican-worker-58f8775989-n9rgr" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.643425 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f277c31b-ff97-4f3b-aec3-c5cfe9293d60-config-data\") pod \"barbican-worker-58f8775989-n9rgr\" (UID: \"f277c31b-ff97-4f3b-aec3-c5cfe9293d60\") " pod="openstack/barbican-worker-58f8775989-n9rgr" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.646186 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m85jt\" (UniqueName: \"kubernetes.io/projected/b1f44651-e4eb-4cce-a493-9dd9b491b22a-kube-api-access-m85jt\") pod \"dnsmasq-dns-5b44bbb4dc-dktkl\" (UID: \"b1f44651-e4eb-4cce-a493-9dd9b491b22a\") " pod="openstack/dnsmasq-dns-5b44bbb4dc-dktkl" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.656776 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvx44\" (UniqueName: \"kubernetes.io/projected/f277c31b-ff97-4f3b-aec3-c5cfe9293d60-kube-api-access-rvx44\") pod \"barbican-worker-58f8775989-n9rgr\" (UID: \"f277c31b-ff97-4f3b-aec3-c5cfe9293d60\") " pod="openstack/barbican-worker-58f8775989-n9rgr" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.699401 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b44bbb4dc-dktkl" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.724014 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c008ffcd-bb96-47dd-a311-fdc58f6d8918-config-data\") pod \"barbican-keystone-listener-75f99f68b4-d7hj4\" (UID: \"c008ffcd-bb96-47dd-a311-fdc58f6d8918\") " pod="openstack/barbican-keystone-listener-75f99f68b4-d7hj4" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.724116 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/faf01cf3-b74b-46d8-b589-05ea0195ac24-config-data-custom\") pod \"barbican-api-79fc56bc44-tfjh4\" (UID: \"faf01cf3-b74b-46d8-b589-05ea0195ac24\") " pod="openstack/barbican-api-79fc56bc44-tfjh4" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.724146 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/faf01cf3-b74b-46d8-b589-05ea0195ac24-logs\") pod \"barbican-api-79fc56bc44-tfjh4\" (UID: \"faf01cf3-b74b-46d8-b589-05ea0195ac24\") " pod="openstack/barbican-api-79fc56bc44-tfjh4" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.724166 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c008ffcd-bb96-47dd-a311-fdc58f6d8918-config-data-custom\") pod \"barbican-keystone-listener-75f99f68b4-d7hj4\" (UID: \"c008ffcd-bb96-47dd-a311-fdc58f6d8918\") " pod="openstack/barbican-keystone-listener-75f99f68b4-d7hj4" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.724209 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kgxx\" (UniqueName: \"kubernetes.io/projected/c008ffcd-bb96-47dd-a311-fdc58f6d8918-kube-api-access-4kgxx\") pod \"barbican-keystone-listener-75f99f68b4-d7hj4\" (UID: \"c008ffcd-bb96-47dd-a311-fdc58f6d8918\") " pod="openstack/barbican-keystone-listener-75f99f68b4-d7hj4" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.724232 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlkqn\" (UniqueName: \"kubernetes.io/projected/faf01cf3-b74b-46d8-b589-05ea0195ac24-kube-api-access-dlkqn\") pod \"barbican-api-79fc56bc44-tfjh4\" (UID: \"faf01cf3-b74b-46d8-b589-05ea0195ac24\") " pod="openstack/barbican-api-79fc56bc44-tfjh4" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.724253 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faf01cf3-b74b-46d8-b589-05ea0195ac24-config-data\") pod \"barbican-api-79fc56bc44-tfjh4\" (UID: \"faf01cf3-b74b-46d8-b589-05ea0195ac24\") " pod="openstack/barbican-api-79fc56bc44-tfjh4" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.724286 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faf01cf3-b74b-46d8-b589-05ea0195ac24-combined-ca-bundle\") pod \"barbican-api-79fc56bc44-tfjh4\" (UID: \"faf01cf3-b74b-46d8-b589-05ea0195ac24\") " pod="openstack/barbican-api-79fc56bc44-tfjh4" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.724335 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c008ffcd-bb96-47dd-a311-fdc58f6d8918-combined-ca-bundle\") pod \"barbican-keystone-listener-75f99f68b4-d7hj4\" (UID: \"c008ffcd-bb96-47dd-a311-fdc58f6d8918\") " pod="openstack/barbican-keystone-listener-75f99f68b4-d7hj4" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.724357 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c008ffcd-bb96-47dd-a311-fdc58f6d8918-logs\") pod \"barbican-keystone-listener-75f99f68b4-d7hj4\" (UID: \"c008ffcd-bb96-47dd-a311-fdc58f6d8918\") " pod="openstack/barbican-keystone-listener-75f99f68b4-d7hj4" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.724759 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c008ffcd-bb96-47dd-a311-fdc58f6d8918-logs\") pod \"barbican-keystone-listener-75f99f68b4-d7hj4\" (UID: \"c008ffcd-bb96-47dd-a311-fdc58f6d8918\") " pod="openstack/barbican-keystone-listener-75f99f68b4-d7hj4" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.727883 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c008ffcd-bb96-47dd-a311-fdc58f6d8918-combined-ca-bundle\") pod \"barbican-keystone-listener-75f99f68b4-d7hj4\" (UID: \"c008ffcd-bb96-47dd-a311-fdc58f6d8918\") " pod="openstack/barbican-keystone-listener-75f99f68b4-d7hj4" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.729652 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c008ffcd-bb96-47dd-a311-fdc58f6d8918-config-data\") pod \"barbican-keystone-listener-75f99f68b4-d7hj4\" (UID: \"c008ffcd-bb96-47dd-a311-fdc58f6d8918\") " pod="openstack/barbican-keystone-listener-75f99f68b4-d7hj4" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.731030 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c008ffcd-bb96-47dd-a311-fdc58f6d8918-config-data-custom\") pod \"barbican-keystone-listener-75f99f68b4-d7hj4\" (UID: \"c008ffcd-bb96-47dd-a311-fdc58f6d8918\") " pod="openstack/barbican-keystone-listener-75f99f68b4-d7hj4" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.740342 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kgxx\" (UniqueName: \"kubernetes.io/projected/c008ffcd-bb96-47dd-a311-fdc58f6d8918-kube-api-access-4kgxx\") pod \"barbican-keystone-listener-75f99f68b4-d7hj4\" (UID: \"c008ffcd-bb96-47dd-a311-fdc58f6d8918\") " pod="openstack/barbican-keystone-listener-75f99f68b4-d7hj4" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.761352 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-58f8775989-n9rgr" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.790619 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-75f99f68b4-d7hj4" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.826715 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/faf01cf3-b74b-46d8-b589-05ea0195ac24-config-data-custom\") pod \"barbican-api-79fc56bc44-tfjh4\" (UID: \"faf01cf3-b74b-46d8-b589-05ea0195ac24\") " pod="openstack/barbican-api-79fc56bc44-tfjh4" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.826773 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/faf01cf3-b74b-46d8-b589-05ea0195ac24-logs\") pod \"barbican-api-79fc56bc44-tfjh4\" (UID: \"faf01cf3-b74b-46d8-b589-05ea0195ac24\") " pod="openstack/barbican-api-79fc56bc44-tfjh4" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.826872 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlkqn\" (UniqueName: \"kubernetes.io/projected/faf01cf3-b74b-46d8-b589-05ea0195ac24-kube-api-access-dlkqn\") pod \"barbican-api-79fc56bc44-tfjh4\" (UID: \"faf01cf3-b74b-46d8-b589-05ea0195ac24\") " pod="openstack/barbican-api-79fc56bc44-tfjh4" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.826933 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faf01cf3-b74b-46d8-b589-05ea0195ac24-config-data\") pod \"barbican-api-79fc56bc44-tfjh4\" (UID: \"faf01cf3-b74b-46d8-b589-05ea0195ac24\") " pod="openstack/barbican-api-79fc56bc44-tfjh4" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.826989 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faf01cf3-b74b-46d8-b589-05ea0195ac24-combined-ca-bundle\") pod \"barbican-api-79fc56bc44-tfjh4\" (UID: \"faf01cf3-b74b-46d8-b589-05ea0195ac24\") " pod="openstack/barbican-api-79fc56bc44-tfjh4" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.827313 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/faf01cf3-b74b-46d8-b589-05ea0195ac24-logs\") pod \"barbican-api-79fc56bc44-tfjh4\" (UID: \"faf01cf3-b74b-46d8-b589-05ea0195ac24\") " pod="openstack/barbican-api-79fc56bc44-tfjh4" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.832879 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/faf01cf3-b74b-46d8-b589-05ea0195ac24-config-data-custom\") pod \"barbican-api-79fc56bc44-tfjh4\" (UID: \"faf01cf3-b74b-46d8-b589-05ea0195ac24\") " pod="openstack/barbican-api-79fc56bc44-tfjh4" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.833502 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faf01cf3-b74b-46d8-b589-05ea0195ac24-combined-ca-bundle\") pod \"barbican-api-79fc56bc44-tfjh4\" (UID: \"faf01cf3-b74b-46d8-b589-05ea0195ac24\") " pod="openstack/barbican-api-79fc56bc44-tfjh4" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.833645 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faf01cf3-b74b-46d8-b589-05ea0195ac24-config-data\") pod \"barbican-api-79fc56bc44-tfjh4\" (UID: \"faf01cf3-b74b-46d8-b589-05ea0195ac24\") " pod="openstack/barbican-api-79fc56bc44-tfjh4" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.842534 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlkqn\" (UniqueName: \"kubernetes.io/projected/faf01cf3-b74b-46d8-b589-05ea0195ac24-kube-api-access-dlkqn\") pod \"barbican-api-79fc56bc44-tfjh4\" (UID: \"faf01cf3-b74b-46d8-b589-05ea0195ac24\") " pod="openstack/barbican-api-79fc56bc44-tfjh4" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.891773 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-applier-0" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.906981 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-79fc56bc44-tfjh4" Feb 19 15:29:47 crc kubenswrapper[4810]: I0219 15:29:47.926722 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-applier-0" Feb 19 15:29:48 crc kubenswrapper[4810]: I0219 15:29:48.153773 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f4cfd6f6c-s7m64" Feb 19 15:29:48 crc kubenswrapper[4810]: I0219 15:29:48.341922 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/18ca6546-69fd-492d-81c5-bb18c56b045d-ovsdbserver-nb\") pod \"18ca6546-69fd-492d-81c5-bb18c56b045d\" (UID: \"18ca6546-69fd-492d-81c5-bb18c56b045d\") " Feb 19 15:29:48 crc kubenswrapper[4810]: I0219 15:29:48.342035 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nljdr\" (UniqueName: \"kubernetes.io/projected/18ca6546-69fd-492d-81c5-bb18c56b045d-kube-api-access-nljdr\") pod \"18ca6546-69fd-492d-81c5-bb18c56b045d\" (UID: \"18ca6546-69fd-492d-81c5-bb18c56b045d\") " Feb 19 15:29:48 crc kubenswrapper[4810]: I0219 15:29:48.342130 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/18ca6546-69fd-492d-81c5-bb18c56b045d-ovsdbserver-sb\") pod \"18ca6546-69fd-492d-81c5-bb18c56b045d\" (UID: \"18ca6546-69fd-492d-81c5-bb18c56b045d\") " Feb 19 15:29:48 crc kubenswrapper[4810]: I0219 15:29:48.342159 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/18ca6546-69fd-492d-81c5-bb18c56b045d-dns-swift-storage-0\") pod \"18ca6546-69fd-492d-81c5-bb18c56b045d\" (UID: \"18ca6546-69fd-492d-81c5-bb18c56b045d\") " Feb 19 15:29:48 crc kubenswrapper[4810]: I0219 15:29:48.342231 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18ca6546-69fd-492d-81c5-bb18c56b045d-dns-svc\") pod \"18ca6546-69fd-492d-81c5-bb18c56b045d\" (UID: \"18ca6546-69fd-492d-81c5-bb18c56b045d\") " Feb 19 15:29:48 crc kubenswrapper[4810]: I0219 15:29:48.342301 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18ca6546-69fd-492d-81c5-bb18c56b045d-config\") pod \"18ca6546-69fd-492d-81c5-bb18c56b045d\" (UID: \"18ca6546-69fd-492d-81c5-bb18c56b045d\") " Feb 19 15:29:48 crc kubenswrapper[4810]: I0219 15:29:48.351619 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18ca6546-69fd-492d-81c5-bb18c56b045d-kube-api-access-nljdr" (OuterVolumeSpecName: "kube-api-access-nljdr") pod "18ca6546-69fd-492d-81c5-bb18c56b045d" (UID: "18ca6546-69fd-492d-81c5-bb18c56b045d"). InnerVolumeSpecName "kube-api-access-nljdr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:29:48 crc kubenswrapper[4810]: I0219 15:29:48.410452 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18ca6546-69fd-492d-81c5-bb18c56b045d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "18ca6546-69fd-492d-81c5-bb18c56b045d" (UID: "18ca6546-69fd-492d-81c5-bb18c56b045d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:29:48 crc kubenswrapper[4810]: I0219 15:29:48.423019 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18ca6546-69fd-492d-81c5-bb18c56b045d-config" (OuterVolumeSpecName: "config") pod "18ca6546-69fd-492d-81c5-bb18c56b045d" (UID: "18ca6546-69fd-492d-81c5-bb18c56b045d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:29:48 crc kubenswrapper[4810]: I0219 15:29:48.444129 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18ca6546-69fd-492d-81c5-bb18c56b045d-config\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:48 crc kubenswrapper[4810]: I0219 15:29:48.444156 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/18ca6546-69fd-492d-81c5-bb18c56b045d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:48 crc kubenswrapper[4810]: I0219 15:29:48.444166 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nljdr\" (UniqueName: \"kubernetes.io/projected/18ca6546-69fd-492d-81c5-bb18c56b045d-kube-api-access-nljdr\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:48 crc kubenswrapper[4810]: I0219 15:29:48.473015 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18ca6546-69fd-492d-81c5-bb18c56b045d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "18ca6546-69fd-492d-81c5-bb18c56b045d" (UID: "18ca6546-69fd-492d-81c5-bb18c56b045d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:29:48 crc kubenswrapper[4810]: I0219 15:29:48.496041 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18ca6546-69fd-492d-81c5-bb18c56b045d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "18ca6546-69fd-492d-81c5-bb18c56b045d" (UID: "18ca6546-69fd-492d-81c5-bb18c56b045d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:29:48 crc kubenswrapper[4810]: I0219 15:29:48.496762 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18ca6546-69fd-492d-81c5-bb18c56b045d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "18ca6546-69fd-492d-81c5-bb18c56b045d" (UID: "18ca6546-69fd-492d-81c5-bb18c56b045d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:29:48 crc kubenswrapper[4810]: I0219 15:29:48.545543 4810 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18ca6546-69fd-492d-81c5-bb18c56b045d-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:48 crc kubenswrapper[4810]: I0219 15:29:48.545568 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/18ca6546-69fd-492d-81c5-bb18c56b045d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:48 crc kubenswrapper[4810]: I0219 15:29:48.545578 4810 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/18ca6546-69fd-492d-81c5-bb18c56b045d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:48 crc kubenswrapper[4810]: E0219 15:29:48.548078 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="b8cd44d0-7395-44e1-9112-9e8bb4198b93" Feb 19 15:29:48 crc kubenswrapper[4810]: I0219 15:29:48.614233 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f4cfd6f6c-s7m64" event={"ID":"18ca6546-69fd-492d-81c5-bb18c56b045d","Type":"ContainerDied","Data":"3e6a529b000841e709c2e1d05c4d119a28c8f20c2ece39574181d8df78a6c626"} Feb 19 15:29:48 crc kubenswrapper[4810]: I0219 15:29:48.614279 4810 scope.go:117] "RemoveContainer" containerID="acf279e70fd332fdcdd2bf83f0303bb99e19cd10482ff8ab44a134fa747add8b" Feb 19 15:29:48 crc kubenswrapper[4810]: I0219 15:29:48.614403 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f4cfd6f6c-s7m64" Feb 19 15:29:48 crc kubenswrapper[4810]: I0219 15:29:48.622417 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b8cd44d0-7395-44e1-9112-9e8bb4198b93" containerName="ceilometer-notification-agent" containerID="cri-o://48bd8312dc5f2e91c1a4d6b015bb83960b232d7ff3a764add13cbac66bd0441f" gracePeriod=30 Feb 19 15:29:48 crc kubenswrapper[4810]: I0219 15:29:48.622704 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b8cd44d0-7395-44e1-9112-9e8bb4198b93","Type":"ContainerStarted","Data":"f669c7fcc5250fb7338c38806127e385c8e63f008e071b7e746e08ad63a54a09"} Feb 19 15:29:48 crc kubenswrapper[4810]: I0219 15:29:48.623900 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 19 15:29:48 crc kubenswrapper[4810]: I0219 15:29:48.627552 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b8cd44d0-7395-44e1-9112-9e8bb4198b93" containerName="proxy-httpd" containerID="cri-o://f669c7fcc5250fb7338c38806127e385c8e63f008e071b7e746e08ad63a54a09" gracePeriod=30 Feb 19 15:29:48 crc kubenswrapper[4810]: I0219 15:29:48.627642 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b8cd44d0-7395-44e1-9112-9e8bb4198b93" containerName="sg-core" containerID="cri-o://1b6e83705ca6e6c238d2cc26ae7440d08d4c7c41779dea1e88d4da0c7c6c4ca7" gracePeriod=30 Feb 19 15:29:48 crc kubenswrapper[4810]: I0219 15:29:48.704103 4810 scope.go:117] "RemoveContainer" containerID="7bc85bb988b4afd58694705b3dae68dca31213574b00f92c71f4c77c5edfdf98" Feb 19 15:29:48 crc kubenswrapper[4810]: I0219 15:29:48.704298 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-applier-0" Feb 19 15:29:48 crc kubenswrapper[4810]: I0219 15:29:48.801023 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-79fc56bc44-tfjh4"] Feb 19 15:29:48 crc kubenswrapper[4810]: I0219 15:29:48.835243 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b44bbb4dc-dktkl"] Feb 19 15:29:48 crc kubenswrapper[4810]: I0219 15:29:48.847365 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f4cfd6f6c-s7m64"] Feb 19 15:29:48 crc kubenswrapper[4810]: I0219 15:29:48.880029 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f4cfd6f6c-s7m64"] Feb 19 15:29:49 crc kubenswrapper[4810]: I0219 15:29:49.125860 4810 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/watcher-decision-engine-0" Feb 19 15:29:49 crc kubenswrapper[4810]: I0219 15:29:49.126246 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Feb 19 15:29:49 crc kubenswrapper[4810]: I0219 15:29:49.126261 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Feb 19 15:29:49 crc kubenswrapper[4810]: I0219 15:29:49.127123 4810 scope.go:117] "RemoveContainer" containerID="ea64343d53131969232078700626ee5996ba47b778be09a59da7ab42c9aa2631" Feb 19 15:29:49 crc kubenswrapper[4810]: I0219 15:29:49.303777 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-58f8775989-n9rgr"] Feb 19 15:29:49 crc kubenswrapper[4810]: I0219 15:29:49.337153 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-75f99f68b4-d7hj4"] Feb 19 15:29:49 crc kubenswrapper[4810]: I0219 15:29:49.476082 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18ca6546-69fd-492d-81c5-bb18c56b045d" path="/var/lib/kubelet/pods/18ca6546-69fd-492d-81c5-bb18c56b045d/volumes" Feb 19 15:29:49 crc kubenswrapper[4810]: I0219 15:29:49.506000 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-svmgl" Feb 19 15:29:49 crc kubenswrapper[4810]: I0219 15:29:49.584457 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/848dfe9d-05f4-4ba9-919e-23e9a7ae63d5-config-data\") pod \"848dfe9d-05f4-4ba9-919e-23e9a7ae63d5\" (UID: \"848dfe9d-05f4-4ba9-919e-23e9a7ae63d5\") " Feb 19 15:29:49 crc kubenswrapper[4810]: I0219 15:29:49.584780 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/848dfe9d-05f4-4ba9-919e-23e9a7ae63d5-db-sync-config-data\") pod \"848dfe9d-05f4-4ba9-919e-23e9a7ae63d5\" (UID: \"848dfe9d-05f4-4ba9-919e-23e9a7ae63d5\") " Feb 19 15:29:49 crc kubenswrapper[4810]: I0219 15:29:49.584808 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/848dfe9d-05f4-4ba9-919e-23e9a7ae63d5-etc-machine-id\") pod \"848dfe9d-05f4-4ba9-919e-23e9a7ae63d5\" (UID: \"848dfe9d-05f4-4ba9-919e-23e9a7ae63d5\") " Feb 19 15:29:49 crc kubenswrapper[4810]: I0219 15:29:49.584823 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/848dfe9d-05f4-4ba9-919e-23e9a7ae63d5-scripts\") pod \"848dfe9d-05f4-4ba9-919e-23e9a7ae63d5\" (UID: \"848dfe9d-05f4-4ba9-919e-23e9a7ae63d5\") " Feb 19 15:29:49 crc kubenswrapper[4810]: I0219 15:29:49.584853 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqnf7\" (UniqueName: \"kubernetes.io/projected/848dfe9d-05f4-4ba9-919e-23e9a7ae63d5-kube-api-access-cqnf7\") pod \"848dfe9d-05f4-4ba9-919e-23e9a7ae63d5\" (UID: \"848dfe9d-05f4-4ba9-919e-23e9a7ae63d5\") " Feb 19 15:29:49 crc kubenswrapper[4810]: I0219 15:29:49.584950 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/848dfe9d-05f4-4ba9-919e-23e9a7ae63d5-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "848dfe9d-05f4-4ba9-919e-23e9a7ae63d5" (UID: "848dfe9d-05f4-4ba9-919e-23e9a7ae63d5"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 15:29:49 crc kubenswrapper[4810]: I0219 15:29:49.584981 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/848dfe9d-05f4-4ba9-919e-23e9a7ae63d5-combined-ca-bundle\") pod \"848dfe9d-05f4-4ba9-919e-23e9a7ae63d5\" (UID: \"848dfe9d-05f4-4ba9-919e-23e9a7ae63d5\") " Feb 19 15:29:49 crc kubenswrapper[4810]: I0219 15:29:49.585507 4810 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/848dfe9d-05f4-4ba9-919e-23e9a7ae63d5-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:49 crc kubenswrapper[4810]: I0219 15:29:49.591907 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/848dfe9d-05f4-4ba9-919e-23e9a7ae63d5-scripts" (OuterVolumeSpecName: "scripts") pod "848dfe9d-05f4-4ba9-919e-23e9a7ae63d5" (UID: "848dfe9d-05f4-4ba9-919e-23e9a7ae63d5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:29:49 crc kubenswrapper[4810]: I0219 15:29:49.594671 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/848dfe9d-05f4-4ba9-919e-23e9a7ae63d5-kube-api-access-cqnf7" (OuterVolumeSpecName: "kube-api-access-cqnf7") pod "848dfe9d-05f4-4ba9-919e-23e9a7ae63d5" (UID: "848dfe9d-05f4-4ba9-919e-23e9a7ae63d5"). InnerVolumeSpecName "kube-api-access-cqnf7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:29:49 crc kubenswrapper[4810]: I0219 15:29:49.594699 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/848dfe9d-05f4-4ba9-919e-23e9a7ae63d5-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "848dfe9d-05f4-4ba9-919e-23e9a7ae63d5" (UID: "848dfe9d-05f4-4ba9-919e-23e9a7ae63d5"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:29:49 crc kubenswrapper[4810]: I0219 15:29:49.623455 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/848dfe9d-05f4-4ba9-919e-23e9a7ae63d5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "848dfe9d-05f4-4ba9-919e-23e9a7ae63d5" (UID: "848dfe9d-05f4-4ba9-919e-23e9a7ae63d5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:29:49 crc kubenswrapper[4810]: I0219 15:29:49.654976 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-58f8775989-n9rgr" event={"ID":"f277c31b-ff97-4f3b-aec3-c5cfe9293d60","Type":"ContainerStarted","Data":"a883cebc3822bfccdfab0bd32dc52bbbd5c9613c2189b603992822c3157d2886"} Feb 19 15:29:49 crc kubenswrapper[4810]: I0219 15:29:49.657967 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/848dfe9d-05f4-4ba9-919e-23e9a7ae63d5-config-data" (OuterVolumeSpecName: "config-data") pod "848dfe9d-05f4-4ba9-919e-23e9a7ae63d5" (UID: "848dfe9d-05f4-4ba9-919e-23e9a7ae63d5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:29:49 crc kubenswrapper[4810]: I0219 15:29:49.666789 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-75f99f68b4-d7hj4" event={"ID":"c008ffcd-bb96-47dd-a311-fdc58f6d8918","Type":"ContainerStarted","Data":"3e52e508c6501d7dabbd530b368b25c699055934d491accb25c6d0cceddd787e"} Feb 19 15:29:49 crc kubenswrapper[4810]: I0219 15:29:49.690968 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"3eb2dccd-c5dc-436f-b7a6-954af7bc51c5","Type":"ContainerStarted","Data":"a8232fee767eaee3fa026223e20673e25fa6d954ddb913e148bf6e0e435de416"} Feb 19 15:29:49 crc kubenswrapper[4810]: I0219 15:29:49.695270 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/848dfe9d-05f4-4ba9-919e-23e9a7ae63d5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:49 crc kubenswrapper[4810]: I0219 15:29:49.695565 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/848dfe9d-05f4-4ba9-919e-23e9a7ae63d5-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:49 crc kubenswrapper[4810]: I0219 15:29:49.695578 4810 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/848dfe9d-05f4-4ba9-919e-23e9a7ae63d5-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:49 crc kubenswrapper[4810]: I0219 15:29:49.695589 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/848dfe9d-05f4-4ba9-919e-23e9a7ae63d5-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:49 crc kubenswrapper[4810]: I0219 15:29:49.695599 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqnf7\" (UniqueName: \"kubernetes.io/projected/848dfe9d-05f4-4ba9-919e-23e9a7ae63d5-kube-api-access-cqnf7\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:49 crc kubenswrapper[4810]: I0219 15:29:49.721619 4810 generic.go:334] "Generic (PLEG): container finished" podID="b1f44651-e4eb-4cce-a493-9dd9b491b22a" containerID="b192a1bf8058da9cccb4f9c8a3c59226e8d40284e5443eca3683d4b0120ebdd4" exitCode=0 Feb 19 15:29:49 crc kubenswrapper[4810]: I0219 15:29:49.721642 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b44bbb4dc-dktkl" event={"ID":"b1f44651-e4eb-4cce-a493-9dd9b491b22a","Type":"ContainerDied","Data":"b192a1bf8058da9cccb4f9c8a3c59226e8d40284e5443eca3683d4b0120ebdd4"} Feb 19 15:29:49 crc kubenswrapper[4810]: I0219 15:29:49.751523 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b44bbb4dc-dktkl" event={"ID":"b1f44651-e4eb-4cce-a493-9dd9b491b22a","Type":"ContainerStarted","Data":"d03f7f66a9a3b08a44840b959b3f1bdcaeb2f46d803e33b39c92670ff95677e9"} Feb 19 15:29:49 crc kubenswrapper[4810]: I0219 15:29:49.760926 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-79fc56bc44-tfjh4" event={"ID":"faf01cf3-b74b-46d8-b589-05ea0195ac24","Type":"ContainerStarted","Data":"800222f8de7a26806971959afe00821e5af0eb096530f9abfdb9f64bd62612cb"} Feb 19 15:29:49 crc kubenswrapper[4810]: I0219 15:29:49.760979 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-79fc56bc44-tfjh4" event={"ID":"faf01cf3-b74b-46d8-b589-05ea0195ac24","Type":"ContainerStarted","Data":"6c0cc91325e6d2565c8ddd9b1eede9d65e56a0efbeee132cbd5a27f02f0ed7ef"} Feb 19 15:29:49 crc kubenswrapper[4810]: I0219 15:29:49.760995 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-79fc56bc44-tfjh4" event={"ID":"faf01cf3-b74b-46d8-b589-05ea0195ac24","Type":"ContainerStarted","Data":"9a79a3684682be966eafd1448e2f56c188e39ccc6a243e2540b5912c1bbf9c6c"} Feb 19 15:29:49 crc kubenswrapper[4810]: I0219 15:29:49.761271 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-79fc56bc44-tfjh4" Feb 19 15:29:49 crc kubenswrapper[4810]: I0219 15:29:49.762124 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-79fc56bc44-tfjh4" Feb 19 15:29:49 crc kubenswrapper[4810]: I0219 15:29:49.765089 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-svmgl" event={"ID":"848dfe9d-05f4-4ba9-919e-23e9a7ae63d5","Type":"ContainerDied","Data":"ce811eb96390a8f8365be6dee0926b85f5365cb868957ed22165ebbf7d343712"} Feb 19 15:29:49 crc kubenswrapper[4810]: I0219 15:29:49.765145 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce811eb96390a8f8365be6dee0926b85f5365cb868957ed22165ebbf7d343712" Feb 19 15:29:49 crc kubenswrapper[4810]: I0219 15:29:49.765260 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-svmgl" Feb 19 15:29:49 crc kubenswrapper[4810]: I0219 15:29:49.774135 4810 generic.go:334] "Generic (PLEG): container finished" podID="b8cd44d0-7395-44e1-9112-9e8bb4198b93" containerID="1b6e83705ca6e6c238d2cc26ae7440d08d4c7c41779dea1e88d4da0c7c6c4ca7" exitCode=2 Feb 19 15:29:49 crc kubenswrapper[4810]: I0219 15:29:49.774174 4810 generic.go:334] "Generic (PLEG): container finished" podID="b8cd44d0-7395-44e1-9112-9e8bb4198b93" containerID="48bd8312dc5f2e91c1a4d6b015bb83960b232d7ff3a764add13cbac66bd0441f" exitCode=0 Feb 19 15:29:49 crc kubenswrapper[4810]: I0219 15:29:49.774245 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b8cd44d0-7395-44e1-9112-9e8bb4198b93","Type":"ContainerDied","Data":"1b6e83705ca6e6c238d2cc26ae7440d08d4c7c41779dea1e88d4da0c7c6c4ca7"} Feb 19 15:29:49 crc kubenswrapper[4810]: I0219 15:29:49.774305 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b8cd44d0-7395-44e1-9112-9e8bb4198b93","Type":"ContainerDied","Data":"48bd8312dc5f2e91c1a4d6b015bb83960b232d7ff3a764add13cbac66bd0441f"} Feb 19 15:29:49 crc kubenswrapper[4810]: I0219 15:29:49.933420 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 15:29:49 crc kubenswrapper[4810]: E0219 15:29:49.933881 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="848dfe9d-05f4-4ba9-919e-23e9a7ae63d5" containerName="cinder-db-sync" Feb 19 15:29:49 crc kubenswrapper[4810]: I0219 15:29:49.933900 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="848dfe9d-05f4-4ba9-919e-23e9a7ae63d5" containerName="cinder-db-sync" Feb 19 15:29:49 crc kubenswrapper[4810]: E0219 15:29:49.933921 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18ca6546-69fd-492d-81c5-bb18c56b045d" containerName="init" Feb 19 15:29:49 crc kubenswrapper[4810]: I0219 15:29:49.933930 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="18ca6546-69fd-492d-81c5-bb18c56b045d" containerName="init" Feb 19 15:29:49 crc kubenswrapper[4810]: E0219 15:29:49.933946 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18ca6546-69fd-492d-81c5-bb18c56b045d" containerName="dnsmasq-dns" Feb 19 15:29:49 crc kubenswrapper[4810]: I0219 15:29:49.933954 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="18ca6546-69fd-492d-81c5-bb18c56b045d" containerName="dnsmasq-dns" Feb 19 15:29:49 crc kubenswrapper[4810]: I0219 15:29:49.934141 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="18ca6546-69fd-492d-81c5-bb18c56b045d" containerName="dnsmasq-dns" Feb 19 15:29:49 crc kubenswrapper[4810]: I0219 15:29:49.934163 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="848dfe9d-05f4-4ba9-919e-23e9a7ae63d5" containerName="cinder-db-sync" Feb 19 15:29:49 crc kubenswrapper[4810]: I0219 15:29:49.935365 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 19 15:29:49 crc kubenswrapper[4810]: I0219 15:29:49.942227 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 19 15:29:49 crc kubenswrapper[4810]: I0219 15:29:49.942446 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 19 15:29:49 crc kubenswrapper[4810]: I0219 15:29:49.942589 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 19 15:29:49 crc kubenswrapper[4810]: I0219 15:29:49.942699 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-nxm2z" Feb 19 15:29:49 crc kubenswrapper[4810]: I0219 15:29:49.967654 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-79fc56bc44-tfjh4" podStartSLOduration=2.967636306 podStartE2EDuration="2.967636306s" podCreationTimestamp="2026-02-19 15:29:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:29:49.890548885 +0000 UTC m=+1219.372579009" watchObservedRunningTime="2026-02-19 15:29:49.967636306 +0000 UTC m=+1219.449666430" Feb 19 15:29:49 crc kubenswrapper[4810]: I0219 15:29:49.973025 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 15:29:50 crc kubenswrapper[4810]: I0219 15:29:50.034394 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b44bbb4dc-dktkl"] Feb 19 15:29:50 crc kubenswrapper[4810]: I0219 15:29:50.051602 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3bd9969-3750-460b-95cd-8c52d2e44d82-scripts\") pod \"cinder-scheduler-0\" (UID: \"e3bd9969-3750-460b-95cd-8c52d2e44d82\") " pod="openstack/cinder-scheduler-0" Feb 19 15:29:50 crc kubenswrapper[4810]: I0219 15:29:50.051645 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e3bd9969-3750-460b-95cd-8c52d2e44d82-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e3bd9969-3750-460b-95cd-8c52d2e44d82\") " pod="openstack/cinder-scheduler-0" Feb 19 15:29:50 crc kubenswrapper[4810]: I0219 15:29:50.051672 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3bd9969-3750-460b-95cd-8c52d2e44d82-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e3bd9969-3750-460b-95cd-8c52d2e44d82\") " pod="openstack/cinder-scheduler-0" Feb 19 15:29:50 crc kubenswrapper[4810]: I0219 15:29:50.051712 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e3bd9969-3750-460b-95cd-8c52d2e44d82-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e3bd9969-3750-460b-95cd-8c52d2e44d82\") " pod="openstack/cinder-scheduler-0" Feb 19 15:29:50 crc kubenswrapper[4810]: I0219 15:29:50.051729 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3bd9969-3750-460b-95cd-8c52d2e44d82-config-data\") pod \"cinder-scheduler-0\" (UID: \"e3bd9969-3750-460b-95cd-8c52d2e44d82\") " pod="openstack/cinder-scheduler-0" Feb 19 15:29:50 crc kubenswrapper[4810]: I0219 15:29:50.051809 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xw97l\" (UniqueName: \"kubernetes.io/projected/e3bd9969-3750-460b-95cd-8c52d2e44d82-kube-api-access-xw97l\") pod \"cinder-scheduler-0\" (UID: \"e3bd9969-3750-460b-95cd-8c52d2e44d82\") " pod="openstack/cinder-scheduler-0" Feb 19 15:29:50 crc kubenswrapper[4810]: I0219 15:29:50.113282 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6c58b86477-9tbw7"] Feb 19 15:29:50 crc kubenswrapper[4810]: I0219 15:29:50.114990 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c58b86477-9tbw7" Feb 19 15:29:50 crc kubenswrapper[4810]: I0219 15:29:50.122210 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c58b86477-9tbw7"] Feb 19 15:29:50 crc kubenswrapper[4810]: I0219 15:29:50.153441 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xw97l\" (UniqueName: \"kubernetes.io/projected/e3bd9969-3750-460b-95cd-8c52d2e44d82-kube-api-access-xw97l\") pod \"cinder-scheduler-0\" (UID: \"e3bd9969-3750-460b-95cd-8c52d2e44d82\") " pod="openstack/cinder-scheduler-0" Feb 19 15:29:50 crc kubenswrapper[4810]: I0219 15:29:50.153529 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3bd9969-3750-460b-95cd-8c52d2e44d82-scripts\") pod \"cinder-scheduler-0\" (UID: \"e3bd9969-3750-460b-95cd-8c52d2e44d82\") " pod="openstack/cinder-scheduler-0" Feb 19 15:29:50 crc kubenswrapper[4810]: I0219 15:29:50.153547 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e3bd9969-3750-460b-95cd-8c52d2e44d82-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e3bd9969-3750-460b-95cd-8c52d2e44d82\") " pod="openstack/cinder-scheduler-0" Feb 19 15:29:50 crc kubenswrapper[4810]: I0219 15:29:50.153568 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3bd9969-3750-460b-95cd-8c52d2e44d82-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e3bd9969-3750-460b-95cd-8c52d2e44d82\") " pod="openstack/cinder-scheduler-0" Feb 19 15:29:50 crc kubenswrapper[4810]: I0219 15:29:50.153603 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e3bd9969-3750-460b-95cd-8c52d2e44d82-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e3bd9969-3750-460b-95cd-8c52d2e44d82\") " pod="openstack/cinder-scheduler-0" Feb 19 15:29:50 crc kubenswrapper[4810]: I0219 15:29:50.153620 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3bd9969-3750-460b-95cd-8c52d2e44d82-config-data\") pod \"cinder-scheduler-0\" (UID: \"e3bd9969-3750-460b-95cd-8c52d2e44d82\") " pod="openstack/cinder-scheduler-0" Feb 19 15:29:50 crc kubenswrapper[4810]: I0219 15:29:50.157484 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3bd9969-3750-460b-95cd-8c52d2e44d82-config-data\") pod \"cinder-scheduler-0\" (UID: \"e3bd9969-3750-460b-95cd-8c52d2e44d82\") " pod="openstack/cinder-scheduler-0" Feb 19 15:29:50 crc kubenswrapper[4810]: I0219 15:29:50.160025 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e3bd9969-3750-460b-95cd-8c52d2e44d82-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e3bd9969-3750-460b-95cd-8c52d2e44d82\") " pod="openstack/cinder-scheduler-0" Feb 19 15:29:50 crc kubenswrapper[4810]: I0219 15:29:50.161264 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e3bd9969-3750-460b-95cd-8c52d2e44d82-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e3bd9969-3750-460b-95cd-8c52d2e44d82\") " pod="openstack/cinder-scheduler-0" Feb 19 15:29:50 crc kubenswrapper[4810]: I0219 15:29:50.162630 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3bd9969-3750-460b-95cd-8c52d2e44d82-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e3bd9969-3750-460b-95cd-8c52d2e44d82\") " pod="openstack/cinder-scheduler-0" Feb 19 15:29:50 crc kubenswrapper[4810]: I0219 15:29:50.166232 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3bd9969-3750-460b-95cd-8c52d2e44d82-scripts\") pod \"cinder-scheduler-0\" (UID: \"e3bd9969-3750-460b-95cd-8c52d2e44d82\") " pod="openstack/cinder-scheduler-0" Feb 19 15:29:50 crc kubenswrapper[4810]: I0219 15:29:50.191357 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xw97l\" (UniqueName: \"kubernetes.io/projected/e3bd9969-3750-460b-95cd-8c52d2e44d82-kube-api-access-xw97l\") pod \"cinder-scheduler-0\" (UID: \"e3bd9969-3750-460b-95cd-8c52d2e44d82\") " pod="openstack/cinder-scheduler-0" Feb 19 15:29:50 crc kubenswrapper[4810]: I0219 15:29:50.211562 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 19 15:29:50 crc kubenswrapper[4810]: I0219 15:29:50.213143 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 19 15:29:50 crc kubenswrapper[4810]: I0219 15:29:50.215842 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 19 15:29:50 crc kubenswrapper[4810]: I0219 15:29:50.255027 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4a48946e-058c-4395-bbad-5effb50b2228-ovsdbserver-nb\") pod \"dnsmasq-dns-6c58b86477-9tbw7\" (UID: \"4a48946e-058c-4395-bbad-5effb50b2228\") " pod="openstack/dnsmasq-dns-6c58b86477-9tbw7" Feb 19 15:29:50 crc kubenswrapper[4810]: I0219 15:29:50.255071 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4a48946e-058c-4395-bbad-5effb50b2228-ovsdbserver-sb\") pod \"dnsmasq-dns-6c58b86477-9tbw7\" (UID: \"4a48946e-058c-4395-bbad-5effb50b2228\") " pod="openstack/dnsmasq-dns-6c58b86477-9tbw7" Feb 19 15:29:50 crc kubenswrapper[4810]: I0219 15:29:50.255140 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4a48946e-058c-4395-bbad-5effb50b2228-dns-swift-storage-0\") pod \"dnsmasq-dns-6c58b86477-9tbw7\" (UID: \"4a48946e-058c-4395-bbad-5effb50b2228\") " pod="openstack/dnsmasq-dns-6c58b86477-9tbw7" Feb 19 15:29:50 crc kubenswrapper[4810]: I0219 15:29:50.255173 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a48946e-058c-4395-bbad-5effb50b2228-dns-svc\") pod \"dnsmasq-dns-6c58b86477-9tbw7\" (UID: \"4a48946e-058c-4395-bbad-5effb50b2228\") " pod="openstack/dnsmasq-dns-6c58b86477-9tbw7" Feb 19 15:29:50 crc kubenswrapper[4810]: I0219 15:29:50.255211 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a48946e-058c-4395-bbad-5effb50b2228-config\") pod \"dnsmasq-dns-6c58b86477-9tbw7\" (UID: \"4a48946e-058c-4395-bbad-5effb50b2228\") " pod="openstack/dnsmasq-dns-6c58b86477-9tbw7" Feb 19 15:29:50 crc kubenswrapper[4810]: I0219 15:29:50.255229 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2hfc\" (UniqueName: \"kubernetes.io/projected/4a48946e-058c-4395-bbad-5effb50b2228-kube-api-access-v2hfc\") pod \"dnsmasq-dns-6c58b86477-9tbw7\" (UID: \"4a48946e-058c-4395-bbad-5effb50b2228\") " pod="openstack/dnsmasq-dns-6c58b86477-9tbw7" Feb 19 15:29:50 crc kubenswrapper[4810]: I0219 15:29:50.260462 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 19 15:29:50 crc kubenswrapper[4810]: E0219 15:29:50.358315 4810 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/99d74501f5843fd610be3a49ec1837e9345737eecaaac0265ae9e4dfbf8c5c14/diff" to get inode usage: stat /var/lib/containers/storage/overlay/99d74501f5843fd610be3a49ec1837e9345737eecaaac0265ae9e4dfbf8c5c14/diff: no such file or directory, extraDiskErr: Feb 19 15:29:50 crc kubenswrapper[4810]: I0219 15:29:50.359508 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f5db557a-0b89-4a02-b1b2-19bc205acee8-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f5db557a-0b89-4a02-b1b2-19bc205acee8\") " pod="openstack/cinder-api-0" Feb 19 15:29:50 crc kubenswrapper[4810]: I0219 15:29:50.359549 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5db557a-0b89-4a02-b1b2-19bc205acee8-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f5db557a-0b89-4a02-b1b2-19bc205acee8\") " pod="openstack/cinder-api-0" Feb 19 15:29:50 crc kubenswrapper[4810]: I0219 15:29:50.359590 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4a48946e-058c-4395-bbad-5effb50b2228-dns-swift-storage-0\") pod \"dnsmasq-dns-6c58b86477-9tbw7\" (UID: \"4a48946e-058c-4395-bbad-5effb50b2228\") " pod="openstack/dnsmasq-dns-6c58b86477-9tbw7" Feb 19 15:29:50 crc kubenswrapper[4810]: I0219 15:29:50.359616 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a48946e-058c-4395-bbad-5effb50b2228-dns-svc\") pod \"dnsmasq-dns-6c58b86477-9tbw7\" (UID: \"4a48946e-058c-4395-bbad-5effb50b2228\") " pod="openstack/dnsmasq-dns-6c58b86477-9tbw7" Feb 19 15:29:50 crc kubenswrapper[4810]: I0219 15:29:50.359643 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7vrh\" (UniqueName: \"kubernetes.io/projected/f5db557a-0b89-4a02-b1b2-19bc205acee8-kube-api-access-s7vrh\") pod \"cinder-api-0\" (UID: \"f5db557a-0b89-4a02-b1b2-19bc205acee8\") " pod="openstack/cinder-api-0" Feb 19 15:29:50 crc kubenswrapper[4810]: I0219 15:29:50.359670 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a48946e-058c-4395-bbad-5effb50b2228-config\") pod \"dnsmasq-dns-6c58b86477-9tbw7\" (UID: \"4a48946e-058c-4395-bbad-5effb50b2228\") " pod="openstack/dnsmasq-dns-6c58b86477-9tbw7" Feb 19 15:29:50 crc kubenswrapper[4810]: I0219 15:29:50.359686 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2hfc\" (UniqueName: \"kubernetes.io/projected/4a48946e-058c-4395-bbad-5effb50b2228-kube-api-access-v2hfc\") pod \"dnsmasq-dns-6c58b86477-9tbw7\" (UID: \"4a48946e-058c-4395-bbad-5effb50b2228\") " pod="openstack/dnsmasq-dns-6c58b86477-9tbw7" Feb 19 15:29:50 crc kubenswrapper[4810]: I0219 15:29:50.359709 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f5db557a-0b89-4a02-b1b2-19bc205acee8-logs\") pod \"cinder-api-0\" (UID: \"f5db557a-0b89-4a02-b1b2-19bc205acee8\") " pod="openstack/cinder-api-0" Feb 19 15:29:50 crc kubenswrapper[4810]: I0219 15:29:50.359753 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5db557a-0b89-4a02-b1b2-19bc205acee8-config-data\") pod \"cinder-api-0\" (UID: \"f5db557a-0b89-4a02-b1b2-19bc205acee8\") " pod="openstack/cinder-api-0" Feb 19 15:29:50 crc kubenswrapper[4810]: I0219 15:29:50.359771 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f5db557a-0b89-4a02-b1b2-19bc205acee8-config-data-custom\") pod \"cinder-api-0\" (UID: \"f5db557a-0b89-4a02-b1b2-19bc205acee8\") " pod="openstack/cinder-api-0" Feb 19 15:29:50 crc kubenswrapper[4810]: I0219 15:29:50.359792 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4a48946e-058c-4395-bbad-5effb50b2228-ovsdbserver-nb\") pod \"dnsmasq-dns-6c58b86477-9tbw7\" (UID: \"4a48946e-058c-4395-bbad-5effb50b2228\") " pod="openstack/dnsmasq-dns-6c58b86477-9tbw7" Feb 19 15:29:50 crc kubenswrapper[4810]: I0219 15:29:50.359811 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5db557a-0b89-4a02-b1b2-19bc205acee8-scripts\") pod \"cinder-api-0\" (UID: \"f5db557a-0b89-4a02-b1b2-19bc205acee8\") " pod="openstack/cinder-api-0" Feb 19 15:29:50 crc kubenswrapper[4810]: I0219 15:29:50.359827 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4a48946e-058c-4395-bbad-5effb50b2228-ovsdbserver-sb\") pod \"dnsmasq-dns-6c58b86477-9tbw7\" (UID: \"4a48946e-058c-4395-bbad-5effb50b2228\") " pod="openstack/dnsmasq-dns-6c58b86477-9tbw7" Feb 19 15:29:50 crc kubenswrapper[4810]: I0219 15:29:50.360731 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4a48946e-058c-4395-bbad-5effb50b2228-dns-swift-storage-0\") pod \"dnsmasq-dns-6c58b86477-9tbw7\" (UID: \"4a48946e-058c-4395-bbad-5effb50b2228\") " pod="openstack/dnsmasq-dns-6c58b86477-9tbw7" Feb 19 15:29:50 crc kubenswrapper[4810]: I0219 15:29:50.361435 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a48946e-058c-4395-bbad-5effb50b2228-dns-svc\") pod \"dnsmasq-dns-6c58b86477-9tbw7\" (UID: \"4a48946e-058c-4395-bbad-5effb50b2228\") " pod="openstack/dnsmasq-dns-6c58b86477-9tbw7" Feb 19 15:29:50 crc kubenswrapper[4810]: I0219 15:29:50.361893 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4a48946e-058c-4395-bbad-5effb50b2228-ovsdbserver-sb\") pod \"dnsmasq-dns-6c58b86477-9tbw7\" (UID: \"4a48946e-058c-4395-bbad-5effb50b2228\") " pod="openstack/dnsmasq-dns-6c58b86477-9tbw7" Feb 19 15:29:50 crc kubenswrapper[4810]: I0219 15:29:50.362068 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a48946e-058c-4395-bbad-5effb50b2228-config\") pod \"dnsmasq-dns-6c58b86477-9tbw7\" (UID: \"4a48946e-058c-4395-bbad-5effb50b2228\") " pod="openstack/dnsmasq-dns-6c58b86477-9tbw7" Feb 19 15:29:50 crc kubenswrapper[4810]: I0219 15:29:50.362606 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4a48946e-058c-4395-bbad-5effb50b2228-ovsdbserver-nb\") pod \"dnsmasq-dns-6c58b86477-9tbw7\" (UID: \"4a48946e-058c-4395-bbad-5effb50b2228\") " pod="openstack/dnsmasq-dns-6c58b86477-9tbw7" Feb 19 15:29:50 crc kubenswrapper[4810]: I0219 15:29:50.380130 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2hfc\" (UniqueName: \"kubernetes.io/projected/4a48946e-058c-4395-bbad-5effb50b2228-kube-api-access-v2hfc\") pod \"dnsmasq-dns-6c58b86477-9tbw7\" (UID: \"4a48946e-058c-4395-bbad-5effb50b2228\") " pod="openstack/dnsmasq-dns-6c58b86477-9tbw7" Feb 19 15:29:50 crc kubenswrapper[4810]: I0219 15:29:50.402090 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 19 15:29:50 crc kubenswrapper[4810]: I0219 15:29:50.461742 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5db557a-0b89-4a02-b1b2-19bc205acee8-scripts\") pod \"cinder-api-0\" (UID: \"f5db557a-0b89-4a02-b1b2-19bc205acee8\") " pod="openstack/cinder-api-0" Feb 19 15:29:50 crc kubenswrapper[4810]: I0219 15:29:50.461821 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f5db557a-0b89-4a02-b1b2-19bc205acee8-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f5db557a-0b89-4a02-b1b2-19bc205acee8\") " pod="openstack/cinder-api-0" Feb 19 15:29:50 crc kubenswrapper[4810]: I0219 15:29:50.461848 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5db557a-0b89-4a02-b1b2-19bc205acee8-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f5db557a-0b89-4a02-b1b2-19bc205acee8\") " pod="openstack/cinder-api-0" Feb 19 15:29:50 crc kubenswrapper[4810]: I0219 15:29:50.461934 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7vrh\" (UniqueName: \"kubernetes.io/projected/f5db557a-0b89-4a02-b1b2-19bc205acee8-kube-api-access-s7vrh\") pod \"cinder-api-0\" (UID: \"f5db557a-0b89-4a02-b1b2-19bc205acee8\") " pod="openstack/cinder-api-0" Feb 19 15:29:50 crc kubenswrapper[4810]: I0219 15:29:50.461979 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f5db557a-0b89-4a02-b1b2-19bc205acee8-logs\") pod \"cinder-api-0\" (UID: \"f5db557a-0b89-4a02-b1b2-19bc205acee8\") " pod="openstack/cinder-api-0" Feb 19 15:29:50 crc kubenswrapper[4810]: I0219 15:29:50.462016 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f5db557a-0b89-4a02-b1b2-19bc205acee8-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f5db557a-0b89-4a02-b1b2-19bc205acee8\") " pod="openstack/cinder-api-0" Feb 19 15:29:50 crc kubenswrapper[4810]: I0219 15:29:50.462027 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5db557a-0b89-4a02-b1b2-19bc205acee8-config-data\") pod \"cinder-api-0\" (UID: \"f5db557a-0b89-4a02-b1b2-19bc205acee8\") " pod="openstack/cinder-api-0" Feb 19 15:29:50 crc kubenswrapper[4810]: I0219 15:29:50.462118 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f5db557a-0b89-4a02-b1b2-19bc205acee8-config-data-custom\") pod \"cinder-api-0\" (UID: \"f5db557a-0b89-4a02-b1b2-19bc205acee8\") " pod="openstack/cinder-api-0" Feb 19 15:29:50 crc kubenswrapper[4810]: I0219 15:29:50.462611 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f5db557a-0b89-4a02-b1b2-19bc205acee8-logs\") pod \"cinder-api-0\" (UID: \"f5db557a-0b89-4a02-b1b2-19bc205acee8\") " pod="openstack/cinder-api-0" Feb 19 15:29:50 crc kubenswrapper[4810]: I0219 15:29:50.465296 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c58b86477-9tbw7" Feb 19 15:29:50 crc kubenswrapper[4810]: I0219 15:29:50.485069 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5db557a-0b89-4a02-b1b2-19bc205acee8-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f5db557a-0b89-4a02-b1b2-19bc205acee8\") " pod="openstack/cinder-api-0" Feb 19 15:29:50 crc kubenswrapper[4810]: I0219 15:29:50.486213 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5db557a-0b89-4a02-b1b2-19bc205acee8-config-data\") pod \"cinder-api-0\" (UID: \"f5db557a-0b89-4a02-b1b2-19bc205acee8\") " pod="openstack/cinder-api-0" Feb 19 15:29:50 crc kubenswrapper[4810]: I0219 15:29:50.487525 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5db557a-0b89-4a02-b1b2-19bc205acee8-scripts\") pod \"cinder-api-0\" (UID: \"f5db557a-0b89-4a02-b1b2-19bc205acee8\") " pod="openstack/cinder-api-0" Feb 19 15:29:50 crc kubenswrapper[4810]: I0219 15:29:50.487792 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f5db557a-0b89-4a02-b1b2-19bc205acee8-config-data-custom\") pod \"cinder-api-0\" (UID: \"f5db557a-0b89-4a02-b1b2-19bc205acee8\") " pod="openstack/cinder-api-0" Feb 19 15:29:50 crc kubenswrapper[4810]: I0219 15:29:50.497588 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7vrh\" (UniqueName: \"kubernetes.io/projected/f5db557a-0b89-4a02-b1b2-19bc205acee8-kube-api-access-s7vrh\") pod \"cinder-api-0\" (UID: \"f5db557a-0b89-4a02-b1b2-19bc205acee8\") " pod="openstack/cinder-api-0" Feb 19 15:29:50 crc kubenswrapper[4810]: I0219 15:29:50.540313 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 19 15:29:50 crc kubenswrapper[4810]: E0219 15:29:50.573521 4810 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Feb 19 15:29:50 crc kubenswrapper[4810]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/b1f44651-e4eb-4cce-a493-9dd9b491b22a/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Feb 19 15:29:50 crc kubenswrapper[4810]: > podSandboxID="d03f7f66a9a3b08a44840b959b3f1bdcaeb2f46d803e33b39c92670ff95677e9" Feb 19 15:29:50 crc kubenswrapper[4810]: E0219 15:29:50.573787 4810 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 19 15:29:50 crc kubenswrapper[4810]: container &Container{Name:dnsmasq-dns,Image:38.102.83.159:5001/podified-master-centos10/openstack-neutron-server:watcher_latest,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n4h578h588h57ch687h559h5f4h5ffh659h594h667h666h588h679hdh56h55hfhbh5f7h67fh5c4h645hd9h684h689h555h549h646h58bh546h664q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-swift-storage-0,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-swift-storage-0,SubPath:dns-swift-storage-0,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-nb,SubPath:ovsdbserver-nb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-sb,SubPath:ovsdbserver-sb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m85jt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5b44bbb4dc-dktkl_openstack(b1f44651-e4eb-4cce-a493-9dd9b491b22a): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/b1f44651-e4eb-4cce-a493-9dd9b491b22a/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Feb 19 15:29:50 crc kubenswrapper[4810]: > logger="UnhandledError" Feb 19 15:29:50 crc kubenswrapper[4810]: E0219 15:29:50.575178 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/b1f44651-e4eb-4cce-a493-9dd9b491b22a/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-5b44bbb4dc-dktkl" podUID="b1f44651-e4eb-4cce-a493-9dd9b491b22a" Feb 19 15:29:51 crc kubenswrapper[4810]: I0219 15:29:51.031728 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 15:29:51 crc kubenswrapper[4810]: I0219 15:29:51.215251 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c58b86477-9tbw7"] Feb 19 15:29:51 crc kubenswrapper[4810]: I0219 15:29:51.318668 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-869f57798-ngdtl" Feb 19 15:29:51 crc kubenswrapper[4810]: I0219 15:29:51.435350 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 19 15:29:51 crc kubenswrapper[4810]: I0219 15:29:51.747029 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-5f56498b8d-9gwmf" Feb 19 15:29:52 crc kubenswrapper[4810]: I0219 15:29:52.202126 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b44bbb4dc-dktkl" Feb 19 15:29:52 crc kubenswrapper[4810]: I0219 15:29:52.233998 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1f44651-e4eb-4cce-a493-9dd9b491b22a-dns-svc\") pod \"b1f44651-e4eb-4cce-a493-9dd9b491b22a\" (UID: \"b1f44651-e4eb-4cce-a493-9dd9b491b22a\") " Feb 19 15:29:52 crc kubenswrapper[4810]: I0219 15:29:52.234528 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b1f44651-e4eb-4cce-a493-9dd9b491b22a-ovsdbserver-nb\") pod \"b1f44651-e4eb-4cce-a493-9dd9b491b22a\" (UID: \"b1f44651-e4eb-4cce-a493-9dd9b491b22a\") " Feb 19 15:29:52 crc kubenswrapper[4810]: I0219 15:29:52.234700 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1f44651-e4eb-4cce-a493-9dd9b491b22a-config\") pod \"b1f44651-e4eb-4cce-a493-9dd9b491b22a\" (UID: \"b1f44651-e4eb-4cce-a493-9dd9b491b22a\") " Feb 19 15:29:52 crc kubenswrapper[4810]: I0219 15:29:52.234875 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m85jt\" (UniqueName: \"kubernetes.io/projected/b1f44651-e4eb-4cce-a493-9dd9b491b22a-kube-api-access-m85jt\") pod \"b1f44651-e4eb-4cce-a493-9dd9b491b22a\" (UID: \"b1f44651-e4eb-4cce-a493-9dd9b491b22a\") " Feb 19 15:29:52 crc kubenswrapper[4810]: I0219 15:29:52.234936 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b1f44651-e4eb-4cce-a493-9dd9b491b22a-dns-swift-storage-0\") pod \"b1f44651-e4eb-4cce-a493-9dd9b491b22a\" (UID: \"b1f44651-e4eb-4cce-a493-9dd9b491b22a\") " Feb 19 15:29:52 crc kubenswrapper[4810]: I0219 15:29:52.235057 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b1f44651-e4eb-4cce-a493-9dd9b491b22a-ovsdbserver-sb\") pod \"b1f44651-e4eb-4cce-a493-9dd9b491b22a\" (UID: \"b1f44651-e4eb-4cce-a493-9dd9b491b22a\") " Feb 19 15:29:52 crc kubenswrapper[4810]: I0219 15:29:52.254434 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1f44651-e4eb-4cce-a493-9dd9b491b22a-kube-api-access-m85jt" (OuterVolumeSpecName: "kube-api-access-m85jt") pod "b1f44651-e4eb-4cce-a493-9dd9b491b22a" (UID: "b1f44651-e4eb-4cce-a493-9dd9b491b22a"). InnerVolumeSpecName "kube-api-access-m85jt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:29:52 crc kubenswrapper[4810]: I0219 15:29:52.326693 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1f44651-e4eb-4cce-a493-9dd9b491b22a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b1f44651-e4eb-4cce-a493-9dd9b491b22a" (UID: "b1f44651-e4eb-4cce-a493-9dd9b491b22a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:29:52 crc kubenswrapper[4810]: I0219 15:29:52.328857 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1f44651-e4eb-4cce-a493-9dd9b491b22a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b1f44651-e4eb-4cce-a493-9dd9b491b22a" (UID: "b1f44651-e4eb-4cce-a493-9dd9b491b22a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:29:52 crc kubenswrapper[4810]: I0219 15:29:52.337556 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m85jt\" (UniqueName: \"kubernetes.io/projected/b1f44651-e4eb-4cce-a493-9dd9b491b22a-kube-api-access-m85jt\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:52 crc kubenswrapper[4810]: I0219 15:29:52.337591 4810 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b1f44651-e4eb-4cce-a493-9dd9b491b22a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:52 crc kubenswrapper[4810]: I0219 15:29:52.337602 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b1f44651-e4eb-4cce-a493-9dd9b491b22a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:52 crc kubenswrapper[4810]: I0219 15:29:52.338945 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1f44651-e4eb-4cce-a493-9dd9b491b22a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b1f44651-e4eb-4cce-a493-9dd9b491b22a" (UID: "b1f44651-e4eb-4cce-a493-9dd9b491b22a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:29:52 crc kubenswrapper[4810]: I0219 15:29:52.348870 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1f44651-e4eb-4cce-a493-9dd9b491b22a-config" (OuterVolumeSpecName: "config") pod "b1f44651-e4eb-4cce-a493-9dd9b491b22a" (UID: "b1f44651-e4eb-4cce-a493-9dd9b491b22a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:29:52 crc kubenswrapper[4810]: I0219 15:29:52.367634 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1f44651-e4eb-4cce-a493-9dd9b491b22a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b1f44651-e4eb-4cce-a493-9dd9b491b22a" (UID: "b1f44651-e4eb-4cce-a493-9dd9b491b22a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:29:52 crc kubenswrapper[4810]: I0219 15:29:52.439593 4810 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1f44651-e4eb-4cce-a493-9dd9b491b22a-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:52 crc kubenswrapper[4810]: I0219 15:29:52.439684 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b1f44651-e4eb-4cce-a493-9dd9b491b22a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:52 crc kubenswrapper[4810]: I0219 15:29:52.439706 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1f44651-e4eb-4cce-a493-9dd9b491b22a-config\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:52 crc kubenswrapper[4810]: I0219 15:29:52.819310 4810 generic.go:334] "Generic (PLEG): container finished" podID="08eca88c-a4b4-461b-8568-ebbf54645272" containerID="11bb73290ec186744ef4e88375d87c281860032daa137cc8bd4779ab70117f2b" exitCode=137 Feb 19 15:29:52 crc kubenswrapper[4810]: I0219 15:29:52.819384 4810 generic.go:334] "Generic (PLEG): container finished" podID="08eca88c-a4b4-461b-8568-ebbf54645272" containerID="cc512cef3a5d05465ad16d8e6be38874fe910640b0911d914b7146b795d387d4" exitCode=137 Feb 19 15:29:52 crc kubenswrapper[4810]: I0219 15:29:52.819458 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-668f7d7fb5-l5kpq" event={"ID":"08eca88c-a4b4-461b-8568-ebbf54645272","Type":"ContainerDied","Data":"11bb73290ec186744ef4e88375d87c281860032daa137cc8bd4779ab70117f2b"} Feb 19 15:29:52 crc kubenswrapper[4810]: I0219 15:29:52.819501 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-668f7d7fb5-l5kpq" event={"ID":"08eca88c-a4b4-461b-8568-ebbf54645272","Type":"ContainerDied","Data":"cc512cef3a5d05465ad16d8e6be38874fe910640b0911d914b7146b795d387d4"} Feb 19 15:29:52 crc kubenswrapper[4810]: I0219 15:29:52.820822 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c58b86477-9tbw7" event={"ID":"4a48946e-058c-4395-bbad-5effb50b2228","Type":"ContainerStarted","Data":"bec8edc98672f19301835694ed5c49318c10a4c5634dfa9bc2728f6b7541a7a3"} Feb 19 15:29:52 crc kubenswrapper[4810]: I0219 15:29:52.821789 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e3bd9969-3750-460b-95cd-8c52d2e44d82","Type":"ContainerStarted","Data":"3b1b3f3010d8c20d2061253491ade333f69730534532a92038800b9cbbc0aede"} Feb 19 15:29:52 crc kubenswrapper[4810]: I0219 15:29:52.823709 4810 generic.go:334] "Generic (PLEG): container finished" podID="c39b1dd9-9e73-4cca-aea6-e228f1ba5942" containerID="267e2a2aa6b67d8303b6404df33bfa4941b4f403604fb949cf5ec932e82ab1b7" exitCode=137 Feb 19 15:29:52 crc kubenswrapper[4810]: I0219 15:29:52.823748 4810 generic.go:334] "Generic (PLEG): container finished" podID="c39b1dd9-9e73-4cca-aea6-e228f1ba5942" containerID="09ef2e9038e53c0d3694b2f3cb1b25543038065797746360e5b00060a157df5b" exitCode=137 Feb 19 15:29:52 crc kubenswrapper[4810]: I0219 15:29:52.823807 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-849c785789-5xrh2" event={"ID":"c39b1dd9-9e73-4cca-aea6-e228f1ba5942","Type":"ContainerDied","Data":"267e2a2aa6b67d8303b6404df33bfa4941b4f403604fb949cf5ec932e82ab1b7"} Feb 19 15:29:52 crc kubenswrapper[4810]: I0219 15:29:52.823856 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-849c785789-5xrh2" event={"ID":"c39b1dd9-9e73-4cca-aea6-e228f1ba5942","Type":"ContainerDied","Data":"09ef2e9038e53c0d3694b2f3cb1b25543038065797746360e5b00060a157df5b"} Feb 19 15:29:52 crc kubenswrapper[4810]: I0219 15:29:52.825695 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b44bbb4dc-dktkl" event={"ID":"b1f44651-e4eb-4cce-a493-9dd9b491b22a","Type":"ContainerDied","Data":"d03f7f66a9a3b08a44840b959b3f1bdcaeb2f46d803e33b39c92670ff95677e9"} Feb 19 15:29:52 crc kubenswrapper[4810]: I0219 15:29:52.825738 4810 scope.go:117] "RemoveContainer" containerID="b192a1bf8058da9cccb4f9c8a3c59226e8d40284e5443eca3683d4b0120ebdd4" Feb 19 15:29:52 crc kubenswrapper[4810]: I0219 15:29:52.825766 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b44bbb4dc-dktkl" Feb 19 15:29:52 crc kubenswrapper[4810]: I0219 15:29:52.826897 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f5db557a-0b89-4a02-b1b2-19bc205acee8","Type":"ContainerStarted","Data":"cb5c2099652d015df95731d45206e3f87ff0597889d9ca17b5c94ba96cec083c"} Feb 19 15:29:52 crc kubenswrapper[4810]: I0219 15:29:52.914096 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b44bbb4dc-dktkl"] Feb 19 15:29:52 crc kubenswrapper[4810]: I0219 15:29:52.924656 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b44bbb4dc-dktkl"] Feb 19 15:29:53 crc kubenswrapper[4810]: I0219 15:29:53.493145 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1f44651-e4eb-4cce-a493-9dd9b491b22a" path="/var/lib/kubelet/pods/b1f44651-e4eb-4cce-a493-9dd9b491b22a/volumes" Feb 19 15:29:53 crc kubenswrapper[4810]: I0219 15:29:53.565494 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-668f7d7fb5-l5kpq" Feb 19 15:29:53 crc kubenswrapper[4810]: I0219 15:29:53.658875 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-849c785789-5xrh2" Feb 19 15:29:53 crc kubenswrapper[4810]: I0219 15:29:53.675891 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08eca88c-a4b4-461b-8568-ebbf54645272-logs\") pod \"08eca88c-a4b4-461b-8568-ebbf54645272\" (UID: \"08eca88c-a4b4-461b-8568-ebbf54645272\") " Feb 19 15:29:53 crc kubenswrapper[4810]: I0219 15:29:53.675972 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/08eca88c-a4b4-461b-8568-ebbf54645272-horizon-secret-key\") pod \"08eca88c-a4b4-461b-8568-ebbf54645272\" (UID: \"08eca88c-a4b4-461b-8568-ebbf54645272\") " Feb 19 15:29:53 crc kubenswrapper[4810]: I0219 15:29:53.676049 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wb5qb\" (UniqueName: \"kubernetes.io/projected/08eca88c-a4b4-461b-8568-ebbf54645272-kube-api-access-wb5qb\") pod \"08eca88c-a4b4-461b-8568-ebbf54645272\" (UID: \"08eca88c-a4b4-461b-8568-ebbf54645272\") " Feb 19 15:29:53 crc kubenswrapper[4810]: I0219 15:29:53.676178 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/08eca88c-a4b4-461b-8568-ebbf54645272-scripts\") pod \"08eca88c-a4b4-461b-8568-ebbf54645272\" (UID: \"08eca88c-a4b4-461b-8568-ebbf54645272\") " Feb 19 15:29:53 crc kubenswrapper[4810]: I0219 15:29:53.676248 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/08eca88c-a4b4-461b-8568-ebbf54645272-config-data\") pod \"08eca88c-a4b4-461b-8568-ebbf54645272\" (UID: \"08eca88c-a4b4-461b-8568-ebbf54645272\") " Feb 19 15:29:53 crc kubenswrapper[4810]: I0219 15:29:53.683959 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08eca88c-a4b4-461b-8568-ebbf54645272-logs" (OuterVolumeSpecName: "logs") pod "08eca88c-a4b4-461b-8568-ebbf54645272" (UID: "08eca88c-a4b4-461b-8568-ebbf54645272"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:29:53 crc kubenswrapper[4810]: I0219 15:29:53.703922 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08eca88c-a4b4-461b-8568-ebbf54645272-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "08eca88c-a4b4-461b-8568-ebbf54645272" (UID: "08eca88c-a4b4-461b-8568-ebbf54645272"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:29:53 crc kubenswrapper[4810]: I0219 15:29:53.706677 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08eca88c-a4b4-461b-8568-ebbf54645272-kube-api-access-wb5qb" (OuterVolumeSpecName: "kube-api-access-wb5qb") pod "08eca88c-a4b4-461b-8568-ebbf54645272" (UID: "08eca88c-a4b4-461b-8568-ebbf54645272"). InnerVolumeSpecName "kube-api-access-wb5qb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:29:53 crc kubenswrapper[4810]: I0219 15:29:53.730227 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-869f57798-ngdtl" Feb 19 15:29:53 crc kubenswrapper[4810]: I0219 15:29:53.777807 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c39b1dd9-9e73-4cca-aea6-e228f1ba5942-logs\") pod \"c39b1dd9-9e73-4cca-aea6-e228f1ba5942\" (UID: \"c39b1dd9-9e73-4cca-aea6-e228f1ba5942\") " Feb 19 15:29:53 crc kubenswrapper[4810]: I0219 15:29:53.778262 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c39b1dd9-9e73-4cca-aea6-e228f1ba5942-horizon-secret-key\") pod \"c39b1dd9-9e73-4cca-aea6-e228f1ba5942\" (UID: \"c39b1dd9-9e73-4cca-aea6-e228f1ba5942\") " Feb 19 15:29:53 crc kubenswrapper[4810]: I0219 15:29:53.778452 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c39b1dd9-9e73-4cca-aea6-e228f1ba5942-config-data\") pod \"c39b1dd9-9e73-4cca-aea6-e228f1ba5942\" (UID: \"c39b1dd9-9e73-4cca-aea6-e228f1ba5942\") " Feb 19 15:29:53 crc kubenswrapper[4810]: I0219 15:29:53.778496 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcbv5\" (UniqueName: \"kubernetes.io/projected/c39b1dd9-9e73-4cca-aea6-e228f1ba5942-kube-api-access-pcbv5\") pod \"c39b1dd9-9e73-4cca-aea6-e228f1ba5942\" (UID: \"c39b1dd9-9e73-4cca-aea6-e228f1ba5942\") " Feb 19 15:29:53 crc kubenswrapper[4810]: I0219 15:29:53.778557 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c39b1dd9-9e73-4cca-aea6-e228f1ba5942-scripts\") pod \"c39b1dd9-9e73-4cca-aea6-e228f1ba5942\" (UID: \"c39b1dd9-9e73-4cca-aea6-e228f1ba5942\") " Feb 19 15:29:53 crc kubenswrapper[4810]: I0219 15:29:53.779048 4810 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08eca88c-a4b4-461b-8568-ebbf54645272-logs\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:53 crc kubenswrapper[4810]: I0219 15:29:53.779061 4810 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/08eca88c-a4b4-461b-8568-ebbf54645272-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:53 crc kubenswrapper[4810]: I0219 15:29:53.779071 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wb5qb\" (UniqueName: \"kubernetes.io/projected/08eca88c-a4b4-461b-8568-ebbf54645272-kube-api-access-wb5qb\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:53 crc kubenswrapper[4810]: I0219 15:29:53.780264 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c39b1dd9-9e73-4cca-aea6-e228f1ba5942-logs" (OuterVolumeSpecName: "logs") pod "c39b1dd9-9e73-4cca-aea6-e228f1ba5942" (UID: "c39b1dd9-9e73-4cca-aea6-e228f1ba5942"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:29:53 crc kubenswrapper[4810]: I0219 15:29:53.806821 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c39b1dd9-9e73-4cca-aea6-e228f1ba5942-kube-api-access-pcbv5" (OuterVolumeSpecName: "kube-api-access-pcbv5") pod "c39b1dd9-9e73-4cca-aea6-e228f1ba5942" (UID: "c39b1dd9-9e73-4cca-aea6-e228f1ba5942"). InnerVolumeSpecName "kube-api-access-pcbv5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:29:53 crc kubenswrapper[4810]: I0219 15:29:53.813495 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c39b1dd9-9e73-4cca-aea6-e228f1ba5942-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "c39b1dd9-9e73-4cca-aea6-e228f1ba5942" (UID: "c39b1dd9-9e73-4cca-aea6-e228f1ba5942"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:29:53 crc kubenswrapper[4810]: I0219 15:29:53.820844 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08eca88c-a4b4-461b-8568-ebbf54645272-scripts" (OuterVolumeSpecName: "scripts") pod "08eca88c-a4b4-461b-8568-ebbf54645272" (UID: "08eca88c-a4b4-461b-8568-ebbf54645272"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:29:53 crc kubenswrapper[4810]: I0219 15:29:53.840432 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c39b1dd9-9e73-4cca-aea6-e228f1ba5942-scripts" (OuterVolumeSpecName: "scripts") pod "c39b1dd9-9e73-4cca-aea6-e228f1ba5942" (UID: "c39b1dd9-9e73-4cca-aea6-e228f1ba5942"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:29:53 crc kubenswrapper[4810]: I0219 15:29:53.846722 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-5f56498b8d-9gwmf" Feb 19 15:29:53 crc kubenswrapper[4810]: I0219 15:29:53.858533 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08eca88c-a4b4-461b-8568-ebbf54645272-config-data" (OuterVolumeSpecName: "config-data") pod "08eca88c-a4b4-461b-8568-ebbf54645272" (UID: "08eca88c-a4b4-461b-8568-ebbf54645272"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:29:53 crc kubenswrapper[4810]: I0219 15:29:53.882608 4810 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c39b1dd9-9e73-4cca-aea6-e228f1ba5942-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:53 crc kubenswrapper[4810]: I0219 15:29:53.882656 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/08eca88c-a4b4-461b-8568-ebbf54645272-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:53 crc kubenswrapper[4810]: I0219 15:29:53.882666 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/08eca88c-a4b4-461b-8568-ebbf54645272-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:53 crc kubenswrapper[4810]: I0219 15:29:53.882676 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcbv5\" (UniqueName: \"kubernetes.io/projected/c39b1dd9-9e73-4cca-aea6-e228f1ba5942-kube-api-access-pcbv5\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:53 crc kubenswrapper[4810]: I0219 15:29:53.882687 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c39b1dd9-9e73-4cca-aea6-e228f1ba5942-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:53 crc kubenswrapper[4810]: I0219 15:29:53.882694 4810 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c39b1dd9-9e73-4cca-aea6-e228f1ba5942-logs\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:53 crc kubenswrapper[4810]: I0219 15:29:53.897395 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 19 15:29:53 crc kubenswrapper[4810]: I0219 15:29:53.927950 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-849c785789-5xrh2" Feb 19 15:29:53 crc kubenswrapper[4810]: I0219 15:29:53.929225 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-849c785789-5xrh2" event={"ID":"c39b1dd9-9e73-4cca-aea6-e228f1ba5942","Type":"ContainerDied","Data":"f7e61fd52ad6569907f8a84cbc32aef547486ffda35028466938ada8d5e3aa10"} Feb 19 15:29:53 crc kubenswrapper[4810]: I0219 15:29:53.929316 4810 scope.go:117] "RemoveContainer" containerID="267e2a2aa6b67d8303b6404df33bfa4941b4f403604fb949cf5ec932e82ab1b7" Feb 19 15:29:53 crc kubenswrapper[4810]: I0219 15:29:53.936728 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-869f57798-ngdtl"] Feb 19 15:29:53 crc kubenswrapper[4810]: I0219 15:29:53.940926 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c39b1dd9-9e73-4cca-aea6-e228f1ba5942-config-data" (OuterVolumeSpecName: "config-data") pod "c39b1dd9-9e73-4cca-aea6-e228f1ba5942" (UID: "c39b1dd9-9e73-4cca-aea6-e228f1ba5942"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:29:53 crc kubenswrapper[4810]: I0219 15:29:53.984431 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c39b1dd9-9e73-4cca-aea6-e228f1ba5942-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:53 crc kubenswrapper[4810]: I0219 15:29:53.993083 4810 generic.go:334] "Generic (PLEG): container finished" podID="3eb2dccd-c5dc-436f-b7a6-954af7bc51c5" containerID="a8232fee767eaee3fa026223e20673e25fa6d954ddb913e148bf6e0e435de416" exitCode=1 Feb 19 15:29:53 crc kubenswrapper[4810]: I0219 15:29:53.993180 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"3eb2dccd-c5dc-436f-b7a6-954af7bc51c5","Type":"ContainerDied","Data":"a8232fee767eaee3fa026223e20673e25fa6d954ddb913e148bf6e0e435de416"} Feb 19 15:29:53 crc kubenswrapper[4810]: I0219 15:29:53.994431 4810 scope.go:117] "RemoveContainer" containerID="a8232fee767eaee3fa026223e20673e25fa6d954ddb913e148bf6e0e435de416" Feb 19 15:29:53 crc kubenswrapper[4810]: E0219 15:29:53.994910 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(3eb2dccd-c5dc-436f-b7a6-954af7bc51c5)\"" pod="openstack/watcher-decision-engine-0" podUID="3eb2dccd-c5dc-436f-b7a6-954af7bc51c5" Feb 19 15:29:54 crc kubenswrapper[4810]: I0219 15:29:54.033391 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-668f7d7fb5-l5kpq" event={"ID":"08eca88c-a4b4-461b-8568-ebbf54645272","Type":"ContainerDied","Data":"b936d9f6773d6dc856145fa11df03aba298425ab6b8a943cdd257e7a943da84e"} Feb 19 15:29:54 crc kubenswrapper[4810]: I0219 15:29:54.033589 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-668f7d7fb5-l5kpq" Feb 19 15:29:54 crc kubenswrapper[4810]: I0219 15:29:54.045825 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c58b86477-9tbw7" event={"ID":"4a48946e-058c-4395-bbad-5effb50b2228","Type":"ContainerStarted","Data":"a0f55d5dfd4c1951d245770b89cf22415d60ecb97cf8c05e857fc4583af61f68"} Feb 19 15:29:54 crc kubenswrapper[4810]: I0219 15:29:54.080253 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-869f57798-ngdtl" podUID="58c845f2-0069-4ee5-9d4b-b5871e078926" containerName="horizon-log" containerID="cri-o://b7473c6c07a1c77d67ffe62af3e5c262ab61dca816caf8aab0acb14dc5b23ebd" gracePeriod=30 Feb 19 15:29:54 crc kubenswrapper[4810]: I0219 15:29:54.080435 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-75f99f68b4-d7hj4" event={"ID":"c008ffcd-bb96-47dd-a311-fdc58f6d8918","Type":"ContainerStarted","Data":"86e12660d14ec5ce962dfc203a055048d75b58212e920b0f18356f78c799be3a"} Feb 19 15:29:54 crc kubenswrapper[4810]: I0219 15:29:54.080635 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-869f57798-ngdtl" podUID="58c845f2-0069-4ee5-9d4b-b5871e078926" containerName="horizon" containerID="cri-o://5ab1bae28a55f588686fefd9b6e6ee98c22d6657796a662570ae5cd62319bd13" gracePeriod=30 Feb 19 15:29:54 crc kubenswrapper[4810]: I0219 15:29:54.171389 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-668f7d7fb5-l5kpq"] Feb 19 15:29:54 crc kubenswrapper[4810]: I0219 15:29:54.250096 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-668f7d7fb5-l5kpq"] Feb 19 15:29:54 crc kubenswrapper[4810]: I0219 15:29:54.280515 4810 scope.go:117] "RemoveContainer" containerID="09ef2e9038e53c0d3694b2f3cb1b25543038065797746360e5b00060a157df5b" Feb 19 15:29:54 crc kubenswrapper[4810]: I0219 15:29:54.384014 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-849c785789-5xrh2"] Feb 19 15:29:54 crc kubenswrapper[4810]: I0219 15:29:54.398281 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-849c785789-5xrh2"] Feb 19 15:29:54 crc kubenswrapper[4810]: I0219 15:29:54.406038 4810 scope.go:117] "RemoveContainer" containerID="ea64343d53131969232078700626ee5996ba47b778be09a59da7ab42c9aa2631" Feb 19 15:29:54 crc kubenswrapper[4810]: I0219 15:29:54.530817 4810 scope.go:117] "RemoveContainer" containerID="11bb73290ec186744ef4e88375d87c281860032daa137cc8bd4779ab70117f2b" Feb 19 15:29:54 crc kubenswrapper[4810]: I0219 15:29:54.582865 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6b886df68b-htd57"] Feb 19 15:29:54 crc kubenswrapper[4810]: E0219 15:29:54.583318 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c39b1dd9-9e73-4cca-aea6-e228f1ba5942" containerName="horizon" Feb 19 15:29:54 crc kubenswrapper[4810]: I0219 15:29:54.583346 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="c39b1dd9-9e73-4cca-aea6-e228f1ba5942" containerName="horizon" Feb 19 15:29:54 crc kubenswrapper[4810]: E0219 15:29:54.583365 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c39b1dd9-9e73-4cca-aea6-e228f1ba5942" containerName="horizon-log" Feb 19 15:29:54 crc kubenswrapper[4810]: I0219 15:29:54.583372 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="c39b1dd9-9e73-4cca-aea6-e228f1ba5942" containerName="horizon-log" Feb 19 15:29:54 crc kubenswrapper[4810]: E0219 15:29:54.583383 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08eca88c-a4b4-461b-8568-ebbf54645272" containerName="horizon-log" Feb 19 15:29:54 crc kubenswrapper[4810]: I0219 15:29:54.583389 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="08eca88c-a4b4-461b-8568-ebbf54645272" containerName="horizon-log" Feb 19 15:29:54 crc kubenswrapper[4810]: E0219 15:29:54.583404 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08eca88c-a4b4-461b-8568-ebbf54645272" containerName="horizon" Feb 19 15:29:54 crc kubenswrapper[4810]: I0219 15:29:54.583411 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="08eca88c-a4b4-461b-8568-ebbf54645272" containerName="horizon" Feb 19 15:29:54 crc kubenswrapper[4810]: E0219 15:29:54.583419 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1f44651-e4eb-4cce-a493-9dd9b491b22a" containerName="init" Feb 19 15:29:54 crc kubenswrapper[4810]: I0219 15:29:54.583425 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1f44651-e4eb-4cce-a493-9dd9b491b22a" containerName="init" Feb 19 15:29:54 crc kubenswrapper[4810]: I0219 15:29:54.583585 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="08eca88c-a4b4-461b-8568-ebbf54645272" containerName="horizon-log" Feb 19 15:29:54 crc kubenswrapper[4810]: I0219 15:29:54.583595 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="c39b1dd9-9e73-4cca-aea6-e228f1ba5942" containerName="horizon-log" Feb 19 15:29:54 crc kubenswrapper[4810]: I0219 15:29:54.583618 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="08eca88c-a4b4-461b-8568-ebbf54645272" containerName="horizon" Feb 19 15:29:54 crc kubenswrapper[4810]: I0219 15:29:54.583627 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="c39b1dd9-9e73-4cca-aea6-e228f1ba5942" containerName="horizon" Feb 19 15:29:54 crc kubenswrapper[4810]: I0219 15:29:54.583639 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1f44651-e4eb-4cce-a493-9dd9b491b22a" containerName="init" Feb 19 15:29:54 crc kubenswrapper[4810]: I0219 15:29:54.584623 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6b886df68b-htd57" Feb 19 15:29:54 crc kubenswrapper[4810]: I0219 15:29:54.590067 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Feb 19 15:29:54 crc kubenswrapper[4810]: I0219 15:29:54.590084 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Feb 19 15:29:54 crc kubenswrapper[4810]: I0219 15:29:54.614421 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8d391303-b5ee-4f63-8035-12f123f35e65-config-data-custom\") pod \"barbican-api-6b886df68b-htd57\" (UID: \"8d391303-b5ee-4f63-8035-12f123f35e65\") " pod="openstack/barbican-api-6b886df68b-htd57" Feb 19 15:29:54 crc kubenswrapper[4810]: I0219 15:29:54.614482 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d391303-b5ee-4f63-8035-12f123f35e65-logs\") pod \"barbican-api-6b886df68b-htd57\" (UID: \"8d391303-b5ee-4f63-8035-12f123f35e65\") " pod="openstack/barbican-api-6b886df68b-htd57" Feb 19 15:29:54 crc kubenswrapper[4810]: I0219 15:29:54.614534 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jt94n\" (UniqueName: \"kubernetes.io/projected/8d391303-b5ee-4f63-8035-12f123f35e65-kube-api-access-jt94n\") pod \"barbican-api-6b886df68b-htd57\" (UID: \"8d391303-b5ee-4f63-8035-12f123f35e65\") " pod="openstack/barbican-api-6b886df68b-htd57" Feb 19 15:29:54 crc kubenswrapper[4810]: I0219 15:29:54.614566 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d391303-b5ee-4f63-8035-12f123f35e65-internal-tls-certs\") pod \"barbican-api-6b886df68b-htd57\" (UID: \"8d391303-b5ee-4f63-8035-12f123f35e65\") " pod="openstack/barbican-api-6b886df68b-htd57" Feb 19 15:29:54 crc kubenswrapper[4810]: I0219 15:29:54.615216 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d391303-b5ee-4f63-8035-12f123f35e65-combined-ca-bundle\") pod \"barbican-api-6b886df68b-htd57\" (UID: \"8d391303-b5ee-4f63-8035-12f123f35e65\") " pod="openstack/barbican-api-6b886df68b-htd57" Feb 19 15:29:54 crc kubenswrapper[4810]: I0219 15:29:54.615271 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d391303-b5ee-4f63-8035-12f123f35e65-public-tls-certs\") pod \"barbican-api-6b886df68b-htd57\" (UID: \"8d391303-b5ee-4f63-8035-12f123f35e65\") " pod="openstack/barbican-api-6b886df68b-htd57" Feb 19 15:29:54 crc kubenswrapper[4810]: I0219 15:29:54.615391 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d391303-b5ee-4f63-8035-12f123f35e65-config-data\") pod \"barbican-api-6b886df68b-htd57\" (UID: \"8d391303-b5ee-4f63-8035-12f123f35e65\") " pod="openstack/barbican-api-6b886df68b-htd57" Feb 19 15:29:54 crc kubenswrapper[4810]: I0219 15:29:54.625011 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6b886df68b-htd57"] Feb 19 15:29:54 crc kubenswrapper[4810]: I0219 15:29:54.716870 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d391303-b5ee-4f63-8035-12f123f35e65-logs\") pod \"barbican-api-6b886df68b-htd57\" (UID: \"8d391303-b5ee-4f63-8035-12f123f35e65\") " pod="openstack/barbican-api-6b886df68b-htd57" Feb 19 15:29:54 crc kubenswrapper[4810]: I0219 15:29:54.716996 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jt94n\" (UniqueName: \"kubernetes.io/projected/8d391303-b5ee-4f63-8035-12f123f35e65-kube-api-access-jt94n\") pod \"barbican-api-6b886df68b-htd57\" (UID: \"8d391303-b5ee-4f63-8035-12f123f35e65\") " pod="openstack/barbican-api-6b886df68b-htd57" Feb 19 15:29:54 crc kubenswrapper[4810]: I0219 15:29:54.717059 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d391303-b5ee-4f63-8035-12f123f35e65-internal-tls-certs\") pod \"barbican-api-6b886df68b-htd57\" (UID: \"8d391303-b5ee-4f63-8035-12f123f35e65\") " pod="openstack/barbican-api-6b886df68b-htd57" Feb 19 15:29:54 crc kubenswrapper[4810]: I0219 15:29:54.717143 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d391303-b5ee-4f63-8035-12f123f35e65-combined-ca-bundle\") pod \"barbican-api-6b886df68b-htd57\" (UID: \"8d391303-b5ee-4f63-8035-12f123f35e65\") " pod="openstack/barbican-api-6b886df68b-htd57" Feb 19 15:29:54 crc kubenswrapper[4810]: I0219 15:29:54.717164 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d391303-b5ee-4f63-8035-12f123f35e65-public-tls-certs\") pod \"barbican-api-6b886df68b-htd57\" (UID: \"8d391303-b5ee-4f63-8035-12f123f35e65\") " pod="openstack/barbican-api-6b886df68b-htd57" Feb 19 15:29:54 crc kubenswrapper[4810]: I0219 15:29:54.717258 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d391303-b5ee-4f63-8035-12f123f35e65-config-data\") pod \"barbican-api-6b886df68b-htd57\" (UID: \"8d391303-b5ee-4f63-8035-12f123f35e65\") " pod="openstack/barbican-api-6b886df68b-htd57" Feb 19 15:29:54 crc kubenswrapper[4810]: I0219 15:29:54.717308 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8d391303-b5ee-4f63-8035-12f123f35e65-config-data-custom\") pod \"barbican-api-6b886df68b-htd57\" (UID: \"8d391303-b5ee-4f63-8035-12f123f35e65\") " pod="openstack/barbican-api-6b886df68b-htd57" Feb 19 15:29:54 crc kubenswrapper[4810]: I0219 15:29:54.717517 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d391303-b5ee-4f63-8035-12f123f35e65-logs\") pod \"barbican-api-6b886df68b-htd57\" (UID: \"8d391303-b5ee-4f63-8035-12f123f35e65\") " pod="openstack/barbican-api-6b886df68b-htd57" Feb 19 15:29:54 crc kubenswrapper[4810]: I0219 15:29:54.726599 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8d391303-b5ee-4f63-8035-12f123f35e65-config-data-custom\") pod \"barbican-api-6b886df68b-htd57\" (UID: \"8d391303-b5ee-4f63-8035-12f123f35e65\") " pod="openstack/barbican-api-6b886df68b-htd57" Feb 19 15:29:54 crc kubenswrapper[4810]: I0219 15:29:54.728692 4810 scope.go:117] "RemoveContainer" containerID="cc512cef3a5d05465ad16d8e6be38874fe910640b0911d914b7146b795d387d4" Feb 19 15:29:54 crc kubenswrapper[4810]: I0219 15:29:54.729565 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d391303-b5ee-4f63-8035-12f123f35e65-config-data\") pod \"barbican-api-6b886df68b-htd57\" (UID: \"8d391303-b5ee-4f63-8035-12f123f35e65\") " pod="openstack/barbican-api-6b886df68b-htd57" Feb 19 15:29:54 crc kubenswrapper[4810]: I0219 15:29:54.734308 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d391303-b5ee-4f63-8035-12f123f35e65-public-tls-certs\") pod \"barbican-api-6b886df68b-htd57\" (UID: \"8d391303-b5ee-4f63-8035-12f123f35e65\") " pod="openstack/barbican-api-6b886df68b-htd57" Feb 19 15:29:54 crc kubenswrapper[4810]: I0219 15:29:54.737566 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d391303-b5ee-4f63-8035-12f123f35e65-combined-ca-bundle\") pod \"barbican-api-6b886df68b-htd57\" (UID: \"8d391303-b5ee-4f63-8035-12f123f35e65\") " pod="openstack/barbican-api-6b886df68b-htd57" Feb 19 15:29:54 crc kubenswrapper[4810]: I0219 15:29:54.738929 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d391303-b5ee-4f63-8035-12f123f35e65-internal-tls-certs\") pod \"barbican-api-6b886df68b-htd57\" (UID: \"8d391303-b5ee-4f63-8035-12f123f35e65\") " pod="openstack/barbican-api-6b886df68b-htd57" Feb 19 15:29:54 crc kubenswrapper[4810]: I0219 15:29:54.747284 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jt94n\" (UniqueName: \"kubernetes.io/projected/8d391303-b5ee-4f63-8035-12f123f35e65-kube-api-access-jt94n\") pod \"barbican-api-6b886df68b-htd57\" (UID: \"8d391303-b5ee-4f63-8035-12f123f35e65\") " pod="openstack/barbican-api-6b886df68b-htd57" Feb 19 15:29:54 crc kubenswrapper[4810]: I0219 15:29:54.913828 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6b886df68b-htd57" Feb 19 15:29:55 crc kubenswrapper[4810]: I0219 15:29:55.124137 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f5db557a-0b89-4a02-b1b2-19bc205acee8","Type":"ContainerStarted","Data":"41f32629cee0c0ae9efe4c3d3ac560c20dce8a8e902c7f6325c5e1b2d1d8b995"} Feb 19 15:29:55 crc kubenswrapper[4810]: I0219 15:29:55.129156 4810 generic.go:334] "Generic (PLEG): container finished" podID="4a48946e-058c-4395-bbad-5effb50b2228" containerID="a0f55d5dfd4c1951d245770b89cf22415d60ecb97cf8c05e857fc4583af61f68" exitCode=0 Feb 19 15:29:55 crc kubenswrapper[4810]: I0219 15:29:55.129229 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c58b86477-9tbw7" event={"ID":"4a48946e-058c-4395-bbad-5effb50b2228","Type":"ContainerDied","Data":"a0f55d5dfd4c1951d245770b89cf22415d60ecb97cf8c05e857fc4583af61f68"} Feb 19 15:29:55 crc kubenswrapper[4810]: I0219 15:29:55.129701 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c58b86477-9tbw7" event={"ID":"4a48946e-058c-4395-bbad-5effb50b2228","Type":"ContainerStarted","Data":"c249e977cefa0135aa004a3d9624e2b5787cc21f20239233e79602a670cf0acb"} Feb 19 15:29:55 crc kubenswrapper[4810]: I0219 15:29:55.131794 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6c58b86477-9tbw7" Feb 19 15:29:55 crc kubenswrapper[4810]: I0219 15:29:55.168132 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-58f8775989-n9rgr" event={"ID":"f277c31b-ff97-4f3b-aec3-c5cfe9293d60","Type":"ContainerStarted","Data":"9735b3da2b6a95af5e1e5f4aa3e9e29ae0064236f04c1a8806891de53deba27d"} Feb 19 15:29:55 crc kubenswrapper[4810]: I0219 15:29:55.168192 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-58f8775989-n9rgr" event={"ID":"f277c31b-ff97-4f3b-aec3-c5cfe9293d60","Type":"ContainerStarted","Data":"8c8893831d66e071f297f357257d2bf8c082ee53a2e2b3bc9428bbd0780f1134"} Feb 19 15:29:55 crc kubenswrapper[4810]: I0219 15:29:55.180290 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6c58b86477-9tbw7" podStartSLOduration=6.180268984 podStartE2EDuration="6.180268984s" podCreationTimestamp="2026-02-19 15:29:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:29:55.156926211 +0000 UTC m=+1224.638956335" watchObservedRunningTime="2026-02-19 15:29:55.180268984 +0000 UTC m=+1224.662299108" Feb 19 15:29:55 crc kubenswrapper[4810]: I0219 15:29:55.180782 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e3bd9969-3750-460b-95cd-8c52d2e44d82","Type":"ContainerStarted","Data":"1088aad748bf16b0aa6aaa6ca8f83095344671dc633f1afff11c7c6c03c2f490"} Feb 19 15:29:55 crc kubenswrapper[4810]: I0219 15:29:55.187406 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-75f99f68b4-d7hj4" event={"ID":"c008ffcd-bb96-47dd-a311-fdc58f6d8918","Type":"ContainerStarted","Data":"ac2646d5c8b3c7f1a0ce7400b5ee60d1b6c83b39d160cd582242a516f408c1bb"} Feb 19 15:29:55 crc kubenswrapper[4810]: I0219 15:29:55.193637 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-58f8775989-n9rgr" podStartSLOduration=4.269471553 podStartE2EDuration="8.193622631s" podCreationTimestamp="2026-02-19 15:29:47 +0000 UTC" firstStartedPulling="2026-02-19 15:29:49.378695368 +0000 UTC m=+1218.860725492" lastFinishedPulling="2026-02-19 15:29:53.302846446 +0000 UTC m=+1222.784876570" observedRunningTime="2026-02-19 15:29:55.189796077 +0000 UTC m=+1224.671826201" watchObservedRunningTime="2026-02-19 15:29:55.193622631 +0000 UTC m=+1224.675652745" Feb 19 15:29:55 crc kubenswrapper[4810]: I0219 15:29:55.219876 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-75f99f68b4-d7hj4" podStartSLOduration=4.346722758 podStartE2EDuration="8.219841524s" podCreationTimestamp="2026-02-19 15:29:47 +0000 UTC" firstStartedPulling="2026-02-19 15:29:49.378432031 +0000 UTC m=+1218.860462155" lastFinishedPulling="2026-02-19 15:29:53.251550787 +0000 UTC m=+1222.733580921" observedRunningTime="2026-02-19 15:29:55.207589014 +0000 UTC m=+1224.689619138" watchObservedRunningTime="2026-02-19 15:29:55.219841524 +0000 UTC m=+1224.701871648" Feb 19 15:29:55 crc kubenswrapper[4810]: I0219 15:29:55.481976 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08eca88c-a4b4-461b-8568-ebbf54645272" path="/var/lib/kubelet/pods/08eca88c-a4b4-461b-8568-ebbf54645272/volumes" Feb 19 15:29:55 crc kubenswrapper[4810]: I0219 15:29:55.482730 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c39b1dd9-9e73-4cca-aea6-e228f1ba5942" path="/var/lib/kubelet/pods/c39b1dd9-9e73-4cca-aea6-e228f1ba5942/volumes" Feb 19 15:29:55 crc kubenswrapper[4810]: I0219 15:29:55.506972 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6b886df68b-htd57"] Feb 19 15:29:56 crc kubenswrapper[4810]: I0219 15:29:56.215045 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e3bd9969-3750-460b-95cd-8c52d2e44d82","Type":"ContainerStarted","Data":"2a43f3753096df053129dc1d2079d46040848bff1f282b81bb80d958a8043c75"} Feb 19 15:29:56 crc kubenswrapper[4810]: I0219 15:29:56.217436 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f5db557a-0b89-4a02-b1b2-19bc205acee8","Type":"ContainerStarted","Data":"32d2af278e824de267fbeedcdd57bbf5777084a34214d092b327d9209c8d753a"} Feb 19 15:29:56 crc kubenswrapper[4810]: I0219 15:29:56.217558 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 19 15:29:56 crc kubenswrapper[4810]: I0219 15:29:56.217574 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="f5db557a-0b89-4a02-b1b2-19bc205acee8" containerName="cinder-api-log" containerID="cri-o://41f32629cee0c0ae9efe4c3d3ac560c20dce8a8e902c7f6325c5e1b2d1d8b995" gracePeriod=30 Feb 19 15:29:56 crc kubenswrapper[4810]: I0219 15:29:56.217610 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="f5db557a-0b89-4a02-b1b2-19bc205acee8" containerName="cinder-api" containerID="cri-o://32d2af278e824de267fbeedcdd57bbf5777084a34214d092b327d9209c8d753a" gracePeriod=30 Feb 19 15:29:56 crc kubenswrapper[4810]: I0219 15:29:56.226141 4810 generic.go:334] "Generic (PLEG): container finished" podID="58c845f2-0069-4ee5-9d4b-b5871e078926" containerID="5ab1bae28a55f588686fefd9b6e6ee98c22d6657796a662570ae5cd62319bd13" exitCode=0 Feb 19 15:29:56 crc kubenswrapper[4810]: I0219 15:29:56.226230 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-869f57798-ngdtl" event={"ID":"58c845f2-0069-4ee5-9d4b-b5871e078926","Type":"ContainerDied","Data":"5ab1bae28a55f588686fefd9b6e6ee98c22d6657796a662570ae5cd62319bd13"} Feb 19 15:29:56 crc kubenswrapper[4810]: I0219 15:29:56.234865 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6b886df68b-htd57" event={"ID":"8d391303-b5ee-4f63-8035-12f123f35e65","Type":"ContainerStarted","Data":"05c95f5046182804c63d2c91ebf19854d994a7b67e23dae465b4554b045ae33e"} Feb 19 15:29:56 crc kubenswrapper[4810]: I0219 15:29:56.234907 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6b886df68b-htd57" event={"ID":"8d391303-b5ee-4f63-8035-12f123f35e65","Type":"ContainerStarted","Data":"50dda5b91868f3e5a95a6dd804fa072c5a73ba040b22717cef0334d8204978c9"} Feb 19 15:29:56 crc kubenswrapper[4810]: I0219 15:29:56.234921 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6b886df68b-htd57" event={"ID":"8d391303-b5ee-4f63-8035-12f123f35e65","Type":"ContainerStarted","Data":"46fac789a8550e65b9a76f431a80e79197eed3298754c9cf7500b7e0a945007e"} Feb 19 15:29:56 crc kubenswrapper[4810]: I0219 15:29:56.263173 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=6.025992918 podStartE2EDuration="7.263153858s" podCreationTimestamp="2026-02-19 15:29:49 +0000 UTC" firstStartedPulling="2026-02-19 15:29:52.111355676 +0000 UTC m=+1221.593385800" lastFinishedPulling="2026-02-19 15:29:53.348516626 +0000 UTC m=+1222.830546740" observedRunningTime="2026-02-19 15:29:56.256358032 +0000 UTC m=+1225.738388156" watchObservedRunningTime="2026-02-19 15:29:56.263153858 +0000 UTC m=+1225.745183982" Feb 19 15:29:56 crc kubenswrapper[4810]: I0219 15:29:56.284979 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6b886df68b-htd57" podStartSLOduration=2.2849578729999998 podStartE2EDuration="2.284957873s" podCreationTimestamp="2026-02-19 15:29:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:29:56.279901239 +0000 UTC m=+1225.761931363" watchObservedRunningTime="2026-02-19 15:29:56.284957873 +0000 UTC m=+1225.766987997" Feb 19 15:29:56 crc kubenswrapper[4810]: I0219 15:29:56.310460 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=6.310441609 podStartE2EDuration="6.310441609s" podCreationTimestamp="2026-02-19 15:29:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:29:56.30073366 +0000 UTC m=+1225.782763784" watchObservedRunningTime="2026-02-19 15:29:56.310441609 +0000 UTC m=+1225.792471733" Feb 19 15:29:57 crc kubenswrapper[4810]: I0219 15:29:57.281696 4810 generic.go:334] "Generic (PLEG): container finished" podID="f5db557a-0b89-4a02-b1b2-19bc205acee8" containerID="32d2af278e824de267fbeedcdd57bbf5777084a34214d092b327d9209c8d753a" exitCode=0 Feb 19 15:29:57 crc kubenswrapper[4810]: I0219 15:29:57.282275 4810 generic.go:334] "Generic (PLEG): container finished" podID="f5db557a-0b89-4a02-b1b2-19bc205acee8" containerID="41f32629cee0c0ae9efe4c3d3ac560c20dce8a8e902c7f6325c5e1b2d1d8b995" exitCode=143 Feb 19 15:29:57 crc kubenswrapper[4810]: I0219 15:29:57.283685 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f5db557a-0b89-4a02-b1b2-19bc205acee8","Type":"ContainerDied","Data":"32d2af278e824de267fbeedcdd57bbf5777084a34214d092b327d9209c8d753a"} Feb 19 15:29:57 crc kubenswrapper[4810]: I0219 15:29:57.283721 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f5db557a-0b89-4a02-b1b2-19bc205acee8","Type":"ContainerDied","Data":"41f32629cee0c0ae9efe4c3d3ac560c20dce8a8e902c7f6325c5e1b2d1d8b995"} Feb 19 15:29:57 crc kubenswrapper[4810]: I0219 15:29:57.285264 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6b886df68b-htd57" Feb 19 15:29:57 crc kubenswrapper[4810]: I0219 15:29:57.285307 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6b886df68b-htd57" Feb 19 15:29:57 crc kubenswrapper[4810]: I0219 15:29:57.629677 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 19 15:29:57 crc kubenswrapper[4810]: I0219 15:29:57.685362 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5db557a-0b89-4a02-b1b2-19bc205acee8-combined-ca-bundle\") pod \"f5db557a-0b89-4a02-b1b2-19bc205acee8\" (UID: \"f5db557a-0b89-4a02-b1b2-19bc205acee8\") " Feb 19 15:29:57 crc kubenswrapper[4810]: I0219 15:29:57.685851 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f5db557a-0b89-4a02-b1b2-19bc205acee8-etc-machine-id\") pod \"f5db557a-0b89-4a02-b1b2-19bc205acee8\" (UID: \"f5db557a-0b89-4a02-b1b2-19bc205acee8\") " Feb 19 15:29:57 crc kubenswrapper[4810]: I0219 15:29:57.685904 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f5db557a-0b89-4a02-b1b2-19bc205acee8-config-data-custom\") pod \"f5db557a-0b89-4a02-b1b2-19bc205acee8\" (UID: \"f5db557a-0b89-4a02-b1b2-19bc205acee8\") " Feb 19 15:29:57 crc kubenswrapper[4810]: I0219 15:29:57.685928 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f5db557a-0b89-4a02-b1b2-19bc205acee8-logs\") pod \"f5db557a-0b89-4a02-b1b2-19bc205acee8\" (UID: \"f5db557a-0b89-4a02-b1b2-19bc205acee8\") " Feb 19 15:29:57 crc kubenswrapper[4810]: I0219 15:29:57.685974 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5db557a-0b89-4a02-b1b2-19bc205acee8-scripts\") pod \"f5db557a-0b89-4a02-b1b2-19bc205acee8\" (UID: \"f5db557a-0b89-4a02-b1b2-19bc205acee8\") " Feb 19 15:29:57 crc kubenswrapper[4810]: I0219 15:29:57.686022 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5db557a-0b89-4a02-b1b2-19bc205acee8-config-data\") pod \"f5db557a-0b89-4a02-b1b2-19bc205acee8\" (UID: \"f5db557a-0b89-4a02-b1b2-19bc205acee8\") " Feb 19 15:29:57 crc kubenswrapper[4810]: I0219 15:29:57.686055 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7vrh\" (UniqueName: \"kubernetes.io/projected/f5db557a-0b89-4a02-b1b2-19bc205acee8-kube-api-access-s7vrh\") pod \"f5db557a-0b89-4a02-b1b2-19bc205acee8\" (UID: \"f5db557a-0b89-4a02-b1b2-19bc205acee8\") " Feb 19 15:29:57 crc kubenswrapper[4810]: I0219 15:29:57.686401 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f5db557a-0b89-4a02-b1b2-19bc205acee8-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "f5db557a-0b89-4a02-b1b2-19bc205acee8" (UID: "f5db557a-0b89-4a02-b1b2-19bc205acee8"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 15:29:57 crc kubenswrapper[4810]: I0219 15:29:57.686438 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5db557a-0b89-4a02-b1b2-19bc205acee8-logs" (OuterVolumeSpecName: "logs") pod "f5db557a-0b89-4a02-b1b2-19bc205acee8" (UID: "f5db557a-0b89-4a02-b1b2-19bc205acee8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:29:57 crc kubenswrapper[4810]: I0219 15:29:57.686712 4810 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f5db557a-0b89-4a02-b1b2-19bc205acee8-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:57 crc kubenswrapper[4810]: I0219 15:29:57.686732 4810 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f5db557a-0b89-4a02-b1b2-19bc205acee8-logs\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:57 crc kubenswrapper[4810]: I0219 15:29:57.691861 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5db557a-0b89-4a02-b1b2-19bc205acee8-kube-api-access-s7vrh" (OuterVolumeSpecName: "kube-api-access-s7vrh") pod "f5db557a-0b89-4a02-b1b2-19bc205acee8" (UID: "f5db557a-0b89-4a02-b1b2-19bc205acee8"). InnerVolumeSpecName "kube-api-access-s7vrh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:29:57 crc kubenswrapper[4810]: I0219 15:29:57.695676 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5db557a-0b89-4a02-b1b2-19bc205acee8-scripts" (OuterVolumeSpecName: "scripts") pod "f5db557a-0b89-4a02-b1b2-19bc205acee8" (UID: "f5db557a-0b89-4a02-b1b2-19bc205acee8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:29:57 crc kubenswrapper[4810]: I0219 15:29:57.701155 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5db557a-0b89-4a02-b1b2-19bc205acee8-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "f5db557a-0b89-4a02-b1b2-19bc205acee8" (UID: "f5db557a-0b89-4a02-b1b2-19bc205acee8"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:29:57 crc kubenswrapper[4810]: I0219 15:29:57.740624 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5db557a-0b89-4a02-b1b2-19bc205acee8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f5db557a-0b89-4a02-b1b2-19bc205acee8" (UID: "f5db557a-0b89-4a02-b1b2-19bc205acee8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:29:57 crc kubenswrapper[4810]: I0219 15:29:57.758140 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5db557a-0b89-4a02-b1b2-19bc205acee8-config-data" (OuterVolumeSpecName: "config-data") pod "f5db557a-0b89-4a02-b1b2-19bc205acee8" (UID: "f5db557a-0b89-4a02-b1b2-19bc205acee8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:29:57 crc kubenswrapper[4810]: I0219 15:29:57.788485 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5db557a-0b89-4a02-b1b2-19bc205acee8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:57 crc kubenswrapper[4810]: I0219 15:29:57.788515 4810 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f5db557a-0b89-4a02-b1b2-19bc205acee8-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:57 crc kubenswrapper[4810]: I0219 15:29:57.788525 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5db557a-0b89-4a02-b1b2-19bc205acee8-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:57 crc kubenswrapper[4810]: I0219 15:29:57.788544 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5db557a-0b89-4a02-b1b2-19bc205acee8-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:57 crc kubenswrapper[4810]: I0219 15:29:57.788553 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7vrh\" (UniqueName: \"kubernetes.io/projected/f5db557a-0b89-4a02-b1b2-19bc205acee8-kube-api-access-s7vrh\") on node \"crc\" DevicePath \"\"" Feb 19 15:29:58 crc kubenswrapper[4810]: I0219 15:29:58.162670 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-869f57798-ngdtl" podUID="58c845f2-0069-4ee5-9d4b-b5871e078926" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.165:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.165:8443: connect: connection refused" Feb 19 15:29:58 crc kubenswrapper[4810]: I0219 15:29:58.294999 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f5db557a-0b89-4a02-b1b2-19bc205acee8","Type":"ContainerDied","Data":"cb5c2099652d015df95731d45206e3f87ff0597889d9ca17b5c94ba96cec083c"} Feb 19 15:29:58 crc kubenswrapper[4810]: I0219 15:29:58.295060 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 19 15:29:58 crc kubenswrapper[4810]: I0219 15:29:58.295067 4810 scope.go:117] "RemoveContainer" containerID="32d2af278e824de267fbeedcdd57bbf5777084a34214d092b327d9209c8d753a" Feb 19 15:29:58 crc kubenswrapper[4810]: I0219 15:29:58.333374 4810 scope.go:117] "RemoveContainer" containerID="41f32629cee0c0ae9efe4c3d3ac560c20dce8a8e902c7f6325c5e1b2d1d8b995" Feb 19 15:29:58 crc kubenswrapper[4810]: I0219 15:29:58.340463 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 19 15:29:58 crc kubenswrapper[4810]: I0219 15:29:58.359068 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Feb 19 15:29:58 crc kubenswrapper[4810]: I0219 15:29:58.374123 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 19 15:29:58 crc kubenswrapper[4810]: E0219 15:29:58.374580 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5db557a-0b89-4a02-b1b2-19bc205acee8" containerName="cinder-api" Feb 19 15:29:58 crc kubenswrapper[4810]: I0219 15:29:58.374597 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5db557a-0b89-4a02-b1b2-19bc205acee8" containerName="cinder-api" Feb 19 15:29:58 crc kubenswrapper[4810]: E0219 15:29:58.374614 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5db557a-0b89-4a02-b1b2-19bc205acee8" containerName="cinder-api-log" Feb 19 15:29:58 crc kubenswrapper[4810]: I0219 15:29:58.374624 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5db557a-0b89-4a02-b1b2-19bc205acee8" containerName="cinder-api-log" Feb 19 15:29:58 crc kubenswrapper[4810]: I0219 15:29:58.374844 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5db557a-0b89-4a02-b1b2-19bc205acee8" containerName="cinder-api-log" Feb 19 15:29:58 crc kubenswrapper[4810]: I0219 15:29:58.374866 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5db557a-0b89-4a02-b1b2-19bc205acee8" containerName="cinder-api" Feb 19 15:29:58 crc kubenswrapper[4810]: I0219 15:29:58.375881 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 19 15:29:58 crc kubenswrapper[4810]: I0219 15:29:58.378378 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Feb 19 15:29:58 crc kubenswrapper[4810]: I0219 15:29:58.378595 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Feb 19 15:29:58 crc kubenswrapper[4810]: I0219 15:29:58.378717 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 19 15:29:58 crc kubenswrapper[4810]: I0219 15:29:58.387787 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 19 15:29:58 crc kubenswrapper[4810]: I0219 15:29:58.403682 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1723b820-73ac-49f3-8716-283bf2c05925-config-data\") pod \"cinder-api-0\" (UID: \"1723b820-73ac-49f3-8716-283bf2c05925\") " pod="openstack/cinder-api-0" Feb 19 15:29:58 crc kubenswrapper[4810]: I0219 15:29:58.403752 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1723b820-73ac-49f3-8716-283bf2c05925-public-tls-certs\") pod \"cinder-api-0\" (UID: \"1723b820-73ac-49f3-8716-283bf2c05925\") " pod="openstack/cinder-api-0" Feb 19 15:29:58 crc kubenswrapper[4810]: I0219 15:29:58.403789 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twsrv\" (UniqueName: \"kubernetes.io/projected/1723b820-73ac-49f3-8716-283bf2c05925-kube-api-access-twsrv\") pod \"cinder-api-0\" (UID: \"1723b820-73ac-49f3-8716-283bf2c05925\") " pod="openstack/cinder-api-0" Feb 19 15:29:58 crc kubenswrapper[4810]: I0219 15:29:58.403822 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1723b820-73ac-49f3-8716-283bf2c05925-logs\") pod \"cinder-api-0\" (UID: \"1723b820-73ac-49f3-8716-283bf2c05925\") " pod="openstack/cinder-api-0" Feb 19 15:29:58 crc kubenswrapper[4810]: I0219 15:29:58.403864 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1723b820-73ac-49f3-8716-283bf2c05925-scripts\") pod \"cinder-api-0\" (UID: \"1723b820-73ac-49f3-8716-283bf2c05925\") " pod="openstack/cinder-api-0" Feb 19 15:29:58 crc kubenswrapper[4810]: I0219 15:29:58.403979 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1723b820-73ac-49f3-8716-283bf2c05925-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"1723b820-73ac-49f3-8716-283bf2c05925\") " pod="openstack/cinder-api-0" Feb 19 15:29:58 crc kubenswrapper[4810]: I0219 15:29:58.404087 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1723b820-73ac-49f3-8716-283bf2c05925-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1723b820-73ac-49f3-8716-283bf2c05925\") " pod="openstack/cinder-api-0" Feb 19 15:29:58 crc kubenswrapper[4810]: I0219 15:29:58.408657 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1723b820-73ac-49f3-8716-283bf2c05925-config-data-custom\") pod \"cinder-api-0\" (UID: \"1723b820-73ac-49f3-8716-283bf2c05925\") " pod="openstack/cinder-api-0" Feb 19 15:29:58 crc kubenswrapper[4810]: I0219 15:29:58.408871 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1723b820-73ac-49f3-8716-283bf2c05925-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1723b820-73ac-49f3-8716-283bf2c05925\") " pod="openstack/cinder-api-0" Feb 19 15:29:58 crc kubenswrapper[4810]: I0219 15:29:58.510906 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1723b820-73ac-49f3-8716-283bf2c05925-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1723b820-73ac-49f3-8716-283bf2c05925\") " pod="openstack/cinder-api-0" Feb 19 15:29:58 crc kubenswrapper[4810]: I0219 15:29:58.511246 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1723b820-73ac-49f3-8716-283bf2c05925-config-data-custom\") pod \"cinder-api-0\" (UID: \"1723b820-73ac-49f3-8716-283bf2c05925\") " pod="openstack/cinder-api-0" Feb 19 15:29:58 crc kubenswrapper[4810]: I0219 15:29:58.511306 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1723b820-73ac-49f3-8716-283bf2c05925-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1723b820-73ac-49f3-8716-283bf2c05925\") " pod="openstack/cinder-api-0" Feb 19 15:29:58 crc kubenswrapper[4810]: I0219 15:29:58.511359 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1723b820-73ac-49f3-8716-283bf2c05925-config-data\") pod \"cinder-api-0\" (UID: \"1723b820-73ac-49f3-8716-283bf2c05925\") " pod="openstack/cinder-api-0" Feb 19 15:29:58 crc kubenswrapper[4810]: I0219 15:29:58.511385 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1723b820-73ac-49f3-8716-283bf2c05925-public-tls-certs\") pod \"cinder-api-0\" (UID: \"1723b820-73ac-49f3-8716-283bf2c05925\") " pod="openstack/cinder-api-0" Feb 19 15:29:58 crc kubenswrapper[4810]: I0219 15:29:58.511405 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twsrv\" (UniqueName: \"kubernetes.io/projected/1723b820-73ac-49f3-8716-283bf2c05925-kube-api-access-twsrv\") pod \"cinder-api-0\" (UID: \"1723b820-73ac-49f3-8716-283bf2c05925\") " pod="openstack/cinder-api-0" Feb 19 15:29:58 crc kubenswrapper[4810]: I0219 15:29:58.511426 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1723b820-73ac-49f3-8716-283bf2c05925-logs\") pod \"cinder-api-0\" (UID: \"1723b820-73ac-49f3-8716-283bf2c05925\") " pod="openstack/cinder-api-0" Feb 19 15:29:58 crc kubenswrapper[4810]: I0219 15:29:58.511449 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1723b820-73ac-49f3-8716-283bf2c05925-scripts\") pod \"cinder-api-0\" (UID: \"1723b820-73ac-49f3-8716-283bf2c05925\") " pod="openstack/cinder-api-0" Feb 19 15:29:58 crc kubenswrapper[4810]: I0219 15:29:58.511493 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1723b820-73ac-49f3-8716-283bf2c05925-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"1723b820-73ac-49f3-8716-283bf2c05925\") " pod="openstack/cinder-api-0" Feb 19 15:29:58 crc kubenswrapper[4810]: I0219 15:29:58.511598 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1723b820-73ac-49f3-8716-283bf2c05925-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1723b820-73ac-49f3-8716-283bf2c05925\") " pod="openstack/cinder-api-0" Feb 19 15:29:58 crc kubenswrapper[4810]: I0219 15:29:58.512047 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1723b820-73ac-49f3-8716-283bf2c05925-logs\") pod \"cinder-api-0\" (UID: \"1723b820-73ac-49f3-8716-283bf2c05925\") " pod="openstack/cinder-api-0" Feb 19 15:29:58 crc kubenswrapper[4810]: I0219 15:29:58.519002 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1723b820-73ac-49f3-8716-283bf2c05925-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1723b820-73ac-49f3-8716-283bf2c05925\") " pod="openstack/cinder-api-0" Feb 19 15:29:58 crc kubenswrapper[4810]: I0219 15:29:58.520729 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1723b820-73ac-49f3-8716-283bf2c05925-public-tls-certs\") pod \"cinder-api-0\" (UID: \"1723b820-73ac-49f3-8716-283bf2c05925\") " pod="openstack/cinder-api-0" Feb 19 15:29:58 crc kubenswrapper[4810]: I0219 15:29:58.523167 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1723b820-73ac-49f3-8716-283bf2c05925-config-data-custom\") pod \"cinder-api-0\" (UID: \"1723b820-73ac-49f3-8716-283bf2c05925\") " pod="openstack/cinder-api-0" Feb 19 15:29:58 crc kubenswrapper[4810]: I0219 15:29:58.528273 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1723b820-73ac-49f3-8716-283bf2c05925-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"1723b820-73ac-49f3-8716-283bf2c05925\") " pod="openstack/cinder-api-0" Feb 19 15:29:58 crc kubenswrapper[4810]: I0219 15:29:58.531088 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1723b820-73ac-49f3-8716-283bf2c05925-scripts\") pod \"cinder-api-0\" (UID: \"1723b820-73ac-49f3-8716-283bf2c05925\") " pod="openstack/cinder-api-0" Feb 19 15:29:58 crc kubenswrapper[4810]: I0219 15:29:58.542732 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twsrv\" (UniqueName: \"kubernetes.io/projected/1723b820-73ac-49f3-8716-283bf2c05925-kube-api-access-twsrv\") pod \"cinder-api-0\" (UID: \"1723b820-73ac-49f3-8716-283bf2c05925\") " pod="openstack/cinder-api-0" Feb 19 15:29:58 crc kubenswrapper[4810]: I0219 15:29:58.543748 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1723b820-73ac-49f3-8716-283bf2c05925-config-data\") pod \"cinder-api-0\" (UID: \"1723b820-73ac-49f3-8716-283bf2c05925\") " pod="openstack/cinder-api-0" Feb 19 15:29:58 crc kubenswrapper[4810]: I0219 15:29:58.712080 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 19 15:29:59 crc kubenswrapper[4810]: I0219 15:29:59.124991 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Feb 19 15:29:59 crc kubenswrapper[4810]: I0219 15:29:59.125402 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Feb 19 15:29:59 crc kubenswrapper[4810]: I0219 15:29:59.126128 4810 scope.go:117] "RemoveContainer" containerID="a8232fee767eaee3fa026223e20673e25fa6d954ddb913e148bf6e0e435de416" Feb 19 15:29:59 crc kubenswrapper[4810]: E0219 15:29:59.126594 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(3eb2dccd-c5dc-436f-b7a6-954af7bc51c5)\"" pod="openstack/watcher-decision-engine-0" podUID="3eb2dccd-c5dc-436f-b7a6-954af7bc51c5" Feb 19 15:29:59 crc kubenswrapper[4810]: I0219 15:29:59.209004 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 19 15:29:59 crc kubenswrapper[4810]: I0219 15:29:59.293020 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-79fc56bc44-tfjh4" Feb 19 15:29:59 crc kubenswrapper[4810]: I0219 15:29:59.331318 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1723b820-73ac-49f3-8716-283bf2c05925","Type":"ContainerStarted","Data":"b04502868b0924f6e769d8268d1b745040c99463576aa2faf7bef62c805bcf9a"} Feb 19 15:29:59 crc kubenswrapper[4810]: I0219 15:29:59.343703 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-79fc56bc44-tfjh4" Feb 19 15:29:59 crc kubenswrapper[4810]: I0219 15:29:59.459494 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5db557a-0b89-4a02-b1b2-19bc205acee8" path="/var/lib/kubelet/pods/f5db557a-0b89-4a02-b1b2-19bc205acee8/volumes" Feb 19 15:30:00 crc kubenswrapper[4810]: I0219 15:30:00.152165 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525250-rls78"] Feb 19 15:30:00 crc kubenswrapper[4810]: I0219 15:30:00.155062 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525250-rls78" Feb 19 15:30:00 crc kubenswrapper[4810]: I0219 15:30:00.157917 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 19 15:30:00 crc kubenswrapper[4810]: I0219 15:30:00.158453 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 19 15:30:00 crc kubenswrapper[4810]: I0219 15:30:00.205564 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525250-rls78"] Feb 19 15:30:00 crc kubenswrapper[4810]: I0219 15:30:00.281963 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/40500a46-a16b-4282-86e4-1d99277d7c7a-secret-volume\") pod \"collect-profiles-29525250-rls78\" (UID: \"40500a46-a16b-4282-86e4-1d99277d7c7a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525250-rls78" Feb 19 15:30:00 crc kubenswrapper[4810]: I0219 15:30:00.282075 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/40500a46-a16b-4282-86e4-1d99277d7c7a-config-volume\") pod \"collect-profiles-29525250-rls78\" (UID: \"40500a46-a16b-4282-86e4-1d99277d7c7a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525250-rls78" Feb 19 15:30:00 crc kubenswrapper[4810]: I0219 15:30:00.282136 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzchp\" (UniqueName: \"kubernetes.io/projected/40500a46-a16b-4282-86e4-1d99277d7c7a-kube-api-access-kzchp\") pod \"collect-profiles-29525250-rls78\" (UID: \"40500a46-a16b-4282-86e4-1d99277d7c7a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525250-rls78" Feb 19 15:30:00 crc kubenswrapper[4810]: I0219 15:30:00.351881 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1723b820-73ac-49f3-8716-283bf2c05925","Type":"ContainerStarted","Data":"7f71476a89f9d71d7cfac23d4d3704a81b53747a76723ed5636896cdb0c1f8fc"} Feb 19 15:30:00 crc kubenswrapper[4810]: I0219 15:30:00.383839 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/40500a46-a16b-4282-86e4-1d99277d7c7a-config-volume\") pod \"collect-profiles-29525250-rls78\" (UID: \"40500a46-a16b-4282-86e4-1d99277d7c7a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525250-rls78" Feb 19 15:30:00 crc kubenswrapper[4810]: I0219 15:30:00.383918 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzchp\" (UniqueName: \"kubernetes.io/projected/40500a46-a16b-4282-86e4-1d99277d7c7a-kube-api-access-kzchp\") pod \"collect-profiles-29525250-rls78\" (UID: \"40500a46-a16b-4282-86e4-1d99277d7c7a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525250-rls78" Feb 19 15:30:00 crc kubenswrapper[4810]: I0219 15:30:00.383997 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/40500a46-a16b-4282-86e4-1d99277d7c7a-secret-volume\") pod \"collect-profiles-29525250-rls78\" (UID: \"40500a46-a16b-4282-86e4-1d99277d7c7a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525250-rls78" Feb 19 15:30:00 crc kubenswrapper[4810]: I0219 15:30:00.385457 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/40500a46-a16b-4282-86e4-1d99277d7c7a-config-volume\") pod \"collect-profiles-29525250-rls78\" (UID: \"40500a46-a16b-4282-86e4-1d99277d7c7a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525250-rls78" Feb 19 15:30:00 crc kubenswrapper[4810]: I0219 15:30:00.389977 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/40500a46-a16b-4282-86e4-1d99277d7c7a-secret-volume\") pod \"collect-profiles-29525250-rls78\" (UID: \"40500a46-a16b-4282-86e4-1d99277d7c7a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525250-rls78" Feb 19 15:30:00 crc kubenswrapper[4810]: I0219 15:30:00.402917 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 19 15:30:00 crc kubenswrapper[4810]: I0219 15:30:00.404307 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzchp\" (UniqueName: \"kubernetes.io/projected/40500a46-a16b-4282-86e4-1d99277d7c7a-kube-api-access-kzchp\") pod \"collect-profiles-29525250-rls78\" (UID: \"40500a46-a16b-4282-86e4-1d99277d7c7a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525250-rls78" Feb 19 15:30:00 crc kubenswrapper[4810]: I0219 15:30:00.468476 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6c58b86477-9tbw7" Feb 19 15:30:00 crc kubenswrapper[4810]: I0219 15:30:00.524767 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525250-rls78" Feb 19 15:30:00 crc kubenswrapper[4810]: I0219 15:30:00.527195 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cf5f86dff-7482l"] Feb 19 15:30:00 crc kubenswrapper[4810]: I0219 15:30:00.527440 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6cf5f86dff-7482l" podUID="4f9534ee-827a-49fb-8588-a5a8a494be3c" containerName="dnsmasq-dns" containerID="cri-o://f7ecab250fb259bafddfe5958b677b33308b00c3f83a99f4416f709cb8747963" gracePeriod=10 Feb 19 15:30:00 crc kubenswrapper[4810]: I0219 15:30:00.654373 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 19 15:30:01 crc kubenswrapper[4810]: I0219 15:30:01.171148 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cf5f86dff-7482l" Feb 19 15:30:01 crc kubenswrapper[4810]: I0219 15:30:01.212669 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525250-rls78"] Feb 19 15:30:01 crc kubenswrapper[4810]: I0219 15:30:01.311938 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhf9w\" (UniqueName: \"kubernetes.io/projected/4f9534ee-827a-49fb-8588-a5a8a494be3c-kube-api-access-vhf9w\") pod \"4f9534ee-827a-49fb-8588-a5a8a494be3c\" (UID: \"4f9534ee-827a-49fb-8588-a5a8a494be3c\") " Feb 19 15:30:01 crc kubenswrapper[4810]: I0219 15:30:01.312050 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f9534ee-827a-49fb-8588-a5a8a494be3c-config\") pod \"4f9534ee-827a-49fb-8588-a5a8a494be3c\" (UID: \"4f9534ee-827a-49fb-8588-a5a8a494be3c\") " Feb 19 15:30:01 crc kubenswrapper[4810]: I0219 15:30:01.312097 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4f9534ee-827a-49fb-8588-a5a8a494be3c-dns-svc\") pod \"4f9534ee-827a-49fb-8588-a5a8a494be3c\" (UID: \"4f9534ee-827a-49fb-8588-a5a8a494be3c\") " Feb 19 15:30:01 crc kubenswrapper[4810]: I0219 15:30:01.312162 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4f9534ee-827a-49fb-8588-a5a8a494be3c-dns-swift-storage-0\") pod \"4f9534ee-827a-49fb-8588-a5a8a494be3c\" (UID: \"4f9534ee-827a-49fb-8588-a5a8a494be3c\") " Feb 19 15:30:01 crc kubenswrapper[4810]: I0219 15:30:01.312209 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4f9534ee-827a-49fb-8588-a5a8a494be3c-ovsdbserver-nb\") pod \"4f9534ee-827a-49fb-8588-a5a8a494be3c\" (UID: \"4f9534ee-827a-49fb-8588-a5a8a494be3c\") " Feb 19 15:30:01 crc kubenswrapper[4810]: I0219 15:30:01.312272 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4f9534ee-827a-49fb-8588-a5a8a494be3c-ovsdbserver-sb\") pod \"4f9534ee-827a-49fb-8588-a5a8a494be3c\" (UID: \"4f9534ee-827a-49fb-8588-a5a8a494be3c\") " Feb 19 15:30:01 crc kubenswrapper[4810]: I0219 15:30:01.405594 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f9534ee-827a-49fb-8588-a5a8a494be3c-kube-api-access-vhf9w" (OuterVolumeSpecName: "kube-api-access-vhf9w") pod "4f9534ee-827a-49fb-8588-a5a8a494be3c" (UID: "4f9534ee-827a-49fb-8588-a5a8a494be3c"). InnerVolumeSpecName "kube-api-access-vhf9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:30:01 crc kubenswrapper[4810]: I0219 15:30:01.414516 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhf9w\" (UniqueName: \"kubernetes.io/projected/4f9534ee-827a-49fb-8588-a5a8a494be3c-kube-api-access-vhf9w\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:01 crc kubenswrapper[4810]: I0219 15:30:01.414930 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525250-rls78" event={"ID":"40500a46-a16b-4282-86e4-1d99277d7c7a","Type":"ContainerStarted","Data":"679424eab96e65817895723298fd29af9f44246bfd841f6fc36de1e432cd3878"} Feb 19 15:30:01 crc kubenswrapper[4810]: I0219 15:30:01.428158 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f9534ee-827a-49fb-8588-a5a8a494be3c-config" (OuterVolumeSpecName: "config") pod "4f9534ee-827a-49fb-8588-a5a8a494be3c" (UID: "4f9534ee-827a-49fb-8588-a5a8a494be3c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:30:01 crc kubenswrapper[4810]: I0219 15:30:01.428298 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1723b820-73ac-49f3-8716-283bf2c05925","Type":"ContainerStarted","Data":"d54f1f3426e34264224ad10ed68b4ec9ced57d19030414035e1145729b72fcf1"} Feb 19 15:30:01 crc kubenswrapper[4810]: I0219 15:30:01.429704 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 19 15:30:01 crc kubenswrapper[4810]: I0219 15:30:01.434541 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f9534ee-827a-49fb-8588-a5a8a494be3c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4f9534ee-827a-49fb-8588-a5a8a494be3c" (UID: "4f9534ee-827a-49fb-8588-a5a8a494be3c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:30:01 crc kubenswrapper[4810]: I0219 15:30:01.449516 4810 generic.go:334] "Generic (PLEG): container finished" podID="4f9534ee-827a-49fb-8588-a5a8a494be3c" containerID="f7ecab250fb259bafddfe5958b677b33308b00c3f83a99f4416f709cb8747963" exitCode=0 Feb 19 15:30:01 crc kubenswrapper[4810]: I0219 15:30:01.454254 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cf5f86dff-7482l" Feb 19 15:30:01 crc kubenswrapper[4810]: I0219 15:30:01.473028 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.473010668 podStartE2EDuration="3.473010668s" podCreationTimestamp="2026-02-19 15:29:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:30:01.455810526 +0000 UTC m=+1230.937840660" watchObservedRunningTime="2026-02-19 15:30:01.473010668 +0000 UTC m=+1230.955040792" Feb 19 15:30:01 crc kubenswrapper[4810]: I0219 15:30:01.489751 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cf5f86dff-7482l" event={"ID":"4f9534ee-827a-49fb-8588-a5a8a494be3c","Type":"ContainerDied","Data":"f7ecab250fb259bafddfe5958b677b33308b00c3f83a99f4416f709cb8747963"} Feb 19 15:30:01 crc kubenswrapper[4810]: I0219 15:30:01.489793 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cf5f86dff-7482l" event={"ID":"4f9534ee-827a-49fb-8588-a5a8a494be3c","Type":"ContainerDied","Data":"32fbc60f0dab31f943ed4d231a260fb910bacf46dc6517badaa0cc870b972e03"} Feb 19 15:30:01 crc kubenswrapper[4810]: I0219 15:30:01.489810 4810 scope.go:117] "RemoveContainer" containerID="f7ecab250fb259bafddfe5958b677b33308b00c3f83a99f4416f709cb8747963" Feb 19 15:30:01 crc kubenswrapper[4810]: I0219 15:30:01.516524 4810 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4f9534ee-827a-49fb-8588-a5a8a494be3c-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:01 crc kubenswrapper[4810]: I0219 15:30:01.516724 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f9534ee-827a-49fb-8588-a5a8a494be3c-config\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:01 crc kubenswrapper[4810]: I0219 15:30:01.522381 4810 scope.go:117] "RemoveContainer" containerID="77f0991b435da4c5e7f87592511070821c3d2ea7637aa55b69e68f2a981e4095" Feb 19 15:30:01 crc kubenswrapper[4810]: I0219 15:30:01.537684 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 15:30:01 crc kubenswrapper[4810]: I0219 15:30:01.556443 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f9534ee-827a-49fb-8588-a5a8a494be3c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "4f9534ee-827a-49fb-8588-a5a8a494be3c" (UID: "4f9534ee-827a-49fb-8588-a5a8a494be3c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:30:01 crc kubenswrapper[4810]: I0219 15:30:01.570538 4810 scope.go:117] "RemoveContainer" containerID="f7ecab250fb259bafddfe5958b677b33308b00c3f83a99f4416f709cb8747963" Feb 19 15:30:01 crc kubenswrapper[4810]: I0219 15:30:01.570658 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f9534ee-827a-49fb-8588-a5a8a494be3c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4f9534ee-827a-49fb-8588-a5a8a494be3c" (UID: "4f9534ee-827a-49fb-8588-a5a8a494be3c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:30:01 crc kubenswrapper[4810]: I0219 15:30:01.570788 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f9534ee-827a-49fb-8588-a5a8a494be3c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4f9534ee-827a-49fb-8588-a5a8a494be3c" (UID: "4f9534ee-827a-49fb-8588-a5a8a494be3c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:30:01 crc kubenswrapper[4810]: E0219 15:30:01.571901 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7ecab250fb259bafddfe5958b677b33308b00c3f83a99f4416f709cb8747963\": container with ID starting with f7ecab250fb259bafddfe5958b677b33308b00c3f83a99f4416f709cb8747963 not found: ID does not exist" containerID="f7ecab250fb259bafddfe5958b677b33308b00c3f83a99f4416f709cb8747963" Feb 19 15:30:01 crc kubenswrapper[4810]: I0219 15:30:01.571948 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7ecab250fb259bafddfe5958b677b33308b00c3f83a99f4416f709cb8747963"} err="failed to get container status \"f7ecab250fb259bafddfe5958b677b33308b00c3f83a99f4416f709cb8747963\": rpc error: code = NotFound desc = could not find container \"f7ecab250fb259bafddfe5958b677b33308b00c3f83a99f4416f709cb8747963\": container with ID starting with f7ecab250fb259bafddfe5958b677b33308b00c3f83a99f4416f709cb8747963 not found: ID does not exist" Feb 19 15:30:01 crc kubenswrapper[4810]: I0219 15:30:01.571979 4810 scope.go:117] "RemoveContainer" containerID="77f0991b435da4c5e7f87592511070821c3d2ea7637aa55b69e68f2a981e4095" Feb 19 15:30:01 crc kubenswrapper[4810]: E0219 15:30:01.572805 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77f0991b435da4c5e7f87592511070821c3d2ea7637aa55b69e68f2a981e4095\": container with ID starting with 77f0991b435da4c5e7f87592511070821c3d2ea7637aa55b69e68f2a981e4095 not found: ID does not exist" containerID="77f0991b435da4c5e7f87592511070821c3d2ea7637aa55b69e68f2a981e4095" Feb 19 15:30:01 crc kubenswrapper[4810]: I0219 15:30:01.572846 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77f0991b435da4c5e7f87592511070821c3d2ea7637aa55b69e68f2a981e4095"} err="failed to get container status \"77f0991b435da4c5e7f87592511070821c3d2ea7637aa55b69e68f2a981e4095\": rpc error: code = NotFound desc = could not find container \"77f0991b435da4c5e7f87592511070821c3d2ea7637aa55b69e68f2a981e4095\": container with ID starting with 77f0991b435da4c5e7f87592511070821c3d2ea7637aa55b69e68f2a981e4095 not found: ID does not exist" Feb 19 15:30:01 crc kubenswrapper[4810]: I0219 15:30:01.618189 4810 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4f9534ee-827a-49fb-8588-a5a8a494be3c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:01 crc kubenswrapper[4810]: I0219 15:30:01.618475 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4f9534ee-827a-49fb-8588-a5a8a494be3c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:01 crc kubenswrapper[4810]: I0219 15:30:01.618486 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4f9534ee-827a-49fb-8588-a5a8a494be3c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:01 crc kubenswrapper[4810]: I0219 15:30:01.786352 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cf5f86dff-7482l"] Feb 19 15:30:01 crc kubenswrapper[4810]: I0219 15:30:01.794331 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6cf5f86dff-7482l"] Feb 19 15:30:02 crc kubenswrapper[4810]: I0219 15:30:02.460841 4810 generic.go:334] "Generic (PLEG): container finished" podID="40500a46-a16b-4282-86e4-1d99277d7c7a" containerID="b50c5c7b6b7301c6c8992f22517d1d5b4a8f3065b75aef28f8faaa61ad94fd27" exitCode=0 Feb 19 15:30:02 crc kubenswrapper[4810]: I0219 15:30:02.461021 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="e3bd9969-3750-460b-95cd-8c52d2e44d82" containerName="cinder-scheduler" containerID="cri-o://1088aad748bf16b0aa6aaa6ca8f83095344671dc633f1afff11c7c6c03c2f490" gracePeriod=30 Feb 19 15:30:02 crc kubenswrapper[4810]: I0219 15:30:02.461087 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525250-rls78" event={"ID":"40500a46-a16b-4282-86e4-1d99277d7c7a","Type":"ContainerDied","Data":"b50c5c7b6b7301c6c8992f22517d1d5b4a8f3065b75aef28f8faaa61ad94fd27"} Feb 19 15:30:02 crc kubenswrapper[4810]: I0219 15:30:02.462113 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="e3bd9969-3750-460b-95cd-8c52d2e44d82" containerName="probe" containerID="cri-o://2a43f3753096df053129dc1d2079d46040848bff1f282b81bb80d958a8043c75" gracePeriod=30 Feb 19 15:30:03 crc kubenswrapper[4810]: I0219 15:30:03.450873 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f9534ee-827a-49fb-8588-a5a8a494be3c" path="/var/lib/kubelet/pods/4f9534ee-827a-49fb-8588-a5a8a494be3c/volumes" Feb 19 15:30:03 crc kubenswrapper[4810]: I0219 15:30:03.474210 4810 generic.go:334] "Generic (PLEG): container finished" podID="e3bd9969-3750-460b-95cd-8c52d2e44d82" containerID="2a43f3753096df053129dc1d2079d46040848bff1f282b81bb80d958a8043c75" exitCode=0 Feb 19 15:30:03 crc kubenswrapper[4810]: I0219 15:30:03.474438 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e3bd9969-3750-460b-95cd-8c52d2e44d82","Type":"ContainerDied","Data":"2a43f3753096df053129dc1d2079d46040848bff1f282b81bb80d958a8043c75"} Feb 19 15:30:03 crc kubenswrapper[4810]: I0219 15:30:03.848148 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-695bb7cdc6-72zs2" podUID="f0b73197-3c7e-44c3-8a49-35d9e0a40629" containerName="neutron-httpd" probeResult="failure" output="Get \"http://10.217.0.174:9696/\": dial tcp 10.217.0.174:9696: connect: connection refused" Feb 19 15:30:03 crc kubenswrapper[4810]: I0219 15:30:03.853981 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525250-rls78" Feb 19 15:30:03 crc kubenswrapper[4810]: I0219 15:30:03.961744 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/40500a46-a16b-4282-86e4-1d99277d7c7a-config-volume\") pod \"40500a46-a16b-4282-86e4-1d99277d7c7a\" (UID: \"40500a46-a16b-4282-86e4-1d99277d7c7a\") " Feb 19 15:30:03 crc kubenswrapper[4810]: I0219 15:30:03.961972 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/40500a46-a16b-4282-86e4-1d99277d7c7a-secret-volume\") pod \"40500a46-a16b-4282-86e4-1d99277d7c7a\" (UID: \"40500a46-a16b-4282-86e4-1d99277d7c7a\") " Feb 19 15:30:03 crc kubenswrapper[4810]: I0219 15:30:03.962247 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzchp\" (UniqueName: \"kubernetes.io/projected/40500a46-a16b-4282-86e4-1d99277d7c7a-kube-api-access-kzchp\") pod \"40500a46-a16b-4282-86e4-1d99277d7c7a\" (UID: \"40500a46-a16b-4282-86e4-1d99277d7c7a\") " Feb 19 15:30:03 crc kubenswrapper[4810]: I0219 15:30:03.963148 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40500a46-a16b-4282-86e4-1d99277d7c7a-config-volume" (OuterVolumeSpecName: "config-volume") pod "40500a46-a16b-4282-86e4-1d99277d7c7a" (UID: "40500a46-a16b-4282-86e4-1d99277d7c7a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:30:03 crc kubenswrapper[4810]: I0219 15:30:03.973525 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40500a46-a16b-4282-86e4-1d99277d7c7a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "40500a46-a16b-4282-86e4-1d99277d7c7a" (UID: "40500a46-a16b-4282-86e4-1d99277d7c7a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:30:03 crc kubenswrapper[4810]: I0219 15:30:03.973575 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40500a46-a16b-4282-86e4-1d99277d7c7a-kube-api-access-kzchp" (OuterVolumeSpecName: "kube-api-access-kzchp") pod "40500a46-a16b-4282-86e4-1d99277d7c7a" (UID: "40500a46-a16b-4282-86e4-1d99277d7c7a"). InnerVolumeSpecName "kube-api-access-kzchp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:30:04 crc kubenswrapper[4810]: I0219 15:30:04.064043 4810 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/40500a46-a16b-4282-86e4-1d99277d7c7a-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:04 crc kubenswrapper[4810]: I0219 15:30:04.064083 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kzchp\" (UniqueName: \"kubernetes.io/projected/40500a46-a16b-4282-86e4-1d99277d7c7a-kube-api-access-kzchp\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:04 crc kubenswrapper[4810]: I0219 15:30:04.064097 4810 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/40500a46-a16b-4282-86e4-1d99277d7c7a-config-volume\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:04 crc kubenswrapper[4810]: I0219 15:30:04.266276 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6cc67d5fc8-hs8lf" Feb 19 15:30:04 crc kubenswrapper[4810]: I0219 15:30:04.489608 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525250-rls78" event={"ID":"40500a46-a16b-4282-86e4-1d99277d7c7a","Type":"ContainerDied","Data":"679424eab96e65817895723298fd29af9f44246bfd841f6fc36de1e432cd3878"} Feb 19 15:30:04 crc kubenswrapper[4810]: I0219 15:30:04.489649 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="679424eab96e65817895723298fd29af9f44246bfd841f6fc36de1e432cd3878" Feb 19 15:30:04 crc kubenswrapper[4810]: I0219 15:30:04.489726 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525250-rls78" Feb 19 15:30:05 crc kubenswrapper[4810]: I0219 15:30:05.833104 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-68ff886dc8-nntj6" Feb 19 15:30:05 crc kubenswrapper[4810]: I0219 15:30:05.835277 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-68ff886dc8-nntj6" Feb 19 15:30:05 crc kubenswrapper[4810]: I0219 15:30:05.840353 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7555d68ddd-xqj8c" Feb 19 15:30:06 crc kubenswrapper[4810]: I0219 15:30:06.543071 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6b886df68b-htd57" Feb 19 15:30:06 crc kubenswrapper[4810]: I0219 15:30:06.736172 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6b886df68b-htd57" Feb 19 15:30:06 crc kubenswrapper[4810]: I0219 15:30:06.822082 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-79fc56bc44-tfjh4"] Feb 19 15:30:06 crc kubenswrapper[4810]: I0219 15:30:06.822775 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-79fc56bc44-tfjh4" podUID="faf01cf3-b74b-46d8-b589-05ea0195ac24" containerName="barbican-api-log" containerID="cri-o://6c0cc91325e6d2565c8ddd9b1eede9d65e56a0efbeee132cbd5a27f02f0ed7ef" gracePeriod=30 Feb 19 15:30:06 crc kubenswrapper[4810]: I0219 15:30:06.822912 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-79fc56bc44-tfjh4" podUID="faf01cf3-b74b-46d8-b589-05ea0195ac24" containerName="barbican-api" containerID="cri-o://800222f8de7a26806971959afe00821e5af0eb096530f9abfdb9f64bd62612cb" gracePeriod=30 Feb 19 15:30:06 crc kubenswrapper[4810]: I0219 15:30:06.865672 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7555d68ddd-xqj8c" Feb 19 15:30:06 crc kubenswrapper[4810]: I0219 15:30:06.941528 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-68ff886dc8-nntj6"] Feb 19 15:30:07 crc kubenswrapper[4810]: I0219 15:30:07.301742 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-6cd8bf58f4-ktsjk" Feb 19 15:30:07 crc kubenswrapper[4810]: I0219 15:30:07.515128 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 19 15:30:07 crc kubenswrapper[4810]: I0219 15:30:07.538516 4810 generic.go:334] "Generic (PLEG): container finished" podID="e3bd9969-3750-460b-95cd-8c52d2e44d82" containerID="1088aad748bf16b0aa6aaa6ca8f83095344671dc633f1afff11c7c6c03c2f490" exitCode=0 Feb 19 15:30:07 crc kubenswrapper[4810]: I0219 15:30:07.538599 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e3bd9969-3750-460b-95cd-8c52d2e44d82","Type":"ContainerDied","Data":"1088aad748bf16b0aa6aaa6ca8f83095344671dc633f1afff11c7c6c03c2f490"} Feb 19 15:30:07 crc kubenswrapper[4810]: I0219 15:30:07.538630 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e3bd9969-3750-460b-95cd-8c52d2e44d82","Type":"ContainerDied","Data":"3b1b3f3010d8c20d2061253491ade333f69730534532a92038800b9cbbc0aede"} Feb 19 15:30:07 crc kubenswrapper[4810]: I0219 15:30:07.538647 4810 scope.go:117] "RemoveContainer" containerID="2a43f3753096df053129dc1d2079d46040848bff1f282b81bb80d958a8043c75" Feb 19 15:30:07 crc kubenswrapper[4810]: I0219 15:30:07.538771 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 19 15:30:07 crc kubenswrapper[4810]: I0219 15:30:07.559599 4810 generic.go:334] "Generic (PLEG): container finished" podID="faf01cf3-b74b-46d8-b589-05ea0195ac24" containerID="6c0cc91325e6d2565c8ddd9b1eede9d65e56a0efbeee132cbd5a27f02f0ed7ef" exitCode=143 Feb 19 15:30:07 crc kubenswrapper[4810]: I0219 15:30:07.559841 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-68ff886dc8-nntj6" podUID="e0116ca5-826e-4a77-bc6f-11e89c047af8" containerName="placement-log" containerID="cri-o://0f0fdb4ec7d48ce5efa4d4ce421e22663d31ffdf324056a1695cd08d0b02000f" gracePeriod=30 Feb 19 15:30:07 crc kubenswrapper[4810]: I0219 15:30:07.560191 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-68ff886dc8-nntj6" podUID="e0116ca5-826e-4a77-bc6f-11e89c047af8" containerName="placement-api" containerID="cri-o://4ac6a4c63292ab530356cb491f1c19b551ba50fa93c7f50d74148f2ff94c41a3" gracePeriod=30 Feb 19 15:30:07 crc kubenswrapper[4810]: I0219 15:30:07.560256 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-79fc56bc44-tfjh4" event={"ID":"faf01cf3-b74b-46d8-b589-05ea0195ac24","Type":"ContainerDied","Data":"6c0cc91325e6d2565c8ddd9b1eede9d65e56a0efbeee132cbd5a27f02f0ed7ef"} Feb 19 15:30:07 crc kubenswrapper[4810]: I0219 15:30:07.601582 4810 scope.go:117] "RemoveContainer" containerID="1088aad748bf16b0aa6aaa6ca8f83095344671dc633f1afff11c7c6c03c2f490" Feb 19 15:30:07 crc kubenswrapper[4810]: I0219 15:30:07.627550 4810 scope.go:117] "RemoveContainer" containerID="2a43f3753096df053129dc1d2079d46040848bff1f282b81bb80d958a8043c75" Feb 19 15:30:07 crc kubenswrapper[4810]: E0219 15:30:07.628266 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a43f3753096df053129dc1d2079d46040848bff1f282b81bb80d958a8043c75\": container with ID starting with 2a43f3753096df053129dc1d2079d46040848bff1f282b81bb80d958a8043c75 not found: ID does not exist" containerID="2a43f3753096df053129dc1d2079d46040848bff1f282b81bb80d958a8043c75" Feb 19 15:30:07 crc kubenswrapper[4810]: I0219 15:30:07.628320 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a43f3753096df053129dc1d2079d46040848bff1f282b81bb80d958a8043c75"} err="failed to get container status \"2a43f3753096df053129dc1d2079d46040848bff1f282b81bb80d958a8043c75\": rpc error: code = NotFound desc = could not find container \"2a43f3753096df053129dc1d2079d46040848bff1f282b81bb80d958a8043c75\": container with ID starting with 2a43f3753096df053129dc1d2079d46040848bff1f282b81bb80d958a8043c75 not found: ID does not exist" Feb 19 15:30:07 crc kubenswrapper[4810]: I0219 15:30:07.628419 4810 scope.go:117] "RemoveContainer" containerID="1088aad748bf16b0aa6aaa6ca8f83095344671dc633f1afff11c7c6c03c2f490" Feb 19 15:30:07 crc kubenswrapper[4810]: E0219 15:30:07.631584 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1088aad748bf16b0aa6aaa6ca8f83095344671dc633f1afff11c7c6c03c2f490\": container with ID starting with 1088aad748bf16b0aa6aaa6ca8f83095344671dc633f1afff11c7c6c03c2f490 not found: ID does not exist" containerID="1088aad748bf16b0aa6aaa6ca8f83095344671dc633f1afff11c7c6c03c2f490" Feb 19 15:30:07 crc kubenswrapper[4810]: I0219 15:30:07.631629 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1088aad748bf16b0aa6aaa6ca8f83095344671dc633f1afff11c7c6c03c2f490"} err="failed to get container status \"1088aad748bf16b0aa6aaa6ca8f83095344671dc633f1afff11c7c6c03c2f490\": rpc error: code = NotFound desc = could not find container \"1088aad748bf16b0aa6aaa6ca8f83095344671dc633f1afff11c7c6c03c2f490\": container with ID starting with 1088aad748bf16b0aa6aaa6ca8f83095344671dc633f1afff11c7c6c03c2f490 not found: ID does not exist" Feb 19 15:30:07 crc kubenswrapper[4810]: I0219 15:30:07.641117 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3bd9969-3750-460b-95cd-8c52d2e44d82-scripts\") pod \"e3bd9969-3750-460b-95cd-8c52d2e44d82\" (UID: \"e3bd9969-3750-460b-95cd-8c52d2e44d82\") " Feb 19 15:30:07 crc kubenswrapper[4810]: I0219 15:30:07.641203 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e3bd9969-3750-460b-95cd-8c52d2e44d82-config-data-custom\") pod \"e3bd9969-3750-460b-95cd-8c52d2e44d82\" (UID: \"e3bd9969-3750-460b-95cd-8c52d2e44d82\") " Feb 19 15:30:07 crc kubenswrapper[4810]: I0219 15:30:07.641225 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e3bd9969-3750-460b-95cd-8c52d2e44d82-etc-machine-id\") pod \"e3bd9969-3750-460b-95cd-8c52d2e44d82\" (UID: \"e3bd9969-3750-460b-95cd-8c52d2e44d82\") " Feb 19 15:30:07 crc kubenswrapper[4810]: I0219 15:30:07.641342 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3bd9969-3750-460b-95cd-8c52d2e44d82-combined-ca-bundle\") pod \"e3bd9969-3750-460b-95cd-8c52d2e44d82\" (UID: \"e3bd9969-3750-460b-95cd-8c52d2e44d82\") " Feb 19 15:30:07 crc kubenswrapper[4810]: I0219 15:30:07.641686 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e3bd9969-3750-460b-95cd-8c52d2e44d82-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "e3bd9969-3750-460b-95cd-8c52d2e44d82" (UID: "e3bd9969-3750-460b-95cd-8c52d2e44d82"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 15:30:07 crc kubenswrapper[4810]: I0219 15:30:07.641765 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3bd9969-3750-460b-95cd-8c52d2e44d82-config-data\") pod \"e3bd9969-3750-460b-95cd-8c52d2e44d82\" (UID: \"e3bd9969-3750-460b-95cd-8c52d2e44d82\") " Feb 19 15:30:07 crc kubenswrapper[4810]: I0219 15:30:07.641812 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xw97l\" (UniqueName: \"kubernetes.io/projected/e3bd9969-3750-460b-95cd-8c52d2e44d82-kube-api-access-xw97l\") pod \"e3bd9969-3750-460b-95cd-8c52d2e44d82\" (UID: \"e3bd9969-3750-460b-95cd-8c52d2e44d82\") " Feb 19 15:30:07 crc kubenswrapper[4810]: I0219 15:30:07.642341 4810 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e3bd9969-3750-460b-95cd-8c52d2e44d82-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:07 crc kubenswrapper[4810]: I0219 15:30:07.657612 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3bd9969-3750-460b-95cd-8c52d2e44d82-kube-api-access-xw97l" (OuterVolumeSpecName: "kube-api-access-xw97l") pod "e3bd9969-3750-460b-95cd-8c52d2e44d82" (UID: "e3bd9969-3750-460b-95cd-8c52d2e44d82"). InnerVolumeSpecName "kube-api-access-xw97l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:30:07 crc kubenswrapper[4810]: I0219 15:30:07.657702 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3bd9969-3750-460b-95cd-8c52d2e44d82-scripts" (OuterVolumeSpecName: "scripts") pod "e3bd9969-3750-460b-95cd-8c52d2e44d82" (UID: "e3bd9969-3750-460b-95cd-8c52d2e44d82"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:30:07 crc kubenswrapper[4810]: I0219 15:30:07.691605 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3bd9969-3750-460b-95cd-8c52d2e44d82-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e3bd9969-3750-460b-95cd-8c52d2e44d82" (UID: "e3bd9969-3750-460b-95cd-8c52d2e44d82"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:30:07 crc kubenswrapper[4810]: I0219 15:30:07.733679 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3bd9969-3750-460b-95cd-8c52d2e44d82-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e3bd9969-3750-460b-95cd-8c52d2e44d82" (UID: "e3bd9969-3750-460b-95cd-8c52d2e44d82"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:30:07 crc kubenswrapper[4810]: I0219 15:30:07.743729 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6dfcf65577-bd5w2" Feb 19 15:30:07 crc kubenswrapper[4810]: I0219 15:30:07.744736 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xw97l\" (UniqueName: \"kubernetes.io/projected/e3bd9969-3750-460b-95cd-8c52d2e44d82-kube-api-access-xw97l\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:07 crc kubenswrapper[4810]: I0219 15:30:07.744758 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3bd9969-3750-460b-95cd-8c52d2e44d82-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:07 crc kubenswrapper[4810]: I0219 15:30:07.744767 4810 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e3bd9969-3750-460b-95cd-8c52d2e44d82-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:07 crc kubenswrapper[4810]: I0219 15:30:07.744778 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3bd9969-3750-460b-95cd-8c52d2e44d82-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:07 crc kubenswrapper[4810]: I0219 15:30:07.807655 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6cc67d5fc8-hs8lf"] Feb 19 15:30:07 crc kubenswrapper[4810]: I0219 15:30:07.807903 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6cc67d5fc8-hs8lf" podUID="fb06ec19-dfd7-459f-9bfc-6c3e1619abb0" containerName="neutron-api" containerID="cri-o://4e847a42f9e6fb1b9f7435b4df35d14a3fd82483b8e4fe2e6eb4d640b1450737" gracePeriod=30 Feb 19 15:30:07 crc kubenswrapper[4810]: I0219 15:30:07.809803 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6cc67d5fc8-hs8lf" podUID="fb06ec19-dfd7-459f-9bfc-6c3e1619abb0" containerName="neutron-httpd" containerID="cri-o://ec03377efcba956dd84e47222d7f2fa8900a14d68a543b851a67f801c749c23f" gracePeriod=30 Feb 19 15:30:07 crc kubenswrapper[4810]: I0219 15:30:07.850906 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3bd9969-3750-460b-95cd-8c52d2e44d82-config-data" (OuterVolumeSpecName: "config-data") pod "e3bd9969-3750-460b-95cd-8c52d2e44d82" (UID: "e3bd9969-3750-460b-95cd-8c52d2e44d82"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:30:07 crc kubenswrapper[4810]: I0219 15:30:07.951516 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3bd9969-3750-460b-95cd-8c52d2e44d82-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:08 crc kubenswrapper[4810]: I0219 15:30:08.162357 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-869f57798-ngdtl" podUID="58c845f2-0069-4ee5-9d4b-b5871e078926" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.165:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.165:8443: connect: connection refused" Feb 19 15:30:08 crc kubenswrapper[4810]: I0219 15:30:08.173452 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 15:30:08 crc kubenswrapper[4810]: I0219 15:30:08.192375 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 15:30:08 crc kubenswrapper[4810]: I0219 15:30:08.208791 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 15:30:08 crc kubenswrapper[4810]: E0219 15:30:08.209229 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40500a46-a16b-4282-86e4-1d99277d7c7a" containerName="collect-profiles" Feb 19 15:30:08 crc kubenswrapper[4810]: I0219 15:30:08.209245 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="40500a46-a16b-4282-86e4-1d99277d7c7a" containerName="collect-profiles" Feb 19 15:30:08 crc kubenswrapper[4810]: E0219 15:30:08.209261 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3bd9969-3750-460b-95cd-8c52d2e44d82" containerName="cinder-scheduler" Feb 19 15:30:08 crc kubenswrapper[4810]: I0219 15:30:08.209267 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3bd9969-3750-460b-95cd-8c52d2e44d82" containerName="cinder-scheduler" Feb 19 15:30:08 crc kubenswrapper[4810]: E0219 15:30:08.209283 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f9534ee-827a-49fb-8588-a5a8a494be3c" containerName="dnsmasq-dns" Feb 19 15:30:08 crc kubenswrapper[4810]: I0219 15:30:08.209289 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f9534ee-827a-49fb-8588-a5a8a494be3c" containerName="dnsmasq-dns" Feb 19 15:30:08 crc kubenswrapper[4810]: E0219 15:30:08.209300 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3bd9969-3750-460b-95cd-8c52d2e44d82" containerName="probe" Feb 19 15:30:08 crc kubenswrapper[4810]: I0219 15:30:08.209306 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3bd9969-3750-460b-95cd-8c52d2e44d82" containerName="probe" Feb 19 15:30:08 crc kubenswrapper[4810]: E0219 15:30:08.209315 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f9534ee-827a-49fb-8588-a5a8a494be3c" containerName="init" Feb 19 15:30:08 crc kubenswrapper[4810]: I0219 15:30:08.209335 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f9534ee-827a-49fb-8588-a5a8a494be3c" containerName="init" Feb 19 15:30:08 crc kubenswrapper[4810]: I0219 15:30:08.209493 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3bd9969-3750-460b-95cd-8c52d2e44d82" containerName="probe" Feb 19 15:30:08 crc kubenswrapper[4810]: I0219 15:30:08.209512 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f9534ee-827a-49fb-8588-a5a8a494be3c" containerName="dnsmasq-dns" Feb 19 15:30:08 crc kubenswrapper[4810]: I0219 15:30:08.209525 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3bd9969-3750-460b-95cd-8c52d2e44d82" containerName="cinder-scheduler" Feb 19 15:30:08 crc kubenswrapper[4810]: I0219 15:30:08.209542 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="40500a46-a16b-4282-86e4-1d99277d7c7a" containerName="collect-profiles" Feb 19 15:30:08 crc kubenswrapper[4810]: I0219 15:30:08.210480 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 19 15:30:08 crc kubenswrapper[4810]: I0219 15:30:08.214291 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 19 15:30:08 crc kubenswrapper[4810]: I0219 15:30:08.224818 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 15:30:08 crc kubenswrapper[4810]: I0219 15:30:08.361985 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/48d5e3e9-853c-4988-8746-a6f74e1fe209-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"48d5e3e9-853c-4988-8746-a6f74e1fe209\") " pod="openstack/cinder-scheduler-0" Feb 19 15:30:08 crc kubenswrapper[4810]: I0219 15:30:08.362065 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48d5e3e9-853c-4988-8746-a6f74e1fe209-config-data\") pod \"cinder-scheduler-0\" (UID: \"48d5e3e9-853c-4988-8746-a6f74e1fe209\") " pod="openstack/cinder-scheduler-0" Feb 19 15:30:08 crc kubenswrapper[4810]: I0219 15:30:08.362121 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/48d5e3e9-853c-4988-8746-a6f74e1fe209-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"48d5e3e9-853c-4988-8746-a6f74e1fe209\") " pod="openstack/cinder-scheduler-0" Feb 19 15:30:08 crc kubenswrapper[4810]: I0219 15:30:08.362140 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slfh7\" (UniqueName: \"kubernetes.io/projected/48d5e3e9-853c-4988-8746-a6f74e1fe209-kube-api-access-slfh7\") pod \"cinder-scheduler-0\" (UID: \"48d5e3e9-853c-4988-8746-a6f74e1fe209\") " pod="openstack/cinder-scheduler-0" Feb 19 15:30:08 crc kubenswrapper[4810]: I0219 15:30:08.362169 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48d5e3e9-853c-4988-8746-a6f74e1fe209-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"48d5e3e9-853c-4988-8746-a6f74e1fe209\") " pod="openstack/cinder-scheduler-0" Feb 19 15:30:08 crc kubenswrapper[4810]: I0219 15:30:08.362205 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48d5e3e9-853c-4988-8746-a6f74e1fe209-scripts\") pod \"cinder-scheduler-0\" (UID: \"48d5e3e9-853c-4988-8746-a6f74e1fe209\") " pod="openstack/cinder-scheduler-0" Feb 19 15:30:08 crc kubenswrapper[4810]: I0219 15:30:08.458876 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-79fc56bc44-tfjh4" podUID="faf01cf3-b74b-46d8-b589-05ea0195ac24" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.182:9311/healthcheck\": read tcp 10.217.0.2:37770->10.217.0.182:9311: read: connection reset by peer" Feb 19 15:30:08 crc kubenswrapper[4810]: I0219 15:30:08.458888 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-79fc56bc44-tfjh4" podUID="faf01cf3-b74b-46d8-b589-05ea0195ac24" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.182:9311/healthcheck\": read tcp 10.217.0.2:37756->10.217.0.182:9311: read: connection reset by peer" Feb 19 15:30:08 crc kubenswrapper[4810]: I0219 15:30:08.465183 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/48d5e3e9-853c-4988-8746-a6f74e1fe209-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"48d5e3e9-853c-4988-8746-a6f74e1fe209\") " pod="openstack/cinder-scheduler-0" Feb 19 15:30:08 crc kubenswrapper[4810]: I0219 15:30:08.465349 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48d5e3e9-853c-4988-8746-a6f74e1fe209-config-data\") pod \"cinder-scheduler-0\" (UID: \"48d5e3e9-853c-4988-8746-a6f74e1fe209\") " pod="openstack/cinder-scheduler-0" Feb 19 15:30:08 crc kubenswrapper[4810]: I0219 15:30:08.465467 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/48d5e3e9-853c-4988-8746-a6f74e1fe209-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"48d5e3e9-853c-4988-8746-a6f74e1fe209\") " pod="openstack/cinder-scheduler-0" Feb 19 15:30:08 crc kubenswrapper[4810]: I0219 15:30:08.465498 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slfh7\" (UniqueName: \"kubernetes.io/projected/48d5e3e9-853c-4988-8746-a6f74e1fe209-kube-api-access-slfh7\") pod \"cinder-scheduler-0\" (UID: \"48d5e3e9-853c-4988-8746-a6f74e1fe209\") " pod="openstack/cinder-scheduler-0" Feb 19 15:30:08 crc kubenswrapper[4810]: I0219 15:30:08.465763 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48d5e3e9-853c-4988-8746-a6f74e1fe209-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"48d5e3e9-853c-4988-8746-a6f74e1fe209\") " pod="openstack/cinder-scheduler-0" Feb 19 15:30:08 crc kubenswrapper[4810]: I0219 15:30:08.465843 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48d5e3e9-853c-4988-8746-a6f74e1fe209-scripts\") pod \"cinder-scheduler-0\" (UID: \"48d5e3e9-853c-4988-8746-a6f74e1fe209\") " pod="openstack/cinder-scheduler-0" Feb 19 15:30:08 crc kubenswrapper[4810]: I0219 15:30:08.466042 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/48d5e3e9-853c-4988-8746-a6f74e1fe209-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"48d5e3e9-853c-4988-8746-a6f74e1fe209\") " pod="openstack/cinder-scheduler-0" Feb 19 15:30:08 crc kubenswrapper[4810]: I0219 15:30:08.480869 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48d5e3e9-853c-4988-8746-a6f74e1fe209-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"48d5e3e9-853c-4988-8746-a6f74e1fe209\") " pod="openstack/cinder-scheduler-0" Feb 19 15:30:08 crc kubenswrapper[4810]: I0219 15:30:08.485384 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slfh7\" (UniqueName: \"kubernetes.io/projected/48d5e3e9-853c-4988-8746-a6f74e1fe209-kube-api-access-slfh7\") pod \"cinder-scheduler-0\" (UID: \"48d5e3e9-853c-4988-8746-a6f74e1fe209\") " pod="openstack/cinder-scheduler-0" Feb 19 15:30:08 crc kubenswrapper[4810]: I0219 15:30:08.488944 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/48d5e3e9-853c-4988-8746-a6f74e1fe209-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"48d5e3e9-853c-4988-8746-a6f74e1fe209\") " pod="openstack/cinder-scheduler-0" Feb 19 15:30:08 crc kubenswrapper[4810]: I0219 15:30:08.492669 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48d5e3e9-853c-4988-8746-a6f74e1fe209-scripts\") pod \"cinder-scheduler-0\" (UID: \"48d5e3e9-853c-4988-8746-a6f74e1fe209\") " pod="openstack/cinder-scheduler-0" Feb 19 15:30:08 crc kubenswrapper[4810]: I0219 15:30:08.493497 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48d5e3e9-853c-4988-8746-a6f74e1fe209-config-data\") pod \"cinder-scheduler-0\" (UID: \"48d5e3e9-853c-4988-8746-a6f74e1fe209\") " pod="openstack/cinder-scheduler-0" Feb 19 15:30:08 crc kubenswrapper[4810]: I0219 15:30:08.574879 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 19 15:30:08 crc kubenswrapper[4810]: I0219 15:30:08.587206 4810 generic.go:334] "Generic (PLEG): container finished" podID="faf01cf3-b74b-46d8-b589-05ea0195ac24" containerID="800222f8de7a26806971959afe00821e5af0eb096530f9abfdb9f64bd62612cb" exitCode=0 Feb 19 15:30:08 crc kubenswrapper[4810]: I0219 15:30:08.587282 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-79fc56bc44-tfjh4" event={"ID":"faf01cf3-b74b-46d8-b589-05ea0195ac24","Type":"ContainerDied","Data":"800222f8de7a26806971959afe00821e5af0eb096530f9abfdb9f64bd62612cb"} Feb 19 15:30:08 crc kubenswrapper[4810]: I0219 15:30:08.603245 4810 generic.go:334] "Generic (PLEG): container finished" podID="fb06ec19-dfd7-459f-9bfc-6c3e1619abb0" containerID="ec03377efcba956dd84e47222d7f2fa8900a14d68a543b851a67f801c749c23f" exitCode=0 Feb 19 15:30:08 crc kubenswrapper[4810]: I0219 15:30:08.603298 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6cc67d5fc8-hs8lf" event={"ID":"fb06ec19-dfd7-459f-9bfc-6c3e1619abb0","Type":"ContainerDied","Data":"ec03377efcba956dd84e47222d7f2fa8900a14d68a543b851a67f801c749c23f"} Feb 19 15:30:08 crc kubenswrapper[4810]: I0219 15:30:08.604504 4810 generic.go:334] "Generic (PLEG): container finished" podID="e0116ca5-826e-4a77-bc6f-11e89c047af8" containerID="0f0fdb4ec7d48ce5efa4d4ce421e22663d31ffdf324056a1695cd08d0b02000f" exitCode=143 Feb 19 15:30:08 crc kubenswrapper[4810]: I0219 15:30:08.604528 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-68ff886dc8-nntj6" event={"ID":"e0116ca5-826e-4a77-bc6f-11e89c047af8","Type":"ContainerDied","Data":"0f0fdb4ec7d48ce5efa4d4ce421e22663d31ffdf324056a1695cd08d0b02000f"} Feb 19 15:30:08 crc kubenswrapper[4810]: I0219 15:30:08.982965 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-79fc56bc44-tfjh4" Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.089283 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlkqn\" (UniqueName: \"kubernetes.io/projected/faf01cf3-b74b-46d8-b589-05ea0195ac24-kube-api-access-dlkqn\") pod \"faf01cf3-b74b-46d8-b589-05ea0195ac24\" (UID: \"faf01cf3-b74b-46d8-b589-05ea0195ac24\") " Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.089422 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/faf01cf3-b74b-46d8-b589-05ea0195ac24-logs\") pod \"faf01cf3-b74b-46d8-b589-05ea0195ac24\" (UID: \"faf01cf3-b74b-46d8-b589-05ea0195ac24\") " Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.089679 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faf01cf3-b74b-46d8-b589-05ea0195ac24-combined-ca-bundle\") pod \"faf01cf3-b74b-46d8-b589-05ea0195ac24\" (UID: \"faf01cf3-b74b-46d8-b589-05ea0195ac24\") " Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.089732 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faf01cf3-b74b-46d8-b589-05ea0195ac24-config-data\") pod \"faf01cf3-b74b-46d8-b589-05ea0195ac24\" (UID: \"faf01cf3-b74b-46d8-b589-05ea0195ac24\") " Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.089756 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/faf01cf3-b74b-46d8-b589-05ea0195ac24-config-data-custom\") pod \"faf01cf3-b74b-46d8-b589-05ea0195ac24\" (UID: \"faf01cf3-b74b-46d8-b589-05ea0195ac24\") " Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.098768 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/faf01cf3-b74b-46d8-b589-05ea0195ac24-kube-api-access-dlkqn" (OuterVolumeSpecName: "kube-api-access-dlkqn") pod "faf01cf3-b74b-46d8-b589-05ea0195ac24" (UID: "faf01cf3-b74b-46d8-b589-05ea0195ac24"). InnerVolumeSpecName "kube-api-access-dlkqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.099679 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/faf01cf3-b74b-46d8-b589-05ea0195ac24-logs" (OuterVolumeSpecName: "logs") pod "faf01cf3-b74b-46d8-b589-05ea0195ac24" (UID: "faf01cf3-b74b-46d8-b589-05ea0195ac24"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.123919 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/faf01cf3-b74b-46d8-b589-05ea0195ac24-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "faf01cf3-b74b-46d8-b589-05ea0195ac24" (UID: "faf01cf3-b74b-46d8-b589-05ea0195ac24"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.199375 4810 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/faf01cf3-b74b-46d8-b589-05ea0195ac24-logs\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.199412 4810 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/faf01cf3-b74b-46d8-b589-05ea0195ac24-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.199427 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlkqn\" (UniqueName: \"kubernetes.io/projected/faf01cf3-b74b-46d8-b589-05ea0195ac24-kube-api-access-dlkqn\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.204733 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/faf01cf3-b74b-46d8-b589-05ea0195ac24-config-data" (OuterVolumeSpecName: "config-data") pod "faf01cf3-b74b-46d8-b589-05ea0195ac24" (UID: "faf01cf3-b74b-46d8-b589-05ea0195ac24"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.219408 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/faf01cf3-b74b-46d8-b589-05ea0195ac24-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "faf01cf3-b74b-46d8-b589-05ea0195ac24" (UID: "faf01cf3-b74b-46d8-b589-05ea0195ac24"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.251539 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-68ff886dc8-nntj6" Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.305948 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faf01cf3-b74b-46d8-b589-05ea0195ac24-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.305985 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faf01cf3-b74b-46d8-b589-05ea0195ac24-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.358663 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.406934 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0116ca5-826e-4a77-bc6f-11e89c047af8-scripts\") pod \"e0116ca5-826e-4a77-bc6f-11e89c047af8\" (UID: \"e0116ca5-826e-4a77-bc6f-11e89c047af8\") " Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.407026 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0116ca5-826e-4a77-bc6f-11e89c047af8-config-data\") pod \"e0116ca5-826e-4a77-bc6f-11e89c047af8\" (UID: \"e0116ca5-826e-4a77-bc6f-11e89c047af8\") " Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.407066 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0116ca5-826e-4a77-bc6f-11e89c047af8-internal-tls-certs\") pod \"e0116ca5-826e-4a77-bc6f-11e89c047af8\" (UID: \"e0116ca5-826e-4a77-bc6f-11e89c047af8\") " Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.407099 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0116ca5-826e-4a77-bc6f-11e89c047af8-logs\") pod \"e0116ca5-826e-4a77-bc6f-11e89c047af8\" (UID: \"e0116ca5-826e-4a77-bc6f-11e89c047af8\") " Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.407138 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0116ca5-826e-4a77-bc6f-11e89c047af8-combined-ca-bundle\") pod \"e0116ca5-826e-4a77-bc6f-11e89c047af8\" (UID: \"e0116ca5-826e-4a77-bc6f-11e89c047af8\") " Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.407199 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0116ca5-826e-4a77-bc6f-11e89c047af8-public-tls-certs\") pod \"e0116ca5-826e-4a77-bc6f-11e89c047af8\" (UID: \"e0116ca5-826e-4a77-bc6f-11e89c047af8\") " Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.407289 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ln9c7\" (UniqueName: \"kubernetes.io/projected/e0116ca5-826e-4a77-bc6f-11e89c047af8-kube-api-access-ln9c7\") pod \"e0116ca5-826e-4a77-bc6f-11e89c047af8\" (UID: \"e0116ca5-826e-4a77-bc6f-11e89c047af8\") " Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.408061 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0116ca5-826e-4a77-bc6f-11e89c047af8-logs" (OuterVolumeSpecName: "logs") pod "e0116ca5-826e-4a77-bc6f-11e89c047af8" (UID: "e0116ca5-826e-4a77-bc6f-11e89c047af8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.415688 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0116ca5-826e-4a77-bc6f-11e89c047af8-kube-api-access-ln9c7" (OuterVolumeSpecName: "kube-api-access-ln9c7") pod "e0116ca5-826e-4a77-bc6f-11e89c047af8" (UID: "e0116ca5-826e-4a77-bc6f-11e89c047af8"). InnerVolumeSpecName "kube-api-access-ln9c7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.426488 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0116ca5-826e-4a77-bc6f-11e89c047af8-scripts" (OuterVolumeSpecName: "scripts") pod "e0116ca5-826e-4a77-bc6f-11e89c047af8" (UID: "e0116ca5-826e-4a77-bc6f-11e89c047af8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.465712 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3bd9969-3750-460b-95cd-8c52d2e44d82" path="/var/lib/kubelet/pods/e3bd9969-3750-460b-95cd-8c52d2e44d82/volumes" Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.482434 4810 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","podcd961c7d-d551-4f5b-a08a-07d088947698"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort podcd961c7d-d551-4f5b-a08a-07d088947698] : Timed out while waiting for systemd to remove kubepods-besteffort-podcd961c7d_d551_4f5b_a08a_07d088947698.slice" Feb 19 15:30:09 crc kubenswrapper[4810]: E0219 15:30:09.482481 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort podcd961c7d-d551-4f5b-a08a-07d088947698] : unable to destroy cgroup paths for cgroup [kubepods besteffort podcd961c7d-d551-4f5b-a08a-07d088947698] : Timed out while waiting for systemd to remove kubepods-besteffort-podcd961c7d_d551_4f5b_a08a_07d088947698.slice" pod="openstack/watcher-api-0" podUID="cd961c7d-d551-4f5b-a08a-07d088947698" Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.510429 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ln9c7\" (UniqueName: \"kubernetes.io/projected/e0116ca5-826e-4a77-bc6f-11e89c047af8-kube-api-access-ln9c7\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.510456 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0116ca5-826e-4a77-bc6f-11e89c047af8-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.510466 4810 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0116ca5-826e-4a77-bc6f-11e89c047af8-logs\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.527647 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0116ca5-826e-4a77-bc6f-11e89c047af8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e0116ca5-826e-4a77-bc6f-11e89c047af8" (UID: "e0116ca5-826e-4a77-bc6f-11e89c047af8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.546801 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0116ca5-826e-4a77-bc6f-11e89c047af8-config-data" (OuterVolumeSpecName: "config-data") pod "e0116ca5-826e-4a77-bc6f-11e89c047af8" (UID: "e0116ca5-826e-4a77-bc6f-11e89c047af8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.574420 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0116ca5-826e-4a77-bc6f-11e89c047af8-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e0116ca5-826e-4a77-bc6f-11e89c047af8" (UID: "e0116ca5-826e-4a77-bc6f-11e89c047af8"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.599628 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0116ca5-826e-4a77-bc6f-11e89c047af8-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e0116ca5-826e-4a77-bc6f-11e89c047af8" (UID: "e0116ca5-826e-4a77-bc6f-11e89c047af8"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.617186 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0116ca5-826e-4a77-bc6f-11e89c047af8-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.617208 4810 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0116ca5-826e-4a77-bc6f-11e89c047af8-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.617218 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0116ca5-826e-4a77-bc6f-11e89c047af8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.617227 4810 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0116ca5-826e-4a77-bc6f-11e89c047af8-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.644041 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-695bb7cdc6-72zs2_f0b73197-3c7e-44c3-8a49-35d9e0a40629/neutron-api/0.log" Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.644294 4810 generic.go:334] "Generic (PLEG): container finished" podID="f0b73197-3c7e-44c3-8a49-35d9e0a40629" containerID="fc28d9c929a268b7ca57d72d22325d489071902ea7ff07035df3c9d98d6a728a" exitCode=137 Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.644418 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-695bb7cdc6-72zs2" event={"ID":"f0b73197-3c7e-44c3-8a49-35d9e0a40629","Type":"ContainerDied","Data":"fc28d9c929a268b7ca57d72d22325d489071902ea7ff07035df3c9d98d6a728a"} Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.658918 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-79fc56bc44-tfjh4" event={"ID":"faf01cf3-b74b-46d8-b589-05ea0195ac24","Type":"ContainerDied","Data":"9a79a3684682be966eafd1448e2f56c188e39ccc6a243e2540b5912c1bbf9c6c"} Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.658958 4810 scope.go:117] "RemoveContainer" containerID="800222f8de7a26806971959afe00821e5af0eb096530f9abfdb9f64bd62612cb" Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.659060 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-79fc56bc44-tfjh4" Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.672116 4810 generic.go:334] "Generic (PLEG): container finished" podID="e0116ca5-826e-4a77-bc6f-11e89c047af8" containerID="4ac6a4c63292ab530356cb491f1c19b551ba50fa93c7f50d74148f2ff94c41a3" exitCode=0 Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.673364 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-68ff886dc8-nntj6" event={"ID":"e0116ca5-826e-4a77-bc6f-11e89c047af8","Type":"ContainerDied","Data":"4ac6a4c63292ab530356cb491f1c19b551ba50fa93c7f50d74148f2ff94c41a3"} Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.673504 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-68ff886dc8-nntj6" event={"ID":"e0116ca5-826e-4a77-bc6f-11e89c047af8","Type":"ContainerDied","Data":"4f57c96680174420000630479bd05d2a43d222e27a564e7d19f21df9961dd34c"} Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.673646 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-68ff886dc8-nntj6" Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.687065 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.687885 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"48d5e3e9-853c-4988-8746-a6f74e1fe209","Type":"ContainerStarted","Data":"40c17b4503d159ceb58fb8b116d2ec739c336b17c2e781d635cec7e728be1677"} Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.692498 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-79fc56bc44-tfjh4"] Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.700297 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-79fc56bc44-tfjh4"] Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.718218 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.742098 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-api-0"] Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.754564 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Feb 19 15:30:09 crc kubenswrapper[4810]: E0219 15:30:09.754950 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0116ca5-826e-4a77-bc6f-11e89c047af8" containerName="placement-api" Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.754962 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0116ca5-826e-4a77-bc6f-11e89c047af8" containerName="placement-api" Feb 19 15:30:09 crc kubenswrapper[4810]: E0219 15:30:09.754976 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="faf01cf3-b74b-46d8-b589-05ea0195ac24" containerName="barbican-api" Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.754981 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="faf01cf3-b74b-46d8-b589-05ea0195ac24" containerName="barbican-api" Feb 19 15:30:09 crc kubenswrapper[4810]: E0219 15:30:09.754992 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0116ca5-826e-4a77-bc6f-11e89c047af8" containerName="placement-log" Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.754997 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0116ca5-826e-4a77-bc6f-11e89c047af8" containerName="placement-log" Feb 19 15:30:09 crc kubenswrapper[4810]: E0219 15:30:09.755019 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="faf01cf3-b74b-46d8-b589-05ea0195ac24" containerName="barbican-api-log" Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.755025 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="faf01cf3-b74b-46d8-b589-05ea0195ac24" containerName="barbican-api-log" Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.755220 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0116ca5-826e-4a77-bc6f-11e89c047af8" containerName="placement-log" Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.755234 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="faf01cf3-b74b-46d8-b589-05ea0195ac24" containerName="barbican-api" Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.755244 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="faf01cf3-b74b-46d8-b589-05ea0195ac24" containerName="barbican-api-log" Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.755253 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0116ca5-826e-4a77-bc6f-11e89c047af8" containerName="placement-api" Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.756185 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.772480 4810 scope.go:117] "RemoveContainer" containerID="6c0cc91325e6d2565c8ddd9b1eede9d65e56a0efbeee132cbd5a27f02f0ed7ef" Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.774835 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.793730 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.794355 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-internal-svc" Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.794608 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-public-svc" Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.813918 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-68ff886dc8-nntj6"] Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.849833 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-68ff886dc8-nntj6"] Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.882936 4810 scope.go:117] "RemoveContainer" containerID="4ac6a4c63292ab530356cb491f1c19b551ba50fa93c7f50d74148f2ff94c41a3" Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.940897 4810 scope.go:117] "RemoveContainer" containerID="0f0fdb4ec7d48ce5efa4d4ce421e22663d31ffdf324056a1695cd08d0b02000f" Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.942825 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c772672c-c983-42e8-ae77-bfc8484ad555-public-tls-certs\") pod \"watcher-api-0\" (UID: \"c772672c-c983-42e8-ae77-bfc8484ad555\") " pod="openstack/watcher-api-0" Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.942919 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c772672c-c983-42e8-ae77-bfc8484ad555-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"c772672c-c983-42e8-ae77-bfc8484ad555\") " pod="openstack/watcher-api-0" Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.942971 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c772672c-c983-42e8-ae77-bfc8484ad555-logs\") pod \"watcher-api-0\" (UID: \"c772672c-c983-42e8-ae77-bfc8484ad555\") " pod="openstack/watcher-api-0" Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.943045 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vx6gm\" (UniqueName: \"kubernetes.io/projected/c772672c-c983-42e8-ae77-bfc8484ad555-kube-api-access-vx6gm\") pod \"watcher-api-0\" (UID: \"c772672c-c983-42e8-ae77-bfc8484ad555\") " pod="openstack/watcher-api-0" Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.943122 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/c772672c-c983-42e8-ae77-bfc8484ad555-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"c772672c-c983-42e8-ae77-bfc8484ad555\") " pod="openstack/watcher-api-0" Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.943164 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c772672c-c983-42e8-ae77-bfc8484ad555-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"c772672c-c983-42e8-ae77-bfc8484ad555\") " pod="openstack/watcher-api-0" Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.943192 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c772672c-c983-42e8-ae77-bfc8484ad555-config-data\") pod \"watcher-api-0\" (UID: \"c772672c-c983-42e8-ae77-bfc8484ad555\") " pod="openstack/watcher-api-0" Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.977546 4810 scope.go:117] "RemoveContainer" containerID="4ac6a4c63292ab530356cb491f1c19b551ba50fa93c7f50d74148f2ff94c41a3" Feb 19 15:30:09 crc kubenswrapper[4810]: E0219 15:30:09.978590 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ac6a4c63292ab530356cb491f1c19b551ba50fa93c7f50d74148f2ff94c41a3\": container with ID starting with 4ac6a4c63292ab530356cb491f1c19b551ba50fa93c7f50d74148f2ff94c41a3 not found: ID does not exist" containerID="4ac6a4c63292ab530356cb491f1c19b551ba50fa93c7f50d74148f2ff94c41a3" Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.978632 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ac6a4c63292ab530356cb491f1c19b551ba50fa93c7f50d74148f2ff94c41a3"} err="failed to get container status \"4ac6a4c63292ab530356cb491f1c19b551ba50fa93c7f50d74148f2ff94c41a3\": rpc error: code = NotFound desc = could not find container \"4ac6a4c63292ab530356cb491f1c19b551ba50fa93c7f50d74148f2ff94c41a3\": container with ID starting with 4ac6a4c63292ab530356cb491f1c19b551ba50fa93c7f50d74148f2ff94c41a3 not found: ID does not exist" Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.978663 4810 scope.go:117] "RemoveContainer" containerID="0f0fdb4ec7d48ce5efa4d4ce421e22663d31ffdf324056a1695cd08d0b02000f" Feb 19 15:30:09 crc kubenswrapper[4810]: E0219 15:30:09.979292 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f0fdb4ec7d48ce5efa4d4ce421e22663d31ffdf324056a1695cd08d0b02000f\": container with ID starting with 0f0fdb4ec7d48ce5efa4d4ce421e22663d31ffdf324056a1695cd08d0b02000f not found: ID does not exist" containerID="0f0fdb4ec7d48ce5efa4d4ce421e22663d31ffdf324056a1695cd08d0b02000f" Feb 19 15:30:09 crc kubenswrapper[4810]: I0219 15:30:09.979315 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f0fdb4ec7d48ce5efa4d4ce421e22663d31ffdf324056a1695cd08d0b02000f"} err="failed to get container status \"0f0fdb4ec7d48ce5efa4d4ce421e22663d31ffdf324056a1695cd08d0b02000f\": rpc error: code = NotFound desc = could not find container \"0f0fdb4ec7d48ce5efa4d4ce421e22663d31ffdf324056a1695cd08d0b02000f\": container with ID starting with 0f0fdb4ec7d48ce5efa4d4ce421e22663d31ffdf324056a1695cd08d0b02000f not found: ID does not exist" Feb 19 15:30:10 crc kubenswrapper[4810]: I0219 15:30:10.045064 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c772672c-c983-42e8-ae77-bfc8484ad555-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"c772672c-c983-42e8-ae77-bfc8484ad555\") " pod="openstack/watcher-api-0" Feb 19 15:30:10 crc kubenswrapper[4810]: I0219 15:30:10.045132 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c772672c-c983-42e8-ae77-bfc8484ad555-logs\") pod \"watcher-api-0\" (UID: \"c772672c-c983-42e8-ae77-bfc8484ad555\") " pod="openstack/watcher-api-0" Feb 19 15:30:10 crc kubenswrapper[4810]: I0219 15:30:10.045167 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vx6gm\" (UniqueName: \"kubernetes.io/projected/c772672c-c983-42e8-ae77-bfc8484ad555-kube-api-access-vx6gm\") pod \"watcher-api-0\" (UID: \"c772672c-c983-42e8-ae77-bfc8484ad555\") " pod="openstack/watcher-api-0" Feb 19 15:30:10 crc kubenswrapper[4810]: I0219 15:30:10.045234 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/c772672c-c983-42e8-ae77-bfc8484ad555-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"c772672c-c983-42e8-ae77-bfc8484ad555\") " pod="openstack/watcher-api-0" Feb 19 15:30:10 crc kubenswrapper[4810]: I0219 15:30:10.045257 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c772672c-c983-42e8-ae77-bfc8484ad555-config-data\") pod \"watcher-api-0\" (UID: \"c772672c-c983-42e8-ae77-bfc8484ad555\") " pod="openstack/watcher-api-0" Feb 19 15:30:10 crc kubenswrapper[4810]: I0219 15:30:10.045271 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c772672c-c983-42e8-ae77-bfc8484ad555-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"c772672c-c983-42e8-ae77-bfc8484ad555\") " pod="openstack/watcher-api-0" Feb 19 15:30:10 crc kubenswrapper[4810]: I0219 15:30:10.045314 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c772672c-c983-42e8-ae77-bfc8484ad555-public-tls-certs\") pod \"watcher-api-0\" (UID: \"c772672c-c983-42e8-ae77-bfc8484ad555\") " pod="openstack/watcher-api-0" Feb 19 15:30:10 crc kubenswrapper[4810]: I0219 15:30:10.048165 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c772672c-c983-42e8-ae77-bfc8484ad555-logs\") pod \"watcher-api-0\" (UID: \"c772672c-c983-42e8-ae77-bfc8484ad555\") " pod="openstack/watcher-api-0" Feb 19 15:30:10 crc kubenswrapper[4810]: I0219 15:30:10.059146 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c772672c-c983-42e8-ae77-bfc8484ad555-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"c772672c-c983-42e8-ae77-bfc8484ad555\") " pod="openstack/watcher-api-0" Feb 19 15:30:10 crc kubenswrapper[4810]: I0219 15:30:10.059896 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c772672c-c983-42e8-ae77-bfc8484ad555-public-tls-certs\") pod \"watcher-api-0\" (UID: \"c772672c-c983-42e8-ae77-bfc8484ad555\") " pod="openstack/watcher-api-0" Feb 19 15:30:10 crc kubenswrapper[4810]: I0219 15:30:10.064664 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/c772672c-c983-42e8-ae77-bfc8484ad555-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"c772672c-c983-42e8-ae77-bfc8484ad555\") " pod="openstack/watcher-api-0" Feb 19 15:30:10 crc kubenswrapper[4810]: I0219 15:30:10.066296 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c772672c-c983-42e8-ae77-bfc8484ad555-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"c772672c-c983-42e8-ae77-bfc8484ad555\") " pod="openstack/watcher-api-0" Feb 19 15:30:10 crc kubenswrapper[4810]: I0219 15:30:10.066883 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vx6gm\" (UniqueName: \"kubernetes.io/projected/c772672c-c983-42e8-ae77-bfc8484ad555-kube-api-access-vx6gm\") pod \"watcher-api-0\" (UID: \"c772672c-c983-42e8-ae77-bfc8484ad555\") " pod="openstack/watcher-api-0" Feb 19 15:30:10 crc kubenswrapper[4810]: I0219 15:30:10.068245 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c772672c-c983-42e8-ae77-bfc8484ad555-config-data\") pod \"watcher-api-0\" (UID: \"c772672c-c983-42e8-ae77-bfc8484ad555\") " pod="openstack/watcher-api-0" Feb 19 15:30:10 crc kubenswrapper[4810]: I0219 15:30:10.153702 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-695bb7cdc6-72zs2_f0b73197-3c7e-44c3-8a49-35d9e0a40629/neutron-api/0.log" Feb 19 15:30:10 crc kubenswrapper[4810]: I0219 15:30:10.153783 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-695bb7cdc6-72zs2" Feb 19 15:30:10 crc kubenswrapper[4810]: I0219 15:30:10.183057 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Feb 19 15:30:10 crc kubenswrapper[4810]: I0219 15:30:10.249206 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f0b73197-3c7e-44c3-8a49-35d9e0a40629-httpd-config\") pod \"f0b73197-3c7e-44c3-8a49-35d9e0a40629\" (UID: \"f0b73197-3c7e-44c3-8a49-35d9e0a40629\") " Feb 19 15:30:10 crc kubenswrapper[4810]: I0219 15:30:10.249299 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drqw4\" (UniqueName: \"kubernetes.io/projected/f0b73197-3c7e-44c3-8a49-35d9e0a40629-kube-api-access-drqw4\") pod \"f0b73197-3c7e-44c3-8a49-35d9e0a40629\" (UID: \"f0b73197-3c7e-44c3-8a49-35d9e0a40629\") " Feb 19 15:30:10 crc kubenswrapper[4810]: I0219 15:30:10.249338 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0b73197-3c7e-44c3-8a49-35d9e0a40629-ovndb-tls-certs\") pod \"f0b73197-3c7e-44c3-8a49-35d9e0a40629\" (UID: \"f0b73197-3c7e-44c3-8a49-35d9e0a40629\") " Feb 19 15:30:10 crc kubenswrapper[4810]: I0219 15:30:10.249447 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f0b73197-3c7e-44c3-8a49-35d9e0a40629-config\") pod \"f0b73197-3c7e-44c3-8a49-35d9e0a40629\" (UID: \"f0b73197-3c7e-44c3-8a49-35d9e0a40629\") " Feb 19 15:30:10 crc kubenswrapper[4810]: I0219 15:30:10.249604 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0b73197-3c7e-44c3-8a49-35d9e0a40629-combined-ca-bundle\") pod \"f0b73197-3c7e-44c3-8a49-35d9e0a40629\" (UID: \"f0b73197-3c7e-44c3-8a49-35d9e0a40629\") " Feb 19 15:30:10 crc kubenswrapper[4810]: I0219 15:30:10.255512 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0b73197-3c7e-44c3-8a49-35d9e0a40629-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "f0b73197-3c7e-44c3-8a49-35d9e0a40629" (UID: "f0b73197-3c7e-44c3-8a49-35d9e0a40629"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:30:10 crc kubenswrapper[4810]: I0219 15:30:10.260482 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0b73197-3c7e-44c3-8a49-35d9e0a40629-kube-api-access-drqw4" (OuterVolumeSpecName: "kube-api-access-drqw4") pod "f0b73197-3c7e-44c3-8a49-35d9e0a40629" (UID: "f0b73197-3c7e-44c3-8a49-35d9e0a40629"). InnerVolumeSpecName "kube-api-access-drqw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:30:10 crc kubenswrapper[4810]: I0219 15:30:10.323458 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0b73197-3c7e-44c3-8a49-35d9e0a40629-config" (OuterVolumeSpecName: "config") pod "f0b73197-3c7e-44c3-8a49-35d9e0a40629" (UID: "f0b73197-3c7e-44c3-8a49-35d9e0a40629"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:30:10 crc kubenswrapper[4810]: I0219 15:30:10.349170 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0b73197-3c7e-44c3-8a49-35d9e0a40629-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f0b73197-3c7e-44c3-8a49-35d9e0a40629" (UID: "f0b73197-3c7e-44c3-8a49-35d9e0a40629"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:30:10 crc kubenswrapper[4810]: I0219 15:30:10.355751 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0b73197-3c7e-44c3-8a49-35d9e0a40629-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:10 crc kubenswrapper[4810]: I0219 15:30:10.355781 4810 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f0b73197-3c7e-44c3-8a49-35d9e0a40629-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:10 crc kubenswrapper[4810]: I0219 15:30:10.355790 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drqw4\" (UniqueName: \"kubernetes.io/projected/f0b73197-3c7e-44c3-8a49-35d9e0a40629-kube-api-access-drqw4\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:10 crc kubenswrapper[4810]: I0219 15:30:10.355801 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/f0b73197-3c7e-44c3-8a49-35d9e0a40629-config\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:10 crc kubenswrapper[4810]: I0219 15:30:10.434027 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0b73197-3c7e-44c3-8a49-35d9e0a40629-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "f0b73197-3c7e-44c3-8a49-35d9e0a40629" (UID: "f0b73197-3c7e-44c3-8a49-35d9e0a40629"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:30:10 crc kubenswrapper[4810]: I0219 15:30:10.457989 4810 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0b73197-3c7e-44c3-8a49-35d9e0a40629-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:10 crc kubenswrapper[4810]: I0219 15:30:10.720654 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"48d5e3e9-853c-4988-8746-a6f74e1fe209","Type":"ContainerStarted","Data":"5e8701206c646e0d74b8b6626ec698e8e8367005c4d8a1860077b7cc9641a1b4"} Feb 19 15:30:10 crc kubenswrapper[4810]: I0219 15:30:10.736815 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-695bb7cdc6-72zs2_f0b73197-3c7e-44c3-8a49-35d9e0a40629/neutron-api/0.log" Feb 19 15:30:10 crc kubenswrapper[4810]: I0219 15:30:10.736879 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-695bb7cdc6-72zs2" event={"ID":"f0b73197-3c7e-44c3-8a49-35d9e0a40629","Type":"ContainerDied","Data":"b18b549a747d3077270169b17318853029444572e9290aac2a4fb288910d92c3"} Feb 19 15:30:10 crc kubenswrapper[4810]: I0219 15:30:10.736912 4810 scope.go:117] "RemoveContainer" containerID="dc4301bc9761619133902901d1997ab38671d0d229f4feca664d5a1fa8e30e7a" Feb 19 15:30:10 crc kubenswrapper[4810]: I0219 15:30:10.736979 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-695bb7cdc6-72zs2" Feb 19 15:30:10 crc kubenswrapper[4810]: I0219 15:30:10.752647 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Feb 19 15:30:10 crc kubenswrapper[4810]: I0219 15:30:10.795059 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-695bb7cdc6-72zs2"] Feb 19 15:30:10 crc kubenswrapper[4810]: I0219 15:30:10.824202 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-695bb7cdc6-72zs2"] Feb 19 15:30:10 crc kubenswrapper[4810]: I0219 15:30:10.824473 4810 scope.go:117] "RemoveContainer" containerID="fc28d9c929a268b7ca57d72d22325d489071902ea7ff07035df3c9d98d6a728a" Feb 19 15:30:11 crc kubenswrapper[4810]: I0219 15:30:11.284762 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 19 15:30:11 crc kubenswrapper[4810]: E0219 15:30:11.285189 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0b73197-3c7e-44c3-8a49-35d9e0a40629" containerName="neutron-api" Feb 19 15:30:11 crc kubenswrapper[4810]: I0219 15:30:11.285202 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0b73197-3c7e-44c3-8a49-35d9e0a40629" containerName="neutron-api" Feb 19 15:30:11 crc kubenswrapper[4810]: E0219 15:30:11.285237 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0b73197-3c7e-44c3-8a49-35d9e0a40629" containerName="neutron-httpd" Feb 19 15:30:11 crc kubenswrapper[4810]: I0219 15:30:11.285243 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0b73197-3c7e-44c3-8a49-35d9e0a40629" containerName="neutron-httpd" Feb 19 15:30:11 crc kubenswrapper[4810]: I0219 15:30:11.285466 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0b73197-3c7e-44c3-8a49-35d9e0a40629" containerName="neutron-httpd" Feb 19 15:30:11 crc kubenswrapper[4810]: I0219 15:30:11.285483 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0b73197-3c7e-44c3-8a49-35d9e0a40629" containerName="neutron-api" Feb 19 15:30:11 crc kubenswrapper[4810]: I0219 15:30:11.286083 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 19 15:30:11 crc kubenswrapper[4810]: I0219 15:30:11.290127 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Feb 19 15:30:11 crc kubenswrapper[4810]: I0219 15:30:11.290354 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-znmn2" Feb 19 15:30:11 crc kubenswrapper[4810]: I0219 15:30:11.290479 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Feb 19 15:30:11 crc kubenswrapper[4810]: I0219 15:30:11.324612 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 19 15:30:11 crc kubenswrapper[4810]: I0219 15:30:11.392167 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fa26ff13-0dc1-4e6b-a9a8-80177afa17af-openstack-config-secret\") pod \"openstackclient\" (UID: \"fa26ff13-0dc1-4e6b-a9a8-80177afa17af\") " pod="openstack/openstackclient" Feb 19 15:30:11 crc kubenswrapper[4810]: I0219 15:30:11.392219 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fa26ff13-0dc1-4e6b-a9a8-80177afa17af-openstack-config\") pod \"openstackclient\" (UID: \"fa26ff13-0dc1-4e6b-a9a8-80177afa17af\") " pod="openstack/openstackclient" Feb 19 15:30:11 crc kubenswrapper[4810]: I0219 15:30:11.392299 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jp9xc\" (UniqueName: \"kubernetes.io/projected/fa26ff13-0dc1-4e6b-a9a8-80177afa17af-kube-api-access-jp9xc\") pod \"openstackclient\" (UID: \"fa26ff13-0dc1-4e6b-a9a8-80177afa17af\") " pod="openstack/openstackclient" Feb 19 15:30:11 crc kubenswrapper[4810]: I0219 15:30:11.392350 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa26ff13-0dc1-4e6b-a9a8-80177afa17af-combined-ca-bundle\") pod \"openstackclient\" (UID: \"fa26ff13-0dc1-4e6b-a9a8-80177afa17af\") " pod="openstack/openstackclient" Feb 19 15:30:11 crc kubenswrapper[4810]: I0219 15:30:11.450565 4810 scope.go:117] "RemoveContainer" containerID="a8232fee767eaee3fa026223e20673e25fa6d954ddb913e148bf6e0e435de416" Feb 19 15:30:11 crc kubenswrapper[4810]: E0219 15:30:11.450901 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(3eb2dccd-c5dc-436f-b7a6-954af7bc51c5)\"" pod="openstack/watcher-decision-engine-0" podUID="3eb2dccd-c5dc-436f-b7a6-954af7bc51c5" Feb 19 15:30:11 crc kubenswrapper[4810]: I0219 15:30:11.456827 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd961c7d-d551-4f5b-a08a-07d088947698" path="/var/lib/kubelet/pods/cd961c7d-d551-4f5b-a08a-07d088947698/volumes" Feb 19 15:30:11 crc kubenswrapper[4810]: I0219 15:30:11.457860 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0116ca5-826e-4a77-bc6f-11e89c047af8" path="/var/lib/kubelet/pods/e0116ca5-826e-4a77-bc6f-11e89c047af8/volumes" Feb 19 15:30:11 crc kubenswrapper[4810]: I0219 15:30:11.458708 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0b73197-3c7e-44c3-8a49-35d9e0a40629" path="/var/lib/kubelet/pods/f0b73197-3c7e-44c3-8a49-35d9e0a40629/volumes" Feb 19 15:30:11 crc kubenswrapper[4810]: I0219 15:30:11.460099 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="faf01cf3-b74b-46d8-b589-05ea0195ac24" path="/var/lib/kubelet/pods/faf01cf3-b74b-46d8-b589-05ea0195ac24/volumes" Feb 19 15:30:11 crc kubenswrapper[4810]: I0219 15:30:11.496255 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jp9xc\" (UniqueName: \"kubernetes.io/projected/fa26ff13-0dc1-4e6b-a9a8-80177afa17af-kube-api-access-jp9xc\") pod \"openstackclient\" (UID: \"fa26ff13-0dc1-4e6b-a9a8-80177afa17af\") " pod="openstack/openstackclient" Feb 19 15:30:11 crc kubenswrapper[4810]: I0219 15:30:11.496538 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa26ff13-0dc1-4e6b-a9a8-80177afa17af-combined-ca-bundle\") pod \"openstackclient\" (UID: \"fa26ff13-0dc1-4e6b-a9a8-80177afa17af\") " pod="openstack/openstackclient" Feb 19 15:30:11 crc kubenswrapper[4810]: I0219 15:30:11.496694 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fa26ff13-0dc1-4e6b-a9a8-80177afa17af-openstack-config-secret\") pod \"openstackclient\" (UID: \"fa26ff13-0dc1-4e6b-a9a8-80177afa17af\") " pod="openstack/openstackclient" Feb 19 15:30:11 crc kubenswrapper[4810]: I0219 15:30:11.496783 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fa26ff13-0dc1-4e6b-a9a8-80177afa17af-openstack-config\") pod \"openstackclient\" (UID: \"fa26ff13-0dc1-4e6b-a9a8-80177afa17af\") " pod="openstack/openstackclient" Feb 19 15:30:11 crc kubenswrapper[4810]: I0219 15:30:11.497651 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fa26ff13-0dc1-4e6b-a9a8-80177afa17af-openstack-config\") pod \"openstackclient\" (UID: \"fa26ff13-0dc1-4e6b-a9a8-80177afa17af\") " pod="openstack/openstackclient" Feb 19 15:30:11 crc kubenswrapper[4810]: I0219 15:30:11.501197 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fa26ff13-0dc1-4e6b-a9a8-80177afa17af-openstack-config-secret\") pod \"openstackclient\" (UID: \"fa26ff13-0dc1-4e6b-a9a8-80177afa17af\") " pod="openstack/openstackclient" Feb 19 15:30:11 crc kubenswrapper[4810]: I0219 15:30:11.501588 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa26ff13-0dc1-4e6b-a9a8-80177afa17af-combined-ca-bundle\") pod \"openstackclient\" (UID: \"fa26ff13-0dc1-4e6b-a9a8-80177afa17af\") " pod="openstack/openstackclient" Feb 19 15:30:11 crc kubenswrapper[4810]: I0219 15:30:11.514499 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jp9xc\" (UniqueName: \"kubernetes.io/projected/fa26ff13-0dc1-4e6b-a9a8-80177afa17af-kube-api-access-jp9xc\") pod \"openstackclient\" (UID: \"fa26ff13-0dc1-4e6b-a9a8-80177afa17af\") " pod="openstack/openstackclient" Feb 19 15:30:11 crc kubenswrapper[4810]: I0219 15:30:11.619978 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Feb 19 15:30:11 crc kubenswrapper[4810]: I0219 15:30:11.630874 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Feb 19 15:30:11 crc kubenswrapper[4810]: I0219 15:30:11.642400 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 19 15:30:11 crc kubenswrapper[4810]: I0219 15:30:11.666567 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 19 15:30:11 crc kubenswrapper[4810]: I0219 15:30:11.668282 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 19 15:30:11 crc kubenswrapper[4810]: I0219 15:30:11.691373 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 19 15:30:11 crc kubenswrapper[4810]: I0219 15:30:11.806930 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"48d5e3e9-853c-4988-8746-a6f74e1fe209","Type":"ContainerStarted","Data":"eddd4db00bb1996ace3587f0150b3115cfcb9b2033d3154b181b637f959bec85"} Feb 19 15:30:11 crc kubenswrapper[4810]: I0219 15:30:11.813043 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca8eb29b-bb26-446f-8a22-5da13ff9d5fa-combined-ca-bundle\") pod \"openstackclient\" (UID: \"ca8eb29b-bb26-446f-8a22-5da13ff9d5fa\") " pod="openstack/openstackclient" Feb 19 15:30:11 crc kubenswrapper[4810]: I0219 15:30:11.813111 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ca8eb29b-bb26-446f-8a22-5da13ff9d5fa-openstack-config\") pod \"openstackclient\" (UID: \"ca8eb29b-bb26-446f-8a22-5da13ff9d5fa\") " pod="openstack/openstackclient" Feb 19 15:30:11 crc kubenswrapper[4810]: I0219 15:30:11.813251 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrb2g\" (UniqueName: \"kubernetes.io/projected/ca8eb29b-bb26-446f-8a22-5da13ff9d5fa-kube-api-access-mrb2g\") pod \"openstackclient\" (UID: \"ca8eb29b-bb26-446f-8a22-5da13ff9d5fa\") " pod="openstack/openstackclient" Feb 19 15:30:11 crc kubenswrapper[4810]: I0219 15:30:11.813445 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ca8eb29b-bb26-446f-8a22-5da13ff9d5fa-openstack-config-secret\") pod \"openstackclient\" (UID: \"ca8eb29b-bb26-446f-8a22-5da13ff9d5fa\") " pod="openstack/openstackclient" Feb 19 15:30:11 crc kubenswrapper[4810]: I0219 15:30:11.825112 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 19 15:30:11 crc kubenswrapper[4810]: I0219 15:30:11.836857 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.836835585 podStartE2EDuration="3.836835585s" podCreationTimestamp="2026-02-19 15:30:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:30:11.832092229 +0000 UTC m=+1241.314122353" watchObservedRunningTime="2026-02-19 15:30:11.836835585 +0000 UTC m=+1241.318865719" Feb 19 15:30:11 crc kubenswrapper[4810]: I0219 15:30:11.839302 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"c772672c-c983-42e8-ae77-bfc8484ad555","Type":"ContainerStarted","Data":"d9c5f503f212d6aa9b0ddd5ddc6ff440b6c542c6960be28ea86ea203ab7ca92d"} Feb 19 15:30:11 crc kubenswrapper[4810]: I0219 15:30:11.839353 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"c772672c-c983-42e8-ae77-bfc8484ad555","Type":"ContainerStarted","Data":"a3586f026120011072c0978ee9277dd85cce4dc7aaed51f25a7bea5aff2ae758"} Feb 19 15:30:11 crc kubenswrapper[4810]: I0219 15:30:11.839363 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"c772672c-c983-42e8-ae77-bfc8484ad555","Type":"ContainerStarted","Data":"e3006b677518772d18ede2c0df9a671f7e5d00f39c12f380e32799bbd51a8cab"} Feb 19 15:30:11 crc kubenswrapper[4810]: I0219 15:30:11.840349 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Feb 19 15:30:11 crc kubenswrapper[4810]: E0219 15:30:11.861686 4810 log.go:32] "RunPodSandbox from runtime service failed" err=< Feb 19 15:30:11 crc kubenswrapper[4810]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_fa26ff13-0dc1-4e6b-a9a8-80177afa17af_0(a09d4e2428d1f78397a2986858e27dcb31ce24f328f9b342286b54d06d27fdce): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"a09d4e2428d1f78397a2986858e27dcb31ce24f328f9b342286b54d06d27fdce" Netns:"/var/run/netns/f3c74cd4-edce-430c-ab15-13ba451c1e9b" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=a09d4e2428d1f78397a2986858e27dcb31ce24f328f9b342286b54d06d27fdce;K8S_POD_UID=fa26ff13-0dc1-4e6b-a9a8-80177afa17af" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/fa26ff13-0dc1-4e6b-a9a8-80177afa17af]: expected pod UID "fa26ff13-0dc1-4e6b-a9a8-80177afa17af" but got "ca8eb29b-bb26-446f-8a22-5da13ff9d5fa" from Kube API Feb 19 15:30:11 crc kubenswrapper[4810]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 19 15:30:11 crc kubenswrapper[4810]: > Feb 19 15:30:11 crc kubenswrapper[4810]: E0219 15:30:11.861750 4810 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Feb 19 15:30:11 crc kubenswrapper[4810]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_fa26ff13-0dc1-4e6b-a9a8-80177afa17af_0(a09d4e2428d1f78397a2986858e27dcb31ce24f328f9b342286b54d06d27fdce): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"a09d4e2428d1f78397a2986858e27dcb31ce24f328f9b342286b54d06d27fdce" Netns:"/var/run/netns/f3c74cd4-edce-430c-ab15-13ba451c1e9b" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=a09d4e2428d1f78397a2986858e27dcb31ce24f328f9b342286b54d06d27fdce;K8S_POD_UID=fa26ff13-0dc1-4e6b-a9a8-80177afa17af" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/fa26ff13-0dc1-4e6b-a9a8-80177afa17af]: expected pod UID "fa26ff13-0dc1-4e6b-a9a8-80177afa17af" but got "ca8eb29b-bb26-446f-8a22-5da13ff9d5fa" from Kube API Feb 19 15:30:11 crc kubenswrapper[4810]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 19 15:30:11 crc kubenswrapper[4810]: > pod="openstack/openstackclient" Feb 19 15:30:11 crc kubenswrapper[4810]: I0219 15:30:11.888314 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=2.888294578 podStartE2EDuration="2.888294578s" podCreationTimestamp="2026-02-19 15:30:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:30:11.876381175 +0000 UTC m=+1241.358411299" watchObservedRunningTime="2026-02-19 15:30:11.888294578 +0000 UTC m=+1241.370324702" Feb 19 15:30:11 crc kubenswrapper[4810]: I0219 15:30:11.914811 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ca8eb29b-bb26-446f-8a22-5da13ff9d5fa-openstack-config-secret\") pod \"openstackclient\" (UID: \"ca8eb29b-bb26-446f-8a22-5da13ff9d5fa\") " pod="openstack/openstackclient" Feb 19 15:30:11 crc kubenswrapper[4810]: I0219 15:30:11.914880 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca8eb29b-bb26-446f-8a22-5da13ff9d5fa-combined-ca-bundle\") pod \"openstackclient\" (UID: \"ca8eb29b-bb26-446f-8a22-5da13ff9d5fa\") " pod="openstack/openstackclient" Feb 19 15:30:11 crc kubenswrapper[4810]: I0219 15:30:11.914914 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ca8eb29b-bb26-446f-8a22-5da13ff9d5fa-openstack-config\") pod \"openstackclient\" (UID: \"ca8eb29b-bb26-446f-8a22-5da13ff9d5fa\") " pod="openstack/openstackclient" Feb 19 15:30:11 crc kubenswrapper[4810]: I0219 15:30:11.915000 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrb2g\" (UniqueName: \"kubernetes.io/projected/ca8eb29b-bb26-446f-8a22-5da13ff9d5fa-kube-api-access-mrb2g\") pod \"openstackclient\" (UID: \"ca8eb29b-bb26-446f-8a22-5da13ff9d5fa\") " pod="openstack/openstackclient" Feb 19 15:30:11 crc kubenswrapper[4810]: I0219 15:30:11.916482 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ca8eb29b-bb26-446f-8a22-5da13ff9d5fa-openstack-config\") pod \"openstackclient\" (UID: \"ca8eb29b-bb26-446f-8a22-5da13ff9d5fa\") " pod="openstack/openstackclient" Feb 19 15:30:11 crc kubenswrapper[4810]: I0219 15:30:11.920868 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca8eb29b-bb26-446f-8a22-5da13ff9d5fa-combined-ca-bundle\") pod \"openstackclient\" (UID: \"ca8eb29b-bb26-446f-8a22-5da13ff9d5fa\") " pod="openstack/openstackclient" Feb 19 15:30:11 crc kubenswrapper[4810]: I0219 15:30:11.927753 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ca8eb29b-bb26-446f-8a22-5da13ff9d5fa-openstack-config-secret\") pod \"openstackclient\" (UID: \"ca8eb29b-bb26-446f-8a22-5da13ff9d5fa\") " pod="openstack/openstackclient" Feb 19 15:30:11 crc kubenswrapper[4810]: I0219 15:30:11.940003 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrb2g\" (UniqueName: \"kubernetes.io/projected/ca8eb29b-bb26-446f-8a22-5da13ff9d5fa-kube-api-access-mrb2g\") pod \"openstackclient\" (UID: \"ca8eb29b-bb26-446f-8a22-5da13ff9d5fa\") " pod="openstack/openstackclient" Feb 19 15:30:12 crc kubenswrapper[4810]: I0219 15:30:12.011709 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 19 15:30:12 crc kubenswrapper[4810]: I0219 15:30:12.567872 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 19 15:30:12 crc kubenswrapper[4810]: I0219 15:30:12.854798 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"ca8eb29b-bb26-446f-8a22-5da13ff9d5fa","Type":"ContainerStarted","Data":"f38b0fd50dec54e94b9eec955297a6f01f6925bad6cb7973d5c0603fb8bdee87"} Feb 19 15:30:12 crc kubenswrapper[4810]: I0219 15:30:12.855100 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 19 15:30:12 crc kubenswrapper[4810]: I0219 15:30:12.859052 4810 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="fa26ff13-0dc1-4e6b-a9a8-80177afa17af" podUID="ca8eb29b-bb26-446f-8a22-5da13ff9d5fa" Feb 19 15:30:12 crc kubenswrapper[4810]: I0219 15:30:12.864701 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 19 15:30:13 crc kubenswrapper[4810]: I0219 15:30:13.032756 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa26ff13-0dc1-4e6b-a9a8-80177afa17af-combined-ca-bundle\") pod \"fa26ff13-0dc1-4e6b-a9a8-80177afa17af\" (UID: \"fa26ff13-0dc1-4e6b-a9a8-80177afa17af\") " Feb 19 15:30:13 crc kubenswrapper[4810]: I0219 15:30:13.033018 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fa26ff13-0dc1-4e6b-a9a8-80177afa17af-openstack-config-secret\") pod \"fa26ff13-0dc1-4e6b-a9a8-80177afa17af\" (UID: \"fa26ff13-0dc1-4e6b-a9a8-80177afa17af\") " Feb 19 15:30:13 crc kubenswrapper[4810]: I0219 15:30:13.033076 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fa26ff13-0dc1-4e6b-a9a8-80177afa17af-openstack-config\") pod \"fa26ff13-0dc1-4e6b-a9a8-80177afa17af\" (UID: \"fa26ff13-0dc1-4e6b-a9a8-80177afa17af\") " Feb 19 15:30:13 crc kubenswrapper[4810]: I0219 15:30:13.033134 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jp9xc\" (UniqueName: \"kubernetes.io/projected/fa26ff13-0dc1-4e6b-a9a8-80177afa17af-kube-api-access-jp9xc\") pod \"fa26ff13-0dc1-4e6b-a9a8-80177afa17af\" (UID: \"fa26ff13-0dc1-4e6b-a9a8-80177afa17af\") " Feb 19 15:30:13 crc kubenswrapper[4810]: I0219 15:30:13.033720 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa26ff13-0dc1-4e6b-a9a8-80177afa17af-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "fa26ff13-0dc1-4e6b-a9a8-80177afa17af" (UID: "fa26ff13-0dc1-4e6b-a9a8-80177afa17af"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:30:13 crc kubenswrapper[4810]: I0219 15:30:13.039467 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa26ff13-0dc1-4e6b-a9a8-80177afa17af-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fa26ff13-0dc1-4e6b-a9a8-80177afa17af" (UID: "fa26ff13-0dc1-4e6b-a9a8-80177afa17af"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:30:13 crc kubenswrapper[4810]: I0219 15:30:13.052468 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa26ff13-0dc1-4e6b-a9a8-80177afa17af-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "fa26ff13-0dc1-4e6b-a9a8-80177afa17af" (UID: "fa26ff13-0dc1-4e6b-a9a8-80177afa17af"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:30:13 crc kubenswrapper[4810]: I0219 15:30:13.069226 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa26ff13-0dc1-4e6b-a9a8-80177afa17af-kube-api-access-jp9xc" (OuterVolumeSpecName: "kube-api-access-jp9xc") pod "fa26ff13-0dc1-4e6b-a9a8-80177afa17af" (UID: "fa26ff13-0dc1-4e6b-a9a8-80177afa17af"). InnerVolumeSpecName "kube-api-access-jp9xc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:30:13 crc kubenswrapper[4810]: I0219 15:30:13.137384 4810 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fa26ff13-0dc1-4e6b-a9a8-80177afa17af-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:13 crc kubenswrapper[4810]: I0219 15:30:13.137418 4810 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fa26ff13-0dc1-4e6b-a9a8-80177afa17af-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:13 crc kubenswrapper[4810]: I0219 15:30:13.137438 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jp9xc\" (UniqueName: \"kubernetes.io/projected/fa26ff13-0dc1-4e6b-a9a8-80177afa17af-kube-api-access-jp9xc\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:13 crc kubenswrapper[4810]: I0219 15:30:13.137449 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa26ff13-0dc1-4e6b-a9a8-80177afa17af-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:13 crc kubenswrapper[4810]: I0219 15:30:13.458594 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa26ff13-0dc1-4e6b-a9a8-80177afa17af" path="/var/lib/kubelet/pods/fa26ff13-0dc1-4e6b-a9a8-80177afa17af/volumes" Feb 19 15:30:13 crc kubenswrapper[4810]: I0219 15:30:13.576236 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 19 15:30:13 crc kubenswrapper[4810]: I0219 15:30:13.861583 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 19 15:30:13 crc kubenswrapper[4810]: I0219 15:30:13.861677 4810 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 19 15:30:13 crc kubenswrapper[4810]: I0219 15:30:13.869807 4810 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="fa26ff13-0dc1-4e6b-a9a8-80177afa17af" podUID="ca8eb29b-bb26-446f-8a22-5da13ff9d5fa" Feb 19 15:30:14 crc kubenswrapper[4810]: I0219 15:30:14.154381 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Feb 19 15:30:15 crc kubenswrapper[4810]: I0219 15:30:15.183546 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Feb 19 15:30:15 crc kubenswrapper[4810]: I0219 15:30:15.824226 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-78bc5d479f-k79xx"] Feb 19 15:30:15 crc kubenswrapper[4810]: I0219 15:30:15.827908 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-78bc5d479f-k79xx" Feb 19 15:30:15 crc kubenswrapper[4810]: I0219 15:30:15.832231 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Feb 19 15:30:15 crc kubenswrapper[4810]: I0219 15:30:15.832253 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Feb 19 15:30:15 crc kubenswrapper[4810]: I0219 15:30:15.832421 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 19 15:30:15 crc kubenswrapper[4810]: I0219 15:30:15.847985 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-78bc5d479f-k79xx"] Feb 19 15:30:15 crc kubenswrapper[4810]: I0219 15:30:15.910171 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpx4p\" (UniqueName: \"kubernetes.io/projected/9190a865-226b-487c-b0f9-2573f50f0eab-kube-api-access-fpx4p\") pod \"swift-proxy-78bc5d479f-k79xx\" (UID: \"9190a865-226b-487c-b0f9-2573f50f0eab\") " pod="openstack/swift-proxy-78bc5d479f-k79xx" Feb 19 15:30:15 crc kubenswrapper[4810]: I0219 15:30:15.910222 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9190a865-226b-487c-b0f9-2573f50f0eab-public-tls-certs\") pod \"swift-proxy-78bc5d479f-k79xx\" (UID: \"9190a865-226b-487c-b0f9-2573f50f0eab\") " pod="openstack/swift-proxy-78bc5d479f-k79xx" Feb 19 15:30:15 crc kubenswrapper[4810]: I0219 15:30:15.910623 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9190a865-226b-487c-b0f9-2573f50f0eab-combined-ca-bundle\") pod \"swift-proxy-78bc5d479f-k79xx\" (UID: \"9190a865-226b-487c-b0f9-2573f50f0eab\") " pod="openstack/swift-proxy-78bc5d479f-k79xx" Feb 19 15:30:15 crc kubenswrapper[4810]: I0219 15:30:15.910737 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9190a865-226b-487c-b0f9-2573f50f0eab-log-httpd\") pod \"swift-proxy-78bc5d479f-k79xx\" (UID: \"9190a865-226b-487c-b0f9-2573f50f0eab\") " pod="openstack/swift-proxy-78bc5d479f-k79xx" Feb 19 15:30:15 crc kubenswrapper[4810]: I0219 15:30:15.910852 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9190a865-226b-487c-b0f9-2573f50f0eab-etc-swift\") pod \"swift-proxy-78bc5d479f-k79xx\" (UID: \"9190a865-226b-487c-b0f9-2573f50f0eab\") " pod="openstack/swift-proxy-78bc5d479f-k79xx" Feb 19 15:30:15 crc kubenswrapper[4810]: I0219 15:30:15.911029 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9190a865-226b-487c-b0f9-2573f50f0eab-internal-tls-certs\") pod \"swift-proxy-78bc5d479f-k79xx\" (UID: \"9190a865-226b-487c-b0f9-2573f50f0eab\") " pod="openstack/swift-proxy-78bc5d479f-k79xx" Feb 19 15:30:15 crc kubenswrapper[4810]: I0219 15:30:15.911126 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9190a865-226b-487c-b0f9-2573f50f0eab-config-data\") pod \"swift-proxy-78bc5d479f-k79xx\" (UID: \"9190a865-226b-487c-b0f9-2573f50f0eab\") " pod="openstack/swift-proxy-78bc5d479f-k79xx" Feb 19 15:30:15 crc kubenswrapper[4810]: I0219 15:30:15.911420 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9190a865-226b-487c-b0f9-2573f50f0eab-run-httpd\") pod \"swift-proxy-78bc5d479f-k79xx\" (UID: \"9190a865-226b-487c-b0f9-2573f50f0eab\") " pod="openstack/swift-proxy-78bc5d479f-k79xx" Feb 19 15:30:16 crc kubenswrapper[4810]: I0219 15:30:16.013208 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9190a865-226b-487c-b0f9-2573f50f0eab-public-tls-certs\") pod \"swift-proxy-78bc5d479f-k79xx\" (UID: \"9190a865-226b-487c-b0f9-2573f50f0eab\") " pod="openstack/swift-proxy-78bc5d479f-k79xx" Feb 19 15:30:16 crc kubenswrapper[4810]: I0219 15:30:16.013270 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9190a865-226b-487c-b0f9-2573f50f0eab-combined-ca-bundle\") pod \"swift-proxy-78bc5d479f-k79xx\" (UID: \"9190a865-226b-487c-b0f9-2573f50f0eab\") " pod="openstack/swift-proxy-78bc5d479f-k79xx" Feb 19 15:30:16 crc kubenswrapper[4810]: I0219 15:30:16.013292 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9190a865-226b-487c-b0f9-2573f50f0eab-log-httpd\") pod \"swift-proxy-78bc5d479f-k79xx\" (UID: \"9190a865-226b-487c-b0f9-2573f50f0eab\") " pod="openstack/swift-proxy-78bc5d479f-k79xx" Feb 19 15:30:16 crc kubenswrapper[4810]: I0219 15:30:16.013318 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9190a865-226b-487c-b0f9-2573f50f0eab-etc-swift\") pod \"swift-proxy-78bc5d479f-k79xx\" (UID: \"9190a865-226b-487c-b0f9-2573f50f0eab\") " pod="openstack/swift-proxy-78bc5d479f-k79xx" Feb 19 15:30:16 crc kubenswrapper[4810]: I0219 15:30:16.013383 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9190a865-226b-487c-b0f9-2573f50f0eab-internal-tls-certs\") pod \"swift-proxy-78bc5d479f-k79xx\" (UID: \"9190a865-226b-487c-b0f9-2573f50f0eab\") " pod="openstack/swift-proxy-78bc5d479f-k79xx" Feb 19 15:30:16 crc kubenswrapper[4810]: I0219 15:30:16.013404 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9190a865-226b-487c-b0f9-2573f50f0eab-config-data\") pod \"swift-proxy-78bc5d479f-k79xx\" (UID: \"9190a865-226b-487c-b0f9-2573f50f0eab\") " pod="openstack/swift-proxy-78bc5d479f-k79xx" Feb 19 15:30:16 crc kubenswrapper[4810]: I0219 15:30:16.013474 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9190a865-226b-487c-b0f9-2573f50f0eab-run-httpd\") pod \"swift-proxy-78bc5d479f-k79xx\" (UID: \"9190a865-226b-487c-b0f9-2573f50f0eab\") " pod="openstack/swift-proxy-78bc5d479f-k79xx" Feb 19 15:30:16 crc kubenswrapper[4810]: I0219 15:30:16.013506 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpx4p\" (UniqueName: \"kubernetes.io/projected/9190a865-226b-487c-b0f9-2573f50f0eab-kube-api-access-fpx4p\") pod \"swift-proxy-78bc5d479f-k79xx\" (UID: \"9190a865-226b-487c-b0f9-2573f50f0eab\") " pod="openstack/swift-proxy-78bc5d479f-k79xx" Feb 19 15:30:16 crc kubenswrapper[4810]: I0219 15:30:16.014010 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9190a865-226b-487c-b0f9-2573f50f0eab-log-httpd\") pod \"swift-proxy-78bc5d479f-k79xx\" (UID: \"9190a865-226b-487c-b0f9-2573f50f0eab\") " pod="openstack/swift-proxy-78bc5d479f-k79xx" Feb 19 15:30:16 crc kubenswrapper[4810]: I0219 15:30:16.015024 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9190a865-226b-487c-b0f9-2573f50f0eab-run-httpd\") pod \"swift-proxy-78bc5d479f-k79xx\" (UID: \"9190a865-226b-487c-b0f9-2573f50f0eab\") " pod="openstack/swift-proxy-78bc5d479f-k79xx" Feb 19 15:30:16 crc kubenswrapper[4810]: I0219 15:30:16.032311 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9190a865-226b-487c-b0f9-2573f50f0eab-internal-tls-certs\") pod \"swift-proxy-78bc5d479f-k79xx\" (UID: \"9190a865-226b-487c-b0f9-2573f50f0eab\") " pod="openstack/swift-proxy-78bc5d479f-k79xx" Feb 19 15:30:16 crc kubenswrapper[4810]: I0219 15:30:16.032355 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9190a865-226b-487c-b0f9-2573f50f0eab-public-tls-certs\") pod \"swift-proxy-78bc5d479f-k79xx\" (UID: \"9190a865-226b-487c-b0f9-2573f50f0eab\") " pod="openstack/swift-proxy-78bc5d479f-k79xx" Feb 19 15:30:16 crc kubenswrapper[4810]: I0219 15:30:16.036344 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9190a865-226b-487c-b0f9-2573f50f0eab-config-data\") pod \"swift-proxy-78bc5d479f-k79xx\" (UID: \"9190a865-226b-487c-b0f9-2573f50f0eab\") " pod="openstack/swift-proxy-78bc5d479f-k79xx" Feb 19 15:30:16 crc kubenswrapper[4810]: I0219 15:30:16.039165 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9190a865-226b-487c-b0f9-2573f50f0eab-combined-ca-bundle\") pod \"swift-proxy-78bc5d479f-k79xx\" (UID: \"9190a865-226b-487c-b0f9-2573f50f0eab\") " pod="openstack/swift-proxy-78bc5d479f-k79xx" Feb 19 15:30:16 crc kubenswrapper[4810]: I0219 15:30:16.040830 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpx4p\" (UniqueName: \"kubernetes.io/projected/9190a865-226b-487c-b0f9-2573f50f0eab-kube-api-access-fpx4p\") pod \"swift-proxy-78bc5d479f-k79xx\" (UID: \"9190a865-226b-487c-b0f9-2573f50f0eab\") " pod="openstack/swift-proxy-78bc5d479f-k79xx" Feb 19 15:30:16 crc kubenswrapper[4810]: I0219 15:30:16.049578 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9190a865-226b-487c-b0f9-2573f50f0eab-etc-swift\") pod \"swift-proxy-78bc5d479f-k79xx\" (UID: \"9190a865-226b-487c-b0f9-2573f50f0eab\") " pod="openstack/swift-proxy-78bc5d479f-k79xx" Feb 19 15:30:16 crc kubenswrapper[4810]: I0219 15:30:16.153027 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-78bc5d479f-k79xx" Feb 19 15:30:16 crc kubenswrapper[4810]: I0219 15:30:16.716340 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-78bc5d479f-k79xx"] Feb 19 15:30:16 crc kubenswrapper[4810]: I0219 15:30:16.847658 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6cc67d5fc8-hs8lf" Feb 19 15:30:16 crc kubenswrapper[4810]: I0219 15:30:16.929708 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-78bc5d479f-k79xx" event={"ID":"9190a865-226b-487c-b0f9-2573f50f0eab","Type":"ContainerStarted","Data":"62eeb2ac06b126e5d4a0047f8650016eb7c2ac3c1f1e63aa5c7a49e0c1bf5ec8"} Feb 19 15:30:16 crc kubenswrapper[4810]: I0219 15:30:16.935752 4810 generic.go:334] "Generic (PLEG): container finished" podID="fb06ec19-dfd7-459f-9bfc-6c3e1619abb0" containerID="4e847a42f9e6fb1b9f7435b4df35d14a3fd82483b8e4fe2e6eb4d640b1450737" exitCode=0 Feb 19 15:30:16 crc kubenswrapper[4810]: I0219 15:30:16.935799 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6cc67d5fc8-hs8lf" event={"ID":"fb06ec19-dfd7-459f-9bfc-6c3e1619abb0","Type":"ContainerDied","Data":"4e847a42f9e6fb1b9f7435b4df35d14a3fd82483b8e4fe2e6eb4d640b1450737"} Feb 19 15:30:16 crc kubenswrapper[4810]: I0219 15:30:16.935844 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6cc67d5fc8-hs8lf" event={"ID":"fb06ec19-dfd7-459f-9bfc-6c3e1619abb0","Type":"ContainerDied","Data":"d1b99914aff75d6854dd18c44e4702b0f89c549bb21d23b94afcb4182e4386df"} Feb 19 15:30:16 crc kubenswrapper[4810]: I0219 15:30:16.935865 4810 scope.go:117] "RemoveContainer" containerID="ec03377efcba956dd84e47222d7f2fa8900a14d68a543b851a67f801c749c23f" Feb 19 15:30:16 crc kubenswrapper[4810]: I0219 15:30:16.936076 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6cc67d5fc8-hs8lf" Feb 19 15:30:16 crc kubenswrapper[4810]: I0219 15:30:16.936403 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lk7nj\" (UniqueName: \"kubernetes.io/projected/fb06ec19-dfd7-459f-9bfc-6c3e1619abb0-kube-api-access-lk7nj\") pod \"fb06ec19-dfd7-459f-9bfc-6c3e1619abb0\" (UID: \"fb06ec19-dfd7-459f-9bfc-6c3e1619abb0\") " Feb 19 15:30:16 crc kubenswrapper[4810]: I0219 15:30:16.936590 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fb06ec19-dfd7-459f-9bfc-6c3e1619abb0-config\") pod \"fb06ec19-dfd7-459f-9bfc-6c3e1619abb0\" (UID: \"fb06ec19-dfd7-459f-9bfc-6c3e1619abb0\") " Feb 19 15:30:16 crc kubenswrapper[4810]: I0219 15:30:16.936632 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fb06ec19-dfd7-459f-9bfc-6c3e1619abb0-httpd-config\") pod \"fb06ec19-dfd7-459f-9bfc-6c3e1619abb0\" (UID: \"fb06ec19-dfd7-459f-9bfc-6c3e1619abb0\") " Feb 19 15:30:16 crc kubenswrapper[4810]: I0219 15:30:16.936653 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb06ec19-dfd7-459f-9bfc-6c3e1619abb0-combined-ca-bundle\") pod \"fb06ec19-dfd7-459f-9bfc-6c3e1619abb0\" (UID: \"fb06ec19-dfd7-459f-9bfc-6c3e1619abb0\") " Feb 19 15:30:16 crc kubenswrapper[4810]: I0219 15:30:16.936679 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb06ec19-dfd7-459f-9bfc-6c3e1619abb0-ovndb-tls-certs\") pod \"fb06ec19-dfd7-459f-9bfc-6c3e1619abb0\" (UID: \"fb06ec19-dfd7-459f-9bfc-6c3e1619abb0\") " Feb 19 15:30:16 crc kubenswrapper[4810]: I0219 15:30:16.944033 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb06ec19-dfd7-459f-9bfc-6c3e1619abb0-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "fb06ec19-dfd7-459f-9bfc-6c3e1619abb0" (UID: "fb06ec19-dfd7-459f-9bfc-6c3e1619abb0"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:30:16 crc kubenswrapper[4810]: I0219 15:30:16.947348 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb06ec19-dfd7-459f-9bfc-6c3e1619abb0-kube-api-access-lk7nj" (OuterVolumeSpecName: "kube-api-access-lk7nj") pod "fb06ec19-dfd7-459f-9bfc-6c3e1619abb0" (UID: "fb06ec19-dfd7-459f-9bfc-6c3e1619abb0"). InnerVolumeSpecName "kube-api-access-lk7nj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:30:16 crc kubenswrapper[4810]: I0219 15:30:16.963891 4810 scope.go:117] "RemoveContainer" containerID="4e847a42f9e6fb1b9f7435b4df35d14a3fd82483b8e4fe2e6eb4d640b1450737" Feb 19 15:30:16 crc kubenswrapper[4810]: I0219 15:30:16.991035 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb06ec19-dfd7-459f-9bfc-6c3e1619abb0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fb06ec19-dfd7-459f-9bfc-6c3e1619abb0" (UID: "fb06ec19-dfd7-459f-9bfc-6c3e1619abb0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:30:16 crc kubenswrapper[4810]: I0219 15:30:16.997582 4810 scope.go:117] "RemoveContainer" containerID="ec03377efcba956dd84e47222d7f2fa8900a14d68a543b851a67f801c749c23f" Feb 19 15:30:17 crc kubenswrapper[4810]: E0219 15:30:17.001434 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec03377efcba956dd84e47222d7f2fa8900a14d68a543b851a67f801c749c23f\": container with ID starting with ec03377efcba956dd84e47222d7f2fa8900a14d68a543b851a67f801c749c23f not found: ID does not exist" containerID="ec03377efcba956dd84e47222d7f2fa8900a14d68a543b851a67f801c749c23f" Feb 19 15:30:17 crc kubenswrapper[4810]: I0219 15:30:17.001464 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec03377efcba956dd84e47222d7f2fa8900a14d68a543b851a67f801c749c23f"} err="failed to get container status \"ec03377efcba956dd84e47222d7f2fa8900a14d68a543b851a67f801c749c23f\": rpc error: code = NotFound desc = could not find container \"ec03377efcba956dd84e47222d7f2fa8900a14d68a543b851a67f801c749c23f\": container with ID starting with ec03377efcba956dd84e47222d7f2fa8900a14d68a543b851a67f801c749c23f not found: ID does not exist" Feb 19 15:30:17 crc kubenswrapper[4810]: I0219 15:30:17.001484 4810 scope.go:117] "RemoveContainer" containerID="4e847a42f9e6fb1b9f7435b4df35d14a3fd82483b8e4fe2e6eb4d640b1450737" Feb 19 15:30:17 crc kubenswrapper[4810]: E0219 15:30:17.002317 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e847a42f9e6fb1b9f7435b4df35d14a3fd82483b8e4fe2e6eb4d640b1450737\": container with ID starting with 4e847a42f9e6fb1b9f7435b4df35d14a3fd82483b8e4fe2e6eb4d640b1450737 not found: ID does not exist" containerID="4e847a42f9e6fb1b9f7435b4df35d14a3fd82483b8e4fe2e6eb4d640b1450737" Feb 19 15:30:17 crc kubenswrapper[4810]: I0219 15:30:17.002383 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e847a42f9e6fb1b9f7435b4df35d14a3fd82483b8e4fe2e6eb4d640b1450737"} err="failed to get container status \"4e847a42f9e6fb1b9f7435b4df35d14a3fd82483b8e4fe2e6eb4d640b1450737\": rpc error: code = NotFound desc = could not find container \"4e847a42f9e6fb1b9f7435b4df35d14a3fd82483b8e4fe2e6eb4d640b1450737\": container with ID starting with 4e847a42f9e6fb1b9f7435b4df35d14a3fd82483b8e4fe2e6eb4d640b1450737 not found: ID does not exist" Feb 19 15:30:17 crc kubenswrapper[4810]: I0219 15:30:17.008990 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb06ec19-dfd7-459f-9bfc-6c3e1619abb0-config" (OuterVolumeSpecName: "config") pod "fb06ec19-dfd7-459f-9bfc-6c3e1619abb0" (UID: "fb06ec19-dfd7-459f-9bfc-6c3e1619abb0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:30:17 crc kubenswrapper[4810]: I0219 15:30:17.042597 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lk7nj\" (UniqueName: \"kubernetes.io/projected/fb06ec19-dfd7-459f-9bfc-6c3e1619abb0-kube-api-access-lk7nj\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:17 crc kubenswrapper[4810]: I0219 15:30:17.042628 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/fb06ec19-dfd7-459f-9bfc-6c3e1619abb0-config\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:17 crc kubenswrapper[4810]: I0219 15:30:17.042639 4810 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fb06ec19-dfd7-459f-9bfc-6c3e1619abb0-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:17 crc kubenswrapper[4810]: I0219 15:30:17.042649 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb06ec19-dfd7-459f-9bfc-6c3e1619abb0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:17 crc kubenswrapper[4810]: I0219 15:30:17.054293 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb06ec19-dfd7-459f-9bfc-6c3e1619abb0-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "fb06ec19-dfd7-459f-9bfc-6c3e1619abb0" (UID: "fb06ec19-dfd7-459f-9bfc-6c3e1619abb0"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:30:17 crc kubenswrapper[4810]: I0219 15:30:17.143958 4810 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb06ec19-dfd7-459f-9bfc-6c3e1619abb0-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:17 crc kubenswrapper[4810]: I0219 15:30:17.272370 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6cc67d5fc8-hs8lf"] Feb 19 15:30:17 crc kubenswrapper[4810]: I0219 15:30:17.282439 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-6cc67d5fc8-hs8lf"] Feb 19 15:30:17 crc kubenswrapper[4810]: I0219 15:30:17.457807 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb06ec19-dfd7-459f-9bfc-6c3e1619abb0" path="/var/lib/kubelet/pods/fb06ec19-dfd7-459f-9bfc-6c3e1619abb0/volumes" Feb 19 15:30:17 crc kubenswrapper[4810]: I0219 15:30:17.947089 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-78bc5d479f-k79xx" event={"ID":"9190a865-226b-487c-b0f9-2573f50f0eab","Type":"ContainerStarted","Data":"f8ed6d381d40b2e3ef03d951a46de99b3c3bafe1807814ddeddf5352cea41872"} Feb 19 15:30:17 crc kubenswrapper[4810]: I0219 15:30:17.947158 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-78bc5d479f-k79xx" event={"ID":"9190a865-226b-487c-b0f9-2573f50f0eab","Type":"ContainerStarted","Data":"e2fc9c32a96f65c2fbdc2383696138996d9691b72d3892e004a27f696c08ac35"} Feb 19 15:30:17 crc kubenswrapper[4810]: I0219 15:30:17.948738 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-78bc5d479f-k79xx" Feb 19 15:30:17 crc kubenswrapper[4810]: I0219 15:30:17.948765 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-78bc5d479f-k79xx" Feb 19 15:30:17 crc kubenswrapper[4810]: I0219 15:30:17.976240 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-78bc5d479f-k79xx" podStartSLOduration=2.976223467 podStartE2EDuration="2.976223467s" podCreationTimestamp="2026-02-19 15:30:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:30:17.969494382 +0000 UTC m=+1247.451524536" watchObservedRunningTime="2026-02-19 15:30:17.976223467 +0000 UTC m=+1247.458253591" Feb 19 15:30:18 crc kubenswrapper[4810]: I0219 15:30:18.163230 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-869f57798-ngdtl" podUID="58c845f2-0069-4ee5-9d4b-b5871e078926" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.165:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.165:8443: connect: connection refused" Feb 19 15:30:18 crc kubenswrapper[4810]: I0219 15:30:18.163714 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-869f57798-ngdtl" Feb 19 15:30:18 crc kubenswrapper[4810]: W0219 15:30:18.753643 4810 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0116ca5_826e_4a77_bc6f_11e89c047af8.slice/crio-4f57c96680174420000630479bd05d2a43d222e27a564e7d19f21df9961dd34c": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0116ca5_826e_4a77_bc6f_11e89c047af8.slice/crio-4f57c96680174420000630479bd05d2a43d222e27a564e7d19f21df9961dd34c: no such file or directory Feb 19 15:30:18 crc kubenswrapper[4810]: W0219 15:30:18.753966 4810 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f9534ee_827a_49fb_8588_a5a8a494be3c.slice/crio-32fbc60f0dab31f943ed4d231a260fb910bacf46dc6517badaa0cc870b972e03": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f9534ee_827a_49fb_8588_a5a8a494be3c.slice/crio-32fbc60f0dab31f943ed4d231a260fb910bacf46dc6517badaa0cc870b972e03: no such file or directory Feb 19 15:30:18 crc kubenswrapper[4810]: W0219 15:30:18.753989 4810 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0116ca5_826e_4a77_bc6f_11e89c047af8.slice/crio-conmon-0f0fdb4ec7d48ce5efa4d4ce421e22663d31ffdf324056a1695cd08d0b02000f.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0116ca5_826e_4a77_bc6f_11e89c047af8.slice/crio-conmon-0f0fdb4ec7d48ce5efa4d4ce421e22663d31ffdf324056a1695cd08d0b02000f.scope: no such file or directory Feb 19 15:30:18 crc kubenswrapper[4810]: W0219 15:30:18.754007 4810 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0b73197_3c7e_44c3_8a49_35d9e0a40629.slice/crio-b18b549a747d3077270169b17318853029444572e9290aac2a4fb288910d92c3": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0b73197_3c7e_44c3_8a49_35d9e0a40629.slice/crio-b18b549a747d3077270169b17318853029444572e9290aac2a4fb288910d92c3: no such file or directory Feb 19 15:30:18 crc kubenswrapper[4810]: W0219 15:30:18.754035 4810 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0116ca5_826e_4a77_bc6f_11e89c047af8.slice/crio-0f0fdb4ec7d48ce5efa4d4ce421e22663d31ffdf324056a1695cd08d0b02000f.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0116ca5_826e_4a77_bc6f_11e89c047af8.slice/crio-0f0fdb4ec7d48ce5efa4d4ce421e22663d31ffdf324056a1695cd08d0b02000f.scope: no such file or directory Feb 19 15:30:18 crc kubenswrapper[4810]: W0219 15:30:18.754054 4810 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f9534ee_827a_49fb_8588_a5a8a494be3c.slice/crio-conmon-77f0991b435da4c5e7f87592511070821c3d2ea7637aa55b69e68f2a981e4095.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f9534ee_827a_49fb_8588_a5a8a494be3c.slice/crio-conmon-77f0991b435da4c5e7f87592511070821c3d2ea7637aa55b69e68f2a981e4095.scope: no such file or directory Feb 19 15:30:18 crc kubenswrapper[4810]: W0219 15:30:18.754073 4810 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f9534ee_827a_49fb_8588_a5a8a494be3c.slice/crio-77f0991b435da4c5e7f87592511070821c3d2ea7637aa55b69e68f2a981e4095.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f9534ee_827a_49fb_8588_a5a8a494be3c.slice/crio-77f0991b435da4c5e7f87592511070821c3d2ea7637aa55b69e68f2a981e4095.scope: no such file or directory Feb 19 15:30:18 crc kubenswrapper[4810]: W0219 15:30:18.757127 4810 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0b73197_3c7e_44c3_8a49_35d9e0a40629.slice/crio-conmon-fc28d9c929a268b7ca57d72d22325d489071902ea7ff07035df3c9d98d6a728a.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0b73197_3c7e_44c3_8a49_35d9e0a40629.slice/crio-conmon-fc28d9c929a268b7ca57d72d22325d489071902ea7ff07035df3c9d98d6a728a.scope: no such file or directory Feb 19 15:30:18 crc kubenswrapper[4810]: W0219 15:30:18.757164 4810 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0b73197_3c7e_44c3_8a49_35d9e0a40629.slice/crio-fc28d9c929a268b7ca57d72d22325d489071902ea7ff07035df3c9d98d6a728a.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0b73197_3c7e_44c3_8a49_35d9e0a40629.slice/crio-fc28d9c929a268b7ca57d72d22325d489071902ea7ff07035df3c9d98d6a728a.scope: no such file or directory Feb 19 15:30:18 crc kubenswrapper[4810]: W0219 15:30:18.757185 4810 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0116ca5_826e_4a77_bc6f_11e89c047af8.slice/crio-conmon-4ac6a4c63292ab530356cb491f1c19b551ba50fa93c7f50d74148f2ff94c41a3.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0116ca5_826e_4a77_bc6f_11e89c047af8.slice/crio-conmon-4ac6a4c63292ab530356cb491f1c19b551ba50fa93c7f50d74148f2ff94c41a3.scope: no such file or directory Feb 19 15:30:18 crc kubenswrapper[4810]: W0219 15:30:18.757200 4810 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0116ca5_826e_4a77_bc6f_11e89c047af8.slice/crio-4ac6a4c63292ab530356cb491f1c19b551ba50fa93c7f50d74148f2ff94c41a3.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0116ca5_826e_4a77_bc6f_11e89c047af8.slice/crio-4ac6a4c63292ab530356cb491f1c19b551ba50fa93c7f50d74148f2ff94c41a3.scope: no such file or directory Feb 19 15:30:18 crc kubenswrapper[4810]: W0219 15:30:18.757220 4810 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0b73197_3c7e_44c3_8a49_35d9e0a40629.slice/crio-conmon-dc4301bc9761619133902901d1997ab38671d0d229f4feca664d5a1fa8e30e7a.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0b73197_3c7e_44c3_8a49_35d9e0a40629.slice/crio-conmon-dc4301bc9761619133902901d1997ab38671d0d229f4feca664d5a1fa8e30e7a.scope: no such file or directory Feb 19 15:30:18 crc kubenswrapper[4810]: W0219 15:30:18.757238 4810 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f9534ee_827a_49fb_8588_a5a8a494be3c.slice/crio-conmon-f7ecab250fb259bafddfe5958b677b33308b00c3f83a99f4416f709cb8747963.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f9534ee_827a_49fb_8588_a5a8a494be3c.slice/crio-conmon-f7ecab250fb259bafddfe5958b677b33308b00c3f83a99f4416f709cb8747963.scope: no such file or directory Feb 19 15:30:18 crc kubenswrapper[4810]: W0219 15:30:18.757258 4810 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0b73197_3c7e_44c3_8a49_35d9e0a40629.slice/crio-dc4301bc9761619133902901d1997ab38671d0d229f4feca664d5a1fa8e30e7a.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0b73197_3c7e_44c3_8a49_35d9e0a40629.slice/crio-dc4301bc9761619133902901d1997ab38671d0d229f4feca664d5a1fa8e30e7a.scope: no such file or directory Feb 19 15:30:18 crc kubenswrapper[4810]: W0219 15:30:18.757276 4810 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f9534ee_827a_49fb_8588_a5a8a494be3c.slice/crio-f7ecab250fb259bafddfe5958b677b33308b00c3f83a99f4416f709cb8747963.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f9534ee_827a_49fb_8588_a5a8a494be3c.slice/crio-f7ecab250fb259bafddfe5958b677b33308b00c3f83a99f4416f709cb8747963.scope: no such file or directory Feb 19 15:30:18 crc kubenswrapper[4810]: W0219 15:30:18.757529 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfb06ec19_dfd7_459f_9bfc_6c3e1619abb0.slice/crio-ec03377efcba956dd84e47222d7f2fa8900a14d68a543b851a67f801c749c23f.scope WatchSource:0}: Error finding container ec03377efcba956dd84e47222d7f2fa8900a14d68a543b851a67f801c749c23f: Status 404 returned error can't find the container with id ec03377efcba956dd84e47222d7f2fa8900a14d68a543b851a67f801c749c23f Feb 19 15:30:18 crc kubenswrapper[4810]: W0219 15:30:18.758451 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfb06ec19_dfd7_459f_9bfc_6c3e1619abb0.slice/crio-4e847a42f9e6fb1b9f7435b4df35d14a3fd82483b8e4fe2e6eb4d640b1450737.scope WatchSource:0}: Error finding container 4e847a42f9e6fb1b9f7435b4df35d14a3fd82483b8e4fe2e6eb4d640b1450737: Status 404 returned error can't find the container with id 4e847a42f9e6fb1b9f7435b4df35d14a3fd82483b8e4fe2e6eb4d640b1450737 Feb 19 15:30:18 crc kubenswrapper[4810]: W0219 15:30:18.758672 4810 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod848dfe9d_05f4_4ba9_919e_23e9a7ae63d5.slice/crio-conmon-61fee9f5cc97dc9164d9a8b37259645ec27704b544f8031e79cd8630294aa448.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod848dfe9d_05f4_4ba9_919e_23e9a7ae63d5.slice/crio-conmon-61fee9f5cc97dc9164d9a8b37259645ec27704b544f8031e79cd8630294aa448.scope: no such file or directory Feb 19 15:30:18 crc kubenswrapper[4810]: W0219 15:30:18.758705 4810 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod848dfe9d_05f4_4ba9_919e_23e9a7ae63d5.slice/crio-61fee9f5cc97dc9164d9a8b37259645ec27704b544f8031e79cd8630294aa448.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod848dfe9d_05f4_4ba9_919e_23e9a7ae63d5.slice/crio-61fee9f5cc97dc9164d9a8b37259645ec27704b544f8031e79cd8630294aa448.scope: no such file or directory Feb 19 15:30:18 crc kubenswrapper[4810]: W0219 15:30:18.786154 4810 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb1f44651_e4eb_4cce_a493_9dd9b491b22a.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb1f44651_e4eb_4cce_a493_9dd9b491b22a.slice: no such file or directory Feb 19 15:30:18 crc kubenswrapper[4810]: W0219 15:30:18.811061 4810 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfaf01cf3_b74b_46d8_b589_05ea0195ac24.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfaf01cf3_b74b_46d8_b589_05ea0195ac24.slice: no such file or directory Feb 19 15:30:18 crc kubenswrapper[4810]: W0219 15:30:18.811455 4810 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8cd44d0_7395_44e1_9112_9e8bb4198b93.slice/crio-conmon-f669c7fcc5250fb7338c38806127e385c8e63f008e071b7e746e08ad63a54a09.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8cd44d0_7395_44e1_9112_9e8bb4198b93.slice/crio-conmon-f669c7fcc5250fb7338c38806127e385c8e63f008e071b7e746e08ad63a54a09.scope: no such file or directory Feb 19 15:30:18 crc kubenswrapper[4810]: W0219 15:30:18.811546 4810 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8cd44d0_7395_44e1_9112_9e8bb4198b93.slice/crio-f669c7fcc5250fb7338c38806127e385c8e63f008e071b7e746e08ad63a54a09.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8cd44d0_7395_44e1_9112_9e8bb4198b93.slice/crio-f669c7fcc5250fb7338c38806127e385c8e63f008e071b7e746e08ad63a54a09.scope: no such file or directory Feb 19 15:30:18 crc kubenswrapper[4810]: W0219 15:30:18.812301 4810 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3eb2dccd_c5dc_436f_b7a6_954af7bc51c5.slice/crio-conmon-a8232fee767eaee3fa026223e20673e25fa6d954ddb913e148bf6e0e435de416.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3eb2dccd_c5dc_436f_b7a6_954af7bc51c5.slice/crio-conmon-a8232fee767eaee3fa026223e20673e25fa6d954ddb913e148bf6e0e435de416.scope: no such file or directory Feb 19 15:30:18 crc kubenswrapper[4810]: W0219 15:30:18.812582 4810 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3eb2dccd_c5dc_436f_b7a6_954af7bc51c5.slice/crio-a8232fee767eaee3fa026223e20673e25fa6d954ddb913e148bf6e0e435de416.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3eb2dccd_c5dc_436f_b7a6_954af7bc51c5.slice/crio-a8232fee767eaee3fa026223e20673e25fa6d954ddb913e148bf6e0e435de416.scope: no such file or directory Feb 19 15:30:18 crc kubenswrapper[4810]: W0219 15:30:18.815398 4810 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3bd9969_3750_460b_95cd_8c52d2e44d82.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3bd9969_3750_460b_95cd_8c52d2e44d82.slice: no such file or directory Feb 19 15:30:18 crc kubenswrapper[4810]: W0219 15:30:18.825150 4810 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5db557a_0b89_4a02_b1b2_19bc205acee8.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5db557a_0b89_4a02_b1b2_19bc205acee8.slice: no such file or directory Feb 19 15:30:18 crc kubenswrapper[4810]: W0219 15:30:18.838160 4810 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40500a46_a16b_4282_86e4_1d99277d7c7a.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40500a46_a16b_4282_86e4_1d99277d7c7a.slice: no such file or directory Feb 19 15:30:18 crc kubenswrapper[4810]: W0219 15:30:18.848260 4810 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa26ff13_0dc1_4e6b_a9a8_80177afa17af.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa26ff13_0dc1_4e6b_a9a8_80177afa17af.slice: no such file or directory Feb 19 15:30:18 crc kubenswrapper[4810]: I0219 15:30:18.849231 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 19 15:30:18 crc kubenswrapper[4810]: I0219 15:30:18.956804 4810 generic.go:334] "Generic (PLEG): container finished" podID="b8cd44d0-7395-44e1-9112-9e8bb4198b93" containerID="f669c7fcc5250fb7338c38806127e385c8e63f008e071b7e746e08ad63a54a09" exitCode=137 Feb 19 15:30:18 crc kubenswrapper[4810]: I0219 15:30:18.957782 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b8cd44d0-7395-44e1-9112-9e8bb4198b93","Type":"ContainerDied","Data":"f669c7fcc5250fb7338c38806127e385c8e63f008e071b7e746e08ad63a54a09"} Feb 19 15:30:19 crc kubenswrapper[4810]: E0219 15:30:19.040206 4810 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfb06ec19_dfd7_459f_9bfc_6c3e1619abb0.slice/crio-conmon-ec03377efcba956dd84e47222d7f2fa8900a14d68a543b851a67f801c749c23f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3eb2dccd_c5dc_436f_b7a6_954af7bc51c5.slice/crio-conmon-ea64343d53131969232078700626ee5996ba47b778be09a59da7ab42c9aa2631.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08eca88c_a4b4_461b_8568_ebbf54645272.slice/crio-cc512cef3a5d05465ad16d8e6be38874fe910640b0911d914b7146b795d387d4.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc39b1dd9_9e73_4cca_aea6_e228f1ba5942.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08eca88c_a4b4_461b_8568_ebbf54645272.slice/crio-conmon-11bb73290ec186744ef4e88375d87c281860032daa137cc8bd4779ab70117f2b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2024a783_c3f9_4e57_b00f_52bec164e64e.slice/crio-conmon-ad28bfaa41efd8e4e6c465f81c081a0451a00386412156a5267acdd97840a40b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd961c7d_d551_4f5b_a08a_07d088947698.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2024a783_c3f9_4e57_b00f_52bec164e64e.slice/crio-ad28bfaa41efd8e4e6c465f81c081a0451a00386412156a5267acdd97840a40b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3eb2dccd_c5dc_436f_b7a6_954af7bc51c5.slice/crio-ea64343d53131969232078700626ee5996ba47b778be09a59da7ab42c9aa2631.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8cd44d0_7395_44e1_9112_9e8bb4198b93.slice/crio-conmon-48bd8312dc5f2e91c1a4d6b015bb83960b232d7ff3a764add13cbac66bd0441f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18ca6546_69fd_492d_81c5_bb18c56b045d.slice/crio-conmon-acf279e70fd332fdcdd2bf83f0303bb99e19cd10482ff8ab44a134fa747add8b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc39b1dd9_9e73_4cca_aea6_e228f1ba5942.slice/crio-conmon-267e2a2aa6b67d8303b6404df33bfa4941b4f403604fb949cf5ec932e82ab1b7.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfb06ec19_dfd7_459f_9bfc_6c3e1619abb0.slice/crio-d1b99914aff75d6854dd18c44e4702b0f89c549bb21d23b94afcb4182e4386df\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f9534ee_827a_49fb_8588_a5a8a494be3c.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8cd44d0_7395_44e1_9112_9e8bb4198b93.slice/crio-conmon-1b6e83705ca6e6c238d2cc26ae7440d08d4c7c41779dea1e88d4da0c7c6c4ca7.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8cd44d0_7395_44e1_9112_9e8bb4198b93.slice/crio-48bd8312dc5f2e91c1a4d6b015bb83960b232d7ff3a764add13cbac66bd0441f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc39b1dd9_9e73_4cca_aea6_e228f1ba5942.slice/crio-267e2a2aa6b67d8303b6404df33bfa4941b4f403604fb949cf5ec932e82ab1b7.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc39b1dd9_9e73_4cca_aea6_e228f1ba5942.slice/crio-conmon-09ef2e9038e53c0d3694b2f3cb1b25543038065797746360e5b00060a157df5b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2024a783_c3f9_4e57_b00f_52bec164e64e.slice/crio-398a9dcb2e74ac56cd2827ea038790abeb16b5f6b3573a48b306842a166c3f44\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0b73197_3c7e_44c3_8a49_35d9e0a40629.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc39b1dd9_9e73_4cca_aea6_e228f1ba5942.slice/crio-f7e61fd52ad6569907f8a84cbc32aef547486ffda35028466938ada8d5e3aa10\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfb06ec19_dfd7_459f_9bfc_6c3e1619abb0.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18ca6546_69fd_492d_81c5_bb18c56b045d.slice/crio-acf279e70fd332fdcdd2bf83f0303bb99e19cd10482ff8ab44a134fa747add8b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08eca88c_a4b4_461b_8568_ebbf54645272.slice/crio-11bb73290ec186744ef4e88375d87c281860032daa137cc8bd4779ab70117f2b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0116ca5_826e_4a77_bc6f_11e89c047af8.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18ca6546_69fd_492d_81c5_bb18c56b045d.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd961c7d_d551_4f5b_a08a_07d088947698.slice/crio-3ca7fe1f4f8bad9a3a06d89d4141ff28e32b10f6d3445d4b3f2404b6e71c942f\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08eca88c_a4b4_461b_8568_ebbf54645272.slice/crio-conmon-cc512cef3a5d05465ad16d8e6be38874fe910640b0911d914b7146b795d387d4.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8cd44d0_7395_44e1_9112_9e8bb4198b93.slice/crio-1b6e83705ca6e6c238d2cc26ae7440d08d4c7c41779dea1e88d4da0c7c6c4ca7.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod848dfe9d_05f4_4ba9_919e_23e9a7ae63d5.slice/crio-ce811eb96390a8f8365be6dee0926b85f5365cb868957ed22165ebbf7d343712\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08eca88c_a4b4_461b_8568_ebbf54645272.slice/crio-b936d9f6773d6dc856145fa11df03aba298425ab6b8a943cdd257e7a943da84e\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd961c7d_d551_4f5b_a08a_07d088947698.slice/crio-conmon-aa9a5e6b8de561c312023bc0224fd25e600c4ac446d6ec5ee19e031c464523e1.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfb06ec19_dfd7_459f_9bfc_6c3e1619abb0.slice/crio-conmon-4e847a42f9e6fb1b9f7435b4df35d14a3fd82483b8e4fe2e6eb4d640b1450737.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd961c7d_d551_4f5b_a08a_07d088947698.slice/crio-aa9a5e6b8de561c312023bc0224fd25e600c4ac446d6ec5ee19e031c464523e1.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18ca6546_69fd_492d_81c5_bb18c56b045d.slice/crio-3e6a529b000841e709c2e1d05c4d119a28c8f20c2ece39574181d8df78a6c626\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08eca88c_a4b4_461b_8568_ebbf54645272.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2024a783_c3f9_4e57_b00f_52bec164e64e.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc39b1dd9_9e73_4cca_aea6_e228f1ba5942.slice/crio-09ef2e9038e53c0d3694b2f3cb1b25543038065797746360e5b00060a157df5b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod58c845f2_0069_4ee5_9d4b_b5871e078926.slice/crio-5ab1bae28a55f588686fefd9b6e6ee98c22d6657796a662570ae5cd62319bd13.scope\": RecentStats: unable to find data in memory cache]" Feb 19 15:30:19 crc kubenswrapper[4810]: I0219 15:30:19.125184 4810 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/watcher-decision-engine-0" Feb 19 15:30:19 crc kubenswrapper[4810]: I0219 15:30:19.125224 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Feb 19 15:30:19 crc kubenswrapper[4810]: I0219 15:30:19.125926 4810 scope.go:117] "RemoveContainer" containerID="a8232fee767eaee3fa026223e20673e25fa6d954ddb913e148bf6e0e435de416" Feb 19 15:30:20 crc kubenswrapper[4810]: I0219 15:30:20.183453 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0" Feb 19 15:30:20 crc kubenswrapper[4810]: I0219 15:30:20.194397 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0" Feb 19 15:30:20 crc kubenswrapper[4810]: I0219 15:30:20.288863 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="b8cd44d0-7395-44e1-9112-9e8bb4198b93" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.163:3000/\": dial tcp 10.217.0.163:3000: connect: connection refused" Feb 19 15:30:21 crc kubenswrapper[4810]: I0219 15:30:21.156466 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Feb 19 15:30:21 crc kubenswrapper[4810]: I0219 15:30:21.165449 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-78bc5d479f-k79xx" Feb 19 15:30:25 crc kubenswrapper[4810]: I0219 15:30:25.051924 4810 generic.go:334] "Generic (PLEG): container finished" podID="58c845f2-0069-4ee5-9d4b-b5871e078926" containerID="b7473c6c07a1c77d67ffe62af3e5c262ab61dca816caf8aab0acb14dc5b23ebd" exitCode=137 Feb 19 15:30:25 crc kubenswrapper[4810]: I0219 15:30:25.052100 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-869f57798-ngdtl" event={"ID":"58c845f2-0069-4ee5-9d4b-b5871e078926","Type":"ContainerDied","Data":"b7473c6c07a1c77d67ffe62af3e5c262ab61dca816caf8aab0acb14dc5b23ebd"} Feb 19 15:30:25 crc kubenswrapper[4810]: I0219 15:30:25.762298 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-869f57798-ngdtl" Feb 19 15:30:25 crc kubenswrapper[4810]: I0219 15:30:25.797915 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 15:30:25 crc kubenswrapper[4810]: I0219 15:30:25.903480 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58c845f2-0069-4ee5-9d4b-b5871e078926-combined-ca-bundle\") pod \"58c845f2-0069-4ee5-9d4b-b5871e078926\" (UID: \"58c845f2-0069-4ee5-9d4b-b5871e078926\") " Feb 19 15:30:25 crc kubenswrapper[4810]: I0219 15:30:25.903569 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/58c845f2-0069-4ee5-9d4b-b5871e078926-horizon-tls-certs\") pod \"58c845f2-0069-4ee5-9d4b-b5871e078926\" (UID: \"58c845f2-0069-4ee5-9d4b-b5871e078926\") " Feb 19 15:30:25 crc kubenswrapper[4810]: I0219 15:30:25.903607 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jd77w\" (UniqueName: \"kubernetes.io/projected/b8cd44d0-7395-44e1-9112-9e8bb4198b93-kube-api-access-jd77w\") pod \"b8cd44d0-7395-44e1-9112-9e8bb4198b93\" (UID: \"b8cd44d0-7395-44e1-9112-9e8bb4198b93\") " Feb 19 15:30:25 crc kubenswrapper[4810]: I0219 15:30:25.903635 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b8cd44d0-7395-44e1-9112-9e8bb4198b93-sg-core-conf-yaml\") pod \"b8cd44d0-7395-44e1-9112-9e8bb4198b93\" (UID: \"b8cd44d0-7395-44e1-9112-9e8bb4198b93\") " Feb 19 15:30:25 crc kubenswrapper[4810]: I0219 15:30:25.903759 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b8cd44d0-7395-44e1-9112-9e8bb4198b93-run-httpd\") pod \"b8cd44d0-7395-44e1-9112-9e8bb4198b93\" (UID: \"b8cd44d0-7395-44e1-9112-9e8bb4198b93\") " Feb 19 15:30:25 crc kubenswrapper[4810]: I0219 15:30:25.903790 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b8cd44d0-7395-44e1-9112-9e8bb4198b93-log-httpd\") pod \"b8cd44d0-7395-44e1-9112-9e8bb4198b93\" (UID: \"b8cd44d0-7395-44e1-9112-9e8bb4198b93\") " Feb 19 15:30:25 crc kubenswrapper[4810]: I0219 15:30:25.903841 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/58c845f2-0069-4ee5-9d4b-b5871e078926-config-data\") pod \"58c845f2-0069-4ee5-9d4b-b5871e078926\" (UID: \"58c845f2-0069-4ee5-9d4b-b5871e078926\") " Feb 19 15:30:25 crc kubenswrapper[4810]: I0219 15:30:25.903875 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8cd44d0-7395-44e1-9112-9e8bb4198b93-combined-ca-bundle\") pod \"b8cd44d0-7395-44e1-9112-9e8bb4198b93\" (UID: \"b8cd44d0-7395-44e1-9112-9e8bb4198b93\") " Feb 19 15:30:25 crc kubenswrapper[4810]: I0219 15:30:25.903894 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58c845f2-0069-4ee5-9d4b-b5871e078926-logs\") pod \"58c845f2-0069-4ee5-9d4b-b5871e078926\" (UID: \"58c845f2-0069-4ee5-9d4b-b5871e078926\") " Feb 19 15:30:25 crc kubenswrapper[4810]: I0219 15:30:25.903920 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/58c845f2-0069-4ee5-9d4b-b5871e078926-horizon-secret-key\") pod \"58c845f2-0069-4ee5-9d4b-b5871e078926\" (UID: \"58c845f2-0069-4ee5-9d4b-b5871e078926\") " Feb 19 15:30:25 crc kubenswrapper[4810]: I0219 15:30:25.903988 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8cd44d0-7395-44e1-9112-9e8bb4198b93-scripts\") pod \"b8cd44d0-7395-44e1-9112-9e8bb4198b93\" (UID: \"b8cd44d0-7395-44e1-9112-9e8bb4198b93\") " Feb 19 15:30:25 crc kubenswrapper[4810]: I0219 15:30:25.904048 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8cd44d0-7395-44e1-9112-9e8bb4198b93-config-data\") pod \"b8cd44d0-7395-44e1-9112-9e8bb4198b93\" (UID: \"b8cd44d0-7395-44e1-9112-9e8bb4198b93\") " Feb 19 15:30:25 crc kubenswrapper[4810]: I0219 15:30:25.904089 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/58c845f2-0069-4ee5-9d4b-b5871e078926-scripts\") pod \"58c845f2-0069-4ee5-9d4b-b5871e078926\" (UID: \"58c845f2-0069-4ee5-9d4b-b5871e078926\") " Feb 19 15:30:25 crc kubenswrapper[4810]: I0219 15:30:25.904108 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjb4j\" (UniqueName: \"kubernetes.io/projected/58c845f2-0069-4ee5-9d4b-b5871e078926-kube-api-access-zjb4j\") pod \"58c845f2-0069-4ee5-9d4b-b5871e078926\" (UID: \"58c845f2-0069-4ee5-9d4b-b5871e078926\") " Feb 19 15:30:25 crc kubenswrapper[4810]: I0219 15:30:25.904426 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58c845f2-0069-4ee5-9d4b-b5871e078926-logs" (OuterVolumeSpecName: "logs") pod "58c845f2-0069-4ee5-9d4b-b5871e078926" (UID: "58c845f2-0069-4ee5-9d4b-b5871e078926"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:30:25 crc kubenswrapper[4810]: I0219 15:30:25.904463 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8cd44d0-7395-44e1-9112-9e8bb4198b93-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b8cd44d0-7395-44e1-9112-9e8bb4198b93" (UID: "b8cd44d0-7395-44e1-9112-9e8bb4198b93"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:30:25 crc kubenswrapper[4810]: I0219 15:30:25.904614 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8cd44d0-7395-44e1-9112-9e8bb4198b93-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b8cd44d0-7395-44e1-9112-9e8bb4198b93" (UID: "b8cd44d0-7395-44e1-9112-9e8bb4198b93"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:30:25 crc kubenswrapper[4810]: I0219 15:30:25.904923 4810 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b8cd44d0-7395-44e1-9112-9e8bb4198b93-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:25 crc kubenswrapper[4810]: I0219 15:30:25.904941 4810 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b8cd44d0-7395-44e1-9112-9e8bb4198b93-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:25 crc kubenswrapper[4810]: I0219 15:30:25.904955 4810 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58c845f2-0069-4ee5-9d4b-b5871e078926-logs\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:25 crc kubenswrapper[4810]: I0219 15:30:25.913686 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58c845f2-0069-4ee5-9d4b-b5871e078926-kube-api-access-zjb4j" (OuterVolumeSpecName: "kube-api-access-zjb4j") pod "58c845f2-0069-4ee5-9d4b-b5871e078926" (UID: "58c845f2-0069-4ee5-9d4b-b5871e078926"). InnerVolumeSpecName "kube-api-access-zjb4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:30:25 crc kubenswrapper[4810]: I0219 15:30:25.913887 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8cd44d0-7395-44e1-9112-9e8bb4198b93-kube-api-access-jd77w" (OuterVolumeSpecName: "kube-api-access-jd77w") pod "b8cd44d0-7395-44e1-9112-9e8bb4198b93" (UID: "b8cd44d0-7395-44e1-9112-9e8bb4198b93"). InnerVolumeSpecName "kube-api-access-jd77w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:30:25 crc kubenswrapper[4810]: I0219 15:30:25.915990 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8cd44d0-7395-44e1-9112-9e8bb4198b93-scripts" (OuterVolumeSpecName: "scripts") pod "b8cd44d0-7395-44e1-9112-9e8bb4198b93" (UID: "b8cd44d0-7395-44e1-9112-9e8bb4198b93"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:30:25 crc kubenswrapper[4810]: I0219 15:30:25.916534 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58c845f2-0069-4ee5-9d4b-b5871e078926-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "58c845f2-0069-4ee5-9d4b-b5871e078926" (UID: "58c845f2-0069-4ee5-9d4b-b5871e078926"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:30:25 crc kubenswrapper[4810]: I0219 15:30:25.940102 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58c845f2-0069-4ee5-9d4b-b5871e078926-scripts" (OuterVolumeSpecName: "scripts") pod "58c845f2-0069-4ee5-9d4b-b5871e078926" (UID: "58c845f2-0069-4ee5-9d4b-b5871e078926"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:30:25 crc kubenswrapper[4810]: I0219 15:30:25.941701 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8cd44d0-7395-44e1-9112-9e8bb4198b93-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b8cd44d0-7395-44e1-9112-9e8bb4198b93" (UID: "b8cd44d0-7395-44e1-9112-9e8bb4198b93"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:30:25 crc kubenswrapper[4810]: I0219 15:30:25.948621 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58c845f2-0069-4ee5-9d4b-b5871e078926-config-data" (OuterVolumeSpecName: "config-data") pod "58c845f2-0069-4ee5-9d4b-b5871e078926" (UID: "58c845f2-0069-4ee5-9d4b-b5871e078926"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:30:25 crc kubenswrapper[4810]: I0219 15:30:25.951050 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58c845f2-0069-4ee5-9d4b-b5871e078926-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "58c845f2-0069-4ee5-9d4b-b5871e078926" (UID: "58c845f2-0069-4ee5-9d4b-b5871e078926"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:30:25 crc kubenswrapper[4810]: I0219 15:30:25.970297 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58c845f2-0069-4ee5-9d4b-b5871e078926-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "58c845f2-0069-4ee5-9d4b-b5871e078926" (UID: "58c845f2-0069-4ee5-9d4b-b5871e078926"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:30:25 crc kubenswrapper[4810]: I0219 15:30:25.999688 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8cd44d0-7395-44e1-9112-9e8bb4198b93-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b8cd44d0-7395-44e1-9112-9e8bb4198b93" (UID: "b8cd44d0-7395-44e1-9112-9e8bb4198b93"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:30:26 crc kubenswrapper[4810]: I0219 15:30:26.007379 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/58c845f2-0069-4ee5-9d4b-b5871e078926-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:26 crc kubenswrapper[4810]: I0219 15:30:26.007418 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjb4j\" (UniqueName: \"kubernetes.io/projected/58c845f2-0069-4ee5-9d4b-b5871e078926-kube-api-access-zjb4j\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:26 crc kubenswrapper[4810]: I0219 15:30:26.007435 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58c845f2-0069-4ee5-9d4b-b5871e078926-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:26 crc kubenswrapper[4810]: I0219 15:30:26.007448 4810 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/58c845f2-0069-4ee5-9d4b-b5871e078926-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:26 crc kubenswrapper[4810]: I0219 15:30:26.007464 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jd77w\" (UniqueName: \"kubernetes.io/projected/b8cd44d0-7395-44e1-9112-9e8bb4198b93-kube-api-access-jd77w\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:26 crc kubenswrapper[4810]: I0219 15:30:26.007474 4810 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b8cd44d0-7395-44e1-9112-9e8bb4198b93-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:26 crc kubenswrapper[4810]: I0219 15:30:26.007486 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/58c845f2-0069-4ee5-9d4b-b5871e078926-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:26 crc kubenswrapper[4810]: I0219 15:30:26.007497 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8cd44d0-7395-44e1-9112-9e8bb4198b93-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:26 crc kubenswrapper[4810]: I0219 15:30:26.007509 4810 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/58c845f2-0069-4ee5-9d4b-b5871e078926-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:26 crc kubenswrapper[4810]: I0219 15:30:26.007520 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8cd44d0-7395-44e1-9112-9e8bb4198b93-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:26 crc kubenswrapper[4810]: I0219 15:30:26.010753 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8cd44d0-7395-44e1-9112-9e8bb4198b93-config-data" (OuterVolumeSpecName: "config-data") pod "b8cd44d0-7395-44e1-9112-9e8bb4198b93" (UID: "b8cd44d0-7395-44e1-9112-9e8bb4198b93"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:30:26 crc kubenswrapper[4810]: I0219 15:30:26.064859 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-869f57798-ngdtl" Feb 19 15:30:26 crc kubenswrapper[4810]: I0219 15:30:26.064853 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-869f57798-ngdtl" event={"ID":"58c845f2-0069-4ee5-9d4b-b5871e078926","Type":"ContainerDied","Data":"dd8cab14dea5221ed6bc57de9b6e6053cd08e7d2f18677d44feb73bc0f3396df"} Feb 19 15:30:26 crc kubenswrapper[4810]: I0219 15:30:26.065229 4810 scope.go:117] "RemoveContainer" containerID="5ab1bae28a55f588686fefd9b6e6ee98c22d6657796a662570ae5cd62319bd13" Feb 19 15:30:26 crc kubenswrapper[4810]: I0219 15:30:26.067032 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"ca8eb29b-bb26-446f-8a22-5da13ff9d5fa","Type":"ContainerStarted","Data":"4d94f7f4346decf2db2e822e04fe90605e7ee0d72e5a98985a765bf82968ebc4"} Feb 19 15:30:26 crc kubenswrapper[4810]: I0219 15:30:26.069432 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"3eb2dccd-c5dc-436f-b7a6-954af7bc51c5","Type":"ContainerStarted","Data":"4e69a7ec2d6ffef7e8de5181b7cb6f418ce564ddea149f05d15804b56bd3283e"} Feb 19 15:30:26 crc kubenswrapper[4810]: I0219 15:30:26.082952 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b8cd44d0-7395-44e1-9112-9e8bb4198b93","Type":"ContainerDied","Data":"89e242fd2781af0141617d648aab74ae173b61b87fc7bb8cdef6002ffaed43fc"} Feb 19 15:30:26 crc kubenswrapper[4810]: I0219 15:30:26.083202 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 15:30:26 crc kubenswrapper[4810]: I0219 15:30:26.101805 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.912537385 podStartE2EDuration="15.101785245s" podCreationTimestamp="2026-02-19 15:30:11 +0000 UTC" firstStartedPulling="2026-02-19 15:30:12.574213475 +0000 UTC m=+1242.056243599" lastFinishedPulling="2026-02-19 15:30:25.763461335 +0000 UTC m=+1255.245491459" observedRunningTime="2026-02-19 15:30:26.09139356 +0000 UTC m=+1255.573423704" watchObservedRunningTime="2026-02-19 15:30:26.101785245 +0000 UTC m=+1255.583815389" Feb 19 15:30:26 crc kubenswrapper[4810]: I0219 15:30:26.109820 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8cd44d0-7395-44e1-9112-9e8bb4198b93-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:26 crc kubenswrapper[4810]: I0219 15:30:26.200000 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-78bc5d479f-k79xx" Feb 19 15:30:26 crc kubenswrapper[4810]: I0219 15:30:26.215118 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 15:30:26 crc kubenswrapper[4810]: I0219 15:30:26.241471 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 19 15:30:26 crc kubenswrapper[4810]: I0219 15:30:26.254469 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-869f57798-ngdtl"] Feb 19 15:30:26 crc kubenswrapper[4810]: I0219 15:30:26.259957 4810 scope.go:117] "RemoveContainer" containerID="b7473c6c07a1c77d67ffe62af3e5c262ab61dca816caf8aab0acb14dc5b23ebd" Feb 19 15:30:26 crc kubenswrapper[4810]: I0219 15:30:26.297660 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-869f57798-ngdtl"] Feb 19 15:30:26 crc kubenswrapper[4810]: I0219 15:30:26.307990 4810 scope.go:117] "RemoveContainer" containerID="f669c7fcc5250fb7338c38806127e385c8e63f008e071b7e746e08ad63a54a09" Feb 19 15:30:26 crc kubenswrapper[4810]: I0219 15:30:26.346456 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 19 15:30:26 crc kubenswrapper[4810]: E0219 15:30:26.347751 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58c845f2-0069-4ee5-9d4b-b5871e078926" containerName="horizon-log" Feb 19 15:30:26 crc kubenswrapper[4810]: I0219 15:30:26.347775 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="58c845f2-0069-4ee5-9d4b-b5871e078926" containerName="horizon-log" Feb 19 15:30:26 crc kubenswrapper[4810]: E0219 15:30:26.347817 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8cd44d0-7395-44e1-9112-9e8bb4198b93" containerName="ceilometer-notification-agent" Feb 19 15:30:26 crc kubenswrapper[4810]: I0219 15:30:26.347829 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8cd44d0-7395-44e1-9112-9e8bb4198b93" containerName="ceilometer-notification-agent" Feb 19 15:30:26 crc kubenswrapper[4810]: E0219 15:30:26.347844 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8cd44d0-7395-44e1-9112-9e8bb4198b93" containerName="sg-core" Feb 19 15:30:26 crc kubenswrapper[4810]: I0219 15:30:26.347851 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8cd44d0-7395-44e1-9112-9e8bb4198b93" containerName="sg-core" Feb 19 15:30:26 crc kubenswrapper[4810]: E0219 15:30:26.347862 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8cd44d0-7395-44e1-9112-9e8bb4198b93" containerName="proxy-httpd" Feb 19 15:30:26 crc kubenswrapper[4810]: I0219 15:30:26.347903 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8cd44d0-7395-44e1-9112-9e8bb4198b93" containerName="proxy-httpd" Feb 19 15:30:26 crc kubenswrapper[4810]: E0219 15:30:26.347922 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb06ec19-dfd7-459f-9bfc-6c3e1619abb0" containerName="neutron-api" Feb 19 15:30:26 crc kubenswrapper[4810]: I0219 15:30:26.347929 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb06ec19-dfd7-459f-9bfc-6c3e1619abb0" containerName="neutron-api" Feb 19 15:30:26 crc kubenswrapper[4810]: E0219 15:30:26.347944 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb06ec19-dfd7-459f-9bfc-6c3e1619abb0" containerName="neutron-httpd" Feb 19 15:30:26 crc kubenswrapper[4810]: I0219 15:30:26.358228 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb06ec19-dfd7-459f-9bfc-6c3e1619abb0" containerName="neutron-httpd" Feb 19 15:30:26 crc kubenswrapper[4810]: E0219 15:30:26.358296 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58c845f2-0069-4ee5-9d4b-b5871e078926" containerName="horizon" Feb 19 15:30:26 crc kubenswrapper[4810]: I0219 15:30:26.358306 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="58c845f2-0069-4ee5-9d4b-b5871e078926" containerName="horizon" Feb 19 15:30:26 crc kubenswrapper[4810]: I0219 15:30:26.358685 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8cd44d0-7395-44e1-9112-9e8bb4198b93" containerName="proxy-httpd" Feb 19 15:30:26 crc kubenswrapper[4810]: I0219 15:30:26.358713 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8cd44d0-7395-44e1-9112-9e8bb4198b93" containerName="ceilometer-notification-agent" Feb 19 15:30:26 crc kubenswrapper[4810]: I0219 15:30:26.358735 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8cd44d0-7395-44e1-9112-9e8bb4198b93" containerName="sg-core" Feb 19 15:30:26 crc kubenswrapper[4810]: I0219 15:30:26.358747 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb06ec19-dfd7-459f-9bfc-6c3e1619abb0" containerName="neutron-httpd" Feb 19 15:30:26 crc kubenswrapper[4810]: I0219 15:30:26.358767 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb06ec19-dfd7-459f-9bfc-6c3e1619abb0" containerName="neutron-api" Feb 19 15:30:26 crc kubenswrapper[4810]: I0219 15:30:26.358780 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="58c845f2-0069-4ee5-9d4b-b5871e078926" containerName="horizon" Feb 19 15:30:26 crc kubenswrapper[4810]: I0219 15:30:26.358797 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="58c845f2-0069-4ee5-9d4b-b5871e078926" containerName="horizon-log" Feb 19 15:30:26 crc kubenswrapper[4810]: I0219 15:30:26.354417 4810 scope.go:117] "RemoveContainer" containerID="1b6e83705ca6e6c238d2cc26ae7440d08d4c7c41779dea1e88d4da0c7c6c4ca7" Feb 19 15:30:26 crc kubenswrapper[4810]: I0219 15:30:26.370586 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 15:30:26 crc kubenswrapper[4810]: I0219 15:30:26.370690 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 15:30:26 crc kubenswrapper[4810]: I0219 15:30:26.378010 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 19 15:30:26 crc kubenswrapper[4810]: I0219 15:30:26.378250 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 19 15:30:26 crc kubenswrapper[4810]: I0219 15:30:26.392261 4810 scope.go:117] "RemoveContainer" containerID="48bd8312dc5f2e91c1a4d6b015bb83960b232d7ff3a764add13cbac66bd0441f" Feb 19 15:30:26 crc kubenswrapper[4810]: I0219 15:30:26.520509 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e03c99b3-d5d6-479a-9b45-045bda62be1e-log-httpd\") pod \"ceilometer-0\" (UID: \"e03c99b3-d5d6-479a-9b45-045bda62be1e\") " pod="openstack/ceilometer-0" Feb 19 15:30:26 crc kubenswrapper[4810]: I0219 15:30:26.520567 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e03c99b3-d5d6-479a-9b45-045bda62be1e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e03c99b3-d5d6-479a-9b45-045bda62be1e\") " pod="openstack/ceilometer-0" Feb 19 15:30:26 crc kubenswrapper[4810]: I0219 15:30:26.520593 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e03c99b3-d5d6-479a-9b45-045bda62be1e-config-data\") pod \"ceilometer-0\" (UID: \"e03c99b3-d5d6-479a-9b45-045bda62be1e\") " pod="openstack/ceilometer-0" Feb 19 15:30:26 crc kubenswrapper[4810]: I0219 15:30:26.520679 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e03c99b3-d5d6-479a-9b45-045bda62be1e-scripts\") pod \"ceilometer-0\" (UID: \"e03c99b3-d5d6-479a-9b45-045bda62be1e\") " pod="openstack/ceilometer-0" Feb 19 15:30:26 crc kubenswrapper[4810]: I0219 15:30:26.520744 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e03c99b3-d5d6-479a-9b45-045bda62be1e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e03c99b3-d5d6-479a-9b45-045bda62be1e\") " pod="openstack/ceilometer-0" Feb 19 15:30:26 crc kubenswrapper[4810]: I0219 15:30:26.520794 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e03c99b3-d5d6-479a-9b45-045bda62be1e-run-httpd\") pod \"ceilometer-0\" (UID: \"e03c99b3-d5d6-479a-9b45-045bda62be1e\") " pod="openstack/ceilometer-0" Feb 19 15:30:26 crc kubenswrapper[4810]: I0219 15:30:26.520847 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vpjk\" (UniqueName: \"kubernetes.io/projected/e03c99b3-d5d6-479a-9b45-045bda62be1e-kube-api-access-8vpjk\") pod \"ceilometer-0\" (UID: \"e03c99b3-d5d6-479a-9b45-045bda62be1e\") " pod="openstack/ceilometer-0" Feb 19 15:30:26 crc kubenswrapper[4810]: I0219 15:30:26.622282 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e03c99b3-d5d6-479a-9b45-045bda62be1e-scripts\") pod \"ceilometer-0\" (UID: \"e03c99b3-d5d6-479a-9b45-045bda62be1e\") " pod="openstack/ceilometer-0" Feb 19 15:30:26 crc kubenswrapper[4810]: I0219 15:30:26.622378 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e03c99b3-d5d6-479a-9b45-045bda62be1e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e03c99b3-d5d6-479a-9b45-045bda62be1e\") " pod="openstack/ceilometer-0" Feb 19 15:30:26 crc kubenswrapper[4810]: I0219 15:30:26.622412 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e03c99b3-d5d6-479a-9b45-045bda62be1e-run-httpd\") pod \"ceilometer-0\" (UID: \"e03c99b3-d5d6-479a-9b45-045bda62be1e\") " pod="openstack/ceilometer-0" Feb 19 15:30:26 crc kubenswrapper[4810]: I0219 15:30:26.622460 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vpjk\" (UniqueName: \"kubernetes.io/projected/e03c99b3-d5d6-479a-9b45-045bda62be1e-kube-api-access-8vpjk\") pod \"ceilometer-0\" (UID: \"e03c99b3-d5d6-479a-9b45-045bda62be1e\") " pod="openstack/ceilometer-0" Feb 19 15:30:26 crc kubenswrapper[4810]: I0219 15:30:26.622541 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e03c99b3-d5d6-479a-9b45-045bda62be1e-log-httpd\") pod \"ceilometer-0\" (UID: \"e03c99b3-d5d6-479a-9b45-045bda62be1e\") " pod="openstack/ceilometer-0" Feb 19 15:30:26 crc kubenswrapper[4810]: I0219 15:30:26.622571 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e03c99b3-d5d6-479a-9b45-045bda62be1e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e03c99b3-d5d6-479a-9b45-045bda62be1e\") " pod="openstack/ceilometer-0" Feb 19 15:30:26 crc kubenswrapper[4810]: I0219 15:30:26.622586 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e03c99b3-d5d6-479a-9b45-045bda62be1e-config-data\") pod \"ceilometer-0\" (UID: \"e03c99b3-d5d6-479a-9b45-045bda62be1e\") " pod="openstack/ceilometer-0" Feb 19 15:30:26 crc kubenswrapper[4810]: I0219 15:30:26.622892 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e03c99b3-d5d6-479a-9b45-045bda62be1e-run-httpd\") pod \"ceilometer-0\" (UID: \"e03c99b3-d5d6-479a-9b45-045bda62be1e\") " pod="openstack/ceilometer-0" Feb 19 15:30:26 crc kubenswrapper[4810]: I0219 15:30:26.623116 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e03c99b3-d5d6-479a-9b45-045bda62be1e-log-httpd\") pod \"ceilometer-0\" (UID: \"e03c99b3-d5d6-479a-9b45-045bda62be1e\") " pod="openstack/ceilometer-0" Feb 19 15:30:26 crc kubenswrapper[4810]: I0219 15:30:26.627187 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e03c99b3-d5d6-479a-9b45-045bda62be1e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e03c99b3-d5d6-479a-9b45-045bda62be1e\") " pod="openstack/ceilometer-0" Feb 19 15:30:26 crc kubenswrapper[4810]: I0219 15:30:26.627251 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e03c99b3-d5d6-479a-9b45-045bda62be1e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e03c99b3-d5d6-479a-9b45-045bda62be1e\") " pod="openstack/ceilometer-0" Feb 19 15:30:26 crc kubenswrapper[4810]: I0219 15:30:26.628183 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e03c99b3-d5d6-479a-9b45-045bda62be1e-config-data\") pod \"ceilometer-0\" (UID: \"e03c99b3-d5d6-479a-9b45-045bda62be1e\") " pod="openstack/ceilometer-0" Feb 19 15:30:26 crc kubenswrapper[4810]: I0219 15:30:26.634442 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e03c99b3-d5d6-479a-9b45-045bda62be1e-scripts\") pod \"ceilometer-0\" (UID: \"e03c99b3-d5d6-479a-9b45-045bda62be1e\") " pod="openstack/ceilometer-0" Feb 19 15:30:26 crc kubenswrapper[4810]: I0219 15:30:26.640427 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vpjk\" (UniqueName: \"kubernetes.io/projected/e03c99b3-d5d6-479a-9b45-045bda62be1e-kube-api-access-8vpjk\") pod \"ceilometer-0\" (UID: \"e03c99b3-d5d6-479a-9b45-045bda62be1e\") " pod="openstack/ceilometer-0" Feb 19 15:30:26 crc kubenswrapper[4810]: I0219 15:30:26.714700 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 15:30:27 crc kubenswrapper[4810]: I0219 15:30:27.165573 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 15:30:27 crc kubenswrapper[4810]: W0219 15:30:27.170761 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode03c99b3_d5d6_479a_9b45_045bda62be1e.slice/crio-28190f8c14a28d468a745076cbf90c346864c9199f363697f039ea4c27bb2c7f WatchSource:0}: Error finding container 28190f8c14a28d468a745076cbf90c346864c9199f363697f039ea4c27bb2c7f: Status 404 returned error can't find the container with id 28190f8c14a28d468a745076cbf90c346864c9199f363697f039ea4c27bb2c7f Feb 19 15:30:27 crc kubenswrapper[4810]: I0219 15:30:27.481547 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58c845f2-0069-4ee5-9d4b-b5871e078926" path="/var/lib/kubelet/pods/58c845f2-0069-4ee5-9d4b-b5871e078926/volumes" Feb 19 15:30:27 crc kubenswrapper[4810]: I0219 15:30:27.484471 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8cd44d0-7395-44e1-9112-9e8bb4198b93" path="/var/lib/kubelet/pods/b8cd44d0-7395-44e1-9112-9e8bb4198b93/volumes" Feb 19 15:30:28 crc kubenswrapper[4810]: I0219 15:30:28.104956 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e03c99b3-d5d6-479a-9b45-045bda62be1e","Type":"ContainerStarted","Data":"2d5f4181bcf65b08449001da5a939c392bf7453a34f071c7f2e2b29a55dbc3c9"} Feb 19 15:30:28 crc kubenswrapper[4810]: I0219 15:30:28.105258 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e03c99b3-d5d6-479a-9b45-045bda62be1e","Type":"ContainerStarted","Data":"922169bd474755c022d1869b56b5508937a05f190fe4b150d2505095771f9d93"} Feb 19 15:30:28 crc kubenswrapper[4810]: I0219 15:30:28.105268 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e03c99b3-d5d6-479a-9b45-045bda62be1e","Type":"ContainerStarted","Data":"28190f8c14a28d468a745076cbf90c346864c9199f363697f039ea4c27bb2c7f"} Feb 19 15:30:29 crc kubenswrapper[4810]: I0219 15:30:29.001679 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 15:30:29 crc kubenswrapper[4810]: I0219 15:30:29.115198 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e03c99b3-d5d6-479a-9b45-045bda62be1e","Type":"ContainerStarted","Data":"75f4bc2020206ce4f5c18d99d5247f762fe2e1e5470498d43115d0a1b1bc5184"} Feb 19 15:30:29 crc kubenswrapper[4810]: I0219 15:30:29.124718 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Feb 19 15:30:29 crc kubenswrapper[4810]: I0219 15:30:29.160748 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Feb 19 15:30:29 crc kubenswrapper[4810]: I0219 15:30:29.743286 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-8ftxl"] Feb 19 15:30:29 crc kubenswrapper[4810]: I0219 15:30:29.744420 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-8ftxl" Feb 19 15:30:29 crc kubenswrapper[4810]: I0219 15:30:29.764779 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-8ftxl"] Feb 19 15:30:29 crc kubenswrapper[4810]: I0219 15:30:29.844301 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-nxd5j"] Feb 19 15:30:29 crc kubenswrapper[4810]: I0219 15:30:29.845539 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-nxd5j" Feb 19 15:30:29 crc kubenswrapper[4810]: I0219 15:30:29.859389 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-nxd5j"] Feb 19 15:30:29 crc kubenswrapper[4810]: I0219 15:30:29.881723 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da5e0166-d811-4dcd-9230-976dd1893c11-operator-scripts\") pod \"nova-api-db-create-8ftxl\" (UID: \"da5e0166-d811-4dcd-9230-976dd1893c11\") " pod="openstack/nova-api-db-create-8ftxl" Feb 19 15:30:29 crc kubenswrapper[4810]: I0219 15:30:29.881794 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxxfl\" (UniqueName: \"kubernetes.io/projected/da5e0166-d811-4dcd-9230-976dd1893c11-kube-api-access-sxxfl\") pod \"nova-api-db-create-8ftxl\" (UID: \"da5e0166-d811-4dcd-9230-976dd1893c11\") " pod="openstack/nova-api-db-create-8ftxl" Feb 19 15:30:29 crc kubenswrapper[4810]: I0219 15:30:29.973753 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-6e67-account-create-update-lk6cv"] Feb 19 15:30:29 crc kubenswrapper[4810]: I0219 15:30:29.975271 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-6e67-account-create-update-lk6cv" Feb 19 15:30:29 crc kubenswrapper[4810]: I0219 15:30:29.980632 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Feb 19 15:30:29 crc kubenswrapper[4810]: I0219 15:30:29.983662 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da5e0166-d811-4dcd-9230-976dd1893c11-operator-scripts\") pod \"nova-api-db-create-8ftxl\" (UID: \"da5e0166-d811-4dcd-9230-976dd1893c11\") " pod="openstack/nova-api-db-create-8ftxl" Feb 19 15:30:29 crc kubenswrapper[4810]: I0219 15:30:29.983719 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f04e1699-2be0-4dca-8e4a-73035fde359f-operator-scripts\") pod \"nova-cell0-db-create-nxd5j\" (UID: \"f04e1699-2be0-4dca-8e4a-73035fde359f\") " pod="openstack/nova-cell0-db-create-nxd5j" Feb 19 15:30:29 crc kubenswrapper[4810]: I0219 15:30:29.983776 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxxfl\" (UniqueName: \"kubernetes.io/projected/da5e0166-d811-4dcd-9230-976dd1893c11-kube-api-access-sxxfl\") pod \"nova-api-db-create-8ftxl\" (UID: \"da5e0166-d811-4dcd-9230-976dd1893c11\") " pod="openstack/nova-api-db-create-8ftxl" Feb 19 15:30:29 crc kubenswrapper[4810]: I0219 15:30:29.983838 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-254j4\" (UniqueName: \"kubernetes.io/projected/f04e1699-2be0-4dca-8e4a-73035fde359f-kube-api-access-254j4\") pod \"nova-cell0-db-create-nxd5j\" (UID: \"f04e1699-2be0-4dca-8e4a-73035fde359f\") " pod="openstack/nova-cell0-db-create-nxd5j" Feb 19 15:30:29 crc kubenswrapper[4810]: I0219 15:30:29.984680 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da5e0166-d811-4dcd-9230-976dd1893c11-operator-scripts\") pod \"nova-api-db-create-8ftxl\" (UID: \"da5e0166-d811-4dcd-9230-976dd1893c11\") " pod="openstack/nova-api-db-create-8ftxl" Feb 19 15:30:29 crc kubenswrapper[4810]: I0219 15:30:29.985702 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-6e67-account-create-update-lk6cv"] Feb 19 15:30:30 crc kubenswrapper[4810]: I0219 15:30:30.047247 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxxfl\" (UniqueName: \"kubernetes.io/projected/da5e0166-d811-4dcd-9230-976dd1893c11-kube-api-access-sxxfl\") pod \"nova-api-db-create-8ftxl\" (UID: \"da5e0166-d811-4dcd-9230-976dd1893c11\") " pod="openstack/nova-api-db-create-8ftxl" Feb 19 15:30:30 crc kubenswrapper[4810]: I0219 15:30:30.057650 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-w2s7h"] Feb 19 15:30:30 crc kubenswrapper[4810]: I0219 15:30:30.058933 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-w2s7h" Feb 19 15:30:30 crc kubenswrapper[4810]: I0219 15:30:30.066728 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-8ftxl" Feb 19 15:30:30 crc kubenswrapper[4810]: I0219 15:30:30.067548 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-w2s7h"] Feb 19 15:30:30 crc kubenswrapper[4810]: I0219 15:30:30.099245 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6d2b\" (UniqueName: \"kubernetes.io/projected/48e0d5d9-1d58-41a5-b740-8c8286edec31-kube-api-access-n6d2b\") pod \"nova-api-6e67-account-create-update-lk6cv\" (UID: \"48e0d5d9-1d58-41a5-b740-8c8286edec31\") " pod="openstack/nova-api-6e67-account-create-update-lk6cv" Feb 19 15:30:30 crc kubenswrapper[4810]: I0219 15:30:30.099316 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/48e0d5d9-1d58-41a5-b740-8c8286edec31-operator-scripts\") pod \"nova-api-6e67-account-create-update-lk6cv\" (UID: \"48e0d5d9-1d58-41a5-b740-8c8286edec31\") " pod="openstack/nova-api-6e67-account-create-update-lk6cv" Feb 19 15:30:30 crc kubenswrapper[4810]: I0219 15:30:30.099543 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f04e1699-2be0-4dca-8e4a-73035fde359f-operator-scripts\") pod \"nova-cell0-db-create-nxd5j\" (UID: \"f04e1699-2be0-4dca-8e4a-73035fde359f\") " pod="openstack/nova-cell0-db-create-nxd5j" Feb 19 15:30:30 crc kubenswrapper[4810]: I0219 15:30:30.099659 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-254j4\" (UniqueName: \"kubernetes.io/projected/f04e1699-2be0-4dca-8e4a-73035fde359f-kube-api-access-254j4\") pod \"nova-cell0-db-create-nxd5j\" (UID: \"f04e1699-2be0-4dca-8e4a-73035fde359f\") " pod="openstack/nova-cell0-db-create-nxd5j" Feb 19 15:30:30 crc kubenswrapper[4810]: I0219 15:30:30.100844 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f04e1699-2be0-4dca-8e4a-73035fde359f-operator-scripts\") pod \"nova-cell0-db-create-nxd5j\" (UID: \"f04e1699-2be0-4dca-8e4a-73035fde359f\") " pod="openstack/nova-cell0-db-create-nxd5j" Feb 19 15:30:30 crc kubenswrapper[4810]: I0219 15:30:30.151153 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-254j4\" (UniqueName: \"kubernetes.io/projected/f04e1699-2be0-4dca-8e4a-73035fde359f-kube-api-access-254j4\") pod \"nova-cell0-db-create-nxd5j\" (UID: \"f04e1699-2be0-4dca-8e4a-73035fde359f\") " pod="openstack/nova-cell0-db-create-nxd5j" Feb 19 15:30:30 crc kubenswrapper[4810]: I0219 15:30:30.159710 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Feb 19 15:30:30 crc kubenswrapper[4810]: I0219 15:30:30.164529 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-nxd5j" Feb 19 15:30:30 crc kubenswrapper[4810]: I0219 15:30:30.166976 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-841f-account-create-update-swd7q"] Feb 19 15:30:30 crc kubenswrapper[4810]: I0219 15:30:30.168281 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-841f-account-create-update-swd7q" Feb 19 15:30:30 crc kubenswrapper[4810]: I0219 15:30:30.181549 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-841f-account-create-update-swd7q"] Feb 19 15:30:30 crc kubenswrapper[4810]: I0219 15:30:30.184745 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Feb 19 15:30:30 crc kubenswrapper[4810]: I0219 15:30:30.201505 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/987d17ad-1427-4709-b5db-19fbb00e8a7c-operator-scripts\") pod \"nova-cell1-db-create-w2s7h\" (UID: \"987d17ad-1427-4709-b5db-19fbb00e8a7c\") " pod="openstack/nova-cell1-db-create-w2s7h" Feb 19 15:30:30 crc kubenswrapper[4810]: I0219 15:30:30.201565 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svb2k\" (UniqueName: \"kubernetes.io/projected/987d17ad-1427-4709-b5db-19fbb00e8a7c-kube-api-access-svb2k\") pod \"nova-cell1-db-create-w2s7h\" (UID: \"987d17ad-1427-4709-b5db-19fbb00e8a7c\") " pod="openstack/nova-cell1-db-create-w2s7h" Feb 19 15:30:30 crc kubenswrapper[4810]: I0219 15:30:30.201638 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6d2b\" (UniqueName: \"kubernetes.io/projected/48e0d5d9-1d58-41a5-b740-8c8286edec31-kube-api-access-n6d2b\") pod \"nova-api-6e67-account-create-update-lk6cv\" (UID: \"48e0d5d9-1d58-41a5-b740-8c8286edec31\") " pod="openstack/nova-api-6e67-account-create-update-lk6cv" Feb 19 15:30:30 crc kubenswrapper[4810]: I0219 15:30:30.201661 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/48e0d5d9-1d58-41a5-b740-8c8286edec31-operator-scripts\") pod \"nova-api-6e67-account-create-update-lk6cv\" (UID: \"48e0d5d9-1d58-41a5-b740-8c8286edec31\") " pod="openstack/nova-api-6e67-account-create-update-lk6cv" Feb 19 15:30:30 crc kubenswrapper[4810]: I0219 15:30:30.202470 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/48e0d5d9-1d58-41a5-b740-8c8286edec31-operator-scripts\") pod \"nova-api-6e67-account-create-update-lk6cv\" (UID: \"48e0d5d9-1d58-41a5-b740-8c8286edec31\") " pod="openstack/nova-api-6e67-account-create-update-lk6cv" Feb 19 15:30:30 crc kubenswrapper[4810]: I0219 15:30:30.218686 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Feb 19 15:30:30 crc kubenswrapper[4810]: I0219 15:30:30.233114 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6d2b\" (UniqueName: \"kubernetes.io/projected/48e0d5d9-1d58-41a5-b740-8c8286edec31-kube-api-access-n6d2b\") pod \"nova-api-6e67-account-create-update-lk6cv\" (UID: \"48e0d5d9-1d58-41a5-b740-8c8286edec31\") " pod="openstack/nova-api-6e67-account-create-update-lk6cv" Feb 19 15:30:30 crc kubenswrapper[4810]: I0219 15:30:30.296806 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 19 15:30:30 crc kubenswrapper[4810]: I0219 15:30:30.302795 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-6e67-account-create-update-lk6cv" Feb 19 15:30:30 crc kubenswrapper[4810]: I0219 15:30:30.304206 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d1703853-2754-4348-8c45-dcd98ff5d429-operator-scripts\") pod \"nova-cell0-841f-account-create-update-swd7q\" (UID: \"d1703853-2754-4348-8c45-dcd98ff5d429\") " pod="openstack/nova-cell0-841f-account-create-update-swd7q" Feb 19 15:30:30 crc kubenswrapper[4810]: I0219 15:30:30.304257 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/987d17ad-1427-4709-b5db-19fbb00e8a7c-operator-scripts\") pod \"nova-cell1-db-create-w2s7h\" (UID: \"987d17ad-1427-4709-b5db-19fbb00e8a7c\") " pod="openstack/nova-cell1-db-create-w2s7h" Feb 19 15:30:30 crc kubenswrapper[4810]: I0219 15:30:30.304290 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svb2k\" (UniqueName: \"kubernetes.io/projected/987d17ad-1427-4709-b5db-19fbb00e8a7c-kube-api-access-svb2k\") pod \"nova-cell1-db-create-w2s7h\" (UID: \"987d17ad-1427-4709-b5db-19fbb00e8a7c\") " pod="openstack/nova-cell1-db-create-w2s7h" Feb 19 15:30:30 crc kubenswrapper[4810]: I0219 15:30:30.304306 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4w8q5\" (UniqueName: \"kubernetes.io/projected/d1703853-2754-4348-8c45-dcd98ff5d429-kube-api-access-4w8q5\") pod \"nova-cell0-841f-account-create-update-swd7q\" (UID: \"d1703853-2754-4348-8c45-dcd98ff5d429\") " pod="openstack/nova-cell0-841f-account-create-update-swd7q" Feb 19 15:30:30 crc kubenswrapper[4810]: I0219 15:30:30.308701 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/987d17ad-1427-4709-b5db-19fbb00e8a7c-operator-scripts\") pod \"nova-cell1-db-create-w2s7h\" (UID: \"987d17ad-1427-4709-b5db-19fbb00e8a7c\") " pod="openstack/nova-cell1-db-create-w2s7h" Feb 19 15:30:30 crc kubenswrapper[4810]: I0219 15:30:30.359036 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-da99-account-create-update-4j7hb"] Feb 19 15:30:30 crc kubenswrapper[4810]: I0219 15:30:30.362370 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-da99-account-create-update-4j7hb" Feb 19 15:30:30 crc kubenswrapper[4810]: I0219 15:30:30.368028 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svb2k\" (UniqueName: \"kubernetes.io/projected/987d17ad-1427-4709-b5db-19fbb00e8a7c-kube-api-access-svb2k\") pod \"nova-cell1-db-create-w2s7h\" (UID: \"987d17ad-1427-4709-b5db-19fbb00e8a7c\") " pod="openstack/nova-cell1-db-create-w2s7h" Feb 19 15:30:30 crc kubenswrapper[4810]: I0219 15:30:30.372787 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Feb 19 15:30:30 crc kubenswrapper[4810]: I0219 15:30:30.374784 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-da99-account-create-update-4j7hb"] Feb 19 15:30:30 crc kubenswrapper[4810]: I0219 15:30:30.408464 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d1703853-2754-4348-8c45-dcd98ff5d429-operator-scripts\") pod \"nova-cell0-841f-account-create-update-swd7q\" (UID: \"d1703853-2754-4348-8c45-dcd98ff5d429\") " pod="openstack/nova-cell0-841f-account-create-update-swd7q" Feb 19 15:30:30 crc kubenswrapper[4810]: I0219 15:30:30.408804 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4w8q5\" (UniqueName: \"kubernetes.io/projected/d1703853-2754-4348-8c45-dcd98ff5d429-kube-api-access-4w8q5\") pod \"nova-cell0-841f-account-create-update-swd7q\" (UID: \"d1703853-2754-4348-8c45-dcd98ff5d429\") " pod="openstack/nova-cell0-841f-account-create-update-swd7q" Feb 19 15:30:30 crc kubenswrapper[4810]: I0219 15:30:30.409713 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d1703853-2754-4348-8c45-dcd98ff5d429-operator-scripts\") pod \"nova-cell0-841f-account-create-update-swd7q\" (UID: \"d1703853-2754-4348-8c45-dcd98ff5d429\") " pod="openstack/nova-cell0-841f-account-create-update-swd7q" Feb 19 15:30:30 crc kubenswrapper[4810]: I0219 15:30:30.427485 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4w8q5\" (UniqueName: \"kubernetes.io/projected/d1703853-2754-4348-8c45-dcd98ff5d429-kube-api-access-4w8q5\") pod \"nova-cell0-841f-account-create-update-swd7q\" (UID: \"d1703853-2754-4348-8c45-dcd98ff5d429\") " pod="openstack/nova-cell0-841f-account-create-update-swd7q" Feb 19 15:30:30 crc kubenswrapper[4810]: I0219 15:30:30.510431 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9t982\" (UniqueName: \"kubernetes.io/projected/1f19eb06-d11c-409b-8b7e-516c9a5db815-kube-api-access-9t982\") pod \"nova-cell1-da99-account-create-update-4j7hb\" (UID: \"1f19eb06-d11c-409b-8b7e-516c9a5db815\") " pod="openstack/nova-cell1-da99-account-create-update-4j7hb" Feb 19 15:30:30 crc kubenswrapper[4810]: I0219 15:30:30.510586 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f19eb06-d11c-409b-8b7e-516c9a5db815-operator-scripts\") pod \"nova-cell1-da99-account-create-update-4j7hb\" (UID: \"1f19eb06-d11c-409b-8b7e-516c9a5db815\") " pod="openstack/nova-cell1-da99-account-create-update-4j7hb" Feb 19 15:30:30 crc kubenswrapper[4810]: I0219 15:30:30.612723 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9t982\" (UniqueName: \"kubernetes.io/projected/1f19eb06-d11c-409b-8b7e-516c9a5db815-kube-api-access-9t982\") pod \"nova-cell1-da99-account-create-update-4j7hb\" (UID: \"1f19eb06-d11c-409b-8b7e-516c9a5db815\") " pod="openstack/nova-cell1-da99-account-create-update-4j7hb" Feb 19 15:30:30 crc kubenswrapper[4810]: I0219 15:30:30.612836 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f19eb06-d11c-409b-8b7e-516c9a5db815-operator-scripts\") pod \"nova-cell1-da99-account-create-update-4j7hb\" (UID: \"1f19eb06-d11c-409b-8b7e-516c9a5db815\") " pod="openstack/nova-cell1-da99-account-create-update-4j7hb" Feb 19 15:30:30 crc kubenswrapper[4810]: I0219 15:30:30.613848 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f19eb06-d11c-409b-8b7e-516c9a5db815-operator-scripts\") pod \"nova-cell1-da99-account-create-update-4j7hb\" (UID: \"1f19eb06-d11c-409b-8b7e-516c9a5db815\") " pod="openstack/nova-cell1-da99-account-create-update-4j7hb" Feb 19 15:30:30 crc kubenswrapper[4810]: I0219 15:30:30.631250 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9t982\" (UniqueName: \"kubernetes.io/projected/1f19eb06-d11c-409b-8b7e-516c9a5db815-kube-api-access-9t982\") pod \"nova-cell1-da99-account-create-update-4j7hb\" (UID: \"1f19eb06-d11c-409b-8b7e-516c9a5db815\") " pod="openstack/nova-cell1-da99-account-create-update-4j7hb" Feb 19 15:30:30 crc kubenswrapper[4810]: I0219 15:30:30.660596 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-w2s7h" Feb 19 15:30:30 crc kubenswrapper[4810]: I0219 15:30:30.695865 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-841f-account-create-update-swd7q" Feb 19 15:30:30 crc kubenswrapper[4810]: I0219 15:30:30.732391 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-da99-account-create-update-4j7hb" Feb 19 15:30:30 crc kubenswrapper[4810]: I0219 15:30:30.808326 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-8ftxl"] Feb 19 15:30:30 crc kubenswrapper[4810]: I0219 15:30:30.927863 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-nxd5j"] Feb 19 15:30:31 crc kubenswrapper[4810]: I0219 15:30:31.028733 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-6e67-account-create-update-lk6cv"] Feb 19 15:30:31 crc kubenswrapper[4810]: W0219 15:30:31.038588 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48e0d5d9_1d58_41a5_b740_8c8286edec31.slice/crio-9a0a850973879b8c16a53cdf917d4a3ec51f5d1f0bea3d2e4973de66d44880c7 WatchSource:0}: Error finding container 9a0a850973879b8c16a53cdf917d4a3ec51f5d1f0bea3d2e4973de66d44880c7: Status 404 returned error can't find the container with id 9a0a850973879b8c16a53cdf917d4a3ec51f5d1f0bea3d2e4973de66d44880c7 Feb 19 15:30:31 crc kubenswrapper[4810]: I0219 15:30:31.175660 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-8ftxl" event={"ID":"da5e0166-d811-4dcd-9230-976dd1893c11","Type":"ContainerStarted","Data":"ea790df5d5f36cd5c53b9c8735a8d922caa18ea278fcafd7ab38eca09a4e4d29"} Feb 19 15:30:31 crc kubenswrapper[4810]: I0219 15:30:31.181739 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e03c99b3-d5d6-479a-9b45-045bda62be1e","Type":"ContainerStarted","Data":"d0173ca27d32542d93975f960f3c985b101c2043974bb22943e94cfa2c3990e5"} Feb 19 15:30:31 crc kubenswrapper[4810]: I0219 15:30:31.181973 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e03c99b3-d5d6-479a-9b45-045bda62be1e" containerName="ceilometer-central-agent" containerID="cri-o://922169bd474755c022d1869b56b5508937a05f190fe4b150d2505095771f9d93" gracePeriod=30 Feb 19 15:30:31 crc kubenswrapper[4810]: I0219 15:30:31.182061 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 19 15:30:31 crc kubenswrapper[4810]: I0219 15:30:31.182259 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e03c99b3-d5d6-479a-9b45-045bda62be1e" containerName="proxy-httpd" containerID="cri-o://d0173ca27d32542d93975f960f3c985b101c2043974bb22943e94cfa2c3990e5" gracePeriod=30 Feb 19 15:30:31 crc kubenswrapper[4810]: I0219 15:30:31.182533 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e03c99b3-d5d6-479a-9b45-045bda62be1e" containerName="sg-core" containerID="cri-o://75f4bc2020206ce4f5c18d99d5247f762fe2e1e5470498d43115d0a1b1bc5184" gracePeriod=30 Feb 19 15:30:31 crc kubenswrapper[4810]: I0219 15:30:31.182520 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e03c99b3-d5d6-479a-9b45-045bda62be1e" containerName="ceilometer-notification-agent" containerID="cri-o://2d5f4181bcf65b08449001da5a939c392bf7453a34f071c7f2e2b29a55dbc3c9" gracePeriod=30 Feb 19 15:30:31 crc kubenswrapper[4810]: I0219 15:30:31.187775 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-6e67-account-create-update-lk6cv" event={"ID":"48e0d5d9-1d58-41a5-b740-8c8286edec31","Type":"ContainerStarted","Data":"9a0a850973879b8c16a53cdf917d4a3ec51f5d1f0bea3d2e4973de66d44880c7"} Feb 19 15:30:31 crc kubenswrapper[4810]: I0219 15:30:31.195741 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-nxd5j" event={"ID":"f04e1699-2be0-4dca-8e4a-73035fde359f","Type":"ContainerStarted","Data":"19ce66453832709a411387f21a96253dcdb5acebfcdfa40ee91e2d5e02a077dc"} Feb 19 15:30:31 crc kubenswrapper[4810]: I0219 15:30:31.215172 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.88857707 podStartE2EDuration="5.215151138s" podCreationTimestamp="2026-02-19 15:30:26 +0000 UTC" firstStartedPulling="2026-02-19 15:30:27.173386294 +0000 UTC m=+1256.655416418" lastFinishedPulling="2026-02-19 15:30:30.499960362 +0000 UTC m=+1259.981990486" observedRunningTime="2026-02-19 15:30:31.206837614 +0000 UTC m=+1260.688867738" watchObservedRunningTime="2026-02-19 15:30:31.215151138 +0000 UTC m=+1260.697181262" Feb 19 15:30:31 crc kubenswrapper[4810]: I0219 15:30:31.279682 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-w2s7h"] Feb 19 15:30:31 crc kubenswrapper[4810]: I0219 15:30:31.300717 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-841f-account-create-update-swd7q"] Feb 19 15:30:31 crc kubenswrapper[4810]: W0219 15:30:31.316591 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1703853_2754_4348_8c45_dcd98ff5d429.slice/crio-8b87b6cef7dd1cc9076c484a82249ed624df4e78c547c11a7768a92577f632cf WatchSource:0}: Error finding container 8b87b6cef7dd1cc9076c484a82249ed624df4e78c547c11a7768a92577f632cf: Status 404 returned error can't find the container with id 8b87b6cef7dd1cc9076c484a82249ed624df4e78c547c11a7768a92577f632cf Feb 19 15:30:31 crc kubenswrapper[4810]: I0219 15:30:31.494817 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-da99-account-create-update-4j7hb"] Feb 19 15:30:31 crc kubenswrapper[4810]: I0219 15:30:31.570915 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Feb 19 15:30:32 crc kubenswrapper[4810]: I0219 15:30:32.217557 4810 generic.go:334] "Generic (PLEG): container finished" podID="da5e0166-d811-4dcd-9230-976dd1893c11" containerID="99cf896833f13eecd3fedefc31f58e2b88d17d37a7cb7ae1aea233b7d9a39af1" exitCode=0 Feb 19 15:30:32 crc kubenswrapper[4810]: I0219 15:30:32.217839 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-8ftxl" event={"ID":"da5e0166-d811-4dcd-9230-976dd1893c11","Type":"ContainerDied","Data":"99cf896833f13eecd3fedefc31f58e2b88d17d37a7cb7ae1aea233b7d9a39af1"} Feb 19 15:30:32 crc kubenswrapper[4810]: I0219 15:30:32.223776 4810 generic.go:334] "Generic (PLEG): container finished" podID="e03c99b3-d5d6-479a-9b45-045bda62be1e" containerID="d0173ca27d32542d93975f960f3c985b101c2043974bb22943e94cfa2c3990e5" exitCode=0 Feb 19 15:30:32 crc kubenswrapper[4810]: I0219 15:30:32.223815 4810 generic.go:334] "Generic (PLEG): container finished" podID="e03c99b3-d5d6-479a-9b45-045bda62be1e" containerID="75f4bc2020206ce4f5c18d99d5247f762fe2e1e5470498d43115d0a1b1bc5184" exitCode=2 Feb 19 15:30:32 crc kubenswrapper[4810]: I0219 15:30:32.223824 4810 generic.go:334] "Generic (PLEG): container finished" podID="e03c99b3-d5d6-479a-9b45-045bda62be1e" containerID="2d5f4181bcf65b08449001da5a939c392bf7453a34f071c7f2e2b29a55dbc3c9" exitCode=0 Feb 19 15:30:32 crc kubenswrapper[4810]: I0219 15:30:32.223859 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e03c99b3-d5d6-479a-9b45-045bda62be1e","Type":"ContainerDied","Data":"d0173ca27d32542d93975f960f3c985b101c2043974bb22943e94cfa2c3990e5"} Feb 19 15:30:32 crc kubenswrapper[4810]: I0219 15:30:32.223882 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e03c99b3-d5d6-479a-9b45-045bda62be1e","Type":"ContainerDied","Data":"75f4bc2020206ce4f5c18d99d5247f762fe2e1e5470498d43115d0a1b1bc5184"} Feb 19 15:30:32 crc kubenswrapper[4810]: I0219 15:30:32.223891 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e03c99b3-d5d6-479a-9b45-045bda62be1e","Type":"ContainerDied","Data":"2d5f4181bcf65b08449001da5a939c392bf7453a34f071c7f2e2b29a55dbc3c9"} Feb 19 15:30:32 crc kubenswrapper[4810]: I0219 15:30:32.228777 4810 generic.go:334] "Generic (PLEG): container finished" podID="d1703853-2754-4348-8c45-dcd98ff5d429" containerID="ead759deef71357ae0d9ddba72b509ea84ac0664aab15baecfde700a4dc84f66" exitCode=0 Feb 19 15:30:32 crc kubenswrapper[4810]: I0219 15:30:32.228832 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-841f-account-create-update-swd7q" event={"ID":"d1703853-2754-4348-8c45-dcd98ff5d429","Type":"ContainerDied","Data":"ead759deef71357ae0d9ddba72b509ea84ac0664aab15baecfde700a4dc84f66"} Feb 19 15:30:32 crc kubenswrapper[4810]: I0219 15:30:32.228855 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-841f-account-create-update-swd7q" event={"ID":"d1703853-2754-4348-8c45-dcd98ff5d429","Type":"ContainerStarted","Data":"8b87b6cef7dd1cc9076c484a82249ed624df4e78c547c11a7768a92577f632cf"} Feb 19 15:30:32 crc kubenswrapper[4810]: I0219 15:30:32.230398 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-da99-account-create-update-4j7hb" event={"ID":"1f19eb06-d11c-409b-8b7e-516c9a5db815","Type":"ContainerStarted","Data":"7cb43c21f053a8d03036f06cd4952d1e70925f82267f48d6f2c4959f93a370e5"} Feb 19 15:30:32 crc kubenswrapper[4810]: I0219 15:30:32.230424 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-da99-account-create-update-4j7hb" event={"ID":"1f19eb06-d11c-409b-8b7e-516c9a5db815","Type":"ContainerStarted","Data":"5b4c6c0a8687bad9f0c8df95b6b32fefcb3920c41eb3bf6e882a708592401faf"} Feb 19 15:30:32 crc kubenswrapper[4810]: I0219 15:30:32.234686 4810 generic.go:334] "Generic (PLEG): container finished" podID="48e0d5d9-1d58-41a5-b740-8c8286edec31" containerID="a519fabbf15898bb4c345dee03c392f33a7ca3106e889528c9a61a815ff5b000" exitCode=0 Feb 19 15:30:32 crc kubenswrapper[4810]: I0219 15:30:32.234731 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-6e67-account-create-update-lk6cv" event={"ID":"48e0d5d9-1d58-41a5-b740-8c8286edec31","Type":"ContainerDied","Data":"a519fabbf15898bb4c345dee03c392f33a7ca3106e889528c9a61a815ff5b000"} Feb 19 15:30:32 crc kubenswrapper[4810]: I0219 15:30:32.255526 4810 generic.go:334] "Generic (PLEG): container finished" podID="f04e1699-2be0-4dca-8e4a-73035fde359f" containerID="2fa804cbc29144cbaa9d2e4c3f648166e91009da2ed6d113042e7022e9308b2c" exitCode=0 Feb 19 15:30:32 crc kubenswrapper[4810]: I0219 15:30:32.255595 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-nxd5j" event={"ID":"f04e1699-2be0-4dca-8e4a-73035fde359f","Type":"ContainerDied","Data":"2fa804cbc29144cbaa9d2e4c3f648166e91009da2ed6d113042e7022e9308b2c"} Feb 19 15:30:32 crc kubenswrapper[4810]: I0219 15:30:32.267317 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-da99-account-create-update-4j7hb" podStartSLOduration=2.267295249 podStartE2EDuration="2.267295249s" podCreationTimestamp="2026-02-19 15:30:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:30:32.25390546 +0000 UTC m=+1261.735935584" watchObservedRunningTime="2026-02-19 15:30:32.267295249 +0000 UTC m=+1261.749325373" Feb 19 15:30:32 crc kubenswrapper[4810]: I0219 15:30:32.267761 4810 generic.go:334] "Generic (PLEG): container finished" podID="987d17ad-1427-4709-b5db-19fbb00e8a7c" containerID="d738b05f8038fce0f6f7dca977b306ea2c9695f4bc8b38cb001bb799b15410d3" exitCode=0 Feb 19 15:30:32 crc kubenswrapper[4810]: I0219 15:30:32.267926 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-decision-engine-0" podUID="3eb2dccd-c5dc-436f-b7a6-954af7bc51c5" containerName="watcher-decision-engine" containerID="cri-o://4e69a7ec2d6ffef7e8de5181b7cb6f418ce564ddea149f05d15804b56bd3283e" gracePeriod=30 Feb 19 15:30:32 crc kubenswrapper[4810]: I0219 15:30:32.268099 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-w2s7h" event={"ID":"987d17ad-1427-4709-b5db-19fbb00e8a7c","Type":"ContainerDied","Data":"d738b05f8038fce0f6f7dca977b306ea2c9695f4bc8b38cb001bb799b15410d3"} Feb 19 15:30:32 crc kubenswrapper[4810]: I0219 15:30:32.268138 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-w2s7h" event={"ID":"987d17ad-1427-4709-b5db-19fbb00e8a7c","Type":"ContainerStarted","Data":"8653c17c2feefd4ce9a0d05b4110170b6e33cc6884775621ff1b3f7d64b78a17"} Feb 19 15:30:32 crc kubenswrapper[4810]: I0219 15:30:32.312617 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-applier-0"] Feb 19 15:30:32 crc kubenswrapper[4810]: I0219 15:30:32.312942 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-applier-0" podUID="2ea855ba-523c-4143-8fe8-b0b1150299d0" containerName="watcher-applier" containerID="cri-o://9386dd3b59b24a748e770d6384d92f3e8aff8a701badb29067310dec0fb2fbb8" gracePeriod=30 Feb 19 15:30:32 crc kubenswrapper[4810]: I0219 15:30:32.399387 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Feb 19 15:30:32 crc kubenswrapper[4810]: I0219 15:30:32.399824 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="c772672c-c983-42e8-ae77-bfc8484ad555" containerName="watcher-api-log" containerID="cri-o://a3586f026120011072c0978ee9277dd85cce4dc7aaed51f25a7bea5aff2ae758" gracePeriod=30 Feb 19 15:30:32 crc kubenswrapper[4810]: I0219 15:30:32.400254 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="c772672c-c983-42e8-ae77-bfc8484ad555" containerName="watcher-api" containerID="cri-o://d9c5f503f212d6aa9b0ddd5ddc6ff440b6c542c6960be28ea86ea203ab7ca92d" gracePeriod=30 Feb 19 15:30:32 crc kubenswrapper[4810]: E0219 15:30:32.893259 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9386dd3b59b24a748e770d6384d92f3e8aff8a701badb29067310dec0fb2fbb8" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Feb 19 15:30:32 crc kubenswrapper[4810]: E0219 15:30:32.897944 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9386dd3b59b24a748e770d6384d92f3e8aff8a701badb29067310dec0fb2fbb8" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Feb 19 15:30:32 crc kubenswrapper[4810]: E0219 15:30:32.901313 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9386dd3b59b24a748e770d6384d92f3e8aff8a701badb29067310dec0fb2fbb8" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Feb 19 15:30:32 crc kubenswrapper[4810]: E0219 15:30:32.901400 4810 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/watcher-applier-0" podUID="2ea855ba-523c-4143-8fe8-b0b1150299d0" containerName="watcher-applier" Feb 19 15:30:33 crc kubenswrapper[4810]: I0219 15:30:33.299189 4810 generic.go:334] "Generic (PLEG): container finished" podID="1f19eb06-d11c-409b-8b7e-516c9a5db815" containerID="7cb43c21f053a8d03036f06cd4952d1e70925f82267f48d6f2c4959f93a370e5" exitCode=0 Feb 19 15:30:33 crc kubenswrapper[4810]: I0219 15:30:33.299418 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-da99-account-create-update-4j7hb" event={"ID":"1f19eb06-d11c-409b-8b7e-516c9a5db815","Type":"ContainerDied","Data":"7cb43c21f053a8d03036f06cd4952d1e70925f82267f48d6f2c4959f93a370e5"} Feb 19 15:30:33 crc kubenswrapper[4810]: I0219 15:30:33.313561 4810 generic.go:334] "Generic (PLEG): container finished" podID="2ea855ba-523c-4143-8fe8-b0b1150299d0" containerID="9386dd3b59b24a748e770d6384d92f3e8aff8a701badb29067310dec0fb2fbb8" exitCode=0 Feb 19 15:30:33 crc kubenswrapper[4810]: I0219 15:30:33.313648 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"2ea855ba-523c-4143-8fe8-b0b1150299d0","Type":"ContainerDied","Data":"9386dd3b59b24a748e770d6384d92f3e8aff8a701badb29067310dec0fb2fbb8"} Feb 19 15:30:33 crc kubenswrapper[4810]: I0219 15:30:33.327823 4810 generic.go:334] "Generic (PLEG): container finished" podID="c772672c-c983-42e8-ae77-bfc8484ad555" containerID="a3586f026120011072c0978ee9277dd85cce4dc7aaed51f25a7bea5aff2ae758" exitCode=143 Feb 19 15:30:33 crc kubenswrapper[4810]: I0219 15:30:33.328040 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"c772672c-c983-42e8-ae77-bfc8484ad555","Type":"ContainerDied","Data":"a3586f026120011072c0978ee9277dd85cce4dc7aaed51f25a7bea5aff2ae758"} Feb 19 15:30:33 crc kubenswrapper[4810]: I0219 15:30:33.497603 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Feb 19 15:30:33 crc kubenswrapper[4810]: I0219 15:30:33.580456 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ea855ba-523c-4143-8fe8-b0b1150299d0-combined-ca-bundle\") pod \"2ea855ba-523c-4143-8fe8-b0b1150299d0\" (UID: \"2ea855ba-523c-4143-8fe8-b0b1150299d0\") " Feb 19 15:30:33 crc kubenswrapper[4810]: I0219 15:30:33.580510 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ea855ba-523c-4143-8fe8-b0b1150299d0-logs\") pod \"2ea855ba-523c-4143-8fe8-b0b1150299d0\" (UID: \"2ea855ba-523c-4143-8fe8-b0b1150299d0\") " Feb 19 15:30:33 crc kubenswrapper[4810]: I0219 15:30:33.580636 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ea855ba-523c-4143-8fe8-b0b1150299d0-config-data\") pod \"2ea855ba-523c-4143-8fe8-b0b1150299d0\" (UID: \"2ea855ba-523c-4143-8fe8-b0b1150299d0\") " Feb 19 15:30:33 crc kubenswrapper[4810]: I0219 15:30:33.580774 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f72k9\" (UniqueName: \"kubernetes.io/projected/2ea855ba-523c-4143-8fe8-b0b1150299d0-kube-api-access-f72k9\") pod \"2ea855ba-523c-4143-8fe8-b0b1150299d0\" (UID: \"2ea855ba-523c-4143-8fe8-b0b1150299d0\") " Feb 19 15:30:33 crc kubenswrapper[4810]: I0219 15:30:33.582682 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ea855ba-523c-4143-8fe8-b0b1150299d0-logs" (OuterVolumeSpecName: "logs") pod "2ea855ba-523c-4143-8fe8-b0b1150299d0" (UID: "2ea855ba-523c-4143-8fe8-b0b1150299d0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:30:33 crc kubenswrapper[4810]: I0219 15:30:33.615224 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ea855ba-523c-4143-8fe8-b0b1150299d0-kube-api-access-f72k9" (OuterVolumeSpecName: "kube-api-access-f72k9") pod "2ea855ba-523c-4143-8fe8-b0b1150299d0" (UID: "2ea855ba-523c-4143-8fe8-b0b1150299d0"). InnerVolumeSpecName "kube-api-access-f72k9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:30:33 crc kubenswrapper[4810]: I0219 15:30:33.624006 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ea855ba-523c-4143-8fe8-b0b1150299d0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2ea855ba-523c-4143-8fe8-b0b1150299d0" (UID: "2ea855ba-523c-4143-8fe8-b0b1150299d0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:30:33 crc kubenswrapper[4810]: I0219 15:30:33.692742 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f72k9\" (UniqueName: \"kubernetes.io/projected/2ea855ba-523c-4143-8fe8-b0b1150299d0-kube-api-access-f72k9\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:33 crc kubenswrapper[4810]: I0219 15:30:33.693194 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ea855ba-523c-4143-8fe8-b0b1150299d0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:33 crc kubenswrapper[4810]: I0219 15:30:33.693211 4810 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ea855ba-523c-4143-8fe8-b0b1150299d0-logs\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:33 crc kubenswrapper[4810]: I0219 15:30:33.728780 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ea855ba-523c-4143-8fe8-b0b1150299d0-config-data" (OuterVolumeSpecName: "config-data") pod "2ea855ba-523c-4143-8fe8-b0b1150299d0" (UID: "2ea855ba-523c-4143-8fe8-b0b1150299d0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:30:33 crc kubenswrapper[4810]: I0219 15:30:33.762701 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-8ftxl" Feb 19 15:30:33 crc kubenswrapper[4810]: I0219 15:30:33.795075 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ea855ba-523c-4143-8fe8-b0b1150299d0-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:33 crc kubenswrapper[4810]: I0219 15:30:33.897379 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da5e0166-d811-4dcd-9230-976dd1893c11-operator-scripts\") pod \"da5e0166-d811-4dcd-9230-976dd1893c11\" (UID: \"da5e0166-d811-4dcd-9230-976dd1893c11\") " Feb 19 15:30:33 crc kubenswrapper[4810]: I0219 15:30:33.897756 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxxfl\" (UniqueName: \"kubernetes.io/projected/da5e0166-d811-4dcd-9230-976dd1893c11-kube-api-access-sxxfl\") pod \"da5e0166-d811-4dcd-9230-976dd1893c11\" (UID: \"da5e0166-d811-4dcd-9230-976dd1893c11\") " Feb 19 15:30:33 crc kubenswrapper[4810]: I0219 15:30:33.899157 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da5e0166-d811-4dcd-9230-976dd1893c11-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "da5e0166-d811-4dcd-9230-976dd1893c11" (UID: "da5e0166-d811-4dcd-9230-976dd1893c11"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:30:33 crc kubenswrapper[4810]: I0219 15:30:33.904502 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da5e0166-d811-4dcd-9230-976dd1893c11-kube-api-access-sxxfl" (OuterVolumeSpecName: "kube-api-access-sxxfl") pod "da5e0166-d811-4dcd-9230-976dd1893c11" (UID: "da5e0166-d811-4dcd-9230-976dd1893c11"). InnerVolumeSpecName "kube-api-access-sxxfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:30:33 crc kubenswrapper[4810]: I0219 15:30:33.929140 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-nxd5j" Feb 19 15:30:33 crc kubenswrapper[4810]: I0219 15:30:33.947820 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-w2s7h" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.000865 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f04e1699-2be0-4dca-8e4a-73035fde359f-operator-scripts\") pod \"f04e1699-2be0-4dca-8e4a-73035fde359f\" (UID: \"f04e1699-2be0-4dca-8e4a-73035fde359f\") " Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.001188 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-254j4\" (UniqueName: \"kubernetes.io/projected/f04e1699-2be0-4dca-8e4a-73035fde359f-kube-api-access-254j4\") pod \"f04e1699-2be0-4dca-8e4a-73035fde359f\" (UID: \"f04e1699-2be0-4dca-8e4a-73035fde359f\") " Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.001891 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxxfl\" (UniqueName: \"kubernetes.io/projected/da5e0166-d811-4dcd-9230-976dd1893c11-kube-api-access-sxxfl\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.001968 4810 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da5e0166-d811-4dcd-9230-976dd1893c11-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.005491 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f04e1699-2be0-4dca-8e4a-73035fde359f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f04e1699-2be0-4dca-8e4a-73035fde359f" (UID: "f04e1699-2be0-4dca-8e4a-73035fde359f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.005661 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f04e1699-2be0-4dca-8e4a-73035fde359f-kube-api-access-254j4" (OuterVolumeSpecName: "kube-api-access-254j4") pod "f04e1699-2be0-4dca-8e4a-73035fde359f" (UID: "f04e1699-2be0-4dca-8e4a-73035fde359f"). InnerVolumeSpecName "kube-api-access-254j4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.007653 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-841f-account-create-update-swd7q" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.031913 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-6e67-account-create-update-lk6cv" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.041236 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.103270 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4w8q5\" (UniqueName: \"kubernetes.io/projected/d1703853-2754-4348-8c45-dcd98ff5d429-kube-api-access-4w8q5\") pod \"d1703853-2754-4348-8c45-dcd98ff5d429\" (UID: \"d1703853-2754-4348-8c45-dcd98ff5d429\") " Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.103400 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d1703853-2754-4348-8c45-dcd98ff5d429-operator-scripts\") pod \"d1703853-2754-4348-8c45-dcd98ff5d429\" (UID: \"d1703853-2754-4348-8c45-dcd98ff5d429\") " Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.103458 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-svb2k\" (UniqueName: \"kubernetes.io/projected/987d17ad-1427-4709-b5db-19fbb00e8a7c-kube-api-access-svb2k\") pod \"987d17ad-1427-4709-b5db-19fbb00e8a7c\" (UID: \"987d17ad-1427-4709-b5db-19fbb00e8a7c\") " Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.103523 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/987d17ad-1427-4709-b5db-19fbb00e8a7c-operator-scripts\") pod \"987d17ad-1427-4709-b5db-19fbb00e8a7c\" (UID: \"987d17ad-1427-4709-b5db-19fbb00e8a7c\") " Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.103912 4810 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f04e1699-2be0-4dca-8e4a-73035fde359f-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.103928 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-254j4\" (UniqueName: \"kubernetes.io/projected/f04e1699-2be0-4dca-8e4a-73035fde359f-kube-api-access-254j4\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.104161 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1703853-2754-4348-8c45-dcd98ff5d429-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d1703853-2754-4348-8c45-dcd98ff5d429" (UID: "d1703853-2754-4348-8c45-dcd98ff5d429"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.104348 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/987d17ad-1427-4709-b5db-19fbb00e8a7c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "987d17ad-1427-4709-b5db-19fbb00e8a7c" (UID: "987d17ad-1427-4709-b5db-19fbb00e8a7c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.109487 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/987d17ad-1427-4709-b5db-19fbb00e8a7c-kube-api-access-svb2k" (OuterVolumeSpecName: "kube-api-access-svb2k") pod "987d17ad-1427-4709-b5db-19fbb00e8a7c" (UID: "987d17ad-1427-4709-b5db-19fbb00e8a7c"). InnerVolumeSpecName "kube-api-access-svb2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.111151 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1703853-2754-4348-8c45-dcd98ff5d429-kube-api-access-4w8q5" (OuterVolumeSpecName: "kube-api-access-4w8q5") pod "d1703853-2754-4348-8c45-dcd98ff5d429" (UID: "d1703853-2754-4348-8c45-dcd98ff5d429"). InnerVolumeSpecName "kube-api-access-4w8q5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.205172 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c772672c-c983-42e8-ae77-bfc8484ad555-public-tls-certs\") pod \"c772672c-c983-42e8-ae77-bfc8484ad555\" (UID: \"c772672c-c983-42e8-ae77-bfc8484ad555\") " Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.205394 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c772672c-c983-42e8-ae77-bfc8484ad555-config-data\") pod \"c772672c-c983-42e8-ae77-bfc8484ad555\" (UID: \"c772672c-c983-42e8-ae77-bfc8484ad555\") " Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.205455 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c772672c-c983-42e8-ae77-bfc8484ad555-logs\") pod \"c772672c-c983-42e8-ae77-bfc8484ad555\" (UID: \"c772672c-c983-42e8-ae77-bfc8484ad555\") " Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.205480 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c772672c-c983-42e8-ae77-bfc8484ad555-combined-ca-bundle\") pod \"c772672c-c983-42e8-ae77-bfc8484ad555\" (UID: \"c772672c-c983-42e8-ae77-bfc8484ad555\") " Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.205507 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vx6gm\" (UniqueName: \"kubernetes.io/projected/c772672c-c983-42e8-ae77-bfc8484ad555-kube-api-access-vx6gm\") pod \"c772672c-c983-42e8-ae77-bfc8484ad555\" (UID: \"c772672c-c983-42e8-ae77-bfc8484ad555\") " Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.205558 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6d2b\" (UniqueName: \"kubernetes.io/projected/48e0d5d9-1d58-41a5-b740-8c8286edec31-kube-api-access-n6d2b\") pod \"48e0d5d9-1d58-41a5-b740-8c8286edec31\" (UID: \"48e0d5d9-1d58-41a5-b740-8c8286edec31\") " Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.205604 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/48e0d5d9-1d58-41a5-b740-8c8286edec31-operator-scripts\") pod \"48e0d5d9-1d58-41a5-b740-8c8286edec31\" (UID: \"48e0d5d9-1d58-41a5-b740-8c8286edec31\") " Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.205636 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/c772672c-c983-42e8-ae77-bfc8484ad555-custom-prometheus-ca\") pod \"c772672c-c983-42e8-ae77-bfc8484ad555\" (UID: \"c772672c-c983-42e8-ae77-bfc8484ad555\") " Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.205681 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c772672c-c983-42e8-ae77-bfc8484ad555-internal-tls-certs\") pod \"c772672c-c983-42e8-ae77-bfc8484ad555\" (UID: \"c772672c-c983-42e8-ae77-bfc8484ad555\") " Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.206167 4810 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d1703853-2754-4348-8c45-dcd98ff5d429-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.206194 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-svb2k\" (UniqueName: \"kubernetes.io/projected/987d17ad-1427-4709-b5db-19fbb00e8a7c-kube-api-access-svb2k\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.206208 4810 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/987d17ad-1427-4709-b5db-19fbb00e8a7c-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.206219 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4w8q5\" (UniqueName: \"kubernetes.io/projected/d1703853-2754-4348-8c45-dcd98ff5d429-kube-api-access-4w8q5\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.207184 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48e0d5d9-1d58-41a5-b740-8c8286edec31-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "48e0d5d9-1d58-41a5-b740-8c8286edec31" (UID: "48e0d5d9-1d58-41a5-b740-8c8286edec31"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.208378 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c772672c-c983-42e8-ae77-bfc8484ad555-logs" (OuterVolumeSpecName: "logs") pod "c772672c-c983-42e8-ae77-bfc8484ad555" (UID: "c772672c-c983-42e8-ae77-bfc8484ad555"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.224519 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48e0d5d9-1d58-41a5-b740-8c8286edec31-kube-api-access-n6d2b" (OuterVolumeSpecName: "kube-api-access-n6d2b") pod "48e0d5d9-1d58-41a5-b740-8c8286edec31" (UID: "48e0d5d9-1d58-41a5-b740-8c8286edec31"). InnerVolumeSpecName "kube-api-access-n6d2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.224854 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c772672c-c983-42e8-ae77-bfc8484ad555-kube-api-access-vx6gm" (OuterVolumeSpecName: "kube-api-access-vx6gm") pod "c772672c-c983-42e8-ae77-bfc8484ad555" (UID: "c772672c-c983-42e8-ae77-bfc8484ad555"). InnerVolumeSpecName "kube-api-access-vx6gm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.265113 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c772672c-c983-42e8-ae77-bfc8484ad555-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c772672c-c983-42e8-ae77-bfc8484ad555" (UID: "c772672c-c983-42e8-ae77-bfc8484ad555"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.270462 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c772672c-c983-42e8-ae77-bfc8484ad555-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "c772672c-c983-42e8-ae77-bfc8484ad555" (UID: "c772672c-c983-42e8-ae77-bfc8484ad555"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.279480 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c772672c-c983-42e8-ae77-bfc8484ad555-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c772672c-c983-42e8-ae77-bfc8484ad555" (UID: "c772672c-c983-42e8-ae77-bfc8484ad555"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.279830 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c772672c-c983-42e8-ae77-bfc8484ad555-config-data" (OuterVolumeSpecName: "config-data") pod "c772672c-c983-42e8-ae77-bfc8484ad555" (UID: "c772672c-c983-42e8-ae77-bfc8484ad555"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.302861 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c772672c-c983-42e8-ae77-bfc8484ad555-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c772672c-c983-42e8-ae77-bfc8484ad555" (UID: "c772672c-c983-42e8-ae77-bfc8484ad555"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.308251 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c772672c-c983-42e8-ae77-bfc8484ad555-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.308277 4810 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c772672c-c983-42e8-ae77-bfc8484ad555-logs\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.308287 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c772672c-c983-42e8-ae77-bfc8484ad555-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.308299 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vx6gm\" (UniqueName: \"kubernetes.io/projected/c772672c-c983-42e8-ae77-bfc8484ad555-kube-api-access-vx6gm\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.308308 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n6d2b\" (UniqueName: \"kubernetes.io/projected/48e0d5d9-1d58-41a5-b740-8c8286edec31-kube-api-access-n6d2b\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.308317 4810 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/48e0d5d9-1d58-41a5-b740-8c8286edec31-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.308346 4810 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/c772672c-c983-42e8-ae77-bfc8484ad555-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.308356 4810 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c772672c-c983-42e8-ae77-bfc8484ad555-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.308363 4810 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c772672c-c983-42e8-ae77-bfc8484ad555-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.338877 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"2ea855ba-523c-4143-8fe8-b0b1150299d0","Type":"ContainerDied","Data":"1d01a4c95612c0c4f1b7f9b7042052db9fc19c8db413469f648bf9735bce00e6"} Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.338917 4810 scope.go:117] "RemoveContainer" containerID="9386dd3b59b24a748e770d6384d92f3e8aff8a701badb29067310dec0fb2fbb8" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.339023 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.358531 4810 generic.go:334] "Generic (PLEG): container finished" podID="c772672c-c983-42e8-ae77-bfc8484ad555" containerID="d9c5f503f212d6aa9b0ddd5ddc6ff440b6c542c6960be28ea86ea203ab7ca92d" exitCode=0 Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.358668 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.360540 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"c772672c-c983-42e8-ae77-bfc8484ad555","Type":"ContainerDied","Data":"d9c5f503f212d6aa9b0ddd5ddc6ff440b6c542c6960be28ea86ea203ab7ca92d"} Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.360581 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"c772672c-c983-42e8-ae77-bfc8484ad555","Type":"ContainerDied","Data":"e3006b677518772d18ede2c0df9a671f7e5d00f39c12f380e32799bbd51a8cab"} Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.367509 4810 generic.go:334] "Generic (PLEG): container finished" podID="3eb2dccd-c5dc-436f-b7a6-954af7bc51c5" containerID="4e69a7ec2d6ffef7e8de5181b7cb6f418ce564ddea149f05d15804b56bd3283e" exitCode=0 Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.367563 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"3eb2dccd-c5dc-436f-b7a6-954af7bc51c5","Type":"ContainerDied","Data":"4e69a7ec2d6ffef7e8de5181b7cb6f418ce564ddea149f05d15804b56bd3283e"} Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.370035 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-6e67-account-create-update-lk6cv" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.370063 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-6e67-account-create-update-lk6cv" event={"ID":"48e0d5d9-1d58-41a5-b740-8c8286edec31","Type":"ContainerDied","Data":"9a0a850973879b8c16a53cdf917d4a3ec51f5d1f0bea3d2e4973de66d44880c7"} Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.370095 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a0a850973879b8c16a53cdf917d4a3ec51f5d1f0bea3d2e4973de66d44880c7" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.371485 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-nxd5j" event={"ID":"f04e1699-2be0-4dca-8e4a-73035fde359f","Type":"ContainerDied","Data":"19ce66453832709a411387f21a96253dcdb5acebfcdfa40ee91e2d5e02a077dc"} Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.371510 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="19ce66453832709a411387f21a96253dcdb5acebfcdfa40ee91e2d5e02a077dc" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.371546 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-nxd5j" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.375737 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-w2s7h" event={"ID":"987d17ad-1427-4709-b5db-19fbb00e8a7c","Type":"ContainerDied","Data":"8653c17c2feefd4ce9a0d05b4110170b6e33cc6884775621ff1b3f7d64b78a17"} Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.375814 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8653c17c2feefd4ce9a0d05b4110170b6e33cc6884775621ff1b3f7d64b78a17" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.375844 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-w2s7h" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.376789 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-8ftxl" event={"ID":"da5e0166-d811-4dcd-9230-976dd1893c11","Type":"ContainerDied","Data":"ea790df5d5f36cd5c53b9c8735a8d922caa18ea278fcafd7ab38eca09a4e4d29"} Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.376809 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea790df5d5f36cd5c53b9c8735a8d922caa18ea278fcafd7ab38eca09a4e4d29" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.376860 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-8ftxl" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.380323 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-841f-account-create-update-swd7q" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.380371 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-841f-account-create-update-swd7q" event={"ID":"d1703853-2754-4348-8c45-dcd98ff5d429","Type":"ContainerDied","Data":"8b87b6cef7dd1cc9076c484a82249ed624df4e78c547c11a7768a92577f632cf"} Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.380392 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b87b6cef7dd1cc9076c484a82249ed624df4e78c547c11a7768a92577f632cf" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.520481 4810 scope.go:117] "RemoveContainer" containerID="d9c5f503f212d6aa9b0ddd5ddc6ff440b6c542c6960be28ea86ea203ab7ca92d" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.522968 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.553401 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-api-0"] Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.563877 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-applier-0"] Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.585058 4810 scope.go:117] "RemoveContainer" containerID="a3586f026120011072c0978ee9277dd85cce4dc7aaed51f25a7bea5aff2ae758" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.585195 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-applier-0"] Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.628690 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Feb 19 15:30:34 crc kubenswrapper[4810]: E0219 15:30:34.629114 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f04e1699-2be0-4dca-8e4a-73035fde359f" containerName="mariadb-database-create" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.629126 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="f04e1699-2be0-4dca-8e4a-73035fde359f" containerName="mariadb-database-create" Feb 19 15:30:34 crc kubenswrapper[4810]: E0219 15:30:34.629139 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da5e0166-d811-4dcd-9230-976dd1893c11" containerName="mariadb-database-create" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.629145 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="da5e0166-d811-4dcd-9230-976dd1893c11" containerName="mariadb-database-create" Feb 19 15:30:34 crc kubenswrapper[4810]: E0219 15:30:34.629152 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c772672c-c983-42e8-ae77-bfc8484ad555" containerName="watcher-api-log" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.629158 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="c772672c-c983-42e8-ae77-bfc8484ad555" containerName="watcher-api-log" Feb 19 15:30:34 crc kubenswrapper[4810]: E0219 15:30:34.629167 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c772672c-c983-42e8-ae77-bfc8484ad555" containerName="watcher-api" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.629174 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="c772672c-c983-42e8-ae77-bfc8484ad555" containerName="watcher-api" Feb 19 15:30:34 crc kubenswrapper[4810]: E0219 15:30:34.629190 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ea855ba-523c-4143-8fe8-b0b1150299d0" containerName="watcher-applier" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.629196 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ea855ba-523c-4143-8fe8-b0b1150299d0" containerName="watcher-applier" Feb 19 15:30:34 crc kubenswrapper[4810]: E0219 15:30:34.629212 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="987d17ad-1427-4709-b5db-19fbb00e8a7c" containerName="mariadb-database-create" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.629217 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="987d17ad-1427-4709-b5db-19fbb00e8a7c" containerName="mariadb-database-create" Feb 19 15:30:34 crc kubenswrapper[4810]: E0219 15:30:34.629229 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48e0d5d9-1d58-41a5-b740-8c8286edec31" containerName="mariadb-account-create-update" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.629235 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="48e0d5d9-1d58-41a5-b740-8c8286edec31" containerName="mariadb-account-create-update" Feb 19 15:30:34 crc kubenswrapper[4810]: E0219 15:30:34.629246 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1703853-2754-4348-8c45-dcd98ff5d429" containerName="mariadb-account-create-update" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.629252 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1703853-2754-4348-8c45-dcd98ff5d429" containerName="mariadb-account-create-update" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.629537 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="c772672c-c983-42e8-ae77-bfc8484ad555" containerName="watcher-api" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.629561 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="987d17ad-1427-4709-b5db-19fbb00e8a7c" containerName="mariadb-database-create" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.629572 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="48e0d5d9-1d58-41a5-b740-8c8286edec31" containerName="mariadb-account-create-update" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.629580 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="c772672c-c983-42e8-ae77-bfc8484ad555" containerName="watcher-api-log" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.629587 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1703853-2754-4348-8c45-dcd98ff5d429" containerName="mariadb-account-create-update" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.629597 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="da5e0166-d811-4dcd-9230-976dd1893c11" containerName="mariadb-database-create" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.629604 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ea855ba-523c-4143-8fe8-b0b1150299d0" containerName="watcher-applier" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.629615 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="f04e1699-2be0-4dca-8e4a-73035fde359f" containerName="mariadb-database-create" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.630631 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.638116 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-internal-svc" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.638302 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.646061 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-applier-0"] Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.663550 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-public-svc" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.688495 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.695876 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-applier-config-data" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.700237 4810 scope.go:117] "RemoveContainer" containerID="d9c5f503f212d6aa9b0ddd5ddc6ff440b6c542c6960be28ea86ea203ab7ca92d" Feb 19 15:30:34 crc kubenswrapper[4810]: E0219 15:30:34.703188 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9c5f503f212d6aa9b0ddd5ddc6ff440b6c542c6960be28ea86ea203ab7ca92d\": container with ID starting with d9c5f503f212d6aa9b0ddd5ddc6ff440b6c542c6960be28ea86ea203ab7ca92d not found: ID does not exist" containerID="d9c5f503f212d6aa9b0ddd5ddc6ff440b6c542c6960be28ea86ea203ab7ca92d" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.703236 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9c5f503f212d6aa9b0ddd5ddc6ff440b6c542c6960be28ea86ea203ab7ca92d"} err="failed to get container status \"d9c5f503f212d6aa9b0ddd5ddc6ff440b6c542c6960be28ea86ea203ab7ca92d\": rpc error: code = NotFound desc = could not find container \"d9c5f503f212d6aa9b0ddd5ddc6ff440b6c542c6960be28ea86ea203ab7ca92d\": container with ID starting with d9c5f503f212d6aa9b0ddd5ddc6ff440b6c542c6960be28ea86ea203ab7ca92d not found: ID does not exist" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.703264 4810 scope.go:117] "RemoveContainer" containerID="a3586f026120011072c0978ee9277dd85cce4dc7aaed51f25a7bea5aff2ae758" Feb 19 15:30:34 crc kubenswrapper[4810]: E0219 15:30:34.704360 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3586f026120011072c0978ee9277dd85cce4dc7aaed51f25a7bea5aff2ae758\": container with ID starting with a3586f026120011072c0978ee9277dd85cce4dc7aaed51f25a7bea5aff2ae758 not found: ID does not exist" containerID="a3586f026120011072c0978ee9277dd85cce4dc7aaed51f25a7bea5aff2ae758" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.704383 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3586f026120011072c0978ee9277dd85cce4dc7aaed51f25a7bea5aff2ae758"} err="failed to get container status \"a3586f026120011072c0978ee9277dd85cce4dc7aaed51f25a7bea5aff2ae758\": rpc error: code = NotFound desc = could not find container \"a3586f026120011072c0978ee9277dd85cce4dc7aaed51f25a7bea5aff2ae758\": container with ID starting with a3586f026120011072c0978ee9277dd85cce4dc7aaed51f25a7bea5aff2ae758 not found: ID does not exist" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.704395 4810 scope.go:117] "RemoveContainer" containerID="a8232fee767eaee3fa026223e20673e25fa6d954ddb913e148bf6e0e435de416" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.722147 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c11a7f60-4839-44aa-8615-98de657221f4-logs\") pod \"watcher-api-0\" (UID: \"c11a7f60-4839-44aa-8615-98de657221f4\") " pod="openstack/watcher-api-0" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.722191 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79f3ef20-3f3d-4fa2-8888-36d421303dfd-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"79f3ef20-3f3d-4fa2-8888-36d421303dfd\") " pod="openstack/watcher-applier-0" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.722237 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/c11a7f60-4839-44aa-8615-98de657221f4-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"c11a7f60-4839-44aa-8615-98de657221f4\") " pod="openstack/watcher-api-0" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.722266 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c11a7f60-4839-44aa-8615-98de657221f4-public-tls-certs\") pod \"watcher-api-0\" (UID: \"c11a7f60-4839-44aa-8615-98de657221f4\") " pod="openstack/watcher-api-0" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.722287 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79f3ef20-3f3d-4fa2-8888-36d421303dfd-logs\") pod \"watcher-applier-0\" (UID: \"79f3ef20-3f3d-4fa2-8888-36d421303dfd\") " pod="openstack/watcher-applier-0" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.722308 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c11a7f60-4839-44aa-8615-98de657221f4-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"c11a7f60-4839-44aa-8615-98de657221f4\") " pod="openstack/watcher-api-0" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.722355 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jq8p\" (UniqueName: \"kubernetes.io/projected/79f3ef20-3f3d-4fa2-8888-36d421303dfd-kube-api-access-4jq8p\") pod \"watcher-applier-0\" (UID: \"79f3ef20-3f3d-4fa2-8888-36d421303dfd\") " pod="openstack/watcher-applier-0" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.722401 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5kzq\" (UniqueName: \"kubernetes.io/projected/c11a7f60-4839-44aa-8615-98de657221f4-kube-api-access-h5kzq\") pod \"watcher-api-0\" (UID: \"c11a7f60-4839-44aa-8615-98de657221f4\") " pod="openstack/watcher-api-0" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.722428 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c11a7f60-4839-44aa-8615-98de657221f4-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"c11a7f60-4839-44aa-8615-98de657221f4\") " pod="openstack/watcher-api-0" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.722466 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c11a7f60-4839-44aa-8615-98de657221f4-config-data\") pod \"watcher-api-0\" (UID: \"c11a7f60-4839-44aa-8615-98de657221f4\") " pod="openstack/watcher-api-0" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.722489 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79f3ef20-3f3d-4fa2-8888-36d421303dfd-config-data\") pod \"watcher-applier-0\" (UID: \"79f3ef20-3f3d-4fa2-8888-36d421303dfd\") " pod="openstack/watcher-applier-0" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.733792 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.749371 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.780559 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.824418 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c11a7f60-4839-44aa-8615-98de657221f4-config-data\") pod \"watcher-api-0\" (UID: \"c11a7f60-4839-44aa-8615-98de657221f4\") " pod="openstack/watcher-api-0" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.824486 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79f3ef20-3f3d-4fa2-8888-36d421303dfd-config-data\") pod \"watcher-applier-0\" (UID: \"79f3ef20-3f3d-4fa2-8888-36d421303dfd\") " pod="openstack/watcher-applier-0" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.824533 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c11a7f60-4839-44aa-8615-98de657221f4-logs\") pod \"watcher-api-0\" (UID: \"c11a7f60-4839-44aa-8615-98de657221f4\") " pod="openstack/watcher-api-0" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.824557 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79f3ef20-3f3d-4fa2-8888-36d421303dfd-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"79f3ef20-3f3d-4fa2-8888-36d421303dfd\") " pod="openstack/watcher-applier-0" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.824594 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/c11a7f60-4839-44aa-8615-98de657221f4-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"c11a7f60-4839-44aa-8615-98de657221f4\") " pod="openstack/watcher-api-0" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.824619 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c11a7f60-4839-44aa-8615-98de657221f4-public-tls-certs\") pod \"watcher-api-0\" (UID: \"c11a7f60-4839-44aa-8615-98de657221f4\") " pod="openstack/watcher-api-0" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.824635 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79f3ef20-3f3d-4fa2-8888-36d421303dfd-logs\") pod \"watcher-applier-0\" (UID: \"79f3ef20-3f3d-4fa2-8888-36d421303dfd\") " pod="openstack/watcher-applier-0" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.824655 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c11a7f60-4839-44aa-8615-98de657221f4-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"c11a7f60-4839-44aa-8615-98de657221f4\") " pod="openstack/watcher-api-0" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.824672 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jq8p\" (UniqueName: \"kubernetes.io/projected/79f3ef20-3f3d-4fa2-8888-36d421303dfd-kube-api-access-4jq8p\") pod \"watcher-applier-0\" (UID: \"79f3ef20-3f3d-4fa2-8888-36d421303dfd\") " pod="openstack/watcher-applier-0" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.824713 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5kzq\" (UniqueName: \"kubernetes.io/projected/c11a7f60-4839-44aa-8615-98de657221f4-kube-api-access-h5kzq\") pod \"watcher-api-0\" (UID: \"c11a7f60-4839-44aa-8615-98de657221f4\") " pod="openstack/watcher-api-0" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.824736 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c11a7f60-4839-44aa-8615-98de657221f4-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"c11a7f60-4839-44aa-8615-98de657221f4\") " pod="openstack/watcher-api-0" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.830842 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79f3ef20-3f3d-4fa2-8888-36d421303dfd-config-data\") pod \"watcher-applier-0\" (UID: \"79f3ef20-3f3d-4fa2-8888-36d421303dfd\") " pod="openstack/watcher-applier-0" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.831154 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c11a7f60-4839-44aa-8615-98de657221f4-logs\") pod \"watcher-api-0\" (UID: \"c11a7f60-4839-44aa-8615-98de657221f4\") " pod="openstack/watcher-api-0" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.838288 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79f3ef20-3f3d-4fa2-8888-36d421303dfd-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"79f3ef20-3f3d-4fa2-8888-36d421303dfd\") " pod="openstack/watcher-applier-0" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.844761 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c11a7f60-4839-44aa-8615-98de657221f4-public-tls-certs\") pod \"watcher-api-0\" (UID: \"c11a7f60-4839-44aa-8615-98de657221f4\") " pod="openstack/watcher-api-0" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.845501 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/c11a7f60-4839-44aa-8615-98de657221f4-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"c11a7f60-4839-44aa-8615-98de657221f4\") " pod="openstack/watcher-api-0" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.845566 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79f3ef20-3f3d-4fa2-8888-36d421303dfd-logs\") pod \"watcher-applier-0\" (UID: \"79f3ef20-3f3d-4fa2-8888-36d421303dfd\") " pod="openstack/watcher-applier-0" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.849018 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c11a7f60-4839-44aa-8615-98de657221f4-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"c11a7f60-4839-44aa-8615-98de657221f4\") " pod="openstack/watcher-api-0" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.852928 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c11a7f60-4839-44aa-8615-98de657221f4-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"c11a7f60-4839-44aa-8615-98de657221f4\") " pod="openstack/watcher-api-0" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.857731 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c11a7f60-4839-44aa-8615-98de657221f4-config-data\") pod \"watcher-api-0\" (UID: \"c11a7f60-4839-44aa-8615-98de657221f4\") " pod="openstack/watcher-api-0" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.861953 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5kzq\" (UniqueName: \"kubernetes.io/projected/c11a7f60-4839-44aa-8615-98de657221f4-kube-api-access-h5kzq\") pod \"watcher-api-0\" (UID: \"c11a7f60-4839-44aa-8615-98de657221f4\") " pod="openstack/watcher-api-0" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.870194 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jq8p\" (UniqueName: \"kubernetes.io/projected/79f3ef20-3f3d-4fa2-8888-36d421303dfd-kube-api-access-4jq8p\") pod \"watcher-applier-0\" (UID: \"79f3ef20-3f3d-4fa2-8888-36d421303dfd\") " pod="openstack/watcher-applier-0" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.926047 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3eb2dccd-c5dc-436f-b7a6-954af7bc51c5-config-data\") pod \"3eb2dccd-c5dc-436f-b7a6-954af7bc51c5\" (UID: \"3eb2dccd-c5dc-436f-b7a6-954af7bc51c5\") " Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.926108 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2bwsx\" (UniqueName: \"kubernetes.io/projected/3eb2dccd-c5dc-436f-b7a6-954af7bc51c5-kube-api-access-2bwsx\") pod \"3eb2dccd-c5dc-436f-b7a6-954af7bc51c5\" (UID: \"3eb2dccd-c5dc-436f-b7a6-954af7bc51c5\") " Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.926232 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3eb2dccd-c5dc-436f-b7a6-954af7bc51c5-combined-ca-bundle\") pod \"3eb2dccd-c5dc-436f-b7a6-954af7bc51c5\" (UID: \"3eb2dccd-c5dc-436f-b7a6-954af7bc51c5\") " Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.926260 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3eb2dccd-c5dc-436f-b7a6-954af7bc51c5-logs\") pod \"3eb2dccd-c5dc-436f-b7a6-954af7bc51c5\" (UID: \"3eb2dccd-c5dc-436f-b7a6-954af7bc51c5\") " Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.926278 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/3eb2dccd-c5dc-436f-b7a6-954af7bc51c5-custom-prometheus-ca\") pod \"3eb2dccd-c5dc-436f-b7a6-954af7bc51c5\" (UID: \"3eb2dccd-c5dc-436f-b7a6-954af7bc51c5\") " Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.927287 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3eb2dccd-c5dc-436f-b7a6-954af7bc51c5-logs" (OuterVolumeSpecName: "logs") pod "3eb2dccd-c5dc-436f-b7a6-954af7bc51c5" (UID: "3eb2dccd-c5dc-436f-b7a6-954af7bc51c5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.929645 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3eb2dccd-c5dc-436f-b7a6-954af7bc51c5-kube-api-access-2bwsx" (OuterVolumeSpecName: "kube-api-access-2bwsx") pod "3eb2dccd-c5dc-436f-b7a6-954af7bc51c5" (UID: "3eb2dccd-c5dc-436f-b7a6-954af7bc51c5"). InnerVolumeSpecName "kube-api-access-2bwsx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.937508 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-da99-account-create-update-4j7hb" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.964875 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3eb2dccd-c5dc-436f-b7a6-954af7bc51c5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3eb2dccd-c5dc-436f-b7a6-954af7bc51c5" (UID: "3eb2dccd-c5dc-436f-b7a6-954af7bc51c5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:30:34 crc kubenswrapper[4810]: I0219 15:30:34.983527 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3eb2dccd-c5dc-436f-b7a6-954af7bc51c5-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "3eb2dccd-c5dc-436f-b7a6-954af7bc51c5" (UID: "3eb2dccd-c5dc-436f-b7a6-954af7bc51c5"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:30:35 crc kubenswrapper[4810]: I0219 15:30:35.027358 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Feb 19 15:30:35 crc kubenswrapper[4810]: I0219 15:30:35.028566 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3eb2dccd-c5dc-436f-b7a6-954af7bc51c5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:35 crc kubenswrapper[4810]: I0219 15:30:35.028588 4810 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3eb2dccd-c5dc-436f-b7a6-954af7bc51c5-logs\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:35 crc kubenswrapper[4810]: I0219 15:30:35.028598 4810 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/3eb2dccd-c5dc-436f-b7a6-954af7bc51c5-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:35 crc kubenswrapper[4810]: I0219 15:30:35.028607 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2bwsx\" (UniqueName: \"kubernetes.io/projected/3eb2dccd-c5dc-436f-b7a6-954af7bc51c5-kube-api-access-2bwsx\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:35 crc kubenswrapper[4810]: I0219 15:30:35.041475 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3eb2dccd-c5dc-436f-b7a6-954af7bc51c5-config-data" (OuterVolumeSpecName: "config-data") pod "3eb2dccd-c5dc-436f-b7a6-954af7bc51c5" (UID: "3eb2dccd-c5dc-436f-b7a6-954af7bc51c5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:30:35 crc kubenswrapper[4810]: I0219 15:30:35.041917 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Feb 19 15:30:35 crc kubenswrapper[4810]: I0219 15:30:35.129508 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9t982\" (UniqueName: \"kubernetes.io/projected/1f19eb06-d11c-409b-8b7e-516c9a5db815-kube-api-access-9t982\") pod \"1f19eb06-d11c-409b-8b7e-516c9a5db815\" (UID: \"1f19eb06-d11c-409b-8b7e-516c9a5db815\") " Feb 19 15:30:35 crc kubenswrapper[4810]: I0219 15:30:35.129680 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f19eb06-d11c-409b-8b7e-516c9a5db815-operator-scripts\") pod \"1f19eb06-d11c-409b-8b7e-516c9a5db815\" (UID: \"1f19eb06-d11c-409b-8b7e-516c9a5db815\") " Feb 19 15:30:35 crc kubenswrapper[4810]: I0219 15:30:35.130143 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3eb2dccd-c5dc-436f-b7a6-954af7bc51c5-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:35 crc kubenswrapper[4810]: I0219 15:30:35.130431 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f19eb06-d11c-409b-8b7e-516c9a5db815-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1f19eb06-d11c-409b-8b7e-516c9a5db815" (UID: "1f19eb06-d11c-409b-8b7e-516c9a5db815"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:30:35 crc kubenswrapper[4810]: I0219 15:30:35.133537 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f19eb06-d11c-409b-8b7e-516c9a5db815-kube-api-access-9t982" (OuterVolumeSpecName: "kube-api-access-9t982") pod "1f19eb06-d11c-409b-8b7e-516c9a5db815" (UID: "1f19eb06-d11c-409b-8b7e-516c9a5db815"). InnerVolumeSpecName "kube-api-access-9t982". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:30:35 crc kubenswrapper[4810]: I0219 15:30:35.231691 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9t982\" (UniqueName: \"kubernetes.io/projected/1f19eb06-d11c-409b-8b7e-516c9a5db815-kube-api-access-9t982\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:35 crc kubenswrapper[4810]: I0219 15:30:35.231738 4810 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f19eb06-d11c-409b-8b7e-516c9a5db815-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:35 crc kubenswrapper[4810]: I0219 15:30:35.404139 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Feb 19 15:30:35 crc kubenswrapper[4810]: I0219 15:30:35.404169 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"3eb2dccd-c5dc-436f-b7a6-954af7bc51c5","Type":"ContainerDied","Data":"3750609201e07e960ce122b5fe6baad963df212daffe611a1c8ba29e4bf01f7a"} Feb 19 15:30:35 crc kubenswrapper[4810]: I0219 15:30:35.404554 4810 scope.go:117] "RemoveContainer" containerID="4e69a7ec2d6ffef7e8de5181b7cb6f418ce564ddea149f05d15804b56bd3283e" Feb 19 15:30:35 crc kubenswrapper[4810]: I0219 15:30:35.408014 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-da99-account-create-update-4j7hb" event={"ID":"1f19eb06-d11c-409b-8b7e-516c9a5db815","Type":"ContainerDied","Data":"5b4c6c0a8687bad9f0c8df95b6b32fefcb3920c41eb3bf6e882a708592401faf"} Feb 19 15:30:35 crc kubenswrapper[4810]: I0219 15:30:35.408053 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b4c6c0a8687bad9f0c8df95b6b32fefcb3920c41eb3bf6e882a708592401faf" Feb 19 15:30:35 crc kubenswrapper[4810]: I0219 15:30:35.408090 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-da99-account-create-update-4j7hb" Feb 19 15:30:35 crc kubenswrapper[4810]: I0219 15:30:35.456462 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ea855ba-523c-4143-8fe8-b0b1150299d0" path="/var/lib/kubelet/pods/2ea855ba-523c-4143-8fe8-b0b1150299d0/volumes" Feb 19 15:30:35 crc kubenswrapper[4810]: I0219 15:30:35.457063 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c772672c-c983-42e8-ae77-bfc8484ad555" path="/var/lib/kubelet/pods/c772672c-c983-42e8-ae77-bfc8484ad555/volumes" Feb 19 15:30:35 crc kubenswrapper[4810]: I0219 15:30:35.457599 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 19 15:30:35 crc kubenswrapper[4810]: I0219 15:30:35.490414 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 19 15:30:35 crc kubenswrapper[4810]: I0219 15:30:35.504961 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 19 15:30:35 crc kubenswrapper[4810]: E0219 15:30:35.505475 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3eb2dccd-c5dc-436f-b7a6-954af7bc51c5" containerName="watcher-decision-engine" Feb 19 15:30:35 crc kubenswrapper[4810]: I0219 15:30:35.505511 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="3eb2dccd-c5dc-436f-b7a6-954af7bc51c5" containerName="watcher-decision-engine" Feb 19 15:30:35 crc kubenswrapper[4810]: E0219 15:30:35.505528 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f19eb06-d11c-409b-8b7e-516c9a5db815" containerName="mariadb-account-create-update" Feb 19 15:30:35 crc kubenswrapper[4810]: I0219 15:30:35.505535 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f19eb06-d11c-409b-8b7e-516c9a5db815" containerName="mariadb-account-create-update" Feb 19 15:30:35 crc kubenswrapper[4810]: E0219 15:30:35.505557 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3eb2dccd-c5dc-436f-b7a6-954af7bc51c5" containerName="watcher-decision-engine" Feb 19 15:30:35 crc kubenswrapper[4810]: I0219 15:30:35.505565 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="3eb2dccd-c5dc-436f-b7a6-954af7bc51c5" containerName="watcher-decision-engine" Feb 19 15:30:35 crc kubenswrapper[4810]: E0219 15:30:35.505597 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3eb2dccd-c5dc-436f-b7a6-954af7bc51c5" containerName="watcher-decision-engine" Feb 19 15:30:35 crc kubenswrapper[4810]: I0219 15:30:35.505605 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="3eb2dccd-c5dc-436f-b7a6-954af7bc51c5" containerName="watcher-decision-engine" Feb 19 15:30:35 crc kubenswrapper[4810]: I0219 15:30:35.505833 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="3eb2dccd-c5dc-436f-b7a6-954af7bc51c5" containerName="watcher-decision-engine" Feb 19 15:30:35 crc kubenswrapper[4810]: I0219 15:30:35.505861 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="3eb2dccd-c5dc-436f-b7a6-954af7bc51c5" containerName="watcher-decision-engine" Feb 19 15:30:35 crc kubenswrapper[4810]: I0219 15:30:35.505874 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f19eb06-d11c-409b-8b7e-516c9a5db815" containerName="mariadb-account-create-update" Feb 19 15:30:35 crc kubenswrapper[4810]: I0219 15:30:35.506846 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Feb 19 15:30:35 crc kubenswrapper[4810]: I0219 15:30:35.510245 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Feb 19 15:30:35 crc kubenswrapper[4810]: I0219 15:30:35.514995 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 19 15:30:35 crc kubenswrapper[4810]: I0219 15:30:35.541489 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Feb 19 15:30:35 crc kubenswrapper[4810]: I0219 15:30:35.589926 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Feb 19 15:30:35 crc kubenswrapper[4810]: I0219 15:30:35.639110 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31448de0-cbd5-4d71-8107-881c0327fb55-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"31448de0-cbd5-4d71-8107-881c0327fb55\") " pod="openstack/watcher-decision-engine-0" Feb 19 15:30:35 crc kubenswrapper[4810]: I0219 15:30:35.639156 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/31448de0-cbd5-4d71-8107-881c0327fb55-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"31448de0-cbd5-4d71-8107-881c0327fb55\") " pod="openstack/watcher-decision-engine-0" Feb 19 15:30:35 crc kubenswrapper[4810]: I0219 15:30:35.639200 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31448de0-cbd5-4d71-8107-881c0327fb55-logs\") pod \"watcher-decision-engine-0\" (UID: \"31448de0-cbd5-4d71-8107-881c0327fb55\") " pod="openstack/watcher-decision-engine-0" Feb 19 15:30:35 crc kubenswrapper[4810]: I0219 15:30:35.639237 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnrtj\" (UniqueName: \"kubernetes.io/projected/31448de0-cbd5-4d71-8107-881c0327fb55-kube-api-access-qnrtj\") pod \"watcher-decision-engine-0\" (UID: \"31448de0-cbd5-4d71-8107-881c0327fb55\") " pod="openstack/watcher-decision-engine-0" Feb 19 15:30:35 crc kubenswrapper[4810]: I0219 15:30:35.639489 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31448de0-cbd5-4d71-8107-881c0327fb55-config-data\") pod \"watcher-decision-engine-0\" (UID: \"31448de0-cbd5-4d71-8107-881c0327fb55\") " pod="openstack/watcher-decision-engine-0" Feb 19 15:30:35 crc kubenswrapper[4810]: I0219 15:30:35.741614 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31448de0-cbd5-4d71-8107-881c0327fb55-config-data\") pod \"watcher-decision-engine-0\" (UID: \"31448de0-cbd5-4d71-8107-881c0327fb55\") " pod="openstack/watcher-decision-engine-0" Feb 19 15:30:35 crc kubenswrapper[4810]: I0219 15:30:35.742516 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31448de0-cbd5-4d71-8107-881c0327fb55-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"31448de0-cbd5-4d71-8107-881c0327fb55\") " pod="openstack/watcher-decision-engine-0" Feb 19 15:30:35 crc kubenswrapper[4810]: I0219 15:30:35.742679 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/31448de0-cbd5-4d71-8107-881c0327fb55-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"31448de0-cbd5-4d71-8107-881c0327fb55\") " pod="openstack/watcher-decision-engine-0" Feb 19 15:30:35 crc kubenswrapper[4810]: I0219 15:30:35.742806 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31448de0-cbd5-4d71-8107-881c0327fb55-logs\") pod \"watcher-decision-engine-0\" (UID: \"31448de0-cbd5-4d71-8107-881c0327fb55\") " pod="openstack/watcher-decision-engine-0" Feb 19 15:30:35 crc kubenswrapper[4810]: I0219 15:30:35.742919 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnrtj\" (UniqueName: \"kubernetes.io/projected/31448de0-cbd5-4d71-8107-881c0327fb55-kube-api-access-qnrtj\") pod \"watcher-decision-engine-0\" (UID: \"31448de0-cbd5-4d71-8107-881c0327fb55\") " pod="openstack/watcher-decision-engine-0" Feb 19 15:30:35 crc kubenswrapper[4810]: I0219 15:30:35.743318 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31448de0-cbd5-4d71-8107-881c0327fb55-logs\") pod \"watcher-decision-engine-0\" (UID: \"31448de0-cbd5-4d71-8107-881c0327fb55\") " pod="openstack/watcher-decision-engine-0" Feb 19 15:30:35 crc kubenswrapper[4810]: I0219 15:30:35.745667 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/31448de0-cbd5-4d71-8107-881c0327fb55-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"31448de0-cbd5-4d71-8107-881c0327fb55\") " pod="openstack/watcher-decision-engine-0" Feb 19 15:30:35 crc kubenswrapper[4810]: I0219 15:30:35.746605 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31448de0-cbd5-4d71-8107-881c0327fb55-config-data\") pod \"watcher-decision-engine-0\" (UID: \"31448de0-cbd5-4d71-8107-881c0327fb55\") " pod="openstack/watcher-decision-engine-0" Feb 19 15:30:35 crc kubenswrapper[4810]: I0219 15:30:35.748866 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31448de0-cbd5-4d71-8107-881c0327fb55-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"31448de0-cbd5-4d71-8107-881c0327fb55\") " pod="openstack/watcher-decision-engine-0" Feb 19 15:30:35 crc kubenswrapper[4810]: I0219 15:30:35.761398 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnrtj\" (UniqueName: \"kubernetes.io/projected/31448de0-cbd5-4d71-8107-881c0327fb55-kube-api-access-qnrtj\") pod \"watcher-decision-engine-0\" (UID: \"31448de0-cbd5-4d71-8107-881c0327fb55\") " pod="openstack/watcher-decision-engine-0" Feb 19 15:30:35 crc kubenswrapper[4810]: I0219 15:30:35.827000 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Feb 19 15:30:36 crc kubenswrapper[4810]: I0219 15:30:36.308133 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 19 15:30:36 crc kubenswrapper[4810]: I0219 15:30:36.430007 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"79f3ef20-3f3d-4fa2-8888-36d421303dfd","Type":"ContainerStarted","Data":"89b2ac85fe989b6715854d867a0b9e4dbc7254cda52ee4efe559c2eceddac7c9"} Feb 19 15:30:36 crc kubenswrapper[4810]: I0219 15:30:36.430063 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"79f3ef20-3f3d-4fa2-8888-36d421303dfd","Type":"ContainerStarted","Data":"fbdf32c335aa6dc04fc51b2cdbf0bfd8d8d90e8499fd480593d6ebb30114ca65"} Feb 19 15:30:36 crc kubenswrapper[4810]: I0219 15:30:36.433296 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"31448de0-cbd5-4d71-8107-881c0327fb55","Type":"ContainerStarted","Data":"8b0e5903dccd9d2e5e23e46710ab47f5e2c1b9dea5ccf431992056feeec7f78e"} Feb 19 15:30:36 crc kubenswrapper[4810]: I0219 15:30:36.439142 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"c11a7f60-4839-44aa-8615-98de657221f4","Type":"ContainerStarted","Data":"a74fdb1f295bf98e0b6a7c00cef5e765a9ec2b3b3bb83e3346c4f8f459e1c631"} Feb 19 15:30:36 crc kubenswrapper[4810]: I0219 15:30:36.439182 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"c11a7f60-4839-44aa-8615-98de657221f4","Type":"ContainerStarted","Data":"be01de1818de444acfe4af8cd81669a085babf9fa9bb7f4d2487532346267f24"} Feb 19 15:30:36 crc kubenswrapper[4810]: I0219 15:30:36.439191 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"c11a7f60-4839-44aa-8615-98de657221f4","Type":"ContainerStarted","Data":"92ad3e8c6d58735d2e5a96f379c49b8251db2cb796138e8fb3d05573e9bc6628"} Feb 19 15:30:36 crc kubenswrapper[4810]: I0219 15:30:36.439442 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Feb 19 15:30:36 crc kubenswrapper[4810]: I0219 15:30:36.454246 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-applier-0" podStartSLOduration=2.454231083 podStartE2EDuration="2.454231083s" podCreationTimestamp="2026-02-19 15:30:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:30:36.449464336 +0000 UTC m=+1265.931494460" watchObservedRunningTime="2026-02-19 15:30:36.454231083 +0000 UTC m=+1265.936261207" Feb 19 15:30:36 crc kubenswrapper[4810]: I0219 15:30:36.470557 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=2.470538213 podStartE2EDuration="2.470538213s" podCreationTimestamp="2026-02-19 15:30:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:30:36.469754944 +0000 UTC m=+1265.951785078" watchObservedRunningTime="2026-02-19 15:30:36.470538213 +0000 UTC m=+1265.952568337" Feb 19 15:30:37 crc kubenswrapper[4810]: I0219 15:30:37.452884 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3eb2dccd-c5dc-436f-b7a6-954af7bc51c5" path="/var/lib/kubelet/pods/3eb2dccd-c5dc-436f-b7a6-954af7bc51c5/volumes" Feb 19 15:30:37 crc kubenswrapper[4810]: I0219 15:30:37.455764 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"31448de0-cbd5-4d71-8107-881c0327fb55","Type":"ContainerStarted","Data":"64d0d64a5da470f2a5ddc1de4e8c04b14ee710ccd412090965ee7e074e161cc8"} Feb 19 15:30:37 crc kubenswrapper[4810]: I0219 15:30:37.467923 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=2.467905401 podStartE2EDuration="2.467905401s" podCreationTimestamp="2026-02-19 15:30:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:30:37.467795758 +0000 UTC m=+1266.949825872" watchObservedRunningTime="2026-02-19 15:30:37.467905401 +0000 UTC m=+1266.949935535" Feb 19 15:30:38 crc kubenswrapper[4810]: I0219 15:30:38.869124 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Feb 19 15:30:39 crc kubenswrapper[4810]: I0219 15:30:39.491892 4810 generic.go:334] "Generic (PLEG): container finished" podID="e03c99b3-d5d6-479a-9b45-045bda62be1e" containerID="922169bd474755c022d1869b56b5508937a05f190fe4b150d2505095771f9d93" exitCode=0 Feb 19 15:30:39 crc kubenswrapper[4810]: I0219 15:30:39.491937 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e03c99b3-d5d6-479a-9b45-045bda62be1e","Type":"ContainerDied","Data":"922169bd474755c022d1869b56b5508937a05f190fe4b150d2505095771f9d93"} Feb 19 15:30:39 crc kubenswrapper[4810]: I0219 15:30:39.617100 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 15:30:39 crc kubenswrapper[4810]: I0219 15:30:39.740581 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vpjk\" (UniqueName: \"kubernetes.io/projected/e03c99b3-d5d6-479a-9b45-045bda62be1e-kube-api-access-8vpjk\") pod \"e03c99b3-d5d6-479a-9b45-045bda62be1e\" (UID: \"e03c99b3-d5d6-479a-9b45-045bda62be1e\") " Feb 19 15:30:39 crc kubenswrapper[4810]: I0219 15:30:39.741063 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e03c99b3-d5d6-479a-9b45-045bda62be1e-combined-ca-bundle\") pod \"e03c99b3-d5d6-479a-9b45-045bda62be1e\" (UID: \"e03c99b3-d5d6-479a-9b45-045bda62be1e\") " Feb 19 15:30:39 crc kubenswrapper[4810]: I0219 15:30:39.741141 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e03c99b3-d5d6-479a-9b45-045bda62be1e-sg-core-conf-yaml\") pod \"e03c99b3-d5d6-479a-9b45-045bda62be1e\" (UID: \"e03c99b3-d5d6-479a-9b45-045bda62be1e\") " Feb 19 15:30:39 crc kubenswrapper[4810]: I0219 15:30:39.741217 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e03c99b3-d5d6-479a-9b45-045bda62be1e-run-httpd\") pod \"e03c99b3-d5d6-479a-9b45-045bda62be1e\" (UID: \"e03c99b3-d5d6-479a-9b45-045bda62be1e\") " Feb 19 15:30:39 crc kubenswrapper[4810]: I0219 15:30:39.741353 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e03c99b3-d5d6-479a-9b45-045bda62be1e-config-data\") pod \"e03c99b3-d5d6-479a-9b45-045bda62be1e\" (UID: \"e03c99b3-d5d6-479a-9b45-045bda62be1e\") " Feb 19 15:30:39 crc kubenswrapper[4810]: I0219 15:30:39.741417 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e03c99b3-d5d6-479a-9b45-045bda62be1e-scripts\") pod \"e03c99b3-d5d6-479a-9b45-045bda62be1e\" (UID: \"e03c99b3-d5d6-479a-9b45-045bda62be1e\") " Feb 19 15:30:39 crc kubenswrapper[4810]: I0219 15:30:39.741500 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e03c99b3-d5d6-479a-9b45-045bda62be1e-log-httpd\") pod \"e03c99b3-d5d6-479a-9b45-045bda62be1e\" (UID: \"e03c99b3-d5d6-479a-9b45-045bda62be1e\") " Feb 19 15:30:39 crc kubenswrapper[4810]: I0219 15:30:39.742380 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e03c99b3-d5d6-479a-9b45-045bda62be1e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e03c99b3-d5d6-479a-9b45-045bda62be1e" (UID: "e03c99b3-d5d6-479a-9b45-045bda62be1e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:30:39 crc kubenswrapper[4810]: I0219 15:30:39.742958 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e03c99b3-d5d6-479a-9b45-045bda62be1e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e03c99b3-d5d6-479a-9b45-045bda62be1e" (UID: "e03c99b3-d5d6-479a-9b45-045bda62be1e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:30:39 crc kubenswrapper[4810]: I0219 15:30:39.760064 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e03c99b3-d5d6-479a-9b45-045bda62be1e-kube-api-access-8vpjk" (OuterVolumeSpecName: "kube-api-access-8vpjk") pod "e03c99b3-d5d6-479a-9b45-045bda62be1e" (UID: "e03c99b3-d5d6-479a-9b45-045bda62be1e"). InnerVolumeSpecName "kube-api-access-8vpjk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:30:39 crc kubenswrapper[4810]: I0219 15:30:39.764683 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e03c99b3-d5d6-479a-9b45-045bda62be1e-scripts" (OuterVolumeSpecName: "scripts") pod "e03c99b3-d5d6-479a-9b45-045bda62be1e" (UID: "e03c99b3-d5d6-479a-9b45-045bda62be1e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:30:39 crc kubenswrapper[4810]: I0219 15:30:39.781573 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e03c99b3-d5d6-479a-9b45-045bda62be1e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e03c99b3-d5d6-479a-9b45-045bda62be1e" (UID: "e03c99b3-d5d6-479a-9b45-045bda62be1e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:30:39 crc kubenswrapper[4810]: I0219 15:30:39.838099 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e03c99b3-d5d6-479a-9b45-045bda62be1e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e03c99b3-d5d6-479a-9b45-045bda62be1e" (UID: "e03c99b3-d5d6-479a-9b45-045bda62be1e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:30:39 crc kubenswrapper[4810]: I0219 15:30:39.843524 4810 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e03c99b3-d5d6-479a-9b45-045bda62be1e-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:39 crc kubenswrapper[4810]: I0219 15:30:39.843790 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vpjk\" (UniqueName: \"kubernetes.io/projected/e03c99b3-d5d6-479a-9b45-045bda62be1e-kube-api-access-8vpjk\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:39 crc kubenswrapper[4810]: I0219 15:30:39.843867 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e03c99b3-d5d6-479a-9b45-045bda62be1e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:39 crc kubenswrapper[4810]: I0219 15:30:39.843924 4810 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e03c99b3-d5d6-479a-9b45-045bda62be1e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:39 crc kubenswrapper[4810]: I0219 15:30:39.843986 4810 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e03c99b3-d5d6-479a-9b45-045bda62be1e-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:39 crc kubenswrapper[4810]: I0219 15:30:39.844042 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e03c99b3-d5d6-479a-9b45-045bda62be1e-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:39 crc kubenswrapper[4810]: I0219 15:30:39.875622 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e03c99b3-d5d6-479a-9b45-045bda62be1e-config-data" (OuterVolumeSpecName: "config-data") pod "e03c99b3-d5d6-479a-9b45-045bda62be1e" (UID: "e03c99b3-d5d6-479a-9b45-045bda62be1e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:30:39 crc kubenswrapper[4810]: I0219 15:30:39.946105 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e03c99b3-d5d6-479a-9b45-045bda62be1e-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:40 crc kubenswrapper[4810]: I0219 15:30:40.028582 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Feb 19 15:30:40 crc kubenswrapper[4810]: I0219 15:30:40.042351 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-applier-0" Feb 19 15:30:40 crc kubenswrapper[4810]: I0219 15:30:40.317449 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-58xq9"] Feb 19 15:30:40 crc kubenswrapper[4810]: E0219 15:30:40.318007 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3eb2dccd-c5dc-436f-b7a6-954af7bc51c5" containerName="watcher-decision-engine" Feb 19 15:30:40 crc kubenswrapper[4810]: I0219 15:30:40.318084 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="3eb2dccd-c5dc-436f-b7a6-954af7bc51c5" containerName="watcher-decision-engine" Feb 19 15:30:40 crc kubenswrapper[4810]: E0219 15:30:40.318144 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e03c99b3-d5d6-479a-9b45-045bda62be1e" containerName="proxy-httpd" Feb 19 15:30:40 crc kubenswrapper[4810]: I0219 15:30:40.318193 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="e03c99b3-d5d6-479a-9b45-045bda62be1e" containerName="proxy-httpd" Feb 19 15:30:40 crc kubenswrapper[4810]: E0219 15:30:40.318253 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e03c99b3-d5d6-479a-9b45-045bda62be1e" containerName="ceilometer-central-agent" Feb 19 15:30:40 crc kubenswrapper[4810]: I0219 15:30:40.318311 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="e03c99b3-d5d6-479a-9b45-045bda62be1e" containerName="ceilometer-central-agent" Feb 19 15:30:40 crc kubenswrapper[4810]: E0219 15:30:40.318399 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e03c99b3-d5d6-479a-9b45-045bda62be1e" containerName="ceilometer-notification-agent" Feb 19 15:30:40 crc kubenswrapper[4810]: I0219 15:30:40.318656 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="e03c99b3-d5d6-479a-9b45-045bda62be1e" containerName="ceilometer-notification-agent" Feb 19 15:30:40 crc kubenswrapper[4810]: E0219 15:30:40.318760 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e03c99b3-d5d6-479a-9b45-045bda62be1e" containerName="sg-core" Feb 19 15:30:40 crc kubenswrapper[4810]: I0219 15:30:40.318812 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="e03c99b3-d5d6-479a-9b45-045bda62be1e" containerName="sg-core" Feb 19 15:30:40 crc kubenswrapper[4810]: I0219 15:30:40.319054 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="3eb2dccd-c5dc-436f-b7a6-954af7bc51c5" containerName="watcher-decision-engine" Feb 19 15:30:40 crc kubenswrapper[4810]: I0219 15:30:40.319123 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="e03c99b3-d5d6-479a-9b45-045bda62be1e" containerName="ceilometer-central-agent" Feb 19 15:30:40 crc kubenswrapper[4810]: I0219 15:30:40.319182 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="e03c99b3-d5d6-479a-9b45-045bda62be1e" containerName="proxy-httpd" Feb 19 15:30:40 crc kubenswrapper[4810]: I0219 15:30:40.319233 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="e03c99b3-d5d6-479a-9b45-045bda62be1e" containerName="sg-core" Feb 19 15:30:40 crc kubenswrapper[4810]: I0219 15:30:40.319285 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="3eb2dccd-c5dc-436f-b7a6-954af7bc51c5" containerName="watcher-decision-engine" Feb 19 15:30:40 crc kubenswrapper[4810]: I0219 15:30:40.319363 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="e03c99b3-d5d6-479a-9b45-045bda62be1e" containerName="ceilometer-notification-agent" Feb 19 15:30:40 crc kubenswrapper[4810]: I0219 15:30:40.320006 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-58xq9" Feb 19 15:30:40 crc kubenswrapper[4810]: I0219 15:30:40.326272 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Feb 19 15:30:40 crc kubenswrapper[4810]: I0219 15:30:40.326439 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 19 15:30:40 crc kubenswrapper[4810]: I0219 15:30:40.329140 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-nqhd6" Feb 19 15:30:40 crc kubenswrapper[4810]: I0219 15:30:40.333449 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-58xq9"] Feb 19 15:30:40 crc kubenswrapper[4810]: I0219 15:30:40.454695 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/972d6f5e-3edf-4b6e-bdde-39c580caea31-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-58xq9\" (UID: \"972d6f5e-3edf-4b6e-bdde-39c580caea31\") " pod="openstack/nova-cell0-conductor-db-sync-58xq9" Feb 19 15:30:40 crc kubenswrapper[4810]: I0219 15:30:40.454810 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/972d6f5e-3edf-4b6e-bdde-39c580caea31-config-data\") pod \"nova-cell0-conductor-db-sync-58xq9\" (UID: \"972d6f5e-3edf-4b6e-bdde-39c580caea31\") " pod="openstack/nova-cell0-conductor-db-sync-58xq9" Feb 19 15:30:40 crc kubenswrapper[4810]: I0219 15:30:40.454983 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/972d6f5e-3edf-4b6e-bdde-39c580caea31-scripts\") pod \"nova-cell0-conductor-db-sync-58xq9\" (UID: \"972d6f5e-3edf-4b6e-bdde-39c580caea31\") " pod="openstack/nova-cell0-conductor-db-sync-58xq9" Feb 19 15:30:40 crc kubenswrapper[4810]: I0219 15:30:40.455113 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75crr\" (UniqueName: \"kubernetes.io/projected/972d6f5e-3edf-4b6e-bdde-39c580caea31-kube-api-access-75crr\") pod \"nova-cell0-conductor-db-sync-58xq9\" (UID: \"972d6f5e-3edf-4b6e-bdde-39c580caea31\") " pod="openstack/nova-cell0-conductor-db-sync-58xq9" Feb 19 15:30:40 crc kubenswrapper[4810]: I0219 15:30:40.503948 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e03c99b3-d5d6-479a-9b45-045bda62be1e","Type":"ContainerDied","Data":"28190f8c14a28d468a745076cbf90c346864c9199f363697f039ea4c27bb2c7f"} Feb 19 15:30:40 crc kubenswrapper[4810]: I0219 15:30:40.503996 4810 scope.go:117] "RemoveContainer" containerID="d0173ca27d32542d93975f960f3c985b101c2043974bb22943e94cfa2c3990e5" Feb 19 15:30:40 crc kubenswrapper[4810]: I0219 15:30:40.504041 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 15:30:40 crc kubenswrapper[4810]: I0219 15:30:40.532777 4810 scope.go:117] "RemoveContainer" containerID="75f4bc2020206ce4f5c18d99d5247f762fe2e1e5470498d43115d0a1b1bc5184" Feb 19 15:30:40 crc kubenswrapper[4810]: I0219 15:30:40.535090 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 15:30:40 crc kubenswrapper[4810]: I0219 15:30:40.544642 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 19 15:30:40 crc kubenswrapper[4810]: I0219 15:30:40.554765 4810 scope.go:117] "RemoveContainer" containerID="2d5f4181bcf65b08449001da5a939c392bf7453a34f071c7f2e2b29a55dbc3c9" Feb 19 15:30:40 crc kubenswrapper[4810]: I0219 15:30:40.556419 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75crr\" (UniqueName: \"kubernetes.io/projected/972d6f5e-3edf-4b6e-bdde-39c580caea31-kube-api-access-75crr\") pod \"nova-cell0-conductor-db-sync-58xq9\" (UID: \"972d6f5e-3edf-4b6e-bdde-39c580caea31\") " pod="openstack/nova-cell0-conductor-db-sync-58xq9" Feb 19 15:30:40 crc kubenswrapper[4810]: I0219 15:30:40.556533 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/972d6f5e-3edf-4b6e-bdde-39c580caea31-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-58xq9\" (UID: \"972d6f5e-3edf-4b6e-bdde-39c580caea31\") " pod="openstack/nova-cell0-conductor-db-sync-58xq9" Feb 19 15:30:40 crc kubenswrapper[4810]: I0219 15:30:40.556600 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/972d6f5e-3edf-4b6e-bdde-39c580caea31-config-data\") pod \"nova-cell0-conductor-db-sync-58xq9\" (UID: \"972d6f5e-3edf-4b6e-bdde-39c580caea31\") " pod="openstack/nova-cell0-conductor-db-sync-58xq9" Feb 19 15:30:40 crc kubenswrapper[4810]: I0219 15:30:40.556654 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/972d6f5e-3edf-4b6e-bdde-39c580caea31-scripts\") pod \"nova-cell0-conductor-db-sync-58xq9\" (UID: \"972d6f5e-3edf-4b6e-bdde-39c580caea31\") " pod="openstack/nova-cell0-conductor-db-sync-58xq9" Feb 19 15:30:40 crc kubenswrapper[4810]: I0219 15:30:40.561868 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/972d6f5e-3edf-4b6e-bdde-39c580caea31-config-data\") pod \"nova-cell0-conductor-db-sync-58xq9\" (UID: \"972d6f5e-3edf-4b6e-bdde-39c580caea31\") " pod="openstack/nova-cell0-conductor-db-sync-58xq9" Feb 19 15:30:40 crc kubenswrapper[4810]: I0219 15:30:40.565037 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/972d6f5e-3edf-4b6e-bdde-39c580caea31-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-58xq9\" (UID: \"972d6f5e-3edf-4b6e-bdde-39c580caea31\") " pod="openstack/nova-cell0-conductor-db-sync-58xq9" Feb 19 15:30:40 crc kubenswrapper[4810]: I0219 15:30:40.589281 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75crr\" (UniqueName: \"kubernetes.io/projected/972d6f5e-3edf-4b6e-bdde-39c580caea31-kube-api-access-75crr\") pod \"nova-cell0-conductor-db-sync-58xq9\" (UID: \"972d6f5e-3edf-4b6e-bdde-39c580caea31\") " pod="openstack/nova-cell0-conductor-db-sync-58xq9" Feb 19 15:30:40 crc kubenswrapper[4810]: I0219 15:30:40.590391 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 19 15:30:40 crc kubenswrapper[4810]: I0219 15:30:40.593312 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 15:30:40 crc kubenswrapper[4810]: I0219 15:30:40.597119 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/972d6f5e-3edf-4b6e-bdde-39c580caea31-scripts\") pod \"nova-cell0-conductor-db-sync-58xq9\" (UID: \"972d6f5e-3edf-4b6e-bdde-39c580caea31\") " pod="openstack/nova-cell0-conductor-db-sync-58xq9" Feb 19 15:30:40 crc kubenswrapper[4810]: I0219 15:30:40.599096 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 19 15:30:40 crc kubenswrapper[4810]: I0219 15:30:40.599736 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 19 15:30:40 crc kubenswrapper[4810]: I0219 15:30:40.610753 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 15:30:40 crc kubenswrapper[4810]: I0219 15:30:40.652506 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-58xq9" Feb 19 15:30:40 crc kubenswrapper[4810]: I0219 15:30:40.706117 4810 scope.go:117] "RemoveContainer" containerID="922169bd474755c022d1869b56b5508937a05f190fe4b150d2505095771f9d93" Feb 19 15:30:40 crc kubenswrapper[4810]: I0219 15:30:40.762838 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxthg\" (UniqueName: \"kubernetes.io/projected/bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38-kube-api-access-jxthg\") pod \"ceilometer-0\" (UID: \"bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38\") " pod="openstack/ceilometer-0" Feb 19 15:30:40 crc kubenswrapper[4810]: I0219 15:30:40.763303 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38-log-httpd\") pod \"ceilometer-0\" (UID: \"bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38\") " pod="openstack/ceilometer-0" Feb 19 15:30:40 crc kubenswrapper[4810]: I0219 15:30:40.763409 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38-run-httpd\") pod \"ceilometer-0\" (UID: \"bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38\") " pod="openstack/ceilometer-0" Feb 19 15:30:40 crc kubenswrapper[4810]: I0219 15:30:40.763525 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38-scripts\") pod \"ceilometer-0\" (UID: \"bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38\") " pod="openstack/ceilometer-0" Feb 19 15:30:40 crc kubenswrapper[4810]: I0219 15:30:40.763596 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38-config-data\") pod \"ceilometer-0\" (UID: \"bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38\") " pod="openstack/ceilometer-0" Feb 19 15:30:40 crc kubenswrapper[4810]: I0219 15:30:40.763732 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38\") " pod="openstack/ceilometer-0" Feb 19 15:30:40 crc kubenswrapper[4810]: I0219 15:30:40.763840 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38\") " pod="openstack/ceilometer-0" Feb 19 15:30:40 crc kubenswrapper[4810]: I0219 15:30:40.874098 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38\") " pod="openstack/ceilometer-0" Feb 19 15:30:40 crc kubenswrapper[4810]: I0219 15:30:40.874161 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38\") " pod="openstack/ceilometer-0" Feb 19 15:30:40 crc kubenswrapper[4810]: I0219 15:30:40.874223 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxthg\" (UniqueName: \"kubernetes.io/projected/bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38-kube-api-access-jxthg\") pod \"ceilometer-0\" (UID: \"bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38\") " pod="openstack/ceilometer-0" Feb 19 15:30:40 crc kubenswrapper[4810]: I0219 15:30:40.874245 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38-log-httpd\") pod \"ceilometer-0\" (UID: \"bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38\") " pod="openstack/ceilometer-0" Feb 19 15:30:40 crc kubenswrapper[4810]: I0219 15:30:40.874272 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38-run-httpd\") pod \"ceilometer-0\" (UID: \"bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38\") " pod="openstack/ceilometer-0" Feb 19 15:30:40 crc kubenswrapper[4810]: I0219 15:30:40.874301 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38-scripts\") pod \"ceilometer-0\" (UID: \"bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38\") " pod="openstack/ceilometer-0" Feb 19 15:30:40 crc kubenswrapper[4810]: I0219 15:30:40.874315 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38-config-data\") pod \"ceilometer-0\" (UID: \"bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38\") " pod="openstack/ceilometer-0" Feb 19 15:30:40 crc kubenswrapper[4810]: I0219 15:30:40.879811 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38-log-httpd\") pod \"ceilometer-0\" (UID: \"bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38\") " pod="openstack/ceilometer-0" Feb 19 15:30:40 crc kubenswrapper[4810]: I0219 15:30:40.880098 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38-run-httpd\") pod \"ceilometer-0\" (UID: \"bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38\") " pod="openstack/ceilometer-0" Feb 19 15:30:40 crc kubenswrapper[4810]: I0219 15:30:40.881107 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38\") " pod="openstack/ceilometer-0" Feb 19 15:30:40 crc kubenswrapper[4810]: I0219 15:30:40.886815 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38\") " pod="openstack/ceilometer-0" Feb 19 15:30:40 crc kubenswrapper[4810]: I0219 15:30:40.895161 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38-scripts\") pod \"ceilometer-0\" (UID: \"bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38\") " pod="openstack/ceilometer-0" Feb 19 15:30:40 crc kubenswrapper[4810]: I0219 15:30:40.900104 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxthg\" (UniqueName: \"kubernetes.io/projected/bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38-kube-api-access-jxthg\") pod \"ceilometer-0\" (UID: \"bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38\") " pod="openstack/ceilometer-0" Feb 19 15:30:40 crc kubenswrapper[4810]: I0219 15:30:40.960943 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38-config-data\") pod \"ceilometer-0\" (UID: \"bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38\") " pod="openstack/ceilometer-0" Feb 19 15:30:40 crc kubenswrapper[4810]: I0219 15:30:40.993642 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 15:30:41 crc kubenswrapper[4810]: I0219 15:30:41.361616 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-58xq9"] Feb 19 15:30:41 crc kubenswrapper[4810]: I0219 15:30:41.449766 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e03c99b3-d5d6-479a-9b45-045bda62be1e" path="/var/lib/kubelet/pods/e03c99b3-d5d6-479a-9b45-045bda62be1e/volumes" Feb 19 15:30:41 crc kubenswrapper[4810]: I0219 15:30:41.518532 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-58xq9" event={"ID":"972d6f5e-3edf-4b6e-bdde-39c580caea31","Type":"ContainerStarted","Data":"0c01e86e8bf881c49ad9ca06e89f46142a0a38c81d0b0f4cbfef925f99badde4"} Feb 19 15:30:41 crc kubenswrapper[4810]: I0219 15:30:41.557264 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 15:30:41 crc kubenswrapper[4810]: W0219 15:30:41.564907 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbbe979b4_aa62_4c1f_8329_6eb2ae2cfa38.slice/crio-4ff3102af0db331be9ee6d83e5b6fe8b5c513c45a0e210d2e6a0f658eeddde5d WatchSource:0}: Error finding container 4ff3102af0db331be9ee6d83e5b6fe8b5c513c45a0e210d2e6a0f658eeddde5d: Status 404 returned error can't find the container with id 4ff3102af0db331be9ee6d83e5b6fe8b5c513c45a0e210d2e6a0f658eeddde5d Feb 19 15:30:42 crc kubenswrapper[4810]: I0219 15:30:42.558397 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38","Type":"ContainerStarted","Data":"fe9552106d30d58d9eba25443c982dcd127f8bde0f0223d82b19cb349c04ea05"} Feb 19 15:30:42 crc kubenswrapper[4810]: I0219 15:30:42.558809 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38","Type":"ContainerStarted","Data":"2271728f7abc2e8188f15e24a952eadeafff700ee538da3ec50e4feeabb89491"} Feb 19 15:30:42 crc kubenswrapper[4810]: I0219 15:30:42.558825 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38","Type":"ContainerStarted","Data":"4ff3102af0db331be9ee6d83e5b6fe8b5c513c45a0e210d2e6a0f658eeddde5d"} Feb 19 15:30:43 crc kubenswrapper[4810]: I0219 15:30:43.573548 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38","Type":"ContainerStarted","Data":"b5e72355bab674a636cae3f88e4f871dcc819f970a52aa01b409a42a1d43b5a1"} Feb 19 15:30:45 crc kubenswrapper[4810]: I0219 15:30:45.029518 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0" Feb 19 15:30:45 crc kubenswrapper[4810]: I0219 15:30:45.039444 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0" Feb 19 15:30:45 crc kubenswrapper[4810]: I0219 15:30:45.043388 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-applier-0" Feb 19 15:30:45 crc kubenswrapper[4810]: I0219 15:30:45.085227 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-applier-0" Feb 19 15:30:45 crc kubenswrapper[4810]: I0219 15:30:45.601207 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Feb 19 15:30:45 crc kubenswrapper[4810]: I0219 15:30:45.641188 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-applier-0" Feb 19 15:30:45 crc kubenswrapper[4810]: I0219 15:30:45.827747 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Feb 19 15:30:45 crc kubenswrapper[4810]: I0219 15:30:45.860944 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Feb 19 15:30:46 crc kubenswrapper[4810]: I0219 15:30:46.601129 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Feb 19 15:30:46 crc kubenswrapper[4810]: I0219 15:30:46.722642 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Feb 19 15:30:46 crc kubenswrapper[4810]: I0219 15:30:46.771480 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 19 15:30:48 crc kubenswrapper[4810]: I0219 15:30:48.621669 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-decision-engine-0" podUID="31448de0-cbd5-4d71-8107-881c0327fb55" containerName="watcher-decision-engine" containerID="cri-o://64d0d64a5da470f2a5ddc1de4e8c04b14ee710ccd412090965ee7e074e161cc8" gracePeriod=30 Feb 19 15:30:49 crc kubenswrapper[4810]: I0219 15:30:49.652936 4810 generic.go:334] "Generic (PLEG): container finished" podID="31448de0-cbd5-4d71-8107-881c0327fb55" containerID="64d0d64a5da470f2a5ddc1de4e8c04b14ee710ccd412090965ee7e074e161cc8" exitCode=0 Feb 19 15:30:49 crc kubenswrapper[4810]: I0219 15:30:49.653006 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"31448de0-cbd5-4d71-8107-881c0327fb55","Type":"ContainerDied","Data":"64d0d64a5da470f2a5ddc1de4e8c04b14ee710ccd412090965ee7e074e161cc8"} Feb 19 15:30:51 crc kubenswrapper[4810]: I0219 15:30:51.533400 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Feb 19 15:30:51 crc kubenswrapper[4810]: I0219 15:30:51.588907 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qnrtj\" (UniqueName: \"kubernetes.io/projected/31448de0-cbd5-4d71-8107-881c0327fb55-kube-api-access-qnrtj\") pod \"31448de0-cbd5-4d71-8107-881c0327fb55\" (UID: \"31448de0-cbd5-4d71-8107-881c0327fb55\") " Feb 19 15:30:51 crc kubenswrapper[4810]: I0219 15:30:51.589283 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31448de0-cbd5-4d71-8107-881c0327fb55-logs\") pod \"31448de0-cbd5-4d71-8107-881c0327fb55\" (UID: \"31448de0-cbd5-4d71-8107-881c0327fb55\") " Feb 19 15:30:51 crc kubenswrapper[4810]: I0219 15:30:51.589413 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31448de0-cbd5-4d71-8107-881c0327fb55-config-data\") pod \"31448de0-cbd5-4d71-8107-881c0327fb55\" (UID: \"31448de0-cbd5-4d71-8107-881c0327fb55\") " Feb 19 15:30:51 crc kubenswrapper[4810]: I0219 15:30:51.589454 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/31448de0-cbd5-4d71-8107-881c0327fb55-custom-prometheus-ca\") pod \"31448de0-cbd5-4d71-8107-881c0327fb55\" (UID: \"31448de0-cbd5-4d71-8107-881c0327fb55\") " Feb 19 15:30:51 crc kubenswrapper[4810]: I0219 15:30:51.589564 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31448de0-cbd5-4d71-8107-881c0327fb55-combined-ca-bundle\") pod \"31448de0-cbd5-4d71-8107-881c0327fb55\" (UID: \"31448de0-cbd5-4d71-8107-881c0327fb55\") " Feb 19 15:30:51 crc kubenswrapper[4810]: I0219 15:30:51.589809 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31448de0-cbd5-4d71-8107-881c0327fb55-logs" (OuterVolumeSpecName: "logs") pod "31448de0-cbd5-4d71-8107-881c0327fb55" (UID: "31448de0-cbd5-4d71-8107-881c0327fb55"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:30:51 crc kubenswrapper[4810]: I0219 15:30:51.590277 4810 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31448de0-cbd5-4d71-8107-881c0327fb55-logs\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:51 crc kubenswrapper[4810]: I0219 15:30:51.601449 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31448de0-cbd5-4d71-8107-881c0327fb55-kube-api-access-qnrtj" (OuterVolumeSpecName: "kube-api-access-qnrtj") pod "31448de0-cbd5-4d71-8107-881c0327fb55" (UID: "31448de0-cbd5-4d71-8107-881c0327fb55"). InnerVolumeSpecName "kube-api-access-qnrtj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:30:51 crc kubenswrapper[4810]: I0219 15:30:51.618071 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31448de0-cbd5-4d71-8107-881c0327fb55-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "31448de0-cbd5-4d71-8107-881c0327fb55" (UID: "31448de0-cbd5-4d71-8107-881c0327fb55"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:30:51 crc kubenswrapper[4810]: I0219 15:30:51.630165 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31448de0-cbd5-4d71-8107-881c0327fb55-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "31448de0-cbd5-4d71-8107-881c0327fb55" (UID: "31448de0-cbd5-4d71-8107-881c0327fb55"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:30:51 crc kubenswrapper[4810]: I0219 15:30:51.673802 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31448de0-cbd5-4d71-8107-881c0327fb55-config-data" (OuterVolumeSpecName: "config-data") pod "31448de0-cbd5-4d71-8107-881c0327fb55" (UID: "31448de0-cbd5-4d71-8107-881c0327fb55"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:30:51 crc kubenswrapper[4810]: I0219 15:30:51.683903 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"31448de0-cbd5-4d71-8107-881c0327fb55","Type":"ContainerDied","Data":"8b0e5903dccd9d2e5e23e46710ab47f5e2c1b9dea5ccf431992056feeec7f78e"} Feb 19 15:30:51 crc kubenswrapper[4810]: I0219 15:30:51.683961 4810 scope.go:117] "RemoveContainer" containerID="64d0d64a5da470f2a5ddc1de4e8c04b14ee710ccd412090965ee7e074e161cc8" Feb 19 15:30:51 crc kubenswrapper[4810]: I0219 15:30:51.684098 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Feb 19 15:30:51 crc kubenswrapper[4810]: I0219 15:30:51.691308 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qnrtj\" (UniqueName: \"kubernetes.io/projected/31448de0-cbd5-4d71-8107-881c0327fb55-kube-api-access-qnrtj\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:51 crc kubenswrapper[4810]: I0219 15:30:51.691357 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31448de0-cbd5-4d71-8107-881c0327fb55-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:51 crc kubenswrapper[4810]: I0219 15:30:51.691369 4810 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/31448de0-cbd5-4d71-8107-881c0327fb55-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:51 crc kubenswrapper[4810]: I0219 15:30:51.691377 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31448de0-cbd5-4d71-8107-881c0327fb55-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:51 crc kubenswrapper[4810]: I0219 15:30:51.745420 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 19 15:30:51 crc kubenswrapper[4810]: I0219 15:30:51.756696 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 19 15:30:51 crc kubenswrapper[4810]: I0219 15:30:51.769395 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 19 15:30:51 crc kubenswrapper[4810]: E0219 15:30:51.769828 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31448de0-cbd5-4d71-8107-881c0327fb55" containerName="watcher-decision-engine" Feb 19 15:30:51 crc kubenswrapper[4810]: I0219 15:30:51.769839 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="31448de0-cbd5-4d71-8107-881c0327fb55" containerName="watcher-decision-engine" Feb 19 15:30:51 crc kubenswrapper[4810]: I0219 15:30:51.770071 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="31448de0-cbd5-4d71-8107-881c0327fb55" containerName="watcher-decision-engine" Feb 19 15:30:51 crc kubenswrapper[4810]: I0219 15:30:51.770714 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Feb 19 15:30:51 crc kubenswrapper[4810]: I0219 15:30:51.776386 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Feb 19 15:30:51 crc kubenswrapper[4810]: I0219 15:30:51.779525 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 19 15:30:51 crc kubenswrapper[4810]: I0219 15:30:51.793085 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/fbbd48c8-49fb-4e51-9ba7-f7b37f681b3d-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"fbbd48c8-49fb-4e51-9ba7-f7b37f681b3d\") " pod="openstack/watcher-decision-engine-0" Feb 19 15:30:51 crc kubenswrapper[4810]: I0219 15:30:51.793131 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbbd48c8-49fb-4e51-9ba7-f7b37f681b3d-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"fbbd48c8-49fb-4e51-9ba7-f7b37f681b3d\") " pod="openstack/watcher-decision-engine-0" Feb 19 15:30:51 crc kubenswrapper[4810]: I0219 15:30:51.793164 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbbd48c8-49fb-4e51-9ba7-f7b37f681b3d-config-data\") pod \"watcher-decision-engine-0\" (UID: \"fbbd48c8-49fb-4e51-9ba7-f7b37f681b3d\") " pod="openstack/watcher-decision-engine-0" Feb 19 15:30:51 crc kubenswrapper[4810]: I0219 15:30:51.793231 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbbd48c8-49fb-4e51-9ba7-f7b37f681b3d-logs\") pod \"watcher-decision-engine-0\" (UID: \"fbbd48c8-49fb-4e51-9ba7-f7b37f681b3d\") " pod="openstack/watcher-decision-engine-0" Feb 19 15:30:51 crc kubenswrapper[4810]: I0219 15:30:51.793297 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jqm9\" (UniqueName: \"kubernetes.io/projected/fbbd48c8-49fb-4e51-9ba7-f7b37f681b3d-kube-api-access-5jqm9\") pod \"watcher-decision-engine-0\" (UID: \"fbbd48c8-49fb-4e51-9ba7-f7b37f681b3d\") " pod="openstack/watcher-decision-engine-0" Feb 19 15:30:51 crc kubenswrapper[4810]: I0219 15:30:51.894479 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jqm9\" (UniqueName: \"kubernetes.io/projected/fbbd48c8-49fb-4e51-9ba7-f7b37f681b3d-kube-api-access-5jqm9\") pod \"watcher-decision-engine-0\" (UID: \"fbbd48c8-49fb-4e51-9ba7-f7b37f681b3d\") " pod="openstack/watcher-decision-engine-0" Feb 19 15:30:51 crc kubenswrapper[4810]: I0219 15:30:51.894529 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/fbbd48c8-49fb-4e51-9ba7-f7b37f681b3d-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"fbbd48c8-49fb-4e51-9ba7-f7b37f681b3d\") " pod="openstack/watcher-decision-engine-0" Feb 19 15:30:51 crc kubenswrapper[4810]: I0219 15:30:51.894555 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbbd48c8-49fb-4e51-9ba7-f7b37f681b3d-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"fbbd48c8-49fb-4e51-9ba7-f7b37f681b3d\") " pod="openstack/watcher-decision-engine-0" Feb 19 15:30:51 crc kubenswrapper[4810]: I0219 15:30:51.894590 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbbd48c8-49fb-4e51-9ba7-f7b37f681b3d-config-data\") pod \"watcher-decision-engine-0\" (UID: \"fbbd48c8-49fb-4e51-9ba7-f7b37f681b3d\") " pod="openstack/watcher-decision-engine-0" Feb 19 15:30:51 crc kubenswrapper[4810]: I0219 15:30:51.894657 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbbd48c8-49fb-4e51-9ba7-f7b37f681b3d-logs\") pod \"watcher-decision-engine-0\" (UID: \"fbbd48c8-49fb-4e51-9ba7-f7b37f681b3d\") " pod="openstack/watcher-decision-engine-0" Feb 19 15:30:51 crc kubenswrapper[4810]: I0219 15:30:51.895156 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbbd48c8-49fb-4e51-9ba7-f7b37f681b3d-logs\") pod \"watcher-decision-engine-0\" (UID: \"fbbd48c8-49fb-4e51-9ba7-f7b37f681b3d\") " pod="openstack/watcher-decision-engine-0" Feb 19 15:30:51 crc kubenswrapper[4810]: I0219 15:30:51.898965 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbbd48c8-49fb-4e51-9ba7-f7b37f681b3d-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"fbbd48c8-49fb-4e51-9ba7-f7b37f681b3d\") " pod="openstack/watcher-decision-engine-0" Feb 19 15:30:51 crc kubenswrapper[4810]: I0219 15:30:51.899051 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/fbbd48c8-49fb-4e51-9ba7-f7b37f681b3d-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"fbbd48c8-49fb-4e51-9ba7-f7b37f681b3d\") " pod="openstack/watcher-decision-engine-0" Feb 19 15:30:51 crc kubenswrapper[4810]: I0219 15:30:51.899456 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbbd48c8-49fb-4e51-9ba7-f7b37f681b3d-config-data\") pod \"watcher-decision-engine-0\" (UID: \"fbbd48c8-49fb-4e51-9ba7-f7b37f681b3d\") " pod="openstack/watcher-decision-engine-0" Feb 19 15:30:51 crc kubenswrapper[4810]: I0219 15:30:51.916957 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jqm9\" (UniqueName: \"kubernetes.io/projected/fbbd48c8-49fb-4e51-9ba7-f7b37f681b3d-kube-api-access-5jqm9\") pod \"watcher-decision-engine-0\" (UID: \"fbbd48c8-49fb-4e51-9ba7-f7b37f681b3d\") " pod="openstack/watcher-decision-engine-0" Feb 19 15:30:52 crc kubenswrapper[4810]: I0219 15:30:52.090652 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Feb 19 15:30:52 crc kubenswrapper[4810]: I0219 15:30:52.568857 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 19 15:30:52 crc kubenswrapper[4810]: W0219 15:30:52.577409 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfbbd48c8_49fb_4e51_9ba7_f7b37f681b3d.slice/crio-d212b20209dbc57da6d250d4636ac4a952077766a0a1e7e5d41f8f5326424149 WatchSource:0}: Error finding container d212b20209dbc57da6d250d4636ac4a952077766a0a1e7e5d41f8f5326424149: Status 404 returned error can't find the container with id d212b20209dbc57da6d250d4636ac4a952077766a0a1e7e5d41f8f5326424149 Feb 19 15:30:52 crc kubenswrapper[4810]: I0219 15:30:52.694465 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38","Type":"ContainerStarted","Data":"7e16e7b6753a6364938105bbe0304a4ea0d0db7a6e7668e4cc2587c80cf11acf"} Feb 19 15:30:52 crc kubenswrapper[4810]: I0219 15:30:52.694642 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 19 15:30:52 crc kubenswrapper[4810]: I0219 15:30:52.697771 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-58xq9" event={"ID":"972d6f5e-3edf-4b6e-bdde-39c580caea31","Type":"ContainerStarted","Data":"19b609d4be47506e6c511dded32be9ffbc5fec785d73d8309ff072ff0f1cf61d"} Feb 19 15:30:52 crc kubenswrapper[4810]: I0219 15:30:52.700186 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"fbbd48c8-49fb-4e51-9ba7-f7b37f681b3d","Type":"ContainerStarted","Data":"d212b20209dbc57da6d250d4636ac4a952077766a0a1e7e5d41f8f5326424149"} Feb 19 15:30:52 crc kubenswrapper[4810]: I0219 15:30:52.759384 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.467905205 podStartE2EDuration="12.759367304s" podCreationTimestamp="2026-02-19 15:30:40 +0000 UTC" firstStartedPulling="2026-02-19 15:30:41.568020856 +0000 UTC m=+1271.050050980" lastFinishedPulling="2026-02-19 15:30:50.859482945 +0000 UTC m=+1280.341513079" observedRunningTime="2026-02-19 15:30:52.732401463 +0000 UTC m=+1282.214431587" watchObservedRunningTime="2026-02-19 15:30:52.759367304 +0000 UTC m=+1282.241397428" Feb 19 15:30:53 crc kubenswrapper[4810]: I0219 15:30:53.449364 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31448de0-cbd5-4d71-8107-881c0327fb55" path="/var/lib/kubelet/pods/31448de0-cbd5-4d71-8107-881c0327fb55/volumes" Feb 19 15:30:53 crc kubenswrapper[4810]: I0219 15:30:53.713676 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"fbbd48c8-49fb-4e51-9ba7-f7b37f681b3d","Type":"ContainerStarted","Data":"0de2c60fe202f0db8743c5bff46dd731a5b2adf31323c862bd094408a75fcc79"} Feb 19 15:30:53 crc kubenswrapper[4810]: I0219 15:30:53.740197 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-58xq9" podStartSLOduration=3.730302551 podStartE2EDuration="13.740177785s" podCreationTimestamp="2026-02-19 15:30:40 +0000 UTC" firstStartedPulling="2026-02-19 15:30:41.366358039 +0000 UTC m=+1270.848388163" lastFinishedPulling="2026-02-19 15:30:51.376233273 +0000 UTC m=+1280.858263397" observedRunningTime="2026-02-19 15:30:52.75673173 +0000 UTC m=+1282.238761854" watchObservedRunningTime="2026-02-19 15:30:53.740177785 +0000 UTC m=+1283.222207929" Feb 19 15:30:53 crc kubenswrapper[4810]: I0219 15:30:53.742237 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=2.742229905 podStartE2EDuration="2.742229905s" podCreationTimestamp="2026-02-19 15:30:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:30:53.735291355 +0000 UTC m=+1283.217321479" watchObservedRunningTime="2026-02-19 15:30:53.742229905 +0000 UTC m=+1283.224260039" Feb 19 15:30:53 crc kubenswrapper[4810]: I0219 15:30:53.948683 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 15:30:53 crc kubenswrapper[4810]: I0219 15:30:53.949000 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="25d19ba7-dcfe-443b-9aec-afd1fc1bee1c" containerName="glance-log" containerID="cri-o://a5b582a70e74643764c5c634d13034b32fb543e1cf8d7471fc29e46f970a5375" gracePeriod=30 Feb 19 15:30:53 crc kubenswrapper[4810]: I0219 15:30:53.949084 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="25d19ba7-dcfe-443b-9aec-afd1fc1bee1c" containerName="glance-httpd" containerID="cri-o://d4f8797a868301d8a063b9ee6d65b2b3734521a53ee7809ebafe72723ce9d630" gracePeriod=30 Feb 19 15:30:54 crc kubenswrapper[4810]: I0219 15:30:54.725202 4810 generic.go:334] "Generic (PLEG): container finished" podID="25d19ba7-dcfe-443b-9aec-afd1fc1bee1c" containerID="a5b582a70e74643764c5c634d13034b32fb543e1cf8d7471fc29e46f970a5375" exitCode=143 Feb 19 15:30:54 crc kubenswrapper[4810]: I0219 15:30:54.725280 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"25d19ba7-dcfe-443b-9aec-afd1fc1bee1c","Type":"ContainerDied","Data":"a5b582a70e74643764c5c634d13034b32fb543e1cf8d7471fc29e46f970a5375"} Feb 19 15:30:54 crc kubenswrapper[4810]: I0219 15:30:54.865352 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="25d19ba7-dcfe-443b-9aec-afd1fc1bee1c" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.170:9292/healthcheck\": read tcp 10.217.0.2:40794->10.217.0.170:9292: read: connection reset by peer" Feb 19 15:30:54 crc kubenswrapper[4810]: I0219 15:30:54.865429 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="25d19ba7-dcfe-443b-9aec-afd1fc1bee1c" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.170:9292/healthcheck\": read tcp 10.217.0.2:40804->10.217.0.170:9292: read: connection reset by peer" Feb 19 15:30:55 crc kubenswrapper[4810]: I0219 15:30:55.432610 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 15:30:55 crc kubenswrapper[4810]: I0219 15:30:55.475282 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25d19ba7-dcfe-443b-9aec-afd1fc1bee1c-combined-ca-bundle\") pod \"25d19ba7-dcfe-443b-9aec-afd1fc1bee1c\" (UID: \"25d19ba7-dcfe-443b-9aec-afd1fc1bee1c\") " Feb 19 15:30:55 crc kubenswrapper[4810]: I0219 15:30:55.475398 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25d19ba7-dcfe-443b-9aec-afd1fc1bee1c-scripts\") pod \"25d19ba7-dcfe-443b-9aec-afd1fc1bee1c\" (UID: \"25d19ba7-dcfe-443b-9aec-afd1fc1bee1c\") " Feb 19 15:30:55 crc kubenswrapper[4810]: I0219 15:30:55.475443 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/25d19ba7-dcfe-443b-9aec-afd1fc1bee1c-public-tls-certs\") pod \"25d19ba7-dcfe-443b-9aec-afd1fc1bee1c\" (UID: \"25d19ba7-dcfe-443b-9aec-afd1fc1bee1c\") " Feb 19 15:30:55 crc kubenswrapper[4810]: I0219 15:30:55.475563 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/25d19ba7-dcfe-443b-9aec-afd1fc1bee1c-httpd-run\") pod \"25d19ba7-dcfe-443b-9aec-afd1fc1bee1c\" (UID: \"25d19ba7-dcfe-443b-9aec-afd1fc1bee1c\") " Feb 19 15:30:55 crc kubenswrapper[4810]: I0219 15:30:55.475702 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h47wl\" (UniqueName: \"kubernetes.io/projected/25d19ba7-dcfe-443b-9aec-afd1fc1bee1c-kube-api-access-h47wl\") pod \"25d19ba7-dcfe-443b-9aec-afd1fc1bee1c\" (UID: \"25d19ba7-dcfe-443b-9aec-afd1fc1bee1c\") " Feb 19 15:30:55 crc kubenswrapper[4810]: I0219 15:30:55.475779 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25d19ba7-dcfe-443b-9aec-afd1fc1bee1c-logs\") pod \"25d19ba7-dcfe-443b-9aec-afd1fc1bee1c\" (UID: \"25d19ba7-dcfe-443b-9aec-afd1fc1bee1c\") " Feb 19 15:30:55 crc kubenswrapper[4810]: I0219 15:30:55.475807 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25d19ba7-dcfe-443b-9aec-afd1fc1bee1c-config-data\") pod \"25d19ba7-dcfe-443b-9aec-afd1fc1bee1c\" (UID: \"25d19ba7-dcfe-443b-9aec-afd1fc1bee1c\") " Feb 19 15:30:55 crc kubenswrapper[4810]: I0219 15:30:55.475890 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"25d19ba7-dcfe-443b-9aec-afd1fc1bee1c\" (UID: \"25d19ba7-dcfe-443b-9aec-afd1fc1bee1c\") " Feb 19 15:30:55 crc kubenswrapper[4810]: I0219 15:30:55.476298 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25d19ba7-dcfe-443b-9aec-afd1fc1bee1c-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "25d19ba7-dcfe-443b-9aec-afd1fc1bee1c" (UID: "25d19ba7-dcfe-443b-9aec-afd1fc1bee1c"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:30:55 crc kubenswrapper[4810]: I0219 15:30:55.476613 4810 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/25d19ba7-dcfe-443b-9aec-afd1fc1bee1c-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:55 crc kubenswrapper[4810]: I0219 15:30:55.482373 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25d19ba7-dcfe-443b-9aec-afd1fc1bee1c-kube-api-access-h47wl" (OuterVolumeSpecName: "kube-api-access-h47wl") pod "25d19ba7-dcfe-443b-9aec-afd1fc1bee1c" (UID: "25d19ba7-dcfe-443b-9aec-afd1fc1bee1c"). InnerVolumeSpecName "kube-api-access-h47wl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:30:55 crc kubenswrapper[4810]: I0219 15:30:55.482763 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25d19ba7-dcfe-443b-9aec-afd1fc1bee1c-logs" (OuterVolumeSpecName: "logs") pod "25d19ba7-dcfe-443b-9aec-afd1fc1bee1c" (UID: "25d19ba7-dcfe-443b-9aec-afd1fc1bee1c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:30:55 crc kubenswrapper[4810]: I0219 15:30:55.487486 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25d19ba7-dcfe-443b-9aec-afd1fc1bee1c-scripts" (OuterVolumeSpecName: "scripts") pod "25d19ba7-dcfe-443b-9aec-afd1fc1bee1c" (UID: "25d19ba7-dcfe-443b-9aec-afd1fc1bee1c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:30:55 crc kubenswrapper[4810]: I0219 15:30:55.489198 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "25d19ba7-dcfe-443b-9aec-afd1fc1bee1c" (UID: "25d19ba7-dcfe-443b-9aec-afd1fc1bee1c"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 19 15:30:55 crc kubenswrapper[4810]: I0219 15:30:55.518693 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25d19ba7-dcfe-443b-9aec-afd1fc1bee1c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "25d19ba7-dcfe-443b-9aec-afd1fc1bee1c" (UID: "25d19ba7-dcfe-443b-9aec-afd1fc1bee1c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:30:55 crc kubenswrapper[4810]: I0219 15:30:55.545708 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25d19ba7-dcfe-443b-9aec-afd1fc1bee1c-config-data" (OuterVolumeSpecName: "config-data") pod "25d19ba7-dcfe-443b-9aec-afd1fc1bee1c" (UID: "25d19ba7-dcfe-443b-9aec-afd1fc1bee1c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:30:55 crc kubenswrapper[4810]: I0219 15:30:55.569128 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25d19ba7-dcfe-443b-9aec-afd1fc1bee1c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "25d19ba7-dcfe-443b-9aec-afd1fc1bee1c" (UID: "25d19ba7-dcfe-443b-9aec-afd1fc1bee1c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:30:55 crc kubenswrapper[4810]: I0219 15:30:55.580645 4810 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Feb 19 15:30:55 crc kubenswrapper[4810]: I0219 15:30:55.584478 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25d19ba7-dcfe-443b-9aec-afd1fc1bee1c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:55 crc kubenswrapper[4810]: I0219 15:30:55.584843 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25d19ba7-dcfe-443b-9aec-afd1fc1bee1c-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:55 crc kubenswrapper[4810]: I0219 15:30:55.584936 4810 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/25d19ba7-dcfe-443b-9aec-afd1fc1bee1c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:55 crc kubenswrapper[4810]: I0219 15:30:55.585016 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h47wl\" (UniqueName: \"kubernetes.io/projected/25d19ba7-dcfe-443b-9aec-afd1fc1bee1c-kube-api-access-h47wl\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:55 crc kubenswrapper[4810]: I0219 15:30:55.585097 4810 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25d19ba7-dcfe-443b-9aec-afd1fc1bee1c-logs\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:55 crc kubenswrapper[4810]: I0219 15:30:55.585185 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25d19ba7-dcfe-443b-9aec-afd1fc1bee1c-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:55 crc kubenswrapper[4810]: I0219 15:30:55.614896 4810 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Feb 19 15:30:55 crc kubenswrapper[4810]: I0219 15:30:55.686685 4810 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:55 crc kubenswrapper[4810]: I0219 15:30:55.738380 4810 generic.go:334] "Generic (PLEG): container finished" podID="25d19ba7-dcfe-443b-9aec-afd1fc1bee1c" containerID="d4f8797a868301d8a063b9ee6d65b2b3734521a53ee7809ebafe72723ce9d630" exitCode=0 Feb 19 15:30:55 crc kubenswrapper[4810]: I0219 15:30:55.738449 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"25d19ba7-dcfe-443b-9aec-afd1fc1bee1c","Type":"ContainerDied","Data":"d4f8797a868301d8a063b9ee6d65b2b3734521a53ee7809ebafe72723ce9d630"} Feb 19 15:30:55 crc kubenswrapper[4810]: I0219 15:30:55.738483 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"25d19ba7-dcfe-443b-9aec-afd1fc1bee1c","Type":"ContainerDied","Data":"8e2ec136ee1702cfd45683995f2deb05321488c4c561ae75b5ecc3c327d09b7a"} Feb 19 15:30:55 crc kubenswrapper[4810]: I0219 15:30:55.738504 4810 scope.go:117] "RemoveContainer" containerID="d4f8797a868301d8a063b9ee6d65b2b3734521a53ee7809ebafe72723ce9d630" Feb 19 15:30:55 crc kubenswrapper[4810]: I0219 15:30:55.738715 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 15:30:55 crc kubenswrapper[4810]: I0219 15:30:55.773710 4810 scope.go:117] "RemoveContainer" containerID="a5b582a70e74643764c5c634d13034b32fb543e1cf8d7471fc29e46f970a5375" Feb 19 15:30:55 crc kubenswrapper[4810]: I0219 15:30:55.794681 4810 scope.go:117] "RemoveContainer" containerID="d4f8797a868301d8a063b9ee6d65b2b3734521a53ee7809ebafe72723ce9d630" Feb 19 15:30:55 crc kubenswrapper[4810]: E0219 15:30:55.795744 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4f8797a868301d8a063b9ee6d65b2b3734521a53ee7809ebafe72723ce9d630\": container with ID starting with d4f8797a868301d8a063b9ee6d65b2b3734521a53ee7809ebafe72723ce9d630 not found: ID does not exist" containerID="d4f8797a868301d8a063b9ee6d65b2b3734521a53ee7809ebafe72723ce9d630" Feb 19 15:30:55 crc kubenswrapper[4810]: I0219 15:30:55.795797 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4f8797a868301d8a063b9ee6d65b2b3734521a53ee7809ebafe72723ce9d630"} err="failed to get container status \"d4f8797a868301d8a063b9ee6d65b2b3734521a53ee7809ebafe72723ce9d630\": rpc error: code = NotFound desc = could not find container \"d4f8797a868301d8a063b9ee6d65b2b3734521a53ee7809ebafe72723ce9d630\": container with ID starting with d4f8797a868301d8a063b9ee6d65b2b3734521a53ee7809ebafe72723ce9d630 not found: ID does not exist" Feb 19 15:30:55 crc kubenswrapper[4810]: I0219 15:30:55.795822 4810 scope.go:117] "RemoveContainer" containerID="a5b582a70e74643764c5c634d13034b32fb543e1cf8d7471fc29e46f970a5375" Feb 19 15:30:55 crc kubenswrapper[4810]: E0219 15:30:55.796175 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5b582a70e74643764c5c634d13034b32fb543e1cf8d7471fc29e46f970a5375\": container with ID starting with a5b582a70e74643764c5c634d13034b32fb543e1cf8d7471fc29e46f970a5375 not found: ID does not exist" containerID="a5b582a70e74643764c5c634d13034b32fb543e1cf8d7471fc29e46f970a5375" Feb 19 15:30:55 crc kubenswrapper[4810]: I0219 15:30:55.796207 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5b582a70e74643764c5c634d13034b32fb543e1cf8d7471fc29e46f970a5375"} err="failed to get container status \"a5b582a70e74643764c5c634d13034b32fb543e1cf8d7471fc29e46f970a5375\": rpc error: code = NotFound desc = could not find container \"a5b582a70e74643764c5c634d13034b32fb543e1cf8d7471fc29e46f970a5375\": container with ID starting with a5b582a70e74643764c5c634d13034b32fb543e1cf8d7471fc29e46f970a5375 not found: ID does not exist" Feb 19 15:30:55 crc kubenswrapper[4810]: I0219 15:30:55.848383 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 15:30:55 crc kubenswrapper[4810]: I0219 15:30:55.868070 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 15:30:55 crc kubenswrapper[4810]: I0219 15:30:55.878469 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 15:30:55 crc kubenswrapper[4810]: E0219 15:30:55.878998 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25d19ba7-dcfe-443b-9aec-afd1fc1bee1c" containerName="glance-httpd" Feb 19 15:30:55 crc kubenswrapper[4810]: I0219 15:30:55.879026 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="25d19ba7-dcfe-443b-9aec-afd1fc1bee1c" containerName="glance-httpd" Feb 19 15:30:55 crc kubenswrapper[4810]: E0219 15:30:55.879051 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25d19ba7-dcfe-443b-9aec-afd1fc1bee1c" containerName="glance-log" Feb 19 15:30:55 crc kubenswrapper[4810]: I0219 15:30:55.879060 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="25d19ba7-dcfe-443b-9aec-afd1fc1bee1c" containerName="glance-log" Feb 19 15:30:55 crc kubenswrapper[4810]: I0219 15:30:55.879303 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="25d19ba7-dcfe-443b-9aec-afd1fc1bee1c" containerName="glance-log" Feb 19 15:30:55 crc kubenswrapper[4810]: I0219 15:30:55.879358 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="25d19ba7-dcfe-443b-9aec-afd1fc1bee1c" containerName="glance-httpd" Feb 19 15:30:55 crc kubenswrapper[4810]: I0219 15:30:55.880541 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 15:30:55 crc kubenswrapper[4810]: I0219 15:30:55.885899 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 19 15:30:55 crc kubenswrapper[4810]: I0219 15:30:55.885919 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 19 15:30:55 crc kubenswrapper[4810]: I0219 15:30:55.923626 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 15:30:55 crc kubenswrapper[4810]: I0219 15:30:55.923880 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="addf00fe-9b9b-41d4-bd81-4e5f2c339fff" containerName="glance-log" containerID="cri-o://1f3fdcf1ede0870a63fe45639c44983223bdd32c1b88c14ab338ab8b9a3f039c" gracePeriod=30 Feb 19 15:30:55 crc kubenswrapper[4810]: I0219 15:30:55.924308 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="addf00fe-9b9b-41d4-bd81-4e5f2c339fff" containerName="glance-httpd" containerID="cri-o://598ccff405c1e1906ae06df4732ddadd78333232ea68a893ba2db6513c96c3dd" gracePeriod=30 Feb 19 15:30:55 crc kubenswrapper[4810]: I0219 15:30:55.929548 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/glance-default-internal-api-0" podUID="addf00fe-9b9b-41d4-bd81-4e5f2c339fff" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.169:9292/healthcheck\": EOF" Feb 19 15:30:55 crc kubenswrapper[4810]: I0219 15:30:55.942541 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 15:30:55 crc kubenswrapper[4810]: I0219 15:30:55.991770 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/41a4af93-6f80-4097-a964-2e3f3055fd3b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"41a4af93-6f80-4097-a964-2e3f3055fd3b\") " pod="openstack/glance-default-external-api-0" Feb 19 15:30:55 crc kubenswrapper[4810]: I0219 15:30:55.991825 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"41a4af93-6f80-4097-a964-2e3f3055fd3b\") " pod="openstack/glance-default-external-api-0" Feb 19 15:30:55 crc kubenswrapper[4810]: I0219 15:30:55.991874 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/41a4af93-6f80-4097-a964-2e3f3055fd3b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"41a4af93-6f80-4097-a964-2e3f3055fd3b\") " pod="openstack/glance-default-external-api-0" Feb 19 15:30:55 crc kubenswrapper[4810]: I0219 15:30:55.991896 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41a4af93-6f80-4097-a964-2e3f3055fd3b-scripts\") pod \"glance-default-external-api-0\" (UID: \"41a4af93-6f80-4097-a964-2e3f3055fd3b\") " pod="openstack/glance-default-external-api-0" Feb 19 15:30:55 crc kubenswrapper[4810]: I0219 15:30:55.991918 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thn9w\" (UniqueName: \"kubernetes.io/projected/41a4af93-6f80-4097-a964-2e3f3055fd3b-kube-api-access-thn9w\") pod \"glance-default-external-api-0\" (UID: \"41a4af93-6f80-4097-a964-2e3f3055fd3b\") " pod="openstack/glance-default-external-api-0" Feb 19 15:30:55 crc kubenswrapper[4810]: I0219 15:30:55.992276 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41a4af93-6f80-4097-a964-2e3f3055fd3b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"41a4af93-6f80-4097-a964-2e3f3055fd3b\") " pod="openstack/glance-default-external-api-0" Feb 19 15:30:55 crc kubenswrapper[4810]: I0219 15:30:55.992469 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41a4af93-6f80-4097-a964-2e3f3055fd3b-logs\") pod \"glance-default-external-api-0\" (UID: \"41a4af93-6f80-4097-a964-2e3f3055fd3b\") " pod="openstack/glance-default-external-api-0" Feb 19 15:30:55 crc kubenswrapper[4810]: I0219 15:30:55.992515 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41a4af93-6f80-4097-a964-2e3f3055fd3b-config-data\") pod \"glance-default-external-api-0\" (UID: \"41a4af93-6f80-4097-a964-2e3f3055fd3b\") " pod="openstack/glance-default-external-api-0" Feb 19 15:30:56 crc kubenswrapper[4810]: I0219 15:30:56.093596 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41a4af93-6f80-4097-a964-2e3f3055fd3b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"41a4af93-6f80-4097-a964-2e3f3055fd3b\") " pod="openstack/glance-default-external-api-0" Feb 19 15:30:56 crc kubenswrapper[4810]: I0219 15:30:56.093658 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41a4af93-6f80-4097-a964-2e3f3055fd3b-logs\") pod \"glance-default-external-api-0\" (UID: \"41a4af93-6f80-4097-a964-2e3f3055fd3b\") " pod="openstack/glance-default-external-api-0" Feb 19 15:30:56 crc kubenswrapper[4810]: I0219 15:30:56.093678 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41a4af93-6f80-4097-a964-2e3f3055fd3b-config-data\") pod \"glance-default-external-api-0\" (UID: \"41a4af93-6f80-4097-a964-2e3f3055fd3b\") " pod="openstack/glance-default-external-api-0" Feb 19 15:30:56 crc kubenswrapper[4810]: I0219 15:30:56.093718 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/41a4af93-6f80-4097-a964-2e3f3055fd3b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"41a4af93-6f80-4097-a964-2e3f3055fd3b\") " pod="openstack/glance-default-external-api-0" Feb 19 15:30:56 crc kubenswrapper[4810]: I0219 15:30:56.093737 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"41a4af93-6f80-4097-a964-2e3f3055fd3b\") " pod="openstack/glance-default-external-api-0" Feb 19 15:30:56 crc kubenswrapper[4810]: I0219 15:30:56.093771 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/41a4af93-6f80-4097-a964-2e3f3055fd3b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"41a4af93-6f80-4097-a964-2e3f3055fd3b\") " pod="openstack/glance-default-external-api-0" Feb 19 15:30:56 crc kubenswrapper[4810]: I0219 15:30:56.093790 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41a4af93-6f80-4097-a964-2e3f3055fd3b-scripts\") pod \"glance-default-external-api-0\" (UID: \"41a4af93-6f80-4097-a964-2e3f3055fd3b\") " pod="openstack/glance-default-external-api-0" Feb 19 15:30:56 crc kubenswrapper[4810]: I0219 15:30:56.093812 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thn9w\" (UniqueName: \"kubernetes.io/projected/41a4af93-6f80-4097-a964-2e3f3055fd3b-kube-api-access-thn9w\") pod \"glance-default-external-api-0\" (UID: \"41a4af93-6f80-4097-a964-2e3f3055fd3b\") " pod="openstack/glance-default-external-api-0" Feb 19 15:30:56 crc kubenswrapper[4810]: I0219 15:30:56.094983 4810 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"41a4af93-6f80-4097-a964-2e3f3055fd3b\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-external-api-0" Feb 19 15:30:56 crc kubenswrapper[4810]: I0219 15:30:56.095201 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/41a4af93-6f80-4097-a964-2e3f3055fd3b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"41a4af93-6f80-4097-a964-2e3f3055fd3b\") " pod="openstack/glance-default-external-api-0" Feb 19 15:30:56 crc kubenswrapper[4810]: I0219 15:30:56.095240 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41a4af93-6f80-4097-a964-2e3f3055fd3b-logs\") pod \"glance-default-external-api-0\" (UID: \"41a4af93-6f80-4097-a964-2e3f3055fd3b\") " pod="openstack/glance-default-external-api-0" Feb 19 15:30:56 crc kubenswrapper[4810]: I0219 15:30:56.099123 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/41a4af93-6f80-4097-a964-2e3f3055fd3b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"41a4af93-6f80-4097-a964-2e3f3055fd3b\") " pod="openstack/glance-default-external-api-0" Feb 19 15:30:56 crc kubenswrapper[4810]: I0219 15:30:56.099162 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41a4af93-6f80-4097-a964-2e3f3055fd3b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"41a4af93-6f80-4097-a964-2e3f3055fd3b\") " pod="openstack/glance-default-external-api-0" Feb 19 15:30:56 crc kubenswrapper[4810]: I0219 15:30:56.103754 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41a4af93-6f80-4097-a964-2e3f3055fd3b-scripts\") pod \"glance-default-external-api-0\" (UID: \"41a4af93-6f80-4097-a964-2e3f3055fd3b\") " pod="openstack/glance-default-external-api-0" Feb 19 15:30:56 crc kubenswrapper[4810]: I0219 15:30:56.111781 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41a4af93-6f80-4097-a964-2e3f3055fd3b-config-data\") pod \"glance-default-external-api-0\" (UID: \"41a4af93-6f80-4097-a964-2e3f3055fd3b\") " pod="openstack/glance-default-external-api-0" Feb 19 15:30:56 crc kubenswrapper[4810]: I0219 15:30:56.115518 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thn9w\" (UniqueName: \"kubernetes.io/projected/41a4af93-6f80-4097-a964-2e3f3055fd3b-kube-api-access-thn9w\") pod \"glance-default-external-api-0\" (UID: \"41a4af93-6f80-4097-a964-2e3f3055fd3b\") " pod="openstack/glance-default-external-api-0" Feb 19 15:30:56 crc kubenswrapper[4810]: I0219 15:30:56.131744 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"41a4af93-6f80-4097-a964-2e3f3055fd3b\") " pod="openstack/glance-default-external-api-0" Feb 19 15:30:56 crc kubenswrapper[4810]: I0219 15:30:56.207477 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 15:30:56 crc kubenswrapper[4810]: I0219 15:30:56.656258 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 15:30:56 crc kubenswrapper[4810]: I0219 15:30:56.656976 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38" containerName="ceilometer-central-agent" containerID="cri-o://2271728f7abc2e8188f15e24a952eadeafff700ee538da3ec50e4feeabb89491" gracePeriod=30 Feb 19 15:30:56 crc kubenswrapper[4810]: I0219 15:30:56.657047 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38" containerName="ceilometer-notification-agent" containerID="cri-o://fe9552106d30d58d9eba25443c982dcd127f8bde0f0223d82b19cb349c04ea05" gracePeriod=30 Feb 19 15:30:56 crc kubenswrapper[4810]: I0219 15:30:56.657063 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38" containerName="proxy-httpd" containerID="cri-o://7e16e7b6753a6364938105bbe0304a4ea0d0db7a6e7668e4cc2587c80cf11acf" gracePeriod=30 Feb 19 15:30:56 crc kubenswrapper[4810]: I0219 15:30:56.657031 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38" containerName="sg-core" containerID="cri-o://b5e72355bab674a636cae3f88e4f871dcc819f970a52aa01b409a42a1d43b5a1" gracePeriod=30 Feb 19 15:30:56 crc kubenswrapper[4810]: I0219 15:30:56.747819 4810 generic.go:334] "Generic (PLEG): container finished" podID="addf00fe-9b9b-41d4-bd81-4e5f2c339fff" containerID="1f3fdcf1ede0870a63fe45639c44983223bdd32c1b88c14ab338ab8b9a3f039c" exitCode=143 Feb 19 15:30:56 crc kubenswrapper[4810]: I0219 15:30:56.747883 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"addf00fe-9b9b-41d4-bd81-4e5f2c339fff","Type":"ContainerDied","Data":"1f3fdcf1ede0870a63fe45639c44983223bdd32c1b88c14ab338ab8b9a3f039c"} Feb 19 15:30:56 crc kubenswrapper[4810]: I0219 15:30:56.783508 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 15:30:56 crc kubenswrapper[4810]: W0219 15:30:56.790084 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod41a4af93_6f80_4097_a964_2e3f3055fd3b.slice/crio-982653081c226b3e310d066fa8db4a3c8e86304fe9718438e4fe3b604b0ee8fe WatchSource:0}: Error finding container 982653081c226b3e310d066fa8db4a3c8e86304fe9718438e4fe3b604b0ee8fe: Status 404 returned error can't find the container with id 982653081c226b3e310d066fa8db4a3c8e86304fe9718438e4fe3b604b0ee8fe Feb 19 15:30:57 crc kubenswrapper[4810]: I0219 15:30:57.450264 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25d19ba7-dcfe-443b-9aec-afd1fc1bee1c" path="/var/lib/kubelet/pods/25d19ba7-dcfe-443b-9aec-afd1fc1bee1c/volumes" Feb 19 15:30:57 crc kubenswrapper[4810]: I0219 15:30:57.761054 4810 generic.go:334] "Generic (PLEG): container finished" podID="bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38" containerID="7e16e7b6753a6364938105bbe0304a4ea0d0db7a6e7668e4cc2587c80cf11acf" exitCode=0 Feb 19 15:30:57 crc kubenswrapper[4810]: I0219 15:30:57.761293 4810 generic.go:334] "Generic (PLEG): container finished" podID="bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38" containerID="b5e72355bab674a636cae3f88e4f871dcc819f970a52aa01b409a42a1d43b5a1" exitCode=2 Feb 19 15:30:57 crc kubenswrapper[4810]: I0219 15:30:57.761304 4810 generic.go:334] "Generic (PLEG): container finished" podID="bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38" containerID="2271728f7abc2e8188f15e24a952eadeafff700ee538da3ec50e4feeabb89491" exitCode=0 Feb 19 15:30:57 crc kubenswrapper[4810]: I0219 15:30:57.761227 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38","Type":"ContainerDied","Data":"7e16e7b6753a6364938105bbe0304a4ea0d0db7a6e7668e4cc2587c80cf11acf"} Feb 19 15:30:57 crc kubenswrapper[4810]: I0219 15:30:57.761383 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38","Type":"ContainerDied","Data":"b5e72355bab674a636cae3f88e4f871dcc819f970a52aa01b409a42a1d43b5a1"} Feb 19 15:30:57 crc kubenswrapper[4810]: I0219 15:30:57.761397 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38","Type":"ContainerDied","Data":"2271728f7abc2e8188f15e24a952eadeafff700ee538da3ec50e4feeabb89491"} Feb 19 15:30:57 crc kubenswrapper[4810]: I0219 15:30:57.765062 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"41a4af93-6f80-4097-a964-2e3f3055fd3b","Type":"ContainerStarted","Data":"0e6adbe25af33318a0ec838bf869159bced4e70a9cd918d5db27ffb822e355cc"} Feb 19 15:30:57 crc kubenswrapper[4810]: I0219 15:30:57.765231 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"41a4af93-6f80-4097-a964-2e3f3055fd3b","Type":"ContainerStarted","Data":"982653081c226b3e310d066fa8db4a3c8e86304fe9718438e4fe3b604b0ee8fe"} Feb 19 15:30:58 crc kubenswrapper[4810]: I0219 15:30:58.616608 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 15:30:58 crc kubenswrapper[4810]: I0219 15:30:58.643254 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/addf00fe-9b9b-41d4-bd81-4e5f2c339fff-internal-tls-certs\") pod \"addf00fe-9b9b-41d4-bd81-4e5f2c339fff\" (UID: \"addf00fe-9b9b-41d4-bd81-4e5f2c339fff\") " Feb 19 15:30:58 crc kubenswrapper[4810]: I0219 15:30:58.643318 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"addf00fe-9b9b-41d4-bd81-4e5f2c339fff\" (UID: \"addf00fe-9b9b-41d4-bd81-4e5f2c339fff\") " Feb 19 15:30:58 crc kubenswrapper[4810]: I0219 15:30:58.643356 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/addf00fe-9b9b-41d4-bd81-4e5f2c339fff-httpd-run\") pod \"addf00fe-9b9b-41d4-bd81-4e5f2c339fff\" (UID: \"addf00fe-9b9b-41d4-bd81-4e5f2c339fff\") " Feb 19 15:30:58 crc kubenswrapper[4810]: I0219 15:30:58.643418 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgrng\" (UniqueName: \"kubernetes.io/projected/addf00fe-9b9b-41d4-bd81-4e5f2c339fff-kube-api-access-tgrng\") pod \"addf00fe-9b9b-41d4-bd81-4e5f2c339fff\" (UID: \"addf00fe-9b9b-41d4-bd81-4e5f2c339fff\") " Feb 19 15:30:58 crc kubenswrapper[4810]: I0219 15:30:58.643487 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/addf00fe-9b9b-41d4-bd81-4e5f2c339fff-scripts\") pod \"addf00fe-9b9b-41d4-bd81-4e5f2c339fff\" (UID: \"addf00fe-9b9b-41d4-bd81-4e5f2c339fff\") " Feb 19 15:30:58 crc kubenswrapper[4810]: I0219 15:30:58.643524 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/addf00fe-9b9b-41d4-bd81-4e5f2c339fff-config-data\") pod \"addf00fe-9b9b-41d4-bd81-4e5f2c339fff\" (UID: \"addf00fe-9b9b-41d4-bd81-4e5f2c339fff\") " Feb 19 15:30:58 crc kubenswrapper[4810]: I0219 15:30:58.643555 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/addf00fe-9b9b-41d4-bd81-4e5f2c339fff-logs\") pod \"addf00fe-9b9b-41d4-bd81-4e5f2c339fff\" (UID: \"addf00fe-9b9b-41d4-bd81-4e5f2c339fff\") " Feb 19 15:30:58 crc kubenswrapper[4810]: I0219 15:30:58.643569 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/addf00fe-9b9b-41d4-bd81-4e5f2c339fff-combined-ca-bundle\") pod \"addf00fe-9b9b-41d4-bd81-4e5f2c339fff\" (UID: \"addf00fe-9b9b-41d4-bd81-4e5f2c339fff\") " Feb 19 15:30:58 crc kubenswrapper[4810]: I0219 15:30:58.644504 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/addf00fe-9b9b-41d4-bd81-4e5f2c339fff-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "addf00fe-9b9b-41d4-bd81-4e5f2c339fff" (UID: "addf00fe-9b9b-41d4-bd81-4e5f2c339fff"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:30:58 crc kubenswrapper[4810]: I0219 15:30:58.644598 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/addf00fe-9b9b-41d4-bd81-4e5f2c339fff-logs" (OuterVolumeSpecName: "logs") pod "addf00fe-9b9b-41d4-bd81-4e5f2c339fff" (UID: "addf00fe-9b9b-41d4-bd81-4e5f2c339fff"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:30:58 crc kubenswrapper[4810]: I0219 15:30:58.660584 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "addf00fe-9b9b-41d4-bd81-4e5f2c339fff" (UID: "addf00fe-9b9b-41d4-bd81-4e5f2c339fff"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 19 15:30:58 crc kubenswrapper[4810]: I0219 15:30:58.667400 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/addf00fe-9b9b-41d4-bd81-4e5f2c339fff-kube-api-access-tgrng" (OuterVolumeSpecName: "kube-api-access-tgrng") pod "addf00fe-9b9b-41d4-bd81-4e5f2c339fff" (UID: "addf00fe-9b9b-41d4-bd81-4e5f2c339fff"). InnerVolumeSpecName "kube-api-access-tgrng". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:30:58 crc kubenswrapper[4810]: I0219 15:30:58.698493 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/addf00fe-9b9b-41d4-bd81-4e5f2c339fff-scripts" (OuterVolumeSpecName: "scripts") pod "addf00fe-9b9b-41d4-bd81-4e5f2c339fff" (UID: "addf00fe-9b9b-41d4-bd81-4e5f2c339fff"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:30:58 crc kubenswrapper[4810]: I0219 15:30:58.740085 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/addf00fe-9b9b-41d4-bd81-4e5f2c339fff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "addf00fe-9b9b-41d4-bd81-4e5f2c339fff" (UID: "addf00fe-9b9b-41d4-bd81-4e5f2c339fff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:30:58 crc kubenswrapper[4810]: I0219 15:30:58.744351 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/addf00fe-9b9b-41d4-bd81-4e5f2c339fff-config-data" (OuterVolumeSpecName: "config-data") pod "addf00fe-9b9b-41d4-bd81-4e5f2c339fff" (UID: "addf00fe-9b9b-41d4-bd81-4e5f2c339fff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:30:58 crc kubenswrapper[4810]: I0219 15:30:58.745784 4810 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/addf00fe-9b9b-41d4-bd81-4e5f2c339fff-logs\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:58 crc kubenswrapper[4810]: I0219 15:30:58.745815 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/addf00fe-9b9b-41d4-bd81-4e5f2c339fff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:58 crc kubenswrapper[4810]: I0219 15:30:58.748063 4810 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Feb 19 15:30:58 crc kubenswrapper[4810]: I0219 15:30:58.748194 4810 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/addf00fe-9b9b-41d4-bd81-4e5f2c339fff-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:58 crc kubenswrapper[4810]: I0219 15:30:58.748213 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgrng\" (UniqueName: \"kubernetes.io/projected/addf00fe-9b9b-41d4-bd81-4e5f2c339fff-kube-api-access-tgrng\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:58 crc kubenswrapper[4810]: I0219 15:30:58.748227 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/addf00fe-9b9b-41d4-bd81-4e5f2c339fff-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:58 crc kubenswrapper[4810]: I0219 15:30:58.748267 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/addf00fe-9b9b-41d4-bd81-4e5f2c339fff-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:58 crc kubenswrapper[4810]: I0219 15:30:58.783991 4810 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Feb 19 15:30:58 crc kubenswrapper[4810]: I0219 15:30:58.786912 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"41a4af93-6f80-4097-a964-2e3f3055fd3b","Type":"ContainerStarted","Data":"06214b2a930e66c57e53806652bef4bf287da47703b5c4cea0142b8a75acc4b7"} Feb 19 15:30:58 crc kubenswrapper[4810]: I0219 15:30:58.793036 4810 generic.go:334] "Generic (PLEG): container finished" podID="addf00fe-9b9b-41d4-bd81-4e5f2c339fff" containerID="598ccff405c1e1906ae06df4732ddadd78333232ea68a893ba2db6513c96c3dd" exitCode=0 Feb 19 15:30:58 crc kubenswrapper[4810]: I0219 15:30:58.793076 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"addf00fe-9b9b-41d4-bd81-4e5f2c339fff","Type":"ContainerDied","Data":"598ccff405c1e1906ae06df4732ddadd78333232ea68a893ba2db6513c96c3dd"} Feb 19 15:30:58 crc kubenswrapper[4810]: I0219 15:30:58.793103 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"addf00fe-9b9b-41d4-bd81-4e5f2c339fff","Type":"ContainerDied","Data":"58cd5599b992d54d64de99ed5546382a6f34cf94866c9af8e9254502abddbf03"} Feb 19 15:30:58 crc kubenswrapper[4810]: I0219 15:30:58.793120 4810 scope.go:117] "RemoveContainer" containerID="598ccff405c1e1906ae06df4732ddadd78333232ea68a893ba2db6513c96c3dd" Feb 19 15:30:58 crc kubenswrapper[4810]: I0219 15:30:58.793184 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 15:30:58 crc kubenswrapper[4810]: I0219 15:30:58.801550 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/addf00fe-9b9b-41d4-bd81-4e5f2c339fff-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "addf00fe-9b9b-41d4-bd81-4e5f2c339fff" (UID: "addf00fe-9b9b-41d4-bd81-4e5f2c339fff"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:30:58 crc kubenswrapper[4810]: I0219 15:30:58.826138 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.826120014 podStartE2EDuration="3.826120014s" podCreationTimestamp="2026-02-19 15:30:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:30:58.809777983 +0000 UTC m=+1288.291808097" watchObservedRunningTime="2026-02-19 15:30:58.826120014 +0000 UTC m=+1288.308150138" Feb 19 15:30:58 crc kubenswrapper[4810]: I0219 15:30:58.850053 4810 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/addf00fe-9b9b-41d4-bd81-4e5f2c339fff-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:58 crc kubenswrapper[4810]: I0219 15:30:58.850077 4810 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Feb 19 15:30:58 crc kubenswrapper[4810]: I0219 15:30:58.898014 4810 scope.go:117] "RemoveContainer" containerID="1f3fdcf1ede0870a63fe45639c44983223bdd32c1b88c14ab338ab8b9a3f039c" Feb 19 15:30:58 crc kubenswrapper[4810]: I0219 15:30:58.932412 4810 scope.go:117] "RemoveContainer" containerID="598ccff405c1e1906ae06df4732ddadd78333232ea68a893ba2db6513c96c3dd" Feb 19 15:30:58 crc kubenswrapper[4810]: E0219 15:30:58.932923 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"598ccff405c1e1906ae06df4732ddadd78333232ea68a893ba2db6513c96c3dd\": container with ID starting with 598ccff405c1e1906ae06df4732ddadd78333232ea68a893ba2db6513c96c3dd not found: ID does not exist" containerID="598ccff405c1e1906ae06df4732ddadd78333232ea68a893ba2db6513c96c3dd" Feb 19 15:30:58 crc kubenswrapper[4810]: I0219 15:30:58.932965 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"598ccff405c1e1906ae06df4732ddadd78333232ea68a893ba2db6513c96c3dd"} err="failed to get container status \"598ccff405c1e1906ae06df4732ddadd78333232ea68a893ba2db6513c96c3dd\": rpc error: code = NotFound desc = could not find container \"598ccff405c1e1906ae06df4732ddadd78333232ea68a893ba2db6513c96c3dd\": container with ID starting with 598ccff405c1e1906ae06df4732ddadd78333232ea68a893ba2db6513c96c3dd not found: ID does not exist" Feb 19 15:30:58 crc kubenswrapper[4810]: I0219 15:30:58.932991 4810 scope.go:117] "RemoveContainer" containerID="1f3fdcf1ede0870a63fe45639c44983223bdd32c1b88c14ab338ab8b9a3f039c" Feb 19 15:30:58 crc kubenswrapper[4810]: E0219 15:30:58.933345 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f3fdcf1ede0870a63fe45639c44983223bdd32c1b88c14ab338ab8b9a3f039c\": container with ID starting with 1f3fdcf1ede0870a63fe45639c44983223bdd32c1b88c14ab338ab8b9a3f039c not found: ID does not exist" containerID="1f3fdcf1ede0870a63fe45639c44983223bdd32c1b88c14ab338ab8b9a3f039c" Feb 19 15:30:58 crc kubenswrapper[4810]: I0219 15:30:58.933375 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f3fdcf1ede0870a63fe45639c44983223bdd32c1b88c14ab338ab8b9a3f039c"} err="failed to get container status \"1f3fdcf1ede0870a63fe45639c44983223bdd32c1b88c14ab338ab8b9a3f039c\": rpc error: code = NotFound desc = could not find container \"1f3fdcf1ede0870a63fe45639c44983223bdd32c1b88c14ab338ab8b9a3f039c\": container with ID starting with 1f3fdcf1ede0870a63fe45639c44983223bdd32c1b88c14ab338ab8b9a3f039c not found: ID does not exist" Feb 19 15:30:59 crc kubenswrapper[4810]: I0219 15:30:59.141630 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 15:30:59 crc kubenswrapper[4810]: I0219 15:30:59.154229 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 15:30:59 crc kubenswrapper[4810]: I0219 15:30:59.169961 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 15:30:59 crc kubenswrapper[4810]: E0219 15:30:59.170444 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="addf00fe-9b9b-41d4-bd81-4e5f2c339fff" containerName="glance-log" Feb 19 15:30:59 crc kubenswrapper[4810]: I0219 15:30:59.170470 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="addf00fe-9b9b-41d4-bd81-4e5f2c339fff" containerName="glance-log" Feb 19 15:30:59 crc kubenswrapper[4810]: E0219 15:30:59.170507 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="addf00fe-9b9b-41d4-bd81-4e5f2c339fff" containerName="glance-httpd" Feb 19 15:30:59 crc kubenswrapper[4810]: I0219 15:30:59.170517 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="addf00fe-9b9b-41d4-bd81-4e5f2c339fff" containerName="glance-httpd" Feb 19 15:30:59 crc kubenswrapper[4810]: I0219 15:30:59.170716 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="addf00fe-9b9b-41d4-bd81-4e5f2c339fff" containerName="glance-log" Feb 19 15:30:59 crc kubenswrapper[4810]: I0219 15:30:59.170739 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="addf00fe-9b9b-41d4-bd81-4e5f2c339fff" containerName="glance-httpd" Feb 19 15:30:59 crc kubenswrapper[4810]: I0219 15:30:59.171923 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 15:30:59 crc kubenswrapper[4810]: I0219 15:30:59.176489 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 19 15:30:59 crc kubenswrapper[4810]: I0219 15:30:59.177971 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 19 15:30:59 crc kubenswrapper[4810]: I0219 15:30:59.188510 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 15:30:59 crc kubenswrapper[4810]: I0219 15:30:59.257581 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6cc215f9-4ba7-4f38-aea0-bdb68abb2ad4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6cc215f9-4ba7-4f38-aea0-bdb68abb2ad4\") " pod="openstack/glance-default-internal-api-0" Feb 19 15:30:59 crc kubenswrapper[4810]: I0219 15:30:59.257632 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6cc215f9-4ba7-4f38-aea0-bdb68abb2ad4-logs\") pod \"glance-default-internal-api-0\" (UID: \"6cc215f9-4ba7-4f38-aea0-bdb68abb2ad4\") " pod="openstack/glance-default-internal-api-0" Feb 19 15:30:59 crc kubenswrapper[4810]: I0219 15:30:59.257659 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"6cc215f9-4ba7-4f38-aea0-bdb68abb2ad4\") " pod="openstack/glance-default-internal-api-0" Feb 19 15:30:59 crc kubenswrapper[4810]: I0219 15:30:59.257691 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6cc215f9-4ba7-4f38-aea0-bdb68abb2ad4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6cc215f9-4ba7-4f38-aea0-bdb68abb2ad4\") " pod="openstack/glance-default-internal-api-0" Feb 19 15:30:59 crc kubenswrapper[4810]: I0219 15:30:59.257721 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cc215f9-4ba7-4f38-aea0-bdb68abb2ad4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6cc215f9-4ba7-4f38-aea0-bdb68abb2ad4\") " pod="openstack/glance-default-internal-api-0" Feb 19 15:30:59 crc kubenswrapper[4810]: I0219 15:30:59.257757 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc215f9-4ba7-4f38-aea0-bdb68abb2ad4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6cc215f9-4ba7-4f38-aea0-bdb68abb2ad4\") " pod="openstack/glance-default-internal-api-0" Feb 19 15:30:59 crc kubenswrapper[4810]: I0219 15:30:59.257782 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cc215f9-4ba7-4f38-aea0-bdb68abb2ad4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6cc215f9-4ba7-4f38-aea0-bdb68abb2ad4\") " pod="openstack/glance-default-internal-api-0" Feb 19 15:30:59 crc kubenswrapper[4810]: I0219 15:30:59.257811 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sprcz\" (UniqueName: \"kubernetes.io/projected/6cc215f9-4ba7-4f38-aea0-bdb68abb2ad4-kube-api-access-sprcz\") pod \"glance-default-internal-api-0\" (UID: \"6cc215f9-4ba7-4f38-aea0-bdb68abb2ad4\") " pod="openstack/glance-default-internal-api-0" Feb 19 15:30:59 crc kubenswrapper[4810]: I0219 15:30:59.384090 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cc215f9-4ba7-4f38-aea0-bdb68abb2ad4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6cc215f9-4ba7-4f38-aea0-bdb68abb2ad4\") " pod="openstack/glance-default-internal-api-0" Feb 19 15:30:59 crc kubenswrapper[4810]: I0219 15:30:59.384178 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sprcz\" (UniqueName: \"kubernetes.io/projected/6cc215f9-4ba7-4f38-aea0-bdb68abb2ad4-kube-api-access-sprcz\") pod \"glance-default-internal-api-0\" (UID: \"6cc215f9-4ba7-4f38-aea0-bdb68abb2ad4\") " pod="openstack/glance-default-internal-api-0" Feb 19 15:30:59 crc kubenswrapper[4810]: I0219 15:30:59.384292 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6cc215f9-4ba7-4f38-aea0-bdb68abb2ad4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6cc215f9-4ba7-4f38-aea0-bdb68abb2ad4\") " pod="openstack/glance-default-internal-api-0" Feb 19 15:30:59 crc kubenswrapper[4810]: I0219 15:30:59.384350 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6cc215f9-4ba7-4f38-aea0-bdb68abb2ad4-logs\") pod \"glance-default-internal-api-0\" (UID: \"6cc215f9-4ba7-4f38-aea0-bdb68abb2ad4\") " pod="openstack/glance-default-internal-api-0" Feb 19 15:30:59 crc kubenswrapper[4810]: I0219 15:30:59.384389 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"6cc215f9-4ba7-4f38-aea0-bdb68abb2ad4\") " pod="openstack/glance-default-internal-api-0" Feb 19 15:30:59 crc kubenswrapper[4810]: I0219 15:30:59.384438 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6cc215f9-4ba7-4f38-aea0-bdb68abb2ad4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6cc215f9-4ba7-4f38-aea0-bdb68abb2ad4\") " pod="openstack/glance-default-internal-api-0" Feb 19 15:30:59 crc kubenswrapper[4810]: I0219 15:30:59.384490 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cc215f9-4ba7-4f38-aea0-bdb68abb2ad4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6cc215f9-4ba7-4f38-aea0-bdb68abb2ad4\") " pod="openstack/glance-default-internal-api-0" Feb 19 15:30:59 crc kubenswrapper[4810]: I0219 15:30:59.384547 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc215f9-4ba7-4f38-aea0-bdb68abb2ad4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6cc215f9-4ba7-4f38-aea0-bdb68abb2ad4\") " pod="openstack/glance-default-internal-api-0" Feb 19 15:30:59 crc kubenswrapper[4810]: I0219 15:30:59.384781 4810 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"6cc215f9-4ba7-4f38-aea0-bdb68abb2ad4\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Feb 19 15:30:59 crc kubenswrapper[4810]: I0219 15:30:59.385128 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6cc215f9-4ba7-4f38-aea0-bdb68abb2ad4-logs\") pod \"glance-default-internal-api-0\" (UID: \"6cc215f9-4ba7-4f38-aea0-bdb68abb2ad4\") " pod="openstack/glance-default-internal-api-0" Feb 19 15:30:59 crc kubenswrapper[4810]: I0219 15:30:59.385176 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6cc215f9-4ba7-4f38-aea0-bdb68abb2ad4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6cc215f9-4ba7-4f38-aea0-bdb68abb2ad4\") " pod="openstack/glance-default-internal-api-0" Feb 19 15:30:59 crc kubenswrapper[4810]: I0219 15:30:59.389454 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc215f9-4ba7-4f38-aea0-bdb68abb2ad4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6cc215f9-4ba7-4f38-aea0-bdb68abb2ad4\") " pod="openstack/glance-default-internal-api-0" Feb 19 15:30:59 crc kubenswrapper[4810]: I0219 15:30:59.390056 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cc215f9-4ba7-4f38-aea0-bdb68abb2ad4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6cc215f9-4ba7-4f38-aea0-bdb68abb2ad4\") " pod="openstack/glance-default-internal-api-0" Feb 19 15:30:59 crc kubenswrapper[4810]: I0219 15:30:59.390479 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cc215f9-4ba7-4f38-aea0-bdb68abb2ad4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6cc215f9-4ba7-4f38-aea0-bdb68abb2ad4\") " pod="openstack/glance-default-internal-api-0" Feb 19 15:30:59 crc kubenswrapper[4810]: I0219 15:30:59.397636 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6cc215f9-4ba7-4f38-aea0-bdb68abb2ad4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6cc215f9-4ba7-4f38-aea0-bdb68abb2ad4\") " pod="openstack/glance-default-internal-api-0" Feb 19 15:30:59 crc kubenswrapper[4810]: I0219 15:30:59.418095 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sprcz\" (UniqueName: \"kubernetes.io/projected/6cc215f9-4ba7-4f38-aea0-bdb68abb2ad4-kube-api-access-sprcz\") pod \"glance-default-internal-api-0\" (UID: \"6cc215f9-4ba7-4f38-aea0-bdb68abb2ad4\") " pod="openstack/glance-default-internal-api-0" Feb 19 15:30:59 crc kubenswrapper[4810]: I0219 15:30:59.422224 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"6cc215f9-4ba7-4f38-aea0-bdb68abb2ad4\") " pod="openstack/glance-default-internal-api-0" Feb 19 15:30:59 crc kubenswrapper[4810]: I0219 15:30:59.452716 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="addf00fe-9b9b-41d4-bd81-4e5f2c339fff" path="/var/lib/kubelet/pods/addf00fe-9b9b-41d4-bd81-4e5f2c339fff/volumes" Feb 19 15:30:59 crc kubenswrapper[4810]: I0219 15:30:59.490115 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 15:31:00 crc kubenswrapper[4810]: I0219 15:31:00.020294 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 15:31:00 crc kubenswrapper[4810]: I0219 15:31:00.822932 4810 generic.go:334] "Generic (PLEG): container finished" podID="bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38" containerID="fe9552106d30d58d9eba25443c982dcd127f8bde0f0223d82b19cb349c04ea05" exitCode=0 Feb 19 15:31:00 crc kubenswrapper[4810]: I0219 15:31:00.823413 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38","Type":"ContainerDied","Data":"fe9552106d30d58d9eba25443c982dcd127f8bde0f0223d82b19cb349c04ea05"} Feb 19 15:31:00 crc kubenswrapper[4810]: I0219 15:31:00.825733 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6cc215f9-4ba7-4f38-aea0-bdb68abb2ad4","Type":"ContainerStarted","Data":"9a7a844afa569d17708a027299d20c3f8da45965cf7fc8489248263f5ff699a9"} Feb 19 15:31:00 crc kubenswrapper[4810]: I0219 15:31:00.825755 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6cc215f9-4ba7-4f38-aea0-bdb68abb2ad4","Type":"ContainerStarted","Data":"030adf3e30e267b79173d4d483260554c095ee440f7945b2cccc696243dbe0bd"} Feb 19 15:31:01 crc kubenswrapper[4810]: I0219 15:31:01.207760 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 15:31:01 crc kubenswrapper[4810]: I0219 15:31:01.323367 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38-config-data\") pod \"bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38\" (UID: \"bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38\") " Feb 19 15:31:01 crc kubenswrapper[4810]: I0219 15:31:01.323645 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxthg\" (UniqueName: \"kubernetes.io/projected/bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38-kube-api-access-jxthg\") pod \"bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38\" (UID: \"bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38\") " Feb 19 15:31:01 crc kubenswrapper[4810]: I0219 15:31:01.323696 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38-combined-ca-bundle\") pod \"bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38\" (UID: \"bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38\") " Feb 19 15:31:01 crc kubenswrapper[4810]: I0219 15:31:01.323801 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38-run-httpd\") pod \"bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38\" (UID: \"bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38\") " Feb 19 15:31:01 crc kubenswrapper[4810]: I0219 15:31:01.323909 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38-sg-core-conf-yaml\") pod \"bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38\" (UID: \"bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38\") " Feb 19 15:31:01 crc kubenswrapper[4810]: I0219 15:31:01.323930 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38-scripts\") pod \"bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38\" (UID: \"bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38\") " Feb 19 15:31:01 crc kubenswrapper[4810]: I0219 15:31:01.323965 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38-log-httpd\") pod \"bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38\" (UID: \"bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38\") " Feb 19 15:31:01 crc kubenswrapper[4810]: I0219 15:31:01.324368 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38" (UID: "bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:31:01 crc kubenswrapper[4810]: I0219 15:31:01.324767 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38" (UID: "bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:31:01 crc kubenswrapper[4810]: I0219 15:31:01.334500 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38-scripts" (OuterVolumeSpecName: "scripts") pod "bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38" (UID: "bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:31:01 crc kubenswrapper[4810]: I0219 15:31:01.334653 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38-kube-api-access-jxthg" (OuterVolumeSpecName: "kube-api-access-jxthg") pod "bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38" (UID: "bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38"). InnerVolumeSpecName "kube-api-access-jxthg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:31:01 crc kubenswrapper[4810]: I0219 15:31:01.350823 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38" (UID: "bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:31:01 crc kubenswrapper[4810]: I0219 15:31:01.411435 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38" (UID: "bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:31:01 crc kubenswrapper[4810]: I0219 15:31:01.421061 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38-config-data" (OuterVolumeSpecName: "config-data") pod "bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38" (UID: "bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:31:01 crc kubenswrapper[4810]: I0219 15:31:01.426492 4810 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 19 15:31:01 crc kubenswrapper[4810]: I0219 15:31:01.426527 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 15:31:01 crc kubenswrapper[4810]: I0219 15:31:01.426540 4810 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 15:31:01 crc kubenswrapper[4810]: I0219 15:31:01.426552 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 15:31:01 crc kubenswrapper[4810]: I0219 15:31:01.426565 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jxthg\" (UniqueName: \"kubernetes.io/projected/bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38-kube-api-access-jxthg\") on node \"crc\" DevicePath \"\"" Feb 19 15:31:01 crc kubenswrapper[4810]: I0219 15:31:01.426579 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 15:31:01 crc kubenswrapper[4810]: I0219 15:31:01.426590 4810 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 15:31:01 crc kubenswrapper[4810]: I0219 15:31:01.837784 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6cc215f9-4ba7-4f38-aea0-bdb68abb2ad4","Type":"ContainerStarted","Data":"8f482a6ed999680b2d59aa1c6bace5119495fcc6394430164430e6c470400848"} Feb 19 15:31:01 crc kubenswrapper[4810]: I0219 15:31:01.841443 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38","Type":"ContainerDied","Data":"4ff3102af0db331be9ee6d83e5b6fe8b5c513c45a0e210d2e6a0f658eeddde5d"} Feb 19 15:31:01 crc kubenswrapper[4810]: I0219 15:31:01.841500 4810 scope.go:117] "RemoveContainer" containerID="7e16e7b6753a6364938105bbe0304a4ea0d0db7a6e7668e4cc2587c80cf11acf" Feb 19 15:31:01 crc kubenswrapper[4810]: I0219 15:31:01.841683 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 15:31:01 crc kubenswrapper[4810]: I0219 15:31:01.861798 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=2.861781425 podStartE2EDuration="2.861781425s" podCreationTimestamp="2026-02-19 15:30:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:31:01.856138407 +0000 UTC m=+1291.338168531" watchObservedRunningTime="2026-02-19 15:31:01.861781425 +0000 UTC m=+1291.343811549" Feb 19 15:31:01 crc kubenswrapper[4810]: I0219 15:31:01.875701 4810 scope.go:117] "RemoveContainer" containerID="b5e72355bab674a636cae3f88e4f871dcc819f970a52aa01b409a42a1d43b5a1" Feb 19 15:31:01 crc kubenswrapper[4810]: I0219 15:31:01.877658 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 15:31:01 crc kubenswrapper[4810]: I0219 15:31:01.884905 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 19 15:31:01 crc kubenswrapper[4810]: I0219 15:31:01.899091 4810 scope.go:117] "RemoveContainer" containerID="fe9552106d30d58d9eba25443c982dcd127f8bde0f0223d82b19cb349c04ea05" Feb 19 15:31:01 crc kubenswrapper[4810]: I0219 15:31:01.906865 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 19 15:31:01 crc kubenswrapper[4810]: E0219 15:31:01.907410 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38" containerName="ceilometer-central-agent" Feb 19 15:31:01 crc kubenswrapper[4810]: I0219 15:31:01.907432 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38" containerName="ceilometer-central-agent" Feb 19 15:31:01 crc kubenswrapper[4810]: E0219 15:31:01.907459 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38" containerName="ceilometer-notification-agent" Feb 19 15:31:01 crc kubenswrapper[4810]: I0219 15:31:01.907469 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38" containerName="ceilometer-notification-agent" Feb 19 15:31:01 crc kubenswrapper[4810]: E0219 15:31:01.907490 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38" containerName="proxy-httpd" Feb 19 15:31:01 crc kubenswrapper[4810]: I0219 15:31:01.907499 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38" containerName="proxy-httpd" Feb 19 15:31:01 crc kubenswrapper[4810]: E0219 15:31:01.907513 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38" containerName="sg-core" Feb 19 15:31:01 crc kubenswrapper[4810]: I0219 15:31:01.907521 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38" containerName="sg-core" Feb 19 15:31:01 crc kubenswrapper[4810]: I0219 15:31:01.907765 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38" containerName="proxy-httpd" Feb 19 15:31:01 crc kubenswrapper[4810]: I0219 15:31:01.907793 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38" containerName="ceilometer-central-agent" Feb 19 15:31:01 crc kubenswrapper[4810]: I0219 15:31:01.907813 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38" containerName="ceilometer-notification-agent" Feb 19 15:31:01 crc kubenswrapper[4810]: I0219 15:31:01.907832 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38" containerName="sg-core" Feb 19 15:31:01 crc kubenswrapper[4810]: I0219 15:31:01.911594 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 15:31:01 crc kubenswrapper[4810]: I0219 15:31:01.918758 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 19 15:31:01 crc kubenswrapper[4810]: I0219 15:31:01.918988 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 19 15:31:01 crc kubenswrapper[4810]: I0219 15:31:01.924180 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 15:31:01 crc kubenswrapper[4810]: I0219 15:31:01.931138 4810 scope.go:117] "RemoveContainer" containerID="2271728f7abc2e8188f15e24a952eadeafff700ee538da3ec50e4feeabb89491" Feb 19 15:31:02 crc kubenswrapper[4810]: I0219 15:31:02.039043 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b358095c-30ba-4f90-b627-63650857fc49-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b358095c-30ba-4f90-b627-63650857fc49\") " pod="openstack/ceilometer-0" Feb 19 15:31:02 crc kubenswrapper[4810]: I0219 15:31:02.039377 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b358095c-30ba-4f90-b627-63650857fc49-log-httpd\") pod \"ceilometer-0\" (UID: \"b358095c-30ba-4f90-b627-63650857fc49\") " pod="openstack/ceilometer-0" Feb 19 15:31:02 crc kubenswrapper[4810]: I0219 15:31:02.039426 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b358095c-30ba-4f90-b627-63650857fc49-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b358095c-30ba-4f90-b627-63650857fc49\") " pod="openstack/ceilometer-0" Feb 19 15:31:02 crc kubenswrapper[4810]: I0219 15:31:02.039486 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b358095c-30ba-4f90-b627-63650857fc49-run-httpd\") pod \"ceilometer-0\" (UID: \"b358095c-30ba-4f90-b627-63650857fc49\") " pod="openstack/ceilometer-0" Feb 19 15:31:02 crc kubenswrapper[4810]: I0219 15:31:02.039573 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75hx8\" (UniqueName: \"kubernetes.io/projected/b358095c-30ba-4f90-b627-63650857fc49-kube-api-access-75hx8\") pod \"ceilometer-0\" (UID: \"b358095c-30ba-4f90-b627-63650857fc49\") " pod="openstack/ceilometer-0" Feb 19 15:31:02 crc kubenswrapper[4810]: I0219 15:31:02.039649 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b358095c-30ba-4f90-b627-63650857fc49-config-data\") pod \"ceilometer-0\" (UID: \"b358095c-30ba-4f90-b627-63650857fc49\") " pod="openstack/ceilometer-0" Feb 19 15:31:02 crc kubenswrapper[4810]: I0219 15:31:02.039666 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b358095c-30ba-4f90-b627-63650857fc49-scripts\") pod \"ceilometer-0\" (UID: \"b358095c-30ba-4f90-b627-63650857fc49\") " pod="openstack/ceilometer-0" Feb 19 15:31:02 crc kubenswrapper[4810]: I0219 15:31:02.091342 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Feb 19 15:31:02 crc kubenswrapper[4810]: I0219 15:31:02.131969 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Feb 19 15:31:02 crc kubenswrapper[4810]: I0219 15:31:02.143219 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b358095c-30ba-4f90-b627-63650857fc49-config-data\") pod \"ceilometer-0\" (UID: \"b358095c-30ba-4f90-b627-63650857fc49\") " pod="openstack/ceilometer-0" Feb 19 15:31:02 crc kubenswrapper[4810]: I0219 15:31:02.143260 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b358095c-30ba-4f90-b627-63650857fc49-scripts\") pod \"ceilometer-0\" (UID: \"b358095c-30ba-4f90-b627-63650857fc49\") " pod="openstack/ceilometer-0" Feb 19 15:31:02 crc kubenswrapper[4810]: I0219 15:31:02.143309 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b358095c-30ba-4f90-b627-63650857fc49-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b358095c-30ba-4f90-b627-63650857fc49\") " pod="openstack/ceilometer-0" Feb 19 15:31:02 crc kubenswrapper[4810]: I0219 15:31:02.143441 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b358095c-30ba-4f90-b627-63650857fc49-log-httpd\") pod \"ceilometer-0\" (UID: \"b358095c-30ba-4f90-b627-63650857fc49\") " pod="openstack/ceilometer-0" Feb 19 15:31:02 crc kubenswrapper[4810]: I0219 15:31:02.143482 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b358095c-30ba-4f90-b627-63650857fc49-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b358095c-30ba-4f90-b627-63650857fc49\") " pod="openstack/ceilometer-0" Feb 19 15:31:02 crc kubenswrapper[4810]: I0219 15:31:02.143534 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b358095c-30ba-4f90-b627-63650857fc49-run-httpd\") pod \"ceilometer-0\" (UID: \"b358095c-30ba-4f90-b627-63650857fc49\") " pod="openstack/ceilometer-0" Feb 19 15:31:02 crc kubenswrapper[4810]: I0219 15:31:02.143571 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75hx8\" (UniqueName: \"kubernetes.io/projected/b358095c-30ba-4f90-b627-63650857fc49-kube-api-access-75hx8\") pod \"ceilometer-0\" (UID: \"b358095c-30ba-4f90-b627-63650857fc49\") " pod="openstack/ceilometer-0" Feb 19 15:31:02 crc kubenswrapper[4810]: I0219 15:31:02.144446 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b358095c-30ba-4f90-b627-63650857fc49-log-httpd\") pod \"ceilometer-0\" (UID: \"b358095c-30ba-4f90-b627-63650857fc49\") " pod="openstack/ceilometer-0" Feb 19 15:31:02 crc kubenswrapper[4810]: I0219 15:31:02.144868 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b358095c-30ba-4f90-b627-63650857fc49-run-httpd\") pod \"ceilometer-0\" (UID: \"b358095c-30ba-4f90-b627-63650857fc49\") " pod="openstack/ceilometer-0" Feb 19 15:31:02 crc kubenswrapper[4810]: I0219 15:31:02.151394 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b358095c-30ba-4f90-b627-63650857fc49-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b358095c-30ba-4f90-b627-63650857fc49\") " pod="openstack/ceilometer-0" Feb 19 15:31:02 crc kubenswrapper[4810]: I0219 15:31:02.152020 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b358095c-30ba-4f90-b627-63650857fc49-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b358095c-30ba-4f90-b627-63650857fc49\") " pod="openstack/ceilometer-0" Feb 19 15:31:02 crc kubenswrapper[4810]: I0219 15:31:02.163206 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b358095c-30ba-4f90-b627-63650857fc49-scripts\") pod \"ceilometer-0\" (UID: \"b358095c-30ba-4f90-b627-63650857fc49\") " pod="openstack/ceilometer-0" Feb 19 15:31:02 crc kubenswrapper[4810]: I0219 15:31:02.165581 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75hx8\" (UniqueName: \"kubernetes.io/projected/b358095c-30ba-4f90-b627-63650857fc49-kube-api-access-75hx8\") pod \"ceilometer-0\" (UID: \"b358095c-30ba-4f90-b627-63650857fc49\") " pod="openstack/ceilometer-0" Feb 19 15:31:02 crc kubenswrapper[4810]: I0219 15:31:02.168714 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b358095c-30ba-4f90-b627-63650857fc49-config-data\") pod \"ceilometer-0\" (UID: \"b358095c-30ba-4f90-b627-63650857fc49\") " pod="openstack/ceilometer-0" Feb 19 15:31:02 crc kubenswrapper[4810]: I0219 15:31:02.233251 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 15:31:02 crc kubenswrapper[4810]: I0219 15:31:02.732090 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 15:31:02 crc kubenswrapper[4810]: I0219 15:31:02.741618 4810 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 15:31:02 crc kubenswrapper[4810]: I0219 15:31:02.857639 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b358095c-30ba-4f90-b627-63650857fc49","Type":"ContainerStarted","Data":"6c4b9f3d928f0cccc0fb2b10a58ce849fff32f26291f366f0504f6cc2013df4e"} Feb 19 15:31:02 crc kubenswrapper[4810]: I0219 15:31:02.858296 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Feb 19 15:31:02 crc kubenswrapper[4810]: I0219 15:31:02.914367 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Feb 19 15:31:03 crc kubenswrapper[4810]: I0219 15:31:03.438375 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 15:31:03 crc kubenswrapper[4810]: I0219 15:31:03.451205 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38" path="/var/lib/kubelet/pods/bbe979b4-aa62-4c1f-8329-6eb2ae2cfa38/volumes" Feb 19 15:31:03 crc kubenswrapper[4810]: I0219 15:31:03.874821 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b358095c-30ba-4f90-b627-63650857fc49","Type":"ContainerStarted","Data":"c9c0491a003bacacbdcb1eaf1f7857933abd385ecfb0908efea3150739289951"} Feb 19 15:31:03 crc kubenswrapper[4810]: I0219 15:31:03.875278 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b358095c-30ba-4f90-b627-63650857fc49","Type":"ContainerStarted","Data":"c213283e307bb78ffccee82c18c4205b43e9c53c95b737f8fce428b65bf54094"} Feb 19 15:31:04 crc kubenswrapper[4810]: I0219 15:31:04.901300 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b358095c-30ba-4f90-b627-63650857fc49","Type":"ContainerStarted","Data":"961cd98ef99173e92df54e7b424605d95b61a834adb6a8bf1e7c9b237c0afdc7"} Feb 19 15:31:06 crc kubenswrapper[4810]: I0219 15:31:06.208381 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 19 15:31:06 crc kubenswrapper[4810]: I0219 15:31:06.208716 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 19 15:31:06 crc kubenswrapper[4810]: I0219 15:31:06.243120 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 19 15:31:06 crc kubenswrapper[4810]: I0219 15:31:06.248735 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 19 15:31:06 crc kubenswrapper[4810]: I0219 15:31:06.919241 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b358095c-30ba-4f90-b627-63650857fc49","Type":"ContainerStarted","Data":"468167a7d908f6d23e0b2d81fbf04b7c71508eb08eaac8f05a5952b968bfd835"} Feb 19 15:31:06 crc kubenswrapper[4810]: I0219 15:31:06.919567 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 19 15:31:06 crc kubenswrapper[4810]: I0219 15:31:06.919581 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 19 15:31:06 crc kubenswrapper[4810]: I0219 15:31:06.919727 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b358095c-30ba-4f90-b627-63650857fc49" containerName="ceilometer-central-agent" containerID="cri-o://c213283e307bb78ffccee82c18c4205b43e9c53c95b737f8fce428b65bf54094" gracePeriod=30 Feb 19 15:31:06 crc kubenswrapper[4810]: I0219 15:31:06.919803 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b358095c-30ba-4f90-b627-63650857fc49" containerName="proxy-httpd" containerID="cri-o://468167a7d908f6d23e0b2d81fbf04b7c71508eb08eaac8f05a5952b968bfd835" gracePeriod=30 Feb 19 15:31:06 crc kubenswrapper[4810]: I0219 15:31:06.919839 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b358095c-30ba-4f90-b627-63650857fc49" containerName="sg-core" containerID="cri-o://961cd98ef99173e92df54e7b424605d95b61a834adb6a8bf1e7c9b237c0afdc7" gracePeriod=30 Feb 19 15:31:06 crc kubenswrapper[4810]: I0219 15:31:06.919868 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b358095c-30ba-4f90-b627-63650857fc49" containerName="ceilometer-notification-agent" containerID="cri-o://c9c0491a003bacacbdcb1eaf1f7857933abd385ecfb0908efea3150739289951" gracePeriod=30 Feb 19 15:31:06 crc kubenswrapper[4810]: I0219 15:31:06.952283 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.983119187 podStartE2EDuration="5.952261726s" podCreationTimestamp="2026-02-19 15:31:01 +0000 UTC" firstStartedPulling="2026-02-19 15:31:02.741306692 +0000 UTC m=+1292.223336826" lastFinishedPulling="2026-02-19 15:31:05.710449241 +0000 UTC m=+1295.192479365" observedRunningTime="2026-02-19 15:31:06.945245654 +0000 UTC m=+1296.427275778" watchObservedRunningTime="2026-02-19 15:31:06.952261726 +0000 UTC m=+1296.434291860" Feb 19 15:31:07 crc kubenswrapper[4810]: I0219 15:31:07.933058 4810 generic.go:334] "Generic (PLEG): container finished" podID="b358095c-30ba-4f90-b627-63650857fc49" containerID="468167a7d908f6d23e0b2d81fbf04b7c71508eb08eaac8f05a5952b968bfd835" exitCode=0 Feb 19 15:31:07 crc kubenswrapper[4810]: I0219 15:31:07.933346 4810 generic.go:334] "Generic (PLEG): container finished" podID="b358095c-30ba-4f90-b627-63650857fc49" containerID="961cd98ef99173e92df54e7b424605d95b61a834adb6a8bf1e7c9b237c0afdc7" exitCode=2 Feb 19 15:31:07 crc kubenswrapper[4810]: I0219 15:31:07.933362 4810 generic.go:334] "Generic (PLEG): container finished" podID="b358095c-30ba-4f90-b627-63650857fc49" containerID="c9c0491a003bacacbdcb1eaf1f7857933abd385ecfb0908efea3150739289951" exitCode=0 Feb 19 15:31:07 crc kubenswrapper[4810]: I0219 15:31:07.933420 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b358095c-30ba-4f90-b627-63650857fc49","Type":"ContainerDied","Data":"468167a7d908f6d23e0b2d81fbf04b7c71508eb08eaac8f05a5952b968bfd835"} Feb 19 15:31:07 crc kubenswrapper[4810]: I0219 15:31:07.933445 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b358095c-30ba-4f90-b627-63650857fc49","Type":"ContainerDied","Data":"961cd98ef99173e92df54e7b424605d95b61a834adb6a8bf1e7c9b237c0afdc7"} Feb 19 15:31:07 crc kubenswrapper[4810]: I0219 15:31:07.933454 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b358095c-30ba-4f90-b627-63650857fc49","Type":"ContainerDied","Data":"c9c0491a003bacacbdcb1eaf1f7857933abd385ecfb0908efea3150739289951"} Feb 19 15:31:07 crc kubenswrapper[4810]: I0219 15:31:07.936317 4810 generic.go:334] "Generic (PLEG): container finished" podID="972d6f5e-3edf-4b6e-bdde-39c580caea31" containerID="19b609d4be47506e6c511dded32be9ffbc5fec785d73d8309ff072ff0f1cf61d" exitCode=0 Feb 19 15:31:07 crc kubenswrapper[4810]: I0219 15:31:07.936430 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-58xq9" event={"ID":"972d6f5e-3edf-4b6e-bdde-39c580caea31","Type":"ContainerDied","Data":"19b609d4be47506e6c511dded32be9ffbc5fec785d73d8309ff072ff0f1cf61d"} Feb 19 15:31:08 crc kubenswrapper[4810]: I0219 15:31:08.725697 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 19 15:31:08 crc kubenswrapper[4810]: I0219 15:31:08.746341 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 19 15:31:09 crc kubenswrapper[4810]: I0219 15:31:09.303562 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-58xq9" Feb 19 15:31:09 crc kubenswrapper[4810]: I0219 15:31:09.398663 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/972d6f5e-3edf-4b6e-bdde-39c580caea31-scripts\") pod \"972d6f5e-3edf-4b6e-bdde-39c580caea31\" (UID: \"972d6f5e-3edf-4b6e-bdde-39c580caea31\") " Feb 19 15:31:09 crc kubenswrapper[4810]: I0219 15:31:09.398953 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75crr\" (UniqueName: \"kubernetes.io/projected/972d6f5e-3edf-4b6e-bdde-39c580caea31-kube-api-access-75crr\") pod \"972d6f5e-3edf-4b6e-bdde-39c580caea31\" (UID: \"972d6f5e-3edf-4b6e-bdde-39c580caea31\") " Feb 19 15:31:09 crc kubenswrapper[4810]: I0219 15:31:09.399019 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/972d6f5e-3edf-4b6e-bdde-39c580caea31-combined-ca-bundle\") pod \"972d6f5e-3edf-4b6e-bdde-39c580caea31\" (UID: \"972d6f5e-3edf-4b6e-bdde-39c580caea31\") " Feb 19 15:31:09 crc kubenswrapper[4810]: I0219 15:31:09.399058 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/972d6f5e-3edf-4b6e-bdde-39c580caea31-config-data\") pod \"972d6f5e-3edf-4b6e-bdde-39c580caea31\" (UID: \"972d6f5e-3edf-4b6e-bdde-39c580caea31\") " Feb 19 15:31:09 crc kubenswrapper[4810]: I0219 15:31:09.404996 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/972d6f5e-3edf-4b6e-bdde-39c580caea31-scripts" (OuterVolumeSpecName: "scripts") pod "972d6f5e-3edf-4b6e-bdde-39c580caea31" (UID: "972d6f5e-3edf-4b6e-bdde-39c580caea31"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:31:09 crc kubenswrapper[4810]: I0219 15:31:09.406921 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/972d6f5e-3edf-4b6e-bdde-39c580caea31-kube-api-access-75crr" (OuterVolumeSpecName: "kube-api-access-75crr") pod "972d6f5e-3edf-4b6e-bdde-39c580caea31" (UID: "972d6f5e-3edf-4b6e-bdde-39c580caea31"). InnerVolumeSpecName "kube-api-access-75crr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:31:09 crc kubenswrapper[4810]: I0219 15:31:09.436649 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/972d6f5e-3edf-4b6e-bdde-39c580caea31-config-data" (OuterVolumeSpecName: "config-data") pod "972d6f5e-3edf-4b6e-bdde-39c580caea31" (UID: "972d6f5e-3edf-4b6e-bdde-39c580caea31"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:31:09 crc kubenswrapper[4810]: I0219 15:31:09.437119 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/972d6f5e-3edf-4b6e-bdde-39c580caea31-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "972d6f5e-3edf-4b6e-bdde-39c580caea31" (UID: "972d6f5e-3edf-4b6e-bdde-39c580caea31"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:31:09 crc kubenswrapper[4810]: I0219 15:31:09.490757 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 19 15:31:09 crc kubenswrapper[4810]: I0219 15:31:09.490804 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 19 15:31:09 crc kubenswrapper[4810]: I0219 15:31:09.502487 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/972d6f5e-3edf-4b6e-bdde-39c580caea31-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 15:31:09 crc kubenswrapper[4810]: I0219 15:31:09.502748 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/972d6f5e-3edf-4b6e-bdde-39c580caea31-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 15:31:09 crc kubenswrapper[4810]: I0219 15:31:09.502758 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/972d6f5e-3edf-4b6e-bdde-39c580caea31-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 15:31:09 crc kubenswrapper[4810]: I0219 15:31:09.502767 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75crr\" (UniqueName: \"kubernetes.io/projected/972d6f5e-3edf-4b6e-bdde-39c580caea31-kube-api-access-75crr\") on node \"crc\" DevicePath \"\"" Feb 19 15:31:09 crc kubenswrapper[4810]: I0219 15:31:09.531395 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 19 15:31:09 crc kubenswrapper[4810]: I0219 15:31:09.532101 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 19 15:31:09 crc kubenswrapper[4810]: I0219 15:31:09.915456 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 15:31:09 crc kubenswrapper[4810]: I0219 15:31:09.961372 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-58xq9" event={"ID":"972d6f5e-3edf-4b6e-bdde-39c580caea31","Type":"ContainerDied","Data":"0c01e86e8bf881c49ad9ca06e89f46142a0a38c81d0b0f4cbfef925f99badde4"} Feb 19 15:31:09 crc kubenswrapper[4810]: I0219 15:31:09.961415 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c01e86e8bf881c49ad9ca06e89f46142a0a38c81d0b0f4cbfef925f99badde4" Feb 19 15:31:09 crc kubenswrapper[4810]: I0219 15:31:09.961752 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-58xq9" Feb 19 15:31:09 crc kubenswrapper[4810]: I0219 15:31:09.964626 4810 generic.go:334] "Generic (PLEG): container finished" podID="b358095c-30ba-4f90-b627-63650857fc49" containerID="c213283e307bb78ffccee82c18c4205b43e9c53c95b737f8fce428b65bf54094" exitCode=0 Feb 19 15:31:09 crc kubenswrapper[4810]: I0219 15:31:09.965737 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 15:31:09 crc kubenswrapper[4810]: I0219 15:31:09.966245 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b358095c-30ba-4f90-b627-63650857fc49","Type":"ContainerDied","Data":"c213283e307bb78ffccee82c18c4205b43e9c53c95b737f8fce428b65bf54094"} Feb 19 15:31:09 crc kubenswrapper[4810]: I0219 15:31:09.966278 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b358095c-30ba-4f90-b627-63650857fc49","Type":"ContainerDied","Data":"6c4b9f3d928f0cccc0fb2b10a58ce849fff32f26291f366f0504f6cc2013df4e"} Feb 19 15:31:09 crc kubenswrapper[4810]: I0219 15:31:09.966299 4810 scope.go:117] "RemoveContainer" containerID="468167a7d908f6d23e0b2d81fbf04b7c71508eb08eaac8f05a5952b968bfd835" Feb 19 15:31:09 crc kubenswrapper[4810]: I0219 15:31:09.967498 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 19 15:31:09 crc kubenswrapper[4810]: I0219 15:31:09.967520 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.010465 4810 scope.go:117] "RemoveContainer" containerID="961cd98ef99173e92df54e7b424605d95b61a834adb6a8bf1e7c9b237c0afdc7" Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.011184 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b358095c-30ba-4f90-b627-63650857fc49-run-httpd\") pod \"b358095c-30ba-4f90-b627-63650857fc49\" (UID: \"b358095c-30ba-4f90-b627-63650857fc49\") " Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.011254 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b358095c-30ba-4f90-b627-63650857fc49-scripts\") pod \"b358095c-30ba-4f90-b627-63650857fc49\" (UID: \"b358095c-30ba-4f90-b627-63650857fc49\") " Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.011313 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b358095c-30ba-4f90-b627-63650857fc49-config-data\") pod \"b358095c-30ba-4f90-b627-63650857fc49\" (UID: \"b358095c-30ba-4f90-b627-63650857fc49\") " Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.011401 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b358095c-30ba-4f90-b627-63650857fc49-sg-core-conf-yaml\") pod \"b358095c-30ba-4f90-b627-63650857fc49\" (UID: \"b358095c-30ba-4f90-b627-63650857fc49\") " Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.011496 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75hx8\" (UniqueName: \"kubernetes.io/projected/b358095c-30ba-4f90-b627-63650857fc49-kube-api-access-75hx8\") pod \"b358095c-30ba-4f90-b627-63650857fc49\" (UID: \"b358095c-30ba-4f90-b627-63650857fc49\") " Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.011533 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b358095c-30ba-4f90-b627-63650857fc49-combined-ca-bundle\") pod \"b358095c-30ba-4f90-b627-63650857fc49\" (UID: \"b358095c-30ba-4f90-b627-63650857fc49\") " Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.011596 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b358095c-30ba-4f90-b627-63650857fc49-log-httpd\") pod \"b358095c-30ba-4f90-b627-63650857fc49\" (UID: \"b358095c-30ba-4f90-b627-63650857fc49\") " Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.011752 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b358095c-30ba-4f90-b627-63650857fc49-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b358095c-30ba-4f90-b627-63650857fc49" (UID: "b358095c-30ba-4f90-b627-63650857fc49"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.012205 4810 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b358095c-30ba-4f90-b627-63650857fc49-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.013491 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b358095c-30ba-4f90-b627-63650857fc49-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b358095c-30ba-4f90-b627-63650857fc49" (UID: "b358095c-30ba-4f90-b627-63650857fc49"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.016565 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b358095c-30ba-4f90-b627-63650857fc49-scripts" (OuterVolumeSpecName: "scripts") pod "b358095c-30ba-4f90-b627-63650857fc49" (UID: "b358095c-30ba-4f90-b627-63650857fc49"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.022799 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b358095c-30ba-4f90-b627-63650857fc49-kube-api-access-75hx8" (OuterVolumeSpecName: "kube-api-access-75hx8") pod "b358095c-30ba-4f90-b627-63650857fc49" (UID: "b358095c-30ba-4f90-b627-63650857fc49"). InnerVolumeSpecName "kube-api-access-75hx8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.042234 4810 scope.go:117] "RemoveContainer" containerID="c9c0491a003bacacbdcb1eaf1f7857933abd385ecfb0908efea3150739289951" Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.067840 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 19 15:31:10 crc kubenswrapper[4810]: E0219 15:31:10.068498 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b358095c-30ba-4f90-b627-63650857fc49" containerName="proxy-httpd" Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.068525 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="b358095c-30ba-4f90-b627-63650857fc49" containerName="proxy-httpd" Feb 19 15:31:10 crc kubenswrapper[4810]: E0219 15:31:10.068554 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b358095c-30ba-4f90-b627-63650857fc49" containerName="sg-core" Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.068563 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="b358095c-30ba-4f90-b627-63650857fc49" containerName="sg-core" Feb 19 15:31:10 crc kubenswrapper[4810]: E0219 15:31:10.068580 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b358095c-30ba-4f90-b627-63650857fc49" containerName="ceilometer-central-agent" Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.068588 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="b358095c-30ba-4f90-b627-63650857fc49" containerName="ceilometer-central-agent" Feb 19 15:31:10 crc kubenswrapper[4810]: E0219 15:31:10.068613 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="972d6f5e-3edf-4b6e-bdde-39c580caea31" containerName="nova-cell0-conductor-db-sync" Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.068622 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="972d6f5e-3edf-4b6e-bdde-39c580caea31" containerName="nova-cell0-conductor-db-sync" Feb 19 15:31:10 crc kubenswrapper[4810]: E0219 15:31:10.068639 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b358095c-30ba-4f90-b627-63650857fc49" containerName="ceilometer-notification-agent" Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.068650 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="b358095c-30ba-4f90-b627-63650857fc49" containerName="ceilometer-notification-agent" Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.068946 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="b358095c-30ba-4f90-b627-63650857fc49" containerName="ceilometer-notification-agent" Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.068986 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="b358095c-30ba-4f90-b627-63650857fc49" containerName="proxy-httpd" Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.069004 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="b358095c-30ba-4f90-b627-63650857fc49" containerName="ceilometer-central-agent" Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.069023 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="972d6f5e-3edf-4b6e-bdde-39c580caea31" containerName="nova-cell0-conductor-db-sync" Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.069042 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="b358095c-30ba-4f90-b627-63650857fc49" containerName="sg-core" Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.071921 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.075603 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b358095c-30ba-4f90-b627-63650857fc49-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b358095c-30ba-4f90-b627-63650857fc49" (UID: "b358095c-30ba-4f90-b627-63650857fc49"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.076904 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-nqhd6" Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.077242 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.077693 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.113890 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ppbf\" (UniqueName: \"kubernetes.io/projected/65e6588c-3b7f-4719-beb6-90229629820f-kube-api-access-8ppbf\") pod \"nova-cell0-conductor-0\" (UID: \"65e6588c-3b7f-4719-beb6-90229629820f\") " pod="openstack/nova-cell0-conductor-0" Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.113969 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65e6588c-3b7f-4719-beb6-90229629820f-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"65e6588c-3b7f-4719-beb6-90229629820f\") " pod="openstack/nova-cell0-conductor-0" Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.114063 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65e6588c-3b7f-4719-beb6-90229629820f-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"65e6588c-3b7f-4719-beb6-90229629820f\") " pod="openstack/nova-cell0-conductor-0" Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.114150 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b358095c-30ba-4f90-b627-63650857fc49-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.114165 4810 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b358095c-30ba-4f90-b627-63650857fc49-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.114175 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75hx8\" (UniqueName: \"kubernetes.io/projected/b358095c-30ba-4f90-b627-63650857fc49-kube-api-access-75hx8\") on node \"crc\" DevicePath \"\"" Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.114186 4810 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b358095c-30ba-4f90-b627-63650857fc49-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.116054 4810 scope.go:117] "RemoveContainer" containerID="c213283e307bb78ffccee82c18c4205b43e9c53c95b737f8fce428b65bf54094" Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.141730 4810 scope.go:117] "RemoveContainer" containerID="468167a7d908f6d23e0b2d81fbf04b7c71508eb08eaac8f05a5952b968bfd835" Feb 19 15:31:10 crc kubenswrapper[4810]: E0219 15:31:10.142280 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"468167a7d908f6d23e0b2d81fbf04b7c71508eb08eaac8f05a5952b968bfd835\": container with ID starting with 468167a7d908f6d23e0b2d81fbf04b7c71508eb08eaac8f05a5952b968bfd835 not found: ID does not exist" containerID="468167a7d908f6d23e0b2d81fbf04b7c71508eb08eaac8f05a5952b968bfd835" Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.142339 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"468167a7d908f6d23e0b2d81fbf04b7c71508eb08eaac8f05a5952b968bfd835"} err="failed to get container status \"468167a7d908f6d23e0b2d81fbf04b7c71508eb08eaac8f05a5952b968bfd835\": rpc error: code = NotFound desc = could not find container \"468167a7d908f6d23e0b2d81fbf04b7c71508eb08eaac8f05a5952b968bfd835\": container with ID starting with 468167a7d908f6d23e0b2d81fbf04b7c71508eb08eaac8f05a5952b968bfd835 not found: ID does not exist" Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.142366 4810 scope.go:117] "RemoveContainer" containerID="961cd98ef99173e92df54e7b424605d95b61a834adb6a8bf1e7c9b237c0afdc7" Feb 19 15:31:10 crc kubenswrapper[4810]: E0219 15:31:10.143672 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"961cd98ef99173e92df54e7b424605d95b61a834adb6a8bf1e7c9b237c0afdc7\": container with ID starting with 961cd98ef99173e92df54e7b424605d95b61a834adb6a8bf1e7c9b237c0afdc7 not found: ID does not exist" containerID="961cd98ef99173e92df54e7b424605d95b61a834adb6a8bf1e7c9b237c0afdc7" Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.143710 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"961cd98ef99173e92df54e7b424605d95b61a834adb6a8bf1e7c9b237c0afdc7"} err="failed to get container status \"961cd98ef99173e92df54e7b424605d95b61a834adb6a8bf1e7c9b237c0afdc7\": rpc error: code = NotFound desc = could not find container \"961cd98ef99173e92df54e7b424605d95b61a834adb6a8bf1e7c9b237c0afdc7\": container with ID starting with 961cd98ef99173e92df54e7b424605d95b61a834adb6a8bf1e7c9b237c0afdc7 not found: ID does not exist" Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.143730 4810 scope.go:117] "RemoveContainer" containerID="c9c0491a003bacacbdcb1eaf1f7857933abd385ecfb0908efea3150739289951" Feb 19 15:31:10 crc kubenswrapper[4810]: E0219 15:31:10.143997 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9c0491a003bacacbdcb1eaf1f7857933abd385ecfb0908efea3150739289951\": container with ID starting with c9c0491a003bacacbdcb1eaf1f7857933abd385ecfb0908efea3150739289951 not found: ID does not exist" containerID="c9c0491a003bacacbdcb1eaf1f7857933abd385ecfb0908efea3150739289951" Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.144016 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9c0491a003bacacbdcb1eaf1f7857933abd385ecfb0908efea3150739289951"} err="failed to get container status \"c9c0491a003bacacbdcb1eaf1f7857933abd385ecfb0908efea3150739289951\": rpc error: code = NotFound desc = could not find container \"c9c0491a003bacacbdcb1eaf1f7857933abd385ecfb0908efea3150739289951\": container with ID starting with c9c0491a003bacacbdcb1eaf1f7857933abd385ecfb0908efea3150739289951 not found: ID does not exist" Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.144029 4810 scope.go:117] "RemoveContainer" containerID="c213283e307bb78ffccee82c18c4205b43e9c53c95b737f8fce428b65bf54094" Feb 19 15:31:10 crc kubenswrapper[4810]: E0219 15:31:10.144230 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c213283e307bb78ffccee82c18c4205b43e9c53c95b737f8fce428b65bf54094\": container with ID starting with c213283e307bb78ffccee82c18c4205b43e9c53c95b737f8fce428b65bf54094 not found: ID does not exist" containerID="c213283e307bb78ffccee82c18c4205b43e9c53c95b737f8fce428b65bf54094" Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.144251 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c213283e307bb78ffccee82c18c4205b43e9c53c95b737f8fce428b65bf54094"} err="failed to get container status \"c213283e307bb78ffccee82c18c4205b43e9c53c95b737f8fce428b65bf54094\": rpc error: code = NotFound desc = could not find container \"c213283e307bb78ffccee82c18c4205b43e9c53c95b737f8fce428b65bf54094\": container with ID starting with c213283e307bb78ffccee82c18c4205b43e9c53c95b737f8fce428b65bf54094 not found: ID does not exist" Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.162390 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b358095c-30ba-4f90-b627-63650857fc49-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b358095c-30ba-4f90-b627-63650857fc49" (UID: "b358095c-30ba-4f90-b627-63650857fc49"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.166061 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b358095c-30ba-4f90-b627-63650857fc49-config-data" (OuterVolumeSpecName: "config-data") pod "b358095c-30ba-4f90-b627-63650857fc49" (UID: "b358095c-30ba-4f90-b627-63650857fc49"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.215724 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65e6588c-3b7f-4719-beb6-90229629820f-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"65e6588c-3b7f-4719-beb6-90229629820f\") " pod="openstack/nova-cell0-conductor-0" Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.216097 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ppbf\" (UniqueName: \"kubernetes.io/projected/65e6588c-3b7f-4719-beb6-90229629820f-kube-api-access-8ppbf\") pod \"nova-cell0-conductor-0\" (UID: \"65e6588c-3b7f-4719-beb6-90229629820f\") " pod="openstack/nova-cell0-conductor-0" Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.216334 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65e6588c-3b7f-4719-beb6-90229629820f-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"65e6588c-3b7f-4719-beb6-90229629820f\") " pod="openstack/nova-cell0-conductor-0" Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.216519 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b358095c-30ba-4f90-b627-63650857fc49-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.216601 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b358095c-30ba-4f90-b627-63650857fc49-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.220578 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65e6588c-3b7f-4719-beb6-90229629820f-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"65e6588c-3b7f-4719-beb6-90229629820f\") " pod="openstack/nova-cell0-conductor-0" Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.220702 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65e6588c-3b7f-4719-beb6-90229629820f-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"65e6588c-3b7f-4719-beb6-90229629820f\") " pod="openstack/nova-cell0-conductor-0" Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.230828 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ppbf\" (UniqueName: \"kubernetes.io/projected/65e6588c-3b7f-4719-beb6-90229629820f-kube-api-access-8ppbf\") pod \"nova-cell0-conductor-0\" (UID: \"65e6588c-3b7f-4719-beb6-90229629820f\") " pod="openstack/nova-cell0-conductor-0" Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.351223 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.370828 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.379343 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.381705 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.384982 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.387789 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.392547 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.406805 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.419694 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af3ec395-6313-4094-9597-b52da27a0d7e-config-data\") pod \"ceilometer-0\" (UID: \"af3ec395-6313-4094-9597-b52da27a0d7e\") " pod="openstack/ceilometer-0" Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.419736 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af3ec395-6313-4094-9597-b52da27a0d7e-run-httpd\") pod \"ceilometer-0\" (UID: \"af3ec395-6313-4094-9597-b52da27a0d7e\") " pod="openstack/ceilometer-0" Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.419777 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af3ec395-6313-4094-9597-b52da27a0d7e-log-httpd\") pod \"ceilometer-0\" (UID: \"af3ec395-6313-4094-9597-b52da27a0d7e\") " pod="openstack/ceilometer-0" Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.419886 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npk42\" (UniqueName: \"kubernetes.io/projected/af3ec395-6313-4094-9597-b52da27a0d7e-kube-api-access-npk42\") pod \"ceilometer-0\" (UID: \"af3ec395-6313-4094-9597-b52da27a0d7e\") " pod="openstack/ceilometer-0" Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.419932 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af3ec395-6313-4094-9597-b52da27a0d7e-scripts\") pod \"ceilometer-0\" (UID: \"af3ec395-6313-4094-9597-b52da27a0d7e\") " pod="openstack/ceilometer-0" Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.419956 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af3ec395-6313-4094-9597-b52da27a0d7e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"af3ec395-6313-4094-9597-b52da27a0d7e\") " pod="openstack/ceilometer-0" Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.420017 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/af3ec395-6313-4094-9597-b52da27a0d7e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"af3ec395-6313-4094-9597-b52da27a0d7e\") " pod="openstack/ceilometer-0" Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.522416 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af3ec395-6313-4094-9597-b52da27a0d7e-config-data\") pod \"ceilometer-0\" (UID: \"af3ec395-6313-4094-9597-b52da27a0d7e\") " pod="openstack/ceilometer-0" Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.522754 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af3ec395-6313-4094-9597-b52da27a0d7e-run-httpd\") pod \"ceilometer-0\" (UID: \"af3ec395-6313-4094-9597-b52da27a0d7e\") " pod="openstack/ceilometer-0" Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.522801 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af3ec395-6313-4094-9597-b52da27a0d7e-log-httpd\") pod \"ceilometer-0\" (UID: \"af3ec395-6313-4094-9597-b52da27a0d7e\") " pod="openstack/ceilometer-0" Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.522857 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npk42\" (UniqueName: \"kubernetes.io/projected/af3ec395-6313-4094-9597-b52da27a0d7e-kube-api-access-npk42\") pod \"ceilometer-0\" (UID: \"af3ec395-6313-4094-9597-b52da27a0d7e\") " pod="openstack/ceilometer-0" Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.522894 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af3ec395-6313-4094-9597-b52da27a0d7e-scripts\") pod \"ceilometer-0\" (UID: \"af3ec395-6313-4094-9597-b52da27a0d7e\") " pod="openstack/ceilometer-0" Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.522917 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af3ec395-6313-4094-9597-b52da27a0d7e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"af3ec395-6313-4094-9597-b52da27a0d7e\") " pod="openstack/ceilometer-0" Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.522971 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/af3ec395-6313-4094-9597-b52da27a0d7e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"af3ec395-6313-4094-9597-b52da27a0d7e\") " pod="openstack/ceilometer-0" Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.523089 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af3ec395-6313-4094-9597-b52da27a0d7e-run-httpd\") pod \"ceilometer-0\" (UID: \"af3ec395-6313-4094-9597-b52da27a0d7e\") " pod="openstack/ceilometer-0" Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.523312 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af3ec395-6313-4094-9597-b52da27a0d7e-log-httpd\") pod \"ceilometer-0\" (UID: \"af3ec395-6313-4094-9597-b52da27a0d7e\") " pod="openstack/ceilometer-0" Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.528544 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/af3ec395-6313-4094-9597-b52da27a0d7e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"af3ec395-6313-4094-9597-b52da27a0d7e\") " pod="openstack/ceilometer-0" Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.532995 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af3ec395-6313-4094-9597-b52da27a0d7e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"af3ec395-6313-4094-9597-b52da27a0d7e\") " pod="openstack/ceilometer-0" Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.534270 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af3ec395-6313-4094-9597-b52da27a0d7e-config-data\") pod \"ceilometer-0\" (UID: \"af3ec395-6313-4094-9597-b52da27a0d7e\") " pod="openstack/ceilometer-0" Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.538124 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af3ec395-6313-4094-9597-b52da27a0d7e-scripts\") pod \"ceilometer-0\" (UID: \"af3ec395-6313-4094-9597-b52da27a0d7e\") " pod="openstack/ceilometer-0" Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.545169 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npk42\" (UniqueName: \"kubernetes.io/projected/af3ec395-6313-4094-9597-b52da27a0d7e-kube-api-access-npk42\") pod \"ceilometer-0\" (UID: \"af3ec395-6313-4094-9597-b52da27a0d7e\") " pod="openstack/ceilometer-0" Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.699716 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.870683 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 19 15:31:10 crc kubenswrapper[4810]: I0219 15:31:10.980056 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"65e6588c-3b7f-4719-beb6-90229629820f","Type":"ContainerStarted","Data":"c853cc96422c5081191bd4ef30ca5ee6475e60b6ea25cc44ff8b6bb0a8f8953a"} Feb 19 15:31:11 crc kubenswrapper[4810]: I0219 15:31:11.184257 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 15:31:11 crc kubenswrapper[4810]: I0219 15:31:11.452982 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b358095c-30ba-4f90-b627-63650857fc49" path="/var/lib/kubelet/pods/b358095c-30ba-4f90-b627-63650857fc49/volumes" Feb 19 15:31:11 crc kubenswrapper[4810]: I0219 15:31:11.832344 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 19 15:31:11 crc kubenswrapper[4810]: I0219 15:31:11.867937 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 19 15:31:11 crc kubenswrapper[4810]: I0219 15:31:11.994018 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"65e6588c-3b7f-4719-beb6-90229629820f","Type":"ContainerStarted","Data":"7c8b78bbba0adeaa15117148c7900fcb09121122590af963dd88c701dfc73adf"} Feb 19 15:31:11 crc kubenswrapper[4810]: I0219 15:31:11.994821 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 19 15:31:11 crc kubenswrapper[4810]: I0219 15:31:11.997061 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"af3ec395-6313-4094-9597-b52da27a0d7e","Type":"ContainerStarted","Data":"308ad8f8b8e7c54aece6c4e1cc25e71fd1252432f00bfbdb0df4940372379f98"} Feb 19 15:31:11 crc kubenswrapper[4810]: I0219 15:31:11.997155 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"af3ec395-6313-4094-9597-b52da27a0d7e","Type":"ContainerStarted","Data":"6cec0822ad581a03d19d641977e9a7d547ef54d70d04ff9e2ccd6f292e9245c7"} Feb 19 15:31:11 crc kubenswrapper[4810]: I0219 15:31:11.997202 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"af3ec395-6313-4094-9597-b52da27a0d7e","Type":"ContainerStarted","Data":"62ba2b7fb62123b253e68adfb6db44d11bb6e2d13e45773cd9a27fa6ec28a020"} Feb 19 15:31:12 crc kubenswrapper[4810]: I0219 15:31:12.014506 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.014485114 podStartE2EDuration="2.014485114s" podCreationTimestamp="2026-02-19 15:31:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:31:12.014375461 +0000 UTC m=+1301.496405615" watchObservedRunningTime="2026-02-19 15:31:12.014485114 +0000 UTC m=+1301.496515248" Feb 19 15:31:13 crc kubenswrapper[4810]: I0219 15:31:13.007213 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"af3ec395-6313-4094-9597-b52da27a0d7e","Type":"ContainerStarted","Data":"4925cc21f9c39a46d677b62b982c7df70c5e45eb8c42ab96787c52123262f91d"} Feb 19 15:31:15 crc kubenswrapper[4810]: I0219 15:31:15.036774 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"af3ec395-6313-4094-9597-b52da27a0d7e","Type":"ContainerStarted","Data":"2adc7f5bee46c61926054ab4e65ab8e1825b109d167cc80842a20a1750f29ed6"} Feb 19 15:31:15 crc kubenswrapper[4810]: I0219 15:31:15.037622 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 19 15:31:15 crc kubenswrapper[4810]: I0219 15:31:15.081574 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.359282376 podStartE2EDuration="5.081552495s" podCreationTimestamp="2026-02-19 15:31:10 +0000 UTC" firstStartedPulling="2026-02-19 15:31:11.185309705 +0000 UTC m=+1300.667339829" lastFinishedPulling="2026-02-19 15:31:13.907579824 +0000 UTC m=+1303.389609948" observedRunningTime="2026-02-19 15:31:15.065367813 +0000 UTC m=+1304.547397947" watchObservedRunningTime="2026-02-19 15:31:15.081552495 +0000 UTC m=+1304.563582629" Feb 19 15:31:20 crc kubenswrapper[4810]: I0219 15:31:20.438600 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Feb 19 15:31:20 crc kubenswrapper[4810]: I0219 15:31:20.925612 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-k7bkw"] Feb 19 15:31:20 crc kubenswrapper[4810]: I0219 15:31:20.927051 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-k7bkw" Feb 19 15:31:20 crc kubenswrapper[4810]: I0219 15:31:20.934714 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Feb 19 15:31:20 crc kubenswrapper[4810]: I0219 15:31:20.935149 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Feb 19 15:31:20 crc kubenswrapper[4810]: I0219 15:31:20.935769 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/299a53ac-e7e5-47a3-bf65-df5624b77717-scripts\") pod \"nova-cell0-cell-mapping-k7bkw\" (UID: \"299a53ac-e7e5-47a3-bf65-df5624b77717\") " pod="openstack/nova-cell0-cell-mapping-k7bkw" Feb 19 15:31:20 crc kubenswrapper[4810]: I0219 15:31:20.936011 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/299a53ac-e7e5-47a3-bf65-df5624b77717-config-data\") pod \"nova-cell0-cell-mapping-k7bkw\" (UID: \"299a53ac-e7e5-47a3-bf65-df5624b77717\") " pod="openstack/nova-cell0-cell-mapping-k7bkw" Feb 19 15:31:20 crc kubenswrapper[4810]: I0219 15:31:20.936133 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhsjr\" (UniqueName: \"kubernetes.io/projected/299a53ac-e7e5-47a3-bf65-df5624b77717-kube-api-access-zhsjr\") pod \"nova-cell0-cell-mapping-k7bkw\" (UID: \"299a53ac-e7e5-47a3-bf65-df5624b77717\") " pod="openstack/nova-cell0-cell-mapping-k7bkw" Feb 19 15:31:20 crc kubenswrapper[4810]: I0219 15:31:20.936252 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/299a53ac-e7e5-47a3-bf65-df5624b77717-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-k7bkw\" (UID: \"299a53ac-e7e5-47a3-bf65-df5624b77717\") " pod="openstack/nova-cell0-cell-mapping-k7bkw" Feb 19 15:31:20 crc kubenswrapper[4810]: I0219 15:31:20.943609 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-k7bkw"] Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.038179 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/299a53ac-e7e5-47a3-bf65-df5624b77717-config-data\") pod \"nova-cell0-cell-mapping-k7bkw\" (UID: \"299a53ac-e7e5-47a3-bf65-df5624b77717\") " pod="openstack/nova-cell0-cell-mapping-k7bkw" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.038272 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhsjr\" (UniqueName: \"kubernetes.io/projected/299a53ac-e7e5-47a3-bf65-df5624b77717-kube-api-access-zhsjr\") pod \"nova-cell0-cell-mapping-k7bkw\" (UID: \"299a53ac-e7e5-47a3-bf65-df5624b77717\") " pod="openstack/nova-cell0-cell-mapping-k7bkw" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.038357 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/299a53ac-e7e5-47a3-bf65-df5624b77717-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-k7bkw\" (UID: \"299a53ac-e7e5-47a3-bf65-df5624b77717\") " pod="openstack/nova-cell0-cell-mapping-k7bkw" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.038397 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/299a53ac-e7e5-47a3-bf65-df5624b77717-scripts\") pod \"nova-cell0-cell-mapping-k7bkw\" (UID: \"299a53ac-e7e5-47a3-bf65-df5624b77717\") " pod="openstack/nova-cell0-cell-mapping-k7bkw" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.046958 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/299a53ac-e7e5-47a3-bf65-df5624b77717-scripts\") pod \"nova-cell0-cell-mapping-k7bkw\" (UID: \"299a53ac-e7e5-47a3-bf65-df5624b77717\") " pod="openstack/nova-cell0-cell-mapping-k7bkw" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.047918 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/299a53ac-e7e5-47a3-bf65-df5624b77717-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-k7bkw\" (UID: \"299a53ac-e7e5-47a3-bf65-df5624b77717\") " pod="openstack/nova-cell0-cell-mapping-k7bkw" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.048448 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/299a53ac-e7e5-47a3-bf65-df5624b77717-config-data\") pod \"nova-cell0-cell-mapping-k7bkw\" (UID: \"299a53ac-e7e5-47a3-bf65-df5624b77717\") " pod="openstack/nova-cell0-cell-mapping-k7bkw" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.068825 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhsjr\" (UniqueName: \"kubernetes.io/projected/299a53ac-e7e5-47a3-bf65-df5624b77717-kube-api-access-zhsjr\") pod \"nova-cell0-cell-mapping-k7bkw\" (UID: \"299a53ac-e7e5-47a3-bf65-df5624b77717\") " pod="openstack/nova-cell0-cell-mapping-k7bkw" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.138737 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.147510 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.174676 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.214714 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.226560 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.228135 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.237018 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.257613 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.258937 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4c8d587-b429-415f-96f6-628924fed084-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b4c8d587-b429-415f-96f6-628924fed084\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.259040 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7g5cs\" (UniqueName: \"kubernetes.io/projected/d9e6d16b-a7c2-4a73-866e-6e068e910d82-kube-api-access-7g5cs\") pod \"nova-scheduler-0\" (UID: \"d9e6d16b-a7c2-4a73-866e-6e068e910d82\") " pod="openstack/nova-scheduler-0" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.259067 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkj9r\" (UniqueName: \"kubernetes.io/projected/b4c8d587-b429-415f-96f6-628924fed084-kube-api-access-qkj9r\") pod \"nova-cell1-novncproxy-0\" (UID: \"b4c8d587-b429-415f-96f6-628924fed084\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.259086 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4c8d587-b429-415f-96f6-628924fed084-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b4c8d587-b429-415f-96f6-628924fed084\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.259137 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9e6d16b-a7c2-4a73-866e-6e068e910d82-config-data\") pod \"nova-scheduler-0\" (UID: \"d9e6d16b-a7c2-4a73-866e-6e068e910d82\") " pod="openstack/nova-scheduler-0" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.259174 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9e6d16b-a7c2-4a73-866e-6e068e910d82-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d9e6d16b-a7c2-4a73-866e-6e068e910d82\") " pod="openstack/nova-scheduler-0" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.265549 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-k7bkw" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.300506 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.302590 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.313555 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.316507 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.348495 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.350632 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.357651 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.361180 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9e6d16b-a7c2-4a73-866e-6e068e910d82-config-data\") pod \"nova-scheduler-0\" (UID: \"d9e6d16b-a7c2-4a73-866e-6e068e910d82\") " pod="openstack/nova-scheduler-0" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.364018 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9e6d16b-a7c2-4a73-866e-6e068e910d82-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d9e6d16b-a7c2-4a73-866e-6e068e910d82\") " pod="openstack/nova-scheduler-0" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.364413 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.364438 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4c8d587-b429-415f-96f6-628924fed084-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b4c8d587-b429-415f-96f6-628924fed084\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.364877 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7g5cs\" (UniqueName: \"kubernetes.io/projected/d9e6d16b-a7c2-4a73-866e-6e068e910d82-kube-api-access-7g5cs\") pod \"nova-scheduler-0\" (UID: \"d9e6d16b-a7c2-4a73-866e-6e068e910d82\") " pod="openstack/nova-scheduler-0" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.364932 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkj9r\" (UniqueName: \"kubernetes.io/projected/b4c8d587-b429-415f-96f6-628924fed084-kube-api-access-qkj9r\") pod \"nova-cell1-novncproxy-0\" (UID: \"b4c8d587-b429-415f-96f6-628924fed084\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.364976 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4c8d587-b429-415f-96f6-628924fed084-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b4c8d587-b429-415f-96f6-628924fed084\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.368090 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9e6d16b-a7c2-4a73-866e-6e068e910d82-config-data\") pod \"nova-scheduler-0\" (UID: \"d9e6d16b-a7c2-4a73-866e-6e068e910d82\") " pod="openstack/nova-scheduler-0" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.371671 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4c8d587-b429-415f-96f6-628924fed084-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b4c8d587-b429-415f-96f6-628924fed084\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.375950 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9e6d16b-a7c2-4a73-866e-6e068e910d82-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d9e6d16b-a7c2-4a73-866e-6e068e910d82\") " pod="openstack/nova-scheduler-0" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.376358 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4c8d587-b429-415f-96f6-628924fed084-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b4c8d587-b429-415f-96f6-628924fed084\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.391843 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7g5cs\" (UniqueName: \"kubernetes.io/projected/d9e6d16b-a7c2-4a73-866e-6e068e910d82-kube-api-access-7g5cs\") pod \"nova-scheduler-0\" (UID: \"d9e6d16b-a7c2-4a73-866e-6e068e910d82\") " pod="openstack/nova-scheduler-0" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.410727 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkj9r\" (UniqueName: \"kubernetes.io/projected/b4c8d587-b429-415f-96f6-628924fed084-kube-api-access-qkj9r\") pod \"nova-cell1-novncproxy-0\" (UID: \"b4c8d587-b429-415f-96f6-628924fed084\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.466812 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8841a6af-789a-4dd9-81ed-3afc45b255e4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8841a6af-789a-4dd9-81ed-3afc45b255e4\") " pod="openstack/nova-api-0" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.466883 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e8674be-9f7f-438c-afcb-529178b5fa9a-config-data\") pod \"nova-metadata-0\" (UID: \"1e8674be-9f7f-438c-afcb-529178b5fa9a\") " pod="openstack/nova-metadata-0" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.466929 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e8674be-9f7f-438c-afcb-529178b5fa9a-logs\") pod \"nova-metadata-0\" (UID: \"1e8674be-9f7f-438c-afcb-529178b5fa9a\") " pod="openstack/nova-metadata-0" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.466969 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-945nq\" (UniqueName: \"kubernetes.io/projected/8841a6af-789a-4dd9-81ed-3afc45b255e4-kube-api-access-945nq\") pod \"nova-api-0\" (UID: \"8841a6af-789a-4dd9-81ed-3afc45b255e4\") " pod="openstack/nova-api-0" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.467629 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2szh6\" (UniqueName: \"kubernetes.io/projected/1e8674be-9f7f-438c-afcb-529178b5fa9a-kube-api-access-2szh6\") pod \"nova-metadata-0\" (UID: \"1e8674be-9f7f-438c-afcb-529178b5fa9a\") " pod="openstack/nova-metadata-0" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.467849 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e8674be-9f7f-438c-afcb-529178b5fa9a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1e8674be-9f7f-438c-afcb-529178b5fa9a\") " pod="openstack/nova-metadata-0" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.468184 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8841a6af-789a-4dd9-81ed-3afc45b255e4-config-data\") pod \"nova-api-0\" (UID: \"8841a6af-789a-4dd9-81ed-3afc45b255e4\") " pod="openstack/nova-api-0" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.468209 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8841a6af-789a-4dd9-81ed-3afc45b255e4-logs\") pod \"nova-api-0\" (UID: \"8841a6af-789a-4dd9-81ed-3afc45b255e4\") " pod="openstack/nova-api-0" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.483654 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7cc76f8d79-b9r9k"] Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.485279 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cc76f8d79-b9r9k" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.497159 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7cc76f8d79-b9r9k"] Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.497227 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.568910 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d13abd15-5b9b-4e00-984f-9dabbe51ddbc-config\") pod \"dnsmasq-dns-7cc76f8d79-b9r9k\" (UID: \"d13abd15-5b9b-4e00-984f-9dabbe51ddbc\") " pod="openstack/dnsmasq-dns-7cc76f8d79-b9r9k" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.568973 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8841a6af-789a-4dd9-81ed-3afc45b255e4-config-data\") pod \"nova-api-0\" (UID: \"8841a6af-789a-4dd9-81ed-3afc45b255e4\") " pod="openstack/nova-api-0" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.568994 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8841a6af-789a-4dd9-81ed-3afc45b255e4-logs\") pod \"nova-api-0\" (UID: \"8841a6af-789a-4dd9-81ed-3afc45b255e4\") " pod="openstack/nova-api-0" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.569013 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d13abd15-5b9b-4e00-984f-9dabbe51ddbc-ovsdbserver-sb\") pod \"dnsmasq-dns-7cc76f8d79-b9r9k\" (UID: \"d13abd15-5b9b-4e00-984f-9dabbe51ddbc\") " pod="openstack/dnsmasq-dns-7cc76f8d79-b9r9k" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.569038 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d13abd15-5b9b-4e00-984f-9dabbe51ddbc-dns-svc\") pod \"dnsmasq-dns-7cc76f8d79-b9r9k\" (UID: \"d13abd15-5b9b-4e00-984f-9dabbe51ddbc\") " pod="openstack/dnsmasq-dns-7cc76f8d79-b9r9k" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.569071 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8841a6af-789a-4dd9-81ed-3afc45b255e4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8841a6af-789a-4dd9-81ed-3afc45b255e4\") " pod="openstack/nova-api-0" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.569098 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e8674be-9f7f-438c-afcb-529178b5fa9a-config-data\") pod \"nova-metadata-0\" (UID: \"1e8674be-9f7f-438c-afcb-529178b5fa9a\") " pod="openstack/nova-metadata-0" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.569127 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e8674be-9f7f-438c-afcb-529178b5fa9a-logs\") pod \"nova-metadata-0\" (UID: \"1e8674be-9f7f-438c-afcb-529178b5fa9a\") " pod="openstack/nova-metadata-0" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.569145 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-945nq\" (UniqueName: \"kubernetes.io/projected/8841a6af-789a-4dd9-81ed-3afc45b255e4-kube-api-access-945nq\") pod \"nova-api-0\" (UID: \"8841a6af-789a-4dd9-81ed-3afc45b255e4\") " pod="openstack/nova-api-0" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.569169 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2szh6\" (UniqueName: \"kubernetes.io/projected/1e8674be-9f7f-438c-afcb-529178b5fa9a-kube-api-access-2szh6\") pod \"nova-metadata-0\" (UID: \"1e8674be-9f7f-438c-afcb-529178b5fa9a\") " pod="openstack/nova-metadata-0" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.569185 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e8674be-9f7f-438c-afcb-529178b5fa9a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1e8674be-9f7f-438c-afcb-529178b5fa9a\") " pod="openstack/nova-metadata-0" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.569205 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d13abd15-5b9b-4e00-984f-9dabbe51ddbc-dns-swift-storage-0\") pod \"dnsmasq-dns-7cc76f8d79-b9r9k\" (UID: \"d13abd15-5b9b-4e00-984f-9dabbe51ddbc\") " pod="openstack/dnsmasq-dns-7cc76f8d79-b9r9k" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.569233 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdf9n\" (UniqueName: \"kubernetes.io/projected/d13abd15-5b9b-4e00-984f-9dabbe51ddbc-kube-api-access-fdf9n\") pod \"dnsmasq-dns-7cc76f8d79-b9r9k\" (UID: \"d13abd15-5b9b-4e00-984f-9dabbe51ddbc\") " pod="openstack/dnsmasq-dns-7cc76f8d79-b9r9k" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.569256 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d13abd15-5b9b-4e00-984f-9dabbe51ddbc-ovsdbserver-nb\") pod \"dnsmasq-dns-7cc76f8d79-b9r9k\" (UID: \"d13abd15-5b9b-4e00-984f-9dabbe51ddbc\") " pod="openstack/dnsmasq-dns-7cc76f8d79-b9r9k" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.570120 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e8674be-9f7f-438c-afcb-529178b5fa9a-logs\") pod \"nova-metadata-0\" (UID: \"1e8674be-9f7f-438c-afcb-529178b5fa9a\") " pod="openstack/nova-metadata-0" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.570693 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8841a6af-789a-4dd9-81ed-3afc45b255e4-logs\") pod \"nova-api-0\" (UID: \"8841a6af-789a-4dd9-81ed-3afc45b255e4\") " pod="openstack/nova-api-0" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.574506 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8841a6af-789a-4dd9-81ed-3afc45b255e4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8841a6af-789a-4dd9-81ed-3afc45b255e4\") " pod="openstack/nova-api-0" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.574592 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e8674be-9f7f-438c-afcb-529178b5fa9a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1e8674be-9f7f-438c-afcb-529178b5fa9a\") " pod="openstack/nova-metadata-0" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.575115 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8841a6af-789a-4dd9-81ed-3afc45b255e4-config-data\") pod \"nova-api-0\" (UID: \"8841a6af-789a-4dd9-81ed-3afc45b255e4\") " pod="openstack/nova-api-0" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.581163 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e8674be-9f7f-438c-afcb-529178b5fa9a-config-data\") pod \"nova-metadata-0\" (UID: \"1e8674be-9f7f-438c-afcb-529178b5fa9a\") " pod="openstack/nova-metadata-0" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.585287 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.594588 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-945nq\" (UniqueName: \"kubernetes.io/projected/8841a6af-789a-4dd9-81ed-3afc45b255e4-kube-api-access-945nq\") pod \"nova-api-0\" (UID: \"8841a6af-789a-4dd9-81ed-3afc45b255e4\") " pod="openstack/nova-api-0" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.595350 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2szh6\" (UniqueName: \"kubernetes.io/projected/1e8674be-9f7f-438c-afcb-529178b5fa9a-kube-api-access-2szh6\") pod \"nova-metadata-0\" (UID: \"1e8674be-9f7f-438c-afcb-529178b5fa9a\") " pod="openstack/nova-metadata-0" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.670712 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d13abd15-5b9b-4e00-984f-9dabbe51ddbc-ovsdbserver-sb\") pod \"dnsmasq-dns-7cc76f8d79-b9r9k\" (UID: \"d13abd15-5b9b-4e00-984f-9dabbe51ddbc\") " pod="openstack/dnsmasq-dns-7cc76f8d79-b9r9k" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.671001 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d13abd15-5b9b-4e00-984f-9dabbe51ddbc-dns-svc\") pod \"dnsmasq-dns-7cc76f8d79-b9r9k\" (UID: \"d13abd15-5b9b-4e00-984f-9dabbe51ddbc\") " pod="openstack/dnsmasq-dns-7cc76f8d79-b9r9k" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.671084 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d13abd15-5b9b-4e00-984f-9dabbe51ddbc-dns-swift-storage-0\") pod \"dnsmasq-dns-7cc76f8d79-b9r9k\" (UID: \"d13abd15-5b9b-4e00-984f-9dabbe51ddbc\") " pod="openstack/dnsmasq-dns-7cc76f8d79-b9r9k" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.671115 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdf9n\" (UniqueName: \"kubernetes.io/projected/d13abd15-5b9b-4e00-984f-9dabbe51ddbc-kube-api-access-fdf9n\") pod \"dnsmasq-dns-7cc76f8d79-b9r9k\" (UID: \"d13abd15-5b9b-4e00-984f-9dabbe51ddbc\") " pod="openstack/dnsmasq-dns-7cc76f8d79-b9r9k" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.671143 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d13abd15-5b9b-4e00-984f-9dabbe51ddbc-ovsdbserver-nb\") pod \"dnsmasq-dns-7cc76f8d79-b9r9k\" (UID: \"d13abd15-5b9b-4e00-984f-9dabbe51ddbc\") " pod="openstack/dnsmasq-dns-7cc76f8d79-b9r9k" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.671192 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d13abd15-5b9b-4e00-984f-9dabbe51ddbc-config\") pod \"dnsmasq-dns-7cc76f8d79-b9r9k\" (UID: \"d13abd15-5b9b-4e00-984f-9dabbe51ddbc\") " pod="openstack/dnsmasq-dns-7cc76f8d79-b9r9k" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.671961 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d13abd15-5b9b-4e00-984f-9dabbe51ddbc-dns-swift-storage-0\") pod \"dnsmasq-dns-7cc76f8d79-b9r9k\" (UID: \"d13abd15-5b9b-4e00-984f-9dabbe51ddbc\") " pod="openstack/dnsmasq-dns-7cc76f8d79-b9r9k" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.672063 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d13abd15-5b9b-4e00-984f-9dabbe51ddbc-dns-svc\") pod \"dnsmasq-dns-7cc76f8d79-b9r9k\" (UID: \"d13abd15-5b9b-4e00-984f-9dabbe51ddbc\") " pod="openstack/dnsmasq-dns-7cc76f8d79-b9r9k" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.672629 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d13abd15-5b9b-4e00-984f-9dabbe51ddbc-ovsdbserver-sb\") pod \"dnsmasq-dns-7cc76f8d79-b9r9k\" (UID: \"d13abd15-5b9b-4e00-984f-9dabbe51ddbc\") " pod="openstack/dnsmasq-dns-7cc76f8d79-b9r9k" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.672929 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d13abd15-5b9b-4e00-984f-9dabbe51ddbc-config\") pod \"dnsmasq-dns-7cc76f8d79-b9r9k\" (UID: \"d13abd15-5b9b-4e00-984f-9dabbe51ddbc\") " pod="openstack/dnsmasq-dns-7cc76f8d79-b9r9k" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.675487 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d13abd15-5b9b-4e00-984f-9dabbe51ddbc-ovsdbserver-nb\") pod \"dnsmasq-dns-7cc76f8d79-b9r9k\" (UID: \"d13abd15-5b9b-4e00-984f-9dabbe51ddbc\") " pod="openstack/dnsmasq-dns-7cc76f8d79-b9r9k" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.690538 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdf9n\" (UniqueName: \"kubernetes.io/projected/d13abd15-5b9b-4e00-984f-9dabbe51ddbc-kube-api-access-fdf9n\") pod \"dnsmasq-dns-7cc76f8d79-b9r9k\" (UID: \"d13abd15-5b9b-4e00-984f-9dabbe51ddbc\") " pod="openstack/dnsmasq-dns-7cc76f8d79-b9r9k" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.775982 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.781261 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.812856 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cc76f8d79-b9r9k" Feb 19 15:31:21 crc kubenswrapper[4810]: I0219 15:31:21.943561 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-k7bkw"] Feb 19 15:31:22 crc kubenswrapper[4810]: I0219 15:31:22.061930 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-hk2fs"] Feb 19 15:31:22 crc kubenswrapper[4810]: I0219 15:31:22.067553 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-hk2fs" Feb 19 15:31:22 crc kubenswrapper[4810]: I0219 15:31:22.072056 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Feb 19 15:31:22 crc kubenswrapper[4810]: I0219 15:31:22.073219 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 19 15:31:22 crc kubenswrapper[4810]: I0219 15:31:22.076772 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-hk2fs"] Feb 19 15:31:22 crc kubenswrapper[4810]: I0219 15:31:22.109483 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpxgr\" (UniqueName: \"kubernetes.io/projected/5f1a5ee7-3792-4f35-a967-80fb96c7df10-kube-api-access-qpxgr\") pod \"nova-cell1-conductor-db-sync-hk2fs\" (UID: \"5f1a5ee7-3792-4f35-a967-80fb96c7df10\") " pod="openstack/nova-cell1-conductor-db-sync-hk2fs" Feb 19 15:31:22 crc kubenswrapper[4810]: I0219 15:31:22.110071 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f1a5ee7-3792-4f35-a967-80fb96c7df10-config-data\") pod \"nova-cell1-conductor-db-sync-hk2fs\" (UID: \"5f1a5ee7-3792-4f35-a967-80fb96c7df10\") " pod="openstack/nova-cell1-conductor-db-sync-hk2fs" Feb 19 15:31:22 crc kubenswrapper[4810]: I0219 15:31:22.110163 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f1a5ee7-3792-4f35-a967-80fb96c7df10-scripts\") pod \"nova-cell1-conductor-db-sync-hk2fs\" (UID: \"5f1a5ee7-3792-4f35-a967-80fb96c7df10\") " pod="openstack/nova-cell1-conductor-db-sync-hk2fs" Feb 19 15:31:22 crc kubenswrapper[4810]: I0219 15:31:22.110399 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f1a5ee7-3792-4f35-a967-80fb96c7df10-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-hk2fs\" (UID: \"5f1a5ee7-3792-4f35-a967-80fb96c7df10\") " pod="openstack/nova-cell1-conductor-db-sync-hk2fs" Feb 19 15:31:22 crc kubenswrapper[4810]: I0219 15:31:22.136898 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 15:31:22 crc kubenswrapper[4810]: I0219 15:31:22.157611 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d9e6d16b-a7c2-4a73-866e-6e068e910d82","Type":"ContainerStarted","Data":"8358d2e845bdbf47646ae6969a74848954fb0b3cc77ce5979954030d2a57fe33"} Feb 19 15:31:22 crc kubenswrapper[4810]: I0219 15:31:22.159462 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-k7bkw" event={"ID":"299a53ac-e7e5-47a3-bf65-df5624b77717","Type":"ContainerStarted","Data":"692e978a354f5e17fbfce0a777f2418f1242f2cdbcdb95bec183a7615ee2fc90"} Feb 19 15:31:22 crc kubenswrapper[4810]: I0219 15:31:22.214845 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpxgr\" (UniqueName: \"kubernetes.io/projected/5f1a5ee7-3792-4f35-a967-80fb96c7df10-kube-api-access-qpxgr\") pod \"nova-cell1-conductor-db-sync-hk2fs\" (UID: \"5f1a5ee7-3792-4f35-a967-80fb96c7df10\") " pod="openstack/nova-cell1-conductor-db-sync-hk2fs" Feb 19 15:31:22 crc kubenswrapper[4810]: I0219 15:31:22.215627 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f1a5ee7-3792-4f35-a967-80fb96c7df10-scripts\") pod \"nova-cell1-conductor-db-sync-hk2fs\" (UID: \"5f1a5ee7-3792-4f35-a967-80fb96c7df10\") " pod="openstack/nova-cell1-conductor-db-sync-hk2fs" Feb 19 15:31:22 crc kubenswrapper[4810]: I0219 15:31:22.215648 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f1a5ee7-3792-4f35-a967-80fb96c7df10-config-data\") pod \"nova-cell1-conductor-db-sync-hk2fs\" (UID: \"5f1a5ee7-3792-4f35-a967-80fb96c7df10\") " pod="openstack/nova-cell1-conductor-db-sync-hk2fs" Feb 19 15:31:22 crc kubenswrapper[4810]: I0219 15:31:22.215716 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f1a5ee7-3792-4f35-a967-80fb96c7df10-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-hk2fs\" (UID: \"5f1a5ee7-3792-4f35-a967-80fb96c7df10\") " pod="openstack/nova-cell1-conductor-db-sync-hk2fs" Feb 19 15:31:22 crc kubenswrapper[4810]: I0219 15:31:22.216290 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 15:31:22 crc kubenswrapper[4810]: I0219 15:31:22.231665 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f1a5ee7-3792-4f35-a967-80fb96c7df10-scripts\") pod \"nova-cell1-conductor-db-sync-hk2fs\" (UID: \"5f1a5ee7-3792-4f35-a967-80fb96c7df10\") " pod="openstack/nova-cell1-conductor-db-sync-hk2fs" Feb 19 15:31:22 crc kubenswrapper[4810]: I0219 15:31:22.232171 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f1a5ee7-3792-4f35-a967-80fb96c7df10-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-hk2fs\" (UID: \"5f1a5ee7-3792-4f35-a967-80fb96c7df10\") " pod="openstack/nova-cell1-conductor-db-sync-hk2fs" Feb 19 15:31:22 crc kubenswrapper[4810]: I0219 15:31:22.232617 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f1a5ee7-3792-4f35-a967-80fb96c7df10-config-data\") pod \"nova-cell1-conductor-db-sync-hk2fs\" (UID: \"5f1a5ee7-3792-4f35-a967-80fb96c7df10\") " pod="openstack/nova-cell1-conductor-db-sync-hk2fs" Feb 19 15:31:22 crc kubenswrapper[4810]: I0219 15:31:22.236539 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpxgr\" (UniqueName: \"kubernetes.io/projected/5f1a5ee7-3792-4f35-a967-80fb96c7df10-kube-api-access-qpxgr\") pod \"nova-cell1-conductor-db-sync-hk2fs\" (UID: \"5f1a5ee7-3792-4f35-a967-80fb96c7df10\") " pod="openstack/nova-cell1-conductor-db-sync-hk2fs" Feb 19 15:31:22 crc kubenswrapper[4810]: I0219 15:31:22.391228 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 15:31:22 crc kubenswrapper[4810]: I0219 15:31:22.499048 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-hk2fs" Feb 19 15:31:22 crc kubenswrapper[4810]: I0219 15:31:22.531130 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 15:31:22 crc kubenswrapper[4810]: I0219 15:31:22.554802 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7cc76f8d79-b9r9k"] Feb 19 15:31:22 crc kubenswrapper[4810]: W0219 15:31:22.564128 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd13abd15_5b9b_4e00_984f_9dabbe51ddbc.slice/crio-3782f704a1deb319907b16927aac1f196131400ef56fc6f3b4943495c1de3d0b WatchSource:0}: Error finding container 3782f704a1deb319907b16927aac1f196131400ef56fc6f3b4943495c1de3d0b: Status 404 returned error can't find the container with id 3782f704a1deb319907b16927aac1f196131400ef56fc6f3b4943495c1de3d0b Feb 19 15:31:23 crc kubenswrapper[4810]: I0219 15:31:23.009547 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-hk2fs"] Feb 19 15:31:23 crc kubenswrapper[4810]: I0219 15:31:23.178271 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-k7bkw" event={"ID":"299a53ac-e7e5-47a3-bf65-df5624b77717","Type":"ContainerStarted","Data":"9c8648b58dedd6b14f6832bd1d2f895ecfd4e781a2433a653d4f48b76efb9fef"} Feb 19 15:31:23 crc kubenswrapper[4810]: I0219 15:31:23.182442 4810 generic.go:334] "Generic (PLEG): container finished" podID="d13abd15-5b9b-4e00-984f-9dabbe51ddbc" containerID="f762a8738d3f47e401158c999773d4f19a8cf6bd6c7936ab82cb0c741248ad3e" exitCode=0 Feb 19 15:31:23 crc kubenswrapper[4810]: I0219 15:31:23.183228 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cc76f8d79-b9r9k" event={"ID":"d13abd15-5b9b-4e00-984f-9dabbe51ddbc","Type":"ContainerDied","Data":"f762a8738d3f47e401158c999773d4f19a8cf6bd6c7936ab82cb0c741248ad3e"} Feb 19 15:31:23 crc kubenswrapper[4810]: I0219 15:31:23.183252 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cc76f8d79-b9r9k" event={"ID":"d13abd15-5b9b-4e00-984f-9dabbe51ddbc","Type":"ContainerStarted","Data":"3782f704a1deb319907b16927aac1f196131400ef56fc6f3b4943495c1de3d0b"} Feb 19 15:31:23 crc kubenswrapper[4810]: I0219 15:31:23.186231 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b4c8d587-b429-415f-96f6-628924fed084","Type":"ContainerStarted","Data":"4774e9328b610c917f9fd35141fcadf466b7543107fa862abe860e7744c56cb8"} Feb 19 15:31:23 crc kubenswrapper[4810]: I0219 15:31:23.187727 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1e8674be-9f7f-438c-afcb-529178b5fa9a","Type":"ContainerStarted","Data":"a598cb61cea96142cc131cec1371fcd736e3d9f078e8978d34c8ba18fa043df1"} Feb 19 15:31:23 crc kubenswrapper[4810]: I0219 15:31:23.203645 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8841a6af-789a-4dd9-81ed-3afc45b255e4","Type":"ContainerStarted","Data":"edb4da08df823fefbaa5fd91c5229d2a05bf23c6e464beaac0148e5443f3fbaf"} Feb 19 15:31:23 crc kubenswrapper[4810]: I0219 15:31:23.204435 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-k7bkw" podStartSLOduration=3.20441259 podStartE2EDuration="3.20441259s" podCreationTimestamp="2026-02-19 15:31:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:31:23.198565321 +0000 UTC m=+1312.680595445" watchObservedRunningTime="2026-02-19 15:31:23.20441259 +0000 UTC m=+1312.686442724" Feb 19 15:31:24 crc kubenswrapper[4810]: W0219 15:31:24.081398 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f1a5ee7_3792_4f35_a967_80fb96c7df10.slice/crio-901147ab356b116732cf4aeab1e9b5ea5ce5785575b20bc7bda82db966cdb603 WatchSource:0}: Error finding container 901147ab356b116732cf4aeab1e9b5ea5ce5785575b20bc7bda82db966cdb603: Status 404 returned error can't find the container with id 901147ab356b116732cf4aeab1e9b5ea5ce5785575b20bc7bda82db966cdb603 Feb 19 15:31:24 crc kubenswrapper[4810]: I0219 15:31:24.221970 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-hk2fs" event={"ID":"5f1a5ee7-3792-4f35-a967-80fb96c7df10","Type":"ContainerStarted","Data":"901147ab356b116732cf4aeab1e9b5ea5ce5785575b20bc7bda82db966cdb603"} Feb 19 15:31:24 crc kubenswrapper[4810]: I0219 15:31:24.674866 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 15:31:24 crc kubenswrapper[4810]: I0219 15:31:24.733154 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 15:31:26 crc kubenswrapper[4810]: I0219 15:31:26.318992 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b4c8d587-b429-415f-96f6-628924fed084","Type":"ContainerStarted","Data":"b7e5ea663f35522cf345ee4992587f93bc18717e336dd09c7687290a99a11aa3"} Feb 19 15:31:26 crc kubenswrapper[4810]: I0219 15:31:26.321175 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-hk2fs" event={"ID":"5f1a5ee7-3792-4f35-a967-80fb96c7df10","Type":"ContainerStarted","Data":"45859d708bbdd95af868748506ae358c82e96df75fa08cfe41661e0323e54c01"} Feb 19 15:31:26 crc kubenswrapper[4810]: I0219 15:31:26.319051 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="b4c8d587-b429-415f-96f6-628924fed084" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://b7e5ea663f35522cf345ee4992587f93bc18717e336dd09c7687290a99a11aa3" gracePeriod=30 Feb 19 15:31:26 crc kubenswrapper[4810]: I0219 15:31:26.327510 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1e8674be-9f7f-438c-afcb-529178b5fa9a","Type":"ContainerStarted","Data":"67d7b1b2fe05fdf05040100d731bde7bfd4f1ee47f3a5b6f3aa77c49f45ebc0c"} Feb 19 15:31:26 crc kubenswrapper[4810]: I0219 15:31:26.327570 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1e8674be-9f7f-438c-afcb-529178b5fa9a","Type":"ContainerStarted","Data":"bf2b72419abc5b9d1dc324265338d9686d9c4dd72a0204301ce8a03dc9ab3fd8"} Feb 19 15:31:26 crc kubenswrapper[4810]: I0219 15:31:26.327619 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="1e8674be-9f7f-438c-afcb-529178b5fa9a" containerName="nova-metadata-log" containerID="cri-o://bf2b72419abc5b9d1dc324265338d9686d9c4dd72a0204301ce8a03dc9ab3fd8" gracePeriod=30 Feb 19 15:31:26 crc kubenswrapper[4810]: I0219 15:31:26.327651 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="1e8674be-9f7f-438c-afcb-529178b5fa9a" containerName="nova-metadata-metadata" containerID="cri-o://67d7b1b2fe05fdf05040100d731bde7bfd4f1ee47f3a5b6f3aa77c49f45ebc0c" gracePeriod=30 Feb 19 15:31:26 crc kubenswrapper[4810]: I0219 15:31:26.334682 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8841a6af-789a-4dd9-81ed-3afc45b255e4","Type":"ContainerStarted","Data":"c77557ffc93a6c8ce5aa2b9530c80c11dbadbea58a331e6c0ce538bfc266ec9b"} Feb 19 15:31:26 crc kubenswrapper[4810]: I0219 15:31:26.334737 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8841a6af-789a-4dd9-81ed-3afc45b255e4","Type":"ContainerStarted","Data":"926de79f3f2ffb9fa64a65d58129228f7afa6cf976838a6de05084ca0e762fe7"} Feb 19 15:31:26 crc kubenswrapper[4810]: I0219 15:31:26.342952 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d9e6d16b-a7c2-4a73-866e-6e068e910d82","Type":"ContainerStarted","Data":"6b9c2cfe7570d6ebf3ef1994a77aa29a1ffc94791dbe2f685217ccc99624a14f"} Feb 19 15:31:26 crc kubenswrapper[4810]: I0219 15:31:26.352460 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cc76f8d79-b9r9k" event={"ID":"d13abd15-5b9b-4e00-984f-9dabbe51ddbc","Type":"ContainerStarted","Data":"968b59a59d30b2d92d2973ce319917f2d174a7454d4003dff6b2c557a24c3a76"} Feb 19 15:31:26 crc kubenswrapper[4810]: I0219 15:31:26.352803 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7cc76f8d79-b9r9k" Feb 19 15:31:26 crc kubenswrapper[4810]: I0219 15:31:26.368120 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.530871095 podStartE2EDuration="5.368104209s" podCreationTimestamp="2026-02-19 15:31:21 +0000 UTC" firstStartedPulling="2026-02-19 15:31:22.231168255 +0000 UTC m=+1311.713198379" lastFinishedPulling="2026-02-19 15:31:25.068401369 +0000 UTC m=+1314.550431493" observedRunningTime="2026-02-19 15:31:26.359653614 +0000 UTC m=+1315.841683738" watchObservedRunningTime="2026-02-19 15:31:26.368104209 +0000 UTC m=+1315.850134333" Feb 19 15:31:26 crc kubenswrapper[4810]: I0219 15:31:26.377100 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.87216789 podStartE2EDuration="5.377080918s" podCreationTimestamp="2026-02-19 15:31:21 +0000 UTC" firstStartedPulling="2026-02-19 15:31:22.561583523 +0000 UTC m=+1312.043613647" lastFinishedPulling="2026-02-19 15:31:25.066496551 +0000 UTC m=+1314.548526675" observedRunningTime="2026-02-19 15:31:26.374787439 +0000 UTC m=+1315.856817593" watchObservedRunningTime="2026-02-19 15:31:26.377080918 +0000 UTC m=+1315.859111052" Feb 19 15:31:26 crc kubenswrapper[4810]: I0219 15:31:26.409806 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.748686288 podStartE2EDuration="5.409789s" podCreationTimestamp="2026-02-19 15:31:21 +0000 UTC" firstStartedPulling="2026-02-19 15:31:22.406603159 +0000 UTC m=+1311.888633283" lastFinishedPulling="2026-02-19 15:31:25.067705871 +0000 UTC m=+1314.549735995" observedRunningTime="2026-02-19 15:31:26.404086405 +0000 UTC m=+1315.886116529" watchObservedRunningTime="2026-02-19 15:31:26.409789 +0000 UTC m=+1315.891819134" Feb 19 15:31:26 crc kubenswrapper[4810]: I0219 15:31:26.433777 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.5172332280000003 podStartE2EDuration="5.43376291s" podCreationTimestamp="2026-02-19 15:31:21 +0000 UTC" firstStartedPulling="2026-02-19 15:31:22.140027576 +0000 UTC m=+1311.622057700" lastFinishedPulling="2026-02-19 15:31:25.056557258 +0000 UTC m=+1314.538587382" observedRunningTime="2026-02-19 15:31:26.430589339 +0000 UTC m=+1315.912619463" watchObservedRunningTime="2026-02-19 15:31:26.43376291 +0000 UTC m=+1315.915793034" Feb 19 15:31:26 crc kubenswrapper[4810]: I0219 15:31:26.472529 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-hk2fs" podStartSLOduration=4.472506926 podStartE2EDuration="4.472506926s" podCreationTimestamp="2026-02-19 15:31:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:31:26.470101345 +0000 UTC m=+1315.952131479" watchObservedRunningTime="2026-02-19 15:31:26.472506926 +0000 UTC m=+1315.954537050" Feb 19 15:31:26 crc kubenswrapper[4810]: I0219 15:31:26.479308 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7cc76f8d79-b9r9k" podStartSLOduration=5.479291638 podStartE2EDuration="5.479291638s" podCreationTimestamp="2026-02-19 15:31:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:31:26.456650282 +0000 UTC m=+1315.938680426" watchObservedRunningTime="2026-02-19 15:31:26.479291638 +0000 UTC m=+1315.961321762" Feb 19 15:31:26 crc kubenswrapper[4810]: I0219 15:31:26.497819 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 19 15:31:26 crc kubenswrapper[4810]: I0219 15:31:26.586976 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 19 15:31:26 crc kubenswrapper[4810]: I0219 15:31:26.781823 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 19 15:31:26 crc kubenswrapper[4810]: I0219 15:31:26.781880 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 19 15:31:27 crc kubenswrapper[4810]: I0219 15:31:27.371144 4810 generic.go:334] "Generic (PLEG): container finished" podID="1e8674be-9f7f-438c-afcb-529178b5fa9a" containerID="67d7b1b2fe05fdf05040100d731bde7bfd4f1ee47f3a5b6f3aa77c49f45ebc0c" exitCode=0 Feb 19 15:31:27 crc kubenswrapper[4810]: I0219 15:31:27.371474 4810 generic.go:334] "Generic (PLEG): container finished" podID="1e8674be-9f7f-438c-afcb-529178b5fa9a" containerID="bf2b72419abc5b9d1dc324265338d9686d9c4dd72a0204301ce8a03dc9ab3fd8" exitCode=143 Feb 19 15:31:27 crc kubenswrapper[4810]: I0219 15:31:27.371218 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1e8674be-9f7f-438c-afcb-529178b5fa9a","Type":"ContainerDied","Data":"67d7b1b2fe05fdf05040100d731bde7bfd4f1ee47f3a5b6f3aa77c49f45ebc0c"} Feb 19 15:31:27 crc kubenswrapper[4810]: I0219 15:31:27.372092 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1e8674be-9f7f-438c-afcb-529178b5fa9a","Type":"ContainerDied","Data":"bf2b72419abc5b9d1dc324265338d9686d9c4dd72a0204301ce8a03dc9ab3fd8"} Feb 19 15:31:27 crc kubenswrapper[4810]: I0219 15:31:27.372114 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1e8674be-9f7f-438c-afcb-529178b5fa9a","Type":"ContainerDied","Data":"a598cb61cea96142cc131cec1371fcd736e3d9f078e8978d34c8ba18fa043df1"} Feb 19 15:31:27 crc kubenswrapper[4810]: I0219 15:31:27.372126 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a598cb61cea96142cc131cec1371fcd736e3d9f078e8978d34c8ba18fa043df1" Feb 19 15:31:27 crc kubenswrapper[4810]: I0219 15:31:27.498628 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 15:31:27 crc kubenswrapper[4810]: I0219 15:31:27.558543 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2szh6\" (UniqueName: \"kubernetes.io/projected/1e8674be-9f7f-438c-afcb-529178b5fa9a-kube-api-access-2szh6\") pod \"1e8674be-9f7f-438c-afcb-529178b5fa9a\" (UID: \"1e8674be-9f7f-438c-afcb-529178b5fa9a\") " Feb 19 15:31:27 crc kubenswrapper[4810]: I0219 15:31:27.558719 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e8674be-9f7f-438c-afcb-529178b5fa9a-combined-ca-bundle\") pod \"1e8674be-9f7f-438c-afcb-529178b5fa9a\" (UID: \"1e8674be-9f7f-438c-afcb-529178b5fa9a\") " Feb 19 15:31:27 crc kubenswrapper[4810]: I0219 15:31:27.558851 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e8674be-9f7f-438c-afcb-529178b5fa9a-logs\") pod \"1e8674be-9f7f-438c-afcb-529178b5fa9a\" (UID: \"1e8674be-9f7f-438c-afcb-529178b5fa9a\") " Feb 19 15:31:27 crc kubenswrapper[4810]: I0219 15:31:27.558906 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e8674be-9f7f-438c-afcb-529178b5fa9a-config-data\") pod \"1e8674be-9f7f-438c-afcb-529178b5fa9a\" (UID: \"1e8674be-9f7f-438c-afcb-529178b5fa9a\") " Feb 19 15:31:27 crc kubenswrapper[4810]: I0219 15:31:27.559278 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e8674be-9f7f-438c-afcb-529178b5fa9a-logs" (OuterVolumeSpecName: "logs") pod "1e8674be-9f7f-438c-afcb-529178b5fa9a" (UID: "1e8674be-9f7f-438c-afcb-529178b5fa9a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:31:27 crc kubenswrapper[4810]: I0219 15:31:27.559829 4810 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e8674be-9f7f-438c-afcb-529178b5fa9a-logs\") on node \"crc\" DevicePath \"\"" Feb 19 15:31:27 crc kubenswrapper[4810]: I0219 15:31:27.565190 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e8674be-9f7f-438c-afcb-529178b5fa9a-kube-api-access-2szh6" (OuterVolumeSpecName: "kube-api-access-2szh6") pod "1e8674be-9f7f-438c-afcb-529178b5fa9a" (UID: "1e8674be-9f7f-438c-afcb-529178b5fa9a"). InnerVolumeSpecName "kube-api-access-2szh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:31:27 crc kubenswrapper[4810]: I0219 15:31:27.598434 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e8674be-9f7f-438c-afcb-529178b5fa9a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1e8674be-9f7f-438c-afcb-529178b5fa9a" (UID: "1e8674be-9f7f-438c-afcb-529178b5fa9a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:31:27 crc kubenswrapper[4810]: I0219 15:31:27.600987 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e8674be-9f7f-438c-afcb-529178b5fa9a-config-data" (OuterVolumeSpecName: "config-data") pod "1e8674be-9f7f-438c-afcb-529178b5fa9a" (UID: "1e8674be-9f7f-438c-afcb-529178b5fa9a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:31:27 crc kubenswrapper[4810]: I0219 15:31:27.661679 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e8674be-9f7f-438c-afcb-529178b5fa9a-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 15:31:27 crc kubenswrapper[4810]: I0219 15:31:27.661721 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2szh6\" (UniqueName: \"kubernetes.io/projected/1e8674be-9f7f-438c-afcb-529178b5fa9a-kube-api-access-2szh6\") on node \"crc\" DevicePath \"\"" Feb 19 15:31:27 crc kubenswrapper[4810]: I0219 15:31:27.661733 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e8674be-9f7f-438c-afcb-529178b5fa9a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 15:31:28 crc kubenswrapper[4810]: I0219 15:31:28.387103 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 15:31:28 crc kubenswrapper[4810]: I0219 15:31:28.464231 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 15:31:28 crc kubenswrapper[4810]: I0219 15:31:28.488156 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 15:31:28 crc kubenswrapper[4810]: I0219 15:31:28.497467 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 19 15:31:28 crc kubenswrapper[4810]: E0219 15:31:28.497976 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e8674be-9f7f-438c-afcb-529178b5fa9a" containerName="nova-metadata-log" Feb 19 15:31:28 crc kubenswrapper[4810]: I0219 15:31:28.498000 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e8674be-9f7f-438c-afcb-529178b5fa9a" containerName="nova-metadata-log" Feb 19 15:31:28 crc kubenswrapper[4810]: E0219 15:31:28.498038 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e8674be-9f7f-438c-afcb-529178b5fa9a" containerName="nova-metadata-metadata" Feb 19 15:31:28 crc kubenswrapper[4810]: I0219 15:31:28.498048 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e8674be-9f7f-438c-afcb-529178b5fa9a" containerName="nova-metadata-metadata" Feb 19 15:31:28 crc kubenswrapper[4810]: I0219 15:31:28.498273 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e8674be-9f7f-438c-afcb-529178b5fa9a" containerName="nova-metadata-metadata" Feb 19 15:31:28 crc kubenswrapper[4810]: I0219 15:31:28.498305 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e8674be-9f7f-438c-afcb-529178b5fa9a" containerName="nova-metadata-log" Feb 19 15:31:28 crc kubenswrapper[4810]: I0219 15:31:28.499593 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 15:31:28 crc kubenswrapper[4810]: I0219 15:31:28.502892 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 19 15:31:28 crc kubenswrapper[4810]: I0219 15:31:28.505890 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 19 15:31:28 crc kubenswrapper[4810]: I0219 15:31:28.542595 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 15:31:28 crc kubenswrapper[4810]: I0219 15:31:28.580840 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbpsq\" (UniqueName: \"kubernetes.io/projected/975e5394-f9a1-428e-90e9-6e1ea9c757e1-kube-api-access-pbpsq\") pod \"nova-metadata-0\" (UID: \"975e5394-f9a1-428e-90e9-6e1ea9c757e1\") " pod="openstack/nova-metadata-0" Feb 19 15:31:28 crc kubenswrapper[4810]: I0219 15:31:28.581142 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/975e5394-f9a1-428e-90e9-6e1ea9c757e1-logs\") pod \"nova-metadata-0\" (UID: \"975e5394-f9a1-428e-90e9-6e1ea9c757e1\") " pod="openstack/nova-metadata-0" Feb 19 15:31:28 crc kubenswrapper[4810]: I0219 15:31:28.581188 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/975e5394-f9a1-428e-90e9-6e1ea9c757e1-config-data\") pod \"nova-metadata-0\" (UID: \"975e5394-f9a1-428e-90e9-6e1ea9c757e1\") " pod="openstack/nova-metadata-0" Feb 19 15:31:28 crc kubenswrapper[4810]: I0219 15:31:28.581480 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/975e5394-f9a1-428e-90e9-6e1ea9c757e1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"975e5394-f9a1-428e-90e9-6e1ea9c757e1\") " pod="openstack/nova-metadata-0" Feb 19 15:31:28 crc kubenswrapper[4810]: I0219 15:31:28.581652 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/975e5394-f9a1-428e-90e9-6e1ea9c757e1-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"975e5394-f9a1-428e-90e9-6e1ea9c757e1\") " pod="openstack/nova-metadata-0" Feb 19 15:31:28 crc kubenswrapper[4810]: I0219 15:31:28.683191 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/975e5394-f9a1-428e-90e9-6e1ea9c757e1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"975e5394-f9a1-428e-90e9-6e1ea9c757e1\") " pod="openstack/nova-metadata-0" Feb 19 15:31:28 crc kubenswrapper[4810]: I0219 15:31:28.683292 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/975e5394-f9a1-428e-90e9-6e1ea9c757e1-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"975e5394-f9a1-428e-90e9-6e1ea9c757e1\") " pod="openstack/nova-metadata-0" Feb 19 15:31:28 crc kubenswrapper[4810]: I0219 15:31:28.683379 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbpsq\" (UniqueName: \"kubernetes.io/projected/975e5394-f9a1-428e-90e9-6e1ea9c757e1-kube-api-access-pbpsq\") pod \"nova-metadata-0\" (UID: \"975e5394-f9a1-428e-90e9-6e1ea9c757e1\") " pod="openstack/nova-metadata-0" Feb 19 15:31:28 crc kubenswrapper[4810]: I0219 15:31:28.683425 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/975e5394-f9a1-428e-90e9-6e1ea9c757e1-logs\") pod \"nova-metadata-0\" (UID: \"975e5394-f9a1-428e-90e9-6e1ea9c757e1\") " pod="openstack/nova-metadata-0" Feb 19 15:31:28 crc kubenswrapper[4810]: I0219 15:31:28.683466 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/975e5394-f9a1-428e-90e9-6e1ea9c757e1-config-data\") pod \"nova-metadata-0\" (UID: \"975e5394-f9a1-428e-90e9-6e1ea9c757e1\") " pod="openstack/nova-metadata-0" Feb 19 15:31:28 crc kubenswrapper[4810]: I0219 15:31:28.684959 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/975e5394-f9a1-428e-90e9-6e1ea9c757e1-logs\") pod \"nova-metadata-0\" (UID: \"975e5394-f9a1-428e-90e9-6e1ea9c757e1\") " pod="openstack/nova-metadata-0" Feb 19 15:31:28 crc kubenswrapper[4810]: I0219 15:31:28.689280 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/975e5394-f9a1-428e-90e9-6e1ea9c757e1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"975e5394-f9a1-428e-90e9-6e1ea9c757e1\") " pod="openstack/nova-metadata-0" Feb 19 15:31:28 crc kubenswrapper[4810]: I0219 15:31:28.690188 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/975e5394-f9a1-428e-90e9-6e1ea9c757e1-config-data\") pod \"nova-metadata-0\" (UID: \"975e5394-f9a1-428e-90e9-6e1ea9c757e1\") " pod="openstack/nova-metadata-0" Feb 19 15:31:28 crc kubenswrapper[4810]: I0219 15:31:28.690437 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/975e5394-f9a1-428e-90e9-6e1ea9c757e1-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"975e5394-f9a1-428e-90e9-6e1ea9c757e1\") " pod="openstack/nova-metadata-0" Feb 19 15:31:28 crc kubenswrapper[4810]: I0219 15:31:28.708396 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbpsq\" (UniqueName: \"kubernetes.io/projected/975e5394-f9a1-428e-90e9-6e1ea9c757e1-kube-api-access-pbpsq\") pod \"nova-metadata-0\" (UID: \"975e5394-f9a1-428e-90e9-6e1ea9c757e1\") " pod="openstack/nova-metadata-0" Feb 19 15:31:28 crc kubenswrapper[4810]: I0219 15:31:28.827078 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 15:31:29 crc kubenswrapper[4810]: I0219 15:31:29.347281 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 15:31:29 crc kubenswrapper[4810]: I0219 15:31:29.405443 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"975e5394-f9a1-428e-90e9-6e1ea9c757e1","Type":"ContainerStarted","Data":"0be93d0eb552143c9c43a40adef0cc77ef54b17f48e67effc28863e9557f11b4"} Feb 19 15:31:29 crc kubenswrapper[4810]: I0219 15:31:29.450550 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e8674be-9f7f-438c-afcb-529178b5fa9a" path="/var/lib/kubelet/pods/1e8674be-9f7f-438c-afcb-529178b5fa9a/volumes" Feb 19 15:31:30 crc kubenswrapper[4810]: I0219 15:31:30.431434 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"975e5394-f9a1-428e-90e9-6e1ea9c757e1","Type":"ContainerStarted","Data":"9c0c8232f43851bc134a132b71a1fe65041c1ed090af3645c3b5f0de510160c6"} Feb 19 15:31:30 crc kubenswrapper[4810]: I0219 15:31:30.431774 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"975e5394-f9a1-428e-90e9-6e1ea9c757e1","Type":"ContainerStarted","Data":"f66cc085b0b58fa637f1e108fa5e008ecf9029e272faad0d2e511d170502fca9"} Feb 19 15:31:30 crc kubenswrapper[4810]: I0219 15:31:30.456479 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.456457717 podStartE2EDuration="2.456457717s" podCreationTimestamp="2026-02-19 15:31:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:31:30.455438501 +0000 UTC m=+1319.937468625" watchObservedRunningTime="2026-02-19 15:31:30.456457717 +0000 UTC m=+1319.938487841" Feb 19 15:31:31 crc kubenswrapper[4810]: I0219 15:31:31.456045 4810 generic.go:334] "Generic (PLEG): container finished" podID="299a53ac-e7e5-47a3-bf65-df5624b77717" containerID="9c8648b58dedd6b14f6832bd1d2f895ecfd4e781a2433a653d4f48b76efb9fef" exitCode=0 Feb 19 15:31:31 crc kubenswrapper[4810]: I0219 15:31:31.459517 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-k7bkw" event={"ID":"299a53ac-e7e5-47a3-bf65-df5624b77717","Type":"ContainerDied","Data":"9c8648b58dedd6b14f6832bd1d2f895ecfd4e781a2433a653d4f48b76efb9fef"} Feb 19 15:31:31 crc kubenswrapper[4810]: I0219 15:31:31.498613 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 19 15:31:31 crc kubenswrapper[4810]: I0219 15:31:31.526885 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 19 15:31:31 crc kubenswrapper[4810]: I0219 15:31:31.777400 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 15:31:31 crc kubenswrapper[4810]: I0219 15:31:31.777439 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 15:31:31 crc kubenswrapper[4810]: I0219 15:31:31.815060 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7cc76f8d79-b9r9k" Feb 19 15:31:31 crc kubenswrapper[4810]: I0219 15:31:31.883956 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c58b86477-9tbw7"] Feb 19 15:31:31 crc kubenswrapper[4810]: I0219 15:31:31.884520 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6c58b86477-9tbw7" podUID="4a48946e-058c-4395-bbad-5effb50b2228" containerName="dnsmasq-dns" containerID="cri-o://c249e977cefa0135aa004a3d9624e2b5787cc21f20239233e79602a670cf0acb" gracePeriod=10 Feb 19 15:31:32 crc kubenswrapper[4810]: I0219 15:31:32.482047 4810 generic.go:334] "Generic (PLEG): container finished" podID="4a48946e-058c-4395-bbad-5effb50b2228" containerID="c249e977cefa0135aa004a3d9624e2b5787cc21f20239233e79602a670cf0acb" exitCode=0 Feb 19 15:31:32 crc kubenswrapper[4810]: I0219 15:31:32.482694 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c58b86477-9tbw7" event={"ID":"4a48946e-058c-4395-bbad-5effb50b2228","Type":"ContainerDied","Data":"c249e977cefa0135aa004a3d9624e2b5787cc21f20239233e79602a670cf0acb"} Feb 19 15:31:32 crc kubenswrapper[4810]: I0219 15:31:32.482721 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c58b86477-9tbw7" event={"ID":"4a48946e-058c-4395-bbad-5effb50b2228","Type":"ContainerDied","Data":"bec8edc98672f19301835694ed5c49318c10a4c5634dfa9bc2728f6b7541a7a3"} Feb 19 15:31:32 crc kubenswrapper[4810]: I0219 15:31:32.482732 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bec8edc98672f19301835694ed5c49318c10a4c5634dfa9bc2728f6b7541a7a3" Feb 19 15:31:32 crc kubenswrapper[4810]: I0219 15:31:32.488151 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c58b86477-9tbw7" Feb 19 15:31:32 crc kubenswrapper[4810]: I0219 15:31:32.534082 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 19 15:31:32 crc kubenswrapper[4810]: I0219 15:31:32.567313 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4a48946e-058c-4395-bbad-5effb50b2228-dns-swift-storage-0\") pod \"4a48946e-058c-4395-bbad-5effb50b2228\" (UID: \"4a48946e-058c-4395-bbad-5effb50b2228\") " Feb 19 15:31:32 crc kubenswrapper[4810]: I0219 15:31:32.567472 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4a48946e-058c-4395-bbad-5effb50b2228-ovsdbserver-sb\") pod \"4a48946e-058c-4395-bbad-5effb50b2228\" (UID: \"4a48946e-058c-4395-bbad-5effb50b2228\") " Feb 19 15:31:32 crc kubenswrapper[4810]: I0219 15:31:32.567503 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4a48946e-058c-4395-bbad-5effb50b2228-ovsdbserver-nb\") pod \"4a48946e-058c-4395-bbad-5effb50b2228\" (UID: \"4a48946e-058c-4395-bbad-5effb50b2228\") " Feb 19 15:31:32 crc kubenswrapper[4810]: I0219 15:31:32.567578 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a48946e-058c-4395-bbad-5effb50b2228-dns-svc\") pod \"4a48946e-058c-4395-bbad-5effb50b2228\" (UID: \"4a48946e-058c-4395-bbad-5effb50b2228\") " Feb 19 15:31:32 crc kubenswrapper[4810]: I0219 15:31:32.567632 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2hfc\" (UniqueName: \"kubernetes.io/projected/4a48946e-058c-4395-bbad-5effb50b2228-kube-api-access-v2hfc\") pod \"4a48946e-058c-4395-bbad-5effb50b2228\" (UID: \"4a48946e-058c-4395-bbad-5effb50b2228\") " Feb 19 15:31:32 crc kubenswrapper[4810]: I0219 15:31:32.567659 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a48946e-058c-4395-bbad-5effb50b2228-config\") pod \"4a48946e-058c-4395-bbad-5effb50b2228\" (UID: \"4a48946e-058c-4395-bbad-5effb50b2228\") " Feb 19 15:31:32 crc kubenswrapper[4810]: I0219 15:31:32.618290 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a48946e-058c-4395-bbad-5effb50b2228-kube-api-access-v2hfc" (OuterVolumeSpecName: "kube-api-access-v2hfc") pod "4a48946e-058c-4395-bbad-5effb50b2228" (UID: "4a48946e-058c-4395-bbad-5effb50b2228"). InnerVolumeSpecName "kube-api-access-v2hfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:31:32 crc kubenswrapper[4810]: I0219 15:31:32.646862 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a48946e-058c-4395-bbad-5effb50b2228-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4a48946e-058c-4395-bbad-5effb50b2228" (UID: "4a48946e-058c-4395-bbad-5effb50b2228"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:31:32 crc kubenswrapper[4810]: I0219 15:31:32.646878 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a48946e-058c-4395-bbad-5effb50b2228-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4a48946e-058c-4395-bbad-5effb50b2228" (UID: "4a48946e-058c-4395-bbad-5effb50b2228"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:31:32 crc kubenswrapper[4810]: I0219 15:31:32.649272 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a48946e-058c-4395-bbad-5effb50b2228-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4a48946e-058c-4395-bbad-5effb50b2228" (UID: "4a48946e-058c-4395-bbad-5effb50b2228"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:31:32 crc kubenswrapper[4810]: I0219 15:31:32.672008 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4a48946e-058c-4395-bbad-5effb50b2228-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 15:31:32 crc kubenswrapper[4810]: I0219 15:31:32.672038 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4a48946e-058c-4395-bbad-5effb50b2228-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 15:31:32 crc kubenswrapper[4810]: I0219 15:31:32.672047 4810 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a48946e-058c-4395-bbad-5effb50b2228-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 15:31:32 crc kubenswrapper[4810]: I0219 15:31:32.672057 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2hfc\" (UniqueName: \"kubernetes.io/projected/4a48946e-058c-4395-bbad-5effb50b2228-kube-api-access-v2hfc\") on node \"crc\" DevicePath \"\"" Feb 19 15:31:32 crc kubenswrapper[4810]: I0219 15:31:32.681126 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a48946e-058c-4395-bbad-5effb50b2228-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "4a48946e-058c-4395-bbad-5effb50b2228" (UID: "4a48946e-058c-4395-bbad-5effb50b2228"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:31:32 crc kubenswrapper[4810]: I0219 15:31:32.687362 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a48946e-058c-4395-bbad-5effb50b2228-config" (OuterVolumeSpecName: "config") pod "4a48946e-058c-4395-bbad-5effb50b2228" (UID: "4a48946e-058c-4395-bbad-5effb50b2228"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:31:32 crc kubenswrapper[4810]: I0219 15:31:32.774054 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a48946e-058c-4395-bbad-5effb50b2228-config\") on node \"crc\" DevicePath \"\"" Feb 19 15:31:32 crc kubenswrapper[4810]: I0219 15:31:32.774387 4810 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4a48946e-058c-4395-bbad-5effb50b2228-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 19 15:31:32 crc kubenswrapper[4810]: I0219 15:31:32.816736 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-k7bkw" Feb 19 15:31:32 crc kubenswrapper[4810]: I0219 15:31:32.859604 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8841a6af-789a-4dd9-81ed-3afc45b255e4" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.215:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 15:31:32 crc kubenswrapper[4810]: I0219 15:31:32.859645 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8841a6af-789a-4dd9-81ed-3afc45b255e4" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.215:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 15:31:32 crc kubenswrapper[4810]: I0219 15:31:32.876191 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/299a53ac-e7e5-47a3-bf65-df5624b77717-config-data\") pod \"299a53ac-e7e5-47a3-bf65-df5624b77717\" (UID: \"299a53ac-e7e5-47a3-bf65-df5624b77717\") " Feb 19 15:31:32 crc kubenswrapper[4810]: I0219 15:31:32.876256 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/299a53ac-e7e5-47a3-bf65-df5624b77717-combined-ca-bundle\") pod \"299a53ac-e7e5-47a3-bf65-df5624b77717\" (UID: \"299a53ac-e7e5-47a3-bf65-df5624b77717\") " Feb 19 15:31:32 crc kubenswrapper[4810]: I0219 15:31:32.876418 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/299a53ac-e7e5-47a3-bf65-df5624b77717-scripts\") pod \"299a53ac-e7e5-47a3-bf65-df5624b77717\" (UID: \"299a53ac-e7e5-47a3-bf65-df5624b77717\") " Feb 19 15:31:32 crc kubenswrapper[4810]: I0219 15:31:32.876507 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhsjr\" (UniqueName: \"kubernetes.io/projected/299a53ac-e7e5-47a3-bf65-df5624b77717-kube-api-access-zhsjr\") pod \"299a53ac-e7e5-47a3-bf65-df5624b77717\" (UID: \"299a53ac-e7e5-47a3-bf65-df5624b77717\") " Feb 19 15:31:32 crc kubenswrapper[4810]: I0219 15:31:32.880800 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/299a53ac-e7e5-47a3-bf65-df5624b77717-kube-api-access-zhsjr" (OuterVolumeSpecName: "kube-api-access-zhsjr") pod "299a53ac-e7e5-47a3-bf65-df5624b77717" (UID: "299a53ac-e7e5-47a3-bf65-df5624b77717"). InnerVolumeSpecName "kube-api-access-zhsjr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:31:32 crc kubenswrapper[4810]: I0219 15:31:32.880966 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/299a53ac-e7e5-47a3-bf65-df5624b77717-scripts" (OuterVolumeSpecName: "scripts") pod "299a53ac-e7e5-47a3-bf65-df5624b77717" (UID: "299a53ac-e7e5-47a3-bf65-df5624b77717"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:31:32 crc kubenswrapper[4810]: I0219 15:31:32.907036 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/299a53ac-e7e5-47a3-bf65-df5624b77717-config-data" (OuterVolumeSpecName: "config-data") pod "299a53ac-e7e5-47a3-bf65-df5624b77717" (UID: "299a53ac-e7e5-47a3-bf65-df5624b77717"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:31:32 crc kubenswrapper[4810]: I0219 15:31:32.914663 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/299a53ac-e7e5-47a3-bf65-df5624b77717-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "299a53ac-e7e5-47a3-bf65-df5624b77717" (UID: "299a53ac-e7e5-47a3-bf65-df5624b77717"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:31:32 crc kubenswrapper[4810]: I0219 15:31:32.979736 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhsjr\" (UniqueName: \"kubernetes.io/projected/299a53ac-e7e5-47a3-bf65-df5624b77717-kube-api-access-zhsjr\") on node \"crc\" DevicePath \"\"" Feb 19 15:31:32 crc kubenswrapper[4810]: I0219 15:31:32.979773 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/299a53ac-e7e5-47a3-bf65-df5624b77717-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 15:31:32 crc kubenswrapper[4810]: I0219 15:31:32.979783 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/299a53ac-e7e5-47a3-bf65-df5624b77717-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 15:31:32 crc kubenswrapper[4810]: I0219 15:31:32.979792 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/299a53ac-e7e5-47a3-bf65-df5624b77717-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 15:31:33 crc kubenswrapper[4810]: I0219 15:31:33.494936 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-k7bkw" event={"ID":"299a53ac-e7e5-47a3-bf65-df5624b77717","Type":"ContainerDied","Data":"692e978a354f5e17fbfce0a777f2418f1242f2cdbcdb95bec183a7615ee2fc90"} Feb 19 15:31:33 crc kubenswrapper[4810]: I0219 15:31:33.494983 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="692e978a354f5e17fbfce0a777f2418f1242f2cdbcdb95bec183a7615ee2fc90" Feb 19 15:31:33 crc kubenswrapper[4810]: I0219 15:31:33.495060 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-k7bkw" Feb 19 15:31:33 crc kubenswrapper[4810]: I0219 15:31:33.495100 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c58b86477-9tbw7" Feb 19 15:31:33 crc kubenswrapper[4810]: I0219 15:31:33.522502 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c58b86477-9tbw7"] Feb 19 15:31:33 crc kubenswrapper[4810]: I0219 15:31:33.558153 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6c58b86477-9tbw7"] Feb 19 15:31:33 crc kubenswrapper[4810]: I0219 15:31:33.685612 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 15:31:33 crc kubenswrapper[4810]: I0219 15:31:33.688333 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8841a6af-789a-4dd9-81ed-3afc45b255e4" containerName="nova-api-log" containerID="cri-o://926de79f3f2ffb9fa64a65d58129228f7afa6cf976838a6de05084ca0e762fe7" gracePeriod=30 Feb 19 15:31:33 crc kubenswrapper[4810]: I0219 15:31:33.689044 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8841a6af-789a-4dd9-81ed-3afc45b255e4" containerName="nova-api-api" containerID="cri-o://c77557ffc93a6c8ce5aa2b9530c80c11dbadbea58a331e6c0ce538bfc266ec9b" gracePeriod=30 Feb 19 15:31:33 crc kubenswrapper[4810]: I0219 15:31:33.705047 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 15:31:33 crc kubenswrapper[4810]: I0219 15:31:33.705357 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="975e5394-f9a1-428e-90e9-6e1ea9c757e1" containerName="nova-metadata-log" containerID="cri-o://f66cc085b0b58fa637f1e108fa5e008ecf9029e272faad0d2e511d170502fca9" gracePeriod=30 Feb 19 15:31:33 crc kubenswrapper[4810]: I0219 15:31:33.705798 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="975e5394-f9a1-428e-90e9-6e1ea9c757e1" containerName="nova-metadata-metadata" containerID="cri-o://9c0c8232f43851bc134a132b71a1fe65041c1ed090af3645c3b5f0de510160c6" gracePeriod=30 Feb 19 15:31:33 crc kubenswrapper[4810]: I0219 15:31:33.718208 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 15:31:33 crc kubenswrapper[4810]: I0219 15:31:33.827825 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 19 15:31:33 crc kubenswrapper[4810]: I0219 15:31:33.827863 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 19 15:31:34 crc kubenswrapper[4810]: I0219 15:31:34.256293 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 15:31:34 crc kubenswrapper[4810]: I0219 15:31:34.303525 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/975e5394-f9a1-428e-90e9-6e1ea9c757e1-combined-ca-bundle\") pod \"975e5394-f9a1-428e-90e9-6e1ea9c757e1\" (UID: \"975e5394-f9a1-428e-90e9-6e1ea9c757e1\") " Feb 19 15:31:34 crc kubenswrapper[4810]: I0219 15:31:34.303640 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pbpsq\" (UniqueName: \"kubernetes.io/projected/975e5394-f9a1-428e-90e9-6e1ea9c757e1-kube-api-access-pbpsq\") pod \"975e5394-f9a1-428e-90e9-6e1ea9c757e1\" (UID: \"975e5394-f9a1-428e-90e9-6e1ea9c757e1\") " Feb 19 15:31:34 crc kubenswrapper[4810]: I0219 15:31:34.303726 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/975e5394-f9a1-428e-90e9-6e1ea9c757e1-nova-metadata-tls-certs\") pod \"975e5394-f9a1-428e-90e9-6e1ea9c757e1\" (UID: \"975e5394-f9a1-428e-90e9-6e1ea9c757e1\") " Feb 19 15:31:34 crc kubenswrapper[4810]: I0219 15:31:34.303760 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/975e5394-f9a1-428e-90e9-6e1ea9c757e1-logs\") pod \"975e5394-f9a1-428e-90e9-6e1ea9c757e1\" (UID: \"975e5394-f9a1-428e-90e9-6e1ea9c757e1\") " Feb 19 15:31:34 crc kubenswrapper[4810]: I0219 15:31:34.303804 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/975e5394-f9a1-428e-90e9-6e1ea9c757e1-config-data\") pod \"975e5394-f9a1-428e-90e9-6e1ea9c757e1\" (UID: \"975e5394-f9a1-428e-90e9-6e1ea9c757e1\") " Feb 19 15:31:34 crc kubenswrapper[4810]: I0219 15:31:34.305347 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/975e5394-f9a1-428e-90e9-6e1ea9c757e1-logs" (OuterVolumeSpecName: "logs") pod "975e5394-f9a1-428e-90e9-6e1ea9c757e1" (UID: "975e5394-f9a1-428e-90e9-6e1ea9c757e1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:31:34 crc kubenswrapper[4810]: I0219 15:31:34.308799 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/975e5394-f9a1-428e-90e9-6e1ea9c757e1-kube-api-access-pbpsq" (OuterVolumeSpecName: "kube-api-access-pbpsq") pod "975e5394-f9a1-428e-90e9-6e1ea9c757e1" (UID: "975e5394-f9a1-428e-90e9-6e1ea9c757e1"). InnerVolumeSpecName "kube-api-access-pbpsq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:31:34 crc kubenswrapper[4810]: I0219 15:31:34.336094 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/975e5394-f9a1-428e-90e9-6e1ea9c757e1-config-data" (OuterVolumeSpecName: "config-data") pod "975e5394-f9a1-428e-90e9-6e1ea9c757e1" (UID: "975e5394-f9a1-428e-90e9-6e1ea9c757e1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:31:34 crc kubenswrapper[4810]: I0219 15:31:34.354275 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/975e5394-f9a1-428e-90e9-6e1ea9c757e1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "975e5394-f9a1-428e-90e9-6e1ea9c757e1" (UID: "975e5394-f9a1-428e-90e9-6e1ea9c757e1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:31:34 crc kubenswrapper[4810]: I0219 15:31:34.369081 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/975e5394-f9a1-428e-90e9-6e1ea9c757e1-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "975e5394-f9a1-428e-90e9-6e1ea9c757e1" (UID: "975e5394-f9a1-428e-90e9-6e1ea9c757e1"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:31:34 crc kubenswrapper[4810]: I0219 15:31:34.406780 4810 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/975e5394-f9a1-428e-90e9-6e1ea9c757e1-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 15:31:34 crc kubenswrapper[4810]: I0219 15:31:34.406816 4810 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/975e5394-f9a1-428e-90e9-6e1ea9c757e1-logs\") on node \"crc\" DevicePath \"\"" Feb 19 15:31:34 crc kubenswrapper[4810]: I0219 15:31:34.406831 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/975e5394-f9a1-428e-90e9-6e1ea9c757e1-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 15:31:34 crc kubenswrapper[4810]: I0219 15:31:34.406843 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/975e5394-f9a1-428e-90e9-6e1ea9c757e1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 15:31:34 crc kubenswrapper[4810]: I0219 15:31:34.406854 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pbpsq\" (UniqueName: \"kubernetes.io/projected/975e5394-f9a1-428e-90e9-6e1ea9c757e1-kube-api-access-pbpsq\") on node \"crc\" DevicePath \"\"" Feb 19 15:31:34 crc kubenswrapper[4810]: I0219 15:31:34.512111 4810 generic.go:334] "Generic (PLEG): container finished" podID="8841a6af-789a-4dd9-81ed-3afc45b255e4" containerID="926de79f3f2ffb9fa64a65d58129228f7afa6cf976838a6de05084ca0e762fe7" exitCode=143 Feb 19 15:31:34 crc kubenswrapper[4810]: I0219 15:31:34.512182 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8841a6af-789a-4dd9-81ed-3afc45b255e4","Type":"ContainerDied","Data":"926de79f3f2ffb9fa64a65d58129228f7afa6cf976838a6de05084ca0e762fe7"} Feb 19 15:31:34 crc kubenswrapper[4810]: I0219 15:31:34.514029 4810 generic.go:334] "Generic (PLEG): container finished" podID="975e5394-f9a1-428e-90e9-6e1ea9c757e1" containerID="9c0c8232f43851bc134a132b71a1fe65041c1ed090af3645c3b5f0de510160c6" exitCode=0 Feb 19 15:31:34 crc kubenswrapper[4810]: I0219 15:31:34.514054 4810 generic.go:334] "Generic (PLEG): container finished" podID="975e5394-f9a1-428e-90e9-6e1ea9c757e1" containerID="f66cc085b0b58fa637f1e108fa5e008ecf9029e272faad0d2e511d170502fca9" exitCode=143 Feb 19 15:31:34 crc kubenswrapper[4810]: I0219 15:31:34.514086 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"975e5394-f9a1-428e-90e9-6e1ea9c757e1","Type":"ContainerDied","Data":"9c0c8232f43851bc134a132b71a1fe65041c1ed090af3645c3b5f0de510160c6"} Feb 19 15:31:34 crc kubenswrapper[4810]: I0219 15:31:34.514104 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"975e5394-f9a1-428e-90e9-6e1ea9c757e1","Type":"ContainerDied","Data":"f66cc085b0b58fa637f1e108fa5e008ecf9029e272faad0d2e511d170502fca9"} Feb 19 15:31:34 crc kubenswrapper[4810]: I0219 15:31:34.514116 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"975e5394-f9a1-428e-90e9-6e1ea9c757e1","Type":"ContainerDied","Data":"0be93d0eb552143c9c43a40adef0cc77ef54b17f48e67effc28863e9557f11b4"} Feb 19 15:31:34 crc kubenswrapper[4810]: I0219 15:31:34.514133 4810 scope.go:117] "RemoveContainer" containerID="9c0c8232f43851bc134a132b71a1fe65041c1ed090af3645c3b5f0de510160c6" Feb 19 15:31:34 crc kubenswrapper[4810]: I0219 15:31:34.514272 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 15:31:34 crc kubenswrapper[4810]: I0219 15:31:34.519551 4810 generic.go:334] "Generic (PLEG): container finished" podID="5f1a5ee7-3792-4f35-a967-80fb96c7df10" containerID="45859d708bbdd95af868748506ae358c82e96df75fa08cfe41661e0323e54c01" exitCode=0 Feb 19 15:31:34 crc kubenswrapper[4810]: I0219 15:31:34.519629 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-hk2fs" event={"ID":"5f1a5ee7-3792-4f35-a967-80fb96c7df10","Type":"ContainerDied","Data":"45859d708bbdd95af868748506ae358c82e96df75fa08cfe41661e0323e54c01"} Feb 19 15:31:34 crc kubenswrapper[4810]: I0219 15:31:34.519739 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="d9e6d16b-a7c2-4a73-866e-6e068e910d82" containerName="nova-scheduler-scheduler" containerID="cri-o://6b9c2cfe7570d6ebf3ef1994a77aa29a1ffc94791dbe2f685217ccc99624a14f" gracePeriod=30 Feb 19 15:31:34 crc kubenswrapper[4810]: I0219 15:31:34.549967 4810 scope.go:117] "RemoveContainer" containerID="f66cc085b0b58fa637f1e108fa5e008ecf9029e272faad0d2e511d170502fca9" Feb 19 15:31:34 crc kubenswrapper[4810]: I0219 15:31:34.572271 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 15:31:34 crc kubenswrapper[4810]: I0219 15:31:34.581130 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 15:31:34 crc kubenswrapper[4810]: I0219 15:31:34.586500 4810 scope.go:117] "RemoveContainer" containerID="9c0c8232f43851bc134a132b71a1fe65041c1ed090af3645c3b5f0de510160c6" Feb 19 15:31:34 crc kubenswrapper[4810]: E0219 15:31:34.587043 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c0c8232f43851bc134a132b71a1fe65041c1ed090af3645c3b5f0de510160c6\": container with ID starting with 9c0c8232f43851bc134a132b71a1fe65041c1ed090af3645c3b5f0de510160c6 not found: ID does not exist" containerID="9c0c8232f43851bc134a132b71a1fe65041c1ed090af3645c3b5f0de510160c6" Feb 19 15:31:34 crc kubenswrapper[4810]: I0219 15:31:34.587098 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c0c8232f43851bc134a132b71a1fe65041c1ed090af3645c3b5f0de510160c6"} err="failed to get container status \"9c0c8232f43851bc134a132b71a1fe65041c1ed090af3645c3b5f0de510160c6\": rpc error: code = NotFound desc = could not find container \"9c0c8232f43851bc134a132b71a1fe65041c1ed090af3645c3b5f0de510160c6\": container with ID starting with 9c0c8232f43851bc134a132b71a1fe65041c1ed090af3645c3b5f0de510160c6 not found: ID does not exist" Feb 19 15:31:34 crc kubenswrapper[4810]: I0219 15:31:34.587132 4810 scope.go:117] "RemoveContainer" containerID="f66cc085b0b58fa637f1e108fa5e008ecf9029e272faad0d2e511d170502fca9" Feb 19 15:31:34 crc kubenswrapper[4810]: E0219 15:31:34.587614 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f66cc085b0b58fa637f1e108fa5e008ecf9029e272faad0d2e511d170502fca9\": container with ID starting with f66cc085b0b58fa637f1e108fa5e008ecf9029e272faad0d2e511d170502fca9 not found: ID does not exist" containerID="f66cc085b0b58fa637f1e108fa5e008ecf9029e272faad0d2e511d170502fca9" Feb 19 15:31:34 crc kubenswrapper[4810]: I0219 15:31:34.587635 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f66cc085b0b58fa637f1e108fa5e008ecf9029e272faad0d2e511d170502fca9"} err="failed to get container status \"f66cc085b0b58fa637f1e108fa5e008ecf9029e272faad0d2e511d170502fca9\": rpc error: code = NotFound desc = could not find container \"f66cc085b0b58fa637f1e108fa5e008ecf9029e272faad0d2e511d170502fca9\": container with ID starting with f66cc085b0b58fa637f1e108fa5e008ecf9029e272faad0d2e511d170502fca9 not found: ID does not exist" Feb 19 15:31:34 crc kubenswrapper[4810]: I0219 15:31:34.587649 4810 scope.go:117] "RemoveContainer" containerID="9c0c8232f43851bc134a132b71a1fe65041c1ed090af3645c3b5f0de510160c6" Feb 19 15:31:34 crc kubenswrapper[4810]: I0219 15:31:34.588074 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c0c8232f43851bc134a132b71a1fe65041c1ed090af3645c3b5f0de510160c6"} err="failed to get container status \"9c0c8232f43851bc134a132b71a1fe65041c1ed090af3645c3b5f0de510160c6\": rpc error: code = NotFound desc = could not find container \"9c0c8232f43851bc134a132b71a1fe65041c1ed090af3645c3b5f0de510160c6\": container with ID starting with 9c0c8232f43851bc134a132b71a1fe65041c1ed090af3645c3b5f0de510160c6 not found: ID does not exist" Feb 19 15:31:34 crc kubenswrapper[4810]: I0219 15:31:34.588110 4810 scope.go:117] "RemoveContainer" containerID="f66cc085b0b58fa637f1e108fa5e008ecf9029e272faad0d2e511d170502fca9" Feb 19 15:31:34 crc kubenswrapper[4810]: I0219 15:31:34.588364 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f66cc085b0b58fa637f1e108fa5e008ecf9029e272faad0d2e511d170502fca9"} err="failed to get container status \"f66cc085b0b58fa637f1e108fa5e008ecf9029e272faad0d2e511d170502fca9\": rpc error: code = NotFound desc = could not find container \"f66cc085b0b58fa637f1e108fa5e008ecf9029e272faad0d2e511d170502fca9\": container with ID starting with f66cc085b0b58fa637f1e108fa5e008ecf9029e272faad0d2e511d170502fca9 not found: ID does not exist" Feb 19 15:31:34 crc kubenswrapper[4810]: I0219 15:31:34.616545 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 19 15:31:34 crc kubenswrapper[4810]: E0219 15:31:34.616999 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a48946e-058c-4395-bbad-5effb50b2228" containerName="init" Feb 19 15:31:34 crc kubenswrapper[4810]: I0219 15:31:34.617015 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a48946e-058c-4395-bbad-5effb50b2228" containerName="init" Feb 19 15:31:34 crc kubenswrapper[4810]: E0219 15:31:34.617030 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a48946e-058c-4395-bbad-5effb50b2228" containerName="dnsmasq-dns" Feb 19 15:31:34 crc kubenswrapper[4810]: I0219 15:31:34.617036 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a48946e-058c-4395-bbad-5effb50b2228" containerName="dnsmasq-dns" Feb 19 15:31:34 crc kubenswrapper[4810]: E0219 15:31:34.617048 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="975e5394-f9a1-428e-90e9-6e1ea9c757e1" containerName="nova-metadata-log" Feb 19 15:31:34 crc kubenswrapper[4810]: I0219 15:31:34.617054 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="975e5394-f9a1-428e-90e9-6e1ea9c757e1" containerName="nova-metadata-log" Feb 19 15:31:34 crc kubenswrapper[4810]: E0219 15:31:34.617067 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="299a53ac-e7e5-47a3-bf65-df5624b77717" containerName="nova-manage" Feb 19 15:31:34 crc kubenswrapper[4810]: I0219 15:31:34.617073 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="299a53ac-e7e5-47a3-bf65-df5624b77717" containerName="nova-manage" Feb 19 15:31:34 crc kubenswrapper[4810]: E0219 15:31:34.617098 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="975e5394-f9a1-428e-90e9-6e1ea9c757e1" containerName="nova-metadata-metadata" Feb 19 15:31:34 crc kubenswrapper[4810]: I0219 15:31:34.617104 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="975e5394-f9a1-428e-90e9-6e1ea9c757e1" containerName="nova-metadata-metadata" Feb 19 15:31:34 crc kubenswrapper[4810]: I0219 15:31:34.617275 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="975e5394-f9a1-428e-90e9-6e1ea9c757e1" containerName="nova-metadata-metadata" Feb 19 15:31:34 crc kubenswrapper[4810]: I0219 15:31:34.617287 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a48946e-058c-4395-bbad-5effb50b2228" containerName="dnsmasq-dns" Feb 19 15:31:34 crc kubenswrapper[4810]: I0219 15:31:34.617297 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="975e5394-f9a1-428e-90e9-6e1ea9c757e1" containerName="nova-metadata-log" Feb 19 15:31:34 crc kubenswrapper[4810]: I0219 15:31:34.617307 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="299a53ac-e7e5-47a3-bf65-df5624b77717" containerName="nova-manage" Feb 19 15:31:34 crc kubenswrapper[4810]: I0219 15:31:34.618383 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 15:31:34 crc kubenswrapper[4810]: I0219 15:31:34.620690 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 19 15:31:34 crc kubenswrapper[4810]: I0219 15:31:34.621821 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 19 15:31:34 crc kubenswrapper[4810]: I0219 15:31:34.626835 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 15:31:34 crc kubenswrapper[4810]: I0219 15:31:34.715778 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/deba4978-0921-4c62-9452-9a47fe24feb7-config-data\") pod \"nova-metadata-0\" (UID: \"deba4978-0921-4c62-9452-9a47fe24feb7\") " pod="openstack/nova-metadata-0" Feb 19 15:31:34 crc kubenswrapper[4810]: I0219 15:31:34.715934 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4dgr\" (UniqueName: \"kubernetes.io/projected/deba4978-0921-4c62-9452-9a47fe24feb7-kube-api-access-d4dgr\") pod \"nova-metadata-0\" (UID: \"deba4978-0921-4c62-9452-9a47fe24feb7\") " pod="openstack/nova-metadata-0" Feb 19 15:31:34 crc kubenswrapper[4810]: I0219 15:31:34.716114 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/deba4978-0921-4c62-9452-9a47fe24feb7-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"deba4978-0921-4c62-9452-9a47fe24feb7\") " pod="openstack/nova-metadata-0" Feb 19 15:31:34 crc kubenswrapper[4810]: I0219 15:31:34.716380 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/deba4978-0921-4c62-9452-9a47fe24feb7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"deba4978-0921-4c62-9452-9a47fe24feb7\") " pod="openstack/nova-metadata-0" Feb 19 15:31:34 crc kubenswrapper[4810]: I0219 15:31:34.716400 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/deba4978-0921-4c62-9452-9a47fe24feb7-logs\") pod \"nova-metadata-0\" (UID: \"deba4978-0921-4c62-9452-9a47fe24feb7\") " pod="openstack/nova-metadata-0" Feb 19 15:31:34 crc kubenswrapper[4810]: I0219 15:31:34.818146 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4dgr\" (UniqueName: \"kubernetes.io/projected/deba4978-0921-4c62-9452-9a47fe24feb7-kube-api-access-d4dgr\") pod \"nova-metadata-0\" (UID: \"deba4978-0921-4c62-9452-9a47fe24feb7\") " pod="openstack/nova-metadata-0" Feb 19 15:31:34 crc kubenswrapper[4810]: I0219 15:31:34.818405 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/deba4978-0921-4c62-9452-9a47fe24feb7-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"deba4978-0921-4c62-9452-9a47fe24feb7\") " pod="openstack/nova-metadata-0" Feb 19 15:31:34 crc kubenswrapper[4810]: I0219 15:31:34.818497 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/deba4978-0921-4c62-9452-9a47fe24feb7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"deba4978-0921-4c62-9452-9a47fe24feb7\") " pod="openstack/nova-metadata-0" Feb 19 15:31:34 crc kubenswrapper[4810]: I0219 15:31:34.818528 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/deba4978-0921-4c62-9452-9a47fe24feb7-logs\") pod \"nova-metadata-0\" (UID: \"deba4978-0921-4c62-9452-9a47fe24feb7\") " pod="openstack/nova-metadata-0" Feb 19 15:31:34 crc kubenswrapper[4810]: I0219 15:31:34.818614 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/deba4978-0921-4c62-9452-9a47fe24feb7-config-data\") pod \"nova-metadata-0\" (UID: \"deba4978-0921-4c62-9452-9a47fe24feb7\") " pod="openstack/nova-metadata-0" Feb 19 15:31:34 crc kubenswrapper[4810]: I0219 15:31:34.819093 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/deba4978-0921-4c62-9452-9a47fe24feb7-logs\") pod \"nova-metadata-0\" (UID: \"deba4978-0921-4c62-9452-9a47fe24feb7\") " pod="openstack/nova-metadata-0" Feb 19 15:31:34 crc kubenswrapper[4810]: I0219 15:31:34.824722 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/deba4978-0921-4c62-9452-9a47fe24feb7-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"deba4978-0921-4c62-9452-9a47fe24feb7\") " pod="openstack/nova-metadata-0" Feb 19 15:31:34 crc kubenswrapper[4810]: I0219 15:31:34.827181 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/deba4978-0921-4c62-9452-9a47fe24feb7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"deba4978-0921-4c62-9452-9a47fe24feb7\") " pod="openstack/nova-metadata-0" Feb 19 15:31:34 crc kubenswrapper[4810]: I0219 15:31:34.827494 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/deba4978-0921-4c62-9452-9a47fe24feb7-config-data\") pod \"nova-metadata-0\" (UID: \"deba4978-0921-4c62-9452-9a47fe24feb7\") " pod="openstack/nova-metadata-0" Feb 19 15:31:34 crc kubenswrapper[4810]: I0219 15:31:34.846584 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4dgr\" (UniqueName: \"kubernetes.io/projected/deba4978-0921-4c62-9452-9a47fe24feb7-kube-api-access-d4dgr\") pod \"nova-metadata-0\" (UID: \"deba4978-0921-4c62-9452-9a47fe24feb7\") " pod="openstack/nova-metadata-0" Feb 19 15:31:34 crc kubenswrapper[4810]: I0219 15:31:34.934116 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 15:31:35 crc kubenswrapper[4810]: I0219 15:31:35.449478 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a48946e-058c-4395-bbad-5effb50b2228" path="/var/lib/kubelet/pods/4a48946e-058c-4395-bbad-5effb50b2228/volumes" Feb 19 15:31:35 crc kubenswrapper[4810]: I0219 15:31:35.450553 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="975e5394-f9a1-428e-90e9-6e1ea9c757e1" path="/var/lib/kubelet/pods/975e5394-f9a1-428e-90e9-6e1ea9c757e1/volumes" Feb 19 15:31:35 crc kubenswrapper[4810]: I0219 15:31:35.484984 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 15:31:35 crc kubenswrapper[4810]: I0219 15:31:35.532152 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"deba4978-0921-4c62-9452-9a47fe24feb7","Type":"ContainerStarted","Data":"9c0cc70373e1f5deebdb6c556b83e6477c32297bae250fc5ee8217259477409e"} Feb 19 15:31:35 crc kubenswrapper[4810]: I0219 15:31:35.848701 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-hk2fs" Feb 19 15:31:36 crc kubenswrapper[4810]: I0219 15:31:36.049078 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f1a5ee7-3792-4f35-a967-80fb96c7df10-config-data\") pod \"5f1a5ee7-3792-4f35-a967-80fb96c7df10\" (UID: \"5f1a5ee7-3792-4f35-a967-80fb96c7df10\") " Feb 19 15:31:36 crc kubenswrapper[4810]: I0219 15:31:36.049177 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f1a5ee7-3792-4f35-a967-80fb96c7df10-scripts\") pod \"5f1a5ee7-3792-4f35-a967-80fb96c7df10\" (UID: \"5f1a5ee7-3792-4f35-a967-80fb96c7df10\") " Feb 19 15:31:36 crc kubenswrapper[4810]: I0219 15:31:36.049453 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f1a5ee7-3792-4f35-a967-80fb96c7df10-combined-ca-bundle\") pod \"5f1a5ee7-3792-4f35-a967-80fb96c7df10\" (UID: \"5f1a5ee7-3792-4f35-a967-80fb96c7df10\") " Feb 19 15:31:36 crc kubenswrapper[4810]: I0219 15:31:36.049594 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qpxgr\" (UniqueName: \"kubernetes.io/projected/5f1a5ee7-3792-4f35-a967-80fb96c7df10-kube-api-access-qpxgr\") pod \"5f1a5ee7-3792-4f35-a967-80fb96c7df10\" (UID: \"5f1a5ee7-3792-4f35-a967-80fb96c7df10\") " Feb 19 15:31:36 crc kubenswrapper[4810]: I0219 15:31:36.053915 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f1a5ee7-3792-4f35-a967-80fb96c7df10-kube-api-access-qpxgr" (OuterVolumeSpecName: "kube-api-access-qpxgr") pod "5f1a5ee7-3792-4f35-a967-80fb96c7df10" (UID: "5f1a5ee7-3792-4f35-a967-80fb96c7df10"). InnerVolumeSpecName "kube-api-access-qpxgr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:31:36 crc kubenswrapper[4810]: I0219 15:31:36.054049 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f1a5ee7-3792-4f35-a967-80fb96c7df10-scripts" (OuterVolumeSpecName: "scripts") pod "5f1a5ee7-3792-4f35-a967-80fb96c7df10" (UID: "5f1a5ee7-3792-4f35-a967-80fb96c7df10"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:31:36 crc kubenswrapper[4810]: I0219 15:31:36.079666 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f1a5ee7-3792-4f35-a967-80fb96c7df10-config-data" (OuterVolumeSpecName: "config-data") pod "5f1a5ee7-3792-4f35-a967-80fb96c7df10" (UID: "5f1a5ee7-3792-4f35-a967-80fb96c7df10"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:31:36 crc kubenswrapper[4810]: I0219 15:31:36.094521 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f1a5ee7-3792-4f35-a967-80fb96c7df10-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5f1a5ee7-3792-4f35-a967-80fb96c7df10" (UID: "5f1a5ee7-3792-4f35-a967-80fb96c7df10"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:31:36 crc kubenswrapper[4810]: I0219 15:31:36.152312 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f1a5ee7-3792-4f35-a967-80fb96c7df10-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 15:31:36 crc kubenswrapper[4810]: I0219 15:31:36.152381 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qpxgr\" (UniqueName: \"kubernetes.io/projected/5f1a5ee7-3792-4f35-a967-80fb96c7df10-kube-api-access-qpxgr\") on node \"crc\" DevicePath \"\"" Feb 19 15:31:36 crc kubenswrapper[4810]: I0219 15:31:36.152397 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f1a5ee7-3792-4f35-a967-80fb96c7df10-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 15:31:36 crc kubenswrapper[4810]: I0219 15:31:36.152408 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f1a5ee7-3792-4f35-a967-80fb96c7df10-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 15:31:36 crc kubenswrapper[4810]: E0219 15:31:36.499769 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6b9c2cfe7570d6ebf3ef1994a77aa29a1ffc94791dbe2f685217ccc99624a14f" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 19 15:31:36 crc kubenswrapper[4810]: E0219 15:31:36.502143 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6b9c2cfe7570d6ebf3ef1994a77aa29a1ffc94791dbe2f685217ccc99624a14f" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 19 15:31:36 crc kubenswrapper[4810]: E0219 15:31:36.503844 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6b9c2cfe7570d6ebf3ef1994a77aa29a1ffc94791dbe2f685217ccc99624a14f" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 19 15:31:36 crc kubenswrapper[4810]: E0219 15:31:36.503933 4810 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="d9e6d16b-a7c2-4a73-866e-6e068e910d82" containerName="nova-scheduler-scheduler" Feb 19 15:31:36 crc kubenswrapper[4810]: I0219 15:31:36.547212 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"deba4978-0921-4c62-9452-9a47fe24feb7","Type":"ContainerStarted","Data":"3cd69082a7ac9af779601af9a54ce467c63853d8132eb18f4aa9ff5eb2315f95"} Feb 19 15:31:36 crc kubenswrapper[4810]: I0219 15:31:36.547260 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"deba4978-0921-4c62-9452-9a47fe24feb7","Type":"ContainerStarted","Data":"acc612b198e2bb735041a74b3bf438f971ed76411bde0e76a3b0dd4d3a920c17"} Feb 19 15:31:36 crc kubenswrapper[4810]: I0219 15:31:36.560934 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-hk2fs" event={"ID":"5f1a5ee7-3792-4f35-a967-80fb96c7df10","Type":"ContainerDied","Data":"901147ab356b116732cf4aeab1e9b5ea5ce5785575b20bc7bda82db966cdb603"} Feb 19 15:31:36 crc kubenswrapper[4810]: I0219 15:31:36.560983 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="901147ab356b116732cf4aeab1e9b5ea5ce5785575b20bc7bda82db966cdb603" Feb 19 15:31:36 crc kubenswrapper[4810]: I0219 15:31:36.560986 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-hk2fs" Feb 19 15:31:36 crc kubenswrapper[4810]: I0219 15:31:36.588675 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.58864837 podStartE2EDuration="2.58864837s" podCreationTimestamp="2026-02-19 15:31:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:31:36.584855923 +0000 UTC m=+1326.066886047" watchObservedRunningTime="2026-02-19 15:31:36.58864837 +0000 UTC m=+1326.070678534" Feb 19 15:31:36 crc kubenswrapper[4810]: I0219 15:31:36.628348 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 19 15:31:36 crc kubenswrapper[4810]: E0219 15:31:36.628804 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f1a5ee7-3792-4f35-a967-80fb96c7df10" containerName="nova-cell1-conductor-db-sync" Feb 19 15:31:36 crc kubenswrapper[4810]: I0219 15:31:36.628822 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f1a5ee7-3792-4f35-a967-80fb96c7df10" containerName="nova-cell1-conductor-db-sync" Feb 19 15:31:36 crc kubenswrapper[4810]: I0219 15:31:36.628995 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f1a5ee7-3792-4f35-a967-80fb96c7df10" containerName="nova-cell1-conductor-db-sync" Feb 19 15:31:36 crc kubenswrapper[4810]: I0219 15:31:36.629658 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 19 15:31:36 crc kubenswrapper[4810]: I0219 15:31:36.631657 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 19 15:31:36 crc kubenswrapper[4810]: I0219 15:31:36.638021 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 19 15:31:36 crc kubenswrapper[4810]: I0219 15:31:36.681172 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f93aa728-7924-4a75-ad48-cc174764cf3e-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f93aa728-7924-4a75-ad48-cc174764cf3e\") " pod="openstack/nova-cell1-conductor-0" Feb 19 15:31:36 crc kubenswrapper[4810]: I0219 15:31:36.681272 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f93aa728-7924-4a75-ad48-cc174764cf3e-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f93aa728-7924-4a75-ad48-cc174764cf3e\") " pod="openstack/nova-cell1-conductor-0" Feb 19 15:31:36 crc kubenswrapper[4810]: I0219 15:31:36.681307 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rlvj\" (UniqueName: \"kubernetes.io/projected/f93aa728-7924-4a75-ad48-cc174764cf3e-kube-api-access-7rlvj\") pod \"nova-cell1-conductor-0\" (UID: \"f93aa728-7924-4a75-ad48-cc174764cf3e\") " pod="openstack/nova-cell1-conductor-0" Feb 19 15:31:36 crc kubenswrapper[4810]: I0219 15:31:36.782841 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f93aa728-7924-4a75-ad48-cc174764cf3e-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f93aa728-7924-4a75-ad48-cc174764cf3e\") " pod="openstack/nova-cell1-conductor-0" Feb 19 15:31:36 crc kubenswrapper[4810]: I0219 15:31:36.782907 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f93aa728-7924-4a75-ad48-cc174764cf3e-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f93aa728-7924-4a75-ad48-cc174764cf3e\") " pod="openstack/nova-cell1-conductor-0" Feb 19 15:31:36 crc kubenswrapper[4810]: I0219 15:31:36.782927 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rlvj\" (UniqueName: \"kubernetes.io/projected/f93aa728-7924-4a75-ad48-cc174764cf3e-kube-api-access-7rlvj\") pod \"nova-cell1-conductor-0\" (UID: \"f93aa728-7924-4a75-ad48-cc174764cf3e\") " pod="openstack/nova-cell1-conductor-0" Feb 19 15:31:36 crc kubenswrapper[4810]: I0219 15:31:36.791241 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f93aa728-7924-4a75-ad48-cc174764cf3e-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f93aa728-7924-4a75-ad48-cc174764cf3e\") " pod="openstack/nova-cell1-conductor-0" Feb 19 15:31:36 crc kubenswrapper[4810]: I0219 15:31:36.801851 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f93aa728-7924-4a75-ad48-cc174764cf3e-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f93aa728-7924-4a75-ad48-cc174764cf3e\") " pod="openstack/nova-cell1-conductor-0" Feb 19 15:31:36 crc kubenswrapper[4810]: I0219 15:31:36.807578 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rlvj\" (UniqueName: \"kubernetes.io/projected/f93aa728-7924-4a75-ad48-cc174764cf3e-kube-api-access-7rlvj\") pod \"nova-cell1-conductor-0\" (UID: \"f93aa728-7924-4a75-ad48-cc174764cf3e\") " pod="openstack/nova-cell1-conductor-0" Feb 19 15:31:36 crc kubenswrapper[4810]: I0219 15:31:36.973660 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 19 15:31:37 crc kubenswrapper[4810]: I0219 15:31:37.313453 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 15:31:37 crc kubenswrapper[4810]: I0219 15:31:37.398374 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8841a6af-789a-4dd9-81ed-3afc45b255e4-combined-ca-bundle\") pod \"8841a6af-789a-4dd9-81ed-3afc45b255e4\" (UID: \"8841a6af-789a-4dd9-81ed-3afc45b255e4\") " Feb 19 15:31:37 crc kubenswrapper[4810]: I0219 15:31:37.398445 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-945nq\" (UniqueName: \"kubernetes.io/projected/8841a6af-789a-4dd9-81ed-3afc45b255e4-kube-api-access-945nq\") pod \"8841a6af-789a-4dd9-81ed-3afc45b255e4\" (UID: \"8841a6af-789a-4dd9-81ed-3afc45b255e4\") " Feb 19 15:31:37 crc kubenswrapper[4810]: I0219 15:31:37.398488 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8841a6af-789a-4dd9-81ed-3afc45b255e4-config-data\") pod \"8841a6af-789a-4dd9-81ed-3afc45b255e4\" (UID: \"8841a6af-789a-4dd9-81ed-3afc45b255e4\") " Feb 19 15:31:37 crc kubenswrapper[4810]: I0219 15:31:37.398527 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8841a6af-789a-4dd9-81ed-3afc45b255e4-logs\") pod \"8841a6af-789a-4dd9-81ed-3afc45b255e4\" (UID: \"8841a6af-789a-4dd9-81ed-3afc45b255e4\") " Feb 19 15:31:37 crc kubenswrapper[4810]: I0219 15:31:37.399509 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8841a6af-789a-4dd9-81ed-3afc45b255e4-logs" (OuterVolumeSpecName: "logs") pod "8841a6af-789a-4dd9-81ed-3afc45b255e4" (UID: "8841a6af-789a-4dd9-81ed-3afc45b255e4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:31:37 crc kubenswrapper[4810]: I0219 15:31:37.405475 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8841a6af-789a-4dd9-81ed-3afc45b255e4-kube-api-access-945nq" (OuterVolumeSpecName: "kube-api-access-945nq") pod "8841a6af-789a-4dd9-81ed-3afc45b255e4" (UID: "8841a6af-789a-4dd9-81ed-3afc45b255e4"). InnerVolumeSpecName "kube-api-access-945nq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:31:37 crc kubenswrapper[4810]: I0219 15:31:37.440624 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8841a6af-789a-4dd9-81ed-3afc45b255e4-config-data" (OuterVolumeSpecName: "config-data") pod "8841a6af-789a-4dd9-81ed-3afc45b255e4" (UID: "8841a6af-789a-4dd9-81ed-3afc45b255e4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:31:37 crc kubenswrapper[4810]: I0219 15:31:37.456127 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8841a6af-789a-4dd9-81ed-3afc45b255e4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8841a6af-789a-4dd9-81ed-3afc45b255e4" (UID: "8841a6af-789a-4dd9-81ed-3afc45b255e4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:31:37 crc kubenswrapper[4810]: I0219 15:31:37.475117 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 19 15:31:37 crc kubenswrapper[4810]: I0219 15:31:37.502645 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8841a6af-789a-4dd9-81ed-3afc45b255e4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 15:31:37 crc kubenswrapper[4810]: I0219 15:31:37.503014 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-945nq\" (UniqueName: \"kubernetes.io/projected/8841a6af-789a-4dd9-81ed-3afc45b255e4-kube-api-access-945nq\") on node \"crc\" DevicePath \"\"" Feb 19 15:31:37 crc kubenswrapper[4810]: I0219 15:31:37.503106 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8841a6af-789a-4dd9-81ed-3afc45b255e4-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 15:31:37 crc kubenswrapper[4810]: I0219 15:31:37.503215 4810 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8841a6af-789a-4dd9-81ed-3afc45b255e4-logs\") on node \"crc\" DevicePath \"\"" Feb 19 15:31:37 crc kubenswrapper[4810]: I0219 15:31:37.570855 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"f93aa728-7924-4a75-ad48-cc174764cf3e","Type":"ContainerStarted","Data":"b6afbb2a6ee072828554f7309688075e596e5f0f978b9e24813ca66e6fdef0b9"} Feb 19 15:31:37 crc kubenswrapper[4810]: I0219 15:31:37.572511 4810 generic.go:334] "Generic (PLEG): container finished" podID="8841a6af-789a-4dd9-81ed-3afc45b255e4" containerID="c77557ffc93a6c8ce5aa2b9530c80c11dbadbea58a331e6c0ce538bfc266ec9b" exitCode=0 Feb 19 15:31:37 crc kubenswrapper[4810]: I0219 15:31:37.572567 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 15:31:37 crc kubenswrapper[4810]: I0219 15:31:37.572581 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8841a6af-789a-4dd9-81ed-3afc45b255e4","Type":"ContainerDied","Data":"c77557ffc93a6c8ce5aa2b9530c80c11dbadbea58a331e6c0ce538bfc266ec9b"} Feb 19 15:31:37 crc kubenswrapper[4810]: I0219 15:31:37.572606 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8841a6af-789a-4dd9-81ed-3afc45b255e4","Type":"ContainerDied","Data":"edb4da08df823fefbaa5fd91c5229d2a05bf23c6e464beaac0148e5443f3fbaf"} Feb 19 15:31:37 crc kubenswrapper[4810]: I0219 15:31:37.572624 4810 scope.go:117] "RemoveContainer" containerID="c77557ffc93a6c8ce5aa2b9530c80c11dbadbea58a331e6c0ce538bfc266ec9b" Feb 19 15:31:37 crc kubenswrapper[4810]: I0219 15:31:37.576252 4810 generic.go:334] "Generic (PLEG): container finished" podID="d9e6d16b-a7c2-4a73-866e-6e068e910d82" containerID="6b9c2cfe7570d6ebf3ef1994a77aa29a1ffc94791dbe2f685217ccc99624a14f" exitCode=0 Feb 19 15:31:37 crc kubenswrapper[4810]: I0219 15:31:37.576388 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d9e6d16b-a7c2-4a73-866e-6e068e910d82","Type":"ContainerDied","Data":"6b9c2cfe7570d6ebf3ef1994a77aa29a1ffc94791dbe2f685217ccc99624a14f"} Feb 19 15:31:37 crc kubenswrapper[4810]: I0219 15:31:37.601078 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 15:31:37 crc kubenswrapper[4810]: I0219 15:31:37.609832 4810 scope.go:117] "RemoveContainer" containerID="926de79f3f2ffb9fa64a65d58129228f7afa6cf976838a6de05084ca0e762fe7" Feb 19 15:31:37 crc kubenswrapper[4810]: I0219 15:31:37.611399 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 19 15:31:37 crc kubenswrapper[4810]: I0219 15:31:37.635195 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 19 15:31:37 crc kubenswrapper[4810]: E0219 15:31:37.635798 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8841a6af-789a-4dd9-81ed-3afc45b255e4" containerName="nova-api-api" Feb 19 15:31:37 crc kubenswrapper[4810]: I0219 15:31:37.635816 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="8841a6af-789a-4dd9-81ed-3afc45b255e4" containerName="nova-api-api" Feb 19 15:31:37 crc kubenswrapper[4810]: E0219 15:31:37.635842 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8841a6af-789a-4dd9-81ed-3afc45b255e4" containerName="nova-api-log" Feb 19 15:31:37 crc kubenswrapper[4810]: I0219 15:31:37.635852 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="8841a6af-789a-4dd9-81ed-3afc45b255e4" containerName="nova-api-log" Feb 19 15:31:37 crc kubenswrapper[4810]: I0219 15:31:37.636081 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="8841a6af-789a-4dd9-81ed-3afc45b255e4" containerName="nova-api-log" Feb 19 15:31:37 crc kubenswrapper[4810]: I0219 15:31:37.636116 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="8841a6af-789a-4dd9-81ed-3afc45b255e4" containerName="nova-api-api" Feb 19 15:31:37 crc kubenswrapper[4810]: I0219 15:31:37.637459 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 15:31:37 crc kubenswrapper[4810]: I0219 15:31:37.639737 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 19 15:31:37 crc kubenswrapper[4810]: I0219 15:31:37.641979 4810 scope.go:117] "RemoveContainer" containerID="c77557ffc93a6c8ce5aa2b9530c80c11dbadbea58a331e6c0ce538bfc266ec9b" Feb 19 15:31:37 crc kubenswrapper[4810]: E0219 15:31:37.642396 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c77557ffc93a6c8ce5aa2b9530c80c11dbadbea58a331e6c0ce538bfc266ec9b\": container with ID starting with c77557ffc93a6c8ce5aa2b9530c80c11dbadbea58a331e6c0ce538bfc266ec9b not found: ID does not exist" containerID="c77557ffc93a6c8ce5aa2b9530c80c11dbadbea58a331e6c0ce538bfc266ec9b" Feb 19 15:31:37 crc kubenswrapper[4810]: I0219 15:31:37.642430 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c77557ffc93a6c8ce5aa2b9530c80c11dbadbea58a331e6c0ce538bfc266ec9b"} err="failed to get container status \"c77557ffc93a6c8ce5aa2b9530c80c11dbadbea58a331e6c0ce538bfc266ec9b\": rpc error: code = NotFound desc = could not find container \"c77557ffc93a6c8ce5aa2b9530c80c11dbadbea58a331e6c0ce538bfc266ec9b\": container with ID starting with c77557ffc93a6c8ce5aa2b9530c80c11dbadbea58a331e6c0ce538bfc266ec9b not found: ID does not exist" Feb 19 15:31:37 crc kubenswrapper[4810]: I0219 15:31:37.642455 4810 scope.go:117] "RemoveContainer" containerID="926de79f3f2ffb9fa64a65d58129228f7afa6cf976838a6de05084ca0e762fe7" Feb 19 15:31:37 crc kubenswrapper[4810]: E0219 15:31:37.644790 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"926de79f3f2ffb9fa64a65d58129228f7afa6cf976838a6de05084ca0e762fe7\": container with ID starting with 926de79f3f2ffb9fa64a65d58129228f7afa6cf976838a6de05084ca0e762fe7 not found: ID does not exist" containerID="926de79f3f2ffb9fa64a65d58129228f7afa6cf976838a6de05084ca0e762fe7" Feb 19 15:31:37 crc kubenswrapper[4810]: I0219 15:31:37.644818 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"926de79f3f2ffb9fa64a65d58129228f7afa6cf976838a6de05084ca0e762fe7"} err="failed to get container status \"926de79f3f2ffb9fa64a65d58129228f7afa6cf976838a6de05084ca0e762fe7\": rpc error: code = NotFound desc = could not find container \"926de79f3f2ffb9fa64a65d58129228f7afa6cf976838a6de05084ca0e762fe7\": container with ID starting with 926de79f3f2ffb9fa64a65d58129228f7afa6cf976838a6de05084ca0e762fe7 not found: ID does not exist" Feb 19 15:31:37 crc kubenswrapper[4810]: I0219 15:31:37.651840 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 15:31:37 crc kubenswrapper[4810]: I0219 15:31:37.717190 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db4be167-d56e-491f-851c-c21f30f63112-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"db4be167-d56e-491f-851c-c21f30f63112\") " pod="openstack/nova-api-0" Feb 19 15:31:37 crc kubenswrapper[4810]: I0219 15:31:37.717799 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v46gz\" (UniqueName: \"kubernetes.io/projected/db4be167-d56e-491f-851c-c21f30f63112-kube-api-access-v46gz\") pod \"nova-api-0\" (UID: \"db4be167-d56e-491f-851c-c21f30f63112\") " pod="openstack/nova-api-0" Feb 19 15:31:37 crc kubenswrapper[4810]: I0219 15:31:37.717837 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db4be167-d56e-491f-851c-c21f30f63112-logs\") pod \"nova-api-0\" (UID: \"db4be167-d56e-491f-851c-c21f30f63112\") " pod="openstack/nova-api-0" Feb 19 15:31:37 crc kubenswrapper[4810]: I0219 15:31:37.717923 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db4be167-d56e-491f-851c-c21f30f63112-config-data\") pod \"nova-api-0\" (UID: \"db4be167-d56e-491f-851c-c21f30f63112\") " pod="openstack/nova-api-0" Feb 19 15:31:37 crc kubenswrapper[4810]: I0219 15:31:37.756066 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 15:31:37 crc kubenswrapper[4810]: I0219 15:31:37.818711 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7g5cs\" (UniqueName: \"kubernetes.io/projected/d9e6d16b-a7c2-4a73-866e-6e068e910d82-kube-api-access-7g5cs\") pod \"d9e6d16b-a7c2-4a73-866e-6e068e910d82\" (UID: \"d9e6d16b-a7c2-4a73-866e-6e068e910d82\") " Feb 19 15:31:37 crc kubenswrapper[4810]: I0219 15:31:37.818811 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9e6d16b-a7c2-4a73-866e-6e068e910d82-combined-ca-bundle\") pod \"d9e6d16b-a7c2-4a73-866e-6e068e910d82\" (UID: \"d9e6d16b-a7c2-4a73-866e-6e068e910d82\") " Feb 19 15:31:37 crc kubenswrapper[4810]: I0219 15:31:37.819024 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9e6d16b-a7c2-4a73-866e-6e068e910d82-config-data\") pod \"d9e6d16b-a7c2-4a73-866e-6e068e910d82\" (UID: \"d9e6d16b-a7c2-4a73-866e-6e068e910d82\") " Feb 19 15:31:37 crc kubenswrapper[4810]: I0219 15:31:37.819405 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v46gz\" (UniqueName: \"kubernetes.io/projected/db4be167-d56e-491f-851c-c21f30f63112-kube-api-access-v46gz\") pod \"nova-api-0\" (UID: \"db4be167-d56e-491f-851c-c21f30f63112\") " pod="openstack/nova-api-0" Feb 19 15:31:37 crc kubenswrapper[4810]: I0219 15:31:37.819453 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db4be167-d56e-491f-851c-c21f30f63112-logs\") pod \"nova-api-0\" (UID: \"db4be167-d56e-491f-851c-c21f30f63112\") " pod="openstack/nova-api-0" Feb 19 15:31:37 crc kubenswrapper[4810]: I0219 15:31:37.819525 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db4be167-d56e-491f-851c-c21f30f63112-config-data\") pod \"nova-api-0\" (UID: \"db4be167-d56e-491f-851c-c21f30f63112\") " pod="openstack/nova-api-0" Feb 19 15:31:37 crc kubenswrapper[4810]: I0219 15:31:37.819609 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db4be167-d56e-491f-851c-c21f30f63112-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"db4be167-d56e-491f-851c-c21f30f63112\") " pod="openstack/nova-api-0" Feb 19 15:31:37 crc kubenswrapper[4810]: I0219 15:31:37.821526 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db4be167-d56e-491f-851c-c21f30f63112-logs\") pod \"nova-api-0\" (UID: \"db4be167-d56e-491f-851c-c21f30f63112\") " pod="openstack/nova-api-0" Feb 19 15:31:37 crc kubenswrapper[4810]: I0219 15:31:37.825971 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db4be167-d56e-491f-851c-c21f30f63112-config-data\") pod \"nova-api-0\" (UID: \"db4be167-d56e-491f-851c-c21f30f63112\") " pod="openstack/nova-api-0" Feb 19 15:31:37 crc kubenswrapper[4810]: I0219 15:31:37.826202 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db4be167-d56e-491f-851c-c21f30f63112-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"db4be167-d56e-491f-851c-c21f30f63112\") " pod="openstack/nova-api-0" Feb 19 15:31:37 crc kubenswrapper[4810]: I0219 15:31:37.826522 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9e6d16b-a7c2-4a73-866e-6e068e910d82-kube-api-access-7g5cs" (OuterVolumeSpecName: "kube-api-access-7g5cs") pod "d9e6d16b-a7c2-4a73-866e-6e068e910d82" (UID: "d9e6d16b-a7c2-4a73-866e-6e068e910d82"). InnerVolumeSpecName "kube-api-access-7g5cs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:31:37 crc kubenswrapper[4810]: I0219 15:31:37.837285 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v46gz\" (UniqueName: \"kubernetes.io/projected/db4be167-d56e-491f-851c-c21f30f63112-kube-api-access-v46gz\") pod \"nova-api-0\" (UID: \"db4be167-d56e-491f-851c-c21f30f63112\") " pod="openstack/nova-api-0" Feb 19 15:31:37 crc kubenswrapper[4810]: I0219 15:31:37.849255 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9e6d16b-a7c2-4a73-866e-6e068e910d82-config-data" (OuterVolumeSpecName: "config-data") pod "d9e6d16b-a7c2-4a73-866e-6e068e910d82" (UID: "d9e6d16b-a7c2-4a73-866e-6e068e910d82"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:31:37 crc kubenswrapper[4810]: I0219 15:31:37.849884 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9e6d16b-a7c2-4a73-866e-6e068e910d82-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d9e6d16b-a7c2-4a73-866e-6e068e910d82" (UID: "d9e6d16b-a7c2-4a73-866e-6e068e910d82"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:31:37 crc kubenswrapper[4810]: I0219 15:31:37.920773 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9e6d16b-a7c2-4a73-866e-6e068e910d82-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 15:31:37 crc kubenswrapper[4810]: I0219 15:31:37.920833 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7g5cs\" (UniqueName: \"kubernetes.io/projected/d9e6d16b-a7c2-4a73-866e-6e068e910d82-kube-api-access-7g5cs\") on node \"crc\" DevicePath \"\"" Feb 19 15:31:37 crc kubenswrapper[4810]: I0219 15:31:37.920843 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9e6d16b-a7c2-4a73-866e-6e068e910d82-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 15:31:37 crc kubenswrapper[4810]: I0219 15:31:37.956467 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 15:31:38 crc kubenswrapper[4810]: I0219 15:31:38.427039 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 15:31:38 crc kubenswrapper[4810]: I0219 15:31:38.591604 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"f93aa728-7924-4a75-ad48-cc174764cf3e","Type":"ContainerStarted","Data":"d781fd3f88e4222668b7ea2ba1070d00a14ad4ff9ae94d33e848586b5dfa2f84"} Feb 19 15:31:38 crc kubenswrapper[4810]: I0219 15:31:38.591760 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Feb 19 15:31:38 crc kubenswrapper[4810]: I0219 15:31:38.598543 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"db4be167-d56e-491f-851c-c21f30f63112","Type":"ContainerStarted","Data":"6b9772e1210b523a5eae56618cf676698a333f36d8eea10d60634c0ac175381c"} Feb 19 15:31:38 crc kubenswrapper[4810]: I0219 15:31:38.601342 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d9e6d16b-a7c2-4a73-866e-6e068e910d82","Type":"ContainerDied","Data":"8358d2e845bdbf47646ae6969a74848954fb0b3cc77ce5979954030d2a57fe33"} Feb 19 15:31:38 crc kubenswrapper[4810]: I0219 15:31:38.601396 4810 scope.go:117] "RemoveContainer" containerID="6b9c2cfe7570d6ebf3ef1994a77aa29a1ffc94791dbe2f685217ccc99624a14f" Feb 19 15:31:38 crc kubenswrapper[4810]: I0219 15:31:38.601409 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 15:31:38 crc kubenswrapper[4810]: I0219 15:31:38.615162 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.615128764 podStartE2EDuration="2.615128764s" podCreationTimestamp="2026-02-19 15:31:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:31:38.613563194 +0000 UTC m=+1328.095593428" watchObservedRunningTime="2026-02-19 15:31:38.615128764 +0000 UTC m=+1328.097158888" Feb 19 15:31:38 crc kubenswrapper[4810]: I0219 15:31:38.669437 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 15:31:38 crc kubenswrapper[4810]: I0219 15:31:38.704392 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 15:31:38 crc kubenswrapper[4810]: I0219 15:31:38.715544 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 15:31:38 crc kubenswrapper[4810]: E0219 15:31:38.716187 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9e6d16b-a7c2-4a73-866e-6e068e910d82" containerName="nova-scheduler-scheduler" Feb 19 15:31:38 crc kubenswrapper[4810]: I0219 15:31:38.716271 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9e6d16b-a7c2-4a73-866e-6e068e910d82" containerName="nova-scheduler-scheduler" Feb 19 15:31:38 crc kubenswrapper[4810]: I0219 15:31:38.716526 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9e6d16b-a7c2-4a73-866e-6e068e910d82" containerName="nova-scheduler-scheduler" Feb 19 15:31:38 crc kubenswrapper[4810]: I0219 15:31:38.717275 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 15:31:38 crc kubenswrapper[4810]: I0219 15:31:38.721075 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 19 15:31:38 crc kubenswrapper[4810]: I0219 15:31:38.736932 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 15:31:38 crc kubenswrapper[4810]: I0219 15:31:38.771863 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9776a876-19db-446c-a7bf-d6fe0111d7b8-config-data\") pod \"nova-scheduler-0\" (UID: \"9776a876-19db-446c-a7bf-d6fe0111d7b8\") " pod="openstack/nova-scheduler-0" Feb 19 15:31:38 crc kubenswrapper[4810]: I0219 15:31:38.771900 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9776a876-19db-446c-a7bf-d6fe0111d7b8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9776a876-19db-446c-a7bf-d6fe0111d7b8\") " pod="openstack/nova-scheduler-0" Feb 19 15:31:38 crc kubenswrapper[4810]: I0219 15:31:38.772046 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4rpb\" (UniqueName: \"kubernetes.io/projected/9776a876-19db-446c-a7bf-d6fe0111d7b8-kube-api-access-f4rpb\") pod \"nova-scheduler-0\" (UID: \"9776a876-19db-446c-a7bf-d6fe0111d7b8\") " pod="openstack/nova-scheduler-0" Feb 19 15:31:38 crc kubenswrapper[4810]: I0219 15:31:38.874222 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9776a876-19db-446c-a7bf-d6fe0111d7b8-config-data\") pod \"nova-scheduler-0\" (UID: \"9776a876-19db-446c-a7bf-d6fe0111d7b8\") " pod="openstack/nova-scheduler-0" Feb 19 15:31:38 crc kubenswrapper[4810]: I0219 15:31:38.874291 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9776a876-19db-446c-a7bf-d6fe0111d7b8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9776a876-19db-446c-a7bf-d6fe0111d7b8\") " pod="openstack/nova-scheduler-0" Feb 19 15:31:38 crc kubenswrapper[4810]: I0219 15:31:38.874489 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4rpb\" (UniqueName: \"kubernetes.io/projected/9776a876-19db-446c-a7bf-d6fe0111d7b8-kube-api-access-f4rpb\") pod \"nova-scheduler-0\" (UID: \"9776a876-19db-446c-a7bf-d6fe0111d7b8\") " pod="openstack/nova-scheduler-0" Feb 19 15:31:38 crc kubenswrapper[4810]: I0219 15:31:38.878732 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9776a876-19db-446c-a7bf-d6fe0111d7b8-config-data\") pod \"nova-scheduler-0\" (UID: \"9776a876-19db-446c-a7bf-d6fe0111d7b8\") " pod="openstack/nova-scheduler-0" Feb 19 15:31:38 crc kubenswrapper[4810]: I0219 15:31:38.879651 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9776a876-19db-446c-a7bf-d6fe0111d7b8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9776a876-19db-446c-a7bf-d6fe0111d7b8\") " pod="openstack/nova-scheduler-0" Feb 19 15:31:38 crc kubenswrapper[4810]: I0219 15:31:38.895366 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4rpb\" (UniqueName: \"kubernetes.io/projected/9776a876-19db-446c-a7bf-d6fe0111d7b8-kube-api-access-f4rpb\") pod \"nova-scheduler-0\" (UID: \"9776a876-19db-446c-a7bf-d6fe0111d7b8\") " pod="openstack/nova-scheduler-0" Feb 19 15:31:39 crc kubenswrapper[4810]: I0219 15:31:39.195085 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 15:31:39 crc kubenswrapper[4810]: I0219 15:31:39.452307 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8841a6af-789a-4dd9-81ed-3afc45b255e4" path="/var/lib/kubelet/pods/8841a6af-789a-4dd9-81ed-3afc45b255e4/volumes" Feb 19 15:31:39 crc kubenswrapper[4810]: I0219 15:31:39.453121 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9e6d16b-a7c2-4a73-866e-6e068e910d82" path="/var/lib/kubelet/pods/d9e6d16b-a7c2-4a73-866e-6e068e910d82/volumes" Feb 19 15:31:39 crc kubenswrapper[4810]: I0219 15:31:39.614907 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"db4be167-d56e-491f-851c-c21f30f63112","Type":"ContainerStarted","Data":"c92ab671905f1460e2a7dccb331259f6a59c52684a2e0df2a88b275d79bf21c4"} Feb 19 15:31:39 crc kubenswrapper[4810]: I0219 15:31:39.614947 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"db4be167-d56e-491f-851c-c21f30f63112","Type":"ContainerStarted","Data":"f3afc915b44f3696b28ac2a3768f5a557a2b647fe608482478f7471cf5736c24"} Feb 19 15:31:39 crc kubenswrapper[4810]: I0219 15:31:39.647403 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 15:31:39 crc kubenswrapper[4810]: I0219 15:31:39.653870 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.653857323 podStartE2EDuration="2.653857323s" podCreationTimestamp="2026-02-19 15:31:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:31:39.64113708 +0000 UTC m=+1329.123167204" watchObservedRunningTime="2026-02-19 15:31:39.653857323 +0000 UTC m=+1329.135887447" Feb 19 15:31:39 crc kubenswrapper[4810]: I0219 15:31:39.934804 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 19 15:31:39 crc kubenswrapper[4810]: I0219 15:31:39.935177 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 19 15:31:40 crc kubenswrapper[4810]: I0219 15:31:40.623270 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9776a876-19db-446c-a7bf-d6fe0111d7b8","Type":"ContainerStarted","Data":"9050cf4664528bc43d0a9883f3bb240ba2cdcad9a8907febfcebd4f66322d6a0"} Feb 19 15:31:40 crc kubenswrapper[4810]: I0219 15:31:40.623317 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9776a876-19db-446c-a7bf-d6fe0111d7b8","Type":"ContainerStarted","Data":"e26af5d090a2b6422428d4303d703f7d22f7771910d627d6bbd17faed1d8906a"} Feb 19 15:31:40 crc kubenswrapper[4810]: I0219 15:31:40.651350 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.651310574 podStartE2EDuration="2.651310574s" podCreationTimestamp="2026-02-19 15:31:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:31:40.645307581 +0000 UTC m=+1330.127337705" watchObservedRunningTime="2026-02-19 15:31:40.651310574 +0000 UTC m=+1330.133340698" Feb 19 15:31:40 crc kubenswrapper[4810]: I0219 15:31:40.728833 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 19 15:31:44 crc kubenswrapper[4810]: I0219 15:31:44.195939 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 19 15:31:44 crc kubenswrapper[4810]: I0219 15:31:44.389712 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 15:31:44 crc kubenswrapper[4810]: I0219 15:31:44.389925 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="097bc4d1-5648-4607-9c49-286e4bbbe553" containerName="kube-state-metrics" containerID="cri-o://35654a2e76fc1f65be05a171e2aeec58c5e73e3b78c5850da9115db247aae94f" gracePeriod=30 Feb 19 15:31:44 crc kubenswrapper[4810]: I0219 15:31:44.678682 4810 generic.go:334] "Generic (PLEG): container finished" podID="097bc4d1-5648-4607-9c49-286e4bbbe553" containerID="35654a2e76fc1f65be05a171e2aeec58c5e73e3b78c5850da9115db247aae94f" exitCode=2 Feb 19 15:31:44 crc kubenswrapper[4810]: I0219 15:31:44.678720 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"097bc4d1-5648-4607-9c49-286e4bbbe553","Type":"ContainerDied","Data":"35654a2e76fc1f65be05a171e2aeec58c5e73e3b78c5850da9115db247aae94f"} Feb 19 15:31:44 crc kubenswrapper[4810]: I0219 15:31:44.900746 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 19 15:31:44 crc kubenswrapper[4810]: I0219 15:31:44.934428 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 19 15:31:44 crc kubenswrapper[4810]: I0219 15:31:44.934481 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 19 15:31:45 crc kubenswrapper[4810]: I0219 15:31:45.011419 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdrqf\" (UniqueName: \"kubernetes.io/projected/097bc4d1-5648-4607-9c49-286e4bbbe553-kube-api-access-mdrqf\") pod \"097bc4d1-5648-4607-9c49-286e4bbbe553\" (UID: \"097bc4d1-5648-4607-9c49-286e4bbbe553\") " Feb 19 15:31:45 crc kubenswrapper[4810]: I0219 15:31:45.018444 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/097bc4d1-5648-4607-9c49-286e4bbbe553-kube-api-access-mdrqf" (OuterVolumeSpecName: "kube-api-access-mdrqf") pod "097bc4d1-5648-4607-9c49-286e4bbbe553" (UID: "097bc4d1-5648-4607-9c49-286e4bbbe553"). InnerVolumeSpecName "kube-api-access-mdrqf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:31:45 crc kubenswrapper[4810]: I0219 15:31:45.113866 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdrqf\" (UniqueName: \"kubernetes.io/projected/097bc4d1-5648-4607-9c49-286e4bbbe553-kube-api-access-mdrqf\") on node \"crc\" DevicePath \"\"" Feb 19 15:31:45 crc kubenswrapper[4810]: I0219 15:31:45.689098 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"097bc4d1-5648-4607-9c49-286e4bbbe553","Type":"ContainerDied","Data":"7d6bd84bead9eb4536dd357117c11ae4d96e35d9a18c032d07c64f477200a6eb"} Feb 19 15:31:45 crc kubenswrapper[4810]: I0219 15:31:45.689153 4810 scope.go:117] "RemoveContainer" containerID="35654a2e76fc1f65be05a171e2aeec58c5e73e3b78c5850da9115db247aae94f" Feb 19 15:31:45 crc kubenswrapper[4810]: I0219 15:31:45.689303 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 19 15:31:45 crc kubenswrapper[4810]: I0219 15:31:45.708748 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 15:31:45 crc kubenswrapper[4810]: I0219 15:31:45.724488 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 15:31:45 crc kubenswrapper[4810]: I0219 15:31:45.740024 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 15:31:45 crc kubenswrapper[4810]: E0219 15:31:45.740904 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="097bc4d1-5648-4607-9c49-286e4bbbe553" containerName="kube-state-metrics" Feb 19 15:31:45 crc kubenswrapper[4810]: I0219 15:31:45.740932 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="097bc4d1-5648-4607-9c49-286e4bbbe553" containerName="kube-state-metrics" Feb 19 15:31:45 crc kubenswrapper[4810]: I0219 15:31:45.741270 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="097bc4d1-5648-4607-9c49-286e4bbbe553" containerName="kube-state-metrics" Feb 19 15:31:45 crc kubenswrapper[4810]: I0219 15:31:45.742528 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 19 15:31:45 crc kubenswrapper[4810]: I0219 15:31:45.744832 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Feb 19 15:31:45 crc kubenswrapper[4810]: I0219 15:31:45.746244 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Feb 19 15:31:45 crc kubenswrapper[4810]: I0219 15:31:45.750868 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 15:31:45 crc kubenswrapper[4810]: I0219 15:31:45.827200 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/9358dbee-2e5b-432d-98e0-6945d2e0d44b-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"9358dbee-2e5b-432d-98e0-6945d2e0d44b\") " pod="openstack/kube-state-metrics-0" Feb 19 15:31:45 crc kubenswrapper[4810]: I0219 15:31:45.827244 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqzx2\" (UniqueName: \"kubernetes.io/projected/9358dbee-2e5b-432d-98e0-6945d2e0d44b-kube-api-access-gqzx2\") pod \"kube-state-metrics-0\" (UID: \"9358dbee-2e5b-432d-98e0-6945d2e0d44b\") " pod="openstack/kube-state-metrics-0" Feb 19 15:31:45 crc kubenswrapper[4810]: I0219 15:31:45.827343 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/9358dbee-2e5b-432d-98e0-6945d2e0d44b-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"9358dbee-2e5b-432d-98e0-6945d2e0d44b\") " pod="openstack/kube-state-metrics-0" Feb 19 15:31:45 crc kubenswrapper[4810]: I0219 15:31:45.827373 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9358dbee-2e5b-432d-98e0-6945d2e0d44b-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"9358dbee-2e5b-432d-98e0-6945d2e0d44b\") " pod="openstack/kube-state-metrics-0" Feb 19 15:31:45 crc kubenswrapper[4810]: I0219 15:31:45.929485 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/9358dbee-2e5b-432d-98e0-6945d2e0d44b-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"9358dbee-2e5b-432d-98e0-6945d2e0d44b\") " pod="openstack/kube-state-metrics-0" Feb 19 15:31:45 crc kubenswrapper[4810]: I0219 15:31:45.929874 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqzx2\" (UniqueName: \"kubernetes.io/projected/9358dbee-2e5b-432d-98e0-6945d2e0d44b-kube-api-access-gqzx2\") pod \"kube-state-metrics-0\" (UID: \"9358dbee-2e5b-432d-98e0-6945d2e0d44b\") " pod="openstack/kube-state-metrics-0" Feb 19 15:31:45 crc kubenswrapper[4810]: I0219 15:31:45.930006 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/9358dbee-2e5b-432d-98e0-6945d2e0d44b-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"9358dbee-2e5b-432d-98e0-6945d2e0d44b\") " pod="openstack/kube-state-metrics-0" Feb 19 15:31:45 crc kubenswrapper[4810]: I0219 15:31:45.930035 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9358dbee-2e5b-432d-98e0-6945d2e0d44b-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"9358dbee-2e5b-432d-98e0-6945d2e0d44b\") " pod="openstack/kube-state-metrics-0" Feb 19 15:31:45 crc kubenswrapper[4810]: I0219 15:31:45.935937 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/9358dbee-2e5b-432d-98e0-6945d2e0d44b-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"9358dbee-2e5b-432d-98e0-6945d2e0d44b\") " pod="openstack/kube-state-metrics-0" Feb 19 15:31:45 crc kubenswrapper[4810]: I0219 15:31:45.936085 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9358dbee-2e5b-432d-98e0-6945d2e0d44b-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"9358dbee-2e5b-432d-98e0-6945d2e0d44b\") " pod="openstack/kube-state-metrics-0" Feb 19 15:31:45 crc kubenswrapper[4810]: I0219 15:31:45.945951 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/9358dbee-2e5b-432d-98e0-6945d2e0d44b-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"9358dbee-2e5b-432d-98e0-6945d2e0d44b\") " pod="openstack/kube-state-metrics-0" Feb 19 15:31:45 crc kubenswrapper[4810]: I0219 15:31:45.946484 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="deba4978-0921-4c62-9452-9a47fe24feb7" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.220:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 19 15:31:45 crc kubenswrapper[4810]: I0219 15:31:45.946446 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="deba4978-0921-4c62-9452-9a47fe24feb7" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.220:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 19 15:31:45 crc kubenswrapper[4810]: I0219 15:31:45.948671 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqzx2\" (UniqueName: \"kubernetes.io/projected/9358dbee-2e5b-432d-98e0-6945d2e0d44b-kube-api-access-gqzx2\") pod \"kube-state-metrics-0\" (UID: \"9358dbee-2e5b-432d-98e0-6945d2e0d44b\") " pod="openstack/kube-state-metrics-0" Feb 19 15:31:46 crc kubenswrapper[4810]: I0219 15:31:46.058593 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 19 15:31:46 crc kubenswrapper[4810]: I0219 15:31:46.540702 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 15:31:46 crc kubenswrapper[4810]: I0219 15:31:46.541229 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="af3ec395-6313-4094-9597-b52da27a0d7e" containerName="sg-core" containerID="cri-o://4925cc21f9c39a46d677b62b982c7df70c5e45eb8c42ab96787c52123262f91d" gracePeriod=30 Feb 19 15:31:46 crc kubenswrapper[4810]: I0219 15:31:46.541229 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="af3ec395-6313-4094-9597-b52da27a0d7e" containerName="ceilometer-notification-agent" containerID="cri-o://308ad8f8b8e7c54aece6c4e1cc25e71fd1252432f00bfbdb0df4940372379f98" gracePeriod=30 Feb 19 15:31:46 crc kubenswrapper[4810]: I0219 15:31:46.541229 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="af3ec395-6313-4094-9597-b52da27a0d7e" containerName="proxy-httpd" containerID="cri-o://2adc7f5bee46c61926054ab4e65ab8e1825b109d167cc80842a20a1750f29ed6" gracePeriod=30 Feb 19 15:31:46 crc kubenswrapper[4810]: I0219 15:31:46.541092 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="af3ec395-6313-4094-9597-b52da27a0d7e" containerName="ceilometer-central-agent" containerID="cri-o://6cec0822ad581a03d19d641977e9a7d547ef54d70d04ff9e2ccd6f292e9245c7" gracePeriod=30 Feb 19 15:31:46 crc kubenswrapper[4810]: W0219 15:31:46.580137 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9358dbee_2e5b_432d_98e0_6945d2e0d44b.slice/crio-f33881054638ff16625cc87cbd7f7d7fde7f7a6632ba65976becde1e415786d5 WatchSource:0}: Error finding container f33881054638ff16625cc87cbd7f7d7fde7f7a6632ba65976becde1e415786d5: Status 404 returned error can't find the container with id f33881054638ff16625cc87cbd7f7d7fde7f7a6632ba65976becde1e415786d5 Feb 19 15:31:46 crc kubenswrapper[4810]: I0219 15:31:46.595070 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 15:31:46 crc kubenswrapper[4810]: I0219 15:31:46.699223 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"9358dbee-2e5b-432d-98e0-6945d2e0d44b","Type":"ContainerStarted","Data":"f33881054638ff16625cc87cbd7f7d7fde7f7a6632ba65976becde1e415786d5"} Feb 19 15:31:46 crc kubenswrapper[4810]: I0219 15:31:46.706215 4810 generic.go:334] "Generic (PLEG): container finished" podID="af3ec395-6313-4094-9597-b52da27a0d7e" containerID="2adc7f5bee46c61926054ab4e65ab8e1825b109d167cc80842a20a1750f29ed6" exitCode=0 Feb 19 15:31:46 crc kubenswrapper[4810]: I0219 15:31:46.706242 4810 generic.go:334] "Generic (PLEG): container finished" podID="af3ec395-6313-4094-9597-b52da27a0d7e" containerID="4925cc21f9c39a46d677b62b982c7df70c5e45eb8c42ab96787c52123262f91d" exitCode=2 Feb 19 15:31:46 crc kubenswrapper[4810]: I0219 15:31:46.706261 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"af3ec395-6313-4094-9597-b52da27a0d7e","Type":"ContainerDied","Data":"2adc7f5bee46c61926054ab4e65ab8e1825b109d167cc80842a20a1750f29ed6"} Feb 19 15:31:46 crc kubenswrapper[4810]: I0219 15:31:46.706283 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"af3ec395-6313-4094-9597-b52da27a0d7e","Type":"ContainerDied","Data":"4925cc21f9c39a46d677b62b982c7df70c5e45eb8c42ab96787c52123262f91d"} Feb 19 15:31:47 crc kubenswrapper[4810]: I0219 15:31:47.044103 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Feb 19 15:31:47 crc kubenswrapper[4810]: I0219 15:31:47.450465 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="097bc4d1-5648-4607-9c49-286e4bbbe553" path="/var/lib/kubelet/pods/097bc4d1-5648-4607-9c49-286e4bbbe553/volumes" Feb 19 15:31:47 crc kubenswrapper[4810]: I0219 15:31:47.718748 4810 generic.go:334] "Generic (PLEG): container finished" podID="af3ec395-6313-4094-9597-b52da27a0d7e" containerID="6cec0822ad581a03d19d641977e9a7d547ef54d70d04ff9e2ccd6f292e9245c7" exitCode=0 Feb 19 15:31:47 crc kubenswrapper[4810]: I0219 15:31:47.718827 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"af3ec395-6313-4094-9597-b52da27a0d7e","Type":"ContainerDied","Data":"6cec0822ad581a03d19d641977e9a7d547ef54d70d04ff9e2ccd6f292e9245c7"} Feb 19 15:31:47 crc kubenswrapper[4810]: I0219 15:31:47.721475 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"9358dbee-2e5b-432d-98e0-6945d2e0d44b","Type":"ContainerStarted","Data":"9cc90f1afe6da59c95622d76dbac631c1a19726a0bc5454289544aee0a783fc2"} Feb 19 15:31:47 crc kubenswrapper[4810]: I0219 15:31:47.721625 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 19 15:31:47 crc kubenswrapper[4810]: I0219 15:31:47.743483 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.167833396 podStartE2EDuration="2.743458652s" podCreationTimestamp="2026-02-19 15:31:45 +0000 UTC" firstStartedPulling="2026-02-19 15:31:46.582279287 +0000 UTC m=+1336.064309411" lastFinishedPulling="2026-02-19 15:31:47.157904533 +0000 UTC m=+1336.639934667" observedRunningTime="2026-02-19 15:31:47.736053424 +0000 UTC m=+1337.218083548" watchObservedRunningTime="2026-02-19 15:31:47.743458652 +0000 UTC m=+1337.225488776" Feb 19 15:31:47 crc kubenswrapper[4810]: I0219 15:31:47.957125 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 15:31:47 crc kubenswrapper[4810]: I0219 15:31:47.957168 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 15:31:49 crc kubenswrapper[4810]: I0219 15:31:49.040669 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="db4be167-d56e-491f-851c-c21f30f63112" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.222:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 15:31:49 crc kubenswrapper[4810]: I0219 15:31:49.040688 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="db4be167-d56e-491f-851c-c21f30f63112" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.222:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 15:31:49 crc kubenswrapper[4810]: I0219 15:31:49.196004 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 19 15:31:49 crc kubenswrapper[4810]: I0219 15:31:49.231687 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 19 15:31:49 crc kubenswrapper[4810]: I0219 15:31:49.537175 4810 patch_prober.go:28] interesting pod/machine-config-daemon-t499d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 15:31:49 crc kubenswrapper[4810]: I0219 15:31:49.537505 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 15:31:49 crc kubenswrapper[4810]: I0219 15:31:49.770826 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 19 15:31:54 crc kubenswrapper[4810]: I0219 15:31:54.224715 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 15:31:54 crc kubenswrapper[4810]: I0219 15:31:54.291387 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af3ec395-6313-4094-9597-b52da27a0d7e-run-httpd\") pod \"af3ec395-6313-4094-9597-b52da27a0d7e\" (UID: \"af3ec395-6313-4094-9597-b52da27a0d7e\") " Feb 19 15:31:54 crc kubenswrapper[4810]: I0219 15:31:54.291675 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npk42\" (UniqueName: \"kubernetes.io/projected/af3ec395-6313-4094-9597-b52da27a0d7e-kube-api-access-npk42\") pod \"af3ec395-6313-4094-9597-b52da27a0d7e\" (UID: \"af3ec395-6313-4094-9597-b52da27a0d7e\") " Feb 19 15:31:54 crc kubenswrapper[4810]: I0219 15:31:54.291802 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af3ec395-6313-4094-9597-b52da27a0d7e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "af3ec395-6313-4094-9597-b52da27a0d7e" (UID: "af3ec395-6313-4094-9597-b52da27a0d7e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:31:54 crc kubenswrapper[4810]: I0219 15:31:54.291899 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af3ec395-6313-4094-9597-b52da27a0d7e-log-httpd\") pod \"af3ec395-6313-4094-9597-b52da27a0d7e\" (UID: \"af3ec395-6313-4094-9597-b52da27a0d7e\") " Feb 19 15:31:54 crc kubenswrapper[4810]: I0219 15:31:54.292013 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/af3ec395-6313-4094-9597-b52da27a0d7e-sg-core-conf-yaml\") pod \"af3ec395-6313-4094-9597-b52da27a0d7e\" (UID: \"af3ec395-6313-4094-9597-b52da27a0d7e\") " Feb 19 15:31:54 crc kubenswrapper[4810]: I0219 15:31:54.292098 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af3ec395-6313-4094-9597-b52da27a0d7e-scripts\") pod \"af3ec395-6313-4094-9597-b52da27a0d7e\" (UID: \"af3ec395-6313-4094-9597-b52da27a0d7e\") " Feb 19 15:31:54 crc kubenswrapper[4810]: I0219 15:31:54.292250 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af3ec395-6313-4094-9597-b52da27a0d7e-config-data\") pod \"af3ec395-6313-4094-9597-b52da27a0d7e\" (UID: \"af3ec395-6313-4094-9597-b52da27a0d7e\") " Feb 19 15:31:54 crc kubenswrapper[4810]: I0219 15:31:54.292572 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af3ec395-6313-4094-9597-b52da27a0d7e-combined-ca-bundle\") pod \"af3ec395-6313-4094-9597-b52da27a0d7e\" (UID: \"af3ec395-6313-4094-9597-b52da27a0d7e\") " Feb 19 15:31:54 crc kubenswrapper[4810]: I0219 15:31:54.293354 4810 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af3ec395-6313-4094-9597-b52da27a0d7e-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 15:31:54 crc kubenswrapper[4810]: I0219 15:31:54.292145 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af3ec395-6313-4094-9597-b52da27a0d7e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "af3ec395-6313-4094-9597-b52da27a0d7e" (UID: "af3ec395-6313-4094-9597-b52da27a0d7e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:31:54 crc kubenswrapper[4810]: I0219 15:31:54.303269 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af3ec395-6313-4094-9597-b52da27a0d7e-kube-api-access-npk42" (OuterVolumeSpecName: "kube-api-access-npk42") pod "af3ec395-6313-4094-9597-b52da27a0d7e" (UID: "af3ec395-6313-4094-9597-b52da27a0d7e"). InnerVolumeSpecName "kube-api-access-npk42". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:31:54 crc kubenswrapper[4810]: I0219 15:31:54.305161 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af3ec395-6313-4094-9597-b52da27a0d7e-scripts" (OuterVolumeSpecName: "scripts") pod "af3ec395-6313-4094-9597-b52da27a0d7e" (UID: "af3ec395-6313-4094-9597-b52da27a0d7e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:31:54 crc kubenswrapper[4810]: I0219 15:31:54.340664 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af3ec395-6313-4094-9597-b52da27a0d7e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "af3ec395-6313-4094-9597-b52da27a0d7e" (UID: "af3ec395-6313-4094-9597-b52da27a0d7e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:31:54 crc kubenswrapper[4810]: I0219 15:31:54.396020 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-npk42\" (UniqueName: \"kubernetes.io/projected/af3ec395-6313-4094-9597-b52da27a0d7e-kube-api-access-npk42\") on node \"crc\" DevicePath \"\"" Feb 19 15:31:54 crc kubenswrapper[4810]: I0219 15:31:54.396268 4810 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af3ec395-6313-4094-9597-b52da27a0d7e-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 15:31:54 crc kubenswrapper[4810]: I0219 15:31:54.396374 4810 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/af3ec395-6313-4094-9597-b52da27a0d7e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 19 15:31:54 crc kubenswrapper[4810]: I0219 15:31:54.396474 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af3ec395-6313-4094-9597-b52da27a0d7e-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 15:31:54 crc kubenswrapper[4810]: I0219 15:31:54.408158 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af3ec395-6313-4094-9597-b52da27a0d7e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "af3ec395-6313-4094-9597-b52da27a0d7e" (UID: "af3ec395-6313-4094-9597-b52da27a0d7e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:31:54 crc kubenswrapper[4810]: I0219 15:31:54.429663 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af3ec395-6313-4094-9597-b52da27a0d7e-config-data" (OuterVolumeSpecName: "config-data") pod "af3ec395-6313-4094-9597-b52da27a0d7e" (UID: "af3ec395-6313-4094-9597-b52da27a0d7e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:31:54 crc kubenswrapper[4810]: I0219 15:31:54.498086 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af3ec395-6313-4094-9597-b52da27a0d7e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 15:31:54 crc kubenswrapper[4810]: I0219 15:31:54.498686 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af3ec395-6313-4094-9597-b52da27a0d7e-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 15:31:54 crc kubenswrapper[4810]: I0219 15:31:54.797543 4810 generic.go:334] "Generic (PLEG): container finished" podID="af3ec395-6313-4094-9597-b52da27a0d7e" containerID="308ad8f8b8e7c54aece6c4e1cc25e71fd1252432f00bfbdb0df4940372379f98" exitCode=0 Feb 19 15:31:54 crc kubenswrapper[4810]: I0219 15:31:54.797588 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"af3ec395-6313-4094-9597-b52da27a0d7e","Type":"ContainerDied","Data":"308ad8f8b8e7c54aece6c4e1cc25e71fd1252432f00bfbdb0df4940372379f98"} Feb 19 15:31:54 crc kubenswrapper[4810]: I0219 15:31:54.797682 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 15:31:54 crc kubenswrapper[4810]: I0219 15:31:54.798235 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"af3ec395-6313-4094-9597-b52da27a0d7e","Type":"ContainerDied","Data":"62ba2b7fb62123b253e68adfb6db44d11bb6e2d13e45773cd9a27fa6ec28a020"} Feb 19 15:31:54 crc kubenswrapper[4810]: I0219 15:31:54.798264 4810 scope.go:117] "RemoveContainer" containerID="2adc7f5bee46c61926054ab4e65ab8e1825b109d167cc80842a20a1750f29ed6" Feb 19 15:31:54 crc kubenswrapper[4810]: I0219 15:31:54.826103 4810 scope.go:117] "RemoveContainer" containerID="4925cc21f9c39a46d677b62b982c7df70c5e45eb8c42ab96787c52123262f91d" Feb 19 15:31:54 crc kubenswrapper[4810]: I0219 15:31:54.844920 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 15:31:54 crc kubenswrapper[4810]: I0219 15:31:54.862805 4810 scope.go:117] "RemoveContainer" containerID="308ad8f8b8e7c54aece6c4e1cc25e71fd1252432f00bfbdb0df4940372379f98" Feb 19 15:31:54 crc kubenswrapper[4810]: I0219 15:31:54.865987 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 19 15:31:54 crc kubenswrapper[4810]: I0219 15:31:54.884298 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 19 15:31:54 crc kubenswrapper[4810]: E0219 15:31:54.884757 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af3ec395-6313-4094-9597-b52da27a0d7e" containerName="sg-core" Feb 19 15:31:54 crc kubenswrapper[4810]: I0219 15:31:54.884773 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="af3ec395-6313-4094-9597-b52da27a0d7e" containerName="sg-core" Feb 19 15:31:54 crc kubenswrapper[4810]: E0219 15:31:54.884796 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af3ec395-6313-4094-9597-b52da27a0d7e" containerName="ceilometer-notification-agent" Feb 19 15:31:54 crc kubenswrapper[4810]: I0219 15:31:54.884805 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="af3ec395-6313-4094-9597-b52da27a0d7e" containerName="ceilometer-notification-agent" Feb 19 15:31:54 crc kubenswrapper[4810]: E0219 15:31:54.884827 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af3ec395-6313-4094-9597-b52da27a0d7e" containerName="ceilometer-central-agent" Feb 19 15:31:54 crc kubenswrapper[4810]: I0219 15:31:54.884835 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="af3ec395-6313-4094-9597-b52da27a0d7e" containerName="ceilometer-central-agent" Feb 19 15:31:54 crc kubenswrapper[4810]: E0219 15:31:54.884856 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af3ec395-6313-4094-9597-b52da27a0d7e" containerName="proxy-httpd" Feb 19 15:31:54 crc kubenswrapper[4810]: I0219 15:31:54.884862 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="af3ec395-6313-4094-9597-b52da27a0d7e" containerName="proxy-httpd" Feb 19 15:31:54 crc kubenswrapper[4810]: I0219 15:31:54.885080 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="af3ec395-6313-4094-9597-b52da27a0d7e" containerName="ceilometer-notification-agent" Feb 19 15:31:54 crc kubenswrapper[4810]: I0219 15:31:54.885094 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="af3ec395-6313-4094-9597-b52da27a0d7e" containerName="sg-core" Feb 19 15:31:54 crc kubenswrapper[4810]: I0219 15:31:54.885118 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="af3ec395-6313-4094-9597-b52da27a0d7e" containerName="proxy-httpd" Feb 19 15:31:54 crc kubenswrapper[4810]: I0219 15:31:54.885128 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="af3ec395-6313-4094-9597-b52da27a0d7e" containerName="ceilometer-central-agent" Feb 19 15:31:54 crc kubenswrapper[4810]: I0219 15:31:54.887035 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 15:31:54 crc kubenswrapper[4810]: I0219 15:31:54.892847 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 19 15:31:54 crc kubenswrapper[4810]: I0219 15:31:54.893014 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 19 15:31:54 crc kubenswrapper[4810]: I0219 15:31:54.893044 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 19 15:31:54 crc kubenswrapper[4810]: I0219 15:31:54.893879 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 15:31:54 crc kubenswrapper[4810]: I0219 15:31:54.906607 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2594f4d-4c2f-4fc1-bda2-98a148e09b20-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f2594f4d-4c2f-4fc1-bda2-98a148e09b20\") " pod="openstack/ceilometer-0" Feb 19 15:31:54 crc kubenswrapper[4810]: I0219 15:31:54.906692 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f2594f4d-4c2f-4fc1-bda2-98a148e09b20-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f2594f4d-4c2f-4fc1-bda2-98a148e09b20\") " pod="openstack/ceilometer-0" Feb 19 15:31:54 crc kubenswrapper[4810]: I0219 15:31:54.906732 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27x4n\" (UniqueName: \"kubernetes.io/projected/f2594f4d-4c2f-4fc1-bda2-98a148e09b20-kube-api-access-27x4n\") pod \"ceilometer-0\" (UID: \"f2594f4d-4c2f-4fc1-bda2-98a148e09b20\") " pod="openstack/ceilometer-0" Feb 19 15:31:54 crc kubenswrapper[4810]: I0219 15:31:54.906774 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2594f4d-4c2f-4fc1-bda2-98a148e09b20-config-data\") pod \"ceilometer-0\" (UID: \"f2594f4d-4c2f-4fc1-bda2-98a148e09b20\") " pod="openstack/ceilometer-0" Feb 19 15:31:54 crc kubenswrapper[4810]: I0219 15:31:54.906795 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2594f4d-4c2f-4fc1-bda2-98a148e09b20-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f2594f4d-4c2f-4fc1-bda2-98a148e09b20\") " pod="openstack/ceilometer-0" Feb 19 15:31:54 crc kubenswrapper[4810]: I0219 15:31:54.906846 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f2594f4d-4c2f-4fc1-bda2-98a148e09b20-log-httpd\") pod \"ceilometer-0\" (UID: \"f2594f4d-4c2f-4fc1-bda2-98a148e09b20\") " pod="openstack/ceilometer-0" Feb 19 15:31:54 crc kubenswrapper[4810]: I0219 15:31:54.906873 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2594f4d-4c2f-4fc1-bda2-98a148e09b20-scripts\") pod \"ceilometer-0\" (UID: \"f2594f4d-4c2f-4fc1-bda2-98a148e09b20\") " pod="openstack/ceilometer-0" Feb 19 15:31:54 crc kubenswrapper[4810]: I0219 15:31:54.906927 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f2594f4d-4c2f-4fc1-bda2-98a148e09b20-run-httpd\") pod \"ceilometer-0\" (UID: \"f2594f4d-4c2f-4fc1-bda2-98a148e09b20\") " pod="openstack/ceilometer-0" Feb 19 15:31:54 crc kubenswrapper[4810]: I0219 15:31:54.941723 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 19 15:31:54 crc kubenswrapper[4810]: I0219 15:31:54.947351 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 19 15:31:54 crc kubenswrapper[4810]: I0219 15:31:54.954140 4810 scope.go:117] "RemoveContainer" containerID="6cec0822ad581a03d19d641977e9a7d547ef54d70d04ff9e2ccd6f292e9245c7" Feb 19 15:31:54 crc kubenswrapper[4810]: I0219 15:31:54.957010 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 19 15:31:54 crc kubenswrapper[4810]: I0219 15:31:54.985638 4810 scope.go:117] "RemoveContainer" containerID="2adc7f5bee46c61926054ab4e65ab8e1825b109d167cc80842a20a1750f29ed6" Feb 19 15:31:54 crc kubenswrapper[4810]: E0219 15:31:54.996807 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2adc7f5bee46c61926054ab4e65ab8e1825b109d167cc80842a20a1750f29ed6\": container with ID starting with 2adc7f5bee46c61926054ab4e65ab8e1825b109d167cc80842a20a1750f29ed6 not found: ID does not exist" containerID="2adc7f5bee46c61926054ab4e65ab8e1825b109d167cc80842a20a1750f29ed6" Feb 19 15:31:54 crc kubenswrapper[4810]: I0219 15:31:54.996853 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2adc7f5bee46c61926054ab4e65ab8e1825b109d167cc80842a20a1750f29ed6"} err="failed to get container status \"2adc7f5bee46c61926054ab4e65ab8e1825b109d167cc80842a20a1750f29ed6\": rpc error: code = NotFound desc = could not find container \"2adc7f5bee46c61926054ab4e65ab8e1825b109d167cc80842a20a1750f29ed6\": container with ID starting with 2adc7f5bee46c61926054ab4e65ab8e1825b109d167cc80842a20a1750f29ed6 not found: ID does not exist" Feb 19 15:31:54 crc kubenswrapper[4810]: I0219 15:31:54.996877 4810 scope.go:117] "RemoveContainer" containerID="4925cc21f9c39a46d677b62b982c7df70c5e45eb8c42ab96787c52123262f91d" Feb 19 15:31:54 crc kubenswrapper[4810]: E0219 15:31:54.998034 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4925cc21f9c39a46d677b62b982c7df70c5e45eb8c42ab96787c52123262f91d\": container with ID starting with 4925cc21f9c39a46d677b62b982c7df70c5e45eb8c42ab96787c52123262f91d not found: ID does not exist" containerID="4925cc21f9c39a46d677b62b982c7df70c5e45eb8c42ab96787c52123262f91d" Feb 19 15:31:54 crc kubenswrapper[4810]: I0219 15:31:54.998069 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4925cc21f9c39a46d677b62b982c7df70c5e45eb8c42ab96787c52123262f91d"} err="failed to get container status \"4925cc21f9c39a46d677b62b982c7df70c5e45eb8c42ab96787c52123262f91d\": rpc error: code = NotFound desc = could not find container \"4925cc21f9c39a46d677b62b982c7df70c5e45eb8c42ab96787c52123262f91d\": container with ID starting with 4925cc21f9c39a46d677b62b982c7df70c5e45eb8c42ab96787c52123262f91d not found: ID does not exist" Feb 19 15:31:54 crc kubenswrapper[4810]: I0219 15:31:54.998085 4810 scope.go:117] "RemoveContainer" containerID="308ad8f8b8e7c54aece6c4e1cc25e71fd1252432f00bfbdb0df4940372379f98" Feb 19 15:31:54 crc kubenswrapper[4810]: E0219 15:31:54.998530 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"308ad8f8b8e7c54aece6c4e1cc25e71fd1252432f00bfbdb0df4940372379f98\": container with ID starting with 308ad8f8b8e7c54aece6c4e1cc25e71fd1252432f00bfbdb0df4940372379f98 not found: ID does not exist" containerID="308ad8f8b8e7c54aece6c4e1cc25e71fd1252432f00bfbdb0df4940372379f98" Feb 19 15:31:54 crc kubenswrapper[4810]: I0219 15:31:54.998572 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"308ad8f8b8e7c54aece6c4e1cc25e71fd1252432f00bfbdb0df4940372379f98"} err="failed to get container status \"308ad8f8b8e7c54aece6c4e1cc25e71fd1252432f00bfbdb0df4940372379f98\": rpc error: code = NotFound desc = could not find container \"308ad8f8b8e7c54aece6c4e1cc25e71fd1252432f00bfbdb0df4940372379f98\": container with ID starting with 308ad8f8b8e7c54aece6c4e1cc25e71fd1252432f00bfbdb0df4940372379f98 not found: ID does not exist" Feb 19 15:31:54 crc kubenswrapper[4810]: I0219 15:31:54.998600 4810 scope.go:117] "RemoveContainer" containerID="6cec0822ad581a03d19d641977e9a7d547ef54d70d04ff9e2ccd6f292e9245c7" Feb 19 15:31:54 crc kubenswrapper[4810]: E0219 15:31:54.998867 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6cec0822ad581a03d19d641977e9a7d547ef54d70d04ff9e2ccd6f292e9245c7\": container with ID starting with 6cec0822ad581a03d19d641977e9a7d547ef54d70d04ff9e2ccd6f292e9245c7 not found: ID does not exist" containerID="6cec0822ad581a03d19d641977e9a7d547ef54d70d04ff9e2ccd6f292e9245c7" Feb 19 15:31:54 crc kubenswrapper[4810]: I0219 15:31:54.998890 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cec0822ad581a03d19d641977e9a7d547ef54d70d04ff9e2ccd6f292e9245c7"} err="failed to get container status \"6cec0822ad581a03d19d641977e9a7d547ef54d70d04ff9e2ccd6f292e9245c7\": rpc error: code = NotFound desc = could not find container \"6cec0822ad581a03d19d641977e9a7d547ef54d70d04ff9e2ccd6f292e9245c7\": container with ID starting with 6cec0822ad581a03d19d641977e9a7d547ef54d70d04ff9e2ccd6f292e9245c7 not found: ID does not exist" Feb 19 15:31:55 crc kubenswrapper[4810]: I0219 15:31:55.008665 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2594f4d-4c2f-4fc1-bda2-98a148e09b20-scripts\") pod \"ceilometer-0\" (UID: \"f2594f4d-4c2f-4fc1-bda2-98a148e09b20\") " pod="openstack/ceilometer-0" Feb 19 15:31:55 crc kubenswrapper[4810]: I0219 15:31:55.008735 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f2594f4d-4c2f-4fc1-bda2-98a148e09b20-run-httpd\") pod \"ceilometer-0\" (UID: \"f2594f4d-4c2f-4fc1-bda2-98a148e09b20\") " pod="openstack/ceilometer-0" Feb 19 15:31:55 crc kubenswrapper[4810]: I0219 15:31:55.008796 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2594f4d-4c2f-4fc1-bda2-98a148e09b20-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f2594f4d-4c2f-4fc1-bda2-98a148e09b20\") " pod="openstack/ceilometer-0" Feb 19 15:31:55 crc kubenswrapper[4810]: I0219 15:31:55.008856 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f2594f4d-4c2f-4fc1-bda2-98a148e09b20-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f2594f4d-4c2f-4fc1-bda2-98a148e09b20\") " pod="openstack/ceilometer-0" Feb 19 15:31:55 crc kubenswrapper[4810]: I0219 15:31:55.008884 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27x4n\" (UniqueName: \"kubernetes.io/projected/f2594f4d-4c2f-4fc1-bda2-98a148e09b20-kube-api-access-27x4n\") pod \"ceilometer-0\" (UID: \"f2594f4d-4c2f-4fc1-bda2-98a148e09b20\") " pod="openstack/ceilometer-0" Feb 19 15:31:55 crc kubenswrapper[4810]: I0219 15:31:55.008940 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2594f4d-4c2f-4fc1-bda2-98a148e09b20-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f2594f4d-4c2f-4fc1-bda2-98a148e09b20\") " pod="openstack/ceilometer-0" Feb 19 15:31:55 crc kubenswrapper[4810]: I0219 15:31:55.008963 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2594f4d-4c2f-4fc1-bda2-98a148e09b20-config-data\") pod \"ceilometer-0\" (UID: \"f2594f4d-4c2f-4fc1-bda2-98a148e09b20\") " pod="openstack/ceilometer-0" Feb 19 15:31:55 crc kubenswrapper[4810]: I0219 15:31:55.009029 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f2594f4d-4c2f-4fc1-bda2-98a148e09b20-log-httpd\") pod \"ceilometer-0\" (UID: \"f2594f4d-4c2f-4fc1-bda2-98a148e09b20\") " pod="openstack/ceilometer-0" Feb 19 15:31:55 crc kubenswrapper[4810]: I0219 15:31:55.009219 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f2594f4d-4c2f-4fc1-bda2-98a148e09b20-run-httpd\") pod \"ceilometer-0\" (UID: \"f2594f4d-4c2f-4fc1-bda2-98a148e09b20\") " pod="openstack/ceilometer-0" Feb 19 15:31:55 crc kubenswrapper[4810]: I0219 15:31:55.009505 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f2594f4d-4c2f-4fc1-bda2-98a148e09b20-log-httpd\") pod \"ceilometer-0\" (UID: \"f2594f4d-4c2f-4fc1-bda2-98a148e09b20\") " pod="openstack/ceilometer-0" Feb 19 15:31:55 crc kubenswrapper[4810]: I0219 15:31:55.014001 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2594f4d-4c2f-4fc1-bda2-98a148e09b20-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f2594f4d-4c2f-4fc1-bda2-98a148e09b20\") " pod="openstack/ceilometer-0" Feb 19 15:31:55 crc kubenswrapper[4810]: I0219 15:31:55.015200 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2594f4d-4c2f-4fc1-bda2-98a148e09b20-scripts\") pod \"ceilometer-0\" (UID: \"f2594f4d-4c2f-4fc1-bda2-98a148e09b20\") " pod="openstack/ceilometer-0" Feb 19 15:31:55 crc kubenswrapper[4810]: I0219 15:31:55.015819 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f2594f4d-4c2f-4fc1-bda2-98a148e09b20-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f2594f4d-4c2f-4fc1-bda2-98a148e09b20\") " pod="openstack/ceilometer-0" Feb 19 15:31:55 crc kubenswrapper[4810]: I0219 15:31:55.016146 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2594f4d-4c2f-4fc1-bda2-98a148e09b20-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f2594f4d-4c2f-4fc1-bda2-98a148e09b20\") " pod="openstack/ceilometer-0" Feb 19 15:31:55 crc kubenswrapper[4810]: I0219 15:31:55.017053 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2594f4d-4c2f-4fc1-bda2-98a148e09b20-config-data\") pod \"ceilometer-0\" (UID: \"f2594f4d-4c2f-4fc1-bda2-98a148e09b20\") " pod="openstack/ceilometer-0" Feb 19 15:31:55 crc kubenswrapper[4810]: I0219 15:31:55.024517 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27x4n\" (UniqueName: \"kubernetes.io/projected/f2594f4d-4c2f-4fc1-bda2-98a148e09b20-kube-api-access-27x4n\") pod \"ceilometer-0\" (UID: \"f2594f4d-4c2f-4fc1-bda2-98a148e09b20\") " pod="openstack/ceilometer-0" Feb 19 15:31:55 crc kubenswrapper[4810]: I0219 15:31:55.243563 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 15:31:55 crc kubenswrapper[4810]: I0219 15:31:55.464088 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af3ec395-6313-4094-9597-b52da27a0d7e" path="/var/lib/kubelet/pods/af3ec395-6313-4094-9597-b52da27a0d7e/volumes" Feb 19 15:31:55 crc kubenswrapper[4810]: W0219 15:31:55.739499 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf2594f4d_4c2f_4fc1_bda2_98a148e09b20.slice/crio-c1f4a9d4ae39c71c4f7ab55984edc9f42ceec7ca748ce3c8a863c795e3882a8e WatchSource:0}: Error finding container c1f4a9d4ae39c71c4f7ab55984edc9f42ceec7ca748ce3c8a863c795e3882a8e: Status 404 returned error can't find the container with id c1f4a9d4ae39c71c4f7ab55984edc9f42ceec7ca748ce3c8a863c795e3882a8e Feb 19 15:31:55 crc kubenswrapper[4810]: I0219 15:31:55.739508 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 15:31:55 crc kubenswrapper[4810]: I0219 15:31:55.823115 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f2594f4d-4c2f-4fc1-bda2-98a148e09b20","Type":"ContainerStarted","Data":"c1f4a9d4ae39c71c4f7ab55984edc9f42ceec7ca748ce3c8a863c795e3882a8e"} Feb 19 15:31:55 crc kubenswrapper[4810]: I0219 15:31:55.833254 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 19 15:31:56 crc kubenswrapper[4810]: I0219 15:31:56.075560 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 19 15:31:56 crc kubenswrapper[4810]: I0219 15:31:56.655133 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 19 15:31:56 crc kubenswrapper[4810]: I0219 15:31:56.755114 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4c8d587-b429-415f-96f6-628924fed084-config-data\") pod \"b4c8d587-b429-415f-96f6-628924fed084\" (UID: \"b4c8d587-b429-415f-96f6-628924fed084\") " Feb 19 15:31:56 crc kubenswrapper[4810]: I0219 15:31:56.755551 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkj9r\" (UniqueName: \"kubernetes.io/projected/b4c8d587-b429-415f-96f6-628924fed084-kube-api-access-qkj9r\") pod \"b4c8d587-b429-415f-96f6-628924fed084\" (UID: \"b4c8d587-b429-415f-96f6-628924fed084\") " Feb 19 15:31:56 crc kubenswrapper[4810]: I0219 15:31:56.755636 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4c8d587-b429-415f-96f6-628924fed084-combined-ca-bundle\") pod \"b4c8d587-b429-415f-96f6-628924fed084\" (UID: \"b4c8d587-b429-415f-96f6-628924fed084\") " Feb 19 15:31:56 crc kubenswrapper[4810]: I0219 15:31:56.762107 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4c8d587-b429-415f-96f6-628924fed084-kube-api-access-qkj9r" (OuterVolumeSpecName: "kube-api-access-qkj9r") pod "b4c8d587-b429-415f-96f6-628924fed084" (UID: "b4c8d587-b429-415f-96f6-628924fed084"). InnerVolumeSpecName "kube-api-access-qkj9r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:31:56 crc kubenswrapper[4810]: I0219 15:31:56.785580 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4c8d587-b429-415f-96f6-628924fed084-config-data" (OuterVolumeSpecName: "config-data") pod "b4c8d587-b429-415f-96f6-628924fed084" (UID: "b4c8d587-b429-415f-96f6-628924fed084"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:31:56 crc kubenswrapper[4810]: I0219 15:31:56.795108 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4c8d587-b429-415f-96f6-628924fed084-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b4c8d587-b429-415f-96f6-628924fed084" (UID: "b4c8d587-b429-415f-96f6-628924fed084"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:31:56 crc kubenswrapper[4810]: I0219 15:31:56.835401 4810 generic.go:334] "Generic (PLEG): container finished" podID="b4c8d587-b429-415f-96f6-628924fed084" containerID="b7e5ea663f35522cf345ee4992587f93bc18717e336dd09c7687290a99a11aa3" exitCode=137 Feb 19 15:31:56 crc kubenswrapper[4810]: I0219 15:31:56.835450 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 19 15:31:56 crc kubenswrapper[4810]: I0219 15:31:56.835495 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b4c8d587-b429-415f-96f6-628924fed084","Type":"ContainerDied","Data":"b7e5ea663f35522cf345ee4992587f93bc18717e336dd09c7687290a99a11aa3"} Feb 19 15:31:56 crc kubenswrapper[4810]: I0219 15:31:56.835528 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b4c8d587-b429-415f-96f6-628924fed084","Type":"ContainerDied","Data":"4774e9328b610c917f9fd35141fcadf466b7543107fa862abe860e7744c56cb8"} Feb 19 15:31:56 crc kubenswrapper[4810]: I0219 15:31:56.835563 4810 scope.go:117] "RemoveContainer" containerID="b7e5ea663f35522cf345ee4992587f93bc18717e336dd09c7687290a99a11aa3" Feb 19 15:31:56 crc kubenswrapper[4810]: I0219 15:31:56.838167 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f2594f4d-4c2f-4fc1-bda2-98a148e09b20","Type":"ContainerStarted","Data":"d89267d6dac559d2a521953fb65466a02a167ef0f42f9337d1bc8f64f8e801a5"} Feb 19 15:31:56 crc kubenswrapper[4810]: I0219 15:31:56.838256 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f2594f4d-4c2f-4fc1-bda2-98a148e09b20","Type":"ContainerStarted","Data":"b146e7e942b25643f6a8f23e7901f7c71009b2241073e62495a7a1015b18112c"} Feb 19 15:31:56 crc kubenswrapper[4810]: I0219 15:31:56.857781 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4c8d587-b429-415f-96f6-628924fed084-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 15:31:56 crc kubenswrapper[4810]: I0219 15:31:56.857820 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkj9r\" (UniqueName: \"kubernetes.io/projected/b4c8d587-b429-415f-96f6-628924fed084-kube-api-access-qkj9r\") on node \"crc\" DevicePath \"\"" Feb 19 15:31:56 crc kubenswrapper[4810]: I0219 15:31:56.857832 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4c8d587-b429-415f-96f6-628924fed084-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 15:31:56 crc kubenswrapper[4810]: I0219 15:31:56.870611 4810 scope.go:117] "RemoveContainer" containerID="b7e5ea663f35522cf345ee4992587f93bc18717e336dd09c7687290a99a11aa3" Feb 19 15:31:56 crc kubenswrapper[4810]: E0219 15:31:56.871159 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7e5ea663f35522cf345ee4992587f93bc18717e336dd09c7687290a99a11aa3\": container with ID starting with b7e5ea663f35522cf345ee4992587f93bc18717e336dd09c7687290a99a11aa3 not found: ID does not exist" containerID="b7e5ea663f35522cf345ee4992587f93bc18717e336dd09c7687290a99a11aa3" Feb 19 15:31:56 crc kubenswrapper[4810]: I0219 15:31:56.871214 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7e5ea663f35522cf345ee4992587f93bc18717e336dd09c7687290a99a11aa3"} err="failed to get container status \"b7e5ea663f35522cf345ee4992587f93bc18717e336dd09c7687290a99a11aa3\": rpc error: code = NotFound desc = could not find container \"b7e5ea663f35522cf345ee4992587f93bc18717e336dd09c7687290a99a11aa3\": container with ID starting with b7e5ea663f35522cf345ee4992587f93bc18717e336dd09c7687290a99a11aa3 not found: ID does not exist" Feb 19 15:31:56 crc kubenswrapper[4810]: I0219 15:31:56.915377 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 15:31:56 crc kubenswrapper[4810]: I0219 15:31:56.937463 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 15:31:56 crc kubenswrapper[4810]: I0219 15:31:56.948276 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 15:31:56 crc kubenswrapper[4810]: E0219 15:31:56.948797 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4c8d587-b429-415f-96f6-628924fed084" containerName="nova-cell1-novncproxy-novncproxy" Feb 19 15:31:56 crc kubenswrapper[4810]: I0219 15:31:56.948823 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4c8d587-b429-415f-96f6-628924fed084" containerName="nova-cell1-novncproxy-novncproxy" Feb 19 15:31:56 crc kubenswrapper[4810]: I0219 15:31:56.949078 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4c8d587-b429-415f-96f6-628924fed084" containerName="nova-cell1-novncproxy-novncproxy" Feb 19 15:31:56 crc kubenswrapper[4810]: I0219 15:31:56.949903 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 19 15:31:56 crc kubenswrapper[4810]: I0219 15:31:56.958204 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Feb 19 15:31:56 crc kubenswrapper[4810]: I0219 15:31:56.958462 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 19 15:31:56 crc kubenswrapper[4810]: I0219 15:31:56.958636 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Feb 19 15:31:56 crc kubenswrapper[4810]: I0219 15:31:56.959188 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 15:31:57 crc kubenswrapper[4810]: I0219 15:31:57.061048 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96jc2\" (UniqueName: \"kubernetes.io/projected/5a7915d4-6c3f-4bc7-b21d-7d51b675640f-kube-api-access-96jc2\") pod \"nova-cell1-novncproxy-0\" (UID: \"5a7915d4-6c3f-4bc7-b21d-7d51b675640f\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 15:31:57 crc kubenswrapper[4810]: I0219 15:31:57.061221 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a7915d4-6c3f-4bc7-b21d-7d51b675640f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"5a7915d4-6c3f-4bc7-b21d-7d51b675640f\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 15:31:57 crc kubenswrapper[4810]: I0219 15:31:57.061270 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a7915d4-6c3f-4bc7-b21d-7d51b675640f-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5a7915d4-6c3f-4bc7-b21d-7d51b675640f\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 15:31:57 crc kubenswrapper[4810]: I0219 15:31:57.061292 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a7915d4-6c3f-4bc7-b21d-7d51b675640f-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5a7915d4-6c3f-4bc7-b21d-7d51b675640f\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 15:31:57 crc kubenswrapper[4810]: I0219 15:31:57.061381 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a7915d4-6c3f-4bc7-b21d-7d51b675640f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"5a7915d4-6c3f-4bc7-b21d-7d51b675640f\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 15:31:57 crc kubenswrapper[4810]: I0219 15:31:57.163026 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a7915d4-6c3f-4bc7-b21d-7d51b675640f-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5a7915d4-6c3f-4bc7-b21d-7d51b675640f\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 15:31:57 crc kubenswrapper[4810]: I0219 15:31:57.163083 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a7915d4-6c3f-4bc7-b21d-7d51b675640f-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5a7915d4-6c3f-4bc7-b21d-7d51b675640f\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 15:31:57 crc kubenswrapper[4810]: I0219 15:31:57.163155 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a7915d4-6c3f-4bc7-b21d-7d51b675640f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"5a7915d4-6c3f-4bc7-b21d-7d51b675640f\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 15:31:57 crc kubenswrapper[4810]: I0219 15:31:57.163208 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96jc2\" (UniqueName: \"kubernetes.io/projected/5a7915d4-6c3f-4bc7-b21d-7d51b675640f-kube-api-access-96jc2\") pod \"nova-cell1-novncproxy-0\" (UID: \"5a7915d4-6c3f-4bc7-b21d-7d51b675640f\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 15:31:57 crc kubenswrapper[4810]: I0219 15:31:57.163717 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a7915d4-6c3f-4bc7-b21d-7d51b675640f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"5a7915d4-6c3f-4bc7-b21d-7d51b675640f\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 15:31:57 crc kubenswrapper[4810]: I0219 15:31:57.167383 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a7915d4-6c3f-4bc7-b21d-7d51b675640f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"5a7915d4-6c3f-4bc7-b21d-7d51b675640f\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 15:31:57 crc kubenswrapper[4810]: I0219 15:31:57.167585 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a7915d4-6c3f-4bc7-b21d-7d51b675640f-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5a7915d4-6c3f-4bc7-b21d-7d51b675640f\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 15:31:57 crc kubenswrapper[4810]: I0219 15:31:57.167853 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a7915d4-6c3f-4bc7-b21d-7d51b675640f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"5a7915d4-6c3f-4bc7-b21d-7d51b675640f\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 15:31:57 crc kubenswrapper[4810]: I0219 15:31:57.168646 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a7915d4-6c3f-4bc7-b21d-7d51b675640f-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5a7915d4-6c3f-4bc7-b21d-7d51b675640f\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 15:31:57 crc kubenswrapper[4810]: I0219 15:31:57.183104 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96jc2\" (UniqueName: \"kubernetes.io/projected/5a7915d4-6c3f-4bc7-b21d-7d51b675640f-kube-api-access-96jc2\") pod \"nova-cell1-novncproxy-0\" (UID: \"5a7915d4-6c3f-4bc7-b21d-7d51b675640f\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 15:31:57 crc kubenswrapper[4810]: I0219 15:31:57.284206 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 19 15:31:57 crc kubenswrapper[4810]: I0219 15:31:57.452290 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4c8d587-b429-415f-96f6-628924fed084" path="/var/lib/kubelet/pods/b4c8d587-b429-415f-96f6-628924fed084/volumes" Feb 19 15:31:57 crc kubenswrapper[4810]: I0219 15:31:57.764035 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 15:31:57 crc kubenswrapper[4810]: W0219 15:31:57.764488 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a7915d4_6c3f_4bc7_b21d_7d51b675640f.slice/crio-c62dbfb3fef551c47f6ceabbedf6ce08e35cc8c56ffcc4089fb9748fdc5b32b3 WatchSource:0}: Error finding container c62dbfb3fef551c47f6ceabbedf6ce08e35cc8c56ffcc4089fb9748fdc5b32b3: Status 404 returned error can't find the container with id c62dbfb3fef551c47f6ceabbedf6ce08e35cc8c56ffcc4089fb9748fdc5b32b3 Feb 19 15:31:57 crc kubenswrapper[4810]: I0219 15:31:57.848187 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"5a7915d4-6c3f-4bc7-b21d-7d51b675640f","Type":"ContainerStarted","Data":"c62dbfb3fef551c47f6ceabbedf6ce08e35cc8c56ffcc4089fb9748fdc5b32b3"} Feb 19 15:31:57 crc kubenswrapper[4810]: I0219 15:31:57.857345 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f2594f4d-4c2f-4fc1-bda2-98a148e09b20","Type":"ContainerStarted","Data":"2267682407e573c3802691ada945d211fd7e4b15c6d2e2d22cf717dea7754b85"} Feb 19 15:31:57 crc kubenswrapper[4810]: I0219 15:31:57.964919 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 19 15:31:57 crc kubenswrapper[4810]: I0219 15:31:57.965912 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 19 15:31:57 crc kubenswrapper[4810]: I0219 15:31:57.976135 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 19 15:31:57 crc kubenswrapper[4810]: I0219 15:31:57.996918 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 19 15:31:58 crc kubenswrapper[4810]: I0219 15:31:58.867567 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"5a7915d4-6c3f-4bc7-b21d-7d51b675640f","Type":"ContainerStarted","Data":"bfe000bc0b31dfce709607fd585b87aa375cc4ed4721af49130544ac2515c529"} Feb 19 15:31:58 crc kubenswrapper[4810]: I0219 15:31:58.867803 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 19 15:31:58 crc kubenswrapper[4810]: I0219 15:31:58.879279 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 19 15:31:58 crc kubenswrapper[4810]: I0219 15:31:58.889895 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.889873281 podStartE2EDuration="2.889873281s" podCreationTimestamp="2026-02-19 15:31:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:31:58.883050297 +0000 UTC m=+1348.365080441" watchObservedRunningTime="2026-02-19 15:31:58.889873281 +0000 UTC m=+1348.371903415" Feb 19 15:31:59 crc kubenswrapper[4810]: I0219 15:31:59.081466 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7dc9fb8849-t2gx5"] Feb 19 15:31:59 crc kubenswrapper[4810]: I0219 15:31:59.082978 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7dc9fb8849-t2gx5" Feb 19 15:31:59 crc kubenswrapper[4810]: I0219 15:31:59.105220 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9xn9\" (UniqueName: \"kubernetes.io/projected/275a98c0-8e6a-4587-8628-54f70b836615-kube-api-access-t9xn9\") pod \"dnsmasq-dns-7dc9fb8849-t2gx5\" (UID: \"275a98c0-8e6a-4587-8628-54f70b836615\") " pod="openstack/dnsmasq-dns-7dc9fb8849-t2gx5" Feb 19 15:31:59 crc kubenswrapper[4810]: I0219 15:31:59.106438 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/275a98c0-8e6a-4587-8628-54f70b836615-ovsdbserver-nb\") pod \"dnsmasq-dns-7dc9fb8849-t2gx5\" (UID: \"275a98c0-8e6a-4587-8628-54f70b836615\") " pod="openstack/dnsmasq-dns-7dc9fb8849-t2gx5" Feb 19 15:31:59 crc kubenswrapper[4810]: I0219 15:31:59.106631 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/275a98c0-8e6a-4587-8628-54f70b836615-config\") pod \"dnsmasq-dns-7dc9fb8849-t2gx5\" (UID: \"275a98c0-8e6a-4587-8628-54f70b836615\") " pod="openstack/dnsmasq-dns-7dc9fb8849-t2gx5" Feb 19 15:31:59 crc kubenswrapper[4810]: I0219 15:31:59.106759 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/275a98c0-8e6a-4587-8628-54f70b836615-ovsdbserver-sb\") pod \"dnsmasq-dns-7dc9fb8849-t2gx5\" (UID: \"275a98c0-8e6a-4587-8628-54f70b836615\") " pod="openstack/dnsmasq-dns-7dc9fb8849-t2gx5" Feb 19 15:31:59 crc kubenswrapper[4810]: I0219 15:31:59.106834 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/275a98c0-8e6a-4587-8628-54f70b836615-dns-swift-storage-0\") pod \"dnsmasq-dns-7dc9fb8849-t2gx5\" (UID: \"275a98c0-8e6a-4587-8628-54f70b836615\") " pod="openstack/dnsmasq-dns-7dc9fb8849-t2gx5" Feb 19 15:31:59 crc kubenswrapper[4810]: I0219 15:31:59.106949 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/275a98c0-8e6a-4587-8628-54f70b836615-dns-svc\") pod \"dnsmasq-dns-7dc9fb8849-t2gx5\" (UID: \"275a98c0-8e6a-4587-8628-54f70b836615\") " pod="openstack/dnsmasq-dns-7dc9fb8849-t2gx5" Feb 19 15:31:59 crc kubenswrapper[4810]: I0219 15:31:59.127574 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7dc9fb8849-t2gx5"] Feb 19 15:31:59 crc kubenswrapper[4810]: I0219 15:31:59.209801 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/275a98c0-8e6a-4587-8628-54f70b836615-config\") pod \"dnsmasq-dns-7dc9fb8849-t2gx5\" (UID: \"275a98c0-8e6a-4587-8628-54f70b836615\") " pod="openstack/dnsmasq-dns-7dc9fb8849-t2gx5" Feb 19 15:31:59 crc kubenswrapper[4810]: I0219 15:31:59.210486 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/275a98c0-8e6a-4587-8628-54f70b836615-ovsdbserver-sb\") pod \"dnsmasq-dns-7dc9fb8849-t2gx5\" (UID: \"275a98c0-8e6a-4587-8628-54f70b836615\") " pod="openstack/dnsmasq-dns-7dc9fb8849-t2gx5" Feb 19 15:31:59 crc kubenswrapper[4810]: I0219 15:31:59.210601 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/275a98c0-8e6a-4587-8628-54f70b836615-dns-swift-storage-0\") pod \"dnsmasq-dns-7dc9fb8849-t2gx5\" (UID: \"275a98c0-8e6a-4587-8628-54f70b836615\") " pod="openstack/dnsmasq-dns-7dc9fb8849-t2gx5" Feb 19 15:31:59 crc kubenswrapper[4810]: I0219 15:31:59.210677 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/275a98c0-8e6a-4587-8628-54f70b836615-dns-svc\") pod \"dnsmasq-dns-7dc9fb8849-t2gx5\" (UID: \"275a98c0-8e6a-4587-8628-54f70b836615\") " pod="openstack/dnsmasq-dns-7dc9fb8849-t2gx5" Feb 19 15:31:59 crc kubenswrapper[4810]: I0219 15:31:59.210786 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9xn9\" (UniqueName: \"kubernetes.io/projected/275a98c0-8e6a-4587-8628-54f70b836615-kube-api-access-t9xn9\") pod \"dnsmasq-dns-7dc9fb8849-t2gx5\" (UID: \"275a98c0-8e6a-4587-8628-54f70b836615\") " pod="openstack/dnsmasq-dns-7dc9fb8849-t2gx5" Feb 19 15:31:59 crc kubenswrapper[4810]: I0219 15:31:59.210868 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/275a98c0-8e6a-4587-8628-54f70b836615-ovsdbserver-nb\") pod \"dnsmasq-dns-7dc9fb8849-t2gx5\" (UID: \"275a98c0-8e6a-4587-8628-54f70b836615\") " pod="openstack/dnsmasq-dns-7dc9fb8849-t2gx5" Feb 19 15:31:59 crc kubenswrapper[4810]: I0219 15:31:59.212014 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/275a98c0-8e6a-4587-8628-54f70b836615-dns-svc\") pod \"dnsmasq-dns-7dc9fb8849-t2gx5\" (UID: \"275a98c0-8e6a-4587-8628-54f70b836615\") " pod="openstack/dnsmasq-dns-7dc9fb8849-t2gx5" Feb 19 15:31:59 crc kubenswrapper[4810]: I0219 15:31:59.212906 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/275a98c0-8e6a-4587-8628-54f70b836615-config\") pod \"dnsmasq-dns-7dc9fb8849-t2gx5\" (UID: \"275a98c0-8e6a-4587-8628-54f70b836615\") " pod="openstack/dnsmasq-dns-7dc9fb8849-t2gx5" Feb 19 15:31:59 crc kubenswrapper[4810]: I0219 15:31:59.219060 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/275a98c0-8e6a-4587-8628-54f70b836615-dns-swift-storage-0\") pod \"dnsmasq-dns-7dc9fb8849-t2gx5\" (UID: \"275a98c0-8e6a-4587-8628-54f70b836615\") " pod="openstack/dnsmasq-dns-7dc9fb8849-t2gx5" Feb 19 15:31:59 crc kubenswrapper[4810]: I0219 15:31:59.219149 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/275a98c0-8e6a-4587-8628-54f70b836615-ovsdbserver-nb\") pod \"dnsmasq-dns-7dc9fb8849-t2gx5\" (UID: \"275a98c0-8e6a-4587-8628-54f70b836615\") " pod="openstack/dnsmasq-dns-7dc9fb8849-t2gx5" Feb 19 15:31:59 crc kubenswrapper[4810]: I0219 15:31:59.219288 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/275a98c0-8e6a-4587-8628-54f70b836615-ovsdbserver-sb\") pod \"dnsmasq-dns-7dc9fb8849-t2gx5\" (UID: \"275a98c0-8e6a-4587-8628-54f70b836615\") " pod="openstack/dnsmasq-dns-7dc9fb8849-t2gx5" Feb 19 15:31:59 crc kubenswrapper[4810]: I0219 15:31:59.233268 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9xn9\" (UniqueName: \"kubernetes.io/projected/275a98c0-8e6a-4587-8628-54f70b836615-kube-api-access-t9xn9\") pod \"dnsmasq-dns-7dc9fb8849-t2gx5\" (UID: \"275a98c0-8e6a-4587-8628-54f70b836615\") " pod="openstack/dnsmasq-dns-7dc9fb8849-t2gx5" Feb 19 15:31:59 crc kubenswrapper[4810]: I0219 15:31:59.416003 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7dc9fb8849-t2gx5" Feb 19 15:32:00 crc kubenswrapper[4810]: I0219 15:31:59.897474 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f2594f4d-4c2f-4fc1-bda2-98a148e09b20","Type":"ContainerStarted","Data":"9125ed9339214110f69897810b0e48bb3128158f2c4b9ccb9353a19533ba604d"} Feb 19 15:32:00 crc kubenswrapper[4810]: I0219 15:31:59.898527 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 19 15:32:00 crc kubenswrapper[4810]: I0219 15:31:59.926980 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.493768033 podStartE2EDuration="5.92696275s" podCreationTimestamp="2026-02-19 15:31:54 +0000 UTC" firstStartedPulling="2026-02-19 15:31:55.742109337 +0000 UTC m=+1345.224139461" lastFinishedPulling="2026-02-19 15:31:59.175304054 +0000 UTC m=+1348.657334178" observedRunningTime="2026-02-19 15:31:59.920619309 +0000 UTC m=+1349.402649433" watchObservedRunningTime="2026-02-19 15:31:59.92696275 +0000 UTC m=+1349.408992864" Feb 19 15:32:00 crc kubenswrapper[4810]: W0219 15:32:00.020804 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod275a98c0_8e6a_4587_8628_54f70b836615.slice/crio-8c49db3d1d18d97ef799aa2fe03c13f1f635369a4b0477795ad010b6baa6d941 WatchSource:0}: Error finding container 8c49db3d1d18d97ef799aa2fe03c13f1f635369a4b0477795ad010b6baa6d941: Status 404 returned error can't find the container with id 8c49db3d1d18d97ef799aa2fe03c13f1f635369a4b0477795ad010b6baa6d941 Feb 19 15:32:00 crc kubenswrapper[4810]: I0219 15:32:00.021247 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7dc9fb8849-t2gx5"] Feb 19 15:32:00 crc kubenswrapper[4810]: I0219 15:32:00.907471 4810 generic.go:334] "Generic (PLEG): container finished" podID="275a98c0-8e6a-4587-8628-54f70b836615" containerID="87b0b7637ffa6ceedf744033bed56aa64044ea8ed1fd841f1129813c92a9043f" exitCode=0 Feb 19 15:32:00 crc kubenswrapper[4810]: I0219 15:32:00.908542 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7dc9fb8849-t2gx5" event={"ID":"275a98c0-8e6a-4587-8628-54f70b836615","Type":"ContainerDied","Data":"87b0b7637ffa6ceedf744033bed56aa64044ea8ed1fd841f1129813c92a9043f"} Feb 19 15:32:00 crc kubenswrapper[4810]: I0219 15:32:00.908590 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7dc9fb8849-t2gx5" event={"ID":"275a98c0-8e6a-4587-8628-54f70b836615","Type":"ContainerStarted","Data":"8c49db3d1d18d97ef799aa2fe03c13f1f635369a4b0477795ad010b6baa6d941"} Feb 19 15:32:01 crc kubenswrapper[4810]: I0219 15:32:01.668865 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 15:32:01 crc kubenswrapper[4810]: I0219 15:32:01.918705 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7dc9fb8849-t2gx5" event={"ID":"275a98c0-8e6a-4587-8628-54f70b836615","Type":"ContainerStarted","Data":"ec7d0dde11f4d91dfc1709e9d373983bc5e0931480a8ae6b522e10ab70a1f7fc"} Feb 19 15:32:01 crc kubenswrapper[4810]: I0219 15:32:01.918863 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="db4be167-d56e-491f-851c-c21f30f63112" containerName="nova-api-log" containerID="cri-o://f3afc915b44f3696b28ac2a3768f5a557a2b647fe608482478f7471cf5736c24" gracePeriod=30 Feb 19 15:32:01 crc kubenswrapper[4810]: I0219 15:32:01.919349 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="db4be167-d56e-491f-851c-c21f30f63112" containerName="nova-api-api" containerID="cri-o://c92ab671905f1460e2a7dccb331259f6a59c52684a2e0df2a88b275d79bf21c4" gracePeriod=30 Feb 19 15:32:01 crc kubenswrapper[4810]: I0219 15:32:01.946444 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7dc9fb8849-t2gx5" podStartSLOduration=2.946428885 podStartE2EDuration="2.946428885s" podCreationTimestamp="2026-02-19 15:31:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:32:01.94114137 +0000 UTC m=+1351.423171514" watchObservedRunningTime="2026-02-19 15:32:01.946428885 +0000 UTC m=+1351.428459009" Feb 19 15:32:02 crc kubenswrapper[4810]: I0219 15:32:02.284713 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 19 15:32:02 crc kubenswrapper[4810]: I0219 15:32:02.928878 4810 generic.go:334] "Generic (PLEG): container finished" podID="db4be167-d56e-491f-851c-c21f30f63112" containerID="f3afc915b44f3696b28ac2a3768f5a557a2b647fe608482478f7471cf5736c24" exitCode=143 Feb 19 15:32:02 crc kubenswrapper[4810]: I0219 15:32:02.930104 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"db4be167-d56e-491f-851c-c21f30f63112","Type":"ContainerDied","Data":"f3afc915b44f3696b28ac2a3768f5a557a2b647fe608482478f7471cf5736c24"} Feb 19 15:32:02 crc kubenswrapper[4810]: I0219 15:32:02.930138 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7dc9fb8849-t2gx5" Feb 19 15:32:03 crc kubenswrapper[4810]: I0219 15:32:03.318951 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 15:32:03 crc kubenswrapper[4810]: I0219 15:32:03.319210 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f2594f4d-4c2f-4fc1-bda2-98a148e09b20" containerName="ceilometer-central-agent" containerID="cri-o://b146e7e942b25643f6a8f23e7901f7c71009b2241073e62495a7a1015b18112c" gracePeriod=30 Feb 19 15:32:03 crc kubenswrapper[4810]: I0219 15:32:03.319652 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f2594f4d-4c2f-4fc1-bda2-98a148e09b20" containerName="proxy-httpd" containerID="cri-o://9125ed9339214110f69897810b0e48bb3128158f2c4b9ccb9353a19533ba604d" gracePeriod=30 Feb 19 15:32:03 crc kubenswrapper[4810]: I0219 15:32:03.319695 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f2594f4d-4c2f-4fc1-bda2-98a148e09b20" containerName="sg-core" containerID="cri-o://2267682407e573c3802691ada945d211fd7e4b15c6d2e2d22cf717dea7754b85" gracePeriod=30 Feb 19 15:32:03 crc kubenswrapper[4810]: I0219 15:32:03.319724 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f2594f4d-4c2f-4fc1-bda2-98a148e09b20" containerName="ceilometer-notification-agent" containerID="cri-o://d89267d6dac559d2a521953fb65466a02a167ef0f42f9337d1bc8f64f8e801a5" gracePeriod=30 Feb 19 15:32:03 crc kubenswrapper[4810]: I0219 15:32:03.944263 4810 generic.go:334] "Generic (PLEG): container finished" podID="db4be167-d56e-491f-851c-c21f30f63112" containerID="c92ab671905f1460e2a7dccb331259f6a59c52684a2e0df2a88b275d79bf21c4" exitCode=0 Feb 19 15:32:03 crc kubenswrapper[4810]: I0219 15:32:03.944298 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"db4be167-d56e-491f-851c-c21f30f63112","Type":"ContainerDied","Data":"c92ab671905f1460e2a7dccb331259f6a59c52684a2e0df2a88b275d79bf21c4"} Feb 19 15:32:03 crc kubenswrapper[4810]: I0219 15:32:03.947708 4810 generic.go:334] "Generic (PLEG): container finished" podID="f2594f4d-4c2f-4fc1-bda2-98a148e09b20" containerID="9125ed9339214110f69897810b0e48bb3128158f2c4b9ccb9353a19533ba604d" exitCode=0 Feb 19 15:32:03 crc kubenswrapper[4810]: I0219 15:32:03.947734 4810 generic.go:334] "Generic (PLEG): container finished" podID="f2594f4d-4c2f-4fc1-bda2-98a148e09b20" containerID="2267682407e573c3802691ada945d211fd7e4b15c6d2e2d22cf717dea7754b85" exitCode=2 Feb 19 15:32:03 crc kubenswrapper[4810]: I0219 15:32:03.947744 4810 generic.go:334] "Generic (PLEG): container finished" podID="f2594f4d-4c2f-4fc1-bda2-98a148e09b20" containerID="d89267d6dac559d2a521953fb65466a02a167ef0f42f9337d1bc8f64f8e801a5" exitCode=0 Feb 19 15:32:03 crc kubenswrapper[4810]: I0219 15:32:03.947737 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f2594f4d-4c2f-4fc1-bda2-98a148e09b20","Type":"ContainerDied","Data":"9125ed9339214110f69897810b0e48bb3128158f2c4b9ccb9353a19533ba604d"} Feb 19 15:32:03 crc kubenswrapper[4810]: I0219 15:32:03.947853 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f2594f4d-4c2f-4fc1-bda2-98a148e09b20","Type":"ContainerDied","Data":"2267682407e573c3802691ada945d211fd7e4b15c6d2e2d22cf717dea7754b85"} Feb 19 15:32:03 crc kubenswrapper[4810]: I0219 15:32:03.947866 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f2594f4d-4c2f-4fc1-bda2-98a148e09b20","Type":"ContainerDied","Data":"d89267d6dac559d2a521953fb65466a02a167ef0f42f9337d1bc8f64f8e801a5"} Feb 19 15:32:04 crc kubenswrapper[4810]: I0219 15:32:04.031412 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 15:32:04 crc kubenswrapper[4810]: I0219 15:32:04.207554 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db4be167-d56e-491f-851c-c21f30f63112-config-data\") pod \"db4be167-d56e-491f-851c-c21f30f63112\" (UID: \"db4be167-d56e-491f-851c-c21f30f63112\") " Feb 19 15:32:04 crc kubenswrapper[4810]: I0219 15:32:04.207852 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db4be167-d56e-491f-851c-c21f30f63112-combined-ca-bundle\") pod \"db4be167-d56e-491f-851c-c21f30f63112\" (UID: \"db4be167-d56e-491f-851c-c21f30f63112\") " Feb 19 15:32:04 crc kubenswrapper[4810]: I0219 15:32:04.207980 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v46gz\" (UniqueName: \"kubernetes.io/projected/db4be167-d56e-491f-851c-c21f30f63112-kube-api-access-v46gz\") pod \"db4be167-d56e-491f-851c-c21f30f63112\" (UID: \"db4be167-d56e-491f-851c-c21f30f63112\") " Feb 19 15:32:04 crc kubenswrapper[4810]: I0219 15:32:04.208052 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db4be167-d56e-491f-851c-c21f30f63112-logs\") pod \"db4be167-d56e-491f-851c-c21f30f63112\" (UID: \"db4be167-d56e-491f-851c-c21f30f63112\") " Feb 19 15:32:04 crc kubenswrapper[4810]: I0219 15:32:04.209578 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db4be167-d56e-491f-851c-c21f30f63112-logs" (OuterVolumeSpecName: "logs") pod "db4be167-d56e-491f-851c-c21f30f63112" (UID: "db4be167-d56e-491f-851c-c21f30f63112"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:32:04 crc kubenswrapper[4810]: I0219 15:32:04.213164 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db4be167-d56e-491f-851c-c21f30f63112-kube-api-access-v46gz" (OuterVolumeSpecName: "kube-api-access-v46gz") pod "db4be167-d56e-491f-851c-c21f30f63112" (UID: "db4be167-d56e-491f-851c-c21f30f63112"). InnerVolumeSpecName "kube-api-access-v46gz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:32:04 crc kubenswrapper[4810]: I0219 15:32:04.243418 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db4be167-d56e-491f-851c-c21f30f63112-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "db4be167-d56e-491f-851c-c21f30f63112" (UID: "db4be167-d56e-491f-851c-c21f30f63112"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:32:04 crc kubenswrapper[4810]: I0219 15:32:04.255079 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db4be167-d56e-491f-851c-c21f30f63112-config-data" (OuterVolumeSpecName: "config-data") pod "db4be167-d56e-491f-851c-c21f30f63112" (UID: "db4be167-d56e-491f-851c-c21f30f63112"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:32:04 crc kubenswrapper[4810]: I0219 15:32:04.311276 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v46gz\" (UniqueName: \"kubernetes.io/projected/db4be167-d56e-491f-851c-c21f30f63112-kube-api-access-v46gz\") on node \"crc\" DevicePath \"\"" Feb 19 15:32:04 crc kubenswrapper[4810]: I0219 15:32:04.311309 4810 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db4be167-d56e-491f-851c-c21f30f63112-logs\") on node \"crc\" DevicePath \"\"" Feb 19 15:32:04 crc kubenswrapper[4810]: I0219 15:32:04.311320 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db4be167-d56e-491f-851c-c21f30f63112-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 15:32:04 crc kubenswrapper[4810]: I0219 15:32:04.311394 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db4be167-d56e-491f-851c-c21f30f63112-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 15:32:04 crc kubenswrapper[4810]: I0219 15:32:04.986832 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"db4be167-d56e-491f-851c-c21f30f63112","Type":"ContainerDied","Data":"6b9772e1210b523a5eae56618cf676698a333f36d8eea10d60634c0ac175381c"} Feb 19 15:32:04 crc kubenswrapper[4810]: I0219 15:32:04.986904 4810 scope.go:117] "RemoveContainer" containerID="c92ab671905f1460e2a7dccb331259f6a59c52684a2e0df2a88b275d79bf21c4" Feb 19 15:32:04 crc kubenswrapper[4810]: I0219 15:32:04.987092 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 15:32:05 crc kubenswrapper[4810]: I0219 15:32:05.067384 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 15:32:05 crc kubenswrapper[4810]: I0219 15:32:05.067951 4810 scope.go:117] "RemoveContainer" containerID="f3afc915b44f3696b28ac2a3768f5a557a2b647fe608482478f7471cf5736c24" Feb 19 15:32:05 crc kubenswrapper[4810]: I0219 15:32:05.105126 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 19 15:32:05 crc kubenswrapper[4810]: I0219 15:32:05.139882 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 19 15:32:05 crc kubenswrapper[4810]: E0219 15:32:05.140393 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db4be167-d56e-491f-851c-c21f30f63112" containerName="nova-api-api" Feb 19 15:32:05 crc kubenswrapper[4810]: I0219 15:32:05.140411 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="db4be167-d56e-491f-851c-c21f30f63112" containerName="nova-api-api" Feb 19 15:32:05 crc kubenswrapper[4810]: E0219 15:32:05.140438 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db4be167-d56e-491f-851c-c21f30f63112" containerName="nova-api-log" Feb 19 15:32:05 crc kubenswrapper[4810]: I0219 15:32:05.140444 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="db4be167-d56e-491f-851c-c21f30f63112" containerName="nova-api-log" Feb 19 15:32:05 crc kubenswrapper[4810]: I0219 15:32:05.140650 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="db4be167-d56e-491f-851c-c21f30f63112" containerName="nova-api-log" Feb 19 15:32:05 crc kubenswrapper[4810]: I0219 15:32:05.140664 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="db4be167-d56e-491f-851c-c21f30f63112" containerName="nova-api-api" Feb 19 15:32:05 crc kubenswrapper[4810]: I0219 15:32:05.141668 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 15:32:05 crc kubenswrapper[4810]: I0219 15:32:05.145244 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 19 15:32:05 crc kubenswrapper[4810]: I0219 15:32:05.150290 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 19 15:32:05 crc kubenswrapper[4810]: I0219 15:32:05.150547 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 19 15:32:05 crc kubenswrapper[4810]: I0219 15:32:05.153998 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 15:32:05 crc kubenswrapper[4810]: I0219 15:32:05.227788 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1120f3c4-1323-4ffe-8798-b15e58615278-config-data\") pod \"nova-api-0\" (UID: \"1120f3c4-1323-4ffe-8798-b15e58615278\") " pod="openstack/nova-api-0" Feb 19 15:32:05 crc kubenswrapper[4810]: I0219 15:32:05.227860 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1120f3c4-1323-4ffe-8798-b15e58615278-internal-tls-certs\") pod \"nova-api-0\" (UID: \"1120f3c4-1323-4ffe-8798-b15e58615278\") " pod="openstack/nova-api-0" Feb 19 15:32:05 crc kubenswrapper[4810]: I0219 15:32:05.227893 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1120f3c4-1323-4ffe-8798-b15e58615278-logs\") pod \"nova-api-0\" (UID: \"1120f3c4-1323-4ffe-8798-b15e58615278\") " pod="openstack/nova-api-0" Feb 19 15:32:05 crc kubenswrapper[4810]: I0219 15:32:05.227925 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfvq8\" (UniqueName: \"kubernetes.io/projected/1120f3c4-1323-4ffe-8798-b15e58615278-kube-api-access-bfvq8\") pod \"nova-api-0\" (UID: \"1120f3c4-1323-4ffe-8798-b15e58615278\") " pod="openstack/nova-api-0" Feb 19 15:32:05 crc kubenswrapper[4810]: I0219 15:32:05.228206 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1120f3c4-1323-4ffe-8798-b15e58615278-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1120f3c4-1323-4ffe-8798-b15e58615278\") " pod="openstack/nova-api-0" Feb 19 15:32:05 crc kubenswrapper[4810]: I0219 15:32:05.228351 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1120f3c4-1323-4ffe-8798-b15e58615278-public-tls-certs\") pod \"nova-api-0\" (UID: \"1120f3c4-1323-4ffe-8798-b15e58615278\") " pod="openstack/nova-api-0" Feb 19 15:32:05 crc kubenswrapper[4810]: I0219 15:32:05.330962 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1120f3c4-1323-4ffe-8798-b15e58615278-public-tls-certs\") pod \"nova-api-0\" (UID: \"1120f3c4-1323-4ffe-8798-b15e58615278\") " pod="openstack/nova-api-0" Feb 19 15:32:05 crc kubenswrapper[4810]: I0219 15:32:05.331228 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1120f3c4-1323-4ffe-8798-b15e58615278-config-data\") pod \"nova-api-0\" (UID: \"1120f3c4-1323-4ffe-8798-b15e58615278\") " pod="openstack/nova-api-0" Feb 19 15:32:05 crc kubenswrapper[4810]: I0219 15:32:05.331299 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1120f3c4-1323-4ffe-8798-b15e58615278-internal-tls-certs\") pod \"nova-api-0\" (UID: \"1120f3c4-1323-4ffe-8798-b15e58615278\") " pod="openstack/nova-api-0" Feb 19 15:32:05 crc kubenswrapper[4810]: I0219 15:32:05.331543 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1120f3c4-1323-4ffe-8798-b15e58615278-logs\") pod \"nova-api-0\" (UID: \"1120f3c4-1323-4ffe-8798-b15e58615278\") " pod="openstack/nova-api-0" Feb 19 15:32:05 crc kubenswrapper[4810]: I0219 15:32:05.331604 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfvq8\" (UniqueName: \"kubernetes.io/projected/1120f3c4-1323-4ffe-8798-b15e58615278-kube-api-access-bfvq8\") pod \"nova-api-0\" (UID: \"1120f3c4-1323-4ffe-8798-b15e58615278\") " pod="openstack/nova-api-0" Feb 19 15:32:05 crc kubenswrapper[4810]: I0219 15:32:05.331709 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1120f3c4-1323-4ffe-8798-b15e58615278-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1120f3c4-1323-4ffe-8798-b15e58615278\") " pod="openstack/nova-api-0" Feb 19 15:32:05 crc kubenswrapper[4810]: I0219 15:32:05.337317 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1120f3c4-1323-4ffe-8798-b15e58615278-logs\") pod \"nova-api-0\" (UID: \"1120f3c4-1323-4ffe-8798-b15e58615278\") " pod="openstack/nova-api-0" Feb 19 15:32:05 crc kubenswrapper[4810]: I0219 15:32:05.338941 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1120f3c4-1323-4ffe-8798-b15e58615278-public-tls-certs\") pod \"nova-api-0\" (UID: \"1120f3c4-1323-4ffe-8798-b15e58615278\") " pod="openstack/nova-api-0" Feb 19 15:32:05 crc kubenswrapper[4810]: I0219 15:32:05.342352 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1120f3c4-1323-4ffe-8798-b15e58615278-config-data\") pod \"nova-api-0\" (UID: \"1120f3c4-1323-4ffe-8798-b15e58615278\") " pod="openstack/nova-api-0" Feb 19 15:32:05 crc kubenswrapper[4810]: I0219 15:32:05.344001 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1120f3c4-1323-4ffe-8798-b15e58615278-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1120f3c4-1323-4ffe-8798-b15e58615278\") " pod="openstack/nova-api-0" Feb 19 15:32:05 crc kubenswrapper[4810]: I0219 15:32:05.346277 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1120f3c4-1323-4ffe-8798-b15e58615278-internal-tls-certs\") pod \"nova-api-0\" (UID: \"1120f3c4-1323-4ffe-8798-b15e58615278\") " pod="openstack/nova-api-0" Feb 19 15:32:05 crc kubenswrapper[4810]: I0219 15:32:05.354851 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfvq8\" (UniqueName: \"kubernetes.io/projected/1120f3c4-1323-4ffe-8798-b15e58615278-kube-api-access-bfvq8\") pod \"nova-api-0\" (UID: \"1120f3c4-1323-4ffe-8798-b15e58615278\") " pod="openstack/nova-api-0" Feb 19 15:32:05 crc kubenswrapper[4810]: I0219 15:32:05.451828 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db4be167-d56e-491f-851c-c21f30f63112" path="/var/lib/kubelet/pods/db4be167-d56e-491f-851c-c21f30f63112/volumes" Feb 19 15:32:05 crc kubenswrapper[4810]: I0219 15:32:05.465514 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 15:32:05 crc kubenswrapper[4810]: I0219 15:32:05.734235 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 15:32:05 crc kubenswrapper[4810]: I0219 15:32:05.843728 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27x4n\" (UniqueName: \"kubernetes.io/projected/f2594f4d-4c2f-4fc1-bda2-98a148e09b20-kube-api-access-27x4n\") pod \"f2594f4d-4c2f-4fc1-bda2-98a148e09b20\" (UID: \"f2594f4d-4c2f-4fc1-bda2-98a148e09b20\") " Feb 19 15:32:05 crc kubenswrapper[4810]: I0219 15:32:05.843810 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f2594f4d-4c2f-4fc1-bda2-98a148e09b20-log-httpd\") pod \"f2594f4d-4c2f-4fc1-bda2-98a148e09b20\" (UID: \"f2594f4d-4c2f-4fc1-bda2-98a148e09b20\") " Feb 19 15:32:05 crc kubenswrapper[4810]: I0219 15:32:05.843873 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2594f4d-4c2f-4fc1-bda2-98a148e09b20-ceilometer-tls-certs\") pod \"f2594f4d-4c2f-4fc1-bda2-98a148e09b20\" (UID: \"f2594f4d-4c2f-4fc1-bda2-98a148e09b20\") " Feb 19 15:32:05 crc kubenswrapper[4810]: I0219 15:32:05.843918 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2594f4d-4c2f-4fc1-bda2-98a148e09b20-scripts\") pod \"f2594f4d-4c2f-4fc1-bda2-98a148e09b20\" (UID: \"f2594f4d-4c2f-4fc1-bda2-98a148e09b20\") " Feb 19 15:32:05 crc kubenswrapper[4810]: I0219 15:32:05.843957 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2594f4d-4c2f-4fc1-bda2-98a148e09b20-config-data\") pod \"f2594f4d-4c2f-4fc1-bda2-98a148e09b20\" (UID: \"f2594f4d-4c2f-4fc1-bda2-98a148e09b20\") " Feb 19 15:32:05 crc kubenswrapper[4810]: I0219 15:32:05.844030 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f2594f4d-4c2f-4fc1-bda2-98a148e09b20-sg-core-conf-yaml\") pod \"f2594f4d-4c2f-4fc1-bda2-98a148e09b20\" (UID: \"f2594f4d-4c2f-4fc1-bda2-98a148e09b20\") " Feb 19 15:32:05 crc kubenswrapper[4810]: I0219 15:32:05.844057 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2594f4d-4c2f-4fc1-bda2-98a148e09b20-combined-ca-bundle\") pod \"f2594f4d-4c2f-4fc1-bda2-98a148e09b20\" (UID: \"f2594f4d-4c2f-4fc1-bda2-98a148e09b20\") " Feb 19 15:32:05 crc kubenswrapper[4810]: I0219 15:32:05.844090 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f2594f4d-4c2f-4fc1-bda2-98a148e09b20-run-httpd\") pod \"f2594f4d-4c2f-4fc1-bda2-98a148e09b20\" (UID: \"f2594f4d-4c2f-4fc1-bda2-98a148e09b20\") " Feb 19 15:32:05 crc kubenswrapper[4810]: I0219 15:32:05.844465 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2594f4d-4c2f-4fc1-bda2-98a148e09b20-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f2594f4d-4c2f-4fc1-bda2-98a148e09b20" (UID: "f2594f4d-4c2f-4fc1-bda2-98a148e09b20"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:32:05 crc kubenswrapper[4810]: I0219 15:32:05.844505 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2594f4d-4c2f-4fc1-bda2-98a148e09b20-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f2594f4d-4c2f-4fc1-bda2-98a148e09b20" (UID: "f2594f4d-4c2f-4fc1-bda2-98a148e09b20"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:32:05 crc kubenswrapper[4810]: I0219 15:32:05.849018 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2594f4d-4c2f-4fc1-bda2-98a148e09b20-kube-api-access-27x4n" (OuterVolumeSpecName: "kube-api-access-27x4n") pod "f2594f4d-4c2f-4fc1-bda2-98a148e09b20" (UID: "f2594f4d-4c2f-4fc1-bda2-98a148e09b20"). InnerVolumeSpecName "kube-api-access-27x4n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:32:05 crc kubenswrapper[4810]: I0219 15:32:05.852544 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2594f4d-4c2f-4fc1-bda2-98a148e09b20-scripts" (OuterVolumeSpecName: "scripts") pod "f2594f4d-4c2f-4fc1-bda2-98a148e09b20" (UID: "f2594f4d-4c2f-4fc1-bda2-98a148e09b20"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:32:05 crc kubenswrapper[4810]: I0219 15:32:05.873436 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2594f4d-4c2f-4fc1-bda2-98a148e09b20-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f2594f4d-4c2f-4fc1-bda2-98a148e09b20" (UID: "f2594f4d-4c2f-4fc1-bda2-98a148e09b20"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:32:05 crc kubenswrapper[4810]: I0219 15:32:05.904382 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2594f4d-4c2f-4fc1-bda2-98a148e09b20-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "f2594f4d-4c2f-4fc1-bda2-98a148e09b20" (UID: "f2594f4d-4c2f-4fc1-bda2-98a148e09b20"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:32:05 crc kubenswrapper[4810]: I0219 15:32:05.948691 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27x4n\" (UniqueName: \"kubernetes.io/projected/f2594f4d-4c2f-4fc1-bda2-98a148e09b20-kube-api-access-27x4n\") on node \"crc\" DevicePath \"\"" Feb 19 15:32:05 crc kubenswrapper[4810]: I0219 15:32:05.949025 4810 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f2594f4d-4c2f-4fc1-bda2-98a148e09b20-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 15:32:05 crc kubenswrapper[4810]: I0219 15:32:05.949046 4810 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2594f4d-4c2f-4fc1-bda2-98a148e09b20-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 15:32:05 crc kubenswrapper[4810]: I0219 15:32:05.949063 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2594f4d-4c2f-4fc1-bda2-98a148e09b20-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 15:32:05 crc kubenswrapper[4810]: I0219 15:32:05.949081 4810 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f2594f4d-4c2f-4fc1-bda2-98a148e09b20-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 19 15:32:05 crc kubenswrapper[4810]: I0219 15:32:05.949098 4810 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f2594f4d-4c2f-4fc1-bda2-98a148e09b20-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 15:32:05 crc kubenswrapper[4810]: I0219 15:32:05.953559 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2594f4d-4c2f-4fc1-bda2-98a148e09b20-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f2594f4d-4c2f-4fc1-bda2-98a148e09b20" (UID: "f2594f4d-4c2f-4fc1-bda2-98a148e09b20"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:32:05 crc kubenswrapper[4810]: I0219 15:32:05.977002 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 15:32:05 crc kubenswrapper[4810]: I0219 15:32:05.978255 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2594f4d-4c2f-4fc1-bda2-98a148e09b20-config-data" (OuterVolumeSpecName: "config-data") pod "f2594f4d-4c2f-4fc1-bda2-98a148e09b20" (UID: "f2594f4d-4c2f-4fc1-bda2-98a148e09b20"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:32:05 crc kubenswrapper[4810]: W0219 15:32:05.981377 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1120f3c4_1323_4ffe_8798_b15e58615278.slice/crio-5c8bcb9cab03480eccfcad397fcab4db26af8c7b42bbdd88a46bf05b9ed6f929 WatchSource:0}: Error finding container 5c8bcb9cab03480eccfcad397fcab4db26af8c7b42bbdd88a46bf05b9ed6f929: Status 404 returned error can't find the container with id 5c8bcb9cab03480eccfcad397fcab4db26af8c7b42bbdd88a46bf05b9ed6f929 Feb 19 15:32:06 crc kubenswrapper[4810]: I0219 15:32:06.003023 4810 generic.go:334] "Generic (PLEG): container finished" podID="f2594f4d-4c2f-4fc1-bda2-98a148e09b20" containerID="b146e7e942b25643f6a8f23e7901f7c71009b2241073e62495a7a1015b18112c" exitCode=0 Feb 19 15:32:06 crc kubenswrapper[4810]: I0219 15:32:06.003086 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f2594f4d-4c2f-4fc1-bda2-98a148e09b20","Type":"ContainerDied","Data":"b146e7e942b25643f6a8f23e7901f7c71009b2241073e62495a7a1015b18112c"} Feb 19 15:32:06 crc kubenswrapper[4810]: I0219 15:32:06.003114 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f2594f4d-4c2f-4fc1-bda2-98a148e09b20","Type":"ContainerDied","Data":"c1f4a9d4ae39c71c4f7ab55984edc9f42ceec7ca748ce3c8a863c795e3882a8e"} Feb 19 15:32:06 crc kubenswrapper[4810]: I0219 15:32:06.003134 4810 scope.go:117] "RemoveContainer" containerID="9125ed9339214110f69897810b0e48bb3128158f2c4b9ccb9353a19533ba604d" Feb 19 15:32:06 crc kubenswrapper[4810]: I0219 15:32:06.003298 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 15:32:06 crc kubenswrapper[4810]: I0219 15:32:06.013207 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1120f3c4-1323-4ffe-8798-b15e58615278","Type":"ContainerStarted","Data":"5c8bcb9cab03480eccfcad397fcab4db26af8c7b42bbdd88a46bf05b9ed6f929"} Feb 19 15:32:06 crc kubenswrapper[4810]: I0219 15:32:06.051419 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2594f4d-4c2f-4fc1-bda2-98a148e09b20-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 15:32:06 crc kubenswrapper[4810]: I0219 15:32:06.051452 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2594f4d-4c2f-4fc1-bda2-98a148e09b20-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 15:32:06 crc kubenswrapper[4810]: I0219 15:32:06.056694 4810 scope.go:117] "RemoveContainer" containerID="2267682407e573c3802691ada945d211fd7e4b15c6d2e2d22cf717dea7754b85" Feb 19 15:32:06 crc kubenswrapper[4810]: I0219 15:32:06.058310 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 15:32:06 crc kubenswrapper[4810]: I0219 15:32:06.067393 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 19 15:32:06 crc kubenswrapper[4810]: I0219 15:32:06.076856 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 19 15:32:06 crc kubenswrapper[4810]: E0219 15:32:06.077263 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2594f4d-4c2f-4fc1-bda2-98a148e09b20" containerName="proxy-httpd" Feb 19 15:32:06 crc kubenswrapper[4810]: I0219 15:32:06.077278 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2594f4d-4c2f-4fc1-bda2-98a148e09b20" containerName="proxy-httpd" Feb 19 15:32:06 crc kubenswrapper[4810]: E0219 15:32:06.077497 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2594f4d-4c2f-4fc1-bda2-98a148e09b20" containerName="sg-core" Feb 19 15:32:06 crc kubenswrapper[4810]: I0219 15:32:06.077514 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2594f4d-4c2f-4fc1-bda2-98a148e09b20" containerName="sg-core" Feb 19 15:32:06 crc kubenswrapper[4810]: E0219 15:32:06.077564 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2594f4d-4c2f-4fc1-bda2-98a148e09b20" containerName="ceilometer-notification-agent" Feb 19 15:32:06 crc kubenswrapper[4810]: I0219 15:32:06.077572 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2594f4d-4c2f-4fc1-bda2-98a148e09b20" containerName="ceilometer-notification-agent" Feb 19 15:32:06 crc kubenswrapper[4810]: E0219 15:32:06.077585 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2594f4d-4c2f-4fc1-bda2-98a148e09b20" containerName="ceilometer-central-agent" Feb 19 15:32:06 crc kubenswrapper[4810]: I0219 15:32:06.077592 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2594f4d-4c2f-4fc1-bda2-98a148e09b20" containerName="ceilometer-central-agent" Feb 19 15:32:06 crc kubenswrapper[4810]: I0219 15:32:06.077861 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2594f4d-4c2f-4fc1-bda2-98a148e09b20" containerName="proxy-httpd" Feb 19 15:32:06 crc kubenswrapper[4810]: I0219 15:32:06.077881 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2594f4d-4c2f-4fc1-bda2-98a148e09b20" containerName="ceilometer-central-agent" Feb 19 15:32:06 crc kubenswrapper[4810]: I0219 15:32:06.077893 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2594f4d-4c2f-4fc1-bda2-98a148e09b20" containerName="sg-core" Feb 19 15:32:06 crc kubenswrapper[4810]: I0219 15:32:06.077909 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2594f4d-4c2f-4fc1-bda2-98a148e09b20" containerName="ceilometer-notification-agent" Feb 19 15:32:06 crc kubenswrapper[4810]: I0219 15:32:06.079936 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 15:32:06 crc kubenswrapper[4810]: I0219 15:32:06.084803 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 19 15:32:06 crc kubenswrapper[4810]: I0219 15:32:06.085033 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 19 15:32:06 crc kubenswrapper[4810]: I0219 15:32:06.095757 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 15:32:06 crc kubenswrapper[4810]: I0219 15:32:06.123416 4810 scope.go:117] "RemoveContainer" containerID="d89267d6dac559d2a521953fb65466a02a167ef0f42f9337d1bc8f64f8e801a5" Feb 19 15:32:06 crc kubenswrapper[4810]: I0219 15:32:06.126722 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 19 15:32:06 crc kubenswrapper[4810]: I0219 15:32:06.165468 4810 scope.go:117] "RemoveContainer" containerID="b146e7e942b25643f6a8f23e7901f7c71009b2241073e62495a7a1015b18112c" Feb 19 15:32:06 crc kubenswrapper[4810]: I0219 15:32:06.204258 4810 scope.go:117] "RemoveContainer" containerID="9125ed9339214110f69897810b0e48bb3128158f2c4b9ccb9353a19533ba604d" Feb 19 15:32:06 crc kubenswrapper[4810]: E0219 15:32:06.205088 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9125ed9339214110f69897810b0e48bb3128158f2c4b9ccb9353a19533ba604d\": container with ID starting with 9125ed9339214110f69897810b0e48bb3128158f2c4b9ccb9353a19533ba604d not found: ID does not exist" containerID="9125ed9339214110f69897810b0e48bb3128158f2c4b9ccb9353a19533ba604d" Feb 19 15:32:06 crc kubenswrapper[4810]: I0219 15:32:06.205134 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9125ed9339214110f69897810b0e48bb3128158f2c4b9ccb9353a19533ba604d"} err="failed to get container status \"9125ed9339214110f69897810b0e48bb3128158f2c4b9ccb9353a19533ba604d\": rpc error: code = NotFound desc = could not find container \"9125ed9339214110f69897810b0e48bb3128158f2c4b9ccb9353a19533ba604d\": container with ID starting with 9125ed9339214110f69897810b0e48bb3128158f2c4b9ccb9353a19533ba604d not found: ID does not exist" Feb 19 15:32:06 crc kubenswrapper[4810]: I0219 15:32:06.205154 4810 scope.go:117] "RemoveContainer" containerID="2267682407e573c3802691ada945d211fd7e4b15c6d2e2d22cf717dea7754b85" Feb 19 15:32:06 crc kubenswrapper[4810]: E0219 15:32:06.206353 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2267682407e573c3802691ada945d211fd7e4b15c6d2e2d22cf717dea7754b85\": container with ID starting with 2267682407e573c3802691ada945d211fd7e4b15c6d2e2d22cf717dea7754b85 not found: ID does not exist" containerID="2267682407e573c3802691ada945d211fd7e4b15c6d2e2d22cf717dea7754b85" Feb 19 15:32:06 crc kubenswrapper[4810]: I0219 15:32:06.206392 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2267682407e573c3802691ada945d211fd7e4b15c6d2e2d22cf717dea7754b85"} err="failed to get container status \"2267682407e573c3802691ada945d211fd7e4b15c6d2e2d22cf717dea7754b85\": rpc error: code = NotFound desc = could not find container \"2267682407e573c3802691ada945d211fd7e4b15c6d2e2d22cf717dea7754b85\": container with ID starting with 2267682407e573c3802691ada945d211fd7e4b15c6d2e2d22cf717dea7754b85 not found: ID does not exist" Feb 19 15:32:06 crc kubenswrapper[4810]: I0219 15:32:06.206405 4810 scope.go:117] "RemoveContainer" containerID="d89267d6dac559d2a521953fb65466a02a167ef0f42f9337d1bc8f64f8e801a5" Feb 19 15:32:06 crc kubenswrapper[4810]: E0219 15:32:06.206779 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d89267d6dac559d2a521953fb65466a02a167ef0f42f9337d1bc8f64f8e801a5\": container with ID starting with d89267d6dac559d2a521953fb65466a02a167ef0f42f9337d1bc8f64f8e801a5 not found: ID does not exist" containerID="d89267d6dac559d2a521953fb65466a02a167ef0f42f9337d1bc8f64f8e801a5" Feb 19 15:32:06 crc kubenswrapper[4810]: I0219 15:32:06.206821 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d89267d6dac559d2a521953fb65466a02a167ef0f42f9337d1bc8f64f8e801a5"} err="failed to get container status \"d89267d6dac559d2a521953fb65466a02a167ef0f42f9337d1bc8f64f8e801a5\": rpc error: code = NotFound desc = could not find container \"d89267d6dac559d2a521953fb65466a02a167ef0f42f9337d1bc8f64f8e801a5\": container with ID starting with d89267d6dac559d2a521953fb65466a02a167ef0f42f9337d1bc8f64f8e801a5 not found: ID does not exist" Feb 19 15:32:06 crc kubenswrapper[4810]: I0219 15:32:06.206837 4810 scope.go:117] "RemoveContainer" containerID="b146e7e942b25643f6a8f23e7901f7c71009b2241073e62495a7a1015b18112c" Feb 19 15:32:06 crc kubenswrapper[4810]: E0219 15:32:06.207499 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b146e7e942b25643f6a8f23e7901f7c71009b2241073e62495a7a1015b18112c\": container with ID starting with b146e7e942b25643f6a8f23e7901f7c71009b2241073e62495a7a1015b18112c not found: ID does not exist" containerID="b146e7e942b25643f6a8f23e7901f7c71009b2241073e62495a7a1015b18112c" Feb 19 15:32:06 crc kubenswrapper[4810]: I0219 15:32:06.207558 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b146e7e942b25643f6a8f23e7901f7c71009b2241073e62495a7a1015b18112c"} err="failed to get container status \"b146e7e942b25643f6a8f23e7901f7c71009b2241073e62495a7a1015b18112c\": rpc error: code = NotFound desc = could not find container \"b146e7e942b25643f6a8f23e7901f7c71009b2241073e62495a7a1015b18112c\": container with ID starting with b146e7e942b25643f6a8f23e7901f7c71009b2241073e62495a7a1015b18112c not found: ID does not exist" Feb 19 15:32:06 crc kubenswrapper[4810]: I0219 15:32:06.254789 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e7d91a4d-5b61-404e-a58b-cb426722f883-log-httpd\") pod \"ceilometer-0\" (UID: \"e7d91a4d-5b61-404e-a58b-cb426722f883\") " pod="openstack/ceilometer-0" Feb 19 15:32:06 crc kubenswrapper[4810]: I0219 15:32:06.254979 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e7d91a4d-5b61-404e-a58b-cb426722f883-run-httpd\") pod \"ceilometer-0\" (UID: \"e7d91a4d-5b61-404e-a58b-cb426722f883\") " pod="openstack/ceilometer-0" Feb 19 15:32:06 crc kubenswrapper[4810]: I0219 15:32:06.255032 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-js5t4\" (UniqueName: \"kubernetes.io/projected/e7d91a4d-5b61-404e-a58b-cb426722f883-kube-api-access-js5t4\") pod \"ceilometer-0\" (UID: \"e7d91a4d-5b61-404e-a58b-cb426722f883\") " pod="openstack/ceilometer-0" Feb 19 15:32:06 crc kubenswrapper[4810]: I0219 15:32:06.255055 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7d91a4d-5b61-404e-a58b-cb426722f883-scripts\") pod \"ceilometer-0\" (UID: \"e7d91a4d-5b61-404e-a58b-cb426722f883\") " pod="openstack/ceilometer-0" Feb 19 15:32:06 crc kubenswrapper[4810]: I0219 15:32:06.255208 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7d91a4d-5b61-404e-a58b-cb426722f883-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e7d91a4d-5b61-404e-a58b-cb426722f883\") " pod="openstack/ceilometer-0" Feb 19 15:32:06 crc kubenswrapper[4810]: I0219 15:32:06.255383 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7d91a4d-5b61-404e-a58b-cb426722f883-config-data\") pod \"ceilometer-0\" (UID: \"e7d91a4d-5b61-404e-a58b-cb426722f883\") " pod="openstack/ceilometer-0" Feb 19 15:32:06 crc kubenswrapper[4810]: I0219 15:32:06.255446 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e7d91a4d-5b61-404e-a58b-cb426722f883-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e7d91a4d-5b61-404e-a58b-cb426722f883\") " pod="openstack/ceilometer-0" Feb 19 15:32:06 crc kubenswrapper[4810]: I0219 15:32:06.256028 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7d91a4d-5b61-404e-a58b-cb426722f883-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e7d91a4d-5b61-404e-a58b-cb426722f883\") " pod="openstack/ceilometer-0" Feb 19 15:32:06 crc kubenswrapper[4810]: I0219 15:32:06.358083 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7d91a4d-5b61-404e-a58b-cb426722f883-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e7d91a4d-5b61-404e-a58b-cb426722f883\") " pod="openstack/ceilometer-0" Feb 19 15:32:06 crc kubenswrapper[4810]: I0219 15:32:06.358183 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7d91a4d-5b61-404e-a58b-cb426722f883-config-data\") pod \"ceilometer-0\" (UID: \"e7d91a4d-5b61-404e-a58b-cb426722f883\") " pod="openstack/ceilometer-0" Feb 19 15:32:06 crc kubenswrapper[4810]: I0219 15:32:06.358215 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e7d91a4d-5b61-404e-a58b-cb426722f883-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e7d91a4d-5b61-404e-a58b-cb426722f883\") " pod="openstack/ceilometer-0" Feb 19 15:32:06 crc kubenswrapper[4810]: I0219 15:32:06.358298 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7d91a4d-5b61-404e-a58b-cb426722f883-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e7d91a4d-5b61-404e-a58b-cb426722f883\") " pod="openstack/ceilometer-0" Feb 19 15:32:06 crc kubenswrapper[4810]: I0219 15:32:06.358365 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e7d91a4d-5b61-404e-a58b-cb426722f883-log-httpd\") pod \"ceilometer-0\" (UID: \"e7d91a4d-5b61-404e-a58b-cb426722f883\") " pod="openstack/ceilometer-0" Feb 19 15:32:06 crc kubenswrapper[4810]: I0219 15:32:06.358462 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e7d91a4d-5b61-404e-a58b-cb426722f883-run-httpd\") pod \"ceilometer-0\" (UID: \"e7d91a4d-5b61-404e-a58b-cb426722f883\") " pod="openstack/ceilometer-0" Feb 19 15:32:06 crc kubenswrapper[4810]: I0219 15:32:06.358494 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-js5t4\" (UniqueName: \"kubernetes.io/projected/e7d91a4d-5b61-404e-a58b-cb426722f883-kube-api-access-js5t4\") pod \"ceilometer-0\" (UID: \"e7d91a4d-5b61-404e-a58b-cb426722f883\") " pod="openstack/ceilometer-0" Feb 19 15:32:06 crc kubenswrapper[4810]: I0219 15:32:06.358515 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7d91a4d-5b61-404e-a58b-cb426722f883-scripts\") pod \"ceilometer-0\" (UID: \"e7d91a4d-5b61-404e-a58b-cb426722f883\") " pod="openstack/ceilometer-0" Feb 19 15:32:06 crc kubenswrapper[4810]: I0219 15:32:06.359050 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e7d91a4d-5b61-404e-a58b-cb426722f883-log-httpd\") pod \"ceilometer-0\" (UID: \"e7d91a4d-5b61-404e-a58b-cb426722f883\") " pod="openstack/ceilometer-0" Feb 19 15:32:06 crc kubenswrapper[4810]: I0219 15:32:06.359444 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e7d91a4d-5b61-404e-a58b-cb426722f883-run-httpd\") pod \"ceilometer-0\" (UID: \"e7d91a4d-5b61-404e-a58b-cb426722f883\") " pod="openstack/ceilometer-0" Feb 19 15:32:06 crc kubenswrapper[4810]: I0219 15:32:06.361770 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7d91a4d-5b61-404e-a58b-cb426722f883-config-data\") pod \"ceilometer-0\" (UID: \"e7d91a4d-5b61-404e-a58b-cb426722f883\") " pod="openstack/ceilometer-0" Feb 19 15:32:06 crc kubenswrapper[4810]: I0219 15:32:06.362311 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7d91a4d-5b61-404e-a58b-cb426722f883-scripts\") pod \"ceilometer-0\" (UID: \"e7d91a4d-5b61-404e-a58b-cb426722f883\") " pod="openstack/ceilometer-0" Feb 19 15:32:06 crc kubenswrapper[4810]: I0219 15:32:06.362670 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e7d91a4d-5b61-404e-a58b-cb426722f883-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e7d91a4d-5b61-404e-a58b-cb426722f883\") " pod="openstack/ceilometer-0" Feb 19 15:32:06 crc kubenswrapper[4810]: I0219 15:32:06.362843 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7d91a4d-5b61-404e-a58b-cb426722f883-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e7d91a4d-5b61-404e-a58b-cb426722f883\") " pod="openstack/ceilometer-0" Feb 19 15:32:06 crc kubenswrapper[4810]: I0219 15:32:06.369386 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7d91a4d-5b61-404e-a58b-cb426722f883-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e7d91a4d-5b61-404e-a58b-cb426722f883\") " pod="openstack/ceilometer-0" Feb 19 15:32:06 crc kubenswrapper[4810]: I0219 15:32:06.380054 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-js5t4\" (UniqueName: \"kubernetes.io/projected/e7d91a4d-5b61-404e-a58b-cb426722f883-kube-api-access-js5t4\") pod \"ceilometer-0\" (UID: \"e7d91a4d-5b61-404e-a58b-cb426722f883\") " pod="openstack/ceilometer-0" Feb 19 15:32:06 crc kubenswrapper[4810]: I0219 15:32:06.455856 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 15:32:06 crc kubenswrapper[4810]: I0219 15:32:06.951284 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 15:32:07 crc kubenswrapper[4810]: I0219 15:32:07.024209 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1120f3c4-1323-4ffe-8798-b15e58615278","Type":"ContainerStarted","Data":"83dcf28c966753013525c88cc66000052f7461b478620e9cca2052eda9edb662"} Feb 19 15:32:07 crc kubenswrapper[4810]: I0219 15:32:07.025204 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1120f3c4-1323-4ffe-8798-b15e58615278","Type":"ContainerStarted","Data":"bfdb849b75abf04adb5f644e93c3f65364ac03de4b6998e5eb22cf3fb4118b78"} Feb 19 15:32:07 crc kubenswrapper[4810]: I0219 15:32:07.032042 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e7d91a4d-5b61-404e-a58b-cb426722f883","Type":"ContainerStarted","Data":"61cf75ecf53b55c33c98212c5ad0ab385ad4652e0c69073e9d20e36a8f190b5e"} Feb 19 15:32:07 crc kubenswrapper[4810]: I0219 15:32:07.057791 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.057767812 podStartE2EDuration="2.057767812s" podCreationTimestamp="2026-02-19 15:32:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:32:07.050368364 +0000 UTC m=+1356.532398508" watchObservedRunningTime="2026-02-19 15:32:07.057767812 +0000 UTC m=+1356.539797936" Feb 19 15:32:07 crc kubenswrapper[4810]: I0219 15:32:07.284943 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Feb 19 15:32:07 crc kubenswrapper[4810]: I0219 15:32:07.320102 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Feb 19 15:32:07 crc kubenswrapper[4810]: I0219 15:32:07.451677 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2594f4d-4c2f-4fc1-bda2-98a148e09b20" path="/var/lib/kubelet/pods/f2594f4d-4c2f-4fc1-bda2-98a148e09b20/volumes" Feb 19 15:32:08 crc kubenswrapper[4810]: I0219 15:32:08.044797 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e7d91a4d-5b61-404e-a58b-cb426722f883","Type":"ContainerStarted","Data":"ac6990e103b1b508ec962576c241925fc6ac704d611478cd1f6d2a4840873610"} Feb 19 15:32:08 crc kubenswrapper[4810]: I0219 15:32:08.045147 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e7d91a4d-5b61-404e-a58b-cb426722f883","Type":"ContainerStarted","Data":"e65dc6917279ed73ebbc95d0132561f41dffb02e119473230cd2e9f6204ca6ba"} Feb 19 15:32:08 crc kubenswrapper[4810]: I0219 15:32:08.063455 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Feb 19 15:32:08 crc kubenswrapper[4810]: I0219 15:32:08.204281 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-hf7qg"] Feb 19 15:32:08 crc kubenswrapper[4810]: I0219 15:32:08.205637 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-hf7qg" Feb 19 15:32:08 crc kubenswrapper[4810]: I0219 15:32:08.207685 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Feb 19 15:32:08 crc kubenswrapper[4810]: I0219 15:32:08.207844 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Feb 19 15:32:08 crc kubenswrapper[4810]: I0219 15:32:08.223374 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-hf7qg"] Feb 19 15:32:08 crc kubenswrapper[4810]: I0219 15:32:08.297036 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cftj\" (UniqueName: \"kubernetes.io/projected/f834f671-3add-4bfc-8152-596d66e90f22-kube-api-access-2cftj\") pod \"nova-cell1-cell-mapping-hf7qg\" (UID: \"f834f671-3add-4bfc-8152-596d66e90f22\") " pod="openstack/nova-cell1-cell-mapping-hf7qg" Feb 19 15:32:08 crc kubenswrapper[4810]: I0219 15:32:08.297132 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f834f671-3add-4bfc-8152-596d66e90f22-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-hf7qg\" (UID: \"f834f671-3add-4bfc-8152-596d66e90f22\") " pod="openstack/nova-cell1-cell-mapping-hf7qg" Feb 19 15:32:08 crc kubenswrapper[4810]: I0219 15:32:08.297173 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f834f671-3add-4bfc-8152-596d66e90f22-scripts\") pod \"nova-cell1-cell-mapping-hf7qg\" (UID: \"f834f671-3add-4bfc-8152-596d66e90f22\") " pod="openstack/nova-cell1-cell-mapping-hf7qg" Feb 19 15:32:08 crc kubenswrapper[4810]: I0219 15:32:08.297241 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f834f671-3add-4bfc-8152-596d66e90f22-config-data\") pod \"nova-cell1-cell-mapping-hf7qg\" (UID: \"f834f671-3add-4bfc-8152-596d66e90f22\") " pod="openstack/nova-cell1-cell-mapping-hf7qg" Feb 19 15:32:08 crc kubenswrapper[4810]: I0219 15:32:08.398712 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f834f671-3add-4bfc-8152-596d66e90f22-config-data\") pod \"nova-cell1-cell-mapping-hf7qg\" (UID: \"f834f671-3add-4bfc-8152-596d66e90f22\") " pod="openstack/nova-cell1-cell-mapping-hf7qg" Feb 19 15:32:08 crc kubenswrapper[4810]: I0219 15:32:08.398818 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cftj\" (UniqueName: \"kubernetes.io/projected/f834f671-3add-4bfc-8152-596d66e90f22-kube-api-access-2cftj\") pod \"nova-cell1-cell-mapping-hf7qg\" (UID: \"f834f671-3add-4bfc-8152-596d66e90f22\") " pod="openstack/nova-cell1-cell-mapping-hf7qg" Feb 19 15:32:08 crc kubenswrapper[4810]: I0219 15:32:08.398881 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f834f671-3add-4bfc-8152-596d66e90f22-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-hf7qg\" (UID: \"f834f671-3add-4bfc-8152-596d66e90f22\") " pod="openstack/nova-cell1-cell-mapping-hf7qg" Feb 19 15:32:08 crc kubenswrapper[4810]: I0219 15:32:08.398917 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f834f671-3add-4bfc-8152-596d66e90f22-scripts\") pod \"nova-cell1-cell-mapping-hf7qg\" (UID: \"f834f671-3add-4bfc-8152-596d66e90f22\") " pod="openstack/nova-cell1-cell-mapping-hf7qg" Feb 19 15:32:08 crc kubenswrapper[4810]: I0219 15:32:08.403893 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f834f671-3add-4bfc-8152-596d66e90f22-config-data\") pod \"nova-cell1-cell-mapping-hf7qg\" (UID: \"f834f671-3add-4bfc-8152-596d66e90f22\") " pod="openstack/nova-cell1-cell-mapping-hf7qg" Feb 19 15:32:08 crc kubenswrapper[4810]: I0219 15:32:08.404067 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f834f671-3add-4bfc-8152-596d66e90f22-scripts\") pod \"nova-cell1-cell-mapping-hf7qg\" (UID: \"f834f671-3add-4bfc-8152-596d66e90f22\") " pod="openstack/nova-cell1-cell-mapping-hf7qg" Feb 19 15:32:08 crc kubenswrapper[4810]: I0219 15:32:08.404555 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f834f671-3add-4bfc-8152-596d66e90f22-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-hf7qg\" (UID: \"f834f671-3add-4bfc-8152-596d66e90f22\") " pod="openstack/nova-cell1-cell-mapping-hf7qg" Feb 19 15:32:08 crc kubenswrapper[4810]: I0219 15:32:08.420717 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cftj\" (UniqueName: \"kubernetes.io/projected/f834f671-3add-4bfc-8152-596d66e90f22-kube-api-access-2cftj\") pod \"nova-cell1-cell-mapping-hf7qg\" (UID: \"f834f671-3add-4bfc-8152-596d66e90f22\") " pod="openstack/nova-cell1-cell-mapping-hf7qg" Feb 19 15:32:08 crc kubenswrapper[4810]: I0219 15:32:08.523694 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-hf7qg" Feb 19 15:32:08 crc kubenswrapper[4810]: W0219 15:32:08.995009 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf834f671_3add_4bfc_8152_596d66e90f22.slice/crio-fed110f1ba52d2ca42d9ab496f60368740e3e263fc2110696241568bca816238 WatchSource:0}: Error finding container fed110f1ba52d2ca42d9ab496f60368740e3e263fc2110696241568bca816238: Status 404 returned error can't find the container with id fed110f1ba52d2ca42d9ab496f60368740e3e263fc2110696241568bca816238 Feb 19 15:32:08 crc kubenswrapper[4810]: I0219 15:32:08.998853 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-hf7qg"] Feb 19 15:32:09 crc kubenswrapper[4810]: I0219 15:32:09.065462 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-hf7qg" event={"ID":"f834f671-3add-4bfc-8152-596d66e90f22","Type":"ContainerStarted","Data":"fed110f1ba52d2ca42d9ab496f60368740e3e263fc2110696241568bca816238"} Feb 19 15:32:09 crc kubenswrapper[4810]: I0219 15:32:09.069495 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e7d91a4d-5b61-404e-a58b-cb426722f883","Type":"ContainerStarted","Data":"6620634cbef3df55485b66447dcd8e726560a5938873cf32536f20c8de70ee50"} Feb 19 15:32:09 crc kubenswrapper[4810]: I0219 15:32:09.417554 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7dc9fb8849-t2gx5" Feb 19 15:32:09 crc kubenswrapper[4810]: I0219 15:32:09.541805 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cc76f8d79-b9r9k"] Feb 19 15:32:09 crc kubenswrapper[4810]: I0219 15:32:09.542494 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7cc76f8d79-b9r9k" podUID="d13abd15-5b9b-4e00-984f-9dabbe51ddbc" containerName="dnsmasq-dns" containerID="cri-o://968b59a59d30b2d92d2973ce319917f2d174a7454d4003dff6b2c557a24c3a76" gracePeriod=10 Feb 19 15:32:10 crc kubenswrapper[4810]: I0219 15:32:10.086313 4810 generic.go:334] "Generic (PLEG): container finished" podID="d13abd15-5b9b-4e00-984f-9dabbe51ddbc" containerID="968b59a59d30b2d92d2973ce319917f2d174a7454d4003dff6b2c557a24c3a76" exitCode=0 Feb 19 15:32:10 crc kubenswrapper[4810]: I0219 15:32:10.086524 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cc76f8d79-b9r9k" event={"ID":"d13abd15-5b9b-4e00-984f-9dabbe51ddbc","Type":"ContainerDied","Data":"968b59a59d30b2d92d2973ce319917f2d174a7454d4003dff6b2c557a24c3a76"} Feb 19 15:32:10 crc kubenswrapper[4810]: I0219 15:32:10.097604 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-hf7qg" event={"ID":"f834f671-3add-4bfc-8152-596d66e90f22","Type":"ContainerStarted","Data":"eb023c9cd6a803467c474cff2a48a3d6536859d9cdbc3785ab4eb9814aa6c925"} Feb 19 15:32:10 crc kubenswrapper[4810]: I0219 15:32:10.122519 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-hf7qg" podStartSLOduration=2.122503524 podStartE2EDuration="2.122503524s" podCreationTimestamp="2026-02-19 15:32:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:32:10.121436057 +0000 UTC m=+1359.603466181" watchObservedRunningTime="2026-02-19 15:32:10.122503524 +0000 UTC m=+1359.604533648" Feb 19 15:32:10 crc kubenswrapper[4810]: I0219 15:32:10.193858 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cc76f8d79-b9r9k" Feb 19 15:32:10 crc kubenswrapper[4810]: I0219 15:32:10.236749 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d13abd15-5b9b-4e00-984f-9dabbe51ddbc-config\") pod \"d13abd15-5b9b-4e00-984f-9dabbe51ddbc\" (UID: \"d13abd15-5b9b-4e00-984f-9dabbe51ddbc\") " Feb 19 15:32:10 crc kubenswrapper[4810]: I0219 15:32:10.236805 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d13abd15-5b9b-4e00-984f-9dabbe51ddbc-dns-swift-storage-0\") pod \"d13abd15-5b9b-4e00-984f-9dabbe51ddbc\" (UID: \"d13abd15-5b9b-4e00-984f-9dabbe51ddbc\") " Feb 19 15:32:10 crc kubenswrapper[4810]: I0219 15:32:10.236997 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d13abd15-5b9b-4e00-984f-9dabbe51ddbc-dns-svc\") pod \"d13abd15-5b9b-4e00-984f-9dabbe51ddbc\" (UID: \"d13abd15-5b9b-4e00-984f-9dabbe51ddbc\") " Feb 19 15:32:10 crc kubenswrapper[4810]: I0219 15:32:10.237051 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d13abd15-5b9b-4e00-984f-9dabbe51ddbc-ovsdbserver-sb\") pod \"d13abd15-5b9b-4e00-984f-9dabbe51ddbc\" (UID: \"d13abd15-5b9b-4e00-984f-9dabbe51ddbc\") " Feb 19 15:32:10 crc kubenswrapper[4810]: I0219 15:32:10.237121 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d13abd15-5b9b-4e00-984f-9dabbe51ddbc-ovsdbserver-nb\") pod \"d13abd15-5b9b-4e00-984f-9dabbe51ddbc\" (UID: \"d13abd15-5b9b-4e00-984f-9dabbe51ddbc\") " Feb 19 15:32:10 crc kubenswrapper[4810]: I0219 15:32:10.237180 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdf9n\" (UniqueName: \"kubernetes.io/projected/d13abd15-5b9b-4e00-984f-9dabbe51ddbc-kube-api-access-fdf9n\") pod \"d13abd15-5b9b-4e00-984f-9dabbe51ddbc\" (UID: \"d13abd15-5b9b-4e00-984f-9dabbe51ddbc\") " Feb 19 15:32:10 crc kubenswrapper[4810]: I0219 15:32:10.269550 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d13abd15-5b9b-4e00-984f-9dabbe51ddbc-kube-api-access-fdf9n" (OuterVolumeSpecName: "kube-api-access-fdf9n") pod "d13abd15-5b9b-4e00-984f-9dabbe51ddbc" (UID: "d13abd15-5b9b-4e00-984f-9dabbe51ddbc"). InnerVolumeSpecName "kube-api-access-fdf9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:32:10 crc kubenswrapper[4810]: I0219 15:32:10.306569 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d13abd15-5b9b-4e00-984f-9dabbe51ddbc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d13abd15-5b9b-4e00-984f-9dabbe51ddbc" (UID: "d13abd15-5b9b-4e00-984f-9dabbe51ddbc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:32:10 crc kubenswrapper[4810]: I0219 15:32:10.314160 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d13abd15-5b9b-4e00-984f-9dabbe51ddbc-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d13abd15-5b9b-4e00-984f-9dabbe51ddbc" (UID: "d13abd15-5b9b-4e00-984f-9dabbe51ddbc"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:32:10 crc kubenswrapper[4810]: I0219 15:32:10.320550 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d13abd15-5b9b-4e00-984f-9dabbe51ddbc-config" (OuterVolumeSpecName: "config") pod "d13abd15-5b9b-4e00-984f-9dabbe51ddbc" (UID: "d13abd15-5b9b-4e00-984f-9dabbe51ddbc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:32:10 crc kubenswrapper[4810]: I0219 15:32:10.340295 4810 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d13abd15-5b9b-4e00-984f-9dabbe51ddbc-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 15:32:10 crc kubenswrapper[4810]: I0219 15:32:10.340406 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d13abd15-5b9b-4e00-984f-9dabbe51ddbc-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 15:32:10 crc kubenswrapper[4810]: I0219 15:32:10.340420 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdf9n\" (UniqueName: \"kubernetes.io/projected/d13abd15-5b9b-4e00-984f-9dabbe51ddbc-kube-api-access-fdf9n\") on node \"crc\" DevicePath \"\"" Feb 19 15:32:10 crc kubenswrapper[4810]: I0219 15:32:10.340452 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d13abd15-5b9b-4e00-984f-9dabbe51ddbc-config\") on node \"crc\" DevicePath \"\"" Feb 19 15:32:10 crc kubenswrapper[4810]: I0219 15:32:10.342959 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d13abd15-5b9b-4e00-984f-9dabbe51ddbc-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d13abd15-5b9b-4e00-984f-9dabbe51ddbc" (UID: "d13abd15-5b9b-4e00-984f-9dabbe51ddbc"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:32:10 crc kubenswrapper[4810]: I0219 15:32:10.356987 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d13abd15-5b9b-4e00-984f-9dabbe51ddbc-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d13abd15-5b9b-4e00-984f-9dabbe51ddbc" (UID: "d13abd15-5b9b-4e00-984f-9dabbe51ddbc"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:32:10 crc kubenswrapper[4810]: I0219 15:32:10.442469 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d13abd15-5b9b-4e00-984f-9dabbe51ddbc-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 15:32:10 crc kubenswrapper[4810]: I0219 15:32:10.442504 4810 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d13abd15-5b9b-4e00-984f-9dabbe51ddbc-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 19 15:32:11 crc kubenswrapper[4810]: I0219 15:32:11.109313 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e7d91a4d-5b61-404e-a58b-cb426722f883","Type":"ContainerStarted","Data":"0648552c36d1af1b785b524b2a94ea4f2718e1cc9b77f90f7db7a36e665bc3ad"} Feb 19 15:32:11 crc kubenswrapper[4810]: I0219 15:32:11.109481 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 19 15:32:11 crc kubenswrapper[4810]: I0219 15:32:11.114376 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cc76f8d79-b9r9k" Feb 19 15:32:11 crc kubenswrapper[4810]: I0219 15:32:11.114429 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cc76f8d79-b9r9k" event={"ID":"d13abd15-5b9b-4e00-984f-9dabbe51ddbc","Type":"ContainerDied","Data":"3782f704a1deb319907b16927aac1f196131400ef56fc6f3b4943495c1de3d0b"} Feb 19 15:32:11 crc kubenswrapper[4810]: I0219 15:32:11.114496 4810 scope.go:117] "RemoveContainer" containerID="968b59a59d30b2d92d2973ce319917f2d174a7454d4003dff6b2c557a24c3a76" Feb 19 15:32:11 crc kubenswrapper[4810]: I0219 15:32:11.162841 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.25907054 podStartE2EDuration="5.162811315s" podCreationTimestamp="2026-02-19 15:32:06 +0000 UTC" firstStartedPulling="2026-02-19 15:32:06.953846308 +0000 UTC m=+1356.435876432" lastFinishedPulling="2026-02-19 15:32:09.857587083 +0000 UTC m=+1359.339617207" observedRunningTime="2026-02-19 15:32:11.138485516 +0000 UTC m=+1360.620515650" watchObservedRunningTime="2026-02-19 15:32:11.162811315 +0000 UTC m=+1360.644841439" Feb 19 15:32:11 crc kubenswrapper[4810]: I0219 15:32:11.166487 4810 scope.go:117] "RemoveContainer" containerID="f762a8738d3f47e401158c999773d4f19a8cf6bd6c7936ab82cb0c741248ad3e" Feb 19 15:32:11 crc kubenswrapper[4810]: I0219 15:32:11.177230 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cc76f8d79-b9r9k"] Feb 19 15:32:11 crc kubenswrapper[4810]: I0219 15:32:11.187704 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7cc76f8d79-b9r9k"] Feb 19 15:32:11 crc kubenswrapper[4810]: I0219 15:32:11.453277 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d13abd15-5b9b-4e00-984f-9dabbe51ddbc" path="/var/lib/kubelet/pods/d13abd15-5b9b-4e00-984f-9dabbe51ddbc/volumes" Feb 19 15:32:15 crc kubenswrapper[4810]: I0219 15:32:15.159771 4810 generic.go:334] "Generic (PLEG): container finished" podID="f834f671-3add-4bfc-8152-596d66e90f22" containerID="eb023c9cd6a803467c474cff2a48a3d6536859d9cdbc3785ab4eb9814aa6c925" exitCode=0 Feb 19 15:32:15 crc kubenswrapper[4810]: I0219 15:32:15.159891 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-hf7qg" event={"ID":"f834f671-3add-4bfc-8152-596d66e90f22","Type":"ContainerDied","Data":"eb023c9cd6a803467c474cff2a48a3d6536859d9cdbc3785ab4eb9814aa6c925"} Feb 19 15:32:15 crc kubenswrapper[4810]: I0219 15:32:15.466508 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 15:32:15 crc kubenswrapper[4810]: I0219 15:32:15.466553 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 15:32:16 crc kubenswrapper[4810]: I0219 15:32:16.481642 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="1120f3c4-1323-4ffe-8798-b15e58615278" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.228:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 19 15:32:16 crc kubenswrapper[4810]: I0219 15:32:16.481675 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="1120f3c4-1323-4ffe-8798-b15e58615278" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.228:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 19 15:32:16 crc kubenswrapper[4810]: I0219 15:32:16.570013 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-hf7qg" Feb 19 15:32:16 crc kubenswrapper[4810]: I0219 15:32:16.664892 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f834f671-3add-4bfc-8152-596d66e90f22-config-data\") pod \"f834f671-3add-4bfc-8152-596d66e90f22\" (UID: \"f834f671-3add-4bfc-8152-596d66e90f22\") " Feb 19 15:32:16 crc kubenswrapper[4810]: I0219 15:32:16.664971 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f834f671-3add-4bfc-8152-596d66e90f22-scripts\") pod \"f834f671-3add-4bfc-8152-596d66e90f22\" (UID: \"f834f671-3add-4bfc-8152-596d66e90f22\") " Feb 19 15:32:16 crc kubenswrapper[4810]: I0219 15:32:16.665073 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f834f671-3add-4bfc-8152-596d66e90f22-combined-ca-bundle\") pod \"f834f671-3add-4bfc-8152-596d66e90f22\" (UID: \"f834f671-3add-4bfc-8152-596d66e90f22\") " Feb 19 15:32:16 crc kubenswrapper[4810]: I0219 15:32:16.665169 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2cftj\" (UniqueName: \"kubernetes.io/projected/f834f671-3add-4bfc-8152-596d66e90f22-kube-api-access-2cftj\") pod \"f834f671-3add-4bfc-8152-596d66e90f22\" (UID: \"f834f671-3add-4bfc-8152-596d66e90f22\") " Feb 19 15:32:16 crc kubenswrapper[4810]: I0219 15:32:16.670526 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f834f671-3add-4bfc-8152-596d66e90f22-kube-api-access-2cftj" (OuterVolumeSpecName: "kube-api-access-2cftj") pod "f834f671-3add-4bfc-8152-596d66e90f22" (UID: "f834f671-3add-4bfc-8152-596d66e90f22"). InnerVolumeSpecName "kube-api-access-2cftj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:32:16 crc kubenswrapper[4810]: I0219 15:32:16.677543 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f834f671-3add-4bfc-8152-596d66e90f22-scripts" (OuterVolumeSpecName: "scripts") pod "f834f671-3add-4bfc-8152-596d66e90f22" (UID: "f834f671-3add-4bfc-8152-596d66e90f22"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:32:16 crc kubenswrapper[4810]: I0219 15:32:16.701439 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f834f671-3add-4bfc-8152-596d66e90f22-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f834f671-3add-4bfc-8152-596d66e90f22" (UID: "f834f671-3add-4bfc-8152-596d66e90f22"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:32:16 crc kubenswrapper[4810]: I0219 15:32:16.707519 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f834f671-3add-4bfc-8152-596d66e90f22-config-data" (OuterVolumeSpecName: "config-data") pod "f834f671-3add-4bfc-8152-596d66e90f22" (UID: "f834f671-3add-4bfc-8152-596d66e90f22"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:32:16 crc kubenswrapper[4810]: I0219 15:32:16.779871 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f834f671-3add-4bfc-8152-596d66e90f22-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 15:32:16 crc kubenswrapper[4810]: I0219 15:32:16.779903 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f834f671-3add-4bfc-8152-596d66e90f22-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 15:32:16 crc kubenswrapper[4810]: I0219 15:32:16.779913 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f834f671-3add-4bfc-8152-596d66e90f22-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 15:32:16 crc kubenswrapper[4810]: I0219 15:32:16.779923 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2cftj\" (UniqueName: \"kubernetes.io/projected/f834f671-3add-4bfc-8152-596d66e90f22-kube-api-access-2cftj\") on node \"crc\" DevicePath \"\"" Feb 19 15:32:17 crc kubenswrapper[4810]: I0219 15:32:17.196649 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-hf7qg" event={"ID":"f834f671-3add-4bfc-8152-596d66e90f22","Type":"ContainerDied","Data":"fed110f1ba52d2ca42d9ab496f60368740e3e263fc2110696241568bca816238"} Feb 19 15:32:17 crc kubenswrapper[4810]: I0219 15:32:17.197045 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fed110f1ba52d2ca42d9ab496f60368740e3e263fc2110696241568bca816238" Feb 19 15:32:17 crc kubenswrapper[4810]: I0219 15:32:17.196770 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-hf7qg" Feb 19 15:32:17 crc kubenswrapper[4810]: I0219 15:32:17.376199 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 15:32:17 crc kubenswrapper[4810]: I0219 15:32:17.376482 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="1120f3c4-1323-4ffe-8798-b15e58615278" containerName="nova-api-log" containerID="cri-o://bfdb849b75abf04adb5f644e93c3f65364ac03de4b6998e5eb22cf3fb4118b78" gracePeriod=30 Feb 19 15:32:17 crc kubenswrapper[4810]: I0219 15:32:17.376827 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="1120f3c4-1323-4ffe-8798-b15e58615278" containerName="nova-api-api" containerID="cri-o://83dcf28c966753013525c88cc66000052f7461b478620e9cca2052eda9edb662" gracePeriod=30 Feb 19 15:32:17 crc kubenswrapper[4810]: I0219 15:32:17.419592 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 15:32:17 crc kubenswrapper[4810]: I0219 15:32:17.419817 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="9776a876-19db-446c-a7bf-d6fe0111d7b8" containerName="nova-scheduler-scheduler" containerID="cri-o://9050cf4664528bc43d0a9883f3bb240ba2cdcad9a8907febfcebd4f66322d6a0" gracePeriod=30 Feb 19 15:32:17 crc kubenswrapper[4810]: I0219 15:32:17.466967 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 15:32:17 crc kubenswrapper[4810]: I0219 15:32:17.467359 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="deba4978-0921-4c62-9452-9a47fe24feb7" containerName="nova-metadata-metadata" containerID="cri-o://3cd69082a7ac9af779601af9a54ce467c63853d8132eb18f4aa9ff5eb2315f95" gracePeriod=30 Feb 19 15:32:17 crc kubenswrapper[4810]: I0219 15:32:17.467549 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="deba4978-0921-4c62-9452-9a47fe24feb7" containerName="nova-metadata-log" containerID="cri-o://acc612b198e2bb735041a74b3bf438f971ed76411bde0e76a3b0dd4d3a920c17" gracePeriod=30 Feb 19 15:32:18 crc kubenswrapper[4810]: I0219 15:32:18.208866 4810 generic.go:334] "Generic (PLEG): container finished" podID="deba4978-0921-4c62-9452-9a47fe24feb7" containerID="acc612b198e2bb735041a74b3bf438f971ed76411bde0e76a3b0dd4d3a920c17" exitCode=143 Feb 19 15:32:18 crc kubenswrapper[4810]: I0219 15:32:18.208958 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"deba4978-0921-4c62-9452-9a47fe24feb7","Type":"ContainerDied","Data":"acc612b198e2bb735041a74b3bf438f971ed76411bde0e76a3b0dd4d3a920c17"} Feb 19 15:32:18 crc kubenswrapper[4810]: I0219 15:32:18.212075 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1120f3c4-1323-4ffe-8798-b15e58615278","Type":"ContainerDied","Data":"bfdb849b75abf04adb5f644e93c3f65364ac03de4b6998e5eb22cf3fb4118b78"} Feb 19 15:32:18 crc kubenswrapper[4810]: I0219 15:32:18.212098 4810 generic.go:334] "Generic (PLEG): container finished" podID="1120f3c4-1323-4ffe-8798-b15e58615278" containerID="bfdb849b75abf04adb5f644e93c3f65364ac03de4b6998e5eb22cf3fb4118b78" exitCode=143 Feb 19 15:32:18 crc kubenswrapper[4810]: I0219 15:32:18.214417 4810 generic.go:334] "Generic (PLEG): container finished" podID="9776a876-19db-446c-a7bf-d6fe0111d7b8" containerID="9050cf4664528bc43d0a9883f3bb240ba2cdcad9a8907febfcebd4f66322d6a0" exitCode=0 Feb 19 15:32:18 crc kubenswrapper[4810]: I0219 15:32:18.214447 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9776a876-19db-446c-a7bf-d6fe0111d7b8","Type":"ContainerDied","Data":"9050cf4664528bc43d0a9883f3bb240ba2cdcad9a8907febfcebd4f66322d6a0"} Feb 19 15:32:18 crc kubenswrapper[4810]: I0219 15:32:18.452027 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 15:32:18 crc kubenswrapper[4810]: I0219 15:32:18.511128 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9776a876-19db-446c-a7bf-d6fe0111d7b8-config-data\") pod \"9776a876-19db-446c-a7bf-d6fe0111d7b8\" (UID: \"9776a876-19db-446c-a7bf-d6fe0111d7b8\") " Feb 19 15:32:18 crc kubenswrapper[4810]: I0219 15:32:18.511198 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9776a876-19db-446c-a7bf-d6fe0111d7b8-combined-ca-bundle\") pod \"9776a876-19db-446c-a7bf-d6fe0111d7b8\" (UID: \"9776a876-19db-446c-a7bf-d6fe0111d7b8\") " Feb 19 15:32:18 crc kubenswrapper[4810]: I0219 15:32:18.511317 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4rpb\" (UniqueName: \"kubernetes.io/projected/9776a876-19db-446c-a7bf-d6fe0111d7b8-kube-api-access-f4rpb\") pod \"9776a876-19db-446c-a7bf-d6fe0111d7b8\" (UID: \"9776a876-19db-446c-a7bf-d6fe0111d7b8\") " Feb 19 15:32:18 crc kubenswrapper[4810]: I0219 15:32:18.519666 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9776a876-19db-446c-a7bf-d6fe0111d7b8-kube-api-access-f4rpb" (OuterVolumeSpecName: "kube-api-access-f4rpb") pod "9776a876-19db-446c-a7bf-d6fe0111d7b8" (UID: "9776a876-19db-446c-a7bf-d6fe0111d7b8"). InnerVolumeSpecName "kube-api-access-f4rpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:32:18 crc kubenswrapper[4810]: I0219 15:32:18.557700 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9776a876-19db-446c-a7bf-d6fe0111d7b8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9776a876-19db-446c-a7bf-d6fe0111d7b8" (UID: "9776a876-19db-446c-a7bf-d6fe0111d7b8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:32:18 crc kubenswrapper[4810]: I0219 15:32:18.604840 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9776a876-19db-446c-a7bf-d6fe0111d7b8-config-data" (OuterVolumeSpecName: "config-data") pod "9776a876-19db-446c-a7bf-d6fe0111d7b8" (UID: "9776a876-19db-446c-a7bf-d6fe0111d7b8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:32:18 crc kubenswrapper[4810]: I0219 15:32:18.613245 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9776a876-19db-446c-a7bf-d6fe0111d7b8-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 15:32:18 crc kubenswrapper[4810]: I0219 15:32:18.613275 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9776a876-19db-446c-a7bf-d6fe0111d7b8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 15:32:18 crc kubenswrapper[4810]: I0219 15:32:18.613286 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f4rpb\" (UniqueName: \"kubernetes.io/projected/9776a876-19db-446c-a7bf-d6fe0111d7b8-kube-api-access-f4rpb\") on node \"crc\" DevicePath \"\"" Feb 19 15:32:18 crc kubenswrapper[4810]: I0219 15:32:18.824035 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 15:32:18 crc kubenswrapper[4810]: I0219 15:32:18.918567 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/deba4978-0921-4c62-9452-9a47fe24feb7-combined-ca-bundle\") pod \"deba4978-0921-4c62-9452-9a47fe24feb7\" (UID: \"deba4978-0921-4c62-9452-9a47fe24feb7\") " Feb 19 15:32:18 crc kubenswrapper[4810]: I0219 15:32:18.918625 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/deba4978-0921-4c62-9452-9a47fe24feb7-nova-metadata-tls-certs\") pod \"deba4978-0921-4c62-9452-9a47fe24feb7\" (UID: \"deba4978-0921-4c62-9452-9a47fe24feb7\") " Feb 19 15:32:18 crc kubenswrapper[4810]: I0219 15:32:18.918752 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/deba4978-0921-4c62-9452-9a47fe24feb7-config-data\") pod \"deba4978-0921-4c62-9452-9a47fe24feb7\" (UID: \"deba4978-0921-4c62-9452-9a47fe24feb7\") " Feb 19 15:32:18 crc kubenswrapper[4810]: I0219 15:32:18.918790 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/deba4978-0921-4c62-9452-9a47fe24feb7-logs\") pod \"deba4978-0921-4c62-9452-9a47fe24feb7\" (UID: \"deba4978-0921-4c62-9452-9a47fe24feb7\") " Feb 19 15:32:18 crc kubenswrapper[4810]: I0219 15:32:18.918868 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4dgr\" (UniqueName: \"kubernetes.io/projected/deba4978-0921-4c62-9452-9a47fe24feb7-kube-api-access-d4dgr\") pod \"deba4978-0921-4c62-9452-9a47fe24feb7\" (UID: \"deba4978-0921-4c62-9452-9a47fe24feb7\") " Feb 19 15:32:18 crc kubenswrapper[4810]: I0219 15:32:18.919747 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/deba4978-0921-4c62-9452-9a47fe24feb7-logs" (OuterVolumeSpecName: "logs") pod "deba4978-0921-4c62-9452-9a47fe24feb7" (UID: "deba4978-0921-4c62-9452-9a47fe24feb7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:32:18 crc kubenswrapper[4810]: I0219 15:32:18.923161 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/deba4978-0921-4c62-9452-9a47fe24feb7-kube-api-access-d4dgr" (OuterVolumeSpecName: "kube-api-access-d4dgr") pod "deba4978-0921-4c62-9452-9a47fe24feb7" (UID: "deba4978-0921-4c62-9452-9a47fe24feb7"). InnerVolumeSpecName "kube-api-access-d4dgr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:32:18 crc kubenswrapper[4810]: I0219 15:32:18.943574 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/deba4978-0921-4c62-9452-9a47fe24feb7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "deba4978-0921-4c62-9452-9a47fe24feb7" (UID: "deba4978-0921-4c62-9452-9a47fe24feb7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:32:18 crc kubenswrapper[4810]: I0219 15:32:18.947736 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/deba4978-0921-4c62-9452-9a47fe24feb7-config-data" (OuterVolumeSpecName: "config-data") pod "deba4978-0921-4c62-9452-9a47fe24feb7" (UID: "deba4978-0921-4c62-9452-9a47fe24feb7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:32:18 crc kubenswrapper[4810]: I0219 15:32:18.991650 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/deba4978-0921-4c62-9452-9a47fe24feb7-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "deba4978-0921-4c62-9452-9a47fe24feb7" (UID: "deba4978-0921-4c62-9452-9a47fe24feb7"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:32:19 crc kubenswrapper[4810]: I0219 15:32:19.021592 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/deba4978-0921-4c62-9452-9a47fe24feb7-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 15:32:19 crc kubenswrapper[4810]: I0219 15:32:19.021639 4810 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/deba4978-0921-4c62-9452-9a47fe24feb7-logs\") on node \"crc\" DevicePath \"\"" Feb 19 15:32:19 crc kubenswrapper[4810]: I0219 15:32:19.021650 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4dgr\" (UniqueName: \"kubernetes.io/projected/deba4978-0921-4c62-9452-9a47fe24feb7-kube-api-access-d4dgr\") on node \"crc\" DevicePath \"\"" Feb 19 15:32:19 crc kubenswrapper[4810]: I0219 15:32:19.021661 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/deba4978-0921-4c62-9452-9a47fe24feb7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 15:32:19 crc kubenswrapper[4810]: I0219 15:32:19.021672 4810 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/deba4978-0921-4c62-9452-9a47fe24feb7-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 15:32:19 crc kubenswrapper[4810]: I0219 15:32:19.225197 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 15:32:19 crc kubenswrapper[4810]: I0219 15:32:19.225181 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9776a876-19db-446c-a7bf-d6fe0111d7b8","Type":"ContainerDied","Data":"e26af5d090a2b6422428d4303d703f7d22f7771910d627d6bbd17faed1d8906a"} Feb 19 15:32:19 crc kubenswrapper[4810]: I0219 15:32:19.225523 4810 scope.go:117] "RemoveContainer" containerID="9050cf4664528bc43d0a9883f3bb240ba2cdcad9a8907febfcebd4f66322d6a0" Feb 19 15:32:19 crc kubenswrapper[4810]: I0219 15:32:19.227815 4810 generic.go:334] "Generic (PLEG): container finished" podID="deba4978-0921-4c62-9452-9a47fe24feb7" containerID="3cd69082a7ac9af779601af9a54ce467c63853d8132eb18f4aa9ff5eb2315f95" exitCode=0 Feb 19 15:32:19 crc kubenswrapper[4810]: I0219 15:32:19.227855 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"deba4978-0921-4c62-9452-9a47fe24feb7","Type":"ContainerDied","Data":"3cd69082a7ac9af779601af9a54ce467c63853d8132eb18f4aa9ff5eb2315f95"} Feb 19 15:32:19 crc kubenswrapper[4810]: I0219 15:32:19.227884 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"deba4978-0921-4c62-9452-9a47fe24feb7","Type":"ContainerDied","Data":"9c0cc70373e1f5deebdb6c556b83e6477c32297bae250fc5ee8217259477409e"} Feb 19 15:32:19 crc kubenswrapper[4810]: I0219 15:32:19.227937 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 15:32:19 crc kubenswrapper[4810]: I0219 15:32:19.255071 4810 scope.go:117] "RemoveContainer" containerID="3cd69082a7ac9af779601af9a54ce467c63853d8132eb18f4aa9ff5eb2315f95" Feb 19 15:32:19 crc kubenswrapper[4810]: I0219 15:32:19.266137 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 15:32:19 crc kubenswrapper[4810]: I0219 15:32:19.280804 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 15:32:19 crc kubenswrapper[4810]: I0219 15:32:19.288162 4810 scope.go:117] "RemoveContainer" containerID="acc612b198e2bb735041a74b3bf438f971ed76411bde0e76a3b0dd4d3a920c17" Feb 19 15:32:19 crc kubenswrapper[4810]: I0219 15:32:19.293312 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 15:32:19 crc kubenswrapper[4810]: I0219 15:32:19.311953 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 19 15:32:19 crc kubenswrapper[4810]: E0219 15:32:19.312572 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deba4978-0921-4c62-9452-9a47fe24feb7" containerName="nova-metadata-metadata" Feb 19 15:32:19 crc kubenswrapper[4810]: I0219 15:32:19.312600 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="deba4978-0921-4c62-9452-9a47fe24feb7" containerName="nova-metadata-metadata" Feb 19 15:32:19 crc kubenswrapper[4810]: E0219 15:32:19.312632 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f834f671-3add-4bfc-8152-596d66e90f22" containerName="nova-manage" Feb 19 15:32:19 crc kubenswrapper[4810]: I0219 15:32:19.312641 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="f834f671-3add-4bfc-8152-596d66e90f22" containerName="nova-manage" Feb 19 15:32:19 crc kubenswrapper[4810]: E0219 15:32:19.312668 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deba4978-0921-4c62-9452-9a47fe24feb7" containerName="nova-metadata-log" Feb 19 15:32:19 crc kubenswrapper[4810]: I0219 15:32:19.312677 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="deba4978-0921-4c62-9452-9a47fe24feb7" containerName="nova-metadata-log" Feb 19 15:32:19 crc kubenswrapper[4810]: E0219 15:32:19.312689 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d13abd15-5b9b-4e00-984f-9dabbe51ddbc" containerName="init" Feb 19 15:32:19 crc kubenswrapper[4810]: I0219 15:32:19.312697 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="d13abd15-5b9b-4e00-984f-9dabbe51ddbc" containerName="init" Feb 19 15:32:19 crc kubenswrapper[4810]: E0219 15:32:19.312716 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9776a876-19db-446c-a7bf-d6fe0111d7b8" containerName="nova-scheduler-scheduler" Feb 19 15:32:19 crc kubenswrapper[4810]: I0219 15:32:19.312724 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="9776a876-19db-446c-a7bf-d6fe0111d7b8" containerName="nova-scheduler-scheduler" Feb 19 15:32:19 crc kubenswrapper[4810]: E0219 15:32:19.312741 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d13abd15-5b9b-4e00-984f-9dabbe51ddbc" containerName="dnsmasq-dns" Feb 19 15:32:19 crc kubenswrapper[4810]: I0219 15:32:19.312749 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="d13abd15-5b9b-4e00-984f-9dabbe51ddbc" containerName="dnsmasq-dns" Feb 19 15:32:19 crc kubenswrapper[4810]: I0219 15:32:19.312983 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="d13abd15-5b9b-4e00-984f-9dabbe51ddbc" containerName="dnsmasq-dns" Feb 19 15:32:19 crc kubenswrapper[4810]: I0219 15:32:19.313015 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="9776a876-19db-446c-a7bf-d6fe0111d7b8" containerName="nova-scheduler-scheduler" Feb 19 15:32:19 crc kubenswrapper[4810]: I0219 15:32:19.313031 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="f834f671-3add-4bfc-8152-596d66e90f22" containerName="nova-manage" Feb 19 15:32:19 crc kubenswrapper[4810]: I0219 15:32:19.313044 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="deba4978-0921-4c62-9452-9a47fe24feb7" containerName="nova-metadata-metadata" Feb 19 15:32:19 crc kubenswrapper[4810]: I0219 15:32:19.313058 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="deba4978-0921-4c62-9452-9a47fe24feb7" containerName="nova-metadata-log" Feb 19 15:32:19 crc kubenswrapper[4810]: I0219 15:32:19.314719 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 15:32:19 crc kubenswrapper[4810]: I0219 15:32:19.319726 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 19 15:32:19 crc kubenswrapper[4810]: I0219 15:32:19.319973 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 19 15:32:19 crc kubenswrapper[4810]: I0219 15:32:19.330899 4810 scope.go:117] "RemoveContainer" containerID="3cd69082a7ac9af779601af9a54ce467c63853d8132eb18f4aa9ff5eb2315f95" Feb 19 15:32:19 crc kubenswrapper[4810]: E0219 15:32:19.331440 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3cd69082a7ac9af779601af9a54ce467c63853d8132eb18f4aa9ff5eb2315f95\": container with ID starting with 3cd69082a7ac9af779601af9a54ce467c63853d8132eb18f4aa9ff5eb2315f95 not found: ID does not exist" containerID="3cd69082a7ac9af779601af9a54ce467c63853d8132eb18f4aa9ff5eb2315f95" Feb 19 15:32:19 crc kubenswrapper[4810]: I0219 15:32:19.331476 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cd69082a7ac9af779601af9a54ce467c63853d8132eb18f4aa9ff5eb2315f95"} err="failed to get container status \"3cd69082a7ac9af779601af9a54ce467c63853d8132eb18f4aa9ff5eb2315f95\": rpc error: code = NotFound desc = could not find container \"3cd69082a7ac9af779601af9a54ce467c63853d8132eb18f4aa9ff5eb2315f95\": container with ID starting with 3cd69082a7ac9af779601af9a54ce467c63853d8132eb18f4aa9ff5eb2315f95 not found: ID does not exist" Feb 19 15:32:19 crc kubenswrapper[4810]: I0219 15:32:19.331502 4810 scope.go:117] "RemoveContainer" containerID="acc612b198e2bb735041a74b3bf438f971ed76411bde0e76a3b0dd4d3a920c17" Feb 19 15:32:19 crc kubenswrapper[4810]: E0219 15:32:19.331758 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"acc612b198e2bb735041a74b3bf438f971ed76411bde0e76a3b0dd4d3a920c17\": container with ID starting with acc612b198e2bb735041a74b3bf438f971ed76411bde0e76a3b0dd4d3a920c17 not found: ID does not exist" containerID="acc612b198e2bb735041a74b3bf438f971ed76411bde0e76a3b0dd4d3a920c17" Feb 19 15:32:19 crc kubenswrapper[4810]: I0219 15:32:19.331784 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acc612b198e2bb735041a74b3bf438f971ed76411bde0e76a3b0dd4d3a920c17"} err="failed to get container status \"acc612b198e2bb735041a74b3bf438f971ed76411bde0e76a3b0dd4d3a920c17\": rpc error: code = NotFound desc = could not find container \"acc612b198e2bb735041a74b3bf438f971ed76411bde0e76a3b0dd4d3a920c17\": container with ID starting with acc612b198e2bb735041a74b3bf438f971ed76411bde0e76a3b0dd4d3a920c17 not found: ID does not exist" Feb 19 15:32:19 crc kubenswrapper[4810]: I0219 15:32:19.340361 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 15:32:19 crc kubenswrapper[4810]: I0219 15:32:19.347974 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 15:32:19 crc kubenswrapper[4810]: I0219 15:32:19.365962 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 15:32:19 crc kubenswrapper[4810]: I0219 15:32:19.367247 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 15:32:19 crc kubenswrapper[4810]: I0219 15:32:19.370003 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 19 15:32:19 crc kubenswrapper[4810]: I0219 15:32:19.375449 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 15:32:19 crc kubenswrapper[4810]: I0219 15:32:19.426839 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d70a0a1b-ed2d-46f1-aeb9-a335de9b06d4-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d70a0a1b-ed2d-46f1-aeb9-a335de9b06d4\") " pod="openstack/nova-scheduler-0" Feb 19 15:32:19 crc kubenswrapper[4810]: I0219 15:32:19.426910 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpmmr\" (UniqueName: \"kubernetes.io/projected/d70a0a1b-ed2d-46f1-aeb9-a335de9b06d4-kube-api-access-wpmmr\") pod \"nova-scheduler-0\" (UID: \"d70a0a1b-ed2d-46f1-aeb9-a335de9b06d4\") " pod="openstack/nova-scheduler-0" Feb 19 15:32:19 crc kubenswrapper[4810]: I0219 15:32:19.426947 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d70a0a1b-ed2d-46f1-aeb9-a335de9b06d4-config-data\") pod \"nova-scheduler-0\" (UID: \"d70a0a1b-ed2d-46f1-aeb9-a335de9b06d4\") " pod="openstack/nova-scheduler-0" Feb 19 15:32:19 crc kubenswrapper[4810]: I0219 15:32:19.426979 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f36ad344-e946-4221-892d-3ffe8fbdd59b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f36ad344-e946-4221-892d-3ffe8fbdd59b\") " pod="openstack/nova-metadata-0" Feb 19 15:32:19 crc kubenswrapper[4810]: I0219 15:32:19.427003 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f36ad344-e946-4221-892d-3ffe8fbdd59b-logs\") pod \"nova-metadata-0\" (UID: \"f36ad344-e946-4221-892d-3ffe8fbdd59b\") " pod="openstack/nova-metadata-0" Feb 19 15:32:19 crc kubenswrapper[4810]: I0219 15:32:19.427021 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nvfz\" (UniqueName: \"kubernetes.io/projected/f36ad344-e946-4221-892d-3ffe8fbdd59b-kube-api-access-8nvfz\") pod \"nova-metadata-0\" (UID: \"f36ad344-e946-4221-892d-3ffe8fbdd59b\") " pod="openstack/nova-metadata-0" Feb 19 15:32:19 crc kubenswrapper[4810]: I0219 15:32:19.427040 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f36ad344-e946-4221-892d-3ffe8fbdd59b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f36ad344-e946-4221-892d-3ffe8fbdd59b\") " pod="openstack/nova-metadata-0" Feb 19 15:32:19 crc kubenswrapper[4810]: I0219 15:32:19.427270 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f36ad344-e946-4221-892d-3ffe8fbdd59b-config-data\") pod \"nova-metadata-0\" (UID: \"f36ad344-e946-4221-892d-3ffe8fbdd59b\") " pod="openstack/nova-metadata-0" Feb 19 15:32:19 crc kubenswrapper[4810]: I0219 15:32:19.462644 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9776a876-19db-446c-a7bf-d6fe0111d7b8" path="/var/lib/kubelet/pods/9776a876-19db-446c-a7bf-d6fe0111d7b8/volumes" Feb 19 15:32:19 crc kubenswrapper[4810]: I0219 15:32:19.464716 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="deba4978-0921-4c62-9452-9a47fe24feb7" path="/var/lib/kubelet/pods/deba4978-0921-4c62-9452-9a47fe24feb7/volumes" Feb 19 15:32:19 crc kubenswrapper[4810]: I0219 15:32:19.530730 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d70a0a1b-ed2d-46f1-aeb9-a335de9b06d4-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d70a0a1b-ed2d-46f1-aeb9-a335de9b06d4\") " pod="openstack/nova-scheduler-0" Feb 19 15:32:19 crc kubenswrapper[4810]: I0219 15:32:19.530838 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpmmr\" (UniqueName: \"kubernetes.io/projected/d70a0a1b-ed2d-46f1-aeb9-a335de9b06d4-kube-api-access-wpmmr\") pod \"nova-scheduler-0\" (UID: \"d70a0a1b-ed2d-46f1-aeb9-a335de9b06d4\") " pod="openstack/nova-scheduler-0" Feb 19 15:32:19 crc kubenswrapper[4810]: I0219 15:32:19.530887 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d70a0a1b-ed2d-46f1-aeb9-a335de9b06d4-config-data\") pod \"nova-scheduler-0\" (UID: \"d70a0a1b-ed2d-46f1-aeb9-a335de9b06d4\") " pod="openstack/nova-scheduler-0" Feb 19 15:32:19 crc kubenswrapper[4810]: I0219 15:32:19.530932 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f36ad344-e946-4221-892d-3ffe8fbdd59b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f36ad344-e946-4221-892d-3ffe8fbdd59b\") " pod="openstack/nova-metadata-0" Feb 19 15:32:19 crc kubenswrapper[4810]: I0219 15:32:19.530966 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f36ad344-e946-4221-892d-3ffe8fbdd59b-logs\") pod \"nova-metadata-0\" (UID: \"f36ad344-e946-4221-892d-3ffe8fbdd59b\") " pod="openstack/nova-metadata-0" Feb 19 15:32:19 crc kubenswrapper[4810]: I0219 15:32:19.530992 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nvfz\" (UniqueName: \"kubernetes.io/projected/f36ad344-e946-4221-892d-3ffe8fbdd59b-kube-api-access-8nvfz\") pod \"nova-metadata-0\" (UID: \"f36ad344-e946-4221-892d-3ffe8fbdd59b\") " pod="openstack/nova-metadata-0" Feb 19 15:32:19 crc kubenswrapper[4810]: I0219 15:32:19.531035 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f36ad344-e946-4221-892d-3ffe8fbdd59b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f36ad344-e946-4221-892d-3ffe8fbdd59b\") " pod="openstack/nova-metadata-0" Feb 19 15:32:19 crc kubenswrapper[4810]: I0219 15:32:19.531080 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f36ad344-e946-4221-892d-3ffe8fbdd59b-config-data\") pod \"nova-metadata-0\" (UID: \"f36ad344-e946-4221-892d-3ffe8fbdd59b\") " pod="openstack/nova-metadata-0" Feb 19 15:32:19 crc kubenswrapper[4810]: I0219 15:32:19.533989 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f36ad344-e946-4221-892d-3ffe8fbdd59b-logs\") pod \"nova-metadata-0\" (UID: \"f36ad344-e946-4221-892d-3ffe8fbdd59b\") " pod="openstack/nova-metadata-0" Feb 19 15:32:19 crc kubenswrapper[4810]: I0219 15:32:19.535961 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d70a0a1b-ed2d-46f1-aeb9-a335de9b06d4-config-data\") pod \"nova-scheduler-0\" (UID: \"d70a0a1b-ed2d-46f1-aeb9-a335de9b06d4\") " pod="openstack/nova-scheduler-0" Feb 19 15:32:19 crc kubenswrapper[4810]: I0219 15:32:19.536061 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f36ad344-e946-4221-892d-3ffe8fbdd59b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f36ad344-e946-4221-892d-3ffe8fbdd59b\") " pod="openstack/nova-metadata-0" Feb 19 15:32:19 crc kubenswrapper[4810]: I0219 15:32:19.536527 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f36ad344-e946-4221-892d-3ffe8fbdd59b-config-data\") pod \"nova-metadata-0\" (UID: \"f36ad344-e946-4221-892d-3ffe8fbdd59b\") " pod="openstack/nova-metadata-0" Feb 19 15:32:19 crc kubenswrapper[4810]: I0219 15:32:19.537112 4810 patch_prober.go:28] interesting pod/machine-config-daemon-t499d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 15:32:19 crc kubenswrapper[4810]: I0219 15:32:19.537148 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 15:32:19 crc kubenswrapper[4810]: I0219 15:32:19.537258 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d70a0a1b-ed2d-46f1-aeb9-a335de9b06d4-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d70a0a1b-ed2d-46f1-aeb9-a335de9b06d4\") " pod="openstack/nova-scheduler-0" Feb 19 15:32:19 crc kubenswrapper[4810]: I0219 15:32:19.539706 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f36ad344-e946-4221-892d-3ffe8fbdd59b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f36ad344-e946-4221-892d-3ffe8fbdd59b\") " pod="openstack/nova-metadata-0" Feb 19 15:32:19 crc kubenswrapper[4810]: I0219 15:32:19.550522 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nvfz\" (UniqueName: \"kubernetes.io/projected/f36ad344-e946-4221-892d-3ffe8fbdd59b-kube-api-access-8nvfz\") pod \"nova-metadata-0\" (UID: \"f36ad344-e946-4221-892d-3ffe8fbdd59b\") " pod="openstack/nova-metadata-0" Feb 19 15:32:19 crc kubenswrapper[4810]: I0219 15:32:19.551278 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpmmr\" (UniqueName: \"kubernetes.io/projected/d70a0a1b-ed2d-46f1-aeb9-a335de9b06d4-kube-api-access-wpmmr\") pod \"nova-scheduler-0\" (UID: \"d70a0a1b-ed2d-46f1-aeb9-a335de9b06d4\") " pod="openstack/nova-scheduler-0" Feb 19 15:32:19 crc kubenswrapper[4810]: I0219 15:32:19.645078 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 15:32:19 crc kubenswrapper[4810]: I0219 15:32:19.688723 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 15:32:20 crc kubenswrapper[4810]: W0219 15:32:20.153907 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf36ad344_e946_4221_892d_3ffe8fbdd59b.slice/crio-c6351edcf2679df19703a7ceb7bb534a95f811a03ffe261463e4c0399447273a WatchSource:0}: Error finding container c6351edcf2679df19703a7ceb7bb534a95f811a03ffe261463e4c0399447273a: Status 404 returned error can't find the container with id c6351edcf2679df19703a7ceb7bb534a95f811a03ffe261463e4c0399447273a Feb 19 15:32:20 crc kubenswrapper[4810]: I0219 15:32:20.155083 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 15:32:20 crc kubenswrapper[4810]: I0219 15:32:20.241830 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 15:32:20 crc kubenswrapper[4810]: I0219 15:32:20.266042 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f36ad344-e946-4221-892d-3ffe8fbdd59b","Type":"ContainerStarted","Data":"c6351edcf2679df19703a7ceb7bb534a95f811a03ffe261463e4c0399447273a"} Feb 19 15:32:20 crc kubenswrapper[4810]: W0219 15:32:20.278279 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd70a0a1b_ed2d_46f1_aeb9_a335de9b06d4.slice/crio-5b618feec84668a7233e75d2d116b1fbd6762c44348b4c0d14b88c16ba912ebc WatchSource:0}: Error finding container 5b618feec84668a7233e75d2d116b1fbd6762c44348b4c0d14b88c16ba912ebc: Status 404 returned error can't find the container with id 5b618feec84668a7233e75d2d116b1fbd6762c44348b4c0d14b88c16ba912ebc Feb 19 15:32:21 crc kubenswrapper[4810]: I0219 15:32:21.004784 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 15:32:21 crc kubenswrapper[4810]: I0219 15:32:21.070689 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1120f3c4-1323-4ffe-8798-b15e58615278-combined-ca-bundle\") pod \"1120f3c4-1323-4ffe-8798-b15e58615278\" (UID: \"1120f3c4-1323-4ffe-8798-b15e58615278\") " Feb 19 15:32:21 crc kubenswrapper[4810]: I0219 15:32:21.070885 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bfvq8\" (UniqueName: \"kubernetes.io/projected/1120f3c4-1323-4ffe-8798-b15e58615278-kube-api-access-bfvq8\") pod \"1120f3c4-1323-4ffe-8798-b15e58615278\" (UID: \"1120f3c4-1323-4ffe-8798-b15e58615278\") " Feb 19 15:32:21 crc kubenswrapper[4810]: I0219 15:32:21.070982 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1120f3c4-1323-4ffe-8798-b15e58615278-logs\") pod \"1120f3c4-1323-4ffe-8798-b15e58615278\" (UID: \"1120f3c4-1323-4ffe-8798-b15e58615278\") " Feb 19 15:32:21 crc kubenswrapper[4810]: I0219 15:32:21.071857 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1120f3c4-1323-4ffe-8798-b15e58615278-logs" (OuterVolumeSpecName: "logs") pod "1120f3c4-1323-4ffe-8798-b15e58615278" (UID: "1120f3c4-1323-4ffe-8798-b15e58615278"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:32:21 crc kubenswrapper[4810]: I0219 15:32:21.072045 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1120f3c4-1323-4ffe-8798-b15e58615278-public-tls-certs\") pod \"1120f3c4-1323-4ffe-8798-b15e58615278\" (UID: \"1120f3c4-1323-4ffe-8798-b15e58615278\") " Feb 19 15:32:21 crc kubenswrapper[4810]: I0219 15:32:21.072130 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1120f3c4-1323-4ffe-8798-b15e58615278-config-data\") pod \"1120f3c4-1323-4ffe-8798-b15e58615278\" (UID: \"1120f3c4-1323-4ffe-8798-b15e58615278\") " Feb 19 15:32:21 crc kubenswrapper[4810]: I0219 15:32:21.072174 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1120f3c4-1323-4ffe-8798-b15e58615278-internal-tls-certs\") pod \"1120f3c4-1323-4ffe-8798-b15e58615278\" (UID: \"1120f3c4-1323-4ffe-8798-b15e58615278\") " Feb 19 15:32:21 crc kubenswrapper[4810]: I0219 15:32:21.072794 4810 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1120f3c4-1323-4ffe-8798-b15e58615278-logs\") on node \"crc\" DevicePath \"\"" Feb 19 15:32:21 crc kubenswrapper[4810]: I0219 15:32:21.074639 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1120f3c4-1323-4ffe-8798-b15e58615278-kube-api-access-bfvq8" (OuterVolumeSpecName: "kube-api-access-bfvq8") pod "1120f3c4-1323-4ffe-8798-b15e58615278" (UID: "1120f3c4-1323-4ffe-8798-b15e58615278"). InnerVolumeSpecName "kube-api-access-bfvq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:32:21 crc kubenswrapper[4810]: I0219 15:32:21.098664 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1120f3c4-1323-4ffe-8798-b15e58615278-config-data" (OuterVolumeSpecName: "config-data") pod "1120f3c4-1323-4ffe-8798-b15e58615278" (UID: "1120f3c4-1323-4ffe-8798-b15e58615278"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:32:21 crc kubenswrapper[4810]: I0219 15:32:21.130894 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1120f3c4-1323-4ffe-8798-b15e58615278-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "1120f3c4-1323-4ffe-8798-b15e58615278" (UID: "1120f3c4-1323-4ffe-8798-b15e58615278"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:32:21 crc kubenswrapper[4810]: I0219 15:32:21.150666 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1120f3c4-1323-4ffe-8798-b15e58615278-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1120f3c4-1323-4ffe-8798-b15e58615278" (UID: "1120f3c4-1323-4ffe-8798-b15e58615278"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:32:21 crc kubenswrapper[4810]: I0219 15:32:21.152559 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1120f3c4-1323-4ffe-8798-b15e58615278-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "1120f3c4-1323-4ffe-8798-b15e58615278" (UID: "1120f3c4-1323-4ffe-8798-b15e58615278"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:32:21 crc kubenswrapper[4810]: I0219 15:32:21.174954 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1120f3c4-1323-4ffe-8798-b15e58615278-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 15:32:21 crc kubenswrapper[4810]: I0219 15:32:21.175009 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bfvq8\" (UniqueName: \"kubernetes.io/projected/1120f3c4-1323-4ffe-8798-b15e58615278-kube-api-access-bfvq8\") on node \"crc\" DevicePath \"\"" Feb 19 15:32:21 crc kubenswrapper[4810]: I0219 15:32:21.175031 4810 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1120f3c4-1323-4ffe-8798-b15e58615278-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 15:32:21 crc kubenswrapper[4810]: I0219 15:32:21.175051 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1120f3c4-1323-4ffe-8798-b15e58615278-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 15:32:21 crc kubenswrapper[4810]: I0219 15:32:21.175068 4810 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1120f3c4-1323-4ffe-8798-b15e58615278-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 15:32:21 crc kubenswrapper[4810]: I0219 15:32:21.286154 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f36ad344-e946-4221-892d-3ffe8fbdd59b","Type":"ContainerStarted","Data":"ad3221d34446778cf5aa06a120ee3623be2cb79ccf71fa0fe0fdc8508b537b39"} Feb 19 15:32:21 crc kubenswrapper[4810]: I0219 15:32:21.286215 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f36ad344-e946-4221-892d-3ffe8fbdd59b","Type":"ContainerStarted","Data":"421f31f709e42d33e806da8cbccdf634f54f5d5f42da772671dff85eade1c72b"} Feb 19 15:32:21 crc kubenswrapper[4810]: I0219 15:32:21.290598 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d70a0a1b-ed2d-46f1-aeb9-a335de9b06d4","Type":"ContainerStarted","Data":"755c6f02f55802e87f157919a987564dd4c473e1c99e2dd09cad9a38d6522eba"} Feb 19 15:32:21 crc kubenswrapper[4810]: I0219 15:32:21.290661 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d70a0a1b-ed2d-46f1-aeb9-a335de9b06d4","Type":"ContainerStarted","Data":"5b618feec84668a7233e75d2d116b1fbd6762c44348b4c0d14b88c16ba912ebc"} Feb 19 15:32:21 crc kubenswrapper[4810]: I0219 15:32:21.294289 4810 generic.go:334] "Generic (PLEG): container finished" podID="1120f3c4-1323-4ffe-8798-b15e58615278" containerID="83dcf28c966753013525c88cc66000052f7461b478620e9cca2052eda9edb662" exitCode=0 Feb 19 15:32:21 crc kubenswrapper[4810]: I0219 15:32:21.294377 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1120f3c4-1323-4ffe-8798-b15e58615278","Type":"ContainerDied","Data":"83dcf28c966753013525c88cc66000052f7461b478620e9cca2052eda9edb662"} Feb 19 15:32:21 crc kubenswrapper[4810]: I0219 15:32:21.294426 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1120f3c4-1323-4ffe-8798-b15e58615278","Type":"ContainerDied","Data":"5c8bcb9cab03480eccfcad397fcab4db26af8c7b42bbdd88a46bf05b9ed6f929"} Feb 19 15:32:21 crc kubenswrapper[4810]: I0219 15:32:21.294458 4810 scope.go:117] "RemoveContainer" containerID="83dcf28c966753013525c88cc66000052f7461b478620e9cca2052eda9edb662" Feb 19 15:32:21 crc kubenswrapper[4810]: I0219 15:32:21.294490 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 15:32:21 crc kubenswrapper[4810]: I0219 15:32:21.311762 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.311693782 podStartE2EDuration="2.311693782s" podCreationTimestamp="2026-02-19 15:32:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:32:21.30771741 +0000 UTC m=+1370.789747564" watchObservedRunningTime="2026-02-19 15:32:21.311693782 +0000 UTC m=+1370.793723936" Feb 19 15:32:21 crc kubenswrapper[4810]: I0219 15:32:21.331615 4810 scope.go:117] "RemoveContainer" containerID="bfdb849b75abf04adb5f644e93c3f65364ac03de4b6998e5eb22cf3fb4118b78" Feb 19 15:32:21 crc kubenswrapper[4810]: I0219 15:32:21.338179 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.338133084 podStartE2EDuration="2.338133084s" podCreationTimestamp="2026-02-19 15:32:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:32:21.333042395 +0000 UTC m=+1370.815072539" watchObservedRunningTime="2026-02-19 15:32:21.338133084 +0000 UTC m=+1370.820163208" Feb 19 15:32:21 crc kubenswrapper[4810]: I0219 15:32:21.367905 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 15:32:21 crc kubenswrapper[4810]: I0219 15:32:21.374129 4810 scope.go:117] "RemoveContainer" containerID="83dcf28c966753013525c88cc66000052f7461b478620e9cca2052eda9edb662" Feb 19 15:32:21 crc kubenswrapper[4810]: E0219 15:32:21.374775 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83dcf28c966753013525c88cc66000052f7461b478620e9cca2052eda9edb662\": container with ID starting with 83dcf28c966753013525c88cc66000052f7461b478620e9cca2052eda9edb662 not found: ID does not exist" containerID="83dcf28c966753013525c88cc66000052f7461b478620e9cca2052eda9edb662" Feb 19 15:32:21 crc kubenswrapper[4810]: I0219 15:32:21.374814 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83dcf28c966753013525c88cc66000052f7461b478620e9cca2052eda9edb662"} err="failed to get container status \"83dcf28c966753013525c88cc66000052f7461b478620e9cca2052eda9edb662\": rpc error: code = NotFound desc = could not find container \"83dcf28c966753013525c88cc66000052f7461b478620e9cca2052eda9edb662\": container with ID starting with 83dcf28c966753013525c88cc66000052f7461b478620e9cca2052eda9edb662 not found: ID does not exist" Feb 19 15:32:21 crc kubenswrapper[4810]: I0219 15:32:21.374843 4810 scope.go:117] "RemoveContainer" containerID="bfdb849b75abf04adb5f644e93c3f65364ac03de4b6998e5eb22cf3fb4118b78" Feb 19 15:32:21 crc kubenswrapper[4810]: E0219 15:32:21.375539 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfdb849b75abf04adb5f644e93c3f65364ac03de4b6998e5eb22cf3fb4118b78\": container with ID starting with bfdb849b75abf04adb5f644e93c3f65364ac03de4b6998e5eb22cf3fb4118b78 not found: ID does not exist" containerID="bfdb849b75abf04adb5f644e93c3f65364ac03de4b6998e5eb22cf3fb4118b78" Feb 19 15:32:21 crc kubenswrapper[4810]: I0219 15:32:21.375593 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfdb849b75abf04adb5f644e93c3f65364ac03de4b6998e5eb22cf3fb4118b78"} err="failed to get container status \"bfdb849b75abf04adb5f644e93c3f65364ac03de4b6998e5eb22cf3fb4118b78\": rpc error: code = NotFound desc = could not find container \"bfdb849b75abf04adb5f644e93c3f65364ac03de4b6998e5eb22cf3fb4118b78\": container with ID starting with bfdb849b75abf04adb5f644e93c3f65364ac03de4b6998e5eb22cf3fb4118b78 not found: ID does not exist" Feb 19 15:32:21 crc kubenswrapper[4810]: I0219 15:32:21.385612 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 19 15:32:21 crc kubenswrapper[4810]: I0219 15:32:21.394230 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 19 15:32:21 crc kubenswrapper[4810]: E0219 15:32:21.394748 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1120f3c4-1323-4ffe-8798-b15e58615278" containerName="nova-api-api" Feb 19 15:32:21 crc kubenswrapper[4810]: I0219 15:32:21.394771 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="1120f3c4-1323-4ffe-8798-b15e58615278" containerName="nova-api-api" Feb 19 15:32:21 crc kubenswrapper[4810]: E0219 15:32:21.394793 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1120f3c4-1323-4ffe-8798-b15e58615278" containerName="nova-api-log" Feb 19 15:32:21 crc kubenswrapper[4810]: I0219 15:32:21.394802 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="1120f3c4-1323-4ffe-8798-b15e58615278" containerName="nova-api-log" Feb 19 15:32:21 crc kubenswrapper[4810]: I0219 15:32:21.395061 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="1120f3c4-1323-4ffe-8798-b15e58615278" containerName="nova-api-log" Feb 19 15:32:21 crc kubenswrapper[4810]: I0219 15:32:21.395111 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="1120f3c4-1323-4ffe-8798-b15e58615278" containerName="nova-api-api" Feb 19 15:32:21 crc kubenswrapper[4810]: I0219 15:32:21.396384 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 15:32:21 crc kubenswrapper[4810]: I0219 15:32:21.406439 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 19 15:32:21 crc kubenswrapper[4810]: I0219 15:32:21.406533 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 19 15:32:21 crc kubenswrapper[4810]: I0219 15:32:21.406654 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 19 15:32:21 crc kubenswrapper[4810]: I0219 15:32:21.409118 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 15:32:21 crc kubenswrapper[4810]: I0219 15:32:21.453931 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1120f3c4-1323-4ffe-8798-b15e58615278" path="/var/lib/kubelet/pods/1120f3c4-1323-4ffe-8798-b15e58615278/volumes" Feb 19 15:32:21 crc kubenswrapper[4810]: I0219 15:32:21.480366 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6397af05-d030-46c2-8a0f-a90beb9b2502-logs\") pod \"nova-api-0\" (UID: \"6397af05-d030-46c2-8a0f-a90beb9b2502\") " pod="openstack/nova-api-0" Feb 19 15:32:21 crc kubenswrapper[4810]: I0219 15:32:21.480416 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6397af05-d030-46c2-8a0f-a90beb9b2502-public-tls-certs\") pod \"nova-api-0\" (UID: \"6397af05-d030-46c2-8a0f-a90beb9b2502\") " pod="openstack/nova-api-0" Feb 19 15:32:21 crc kubenswrapper[4810]: I0219 15:32:21.480507 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6397af05-d030-46c2-8a0f-a90beb9b2502-internal-tls-certs\") pod \"nova-api-0\" (UID: \"6397af05-d030-46c2-8a0f-a90beb9b2502\") " pod="openstack/nova-api-0" Feb 19 15:32:21 crc kubenswrapper[4810]: I0219 15:32:21.480597 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mntt5\" (UniqueName: \"kubernetes.io/projected/6397af05-d030-46c2-8a0f-a90beb9b2502-kube-api-access-mntt5\") pod \"nova-api-0\" (UID: \"6397af05-d030-46c2-8a0f-a90beb9b2502\") " pod="openstack/nova-api-0" Feb 19 15:32:21 crc kubenswrapper[4810]: I0219 15:32:21.481041 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6397af05-d030-46c2-8a0f-a90beb9b2502-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6397af05-d030-46c2-8a0f-a90beb9b2502\") " pod="openstack/nova-api-0" Feb 19 15:32:21 crc kubenswrapper[4810]: I0219 15:32:21.481187 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6397af05-d030-46c2-8a0f-a90beb9b2502-config-data\") pod \"nova-api-0\" (UID: \"6397af05-d030-46c2-8a0f-a90beb9b2502\") " pod="openstack/nova-api-0" Feb 19 15:32:21 crc kubenswrapper[4810]: I0219 15:32:21.583648 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6397af05-d030-46c2-8a0f-a90beb9b2502-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6397af05-d030-46c2-8a0f-a90beb9b2502\") " pod="openstack/nova-api-0" Feb 19 15:32:21 crc kubenswrapper[4810]: I0219 15:32:21.583736 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6397af05-d030-46c2-8a0f-a90beb9b2502-config-data\") pod \"nova-api-0\" (UID: \"6397af05-d030-46c2-8a0f-a90beb9b2502\") " pod="openstack/nova-api-0" Feb 19 15:32:21 crc kubenswrapper[4810]: I0219 15:32:21.583864 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6397af05-d030-46c2-8a0f-a90beb9b2502-logs\") pod \"nova-api-0\" (UID: \"6397af05-d030-46c2-8a0f-a90beb9b2502\") " pod="openstack/nova-api-0" Feb 19 15:32:21 crc kubenswrapper[4810]: I0219 15:32:21.583904 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6397af05-d030-46c2-8a0f-a90beb9b2502-public-tls-certs\") pod \"nova-api-0\" (UID: \"6397af05-d030-46c2-8a0f-a90beb9b2502\") " pod="openstack/nova-api-0" Feb 19 15:32:21 crc kubenswrapper[4810]: I0219 15:32:21.583988 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6397af05-d030-46c2-8a0f-a90beb9b2502-internal-tls-certs\") pod \"nova-api-0\" (UID: \"6397af05-d030-46c2-8a0f-a90beb9b2502\") " pod="openstack/nova-api-0" Feb 19 15:32:21 crc kubenswrapper[4810]: I0219 15:32:21.584049 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mntt5\" (UniqueName: \"kubernetes.io/projected/6397af05-d030-46c2-8a0f-a90beb9b2502-kube-api-access-mntt5\") pod \"nova-api-0\" (UID: \"6397af05-d030-46c2-8a0f-a90beb9b2502\") " pod="openstack/nova-api-0" Feb 19 15:32:21 crc kubenswrapper[4810]: I0219 15:32:21.584920 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6397af05-d030-46c2-8a0f-a90beb9b2502-logs\") pod \"nova-api-0\" (UID: \"6397af05-d030-46c2-8a0f-a90beb9b2502\") " pod="openstack/nova-api-0" Feb 19 15:32:21 crc kubenswrapper[4810]: I0219 15:32:21.589209 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6397af05-d030-46c2-8a0f-a90beb9b2502-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6397af05-d030-46c2-8a0f-a90beb9b2502\") " pod="openstack/nova-api-0" Feb 19 15:32:21 crc kubenswrapper[4810]: I0219 15:32:21.590208 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6397af05-d030-46c2-8a0f-a90beb9b2502-internal-tls-certs\") pod \"nova-api-0\" (UID: \"6397af05-d030-46c2-8a0f-a90beb9b2502\") " pod="openstack/nova-api-0" Feb 19 15:32:21 crc kubenswrapper[4810]: I0219 15:32:21.590635 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6397af05-d030-46c2-8a0f-a90beb9b2502-config-data\") pod \"nova-api-0\" (UID: \"6397af05-d030-46c2-8a0f-a90beb9b2502\") " pod="openstack/nova-api-0" Feb 19 15:32:21 crc kubenswrapper[4810]: I0219 15:32:21.590866 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6397af05-d030-46c2-8a0f-a90beb9b2502-public-tls-certs\") pod \"nova-api-0\" (UID: \"6397af05-d030-46c2-8a0f-a90beb9b2502\") " pod="openstack/nova-api-0" Feb 19 15:32:21 crc kubenswrapper[4810]: I0219 15:32:21.605440 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mntt5\" (UniqueName: \"kubernetes.io/projected/6397af05-d030-46c2-8a0f-a90beb9b2502-kube-api-access-mntt5\") pod \"nova-api-0\" (UID: \"6397af05-d030-46c2-8a0f-a90beb9b2502\") " pod="openstack/nova-api-0" Feb 19 15:32:21 crc kubenswrapper[4810]: I0219 15:32:21.721798 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 15:32:22 crc kubenswrapper[4810]: W0219 15:32:22.282751 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6397af05_d030_46c2_8a0f_a90beb9b2502.slice/crio-8c2c929b825b8c0b7372b6797ac977dd0218a672d0b6cec53c61516afedfcfd6 WatchSource:0}: Error finding container 8c2c929b825b8c0b7372b6797ac977dd0218a672d0b6cec53c61516afedfcfd6: Status 404 returned error can't find the container with id 8c2c929b825b8c0b7372b6797ac977dd0218a672d0b6cec53c61516afedfcfd6 Feb 19 15:32:22 crc kubenswrapper[4810]: I0219 15:32:22.283015 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 15:32:22 crc kubenswrapper[4810]: I0219 15:32:22.304311 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6397af05-d030-46c2-8a0f-a90beb9b2502","Type":"ContainerStarted","Data":"8c2c929b825b8c0b7372b6797ac977dd0218a672d0b6cec53c61516afedfcfd6"} Feb 19 15:32:23 crc kubenswrapper[4810]: I0219 15:32:23.317194 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6397af05-d030-46c2-8a0f-a90beb9b2502","Type":"ContainerStarted","Data":"d126d183ccc2619d372f4dc3e6c72b3581ce487d207925dee34a82886e30e28f"} Feb 19 15:32:23 crc kubenswrapper[4810]: I0219 15:32:23.317740 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6397af05-d030-46c2-8a0f-a90beb9b2502","Type":"ContainerStarted","Data":"c90a5d8710584e4feb60b33d824e1b128654f406aa3390d69408f6d9fa4cb765"} Feb 19 15:32:23 crc kubenswrapper[4810]: I0219 15:32:23.353245 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.353218548 podStartE2EDuration="2.353218548s" podCreationTimestamp="2026-02-19 15:32:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:32:23.345255015 +0000 UTC m=+1372.827285179" watchObservedRunningTime="2026-02-19 15:32:23.353218548 +0000 UTC m=+1372.835248702" Feb 19 15:32:24 crc kubenswrapper[4810]: I0219 15:32:24.645483 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 19 15:32:24 crc kubenswrapper[4810]: I0219 15:32:24.645836 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 19 15:32:24 crc kubenswrapper[4810]: I0219 15:32:24.688894 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 19 15:32:29 crc kubenswrapper[4810]: I0219 15:32:29.645565 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 19 15:32:29 crc kubenswrapper[4810]: I0219 15:32:29.646282 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 19 15:32:29 crc kubenswrapper[4810]: I0219 15:32:29.689068 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 19 15:32:29 crc kubenswrapper[4810]: I0219 15:32:29.721796 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 19 15:32:30 crc kubenswrapper[4810]: I0219 15:32:30.455040 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 19 15:32:30 crc kubenswrapper[4810]: I0219 15:32:30.658488 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="f36ad344-e946-4221-892d-3ffe8fbdd59b" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.231:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 19 15:32:30 crc kubenswrapper[4810]: I0219 15:32:30.658506 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="f36ad344-e946-4221-892d-3ffe8fbdd59b" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.231:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 19 15:32:31 crc kubenswrapper[4810]: I0219 15:32:31.722547 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 15:32:31 crc kubenswrapper[4810]: I0219 15:32:31.722938 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 15:32:32 crc kubenswrapper[4810]: I0219 15:32:32.736591 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="6397af05-d030-46c2-8a0f-a90beb9b2502" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.233:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 19 15:32:32 crc kubenswrapper[4810]: I0219 15:32:32.736565 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="6397af05-d030-46c2-8a0f-a90beb9b2502" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.233:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 19 15:32:36 crc kubenswrapper[4810]: I0219 15:32:36.466641 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 19 15:32:39 crc kubenswrapper[4810]: I0219 15:32:39.075706 4810 scope.go:117] "RemoveContainer" containerID="1d5f88ecf4c81e410f0df90ab60b4889433c0df9a3a9c3d46a0ad0dad5a5c6f9" Feb 19 15:32:39 crc kubenswrapper[4810]: I0219 15:32:39.111530 4810 scope.go:117] "RemoveContainer" containerID="5cab9d89ebbb4715343c797d52130d009456c8d3d9eaf887c80836933b581c07" Feb 19 15:32:39 crc kubenswrapper[4810]: I0219 15:32:39.144657 4810 scope.go:117] "RemoveContainer" containerID="a21841d08c3392c62d7d0123a3228f5e73d4d173558fa34eefb59a0035628f39" Feb 19 15:32:39 crc kubenswrapper[4810]: I0219 15:32:39.651380 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 19 15:32:39 crc kubenswrapper[4810]: I0219 15:32:39.655442 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 19 15:32:39 crc kubenswrapper[4810]: I0219 15:32:39.658006 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 19 15:32:40 crc kubenswrapper[4810]: I0219 15:32:40.546675 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 19 15:32:41 crc kubenswrapper[4810]: I0219 15:32:41.735222 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 19 15:32:41 crc kubenswrapper[4810]: I0219 15:32:41.736600 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 19 15:32:41 crc kubenswrapper[4810]: I0219 15:32:41.743660 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 19 15:32:41 crc kubenswrapper[4810]: I0219 15:32:41.753643 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 19 15:32:42 crc kubenswrapper[4810]: I0219 15:32:42.566412 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 19 15:32:42 crc kubenswrapper[4810]: I0219 15:32:42.579103 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 19 15:32:49 crc kubenswrapper[4810]: I0219 15:32:49.538173 4810 patch_prober.go:28] interesting pod/machine-config-daemon-t499d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 15:32:49 crc kubenswrapper[4810]: I0219 15:32:49.538911 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 15:32:49 crc kubenswrapper[4810]: I0219 15:32:49.538975 4810 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t499d" Feb 19 15:32:49 crc kubenswrapper[4810]: I0219 15:32:49.539950 4810 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2979159f3325af188cf73d374cfc4f7b1a64cb0be10361454a84d92914ce8075"} pod="openshift-machine-config-operator/machine-config-daemon-t499d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 15:32:49 crc kubenswrapper[4810]: I0219 15:32:49.540047 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" containerID="cri-o://2979159f3325af188cf73d374cfc4f7b1a64cb0be10361454a84d92914ce8075" gracePeriod=600 Feb 19 15:32:50 crc kubenswrapper[4810]: I0219 15:32:50.660373 4810 generic.go:334] "Generic (PLEG): container finished" podID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerID="2979159f3325af188cf73d374cfc4f7b1a64cb0be10361454a84d92914ce8075" exitCode=0 Feb 19 15:32:50 crc kubenswrapper[4810]: I0219 15:32:50.660476 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" event={"ID":"dfbf74ef-1e94-4826-8583-42b2e246ccf3","Type":"ContainerDied","Data":"2979159f3325af188cf73d374cfc4f7b1a64cb0be10361454a84d92914ce8075"} Feb 19 15:32:50 crc kubenswrapper[4810]: I0219 15:32:50.660950 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" event={"ID":"dfbf74ef-1e94-4826-8583-42b2e246ccf3","Type":"ContainerStarted","Data":"2ce40b521710718c7ba6b5fc71023bb5882beb271696b0616786a7ce052f1e2a"} Feb 19 15:32:50 crc kubenswrapper[4810]: I0219 15:32:50.660973 4810 scope.go:117] "RemoveContainer" containerID="37fe95e370faa9fca4a69499713730a8ba7e7939f57cd237ea9a505f9b09a6bf" Feb 19 15:32:51 crc kubenswrapper[4810]: I0219 15:32:51.229657 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 15:32:52 crc kubenswrapper[4810]: I0219 15:32:52.354514 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 15:32:52 crc kubenswrapper[4810]: I0219 15:32:52.591402 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-955rr"] Feb 19 15:32:52 crc kubenswrapper[4810]: I0219 15:32:52.593311 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-955rr" Feb 19 15:32:52 crc kubenswrapper[4810]: I0219 15:32:52.616252 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-955rr"] Feb 19 15:32:52 crc kubenswrapper[4810]: I0219 15:32:52.649971 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/783f4f81-b0fb-49c4-9b07-a2715641355a-utilities\") pod \"redhat-operators-955rr\" (UID: \"783f4f81-b0fb-49c4-9b07-a2715641355a\") " pod="openshift-marketplace/redhat-operators-955rr" Feb 19 15:32:52 crc kubenswrapper[4810]: I0219 15:32:52.650024 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/783f4f81-b0fb-49c4-9b07-a2715641355a-catalog-content\") pod \"redhat-operators-955rr\" (UID: \"783f4f81-b0fb-49c4-9b07-a2715641355a\") " pod="openshift-marketplace/redhat-operators-955rr" Feb 19 15:32:52 crc kubenswrapper[4810]: I0219 15:32:52.650115 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrjnp\" (UniqueName: \"kubernetes.io/projected/783f4f81-b0fb-49c4-9b07-a2715641355a-kube-api-access-hrjnp\") pod \"redhat-operators-955rr\" (UID: \"783f4f81-b0fb-49c4-9b07-a2715641355a\") " pod="openshift-marketplace/redhat-operators-955rr" Feb 19 15:32:52 crc kubenswrapper[4810]: I0219 15:32:52.751570 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/783f4f81-b0fb-49c4-9b07-a2715641355a-utilities\") pod \"redhat-operators-955rr\" (UID: \"783f4f81-b0fb-49c4-9b07-a2715641355a\") " pod="openshift-marketplace/redhat-operators-955rr" Feb 19 15:32:52 crc kubenswrapper[4810]: I0219 15:32:52.751611 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/783f4f81-b0fb-49c4-9b07-a2715641355a-catalog-content\") pod \"redhat-operators-955rr\" (UID: \"783f4f81-b0fb-49c4-9b07-a2715641355a\") " pod="openshift-marketplace/redhat-operators-955rr" Feb 19 15:32:52 crc kubenswrapper[4810]: I0219 15:32:52.751643 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrjnp\" (UniqueName: \"kubernetes.io/projected/783f4f81-b0fb-49c4-9b07-a2715641355a-kube-api-access-hrjnp\") pod \"redhat-operators-955rr\" (UID: \"783f4f81-b0fb-49c4-9b07-a2715641355a\") " pod="openshift-marketplace/redhat-operators-955rr" Feb 19 15:32:52 crc kubenswrapper[4810]: I0219 15:32:52.752056 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/783f4f81-b0fb-49c4-9b07-a2715641355a-utilities\") pod \"redhat-operators-955rr\" (UID: \"783f4f81-b0fb-49c4-9b07-a2715641355a\") " pod="openshift-marketplace/redhat-operators-955rr" Feb 19 15:32:52 crc kubenswrapper[4810]: I0219 15:32:52.752121 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/783f4f81-b0fb-49c4-9b07-a2715641355a-catalog-content\") pod \"redhat-operators-955rr\" (UID: \"783f4f81-b0fb-49c4-9b07-a2715641355a\") " pod="openshift-marketplace/redhat-operators-955rr" Feb 19 15:32:52 crc kubenswrapper[4810]: I0219 15:32:52.780885 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrjnp\" (UniqueName: \"kubernetes.io/projected/783f4f81-b0fb-49c4-9b07-a2715641355a-kube-api-access-hrjnp\") pod \"redhat-operators-955rr\" (UID: \"783f4f81-b0fb-49c4-9b07-a2715641355a\") " pod="openshift-marketplace/redhat-operators-955rr" Feb 19 15:32:52 crc kubenswrapper[4810]: I0219 15:32:52.921785 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-955rr" Feb 19 15:32:53 crc kubenswrapper[4810]: W0219 15:32:53.324734 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod783f4f81_b0fb_49c4_9b07_a2715641355a.slice/crio-75d6325f144c755c14a8b6930d4d42def83ced3289d5cc9615e327af0d069534 WatchSource:0}: Error finding container 75d6325f144c755c14a8b6930d4d42def83ced3289d5cc9615e327af0d069534: Status 404 returned error can't find the container with id 75d6325f144c755c14a8b6930d4d42def83ced3289d5cc9615e327af0d069534 Feb 19 15:32:53 crc kubenswrapper[4810]: I0219 15:32:53.333948 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-955rr"] Feb 19 15:32:53 crc kubenswrapper[4810]: I0219 15:32:53.700875 4810 generic.go:334] "Generic (PLEG): container finished" podID="783f4f81-b0fb-49c4-9b07-a2715641355a" containerID="761f02a235bc3255020cf4a80c96bbf0e5db299cb80ff40640ab5d09947f186d" exitCode=0 Feb 19 15:32:53 crc kubenswrapper[4810]: I0219 15:32:53.700931 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-955rr" event={"ID":"783f4f81-b0fb-49c4-9b07-a2715641355a","Type":"ContainerDied","Data":"761f02a235bc3255020cf4a80c96bbf0e5db299cb80ff40640ab5d09947f186d"} Feb 19 15:32:53 crc kubenswrapper[4810]: I0219 15:32:53.700956 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-955rr" event={"ID":"783f4f81-b0fb-49c4-9b07-a2715641355a","Type":"ContainerStarted","Data":"75d6325f144c755c14a8b6930d4d42def83ced3289d5cc9615e327af0d069534"} Feb 19 15:32:54 crc kubenswrapper[4810]: I0219 15:32:54.920230 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="00bcfb03-4357-4343-99a5-30dc7f25abe9" containerName="rabbitmq" containerID="cri-o://c5b200cb5f29f71ee660d52a8a35bdae1c6f7de01b68b55aa0a763c8a2cc371f" gracePeriod=604797 Feb 19 15:32:55 crc kubenswrapper[4810]: I0219 15:32:55.743309 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-955rr" event={"ID":"783f4f81-b0fb-49c4-9b07-a2715641355a","Type":"ContainerStarted","Data":"6bc2e1d7197343d97aad8e795e58bf89ffecf5750c4972eb72fee6305a410fbc"} Feb 19 15:32:56 crc kubenswrapper[4810]: I0219 15:32:56.031544 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="2a3676ed-f06f-4dea-82a1-959716331113" containerName="rabbitmq" containerID="cri-o://dbc27193c644946d801635eac3f4ac8fd7e93e4ac9ea0ee7cb027f5cec7cc199" gracePeriod=604797 Feb 19 15:32:56 crc kubenswrapper[4810]: I0219 15:32:56.132894 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="00bcfb03-4357-4343-99a5-30dc7f25abe9" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.106:5671: connect: connection refused" Feb 19 15:32:56 crc kubenswrapper[4810]: I0219 15:32:56.443011 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="2a3676ed-f06f-4dea-82a1-959716331113" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.107:5671: connect: connection refused" Feb 19 15:32:57 crc kubenswrapper[4810]: I0219 15:32:57.766684 4810 generic.go:334] "Generic (PLEG): container finished" podID="783f4f81-b0fb-49c4-9b07-a2715641355a" containerID="6bc2e1d7197343d97aad8e795e58bf89ffecf5750c4972eb72fee6305a410fbc" exitCode=0 Feb 19 15:32:57 crc kubenswrapper[4810]: I0219 15:32:57.766753 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-955rr" event={"ID":"783f4f81-b0fb-49c4-9b07-a2715641355a","Type":"ContainerDied","Data":"6bc2e1d7197343d97aad8e795e58bf89ffecf5750c4972eb72fee6305a410fbc"} Feb 19 15:32:57 crc kubenswrapper[4810]: I0219 15:32:57.768540 4810 generic.go:334] "Generic (PLEG): container finished" podID="00bcfb03-4357-4343-99a5-30dc7f25abe9" containerID="c5b200cb5f29f71ee660d52a8a35bdae1c6f7de01b68b55aa0a763c8a2cc371f" exitCode=0 Feb 19 15:32:57 crc kubenswrapper[4810]: I0219 15:32:57.768572 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"00bcfb03-4357-4343-99a5-30dc7f25abe9","Type":"ContainerDied","Data":"c5b200cb5f29f71ee660d52a8a35bdae1c6f7de01b68b55aa0a763c8a2cc371f"} Feb 19 15:32:58 crc kubenswrapper[4810]: I0219 15:32:58.700190 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 19 15:32:58 crc kubenswrapper[4810]: I0219 15:32:58.778586 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/00bcfb03-4357-4343-99a5-30dc7f25abe9-rabbitmq-confd\") pod \"00bcfb03-4357-4343-99a5-30dc7f25abe9\" (UID: \"00bcfb03-4357-4343-99a5-30dc7f25abe9\") " Feb 19 15:32:58 crc kubenswrapper[4810]: I0219 15:32:58.778632 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/00bcfb03-4357-4343-99a5-30dc7f25abe9-erlang-cookie-secret\") pod \"00bcfb03-4357-4343-99a5-30dc7f25abe9\" (UID: \"00bcfb03-4357-4343-99a5-30dc7f25abe9\") " Feb 19 15:32:58 crc kubenswrapper[4810]: I0219 15:32:58.778692 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/00bcfb03-4357-4343-99a5-30dc7f25abe9-pod-info\") pod \"00bcfb03-4357-4343-99a5-30dc7f25abe9\" (UID: \"00bcfb03-4357-4343-99a5-30dc7f25abe9\") " Feb 19 15:32:58 crc kubenswrapper[4810]: I0219 15:32:58.778724 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/00bcfb03-4357-4343-99a5-30dc7f25abe9-config-data\") pod \"00bcfb03-4357-4343-99a5-30dc7f25abe9\" (UID: \"00bcfb03-4357-4343-99a5-30dc7f25abe9\") " Feb 19 15:32:58 crc kubenswrapper[4810]: I0219 15:32:58.778836 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"00bcfb03-4357-4343-99a5-30dc7f25abe9\" (UID: \"00bcfb03-4357-4343-99a5-30dc7f25abe9\") " Feb 19 15:32:58 crc kubenswrapper[4810]: I0219 15:32:58.779517 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/00bcfb03-4357-4343-99a5-30dc7f25abe9-plugins-conf\") pod \"00bcfb03-4357-4343-99a5-30dc7f25abe9\" (UID: \"00bcfb03-4357-4343-99a5-30dc7f25abe9\") " Feb 19 15:32:58 crc kubenswrapper[4810]: I0219 15:32:58.779567 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/00bcfb03-4357-4343-99a5-30dc7f25abe9-server-conf\") pod \"00bcfb03-4357-4343-99a5-30dc7f25abe9\" (UID: \"00bcfb03-4357-4343-99a5-30dc7f25abe9\") " Feb 19 15:32:58 crc kubenswrapper[4810]: I0219 15:32:58.779649 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74vq8\" (UniqueName: \"kubernetes.io/projected/00bcfb03-4357-4343-99a5-30dc7f25abe9-kube-api-access-74vq8\") pod \"00bcfb03-4357-4343-99a5-30dc7f25abe9\" (UID: \"00bcfb03-4357-4343-99a5-30dc7f25abe9\") " Feb 19 15:32:58 crc kubenswrapper[4810]: I0219 15:32:58.779682 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/00bcfb03-4357-4343-99a5-30dc7f25abe9-rabbitmq-tls\") pod \"00bcfb03-4357-4343-99a5-30dc7f25abe9\" (UID: \"00bcfb03-4357-4343-99a5-30dc7f25abe9\") " Feb 19 15:32:58 crc kubenswrapper[4810]: I0219 15:32:58.779762 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/00bcfb03-4357-4343-99a5-30dc7f25abe9-rabbitmq-plugins\") pod \"00bcfb03-4357-4343-99a5-30dc7f25abe9\" (UID: \"00bcfb03-4357-4343-99a5-30dc7f25abe9\") " Feb 19 15:32:58 crc kubenswrapper[4810]: I0219 15:32:58.779836 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/00bcfb03-4357-4343-99a5-30dc7f25abe9-rabbitmq-erlang-cookie\") pod \"00bcfb03-4357-4343-99a5-30dc7f25abe9\" (UID: \"00bcfb03-4357-4343-99a5-30dc7f25abe9\") " Feb 19 15:32:58 crc kubenswrapper[4810]: I0219 15:32:58.782250 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00bcfb03-4357-4343-99a5-30dc7f25abe9-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "00bcfb03-4357-4343-99a5-30dc7f25abe9" (UID: "00bcfb03-4357-4343-99a5-30dc7f25abe9"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:32:58 crc kubenswrapper[4810]: I0219 15:32:58.783833 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00bcfb03-4357-4343-99a5-30dc7f25abe9-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "00bcfb03-4357-4343-99a5-30dc7f25abe9" (UID: "00bcfb03-4357-4343-99a5-30dc7f25abe9"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:32:58 crc kubenswrapper[4810]: I0219 15:32:58.784168 4810 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/00bcfb03-4357-4343-99a5-30dc7f25abe9-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 19 15:32:58 crc kubenswrapper[4810]: I0219 15:32:58.784201 4810 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/00bcfb03-4357-4343-99a5-30dc7f25abe9-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 19 15:32:58 crc kubenswrapper[4810]: I0219 15:32:58.787380 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00bcfb03-4357-4343-99a5-30dc7f25abe9-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "00bcfb03-4357-4343-99a5-30dc7f25abe9" (UID: "00bcfb03-4357-4343-99a5-30dc7f25abe9"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:32:58 crc kubenswrapper[4810]: I0219 15:32:58.790309 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "persistence") pod "00bcfb03-4357-4343-99a5-30dc7f25abe9" (UID: "00bcfb03-4357-4343-99a5-30dc7f25abe9"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 19 15:32:58 crc kubenswrapper[4810]: I0219 15:32:58.794278 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/00bcfb03-4357-4343-99a5-30dc7f25abe9-pod-info" (OuterVolumeSpecName: "pod-info") pod "00bcfb03-4357-4343-99a5-30dc7f25abe9" (UID: "00bcfb03-4357-4343-99a5-30dc7f25abe9"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 19 15:32:58 crc kubenswrapper[4810]: I0219 15:32:58.794796 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-955rr" event={"ID":"783f4f81-b0fb-49c4-9b07-a2715641355a","Type":"ContainerStarted","Data":"1a4461610ed2fb313a8e62f0315dc40b8ce3a3e6a0f3690df259b0b16d63375f"} Feb 19 15:32:58 crc kubenswrapper[4810]: I0219 15:32:58.814585 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00bcfb03-4357-4343-99a5-30dc7f25abe9-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "00bcfb03-4357-4343-99a5-30dc7f25abe9" (UID: "00bcfb03-4357-4343-99a5-30dc7f25abe9"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:32:58 crc kubenswrapper[4810]: I0219 15:32:58.826086 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00bcfb03-4357-4343-99a5-30dc7f25abe9-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "00bcfb03-4357-4343-99a5-30dc7f25abe9" (UID: "00bcfb03-4357-4343-99a5-30dc7f25abe9"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:32:58 crc kubenswrapper[4810]: I0219 15:32:58.826464 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"00bcfb03-4357-4343-99a5-30dc7f25abe9","Type":"ContainerDied","Data":"438e5fcabfeda7b104ccc004754827e00367d2ec7bbb19edfefdf5cb049ee1ce"} Feb 19 15:32:58 crc kubenswrapper[4810]: I0219 15:32:58.826567 4810 scope.go:117] "RemoveContainer" containerID="c5b200cb5f29f71ee660d52a8a35bdae1c6f7de01b68b55aa0a763c8a2cc371f" Feb 19 15:32:58 crc kubenswrapper[4810]: I0219 15:32:58.826776 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 19 15:32:58 crc kubenswrapper[4810]: I0219 15:32:58.827598 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-955rr" podStartSLOduration=2.179152827 podStartE2EDuration="6.827583885s" podCreationTimestamp="2026-02-19 15:32:52 +0000 UTC" firstStartedPulling="2026-02-19 15:32:53.703532895 +0000 UTC m=+1403.185563019" lastFinishedPulling="2026-02-19 15:32:58.351963953 +0000 UTC m=+1407.833994077" observedRunningTime="2026-02-19 15:32:58.827157084 +0000 UTC m=+1408.309187218" watchObservedRunningTime="2026-02-19 15:32:58.827583885 +0000 UTC m=+1408.309614009" Feb 19 15:32:58 crc kubenswrapper[4810]: I0219 15:32:58.843741 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00bcfb03-4357-4343-99a5-30dc7f25abe9-kube-api-access-74vq8" (OuterVolumeSpecName: "kube-api-access-74vq8") pod "00bcfb03-4357-4343-99a5-30dc7f25abe9" (UID: "00bcfb03-4357-4343-99a5-30dc7f25abe9"). InnerVolumeSpecName "kube-api-access-74vq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:32:58 crc kubenswrapper[4810]: I0219 15:32:58.867765 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00bcfb03-4357-4343-99a5-30dc7f25abe9-config-data" (OuterVolumeSpecName: "config-data") pod "00bcfb03-4357-4343-99a5-30dc7f25abe9" (UID: "00bcfb03-4357-4343-99a5-30dc7f25abe9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:32:58 crc kubenswrapper[4810]: I0219 15:32:58.886586 4810 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/00bcfb03-4357-4343-99a5-30dc7f25abe9-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 19 15:32:58 crc kubenswrapper[4810]: I0219 15:32:58.886626 4810 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/00bcfb03-4357-4343-99a5-30dc7f25abe9-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 19 15:32:58 crc kubenswrapper[4810]: I0219 15:32:58.886639 4810 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/00bcfb03-4357-4343-99a5-30dc7f25abe9-pod-info\") on node \"crc\" DevicePath \"\"" Feb 19 15:32:58 crc kubenswrapper[4810]: I0219 15:32:58.886651 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/00bcfb03-4357-4343-99a5-30dc7f25abe9-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 15:32:58 crc kubenswrapper[4810]: I0219 15:32:58.886676 4810 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Feb 19 15:32:58 crc kubenswrapper[4810]: I0219 15:32:58.886689 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74vq8\" (UniqueName: \"kubernetes.io/projected/00bcfb03-4357-4343-99a5-30dc7f25abe9-kube-api-access-74vq8\") on node \"crc\" DevicePath \"\"" Feb 19 15:32:58 crc kubenswrapper[4810]: I0219 15:32:58.886701 4810 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/00bcfb03-4357-4343-99a5-30dc7f25abe9-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 19 15:32:58 crc kubenswrapper[4810]: I0219 15:32:58.906900 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00bcfb03-4357-4343-99a5-30dc7f25abe9-server-conf" (OuterVolumeSpecName: "server-conf") pod "00bcfb03-4357-4343-99a5-30dc7f25abe9" (UID: "00bcfb03-4357-4343-99a5-30dc7f25abe9"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:32:58 crc kubenswrapper[4810]: I0219 15:32:58.933075 4810 scope.go:117] "RemoveContainer" containerID="5f65c0deba7b3077c5501137f00e319288d66ec1245a0e431539e6d1d5d3d67c" Feb 19 15:32:58 crc kubenswrapper[4810]: I0219 15:32:58.945218 4810 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Feb 19 15:32:58 crc kubenswrapper[4810]: I0219 15:32:58.997839 4810 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Feb 19 15:32:58 crc kubenswrapper[4810]: I0219 15:32:58.997963 4810 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/00bcfb03-4357-4343-99a5-30dc7f25abe9-server-conf\") on node \"crc\" DevicePath \"\"" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.010514 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00bcfb03-4357-4343-99a5-30dc7f25abe9-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "00bcfb03-4357-4343-99a5-30dc7f25abe9" (UID: "00bcfb03-4357-4343-99a5-30dc7f25abe9"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.103858 4810 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/00bcfb03-4357-4343-99a5-30dc7f25abe9-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.168563 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.178561 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.204184 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 15:32:59 crc kubenswrapper[4810]: E0219 15:32:59.204848 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00bcfb03-4357-4343-99a5-30dc7f25abe9" containerName="setup-container" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.204866 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="00bcfb03-4357-4343-99a5-30dc7f25abe9" containerName="setup-container" Feb 19 15:32:59 crc kubenswrapper[4810]: E0219 15:32:59.204893 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00bcfb03-4357-4343-99a5-30dc7f25abe9" containerName="rabbitmq" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.204900 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="00bcfb03-4357-4343-99a5-30dc7f25abe9" containerName="rabbitmq" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.205130 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="00bcfb03-4357-4343-99a5-30dc7f25abe9" containerName="rabbitmq" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.206275 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.212688 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.212948 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.213104 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-9r5f7" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.213125 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.213135 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.213193 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.215677 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.240866 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.307078 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b86448c3-669a-4132-b8ab-4db06347fa10-server-conf\") pod \"rabbitmq-server-0\" (UID: \"b86448c3-669a-4132-b8ab-4db06347fa10\") " pod="openstack/rabbitmq-server-0" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.307126 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b86448c3-669a-4132-b8ab-4db06347fa10-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"b86448c3-669a-4132-b8ab-4db06347fa10\") " pod="openstack/rabbitmq-server-0" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.307151 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b86448c3-669a-4132-b8ab-4db06347fa10-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"b86448c3-669a-4132-b8ab-4db06347fa10\") " pod="openstack/rabbitmq-server-0" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.307193 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b86448c3-669a-4132-b8ab-4db06347fa10-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"b86448c3-669a-4132-b8ab-4db06347fa10\") " pod="openstack/rabbitmq-server-0" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.307220 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b86448c3-669a-4132-b8ab-4db06347fa10-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"b86448c3-669a-4132-b8ab-4db06347fa10\") " pod="openstack/rabbitmq-server-0" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.307235 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7ct7\" (UniqueName: \"kubernetes.io/projected/b86448c3-669a-4132-b8ab-4db06347fa10-kube-api-access-j7ct7\") pod \"rabbitmq-server-0\" (UID: \"b86448c3-669a-4132-b8ab-4db06347fa10\") " pod="openstack/rabbitmq-server-0" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.307286 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b86448c3-669a-4132-b8ab-4db06347fa10-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"b86448c3-669a-4132-b8ab-4db06347fa10\") " pod="openstack/rabbitmq-server-0" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.307334 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b86448c3-669a-4132-b8ab-4db06347fa10-pod-info\") pod \"rabbitmq-server-0\" (UID: \"b86448c3-669a-4132-b8ab-4db06347fa10\") " pod="openstack/rabbitmq-server-0" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.307376 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"b86448c3-669a-4132-b8ab-4db06347fa10\") " pod="openstack/rabbitmq-server-0" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.307390 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b86448c3-669a-4132-b8ab-4db06347fa10-config-data\") pod \"rabbitmq-server-0\" (UID: \"b86448c3-669a-4132-b8ab-4db06347fa10\") " pod="openstack/rabbitmq-server-0" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.307414 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b86448c3-669a-4132-b8ab-4db06347fa10-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"b86448c3-669a-4132-b8ab-4db06347fa10\") " pod="openstack/rabbitmq-server-0" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.348362 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.409277 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2a3676ed-f06f-4dea-82a1-959716331113-rabbitmq-erlang-cookie\") pod \"2a3676ed-f06f-4dea-82a1-959716331113\" (UID: \"2a3676ed-f06f-4dea-82a1-959716331113\") " Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.409454 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2a3676ed-f06f-4dea-82a1-959716331113-rabbitmq-plugins\") pod \"2a3676ed-f06f-4dea-82a1-959716331113\" (UID: \"2a3676ed-f06f-4dea-82a1-959716331113\") " Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.409501 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wtmb8\" (UniqueName: \"kubernetes.io/projected/2a3676ed-f06f-4dea-82a1-959716331113-kube-api-access-wtmb8\") pod \"2a3676ed-f06f-4dea-82a1-959716331113\" (UID: \"2a3676ed-f06f-4dea-82a1-959716331113\") " Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.409569 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2a3676ed-f06f-4dea-82a1-959716331113-config-data\") pod \"2a3676ed-f06f-4dea-82a1-959716331113\" (UID: \"2a3676ed-f06f-4dea-82a1-959716331113\") " Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.409618 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2a3676ed-f06f-4dea-82a1-959716331113-server-conf\") pod \"2a3676ed-f06f-4dea-82a1-959716331113\" (UID: \"2a3676ed-f06f-4dea-82a1-959716331113\") " Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.409667 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"2a3676ed-f06f-4dea-82a1-959716331113\" (UID: \"2a3676ed-f06f-4dea-82a1-959716331113\") " Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.409722 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2a3676ed-f06f-4dea-82a1-959716331113-rabbitmq-tls\") pod \"2a3676ed-f06f-4dea-82a1-959716331113\" (UID: \"2a3676ed-f06f-4dea-82a1-959716331113\") " Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.409746 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2a3676ed-f06f-4dea-82a1-959716331113-erlang-cookie-secret\") pod \"2a3676ed-f06f-4dea-82a1-959716331113\" (UID: \"2a3676ed-f06f-4dea-82a1-959716331113\") " Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.409817 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2a3676ed-f06f-4dea-82a1-959716331113-rabbitmq-confd\") pod \"2a3676ed-f06f-4dea-82a1-959716331113\" (UID: \"2a3676ed-f06f-4dea-82a1-959716331113\") " Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.409842 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2a3676ed-f06f-4dea-82a1-959716331113-pod-info\") pod \"2a3676ed-f06f-4dea-82a1-959716331113\" (UID: \"2a3676ed-f06f-4dea-82a1-959716331113\") " Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.409900 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2a3676ed-f06f-4dea-82a1-959716331113-plugins-conf\") pod \"2a3676ed-f06f-4dea-82a1-959716331113\" (UID: \"2a3676ed-f06f-4dea-82a1-959716331113\") " Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.410001 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a3676ed-f06f-4dea-82a1-959716331113-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "2a3676ed-f06f-4dea-82a1-959716331113" (UID: "2a3676ed-f06f-4dea-82a1-959716331113"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.410165 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b86448c3-669a-4132-b8ab-4db06347fa10-server-conf\") pod \"rabbitmq-server-0\" (UID: \"b86448c3-669a-4132-b8ab-4db06347fa10\") " pod="openstack/rabbitmq-server-0" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.410206 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b86448c3-669a-4132-b8ab-4db06347fa10-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"b86448c3-669a-4132-b8ab-4db06347fa10\") " pod="openstack/rabbitmq-server-0" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.410244 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b86448c3-669a-4132-b8ab-4db06347fa10-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"b86448c3-669a-4132-b8ab-4db06347fa10\") " pod="openstack/rabbitmq-server-0" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.410301 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b86448c3-669a-4132-b8ab-4db06347fa10-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"b86448c3-669a-4132-b8ab-4db06347fa10\") " pod="openstack/rabbitmq-server-0" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.410364 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b86448c3-669a-4132-b8ab-4db06347fa10-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"b86448c3-669a-4132-b8ab-4db06347fa10\") " pod="openstack/rabbitmq-server-0" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.410392 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7ct7\" (UniqueName: \"kubernetes.io/projected/b86448c3-669a-4132-b8ab-4db06347fa10-kube-api-access-j7ct7\") pod \"rabbitmq-server-0\" (UID: \"b86448c3-669a-4132-b8ab-4db06347fa10\") " pod="openstack/rabbitmq-server-0" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.410453 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b86448c3-669a-4132-b8ab-4db06347fa10-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"b86448c3-669a-4132-b8ab-4db06347fa10\") " pod="openstack/rabbitmq-server-0" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.410493 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b86448c3-669a-4132-b8ab-4db06347fa10-pod-info\") pod \"rabbitmq-server-0\" (UID: \"b86448c3-669a-4132-b8ab-4db06347fa10\") " pod="openstack/rabbitmq-server-0" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.410554 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"b86448c3-669a-4132-b8ab-4db06347fa10\") " pod="openstack/rabbitmq-server-0" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.410575 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b86448c3-669a-4132-b8ab-4db06347fa10-config-data\") pod \"rabbitmq-server-0\" (UID: \"b86448c3-669a-4132-b8ab-4db06347fa10\") " pod="openstack/rabbitmq-server-0" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.410559 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a3676ed-f06f-4dea-82a1-959716331113-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "2a3676ed-f06f-4dea-82a1-959716331113" (UID: "2a3676ed-f06f-4dea-82a1-959716331113"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.410609 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b86448c3-669a-4132-b8ab-4db06347fa10-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"b86448c3-669a-4132-b8ab-4db06347fa10\") " pod="openstack/rabbitmq-server-0" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.410876 4810 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2a3676ed-f06f-4dea-82a1-959716331113-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.410890 4810 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2a3676ed-f06f-4dea-82a1-959716331113-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.411701 4810 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"b86448c3-669a-4132-b8ab-4db06347fa10\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/rabbitmq-server-0" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.413231 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b86448c3-669a-4132-b8ab-4db06347fa10-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"b86448c3-669a-4132-b8ab-4db06347fa10\") " pod="openstack/rabbitmq-server-0" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.414317 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b86448c3-669a-4132-b8ab-4db06347fa10-config-data\") pod \"rabbitmq-server-0\" (UID: \"b86448c3-669a-4132-b8ab-4db06347fa10\") " pod="openstack/rabbitmq-server-0" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.415488 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a3676ed-f06f-4dea-82a1-959716331113-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "2a3676ed-f06f-4dea-82a1-959716331113" (UID: "2a3676ed-f06f-4dea-82a1-959716331113"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.416105 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b86448c3-669a-4132-b8ab-4db06347fa10-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"b86448c3-669a-4132-b8ab-4db06347fa10\") " pod="openstack/rabbitmq-server-0" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.416397 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b86448c3-669a-4132-b8ab-4db06347fa10-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"b86448c3-669a-4132-b8ab-4db06347fa10\") " pod="openstack/rabbitmq-server-0" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.417009 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b86448c3-669a-4132-b8ab-4db06347fa10-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"b86448c3-669a-4132-b8ab-4db06347fa10\") " pod="openstack/rabbitmq-server-0" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.420347 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a3676ed-f06f-4dea-82a1-959716331113-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "2a3676ed-f06f-4dea-82a1-959716331113" (UID: "2a3676ed-f06f-4dea-82a1-959716331113"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.420715 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/2a3676ed-f06f-4dea-82a1-959716331113-pod-info" (OuterVolumeSpecName: "pod-info") pod "2a3676ed-f06f-4dea-82a1-959716331113" (UID: "2a3676ed-f06f-4dea-82a1-959716331113"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.421747 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b86448c3-669a-4132-b8ab-4db06347fa10-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"b86448c3-669a-4132-b8ab-4db06347fa10\") " pod="openstack/rabbitmq-server-0" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.424868 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a3676ed-f06f-4dea-82a1-959716331113-kube-api-access-wtmb8" (OuterVolumeSpecName: "kube-api-access-wtmb8") pod "2a3676ed-f06f-4dea-82a1-959716331113" (UID: "2a3676ed-f06f-4dea-82a1-959716331113"). InnerVolumeSpecName "kube-api-access-wtmb8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.425649 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b86448c3-669a-4132-b8ab-4db06347fa10-server-conf\") pod \"rabbitmq-server-0\" (UID: \"b86448c3-669a-4132-b8ab-4db06347fa10\") " pod="openstack/rabbitmq-server-0" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.426031 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b86448c3-669a-4132-b8ab-4db06347fa10-pod-info\") pod \"rabbitmq-server-0\" (UID: \"b86448c3-669a-4132-b8ab-4db06347fa10\") " pod="openstack/rabbitmq-server-0" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.429994 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b86448c3-669a-4132-b8ab-4db06347fa10-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"b86448c3-669a-4132-b8ab-4db06347fa10\") " pod="openstack/rabbitmq-server-0" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.435874 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "persistence") pod "2a3676ed-f06f-4dea-82a1-959716331113" (UID: "2a3676ed-f06f-4dea-82a1-959716331113"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.450542 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a3676ed-f06f-4dea-82a1-959716331113-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "2a3676ed-f06f-4dea-82a1-959716331113" (UID: "2a3676ed-f06f-4dea-82a1-959716331113"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.453295 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00bcfb03-4357-4343-99a5-30dc7f25abe9" path="/var/lib/kubelet/pods/00bcfb03-4357-4343-99a5-30dc7f25abe9/volumes" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.470800 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7ct7\" (UniqueName: \"kubernetes.io/projected/b86448c3-669a-4132-b8ab-4db06347fa10-kube-api-access-j7ct7\") pod \"rabbitmq-server-0\" (UID: \"b86448c3-669a-4132-b8ab-4db06347fa10\") " pod="openstack/rabbitmq-server-0" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.512433 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"b86448c3-669a-4132-b8ab-4db06347fa10\") " pod="openstack/rabbitmq-server-0" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.516806 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a3676ed-f06f-4dea-82a1-959716331113-config-data" (OuterVolumeSpecName: "config-data") pod "2a3676ed-f06f-4dea-82a1-959716331113" (UID: "2a3676ed-f06f-4dea-82a1-959716331113"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.517574 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wtmb8\" (UniqueName: \"kubernetes.io/projected/2a3676ed-f06f-4dea-82a1-959716331113-kube-api-access-wtmb8\") on node \"crc\" DevicePath \"\"" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.518211 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2a3676ed-f06f-4dea-82a1-959716331113-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.518236 4810 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.518247 4810 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2a3676ed-f06f-4dea-82a1-959716331113-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.518258 4810 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2a3676ed-f06f-4dea-82a1-959716331113-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.518266 4810 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2a3676ed-f06f-4dea-82a1-959716331113-pod-info\") on node \"crc\" DevicePath \"\"" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.518275 4810 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2a3676ed-f06f-4dea-82a1-959716331113-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.542015 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a3676ed-f06f-4dea-82a1-959716331113-server-conf" (OuterVolumeSpecName: "server-conf") pod "2a3676ed-f06f-4dea-82a1-959716331113" (UID: "2a3676ed-f06f-4dea-82a1-959716331113"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.542693 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.551017 4810 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.617291 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a3676ed-f06f-4dea-82a1-959716331113-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "2a3676ed-f06f-4dea-82a1-959716331113" (UID: "2a3676ed-f06f-4dea-82a1-959716331113"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.620155 4810 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2a3676ed-f06f-4dea-82a1-959716331113-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.620254 4810 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2a3676ed-f06f-4dea-82a1-959716331113-server-conf\") on node \"crc\" DevicePath \"\"" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.620308 4810 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.839445 4810 generic.go:334] "Generic (PLEG): container finished" podID="2a3676ed-f06f-4dea-82a1-959716331113" containerID="dbc27193c644946d801635eac3f4ac8fd7e93e4ac9ea0ee7cb027f5cec7cc199" exitCode=0 Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.839507 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.839529 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2a3676ed-f06f-4dea-82a1-959716331113","Type":"ContainerDied","Data":"dbc27193c644946d801635eac3f4ac8fd7e93e4ac9ea0ee7cb027f5cec7cc199"} Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.839896 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2a3676ed-f06f-4dea-82a1-959716331113","Type":"ContainerDied","Data":"65b3bafec6c943249f491880afc4a9c1d426f050515e89a7aeb5d3ee771259c3"} Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.839913 4810 scope.go:117] "RemoveContainer" containerID="dbc27193c644946d801635eac3f4ac8fd7e93e4ac9ea0ee7cb027f5cec7cc199" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.881406 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.883564 4810 scope.go:117] "RemoveContainer" containerID="d8e4585cb9bc557ddc8d8dc697ba266361d794bec388e11f6081c9ec077a1726" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.919613 4810 scope.go:117] "RemoveContainer" containerID="dbc27193c644946d801635eac3f4ac8fd7e93e4ac9ea0ee7cb027f5cec7cc199" Feb 19 15:32:59 crc kubenswrapper[4810]: E0219 15:32:59.922957 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbc27193c644946d801635eac3f4ac8fd7e93e4ac9ea0ee7cb027f5cec7cc199\": container with ID starting with dbc27193c644946d801635eac3f4ac8fd7e93e4ac9ea0ee7cb027f5cec7cc199 not found: ID does not exist" containerID="dbc27193c644946d801635eac3f4ac8fd7e93e4ac9ea0ee7cb027f5cec7cc199" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.923090 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbc27193c644946d801635eac3f4ac8fd7e93e4ac9ea0ee7cb027f5cec7cc199"} err="failed to get container status \"dbc27193c644946d801635eac3f4ac8fd7e93e4ac9ea0ee7cb027f5cec7cc199\": rpc error: code = NotFound desc = could not find container \"dbc27193c644946d801635eac3f4ac8fd7e93e4ac9ea0ee7cb027f5cec7cc199\": container with ID starting with dbc27193c644946d801635eac3f4ac8fd7e93e4ac9ea0ee7cb027f5cec7cc199 not found: ID does not exist" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.923226 4810 scope.go:117] "RemoveContainer" containerID="d8e4585cb9bc557ddc8d8dc697ba266361d794bec388e11f6081c9ec077a1726" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.923405 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 15:32:59 crc kubenswrapper[4810]: E0219 15:32:59.923932 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8e4585cb9bc557ddc8d8dc697ba266361d794bec388e11f6081c9ec077a1726\": container with ID starting with d8e4585cb9bc557ddc8d8dc697ba266361d794bec388e11f6081c9ec077a1726 not found: ID does not exist" containerID="d8e4585cb9bc557ddc8d8dc697ba266361d794bec388e11f6081c9ec077a1726" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.924139 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8e4585cb9bc557ddc8d8dc697ba266361d794bec388e11f6081c9ec077a1726"} err="failed to get container status \"d8e4585cb9bc557ddc8d8dc697ba266361d794bec388e11f6081c9ec077a1726\": rpc error: code = NotFound desc = could not find container \"d8e4585cb9bc557ddc8d8dc697ba266361d794bec388e11f6081c9ec077a1726\": container with ID starting with d8e4585cb9bc557ddc8d8dc697ba266361d794bec388e11f6081c9ec077a1726 not found: ID does not exist" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.953160 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 15:32:59 crc kubenswrapper[4810]: E0219 15:32:59.964464 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a3676ed-f06f-4dea-82a1-959716331113" containerName="setup-container" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.964495 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a3676ed-f06f-4dea-82a1-959716331113" containerName="setup-container" Feb 19 15:32:59 crc kubenswrapper[4810]: E0219 15:32:59.964512 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a3676ed-f06f-4dea-82a1-959716331113" containerName="rabbitmq" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.964518 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a3676ed-f06f-4dea-82a1-959716331113" containerName="rabbitmq" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.964780 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a3676ed-f06f-4dea-82a1-959716331113" containerName="rabbitmq" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.965851 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.967858 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.968029 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.968183 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.969073 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.969252 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.969376 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.969608 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-qxvfg" Feb 19 15:32:59 crc kubenswrapper[4810]: I0219 15:32:59.990128 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 15:33:00 crc kubenswrapper[4810]: I0219 15:33:00.051776 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 15:33:00 crc kubenswrapper[4810]: I0219 15:33:00.052292 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/03247cdb-4055-4d47-b433-848e363768ab-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"03247cdb-4055-4d47-b433-848e363768ab\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 15:33:00 crc kubenswrapper[4810]: I0219 15:33:00.052396 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/03247cdb-4055-4d47-b433-848e363768ab-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"03247cdb-4055-4d47-b433-848e363768ab\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 15:33:00 crc kubenswrapper[4810]: I0219 15:33:00.052446 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/03247cdb-4055-4d47-b433-848e363768ab-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"03247cdb-4055-4d47-b433-848e363768ab\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 15:33:00 crc kubenswrapper[4810]: I0219 15:33:00.052476 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"03247cdb-4055-4d47-b433-848e363768ab\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 15:33:00 crc kubenswrapper[4810]: I0219 15:33:00.052504 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/03247cdb-4055-4d47-b433-848e363768ab-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"03247cdb-4055-4d47-b433-848e363768ab\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 15:33:00 crc kubenswrapper[4810]: I0219 15:33:00.052534 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58z2t\" (UniqueName: \"kubernetes.io/projected/03247cdb-4055-4d47-b433-848e363768ab-kube-api-access-58z2t\") pod \"rabbitmq-cell1-server-0\" (UID: \"03247cdb-4055-4d47-b433-848e363768ab\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 15:33:00 crc kubenswrapper[4810]: I0219 15:33:00.052555 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/03247cdb-4055-4d47-b433-848e363768ab-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"03247cdb-4055-4d47-b433-848e363768ab\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 15:33:00 crc kubenswrapper[4810]: I0219 15:33:00.052577 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/03247cdb-4055-4d47-b433-848e363768ab-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"03247cdb-4055-4d47-b433-848e363768ab\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 15:33:00 crc kubenswrapper[4810]: I0219 15:33:00.052615 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/03247cdb-4055-4d47-b433-848e363768ab-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"03247cdb-4055-4d47-b433-848e363768ab\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 15:33:00 crc kubenswrapper[4810]: I0219 15:33:00.052635 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/03247cdb-4055-4d47-b433-848e363768ab-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"03247cdb-4055-4d47-b433-848e363768ab\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 15:33:00 crc kubenswrapper[4810]: I0219 15:33:00.052666 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/03247cdb-4055-4d47-b433-848e363768ab-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"03247cdb-4055-4d47-b433-848e363768ab\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 15:33:00 crc kubenswrapper[4810]: W0219 15:33:00.053792 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb86448c3_669a_4132_b8ab_4db06347fa10.slice/crio-399aaf0177600c18b47a56dbdca5d87c12cbf5e9ea295a08327d01e27b1b0566 WatchSource:0}: Error finding container 399aaf0177600c18b47a56dbdca5d87c12cbf5e9ea295a08327d01e27b1b0566: Status 404 returned error can't find the container with id 399aaf0177600c18b47a56dbdca5d87c12cbf5e9ea295a08327d01e27b1b0566 Feb 19 15:33:00 crc kubenswrapper[4810]: I0219 15:33:00.154706 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/03247cdb-4055-4d47-b433-848e363768ab-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"03247cdb-4055-4d47-b433-848e363768ab\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 15:33:00 crc kubenswrapper[4810]: I0219 15:33:00.155079 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/03247cdb-4055-4d47-b433-848e363768ab-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"03247cdb-4055-4d47-b433-848e363768ab\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 15:33:00 crc kubenswrapper[4810]: I0219 15:33:00.155123 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/03247cdb-4055-4d47-b433-848e363768ab-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"03247cdb-4055-4d47-b433-848e363768ab\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 15:33:00 crc kubenswrapper[4810]: I0219 15:33:00.155240 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/03247cdb-4055-4d47-b433-848e363768ab-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"03247cdb-4055-4d47-b433-848e363768ab\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 15:33:00 crc kubenswrapper[4810]: I0219 15:33:00.155276 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/03247cdb-4055-4d47-b433-848e363768ab-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"03247cdb-4055-4d47-b433-848e363768ab\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 15:33:00 crc kubenswrapper[4810]: I0219 15:33:00.155338 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/03247cdb-4055-4d47-b433-848e363768ab-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"03247cdb-4055-4d47-b433-848e363768ab\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 15:33:00 crc kubenswrapper[4810]: I0219 15:33:00.155375 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"03247cdb-4055-4d47-b433-848e363768ab\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 15:33:00 crc kubenswrapper[4810]: I0219 15:33:00.155409 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/03247cdb-4055-4d47-b433-848e363768ab-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"03247cdb-4055-4d47-b433-848e363768ab\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 15:33:00 crc kubenswrapper[4810]: I0219 15:33:00.155444 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58z2t\" (UniqueName: \"kubernetes.io/projected/03247cdb-4055-4d47-b433-848e363768ab-kube-api-access-58z2t\") pod \"rabbitmq-cell1-server-0\" (UID: \"03247cdb-4055-4d47-b433-848e363768ab\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 15:33:00 crc kubenswrapper[4810]: I0219 15:33:00.155466 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/03247cdb-4055-4d47-b433-848e363768ab-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"03247cdb-4055-4d47-b433-848e363768ab\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 15:33:00 crc kubenswrapper[4810]: I0219 15:33:00.155490 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/03247cdb-4055-4d47-b433-848e363768ab-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"03247cdb-4055-4d47-b433-848e363768ab\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 15:33:00 crc kubenswrapper[4810]: I0219 15:33:00.156052 4810 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"03247cdb-4055-4d47-b433-848e363768ab\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/rabbitmq-cell1-server-0" Feb 19 15:33:00 crc kubenswrapper[4810]: I0219 15:33:00.156135 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/03247cdb-4055-4d47-b433-848e363768ab-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"03247cdb-4055-4d47-b433-848e363768ab\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 15:33:00 crc kubenswrapper[4810]: I0219 15:33:00.156053 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/03247cdb-4055-4d47-b433-848e363768ab-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"03247cdb-4055-4d47-b433-848e363768ab\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 15:33:00 crc kubenswrapper[4810]: I0219 15:33:00.156747 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/03247cdb-4055-4d47-b433-848e363768ab-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"03247cdb-4055-4d47-b433-848e363768ab\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 15:33:00 crc kubenswrapper[4810]: I0219 15:33:00.157224 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/03247cdb-4055-4d47-b433-848e363768ab-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"03247cdb-4055-4d47-b433-848e363768ab\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 15:33:00 crc kubenswrapper[4810]: I0219 15:33:00.158772 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/03247cdb-4055-4d47-b433-848e363768ab-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"03247cdb-4055-4d47-b433-848e363768ab\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 15:33:00 crc kubenswrapper[4810]: I0219 15:33:00.158824 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/03247cdb-4055-4d47-b433-848e363768ab-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"03247cdb-4055-4d47-b433-848e363768ab\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 15:33:00 crc kubenswrapper[4810]: I0219 15:33:00.159261 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/03247cdb-4055-4d47-b433-848e363768ab-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"03247cdb-4055-4d47-b433-848e363768ab\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 15:33:00 crc kubenswrapper[4810]: I0219 15:33:00.160074 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/03247cdb-4055-4d47-b433-848e363768ab-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"03247cdb-4055-4d47-b433-848e363768ab\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 15:33:00 crc kubenswrapper[4810]: I0219 15:33:00.160729 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/03247cdb-4055-4d47-b433-848e363768ab-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"03247cdb-4055-4d47-b433-848e363768ab\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 15:33:00 crc kubenswrapper[4810]: I0219 15:33:00.176186 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58z2t\" (UniqueName: \"kubernetes.io/projected/03247cdb-4055-4d47-b433-848e363768ab-kube-api-access-58z2t\") pod \"rabbitmq-cell1-server-0\" (UID: \"03247cdb-4055-4d47-b433-848e363768ab\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 15:33:00 crc kubenswrapper[4810]: I0219 15:33:00.184404 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"03247cdb-4055-4d47-b433-848e363768ab\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 15:33:00 crc kubenswrapper[4810]: I0219 15:33:00.286608 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 19 15:33:00 crc kubenswrapper[4810]: I0219 15:33:00.808532 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 15:33:00 crc kubenswrapper[4810]: I0219 15:33:00.875075 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b86448c3-669a-4132-b8ab-4db06347fa10","Type":"ContainerStarted","Data":"399aaf0177600c18b47a56dbdca5d87c12cbf5e9ea295a08327d01e27b1b0566"} Feb 19 15:33:01 crc kubenswrapper[4810]: I0219 15:33:01.449833 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a3676ed-f06f-4dea-82a1-959716331113" path="/var/lib/kubelet/pods/2a3676ed-f06f-4dea-82a1-959716331113/volumes" Feb 19 15:33:01 crc kubenswrapper[4810]: I0219 15:33:01.888210 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"03247cdb-4055-4d47-b433-848e363768ab","Type":"ContainerStarted","Data":"185b4c5740d4df6690def4329f9bd23e57daa2dbba3c22414e45dd1a66711b20"} Feb 19 15:33:01 crc kubenswrapper[4810]: I0219 15:33:01.890109 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b86448c3-669a-4132-b8ab-4db06347fa10","Type":"ContainerStarted","Data":"01f710572014ea30c64f1b2a62a873ecb5702426460d2b063c7df96d0df38fa3"} Feb 19 15:33:02 crc kubenswrapper[4810]: I0219 15:33:02.902592 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"03247cdb-4055-4d47-b433-848e363768ab","Type":"ContainerStarted","Data":"aeb8c5c6a71d86ecc6aba3fc315503cd296a83a4195aac56a6884ee1e7fef305"} Feb 19 15:33:02 crc kubenswrapper[4810]: I0219 15:33:02.921913 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-955rr" Feb 19 15:33:02 crc kubenswrapper[4810]: I0219 15:33:02.921996 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-955rr" Feb 19 15:33:04 crc kubenswrapper[4810]: I0219 15:33:04.000749 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-955rr" podUID="783f4f81-b0fb-49c4-9b07-a2715641355a" containerName="registry-server" probeResult="failure" output=< Feb 19 15:33:04 crc kubenswrapper[4810]: timeout: failed to connect service ":50051" within 1s Feb 19 15:33:04 crc kubenswrapper[4810]: > Feb 19 15:33:07 crc kubenswrapper[4810]: I0219 15:33:07.991620 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7754685579-pj5bz"] Feb 19 15:33:07 crc kubenswrapper[4810]: I0219 15:33:07.994667 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7754685579-pj5bz" Feb 19 15:33:07 crc kubenswrapper[4810]: I0219 15:33:07.998776 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Feb 19 15:33:08 crc kubenswrapper[4810]: I0219 15:33:08.009590 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7754685579-pj5bz"] Feb 19 15:33:08 crc kubenswrapper[4810]: I0219 15:33:08.010797 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0c3acd8a-696a-4a86-9052-03ef2cca79c7-dns-swift-storage-0\") pod \"dnsmasq-dns-7754685579-pj5bz\" (UID: \"0c3acd8a-696a-4a86-9052-03ef2cca79c7\") " pod="openstack/dnsmasq-dns-7754685579-pj5bz" Feb 19 15:33:08 crc kubenswrapper[4810]: I0219 15:33:08.010906 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/0c3acd8a-696a-4a86-9052-03ef2cca79c7-openstack-edpm-ipam\") pod \"dnsmasq-dns-7754685579-pj5bz\" (UID: \"0c3acd8a-696a-4a86-9052-03ef2cca79c7\") " pod="openstack/dnsmasq-dns-7754685579-pj5bz" Feb 19 15:33:08 crc kubenswrapper[4810]: I0219 15:33:08.011090 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgmt7\" (UniqueName: \"kubernetes.io/projected/0c3acd8a-696a-4a86-9052-03ef2cca79c7-kube-api-access-mgmt7\") pod \"dnsmasq-dns-7754685579-pj5bz\" (UID: \"0c3acd8a-696a-4a86-9052-03ef2cca79c7\") " pod="openstack/dnsmasq-dns-7754685579-pj5bz" Feb 19 15:33:08 crc kubenswrapper[4810]: I0219 15:33:08.011284 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0c3acd8a-696a-4a86-9052-03ef2cca79c7-ovsdbserver-nb\") pod \"dnsmasq-dns-7754685579-pj5bz\" (UID: \"0c3acd8a-696a-4a86-9052-03ef2cca79c7\") " pod="openstack/dnsmasq-dns-7754685579-pj5bz" Feb 19 15:33:08 crc kubenswrapper[4810]: I0219 15:33:08.011443 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0c3acd8a-696a-4a86-9052-03ef2cca79c7-ovsdbserver-sb\") pod \"dnsmasq-dns-7754685579-pj5bz\" (UID: \"0c3acd8a-696a-4a86-9052-03ef2cca79c7\") " pod="openstack/dnsmasq-dns-7754685579-pj5bz" Feb 19 15:33:08 crc kubenswrapper[4810]: I0219 15:33:08.011617 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0c3acd8a-696a-4a86-9052-03ef2cca79c7-dns-svc\") pod \"dnsmasq-dns-7754685579-pj5bz\" (UID: \"0c3acd8a-696a-4a86-9052-03ef2cca79c7\") " pod="openstack/dnsmasq-dns-7754685579-pj5bz" Feb 19 15:33:08 crc kubenswrapper[4810]: I0219 15:33:08.011729 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c3acd8a-696a-4a86-9052-03ef2cca79c7-config\") pod \"dnsmasq-dns-7754685579-pj5bz\" (UID: \"0c3acd8a-696a-4a86-9052-03ef2cca79c7\") " pod="openstack/dnsmasq-dns-7754685579-pj5bz" Feb 19 15:33:08 crc kubenswrapper[4810]: I0219 15:33:08.113148 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0c3acd8a-696a-4a86-9052-03ef2cca79c7-dns-swift-storage-0\") pod \"dnsmasq-dns-7754685579-pj5bz\" (UID: \"0c3acd8a-696a-4a86-9052-03ef2cca79c7\") " pod="openstack/dnsmasq-dns-7754685579-pj5bz" Feb 19 15:33:08 crc kubenswrapper[4810]: I0219 15:33:08.113229 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/0c3acd8a-696a-4a86-9052-03ef2cca79c7-openstack-edpm-ipam\") pod \"dnsmasq-dns-7754685579-pj5bz\" (UID: \"0c3acd8a-696a-4a86-9052-03ef2cca79c7\") " pod="openstack/dnsmasq-dns-7754685579-pj5bz" Feb 19 15:33:08 crc kubenswrapper[4810]: I0219 15:33:08.113300 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgmt7\" (UniqueName: \"kubernetes.io/projected/0c3acd8a-696a-4a86-9052-03ef2cca79c7-kube-api-access-mgmt7\") pod \"dnsmasq-dns-7754685579-pj5bz\" (UID: \"0c3acd8a-696a-4a86-9052-03ef2cca79c7\") " pod="openstack/dnsmasq-dns-7754685579-pj5bz" Feb 19 15:33:08 crc kubenswrapper[4810]: I0219 15:33:08.113367 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0c3acd8a-696a-4a86-9052-03ef2cca79c7-ovsdbserver-nb\") pod \"dnsmasq-dns-7754685579-pj5bz\" (UID: \"0c3acd8a-696a-4a86-9052-03ef2cca79c7\") " pod="openstack/dnsmasq-dns-7754685579-pj5bz" Feb 19 15:33:08 crc kubenswrapper[4810]: I0219 15:33:08.113413 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0c3acd8a-696a-4a86-9052-03ef2cca79c7-ovsdbserver-sb\") pod \"dnsmasq-dns-7754685579-pj5bz\" (UID: \"0c3acd8a-696a-4a86-9052-03ef2cca79c7\") " pod="openstack/dnsmasq-dns-7754685579-pj5bz" Feb 19 15:33:08 crc kubenswrapper[4810]: I0219 15:33:08.113473 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0c3acd8a-696a-4a86-9052-03ef2cca79c7-dns-svc\") pod \"dnsmasq-dns-7754685579-pj5bz\" (UID: \"0c3acd8a-696a-4a86-9052-03ef2cca79c7\") " pod="openstack/dnsmasq-dns-7754685579-pj5bz" Feb 19 15:33:08 crc kubenswrapper[4810]: I0219 15:33:08.113520 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c3acd8a-696a-4a86-9052-03ef2cca79c7-config\") pod \"dnsmasq-dns-7754685579-pj5bz\" (UID: \"0c3acd8a-696a-4a86-9052-03ef2cca79c7\") " pod="openstack/dnsmasq-dns-7754685579-pj5bz" Feb 19 15:33:08 crc kubenswrapper[4810]: I0219 15:33:08.114270 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0c3acd8a-696a-4a86-9052-03ef2cca79c7-dns-swift-storage-0\") pod \"dnsmasq-dns-7754685579-pj5bz\" (UID: \"0c3acd8a-696a-4a86-9052-03ef2cca79c7\") " pod="openstack/dnsmasq-dns-7754685579-pj5bz" Feb 19 15:33:08 crc kubenswrapper[4810]: I0219 15:33:08.114275 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/0c3acd8a-696a-4a86-9052-03ef2cca79c7-openstack-edpm-ipam\") pod \"dnsmasq-dns-7754685579-pj5bz\" (UID: \"0c3acd8a-696a-4a86-9052-03ef2cca79c7\") " pod="openstack/dnsmasq-dns-7754685579-pj5bz" Feb 19 15:33:08 crc kubenswrapper[4810]: I0219 15:33:08.114438 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0c3acd8a-696a-4a86-9052-03ef2cca79c7-ovsdbserver-sb\") pod \"dnsmasq-dns-7754685579-pj5bz\" (UID: \"0c3acd8a-696a-4a86-9052-03ef2cca79c7\") " pod="openstack/dnsmasq-dns-7754685579-pj5bz" Feb 19 15:33:08 crc kubenswrapper[4810]: I0219 15:33:08.114559 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0c3acd8a-696a-4a86-9052-03ef2cca79c7-ovsdbserver-nb\") pod \"dnsmasq-dns-7754685579-pj5bz\" (UID: \"0c3acd8a-696a-4a86-9052-03ef2cca79c7\") " pod="openstack/dnsmasq-dns-7754685579-pj5bz" Feb 19 15:33:08 crc kubenswrapper[4810]: I0219 15:33:08.114579 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0c3acd8a-696a-4a86-9052-03ef2cca79c7-dns-svc\") pod \"dnsmasq-dns-7754685579-pj5bz\" (UID: \"0c3acd8a-696a-4a86-9052-03ef2cca79c7\") " pod="openstack/dnsmasq-dns-7754685579-pj5bz" Feb 19 15:33:08 crc kubenswrapper[4810]: I0219 15:33:08.114639 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c3acd8a-696a-4a86-9052-03ef2cca79c7-config\") pod \"dnsmasq-dns-7754685579-pj5bz\" (UID: \"0c3acd8a-696a-4a86-9052-03ef2cca79c7\") " pod="openstack/dnsmasq-dns-7754685579-pj5bz" Feb 19 15:33:08 crc kubenswrapper[4810]: I0219 15:33:08.135681 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgmt7\" (UniqueName: \"kubernetes.io/projected/0c3acd8a-696a-4a86-9052-03ef2cca79c7-kube-api-access-mgmt7\") pod \"dnsmasq-dns-7754685579-pj5bz\" (UID: \"0c3acd8a-696a-4a86-9052-03ef2cca79c7\") " pod="openstack/dnsmasq-dns-7754685579-pj5bz" Feb 19 15:33:08 crc kubenswrapper[4810]: I0219 15:33:08.321444 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7754685579-pj5bz" Feb 19 15:33:08 crc kubenswrapper[4810]: I0219 15:33:08.802388 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7754685579-pj5bz"] Feb 19 15:33:08 crc kubenswrapper[4810]: I0219 15:33:08.967791 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7754685579-pj5bz" event={"ID":"0c3acd8a-696a-4a86-9052-03ef2cca79c7","Type":"ContainerStarted","Data":"e6484f856f1dabe83fb7c154c7f3ffdc7af18bb3f81e4c632668b443bfbf7736"} Feb 19 15:33:09 crc kubenswrapper[4810]: I0219 15:33:09.981349 4810 generic.go:334] "Generic (PLEG): container finished" podID="0c3acd8a-696a-4a86-9052-03ef2cca79c7" containerID="493902d9834ff016c9ae111e5d259b86d62b85d4780006bae12de68a9a933a23" exitCode=0 Feb 19 15:33:09 crc kubenswrapper[4810]: I0219 15:33:09.981402 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7754685579-pj5bz" event={"ID":"0c3acd8a-696a-4a86-9052-03ef2cca79c7","Type":"ContainerDied","Data":"493902d9834ff016c9ae111e5d259b86d62b85d4780006bae12de68a9a933a23"} Feb 19 15:33:10 crc kubenswrapper[4810]: I0219 15:33:10.995913 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7754685579-pj5bz" event={"ID":"0c3acd8a-696a-4a86-9052-03ef2cca79c7","Type":"ContainerStarted","Data":"7c6881386ea9e17edcb266f7d41c368100ad0a747bba3ae5f2c27278cd9f5864"} Feb 19 15:33:10 crc kubenswrapper[4810]: I0219 15:33:10.996177 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7754685579-pj5bz" Feb 19 15:33:11 crc kubenswrapper[4810]: I0219 15:33:11.038403 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7754685579-pj5bz" podStartSLOduration=4.038382157 podStartE2EDuration="4.038382157s" podCreationTimestamp="2026-02-19 15:33:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:33:11.024234528 +0000 UTC m=+1420.506264682" watchObservedRunningTime="2026-02-19 15:33:11.038382157 +0000 UTC m=+1420.520412291" Feb 19 15:33:13 crc kubenswrapper[4810]: I0219 15:33:13.005790 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-955rr" Feb 19 15:33:13 crc kubenswrapper[4810]: I0219 15:33:13.055397 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-955rr" Feb 19 15:33:13 crc kubenswrapper[4810]: I0219 15:33:13.255790 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-955rr"] Feb 19 15:33:14 crc kubenswrapper[4810]: I0219 15:33:14.033512 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-955rr" podUID="783f4f81-b0fb-49c4-9b07-a2715641355a" containerName="registry-server" containerID="cri-o://1a4461610ed2fb313a8e62f0315dc40b8ce3a3e6a0f3690df259b0b16d63375f" gracePeriod=2 Feb 19 15:33:14 crc kubenswrapper[4810]: I0219 15:33:14.562976 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-955rr" Feb 19 15:33:14 crc kubenswrapper[4810]: I0219 15:33:14.756076 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/783f4f81-b0fb-49c4-9b07-a2715641355a-utilities\") pod \"783f4f81-b0fb-49c4-9b07-a2715641355a\" (UID: \"783f4f81-b0fb-49c4-9b07-a2715641355a\") " Feb 19 15:33:14 crc kubenswrapper[4810]: I0219 15:33:14.756176 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hrjnp\" (UniqueName: \"kubernetes.io/projected/783f4f81-b0fb-49c4-9b07-a2715641355a-kube-api-access-hrjnp\") pod \"783f4f81-b0fb-49c4-9b07-a2715641355a\" (UID: \"783f4f81-b0fb-49c4-9b07-a2715641355a\") " Feb 19 15:33:14 crc kubenswrapper[4810]: I0219 15:33:14.756591 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/783f4f81-b0fb-49c4-9b07-a2715641355a-catalog-content\") pod \"783f4f81-b0fb-49c4-9b07-a2715641355a\" (UID: \"783f4f81-b0fb-49c4-9b07-a2715641355a\") " Feb 19 15:33:14 crc kubenswrapper[4810]: I0219 15:33:14.757403 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/783f4f81-b0fb-49c4-9b07-a2715641355a-utilities" (OuterVolumeSpecName: "utilities") pod "783f4f81-b0fb-49c4-9b07-a2715641355a" (UID: "783f4f81-b0fb-49c4-9b07-a2715641355a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:33:14 crc kubenswrapper[4810]: I0219 15:33:14.769121 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/783f4f81-b0fb-49c4-9b07-a2715641355a-kube-api-access-hrjnp" (OuterVolumeSpecName: "kube-api-access-hrjnp") pod "783f4f81-b0fb-49c4-9b07-a2715641355a" (UID: "783f4f81-b0fb-49c4-9b07-a2715641355a"). InnerVolumeSpecName "kube-api-access-hrjnp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:33:14 crc kubenswrapper[4810]: I0219 15:33:14.859450 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/783f4f81-b0fb-49c4-9b07-a2715641355a-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 15:33:14 crc kubenswrapper[4810]: I0219 15:33:14.859786 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hrjnp\" (UniqueName: \"kubernetes.io/projected/783f4f81-b0fb-49c4-9b07-a2715641355a-kube-api-access-hrjnp\") on node \"crc\" DevicePath \"\"" Feb 19 15:33:14 crc kubenswrapper[4810]: I0219 15:33:14.880098 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/783f4f81-b0fb-49c4-9b07-a2715641355a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "783f4f81-b0fb-49c4-9b07-a2715641355a" (UID: "783f4f81-b0fb-49c4-9b07-a2715641355a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:33:14 crc kubenswrapper[4810]: I0219 15:33:14.961116 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/783f4f81-b0fb-49c4-9b07-a2715641355a-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 15:33:15 crc kubenswrapper[4810]: I0219 15:33:15.056595 4810 generic.go:334] "Generic (PLEG): container finished" podID="783f4f81-b0fb-49c4-9b07-a2715641355a" containerID="1a4461610ed2fb313a8e62f0315dc40b8ce3a3e6a0f3690df259b0b16d63375f" exitCode=0 Feb 19 15:33:15 crc kubenswrapper[4810]: I0219 15:33:15.056664 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-955rr" event={"ID":"783f4f81-b0fb-49c4-9b07-a2715641355a","Type":"ContainerDied","Data":"1a4461610ed2fb313a8e62f0315dc40b8ce3a3e6a0f3690df259b0b16d63375f"} Feb 19 15:33:15 crc kubenswrapper[4810]: I0219 15:33:15.056723 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-955rr" event={"ID":"783f4f81-b0fb-49c4-9b07-a2715641355a","Type":"ContainerDied","Data":"75d6325f144c755c14a8b6930d4d42def83ced3289d5cc9615e327af0d069534"} Feb 19 15:33:15 crc kubenswrapper[4810]: I0219 15:33:15.056741 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-955rr" Feb 19 15:33:15 crc kubenswrapper[4810]: I0219 15:33:15.056996 4810 scope.go:117] "RemoveContainer" containerID="1a4461610ed2fb313a8e62f0315dc40b8ce3a3e6a0f3690df259b0b16d63375f" Feb 19 15:33:15 crc kubenswrapper[4810]: I0219 15:33:15.104149 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-955rr"] Feb 19 15:33:15 crc kubenswrapper[4810]: I0219 15:33:15.106917 4810 scope.go:117] "RemoveContainer" containerID="6bc2e1d7197343d97aad8e795e58bf89ffecf5750c4972eb72fee6305a410fbc" Feb 19 15:33:15 crc kubenswrapper[4810]: I0219 15:33:15.123943 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-955rr"] Feb 19 15:33:15 crc kubenswrapper[4810]: I0219 15:33:15.136203 4810 scope.go:117] "RemoveContainer" containerID="761f02a235bc3255020cf4a80c96bbf0e5db299cb80ff40640ab5d09947f186d" Feb 19 15:33:15 crc kubenswrapper[4810]: I0219 15:33:15.187969 4810 scope.go:117] "RemoveContainer" containerID="1a4461610ed2fb313a8e62f0315dc40b8ce3a3e6a0f3690df259b0b16d63375f" Feb 19 15:33:15 crc kubenswrapper[4810]: E0219 15:33:15.188466 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a4461610ed2fb313a8e62f0315dc40b8ce3a3e6a0f3690df259b0b16d63375f\": container with ID starting with 1a4461610ed2fb313a8e62f0315dc40b8ce3a3e6a0f3690df259b0b16d63375f not found: ID does not exist" containerID="1a4461610ed2fb313a8e62f0315dc40b8ce3a3e6a0f3690df259b0b16d63375f" Feb 19 15:33:15 crc kubenswrapper[4810]: I0219 15:33:15.188525 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a4461610ed2fb313a8e62f0315dc40b8ce3a3e6a0f3690df259b0b16d63375f"} err="failed to get container status \"1a4461610ed2fb313a8e62f0315dc40b8ce3a3e6a0f3690df259b0b16d63375f\": rpc error: code = NotFound desc = could not find container \"1a4461610ed2fb313a8e62f0315dc40b8ce3a3e6a0f3690df259b0b16d63375f\": container with ID starting with 1a4461610ed2fb313a8e62f0315dc40b8ce3a3e6a0f3690df259b0b16d63375f not found: ID does not exist" Feb 19 15:33:15 crc kubenswrapper[4810]: I0219 15:33:15.188563 4810 scope.go:117] "RemoveContainer" containerID="6bc2e1d7197343d97aad8e795e58bf89ffecf5750c4972eb72fee6305a410fbc" Feb 19 15:33:15 crc kubenswrapper[4810]: E0219 15:33:15.188976 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6bc2e1d7197343d97aad8e795e58bf89ffecf5750c4972eb72fee6305a410fbc\": container with ID starting with 6bc2e1d7197343d97aad8e795e58bf89ffecf5750c4972eb72fee6305a410fbc not found: ID does not exist" containerID="6bc2e1d7197343d97aad8e795e58bf89ffecf5750c4972eb72fee6305a410fbc" Feb 19 15:33:15 crc kubenswrapper[4810]: I0219 15:33:15.189023 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bc2e1d7197343d97aad8e795e58bf89ffecf5750c4972eb72fee6305a410fbc"} err="failed to get container status \"6bc2e1d7197343d97aad8e795e58bf89ffecf5750c4972eb72fee6305a410fbc\": rpc error: code = NotFound desc = could not find container \"6bc2e1d7197343d97aad8e795e58bf89ffecf5750c4972eb72fee6305a410fbc\": container with ID starting with 6bc2e1d7197343d97aad8e795e58bf89ffecf5750c4972eb72fee6305a410fbc not found: ID does not exist" Feb 19 15:33:15 crc kubenswrapper[4810]: I0219 15:33:15.189050 4810 scope.go:117] "RemoveContainer" containerID="761f02a235bc3255020cf4a80c96bbf0e5db299cb80ff40640ab5d09947f186d" Feb 19 15:33:15 crc kubenswrapper[4810]: E0219 15:33:15.189373 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"761f02a235bc3255020cf4a80c96bbf0e5db299cb80ff40640ab5d09947f186d\": container with ID starting with 761f02a235bc3255020cf4a80c96bbf0e5db299cb80ff40640ab5d09947f186d not found: ID does not exist" containerID="761f02a235bc3255020cf4a80c96bbf0e5db299cb80ff40640ab5d09947f186d" Feb 19 15:33:15 crc kubenswrapper[4810]: I0219 15:33:15.189435 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"761f02a235bc3255020cf4a80c96bbf0e5db299cb80ff40640ab5d09947f186d"} err="failed to get container status \"761f02a235bc3255020cf4a80c96bbf0e5db299cb80ff40640ab5d09947f186d\": rpc error: code = NotFound desc = could not find container \"761f02a235bc3255020cf4a80c96bbf0e5db299cb80ff40640ab5d09947f186d\": container with ID starting with 761f02a235bc3255020cf4a80c96bbf0e5db299cb80ff40640ab5d09947f186d not found: ID does not exist" Feb 19 15:33:15 crc kubenswrapper[4810]: I0219 15:33:15.454311 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="783f4f81-b0fb-49c4-9b07-a2715641355a" path="/var/lib/kubelet/pods/783f4f81-b0fb-49c4-9b07-a2715641355a/volumes" Feb 19 15:33:18 crc kubenswrapper[4810]: I0219 15:33:18.323651 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7754685579-pj5bz" Feb 19 15:33:18 crc kubenswrapper[4810]: I0219 15:33:18.414725 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7dc9fb8849-t2gx5"] Feb 19 15:33:18 crc kubenswrapper[4810]: I0219 15:33:18.415039 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7dc9fb8849-t2gx5" podUID="275a98c0-8e6a-4587-8628-54f70b836615" containerName="dnsmasq-dns" containerID="cri-o://ec7d0dde11f4d91dfc1709e9d373983bc5e0931480a8ae6b522e10ab70a1f7fc" gracePeriod=10 Feb 19 15:33:18 crc kubenswrapper[4810]: I0219 15:33:18.591276 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-685d6df875-6hghq"] Feb 19 15:33:18 crc kubenswrapper[4810]: E0219 15:33:18.591843 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="783f4f81-b0fb-49c4-9b07-a2715641355a" containerName="registry-server" Feb 19 15:33:18 crc kubenswrapper[4810]: I0219 15:33:18.591865 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="783f4f81-b0fb-49c4-9b07-a2715641355a" containerName="registry-server" Feb 19 15:33:18 crc kubenswrapper[4810]: E0219 15:33:18.591886 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="783f4f81-b0fb-49c4-9b07-a2715641355a" containerName="extract-utilities" Feb 19 15:33:18 crc kubenswrapper[4810]: I0219 15:33:18.591895 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="783f4f81-b0fb-49c4-9b07-a2715641355a" containerName="extract-utilities" Feb 19 15:33:18 crc kubenswrapper[4810]: E0219 15:33:18.591918 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="783f4f81-b0fb-49c4-9b07-a2715641355a" containerName="extract-content" Feb 19 15:33:18 crc kubenswrapper[4810]: I0219 15:33:18.591925 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="783f4f81-b0fb-49c4-9b07-a2715641355a" containerName="extract-content" Feb 19 15:33:18 crc kubenswrapper[4810]: I0219 15:33:18.592173 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="783f4f81-b0fb-49c4-9b07-a2715641355a" containerName="registry-server" Feb 19 15:33:18 crc kubenswrapper[4810]: I0219 15:33:18.593694 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-685d6df875-6hghq" Feb 19 15:33:18 crc kubenswrapper[4810]: I0219 15:33:18.601067 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-685d6df875-6hghq"] Feb 19 15:33:18 crc kubenswrapper[4810]: I0219 15:33:18.749391 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gxvs\" (UniqueName: \"kubernetes.io/projected/7c074feb-2f7c-4f84-9ea8-5a9062e6b10a-kube-api-access-7gxvs\") pod \"dnsmasq-dns-685d6df875-6hghq\" (UID: \"7c074feb-2f7c-4f84-9ea8-5a9062e6b10a\") " pod="openstack/dnsmasq-dns-685d6df875-6hghq" Feb 19 15:33:18 crc kubenswrapper[4810]: I0219 15:33:18.749453 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7c074feb-2f7c-4f84-9ea8-5a9062e6b10a-dns-swift-storage-0\") pod \"dnsmasq-dns-685d6df875-6hghq\" (UID: \"7c074feb-2f7c-4f84-9ea8-5a9062e6b10a\") " pod="openstack/dnsmasq-dns-685d6df875-6hghq" Feb 19 15:33:18 crc kubenswrapper[4810]: I0219 15:33:18.749961 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7c074feb-2f7c-4f84-9ea8-5a9062e6b10a-dns-svc\") pod \"dnsmasq-dns-685d6df875-6hghq\" (UID: \"7c074feb-2f7c-4f84-9ea8-5a9062e6b10a\") " pod="openstack/dnsmasq-dns-685d6df875-6hghq" Feb 19 15:33:18 crc kubenswrapper[4810]: I0219 15:33:18.750008 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c074feb-2f7c-4f84-9ea8-5a9062e6b10a-config\") pod \"dnsmasq-dns-685d6df875-6hghq\" (UID: \"7c074feb-2f7c-4f84-9ea8-5a9062e6b10a\") " pod="openstack/dnsmasq-dns-685d6df875-6hghq" Feb 19 15:33:18 crc kubenswrapper[4810]: I0219 15:33:18.750110 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/7c074feb-2f7c-4f84-9ea8-5a9062e6b10a-openstack-edpm-ipam\") pod \"dnsmasq-dns-685d6df875-6hghq\" (UID: \"7c074feb-2f7c-4f84-9ea8-5a9062e6b10a\") " pod="openstack/dnsmasq-dns-685d6df875-6hghq" Feb 19 15:33:18 crc kubenswrapper[4810]: I0219 15:33:18.750161 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7c074feb-2f7c-4f84-9ea8-5a9062e6b10a-ovsdbserver-sb\") pod \"dnsmasq-dns-685d6df875-6hghq\" (UID: \"7c074feb-2f7c-4f84-9ea8-5a9062e6b10a\") " pod="openstack/dnsmasq-dns-685d6df875-6hghq" Feb 19 15:33:18 crc kubenswrapper[4810]: I0219 15:33:18.750235 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7c074feb-2f7c-4f84-9ea8-5a9062e6b10a-ovsdbserver-nb\") pod \"dnsmasq-dns-685d6df875-6hghq\" (UID: \"7c074feb-2f7c-4f84-9ea8-5a9062e6b10a\") " pod="openstack/dnsmasq-dns-685d6df875-6hghq" Feb 19 15:33:18 crc kubenswrapper[4810]: I0219 15:33:18.853542 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7c074feb-2f7c-4f84-9ea8-5a9062e6b10a-ovsdbserver-nb\") pod \"dnsmasq-dns-685d6df875-6hghq\" (UID: \"7c074feb-2f7c-4f84-9ea8-5a9062e6b10a\") " pod="openstack/dnsmasq-dns-685d6df875-6hghq" Feb 19 15:33:18 crc kubenswrapper[4810]: I0219 15:33:18.853930 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gxvs\" (UniqueName: \"kubernetes.io/projected/7c074feb-2f7c-4f84-9ea8-5a9062e6b10a-kube-api-access-7gxvs\") pod \"dnsmasq-dns-685d6df875-6hghq\" (UID: \"7c074feb-2f7c-4f84-9ea8-5a9062e6b10a\") " pod="openstack/dnsmasq-dns-685d6df875-6hghq" Feb 19 15:33:18 crc kubenswrapper[4810]: I0219 15:33:18.853958 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7c074feb-2f7c-4f84-9ea8-5a9062e6b10a-dns-swift-storage-0\") pod \"dnsmasq-dns-685d6df875-6hghq\" (UID: \"7c074feb-2f7c-4f84-9ea8-5a9062e6b10a\") " pod="openstack/dnsmasq-dns-685d6df875-6hghq" Feb 19 15:33:18 crc kubenswrapper[4810]: I0219 15:33:18.853991 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7c074feb-2f7c-4f84-9ea8-5a9062e6b10a-dns-svc\") pod \"dnsmasq-dns-685d6df875-6hghq\" (UID: \"7c074feb-2f7c-4f84-9ea8-5a9062e6b10a\") " pod="openstack/dnsmasq-dns-685d6df875-6hghq" Feb 19 15:33:18 crc kubenswrapper[4810]: I0219 15:33:18.854020 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c074feb-2f7c-4f84-9ea8-5a9062e6b10a-config\") pod \"dnsmasq-dns-685d6df875-6hghq\" (UID: \"7c074feb-2f7c-4f84-9ea8-5a9062e6b10a\") " pod="openstack/dnsmasq-dns-685d6df875-6hghq" Feb 19 15:33:18 crc kubenswrapper[4810]: I0219 15:33:18.854087 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/7c074feb-2f7c-4f84-9ea8-5a9062e6b10a-openstack-edpm-ipam\") pod \"dnsmasq-dns-685d6df875-6hghq\" (UID: \"7c074feb-2f7c-4f84-9ea8-5a9062e6b10a\") " pod="openstack/dnsmasq-dns-685d6df875-6hghq" Feb 19 15:33:18 crc kubenswrapper[4810]: I0219 15:33:18.854121 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7c074feb-2f7c-4f84-9ea8-5a9062e6b10a-ovsdbserver-sb\") pod \"dnsmasq-dns-685d6df875-6hghq\" (UID: \"7c074feb-2f7c-4f84-9ea8-5a9062e6b10a\") " pod="openstack/dnsmasq-dns-685d6df875-6hghq" Feb 19 15:33:18 crc kubenswrapper[4810]: I0219 15:33:18.854893 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7c074feb-2f7c-4f84-9ea8-5a9062e6b10a-dns-swift-storage-0\") pod \"dnsmasq-dns-685d6df875-6hghq\" (UID: \"7c074feb-2f7c-4f84-9ea8-5a9062e6b10a\") " pod="openstack/dnsmasq-dns-685d6df875-6hghq" Feb 19 15:33:18 crc kubenswrapper[4810]: I0219 15:33:18.855394 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7c074feb-2f7c-4f84-9ea8-5a9062e6b10a-ovsdbserver-nb\") pod \"dnsmasq-dns-685d6df875-6hghq\" (UID: \"7c074feb-2f7c-4f84-9ea8-5a9062e6b10a\") " pod="openstack/dnsmasq-dns-685d6df875-6hghq" Feb 19 15:33:18 crc kubenswrapper[4810]: I0219 15:33:18.856166 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c074feb-2f7c-4f84-9ea8-5a9062e6b10a-config\") pod \"dnsmasq-dns-685d6df875-6hghq\" (UID: \"7c074feb-2f7c-4f84-9ea8-5a9062e6b10a\") " pod="openstack/dnsmasq-dns-685d6df875-6hghq" Feb 19 15:33:18 crc kubenswrapper[4810]: I0219 15:33:18.856597 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7c074feb-2f7c-4f84-9ea8-5a9062e6b10a-dns-svc\") pod \"dnsmasq-dns-685d6df875-6hghq\" (UID: \"7c074feb-2f7c-4f84-9ea8-5a9062e6b10a\") " pod="openstack/dnsmasq-dns-685d6df875-6hghq" Feb 19 15:33:18 crc kubenswrapper[4810]: I0219 15:33:18.856664 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/7c074feb-2f7c-4f84-9ea8-5a9062e6b10a-openstack-edpm-ipam\") pod \"dnsmasq-dns-685d6df875-6hghq\" (UID: \"7c074feb-2f7c-4f84-9ea8-5a9062e6b10a\") " pod="openstack/dnsmasq-dns-685d6df875-6hghq" Feb 19 15:33:18 crc kubenswrapper[4810]: I0219 15:33:18.858816 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7c074feb-2f7c-4f84-9ea8-5a9062e6b10a-ovsdbserver-sb\") pod \"dnsmasq-dns-685d6df875-6hghq\" (UID: \"7c074feb-2f7c-4f84-9ea8-5a9062e6b10a\") " pod="openstack/dnsmasq-dns-685d6df875-6hghq" Feb 19 15:33:18 crc kubenswrapper[4810]: I0219 15:33:18.885384 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gxvs\" (UniqueName: \"kubernetes.io/projected/7c074feb-2f7c-4f84-9ea8-5a9062e6b10a-kube-api-access-7gxvs\") pod \"dnsmasq-dns-685d6df875-6hghq\" (UID: \"7c074feb-2f7c-4f84-9ea8-5a9062e6b10a\") " pod="openstack/dnsmasq-dns-685d6df875-6hghq" Feb 19 15:33:18 crc kubenswrapper[4810]: I0219 15:33:18.915768 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-685d6df875-6hghq" Feb 19 15:33:19 crc kubenswrapper[4810]: I0219 15:33:19.085155 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7dc9fb8849-t2gx5" Feb 19 15:33:19 crc kubenswrapper[4810]: I0219 15:33:19.137311 4810 generic.go:334] "Generic (PLEG): container finished" podID="275a98c0-8e6a-4587-8628-54f70b836615" containerID="ec7d0dde11f4d91dfc1709e9d373983bc5e0931480a8ae6b522e10ab70a1f7fc" exitCode=0 Feb 19 15:33:19 crc kubenswrapper[4810]: I0219 15:33:19.137573 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7dc9fb8849-t2gx5" event={"ID":"275a98c0-8e6a-4587-8628-54f70b836615","Type":"ContainerDied","Data":"ec7d0dde11f4d91dfc1709e9d373983bc5e0931480a8ae6b522e10ab70a1f7fc"} Feb 19 15:33:19 crc kubenswrapper[4810]: I0219 15:33:19.137598 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7dc9fb8849-t2gx5" event={"ID":"275a98c0-8e6a-4587-8628-54f70b836615","Type":"ContainerDied","Data":"8c49db3d1d18d97ef799aa2fe03c13f1f635369a4b0477795ad010b6baa6d941"} Feb 19 15:33:19 crc kubenswrapper[4810]: I0219 15:33:19.137616 4810 scope.go:117] "RemoveContainer" containerID="ec7d0dde11f4d91dfc1709e9d373983bc5e0931480a8ae6b522e10ab70a1f7fc" Feb 19 15:33:19 crc kubenswrapper[4810]: I0219 15:33:19.137719 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7dc9fb8849-t2gx5" Feb 19 15:33:19 crc kubenswrapper[4810]: I0219 15:33:19.161620 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/275a98c0-8e6a-4587-8628-54f70b836615-dns-svc\") pod \"275a98c0-8e6a-4587-8628-54f70b836615\" (UID: \"275a98c0-8e6a-4587-8628-54f70b836615\") " Feb 19 15:33:19 crc kubenswrapper[4810]: I0219 15:33:19.161720 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/275a98c0-8e6a-4587-8628-54f70b836615-config\") pod \"275a98c0-8e6a-4587-8628-54f70b836615\" (UID: \"275a98c0-8e6a-4587-8628-54f70b836615\") " Feb 19 15:33:19 crc kubenswrapper[4810]: I0219 15:33:19.161883 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9xn9\" (UniqueName: \"kubernetes.io/projected/275a98c0-8e6a-4587-8628-54f70b836615-kube-api-access-t9xn9\") pod \"275a98c0-8e6a-4587-8628-54f70b836615\" (UID: \"275a98c0-8e6a-4587-8628-54f70b836615\") " Feb 19 15:33:19 crc kubenswrapper[4810]: I0219 15:33:19.161913 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/275a98c0-8e6a-4587-8628-54f70b836615-dns-swift-storage-0\") pod \"275a98c0-8e6a-4587-8628-54f70b836615\" (UID: \"275a98c0-8e6a-4587-8628-54f70b836615\") " Feb 19 15:33:19 crc kubenswrapper[4810]: I0219 15:33:19.162038 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/275a98c0-8e6a-4587-8628-54f70b836615-ovsdbserver-nb\") pod \"275a98c0-8e6a-4587-8628-54f70b836615\" (UID: \"275a98c0-8e6a-4587-8628-54f70b836615\") " Feb 19 15:33:19 crc kubenswrapper[4810]: I0219 15:33:19.162077 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/275a98c0-8e6a-4587-8628-54f70b836615-ovsdbserver-sb\") pod \"275a98c0-8e6a-4587-8628-54f70b836615\" (UID: \"275a98c0-8e6a-4587-8628-54f70b836615\") " Feb 19 15:33:19 crc kubenswrapper[4810]: I0219 15:33:19.168266 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/275a98c0-8e6a-4587-8628-54f70b836615-kube-api-access-t9xn9" (OuterVolumeSpecName: "kube-api-access-t9xn9") pod "275a98c0-8e6a-4587-8628-54f70b836615" (UID: "275a98c0-8e6a-4587-8628-54f70b836615"). InnerVolumeSpecName "kube-api-access-t9xn9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:33:19 crc kubenswrapper[4810]: I0219 15:33:19.188342 4810 scope.go:117] "RemoveContainer" containerID="87b0b7637ffa6ceedf744033bed56aa64044ea8ed1fd841f1129813c92a9043f" Feb 19 15:33:19 crc kubenswrapper[4810]: I0219 15:33:19.221000 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/275a98c0-8e6a-4587-8628-54f70b836615-config" (OuterVolumeSpecName: "config") pod "275a98c0-8e6a-4587-8628-54f70b836615" (UID: "275a98c0-8e6a-4587-8628-54f70b836615"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:33:19 crc kubenswrapper[4810]: I0219 15:33:19.222092 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/275a98c0-8e6a-4587-8628-54f70b836615-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "275a98c0-8e6a-4587-8628-54f70b836615" (UID: "275a98c0-8e6a-4587-8628-54f70b836615"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:33:19 crc kubenswrapper[4810]: I0219 15:33:19.239633 4810 scope.go:117] "RemoveContainer" containerID="ec7d0dde11f4d91dfc1709e9d373983bc5e0931480a8ae6b522e10ab70a1f7fc" Feb 19 15:33:19 crc kubenswrapper[4810]: E0219 15:33:19.240047 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec7d0dde11f4d91dfc1709e9d373983bc5e0931480a8ae6b522e10ab70a1f7fc\": container with ID starting with ec7d0dde11f4d91dfc1709e9d373983bc5e0931480a8ae6b522e10ab70a1f7fc not found: ID does not exist" containerID="ec7d0dde11f4d91dfc1709e9d373983bc5e0931480a8ae6b522e10ab70a1f7fc" Feb 19 15:33:19 crc kubenswrapper[4810]: I0219 15:33:19.240090 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec7d0dde11f4d91dfc1709e9d373983bc5e0931480a8ae6b522e10ab70a1f7fc"} err="failed to get container status \"ec7d0dde11f4d91dfc1709e9d373983bc5e0931480a8ae6b522e10ab70a1f7fc\": rpc error: code = NotFound desc = could not find container \"ec7d0dde11f4d91dfc1709e9d373983bc5e0931480a8ae6b522e10ab70a1f7fc\": container with ID starting with ec7d0dde11f4d91dfc1709e9d373983bc5e0931480a8ae6b522e10ab70a1f7fc not found: ID does not exist" Feb 19 15:33:19 crc kubenswrapper[4810]: I0219 15:33:19.240115 4810 scope.go:117] "RemoveContainer" containerID="87b0b7637ffa6ceedf744033bed56aa64044ea8ed1fd841f1129813c92a9043f" Feb 19 15:33:19 crc kubenswrapper[4810]: E0219 15:33:19.240644 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87b0b7637ffa6ceedf744033bed56aa64044ea8ed1fd841f1129813c92a9043f\": container with ID starting with 87b0b7637ffa6ceedf744033bed56aa64044ea8ed1fd841f1129813c92a9043f not found: ID does not exist" containerID="87b0b7637ffa6ceedf744033bed56aa64044ea8ed1fd841f1129813c92a9043f" Feb 19 15:33:19 crc kubenswrapper[4810]: I0219 15:33:19.240682 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87b0b7637ffa6ceedf744033bed56aa64044ea8ed1fd841f1129813c92a9043f"} err="failed to get container status \"87b0b7637ffa6ceedf744033bed56aa64044ea8ed1fd841f1129813c92a9043f\": rpc error: code = NotFound desc = could not find container \"87b0b7637ffa6ceedf744033bed56aa64044ea8ed1fd841f1129813c92a9043f\": container with ID starting with 87b0b7637ffa6ceedf744033bed56aa64044ea8ed1fd841f1129813c92a9043f not found: ID does not exist" Feb 19 15:33:19 crc kubenswrapper[4810]: I0219 15:33:19.243976 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/275a98c0-8e6a-4587-8628-54f70b836615-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "275a98c0-8e6a-4587-8628-54f70b836615" (UID: "275a98c0-8e6a-4587-8628-54f70b836615"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:33:19 crc kubenswrapper[4810]: I0219 15:33:19.245438 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/275a98c0-8e6a-4587-8628-54f70b836615-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "275a98c0-8e6a-4587-8628-54f70b836615" (UID: "275a98c0-8e6a-4587-8628-54f70b836615"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:33:19 crc kubenswrapper[4810]: I0219 15:33:19.247209 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/275a98c0-8e6a-4587-8628-54f70b836615-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "275a98c0-8e6a-4587-8628-54f70b836615" (UID: "275a98c0-8e6a-4587-8628-54f70b836615"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:33:19 crc kubenswrapper[4810]: I0219 15:33:19.265419 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t9xn9\" (UniqueName: \"kubernetes.io/projected/275a98c0-8e6a-4587-8628-54f70b836615-kube-api-access-t9xn9\") on node \"crc\" DevicePath \"\"" Feb 19 15:33:19 crc kubenswrapper[4810]: I0219 15:33:19.265466 4810 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/275a98c0-8e6a-4587-8628-54f70b836615-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 19 15:33:19 crc kubenswrapper[4810]: I0219 15:33:19.265477 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/275a98c0-8e6a-4587-8628-54f70b836615-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 15:33:19 crc kubenswrapper[4810]: I0219 15:33:19.265485 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/275a98c0-8e6a-4587-8628-54f70b836615-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 15:33:19 crc kubenswrapper[4810]: I0219 15:33:19.265494 4810 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/275a98c0-8e6a-4587-8628-54f70b836615-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 15:33:19 crc kubenswrapper[4810]: I0219 15:33:19.265503 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/275a98c0-8e6a-4587-8628-54f70b836615-config\") on node \"crc\" DevicePath \"\"" Feb 19 15:33:19 crc kubenswrapper[4810]: I0219 15:33:19.482237 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7dc9fb8849-t2gx5"] Feb 19 15:33:19 crc kubenswrapper[4810]: I0219 15:33:19.496016 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7dc9fb8849-t2gx5"] Feb 19 15:33:19 crc kubenswrapper[4810]: W0219 15:33:19.522888 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7c074feb_2f7c_4f84_9ea8_5a9062e6b10a.slice/crio-6c77a5fd184196ecb07c7282f62afdb7d7e64c0c1fbb7ec04e0521259c9ec772 WatchSource:0}: Error finding container 6c77a5fd184196ecb07c7282f62afdb7d7e64c0c1fbb7ec04e0521259c9ec772: Status 404 returned error can't find the container with id 6c77a5fd184196ecb07c7282f62afdb7d7e64c0c1fbb7ec04e0521259c9ec772 Feb 19 15:33:19 crc kubenswrapper[4810]: I0219 15:33:19.525980 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-685d6df875-6hghq"] Feb 19 15:33:20 crc kubenswrapper[4810]: I0219 15:33:20.151767 4810 generic.go:334] "Generic (PLEG): container finished" podID="7c074feb-2f7c-4f84-9ea8-5a9062e6b10a" containerID="a696b0db87b83e84ced11bfb6a883121309718434a9f476eae5a74820db73b13" exitCode=0 Feb 19 15:33:20 crc kubenswrapper[4810]: I0219 15:33:20.151849 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-685d6df875-6hghq" event={"ID":"7c074feb-2f7c-4f84-9ea8-5a9062e6b10a","Type":"ContainerDied","Data":"a696b0db87b83e84ced11bfb6a883121309718434a9f476eae5a74820db73b13"} Feb 19 15:33:20 crc kubenswrapper[4810]: I0219 15:33:20.151899 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-685d6df875-6hghq" event={"ID":"7c074feb-2f7c-4f84-9ea8-5a9062e6b10a","Type":"ContainerStarted","Data":"6c77a5fd184196ecb07c7282f62afdb7d7e64c0c1fbb7ec04e0521259c9ec772"} Feb 19 15:33:21 crc kubenswrapper[4810]: I0219 15:33:21.170304 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-685d6df875-6hghq" event={"ID":"7c074feb-2f7c-4f84-9ea8-5a9062e6b10a","Type":"ContainerStarted","Data":"00c970a164c833e0d9c62aa4f69220908502eb8fc2336adb5ca376b8c779d390"} Feb 19 15:33:21 crc kubenswrapper[4810]: I0219 15:33:21.170823 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-685d6df875-6hghq" Feb 19 15:33:21 crc kubenswrapper[4810]: I0219 15:33:21.225251 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-685d6df875-6hghq" podStartSLOduration=3.225229719 podStartE2EDuration="3.225229719s" podCreationTimestamp="2026-02-19 15:33:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:33:21.204912756 +0000 UTC m=+1430.686942900" watchObservedRunningTime="2026-02-19 15:33:21.225229719 +0000 UTC m=+1430.707259853" Feb 19 15:33:21 crc kubenswrapper[4810]: I0219 15:33:21.454874 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="275a98c0-8e6a-4587-8628-54f70b836615" path="/var/lib/kubelet/pods/275a98c0-8e6a-4587-8628-54f70b836615/volumes" Feb 19 15:33:28 crc kubenswrapper[4810]: I0219 15:33:28.918587 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-685d6df875-6hghq" Feb 19 15:33:29 crc kubenswrapper[4810]: I0219 15:33:29.006059 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7754685579-pj5bz"] Feb 19 15:33:29 crc kubenswrapper[4810]: I0219 15:33:29.006559 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7754685579-pj5bz" podUID="0c3acd8a-696a-4a86-9052-03ef2cca79c7" containerName="dnsmasq-dns" containerID="cri-o://7c6881386ea9e17edcb266f7d41c368100ad0a747bba3ae5f2c27278cd9f5864" gracePeriod=10 Feb 19 15:33:29 crc kubenswrapper[4810]: I0219 15:33:29.283032 4810 generic.go:334] "Generic (PLEG): container finished" podID="0c3acd8a-696a-4a86-9052-03ef2cca79c7" containerID="7c6881386ea9e17edcb266f7d41c368100ad0a747bba3ae5f2c27278cd9f5864" exitCode=0 Feb 19 15:33:29 crc kubenswrapper[4810]: I0219 15:33:29.283303 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7754685579-pj5bz" event={"ID":"0c3acd8a-696a-4a86-9052-03ef2cca79c7","Type":"ContainerDied","Data":"7c6881386ea9e17edcb266f7d41c368100ad0a747bba3ae5f2c27278cd9f5864"} Feb 19 15:33:29 crc kubenswrapper[4810]: I0219 15:33:29.563592 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7754685579-pj5bz" Feb 19 15:33:29 crc kubenswrapper[4810]: I0219 15:33:29.695198 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c3acd8a-696a-4a86-9052-03ef2cca79c7-config\") pod \"0c3acd8a-696a-4a86-9052-03ef2cca79c7\" (UID: \"0c3acd8a-696a-4a86-9052-03ef2cca79c7\") " Feb 19 15:33:29 crc kubenswrapper[4810]: I0219 15:33:29.695305 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0c3acd8a-696a-4a86-9052-03ef2cca79c7-dns-svc\") pod \"0c3acd8a-696a-4a86-9052-03ef2cca79c7\" (UID: \"0c3acd8a-696a-4a86-9052-03ef2cca79c7\") " Feb 19 15:33:29 crc kubenswrapper[4810]: I0219 15:33:29.695382 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0c3acd8a-696a-4a86-9052-03ef2cca79c7-ovsdbserver-nb\") pod \"0c3acd8a-696a-4a86-9052-03ef2cca79c7\" (UID: \"0c3acd8a-696a-4a86-9052-03ef2cca79c7\") " Feb 19 15:33:29 crc kubenswrapper[4810]: I0219 15:33:29.695448 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/0c3acd8a-696a-4a86-9052-03ef2cca79c7-openstack-edpm-ipam\") pod \"0c3acd8a-696a-4a86-9052-03ef2cca79c7\" (UID: \"0c3acd8a-696a-4a86-9052-03ef2cca79c7\") " Feb 19 15:33:29 crc kubenswrapper[4810]: I0219 15:33:29.695474 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0c3acd8a-696a-4a86-9052-03ef2cca79c7-ovsdbserver-sb\") pod \"0c3acd8a-696a-4a86-9052-03ef2cca79c7\" (UID: \"0c3acd8a-696a-4a86-9052-03ef2cca79c7\") " Feb 19 15:33:29 crc kubenswrapper[4810]: I0219 15:33:29.695510 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mgmt7\" (UniqueName: \"kubernetes.io/projected/0c3acd8a-696a-4a86-9052-03ef2cca79c7-kube-api-access-mgmt7\") pod \"0c3acd8a-696a-4a86-9052-03ef2cca79c7\" (UID: \"0c3acd8a-696a-4a86-9052-03ef2cca79c7\") " Feb 19 15:33:29 crc kubenswrapper[4810]: I0219 15:33:29.695587 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0c3acd8a-696a-4a86-9052-03ef2cca79c7-dns-swift-storage-0\") pod \"0c3acd8a-696a-4a86-9052-03ef2cca79c7\" (UID: \"0c3acd8a-696a-4a86-9052-03ef2cca79c7\") " Feb 19 15:33:29 crc kubenswrapper[4810]: I0219 15:33:29.707002 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c3acd8a-696a-4a86-9052-03ef2cca79c7-kube-api-access-mgmt7" (OuterVolumeSpecName: "kube-api-access-mgmt7") pod "0c3acd8a-696a-4a86-9052-03ef2cca79c7" (UID: "0c3acd8a-696a-4a86-9052-03ef2cca79c7"). InnerVolumeSpecName "kube-api-access-mgmt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:33:29 crc kubenswrapper[4810]: I0219 15:33:29.751187 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c3acd8a-696a-4a86-9052-03ef2cca79c7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0c3acd8a-696a-4a86-9052-03ef2cca79c7" (UID: "0c3acd8a-696a-4a86-9052-03ef2cca79c7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:33:29 crc kubenswrapper[4810]: I0219 15:33:29.757531 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c3acd8a-696a-4a86-9052-03ef2cca79c7-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "0c3acd8a-696a-4a86-9052-03ef2cca79c7" (UID: "0c3acd8a-696a-4a86-9052-03ef2cca79c7"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:33:29 crc kubenswrapper[4810]: I0219 15:33:29.761446 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c3acd8a-696a-4a86-9052-03ef2cca79c7-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "0c3acd8a-696a-4a86-9052-03ef2cca79c7" (UID: "0c3acd8a-696a-4a86-9052-03ef2cca79c7"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:33:29 crc kubenswrapper[4810]: I0219 15:33:29.762301 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c3acd8a-696a-4a86-9052-03ef2cca79c7-config" (OuterVolumeSpecName: "config") pod "0c3acd8a-696a-4a86-9052-03ef2cca79c7" (UID: "0c3acd8a-696a-4a86-9052-03ef2cca79c7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:33:29 crc kubenswrapper[4810]: I0219 15:33:29.774199 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c3acd8a-696a-4a86-9052-03ef2cca79c7-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0c3acd8a-696a-4a86-9052-03ef2cca79c7" (UID: "0c3acd8a-696a-4a86-9052-03ef2cca79c7"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:33:29 crc kubenswrapper[4810]: I0219 15:33:29.783802 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c3acd8a-696a-4a86-9052-03ef2cca79c7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0c3acd8a-696a-4a86-9052-03ef2cca79c7" (UID: "0c3acd8a-696a-4a86-9052-03ef2cca79c7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:33:29 crc kubenswrapper[4810]: I0219 15:33:29.797784 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c3acd8a-696a-4a86-9052-03ef2cca79c7-config\") on node \"crc\" DevicePath \"\"" Feb 19 15:33:29 crc kubenswrapper[4810]: I0219 15:33:29.797827 4810 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0c3acd8a-696a-4a86-9052-03ef2cca79c7-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 15:33:29 crc kubenswrapper[4810]: I0219 15:33:29.797841 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0c3acd8a-696a-4a86-9052-03ef2cca79c7-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 15:33:29 crc kubenswrapper[4810]: I0219 15:33:29.797855 4810 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/0c3acd8a-696a-4a86-9052-03ef2cca79c7-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 15:33:29 crc kubenswrapper[4810]: I0219 15:33:29.797866 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0c3acd8a-696a-4a86-9052-03ef2cca79c7-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 15:33:29 crc kubenswrapper[4810]: I0219 15:33:29.797879 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mgmt7\" (UniqueName: \"kubernetes.io/projected/0c3acd8a-696a-4a86-9052-03ef2cca79c7-kube-api-access-mgmt7\") on node \"crc\" DevicePath \"\"" Feb 19 15:33:29 crc kubenswrapper[4810]: I0219 15:33:29.797890 4810 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0c3acd8a-696a-4a86-9052-03ef2cca79c7-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 19 15:33:30 crc kubenswrapper[4810]: I0219 15:33:30.295299 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7754685579-pj5bz" event={"ID":"0c3acd8a-696a-4a86-9052-03ef2cca79c7","Type":"ContainerDied","Data":"e6484f856f1dabe83fb7c154c7f3ffdc7af18bb3f81e4c632668b443bfbf7736"} Feb 19 15:33:30 crc kubenswrapper[4810]: I0219 15:33:30.295375 4810 scope.go:117] "RemoveContainer" containerID="7c6881386ea9e17edcb266f7d41c368100ad0a747bba3ae5f2c27278cd9f5864" Feb 19 15:33:30 crc kubenswrapper[4810]: I0219 15:33:30.295442 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7754685579-pj5bz" Feb 19 15:33:30 crc kubenswrapper[4810]: I0219 15:33:30.404380 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7754685579-pj5bz"] Feb 19 15:33:30 crc kubenswrapper[4810]: I0219 15:33:30.432611 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7754685579-pj5bz"] Feb 19 15:33:30 crc kubenswrapper[4810]: I0219 15:33:30.489527 4810 scope.go:117] "RemoveContainer" containerID="493902d9834ff016c9ae111e5d259b86d62b85d4780006bae12de68a9a933a23" Feb 19 15:33:31 crc kubenswrapper[4810]: I0219 15:33:31.464677 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c3acd8a-696a-4a86-9052-03ef2cca79c7" path="/var/lib/kubelet/pods/0c3acd8a-696a-4a86-9052-03ef2cca79c7/volumes" Feb 19 15:33:34 crc kubenswrapper[4810]: I0219 15:33:34.343758 4810 generic.go:334] "Generic (PLEG): container finished" podID="b86448c3-669a-4132-b8ab-4db06347fa10" containerID="01f710572014ea30c64f1b2a62a873ecb5702426460d2b063c7df96d0df38fa3" exitCode=0 Feb 19 15:33:34 crc kubenswrapper[4810]: I0219 15:33:34.343961 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b86448c3-669a-4132-b8ab-4db06347fa10","Type":"ContainerDied","Data":"01f710572014ea30c64f1b2a62a873ecb5702426460d2b063c7df96d0df38fa3"} Feb 19 15:33:35 crc kubenswrapper[4810]: I0219 15:33:35.378301 4810 generic.go:334] "Generic (PLEG): container finished" podID="03247cdb-4055-4d47-b433-848e363768ab" containerID="aeb8c5c6a71d86ecc6aba3fc315503cd296a83a4195aac56a6884ee1e7fef305" exitCode=0 Feb 19 15:33:35 crc kubenswrapper[4810]: I0219 15:33:35.378369 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"03247cdb-4055-4d47-b433-848e363768ab","Type":"ContainerDied","Data":"aeb8c5c6a71d86ecc6aba3fc315503cd296a83a4195aac56a6884ee1e7fef305"} Feb 19 15:33:35 crc kubenswrapper[4810]: I0219 15:33:35.405807 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b86448c3-669a-4132-b8ab-4db06347fa10","Type":"ContainerStarted","Data":"dd506e1ca426d2c85d2a66a3c84ad0452ba42d9045328da00c06a257b5a9e728"} Feb 19 15:33:35 crc kubenswrapper[4810]: I0219 15:33:35.406783 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 19 15:33:35 crc kubenswrapper[4810]: I0219 15:33:35.487365 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.487344181 podStartE2EDuration="36.487344181s" podCreationTimestamp="2026-02-19 15:32:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:33:35.477738138 +0000 UTC m=+1444.959768262" watchObservedRunningTime="2026-02-19 15:33:35.487344181 +0000 UTC m=+1444.969374305" Feb 19 15:33:36 crc kubenswrapper[4810]: I0219 15:33:36.417952 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"03247cdb-4055-4d47-b433-848e363768ab","Type":"ContainerStarted","Data":"8748ae10090a4ae5b6bfbf65ab86cae3b394f0c14ed3d986cfc40122c35f530a"} Feb 19 15:33:36 crc kubenswrapper[4810]: I0219 15:33:36.418634 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 19 15:33:36 crc kubenswrapper[4810]: I0219 15:33:36.447908 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.447889262 podStartE2EDuration="37.447889262s" podCreationTimestamp="2026-02-19 15:32:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:33:36.443259265 +0000 UTC m=+1445.925289389" watchObservedRunningTime="2026-02-19 15:33:36.447889262 +0000 UTC m=+1445.929919386" Feb 19 15:33:39 crc kubenswrapper[4810]: I0219 15:33:39.331642 4810 scope.go:117] "RemoveContainer" containerID="1d10f5e352636a23ee8d873911fea2dc7821da75356614e2daef6d4813ea231e" Feb 19 15:33:39 crc kubenswrapper[4810]: I0219 15:33:39.362607 4810 scope.go:117] "RemoveContainer" containerID="bcd810514d656b151586b085915c58b159ffd3f83f716b74f5c945866a1aa802" Feb 19 15:33:42 crc kubenswrapper[4810]: I0219 15:33:42.744249 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-484qb"] Feb 19 15:33:42 crc kubenswrapper[4810]: E0219 15:33:42.745771 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c3acd8a-696a-4a86-9052-03ef2cca79c7" containerName="dnsmasq-dns" Feb 19 15:33:42 crc kubenswrapper[4810]: I0219 15:33:42.745794 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c3acd8a-696a-4a86-9052-03ef2cca79c7" containerName="dnsmasq-dns" Feb 19 15:33:42 crc kubenswrapper[4810]: E0219 15:33:42.745807 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c3acd8a-696a-4a86-9052-03ef2cca79c7" containerName="init" Feb 19 15:33:42 crc kubenswrapper[4810]: I0219 15:33:42.745814 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c3acd8a-696a-4a86-9052-03ef2cca79c7" containerName="init" Feb 19 15:33:42 crc kubenswrapper[4810]: E0219 15:33:42.745829 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="275a98c0-8e6a-4587-8628-54f70b836615" containerName="init" Feb 19 15:33:42 crc kubenswrapper[4810]: I0219 15:33:42.745837 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="275a98c0-8e6a-4587-8628-54f70b836615" containerName="init" Feb 19 15:33:42 crc kubenswrapper[4810]: E0219 15:33:42.745878 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="275a98c0-8e6a-4587-8628-54f70b836615" containerName="dnsmasq-dns" Feb 19 15:33:42 crc kubenswrapper[4810]: I0219 15:33:42.745884 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="275a98c0-8e6a-4587-8628-54f70b836615" containerName="dnsmasq-dns" Feb 19 15:33:42 crc kubenswrapper[4810]: I0219 15:33:42.746107 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="275a98c0-8e6a-4587-8628-54f70b836615" containerName="dnsmasq-dns" Feb 19 15:33:42 crc kubenswrapper[4810]: I0219 15:33:42.746133 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c3acd8a-696a-4a86-9052-03ef2cca79c7" containerName="dnsmasq-dns" Feb 19 15:33:42 crc kubenswrapper[4810]: I0219 15:33:42.747146 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-484qb" Feb 19 15:33:42 crc kubenswrapper[4810]: I0219 15:33:42.749925 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 15:33:42 crc kubenswrapper[4810]: I0219 15:33:42.750150 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-djc4q" Feb 19 15:33:42 crc kubenswrapper[4810]: I0219 15:33:42.753208 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 15:33:42 crc kubenswrapper[4810]: I0219 15:33:42.753295 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 15:33:42 crc kubenswrapper[4810]: I0219 15:33:42.771509 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-484qb"] Feb 19 15:33:42 crc kubenswrapper[4810]: I0219 15:33:42.797947 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8c05e8c7-82f6-4ef1-a576-3c84e70dc570-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-484qb\" (UID: \"8c05e8c7-82f6-4ef1-a576-3c84e70dc570\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-484qb" Feb 19 15:33:42 crc kubenswrapper[4810]: I0219 15:33:42.798199 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8c05e8c7-82f6-4ef1-a576-3c84e70dc570-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-484qb\" (UID: \"8c05e8c7-82f6-4ef1-a576-3c84e70dc570\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-484qb" Feb 19 15:33:42 crc kubenswrapper[4810]: I0219 15:33:42.798256 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zb8t5\" (UniqueName: \"kubernetes.io/projected/8c05e8c7-82f6-4ef1-a576-3c84e70dc570-kube-api-access-zb8t5\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-484qb\" (UID: \"8c05e8c7-82f6-4ef1-a576-3c84e70dc570\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-484qb" Feb 19 15:33:42 crc kubenswrapper[4810]: I0219 15:33:42.798679 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c05e8c7-82f6-4ef1-a576-3c84e70dc570-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-484qb\" (UID: \"8c05e8c7-82f6-4ef1-a576-3c84e70dc570\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-484qb" Feb 19 15:33:42 crc kubenswrapper[4810]: I0219 15:33:42.900479 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8c05e8c7-82f6-4ef1-a576-3c84e70dc570-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-484qb\" (UID: \"8c05e8c7-82f6-4ef1-a576-3c84e70dc570\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-484qb" Feb 19 15:33:42 crc kubenswrapper[4810]: I0219 15:33:42.900523 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zb8t5\" (UniqueName: \"kubernetes.io/projected/8c05e8c7-82f6-4ef1-a576-3c84e70dc570-kube-api-access-zb8t5\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-484qb\" (UID: \"8c05e8c7-82f6-4ef1-a576-3c84e70dc570\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-484qb" Feb 19 15:33:42 crc kubenswrapper[4810]: I0219 15:33:42.900596 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c05e8c7-82f6-4ef1-a576-3c84e70dc570-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-484qb\" (UID: \"8c05e8c7-82f6-4ef1-a576-3c84e70dc570\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-484qb" Feb 19 15:33:42 crc kubenswrapper[4810]: I0219 15:33:42.900704 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8c05e8c7-82f6-4ef1-a576-3c84e70dc570-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-484qb\" (UID: \"8c05e8c7-82f6-4ef1-a576-3c84e70dc570\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-484qb" Feb 19 15:33:42 crc kubenswrapper[4810]: I0219 15:33:42.906361 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8c05e8c7-82f6-4ef1-a576-3c84e70dc570-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-484qb\" (UID: \"8c05e8c7-82f6-4ef1-a576-3c84e70dc570\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-484qb" Feb 19 15:33:42 crc kubenswrapper[4810]: I0219 15:33:42.906472 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8c05e8c7-82f6-4ef1-a576-3c84e70dc570-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-484qb\" (UID: \"8c05e8c7-82f6-4ef1-a576-3c84e70dc570\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-484qb" Feb 19 15:33:42 crc kubenswrapper[4810]: I0219 15:33:42.907552 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c05e8c7-82f6-4ef1-a576-3c84e70dc570-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-484qb\" (UID: \"8c05e8c7-82f6-4ef1-a576-3c84e70dc570\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-484qb" Feb 19 15:33:42 crc kubenswrapper[4810]: I0219 15:33:42.917506 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zb8t5\" (UniqueName: \"kubernetes.io/projected/8c05e8c7-82f6-4ef1-a576-3c84e70dc570-kube-api-access-zb8t5\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-484qb\" (UID: \"8c05e8c7-82f6-4ef1-a576-3c84e70dc570\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-484qb" Feb 19 15:33:43 crc kubenswrapper[4810]: I0219 15:33:43.077241 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-484qb" Feb 19 15:33:43 crc kubenswrapper[4810]: I0219 15:33:43.889413 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-484qb"] Feb 19 15:33:44 crc kubenswrapper[4810]: I0219 15:33:44.508504 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-484qb" event={"ID":"8c05e8c7-82f6-4ef1-a576-3c84e70dc570","Type":"ContainerStarted","Data":"086440bfa5f26ba1363eecaa5c50943356fef7aea25c05779628eb95ef3c57b2"} Feb 19 15:33:49 crc kubenswrapper[4810]: I0219 15:33:49.546601 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 19 15:33:50 crc kubenswrapper[4810]: I0219 15:33:50.291523 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 19 15:33:55 crc kubenswrapper[4810]: I0219 15:33:55.613320 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-484qb" event={"ID":"8c05e8c7-82f6-4ef1-a576-3c84e70dc570","Type":"ContainerStarted","Data":"fa5950c0010d836c770fe60039a24c6338efe9c7a11d0ea864b9fcf4b45f1ebc"} Feb 19 15:33:55 crc kubenswrapper[4810]: I0219 15:33:55.639583 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-484qb" podStartSLOduration=3.153378544 podStartE2EDuration="13.639568329s" podCreationTimestamp="2026-02-19 15:33:42 +0000 UTC" firstStartedPulling="2026-02-19 15:33:43.888166836 +0000 UTC m=+1453.370196960" lastFinishedPulling="2026-02-19 15:33:54.374356581 +0000 UTC m=+1463.856386745" observedRunningTime="2026-02-19 15:33:55.636813869 +0000 UTC m=+1465.118843993" watchObservedRunningTime="2026-02-19 15:33:55.639568329 +0000 UTC m=+1465.121598453" Feb 19 15:34:05 crc kubenswrapper[4810]: I0219 15:34:05.742026 4810 generic.go:334] "Generic (PLEG): container finished" podID="8c05e8c7-82f6-4ef1-a576-3c84e70dc570" containerID="fa5950c0010d836c770fe60039a24c6338efe9c7a11d0ea864b9fcf4b45f1ebc" exitCode=0 Feb 19 15:34:05 crc kubenswrapper[4810]: I0219 15:34:05.742085 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-484qb" event={"ID":"8c05e8c7-82f6-4ef1-a576-3c84e70dc570","Type":"ContainerDied","Data":"fa5950c0010d836c770fe60039a24c6338efe9c7a11d0ea864b9fcf4b45f1ebc"} Feb 19 15:34:07 crc kubenswrapper[4810]: I0219 15:34:07.260748 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-484qb" Feb 19 15:34:07 crc kubenswrapper[4810]: I0219 15:34:07.333005 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zb8t5\" (UniqueName: \"kubernetes.io/projected/8c05e8c7-82f6-4ef1-a576-3c84e70dc570-kube-api-access-zb8t5\") pod \"8c05e8c7-82f6-4ef1-a576-3c84e70dc570\" (UID: \"8c05e8c7-82f6-4ef1-a576-3c84e70dc570\") " Feb 19 15:34:07 crc kubenswrapper[4810]: I0219 15:34:07.333484 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8c05e8c7-82f6-4ef1-a576-3c84e70dc570-inventory\") pod \"8c05e8c7-82f6-4ef1-a576-3c84e70dc570\" (UID: \"8c05e8c7-82f6-4ef1-a576-3c84e70dc570\") " Feb 19 15:34:07 crc kubenswrapper[4810]: I0219 15:34:07.333706 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8c05e8c7-82f6-4ef1-a576-3c84e70dc570-ssh-key-openstack-edpm-ipam\") pod \"8c05e8c7-82f6-4ef1-a576-3c84e70dc570\" (UID: \"8c05e8c7-82f6-4ef1-a576-3c84e70dc570\") " Feb 19 15:34:07 crc kubenswrapper[4810]: I0219 15:34:07.333955 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c05e8c7-82f6-4ef1-a576-3c84e70dc570-repo-setup-combined-ca-bundle\") pod \"8c05e8c7-82f6-4ef1-a576-3c84e70dc570\" (UID: \"8c05e8c7-82f6-4ef1-a576-3c84e70dc570\") " Feb 19 15:34:07 crc kubenswrapper[4810]: I0219 15:34:07.339054 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c05e8c7-82f6-4ef1-a576-3c84e70dc570-kube-api-access-zb8t5" (OuterVolumeSpecName: "kube-api-access-zb8t5") pod "8c05e8c7-82f6-4ef1-a576-3c84e70dc570" (UID: "8c05e8c7-82f6-4ef1-a576-3c84e70dc570"). InnerVolumeSpecName "kube-api-access-zb8t5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:34:07 crc kubenswrapper[4810]: I0219 15:34:07.340586 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c05e8c7-82f6-4ef1-a576-3c84e70dc570-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "8c05e8c7-82f6-4ef1-a576-3c84e70dc570" (UID: "8c05e8c7-82f6-4ef1-a576-3c84e70dc570"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:34:07 crc kubenswrapper[4810]: I0219 15:34:07.361461 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c05e8c7-82f6-4ef1-a576-3c84e70dc570-inventory" (OuterVolumeSpecName: "inventory") pod "8c05e8c7-82f6-4ef1-a576-3c84e70dc570" (UID: "8c05e8c7-82f6-4ef1-a576-3c84e70dc570"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:34:07 crc kubenswrapper[4810]: I0219 15:34:07.365460 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c05e8c7-82f6-4ef1-a576-3c84e70dc570-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "8c05e8c7-82f6-4ef1-a576-3c84e70dc570" (UID: "8c05e8c7-82f6-4ef1-a576-3c84e70dc570"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:34:07 crc kubenswrapper[4810]: I0219 15:34:07.436608 4810 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8c05e8c7-82f6-4ef1-a576-3c84e70dc570-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 15:34:07 crc kubenswrapper[4810]: I0219 15:34:07.436642 4810 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c05e8c7-82f6-4ef1-a576-3c84e70dc570-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 15:34:07 crc kubenswrapper[4810]: I0219 15:34:07.436653 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zb8t5\" (UniqueName: \"kubernetes.io/projected/8c05e8c7-82f6-4ef1-a576-3c84e70dc570-kube-api-access-zb8t5\") on node \"crc\" DevicePath \"\"" Feb 19 15:34:07 crc kubenswrapper[4810]: I0219 15:34:07.436667 4810 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8c05e8c7-82f6-4ef1-a576-3c84e70dc570-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 15:34:07 crc kubenswrapper[4810]: I0219 15:34:07.760544 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-484qb" event={"ID":"8c05e8c7-82f6-4ef1-a576-3c84e70dc570","Type":"ContainerDied","Data":"086440bfa5f26ba1363eecaa5c50943356fef7aea25c05779628eb95ef3c57b2"} Feb 19 15:34:07 crc kubenswrapper[4810]: I0219 15:34:07.760886 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="086440bfa5f26ba1363eecaa5c50943356fef7aea25c05779628eb95ef3c57b2" Feb 19 15:34:07 crc kubenswrapper[4810]: I0219 15:34:07.760575 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-484qb" Feb 19 15:34:07 crc kubenswrapper[4810]: I0219 15:34:07.883754 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-x748x"] Feb 19 15:34:07 crc kubenswrapper[4810]: E0219 15:34:07.884610 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c05e8c7-82f6-4ef1-a576-3c84e70dc570" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 19 15:34:07 crc kubenswrapper[4810]: I0219 15:34:07.884648 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c05e8c7-82f6-4ef1-a576-3c84e70dc570" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 19 15:34:07 crc kubenswrapper[4810]: I0219 15:34:07.885168 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c05e8c7-82f6-4ef1-a576-3c84e70dc570" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 19 15:34:07 crc kubenswrapper[4810]: I0219 15:34:07.886671 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-x748x" Feb 19 15:34:07 crc kubenswrapper[4810]: I0219 15:34:07.896686 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-x748x"] Feb 19 15:34:07 crc kubenswrapper[4810]: I0219 15:34:07.934613 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 15:34:07 crc kubenswrapper[4810]: I0219 15:34:07.935151 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 15:34:07 crc kubenswrapper[4810]: I0219 15:34:07.935679 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 15:34:07 crc kubenswrapper[4810]: I0219 15:34:07.936019 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-djc4q" Feb 19 15:34:08 crc kubenswrapper[4810]: I0219 15:34:08.058708 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/32dc9563-791b-421e-a807-41cc1e775b3a-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-x748x\" (UID: \"32dc9563-791b-421e-a807-41cc1e775b3a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-x748x" Feb 19 15:34:08 crc kubenswrapper[4810]: I0219 15:34:08.058789 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/32dc9563-791b-421e-a807-41cc1e775b3a-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-x748x\" (UID: \"32dc9563-791b-421e-a807-41cc1e775b3a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-x748x" Feb 19 15:34:08 crc kubenswrapper[4810]: I0219 15:34:08.059046 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fskpf\" (UniqueName: \"kubernetes.io/projected/32dc9563-791b-421e-a807-41cc1e775b3a-kube-api-access-fskpf\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-x748x\" (UID: \"32dc9563-791b-421e-a807-41cc1e775b3a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-x748x" Feb 19 15:34:08 crc kubenswrapper[4810]: I0219 15:34:08.161615 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/32dc9563-791b-421e-a807-41cc1e775b3a-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-x748x\" (UID: \"32dc9563-791b-421e-a807-41cc1e775b3a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-x748x" Feb 19 15:34:08 crc kubenswrapper[4810]: I0219 15:34:08.161765 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fskpf\" (UniqueName: \"kubernetes.io/projected/32dc9563-791b-421e-a807-41cc1e775b3a-kube-api-access-fskpf\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-x748x\" (UID: \"32dc9563-791b-421e-a807-41cc1e775b3a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-x748x" Feb 19 15:34:08 crc kubenswrapper[4810]: I0219 15:34:08.162006 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/32dc9563-791b-421e-a807-41cc1e775b3a-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-x748x\" (UID: \"32dc9563-791b-421e-a807-41cc1e775b3a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-x748x" Feb 19 15:34:08 crc kubenswrapper[4810]: I0219 15:34:08.170361 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/32dc9563-791b-421e-a807-41cc1e775b3a-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-x748x\" (UID: \"32dc9563-791b-421e-a807-41cc1e775b3a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-x748x" Feb 19 15:34:08 crc kubenswrapper[4810]: I0219 15:34:08.170527 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/32dc9563-791b-421e-a807-41cc1e775b3a-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-x748x\" (UID: \"32dc9563-791b-421e-a807-41cc1e775b3a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-x748x" Feb 19 15:34:08 crc kubenswrapper[4810]: I0219 15:34:08.192145 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fskpf\" (UniqueName: \"kubernetes.io/projected/32dc9563-791b-421e-a807-41cc1e775b3a-kube-api-access-fskpf\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-x748x\" (UID: \"32dc9563-791b-421e-a807-41cc1e775b3a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-x748x" Feb 19 15:34:08 crc kubenswrapper[4810]: I0219 15:34:08.266137 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-x748x" Feb 19 15:34:08 crc kubenswrapper[4810]: I0219 15:34:08.832891 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-x748x"] Feb 19 15:34:09 crc kubenswrapper[4810]: I0219 15:34:09.782441 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-x748x" event={"ID":"32dc9563-791b-421e-a807-41cc1e775b3a","Type":"ContainerStarted","Data":"0218f16e05c77fd4aff0c773c6f6f5f459d5e7eb7df2e492a757da51fe6de6ea"} Feb 19 15:34:09 crc kubenswrapper[4810]: I0219 15:34:09.782750 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-x748x" event={"ID":"32dc9563-791b-421e-a807-41cc1e775b3a","Type":"ContainerStarted","Data":"269910cad55a32d5b2042c87d986c67c8c7621ea6b376e19dc5e49dd23fda6f7"} Feb 19 15:34:09 crc kubenswrapper[4810]: I0219 15:34:09.818908 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-x748x" podStartSLOduration=2.369954075 podStartE2EDuration="2.818886909s" podCreationTimestamp="2026-02-19 15:34:07 +0000 UTC" firstStartedPulling="2026-02-19 15:34:08.832372043 +0000 UTC m=+1478.314402167" lastFinishedPulling="2026-02-19 15:34:09.281304867 +0000 UTC m=+1478.763335001" observedRunningTime="2026-02-19 15:34:09.80626955 +0000 UTC m=+1479.288299674" watchObservedRunningTime="2026-02-19 15:34:09.818886909 +0000 UTC m=+1479.300917043" Feb 19 15:34:12 crc kubenswrapper[4810]: I0219 15:34:12.818731 4810 generic.go:334] "Generic (PLEG): container finished" podID="32dc9563-791b-421e-a807-41cc1e775b3a" containerID="0218f16e05c77fd4aff0c773c6f6f5f459d5e7eb7df2e492a757da51fe6de6ea" exitCode=0 Feb 19 15:34:12 crc kubenswrapper[4810]: I0219 15:34:12.818795 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-x748x" event={"ID":"32dc9563-791b-421e-a807-41cc1e775b3a","Type":"ContainerDied","Data":"0218f16e05c77fd4aff0c773c6f6f5f459d5e7eb7df2e492a757da51fe6de6ea"} Feb 19 15:34:14 crc kubenswrapper[4810]: I0219 15:34:14.263405 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-x748x" Feb 19 15:34:14 crc kubenswrapper[4810]: I0219 15:34:14.409268 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/32dc9563-791b-421e-a807-41cc1e775b3a-inventory\") pod \"32dc9563-791b-421e-a807-41cc1e775b3a\" (UID: \"32dc9563-791b-421e-a807-41cc1e775b3a\") " Feb 19 15:34:14 crc kubenswrapper[4810]: I0219 15:34:14.409498 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/32dc9563-791b-421e-a807-41cc1e775b3a-ssh-key-openstack-edpm-ipam\") pod \"32dc9563-791b-421e-a807-41cc1e775b3a\" (UID: \"32dc9563-791b-421e-a807-41cc1e775b3a\") " Feb 19 15:34:14 crc kubenswrapper[4810]: I0219 15:34:14.409661 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fskpf\" (UniqueName: \"kubernetes.io/projected/32dc9563-791b-421e-a807-41cc1e775b3a-kube-api-access-fskpf\") pod \"32dc9563-791b-421e-a807-41cc1e775b3a\" (UID: \"32dc9563-791b-421e-a807-41cc1e775b3a\") " Feb 19 15:34:14 crc kubenswrapper[4810]: I0219 15:34:14.417059 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32dc9563-791b-421e-a807-41cc1e775b3a-kube-api-access-fskpf" (OuterVolumeSpecName: "kube-api-access-fskpf") pod "32dc9563-791b-421e-a807-41cc1e775b3a" (UID: "32dc9563-791b-421e-a807-41cc1e775b3a"). InnerVolumeSpecName "kube-api-access-fskpf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:34:14 crc kubenswrapper[4810]: I0219 15:34:14.445256 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32dc9563-791b-421e-a807-41cc1e775b3a-inventory" (OuterVolumeSpecName: "inventory") pod "32dc9563-791b-421e-a807-41cc1e775b3a" (UID: "32dc9563-791b-421e-a807-41cc1e775b3a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:34:14 crc kubenswrapper[4810]: I0219 15:34:14.451619 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32dc9563-791b-421e-a807-41cc1e775b3a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "32dc9563-791b-421e-a807-41cc1e775b3a" (UID: "32dc9563-791b-421e-a807-41cc1e775b3a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:34:14 crc kubenswrapper[4810]: I0219 15:34:14.512882 4810 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/32dc9563-791b-421e-a807-41cc1e775b3a-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 15:34:14 crc kubenswrapper[4810]: I0219 15:34:14.516129 4810 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/32dc9563-791b-421e-a807-41cc1e775b3a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 15:34:14 crc kubenswrapper[4810]: I0219 15:34:14.516618 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fskpf\" (UniqueName: \"kubernetes.io/projected/32dc9563-791b-421e-a807-41cc1e775b3a-kube-api-access-fskpf\") on node \"crc\" DevicePath \"\"" Feb 19 15:34:14 crc kubenswrapper[4810]: I0219 15:34:14.848300 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-x748x" event={"ID":"32dc9563-791b-421e-a807-41cc1e775b3a","Type":"ContainerDied","Data":"269910cad55a32d5b2042c87d986c67c8c7621ea6b376e19dc5e49dd23fda6f7"} Feb 19 15:34:14 crc kubenswrapper[4810]: I0219 15:34:14.848358 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="269910cad55a32d5b2042c87d986c67c8c7621ea6b376e19dc5e49dd23fda6f7" Feb 19 15:34:14 crc kubenswrapper[4810]: I0219 15:34:14.848398 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-x748x" Feb 19 15:34:15 crc kubenswrapper[4810]: I0219 15:34:15.050252 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sdzkx"] Feb 19 15:34:15 crc kubenswrapper[4810]: E0219 15:34:15.050761 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32dc9563-791b-421e-a807-41cc1e775b3a" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 19 15:34:15 crc kubenswrapper[4810]: I0219 15:34:15.050790 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="32dc9563-791b-421e-a807-41cc1e775b3a" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 19 15:34:15 crc kubenswrapper[4810]: I0219 15:34:15.051055 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="32dc9563-791b-421e-a807-41cc1e775b3a" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 19 15:34:15 crc kubenswrapper[4810]: I0219 15:34:15.051806 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sdzkx" Feb 19 15:34:15 crc kubenswrapper[4810]: I0219 15:34:15.061752 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 15:34:15 crc kubenswrapper[4810]: I0219 15:34:15.061918 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-djc4q" Feb 19 15:34:15 crc kubenswrapper[4810]: I0219 15:34:15.062028 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 15:34:15 crc kubenswrapper[4810]: I0219 15:34:15.062192 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 15:34:15 crc kubenswrapper[4810]: I0219 15:34:15.078029 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sdzkx"] Feb 19 15:34:15 crc kubenswrapper[4810]: I0219 15:34:15.128695 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4a9ca21-e1c7-490d-8078-14407b530301-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-sdzkx\" (UID: \"c4a9ca21-e1c7-490d-8078-14407b530301\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sdzkx" Feb 19 15:34:15 crc kubenswrapper[4810]: I0219 15:34:15.128787 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c4a9ca21-e1c7-490d-8078-14407b530301-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-sdzkx\" (UID: \"c4a9ca21-e1c7-490d-8078-14407b530301\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sdzkx" Feb 19 15:34:15 crc kubenswrapper[4810]: I0219 15:34:15.128841 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwflq\" (UniqueName: \"kubernetes.io/projected/c4a9ca21-e1c7-490d-8078-14407b530301-kube-api-access-rwflq\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-sdzkx\" (UID: \"c4a9ca21-e1c7-490d-8078-14407b530301\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sdzkx" Feb 19 15:34:15 crc kubenswrapper[4810]: I0219 15:34:15.128877 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c4a9ca21-e1c7-490d-8078-14407b530301-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-sdzkx\" (UID: \"c4a9ca21-e1c7-490d-8078-14407b530301\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sdzkx" Feb 19 15:34:15 crc kubenswrapper[4810]: I0219 15:34:15.230617 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c4a9ca21-e1c7-490d-8078-14407b530301-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-sdzkx\" (UID: \"c4a9ca21-e1c7-490d-8078-14407b530301\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sdzkx" Feb 19 15:34:15 crc kubenswrapper[4810]: I0219 15:34:15.230699 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwflq\" (UniqueName: \"kubernetes.io/projected/c4a9ca21-e1c7-490d-8078-14407b530301-kube-api-access-rwflq\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-sdzkx\" (UID: \"c4a9ca21-e1c7-490d-8078-14407b530301\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sdzkx" Feb 19 15:34:15 crc kubenswrapper[4810]: I0219 15:34:15.230738 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c4a9ca21-e1c7-490d-8078-14407b530301-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-sdzkx\" (UID: \"c4a9ca21-e1c7-490d-8078-14407b530301\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sdzkx" Feb 19 15:34:15 crc kubenswrapper[4810]: I0219 15:34:15.230895 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4a9ca21-e1c7-490d-8078-14407b530301-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-sdzkx\" (UID: \"c4a9ca21-e1c7-490d-8078-14407b530301\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sdzkx" Feb 19 15:34:15 crc kubenswrapper[4810]: I0219 15:34:15.235808 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c4a9ca21-e1c7-490d-8078-14407b530301-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-sdzkx\" (UID: \"c4a9ca21-e1c7-490d-8078-14407b530301\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sdzkx" Feb 19 15:34:15 crc kubenswrapper[4810]: I0219 15:34:15.235885 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c4a9ca21-e1c7-490d-8078-14407b530301-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-sdzkx\" (UID: \"c4a9ca21-e1c7-490d-8078-14407b530301\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sdzkx" Feb 19 15:34:15 crc kubenswrapper[4810]: I0219 15:34:15.237010 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4a9ca21-e1c7-490d-8078-14407b530301-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-sdzkx\" (UID: \"c4a9ca21-e1c7-490d-8078-14407b530301\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sdzkx" Feb 19 15:34:15 crc kubenswrapper[4810]: I0219 15:34:15.252756 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwflq\" (UniqueName: \"kubernetes.io/projected/c4a9ca21-e1c7-490d-8078-14407b530301-kube-api-access-rwflq\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-sdzkx\" (UID: \"c4a9ca21-e1c7-490d-8078-14407b530301\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sdzkx" Feb 19 15:34:15 crc kubenswrapper[4810]: I0219 15:34:15.381780 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sdzkx" Feb 19 15:34:15 crc kubenswrapper[4810]: I0219 15:34:15.950016 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sdzkx"] Feb 19 15:34:16 crc kubenswrapper[4810]: I0219 15:34:16.870587 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sdzkx" event={"ID":"c4a9ca21-e1c7-490d-8078-14407b530301","Type":"ContainerStarted","Data":"aea7e9dbe67f144df792c12f2c4b232cda1fd424cee70ce7f1f3b4844bd41e5c"} Feb 19 15:34:16 crc kubenswrapper[4810]: I0219 15:34:16.870864 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sdzkx" event={"ID":"c4a9ca21-e1c7-490d-8078-14407b530301","Type":"ContainerStarted","Data":"aa1bc9f36577e42d99564c9f4bae7a23f2588d18bc7dc16da1d066acf3ad1da1"} Feb 19 15:34:16 crc kubenswrapper[4810]: I0219 15:34:16.903571 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sdzkx" podStartSLOduration=1.418523432 podStartE2EDuration="1.903544378s" podCreationTimestamp="2026-02-19 15:34:15 +0000 UTC" firstStartedPulling="2026-02-19 15:34:15.948595809 +0000 UTC m=+1485.430625933" lastFinishedPulling="2026-02-19 15:34:16.433616745 +0000 UTC m=+1485.915646879" observedRunningTime="2026-02-19 15:34:16.894015198 +0000 UTC m=+1486.376045322" watchObservedRunningTime="2026-02-19 15:34:16.903544378 +0000 UTC m=+1486.385574522" Feb 19 15:34:39 crc kubenswrapper[4810]: I0219 15:34:39.561727 4810 scope.go:117] "RemoveContainer" containerID="c96da79c9ab27a3dc86e77ea8607bc39b965b2f04ca64ded9b1c4a74386d352e" Feb 19 15:34:39 crc kubenswrapper[4810]: I0219 15:34:39.603881 4810 scope.go:117] "RemoveContainer" containerID="99c27801bb39f1082a20de11443ab5b4c03a227dc67e8dce6456d77eb0a7c2db" Feb 19 15:34:39 crc kubenswrapper[4810]: I0219 15:34:39.689463 4810 scope.go:117] "RemoveContainer" containerID="930ffde39b4c3d7913e11cb429594ee8cd480971fa345e4d0d06f707706f3472" Feb 19 15:34:49 crc kubenswrapper[4810]: I0219 15:34:49.537816 4810 patch_prober.go:28] interesting pod/machine-config-daemon-t499d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 15:34:49 crc kubenswrapper[4810]: I0219 15:34:49.538571 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 15:35:19 crc kubenswrapper[4810]: I0219 15:35:19.537545 4810 patch_prober.go:28] interesting pod/machine-config-daemon-t499d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 15:35:19 crc kubenswrapper[4810]: I0219 15:35:19.538151 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 15:35:33 crc kubenswrapper[4810]: I0219 15:35:33.766123 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-v5gm5"] Feb 19 15:35:33 crc kubenswrapper[4810]: I0219 15:35:33.769980 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-v5gm5"] Feb 19 15:35:33 crc kubenswrapper[4810]: I0219 15:35:33.770122 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v5gm5" Feb 19 15:35:33 crc kubenswrapper[4810]: I0219 15:35:33.850242 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9786edb-c71c-4700-824b-2e348c15b77f-utilities\") pod \"community-operators-v5gm5\" (UID: \"c9786edb-c71c-4700-824b-2e348c15b77f\") " pod="openshift-marketplace/community-operators-v5gm5" Feb 19 15:35:33 crc kubenswrapper[4810]: I0219 15:35:33.850309 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4wvh\" (UniqueName: \"kubernetes.io/projected/c9786edb-c71c-4700-824b-2e348c15b77f-kube-api-access-q4wvh\") pod \"community-operators-v5gm5\" (UID: \"c9786edb-c71c-4700-824b-2e348c15b77f\") " pod="openshift-marketplace/community-operators-v5gm5" Feb 19 15:35:33 crc kubenswrapper[4810]: I0219 15:35:33.850433 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9786edb-c71c-4700-824b-2e348c15b77f-catalog-content\") pod \"community-operators-v5gm5\" (UID: \"c9786edb-c71c-4700-824b-2e348c15b77f\") " pod="openshift-marketplace/community-operators-v5gm5" Feb 19 15:35:33 crc kubenswrapper[4810]: I0219 15:35:33.952520 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9786edb-c71c-4700-824b-2e348c15b77f-utilities\") pod \"community-operators-v5gm5\" (UID: \"c9786edb-c71c-4700-824b-2e348c15b77f\") " pod="openshift-marketplace/community-operators-v5gm5" Feb 19 15:35:33 crc kubenswrapper[4810]: I0219 15:35:33.952591 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4wvh\" (UniqueName: \"kubernetes.io/projected/c9786edb-c71c-4700-824b-2e348c15b77f-kube-api-access-q4wvh\") pod \"community-operators-v5gm5\" (UID: \"c9786edb-c71c-4700-824b-2e348c15b77f\") " pod="openshift-marketplace/community-operators-v5gm5" Feb 19 15:35:33 crc kubenswrapper[4810]: I0219 15:35:33.952620 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9786edb-c71c-4700-824b-2e348c15b77f-catalog-content\") pod \"community-operators-v5gm5\" (UID: \"c9786edb-c71c-4700-824b-2e348c15b77f\") " pod="openshift-marketplace/community-operators-v5gm5" Feb 19 15:35:33 crc kubenswrapper[4810]: I0219 15:35:33.953182 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9786edb-c71c-4700-824b-2e348c15b77f-utilities\") pod \"community-operators-v5gm5\" (UID: \"c9786edb-c71c-4700-824b-2e348c15b77f\") " pod="openshift-marketplace/community-operators-v5gm5" Feb 19 15:35:33 crc kubenswrapper[4810]: I0219 15:35:33.953212 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9786edb-c71c-4700-824b-2e348c15b77f-catalog-content\") pod \"community-operators-v5gm5\" (UID: \"c9786edb-c71c-4700-824b-2e348c15b77f\") " pod="openshift-marketplace/community-operators-v5gm5" Feb 19 15:35:33 crc kubenswrapper[4810]: I0219 15:35:33.974542 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4wvh\" (UniqueName: \"kubernetes.io/projected/c9786edb-c71c-4700-824b-2e348c15b77f-kube-api-access-q4wvh\") pod \"community-operators-v5gm5\" (UID: \"c9786edb-c71c-4700-824b-2e348c15b77f\") " pod="openshift-marketplace/community-operators-v5gm5" Feb 19 15:35:34 crc kubenswrapper[4810]: I0219 15:35:34.114817 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v5gm5" Feb 19 15:35:34 crc kubenswrapper[4810]: I0219 15:35:34.639575 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-v5gm5"] Feb 19 15:35:34 crc kubenswrapper[4810]: W0219 15:35:34.642695 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9786edb_c71c_4700_824b_2e348c15b77f.slice/crio-5e16d58b749a7926e83d0cd1c29f80582242aa5e44d6f95001df47fc5609615f WatchSource:0}: Error finding container 5e16d58b749a7926e83d0cd1c29f80582242aa5e44d6f95001df47fc5609615f: Status 404 returned error can't find the container with id 5e16d58b749a7926e83d0cd1c29f80582242aa5e44d6f95001df47fc5609615f Feb 19 15:35:34 crc kubenswrapper[4810]: I0219 15:35:34.729921 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v5gm5" event={"ID":"c9786edb-c71c-4700-824b-2e348c15b77f","Type":"ContainerStarted","Data":"5e16d58b749a7926e83d0cd1c29f80582242aa5e44d6f95001df47fc5609615f"} Feb 19 15:35:35 crc kubenswrapper[4810]: I0219 15:35:35.739085 4810 generic.go:334] "Generic (PLEG): container finished" podID="c9786edb-c71c-4700-824b-2e348c15b77f" containerID="f42d7f9a1230992f8bd9c5e66cc486a27519984e9110e6a5a352dd034567457d" exitCode=0 Feb 19 15:35:35 crc kubenswrapper[4810]: I0219 15:35:35.739150 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v5gm5" event={"ID":"c9786edb-c71c-4700-824b-2e348c15b77f","Type":"ContainerDied","Data":"f42d7f9a1230992f8bd9c5e66cc486a27519984e9110e6a5a352dd034567457d"} Feb 19 15:35:36 crc kubenswrapper[4810]: I0219 15:35:36.750712 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v5gm5" event={"ID":"c9786edb-c71c-4700-824b-2e348c15b77f","Type":"ContainerStarted","Data":"8cb0bdc04704794d8377332b63549fd9711385b8e8633586279d263c62b2b764"} Feb 19 15:35:37 crc kubenswrapper[4810]: I0219 15:35:37.761067 4810 generic.go:334] "Generic (PLEG): container finished" podID="c9786edb-c71c-4700-824b-2e348c15b77f" containerID="8cb0bdc04704794d8377332b63549fd9711385b8e8633586279d263c62b2b764" exitCode=0 Feb 19 15:35:37 crc kubenswrapper[4810]: I0219 15:35:37.761163 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v5gm5" event={"ID":"c9786edb-c71c-4700-824b-2e348c15b77f","Type":"ContainerDied","Data":"8cb0bdc04704794d8377332b63549fd9711385b8e8633586279d263c62b2b764"} Feb 19 15:35:38 crc kubenswrapper[4810]: I0219 15:35:38.776527 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v5gm5" event={"ID":"c9786edb-c71c-4700-824b-2e348c15b77f","Type":"ContainerStarted","Data":"2336427e8f03e8aaf227aa7c6fe8af824454db5666683d65d337e84a41da9dd8"} Feb 19 15:35:38 crc kubenswrapper[4810]: I0219 15:35:38.813982 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-v5gm5" podStartSLOduration=3.371785586 podStartE2EDuration="5.813948995s" podCreationTimestamp="2026-02-19 15:35:33 +0000 UTC" firstStartedPulling="2026-02-19 15:35:35.740893595 +0000 UTC m=+1565.222923709" lastFinishedPulling="2026-02-19 15:35:38.183056994 +0000 UTC m=+1567.665087118" observedRunningTime="2026-02-19 15:35:38.794795807 +0000 UTC m=+1568.276825981" watchObservedRunningTime="2026-02-19 15:35:38.813948995 +0000 UTC m=+1568.295979159" Feb 19 15:35:39 crc kubenswrapper[4810]: I0219 15:35:39.830228 4810 scope.go:117] "RemoveContainer" containerID="5e6853b5e9878d2a03e52d891d8a223a4096ce551b9935e29c1d5c5f37ac41cb" Feb 19 15:35:39 crc kubenswrapper[4810]: I0219 15:35:39.867889 4810 scope.go:117] "RemoveContainer" containerID="c6f5d9b5a6b15c45dc5d760616281a4656e6d239a89c530245a55061c13bc709" Feb 19 15:35:39 crc kubenswrapper[4810]: I0219 15:35:39.891126 4810 scope.go:117] "RemoveContainer" containerID="e36236dacc44e9719f0a5616b325da89fd715c826d97ed6b2c660301840187d2" Feb 19 15:35:39 crc kubenswrapper[4810]: I0219 15:35:39.937089 4810 scope.go:117] "RemoveContainer" containerID="d88c68d698d3972f0825f45fc0f2b6d882a24f69749acf74ad5f3d90f016e7f9" Feb 19 15:35:44 crc kubenswrapper[4810]: I0219 15:35:44.115950 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-v5gm5" Feb 19 15:35:44 crc kubenswrapper[4810]: I0219 15:35:44.116511 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-v5gm5" Feb 19 15:35:44 crc kubenswrapper[4810]: I0219 15:35:44.186254 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-v5gm5" Feb 19 15:35:44 crc kubenswrapper[4810]: I0219 15:35:44.944181 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-v5gm5" Feb 19 15:35:45 crc kubenswrapper[4810]: I0219 15:35:45.053352 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-v5gm5"] Feb 19 15:35:46 crc kubenswrapper[4810]: I0219 15:35:46.864982 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-v5gm5" podUID="c9786edb-c71c-4700-824b-2e348c15b77f" containerName="registry-server" containerID="cri-o://2336427e8f03e8aaf227aa7c6fe8af824454db5666683d65d337e84a41da9dd8" gracePeriod=2 Feb 19 15:35:47 crc kubenswrapper[4810]: I0219 15:35:47.376044 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v5gm5" Feb 19 15:35:47 crc kubenswrapper[4810]: I0219 15:35:47.432477 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4wvh\" (UniqueName: \"kubernetes.io/projected/c9786edb-c71c-4700-824b-2e348c15b77f-kube-api-access-q4wvh\") pod \"c9786edb-c71c-4700-824b-2e348c15b77f\" (UID: \"c9786edb-c71c-4700-824b-2e348c15b77f\") " Feb 19 15:35:47 crc kubenswrapper[4810]: I0219 15:35:47.432724 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9786edb-c71c-4700-824b-2e348c15b77f-utilities\") pod \"c9786edb-c71c-4700-824b-2e348c15b77f\" (UID: \"c9786edb-c71c-4700-824b-2e348c15b77f\") " Feb 19 15:35:47 crc kubenswrapper[4810]: I0219 15:35:47.432915 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9786edb-c71c-4700-824b-2e348c15b77f-catalog-content\") pod \"c9786edb-c71c-4700-824b-2e348c15b77f\" (UID: \"c9786edb-c71c-4700-824b-2e348c15b77f\") " Feb 19 15:35:47 crc kubenswrapper[4810]: I0219 15:35:47.433528 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9786edb-c71c-4700-824b-2e348c15b77f-utilities" (OuterVolumeSpecName: "utilities") pod "c9786edb-c71c-4700-824b-2e348c15b77f" (UID: "c9786edb-c71c-4700-824b-2e348c15b77f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:35:47 crc kubenswrapper[4810]: I0219 15:35:47.455519 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9786edb-c71c-4700-824b-2e348c15b77f-kube-api-access-q4wvh" (OuterVolumeSpecName: "kube-api-access-q4wvh") pod "c9786edb-c71c-4700-824b-2e348c15b77f" (UID: "c9786edb-c71c-4700-824b-2e348c15b77f"). InnerVolumeSpecName "kube-api-access-q4wvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:35:47 crc kubenswrapper[4810]: I0219 15:35:47.481553 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9786edb-c71c-4700-824b-2e348c15b77f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c9786edb-c71c-4700-824b-2e348c15b77f" (UID: "c9786edb-c71c-4700-824b-2e348c15b77f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:35:47 crc kubenswrapper[4810]: I0219 15:35:47.535490 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9786edb-c71c-4700-824b-2e348c15b77f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 15:35:47 crc kubenswrapper[4810]: I0219 15:35:47.535520 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4wvh\" (UniqueName: \"kubernetes.io/projected/c9786edb-c71c-4700-824b-2e348c15b77f-kube-api-access-q4wvh\") on node \"crc\" DevicePath \"\"" Feb 19 15:35:47 crc kubenswrapper[4810]: I0219 15:35:47.535533 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9786edb-c71c-4700-824b-2e348c15b77f-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 15:35:47 crc kubenswrapper[4810]: I0219 15:35:47.876889 4810 generic.go:334] "Generic (PLEG): container finished" podID="c9786edb-c71c-4700-824b-2e348c15b77f" containerID="2336427e8f03e8aaf227aa7c6fe8af824454db5666683d65d337e84a41da9dd8" exitCode=0 Feb 19 15:35:47 crc kubenswrapper[4810]: I0219 15:35:47.876917 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v5gm5" event={"ID":"c9786edb-c71c-4700-824b-2e348c15b77f","Type":"ContainerDied","Data":"2336427e8f03e8aaf227aa7c6fe8af824454db5666683d65d337e84a41da9dd8"} Feb 19 15:35:47 crc kubenswrapper[4810]: I0219 15:35:47.877179 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v5gm5" event={"ID":"c9786edb-c71c-4700-824b-2e348c15b77f","Type":"ContainerDied","Data":"5e16d58b749a7926e83d0cd1c29f80582242aa5e44d6f95001df47fc5609615f"} Feb 19 15:35:47 crc kubenswrapper[4810]: I0219 15:35:47.877197 4810 scope.go:117] "RemoveContainer" containerID="2336427e8f03e8aaf227aa7c6fe8af824454db5666683d65d337e84a41da9dd8" Feb 19 15:35:47 crc kubenswrapper[4810]: I0219 15:35:47.877031 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v5gm5" Feb 19 15:35:47 crc kubenswrapper[4810]: I0219 15:35:47.918310 4810 scope.go:117] "RemoveContainer" containerID="8cb0bdc04704794d8377332b63549fd9711385b8e8633586279d263c62b2b764" Feb 19 15:35:47 crc kubenswrapper[4810]: I0219 15:35:47.936366 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-v5gm5"] Feb 19 15:35:47 crc kubenswrapper[4810]: I0219 15:35:47.951570 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-v5gm5"] Feb 19 15:35:47 crc kubenswrapper[4810]: I0219 15:35:47.955314 4810 scope.go:117] "RemoveContainer" containerID="f42d7f9a1230992f8bd9c5e66cc486a27519984e9110e6a5a352dd034567457d" Feb 19 15:35:48 crc kubenswrapper[4810]: I0219 15:35:48.003590 4810 scope.go:117] "RemoveContainer" containerID="2336427e8f03e8aaf227aa7c6fe8af824454db5666683d65d337e84a41da9dd8" Feb 19 15:35:48 crc kubenswrapper[4810]: E0219 15:35:48.004316 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2336427e8f03e8aaf227aa7c6fe8af824454db5666683d65d337e84a41da9dd8\": container with ID starting with 2336427e8f03e8aaf227aa7c6fe8af824454db5666683d65d337e84a41da9dd8 not found: ID does not exist" containerID="2336427e8f03e8aaf227aa7c6fe8af824454db5666683d65d337e84a41da9dd8" Feb 19 15:35:48 crc kubenswrapper[4810]: I0219 15:35:48.004411 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2336427e8f03e8aaf227aa7c6fe8af824454db5666683d65d337e84a41da9dd8"} err="failed to get container status \"2336427e8f03e8aaf227aa7c6fe8af824454db5666683d65d337e84a41da9dd8\": rpc error: code = NotFound desc = could not find container \"2336427e8f03e8aaf227aa7c6fe8af824454db5666683d65d337e84a41da9dd8\": container with ID starting with 2336427e8f03e8aaf227aa7c6fe8af824454db5666683d65d337e84a41da9dd8 not found: ID does not exist" Feb 19 15:35:48 crc kubenswrapper[4810]: I0219 15:35:48.004460 4810 scope.go:117] "RemoveContainer" containerID="8cb0bdc04704794d8377332b63549fd9711385b8e8633586279d263c62b2b764" Feb 19 15:35:48 crc kubenswrapper[4810]: E0219 15:35:48.005151 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cb0bdc04704794d8377332b63549fd9711385b8e8633586279d263c62b2b764\": container with ID starting with 8cb0bdc04704794d8377332b63549fd9711385b8e8633586279d263c62b2b764 not found: ID does not exist" containerID="8cb0bdc04704794d8377332b63549fd9711385b8e8633586279d263c62b2b764" Feb 19 15:35:48 crc kubenswrapper[4810]: I0219 15:35:48.005361 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cb0bdc04704794d8377332b63549fd9711385b8e8633586279d263c62b2b764"} err="failed to get container status \"8cb0bdc04704794d8377332b63549fd9711385b8e8633586279d263c62b2b764\": rpc error: code = NotFound desc = could not find container \"8cb0bdc04704794d8377332b63549fd9711385b8e8633586279d263c62b2b764\": container with ID starting with 8cb0bdc04704794d8377332b63549fd9711385b8e8633586279d263c62b2b764 not found: ID does not exist" Feb 19 15:35:48 crc kubenswrapper[4810]: I0219 15:35:48.005525 4810 scope.go:117] "RemoveContainer" containerID="f42d7f9a1230992f8bd9c5e66cc486a27519984e9110e6a5a352dd034567457d" Feb 19 15:35:48 crc kubenswrapper[4810]: E0219 15:35:48.006161 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f42d7f9a1230992f8bd9c5e66cc486a27519984e9110e6a5a352dd034567457d\": container with ID starting with f42d7f9a1230992f8bd9c5e66cc486a27519984e9110e6a5a352dd034567457d not found: ID does not exist" containerID="f42d7f9a1230992f8bd9c5e66cc486a27519984e9110e6a5a352dd034567457d" Feb 19 15:35:48 crc kubenswrapper[4810]: I0219 15:35:48.006199 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f42d7f9a1230992f8bd9c5e66cc486a27519984e9110e6a5a352dd034567457d"} err="failed to get container status \"f42d7f9a1230992f8bd9c5e66cc486a27519984e9110e6a5a352dd034567457d\": rpc error: code = NotFound desc = could not find container \"f42d7f9a1230992f8bd9c5e66cc486a27519984e9110e6a5a352dd034567457d\": container with ID starting with f42d7f9a1230992f8bd9c5e66cc486a27519984e9110e6a5a352dd034567457d not found: ID does not exist" Feb 19 15:35:49 crc kubenswrapper[4810]: I0219 15:35:49.450475 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9786edb-c71c-4700-824b-2e348c15b77f" path="/var/lib/kubelet/pods/c9786edb-c71c-4700-824b-2e348c15b77f/volumes" Feb 19 15:35:49 crc kubenswrapper[4810]: I0219 15:35:49.537190 4810 patch_prober.go:28] interesting pod/machine-config-daemon-t499d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 15:35:49 crc kubenswrapper[4810]: I0219 15:35:49.537232 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 15:35:49 crc kubenswrapper[4810]: I0219 15:35:49.537269 4810 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t499d" Feb 19 15:35:49 crc kubenswrapper[4810]: I0219 15:35:49.537983 4810 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2ce40b521710718c7ba6b5fc71023bb5882beb271696b0616786a7ce052f1e2a"} pod="openshift-machine-config-operator/machine-config-daemon-t499d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 15:35:49 crc kubenswrapper[4810]: I0219 15:35:49.538035 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" containerID="cri-o://2ce40b521710718c7ba6b5fc71023bb5882beb271696b0616786a7ce052f1e2a" gracePeriod=600 Feb 19 15:35:49 crc kubenswrapper[4810]: E0219 15:35:49.661593 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 15:35:49 crc kubenswrapper[4810]: I0219 15:35:49.906254 4810 generic.go:334] "Generic (PLEG): container finished" podID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerID="2ce40b521710718c7ba6b5fc71023bb5882beb271696b0616786a7ce052f1e2a" exitCode=0 Feb 19 15:35:49 crc kubenswrapper[4810]: I0219 15:35:49.906299 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" event={"ID":"dfbf74ef-1e94-4826-8583-42b2e246ccf3","Type":"ContainerDied","Data":"2ce40b521710718c7ba6b5fc71023bb5882beb271696b0616786a7ce052f1e2a"} Feb 19 15:35:49 crc kubenswrapper[4810]: I0219 15:35:49.906433 4810 scope.go:117] "RemoveContainer" containerID="2979159f3325af188cf73d374cfc4f7b1a64cb0be10361454a84d92914ce8075" Feb 19 15:35:49 crc kubenswrapper[4810]: I0219 15:35:49.907296 4810 scope.go:117] "RemoveContainer" containerID="2ce40b521710718c7ba6b5fc71023bb5882beb271696b0616786a7ce052f1e2a" Feb 19 15:35:49 crc kubenswrapper[4810]: E0219 15:35:49.907664 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 15:35:51 crc kubenswrapper[4810]: I0219 15:35:51.267928 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hdrj4"] Feb 19 15:35:51 crc kubenswrapper[4810]: E0219 15:35:51.269263 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9786edb-c71c-4700-824b-2e348c15b77f" containerName="extract-utilities" Feb 19 15:35:51 crc kubenswrapper[4810]: I0219 15:35:51.269292 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9786edb-c71c-4700-824b-2e348c15b77f" containerName="extract-utilities" Feb 19 15:35:51 crc kubenswrapper[4810]: E0219 15:35:51.269428 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9786edb-c71c-4700-824b-2e348c15b77f" containerName="extract-content" Feb 19 15:35:51 crc kubenswrapper[4810]: I0219 15:35:51.269444 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9786edb-c71c-4700-824b-2e348c15b77f" containerName="extract-content" Feb 19 15:35:51 crc kubenswrapper[4810]: E0219 15:35:51.270105 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9786edb-c71c-4700-824b-2e348c15b77f" containerName="registry-server" Feb 19 15:35:51 crc kubenswrapper[4810]: I0219 15:35:51.270132 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9786edb-c71c-4700-824b-2e348c15b77f" containerName="registry-server" Feb 19 15:35:51 crc kubenswrapper[4810]: I0219 15:35:51.270580 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9786edb-c71c-4700-824b-2e348c15b77f" containerName="registry-server" Feb 19 15:35:51 crc kubenswrapper[4810]: I0219 15:35:51.273642 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hdrj4" Feb 19 15:35:51 crc kubenswrapper[4810]: I0219 15:35:51.291213 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hdrj4"] Feb 19 15:35:51 crc kubenswrapper[4810]: I0219 15:35:51.315009 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6eda6aa7-94b5-4d12-95cb-76730652c627-catalog-content\") pod \"certified-operators-hdrj4\" (UID: \"6eda6aa7-94b5-4d12-95cb-76730652c627\") " pod="openshift-marketplace/certified-operators-hdrj4" Feb 19 15:35:51 crc kubenswrapper[4810]: I0219 15:35:51.315070 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9qwd\" (UniqueName: \"kubernetes.io/projected/6eda6aa7-94b5-4d12-95cb-76730652c627-kube-api-access-c9qwd\") pod \"certified-operators-hdrj4\" (UID: \"6eda6aa7-94b5-4d12-95cb-76730652c627\") " pod="openshift-marketplace/certified-operators-hdrj4" Feb 19 15:35:51 crc kubenswrapper[4810]: I0219 15:35:51.315098 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6eda6aa7-94b5-4d12-95cb-76730652c627-utilities\") pod \"certified-operators-hdrj4\" (UID: \"6eda6aa7-94b5-4d12-95cb-76730652c627\") " pod="openshift-marketplace/certified-operators-hdrj4" Feb 19 15:35:51 crc kubenswrapper[4810]: I0219 15:35:51.416976 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6eda6aa7-94b5-4d12-95cb-76730652c627-catalog-content\") pod \"certified-operators-hdrj4\" (UID: \"6eda6aa7-94b5-4d12-95cb-76730652c627\") " pod="openshift-marketplace/certified-operators-hdrj4" Feb 19 15:35:51 crc kubenswrapper[4810]: I0219 15:35:51.417028 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9qwd\" (UniqueName: \"kubernetes.io/projected/6eda6aa7-94b5-4d12-95cb-76730652c627-kube-api-access-c9qwd\") pod \"certified-operators-hdrj4\" (UID: \"6eda6aa7-94b5-4d12-95cb-76730652c627\") " pod="openshift-marketplace/certified-operators-hdrj4" Feb 19 15:35:51 crc kubenswrapper[4810]: I0219 15:35:51.417047 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6eda6aa7-94b5-4d12-95cb-76730652c627-utilities\") pod \"certified-operators-hdrj4\" (UID: \"6eda6aa7-94b5-4d12-95cb-76730652c627\") " pod="openshift-marketplace/certified-operators-hdrj4" Feb 19 15:35:51 crc kubenswrapper[4810]: I0219 15:35:51.417559 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6eda6aa7-94b5-4d12-95cb-76730652c627-catalog-content\") pod \"certified-operators-hdrj4\" (UID: \"6eda6aa7-94b5-4d12-95cb-76730652c627\") " pod="openshift-marketplace/certified-operators-hdrj4" Feb 19 15:35:51 crc kubenswrapper[4810]: I0219 15:35:51.417632 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6eda6aa7-94b5-4d12-95cb-76730652c627-utilities\") pod \"certified-operators-hdrj4\" (UID: \"6eda6aa7-94b5-4d12-95cb-76730652c627\") " pod="openshift-marketplace/certified-operators-hdrj4" Feb 19 15:35:51 crc kubenswrapper[4810]: I0219 15:35:51.441408 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9qwd\" (UniqueName: \"kubernetes.io/projected/6eda6aa7-94b5-4d12-95cb-76730652c627-kube-api-access-c9qwd\") pod \"certified-operators-hdrj4\" (UID: \"6eda6aa7-94b5-4d12-95cb-76730652c627\") " pod="openshift-marketplace/certified-operators-hdrj4" Feb 19 15:35:51 crc kubenswrapper[4810]: I0219 15:35:51.623657 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hdrj4" Feb 19 15:35:52 crc kubenswrapper[4810]: I0219 15:35:52.177300 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hdrj4"] Feb 19 15:35:52 crc kubenswrapper[4810]: I0219 15:35:52.943918 4810 generic.go:334] "Generic (PLEG): container finished" podID="6eda6aa7-94b5-4d12-95cb-76730652c627" containerID="ef601d7cf00e17850747c711a62d9d231bcc24ddc7e66dc470bc4ddf81533ce1" exitCode=0 Feb 19 15:35:52 crc kubenswrapper[4810]: I0219 15:35:52.944170 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hdrj4" event={"ID":"6eda6aa7-94b5-4d12-95cb-76730652c627","Type":"ContainerDied","Data":"ef601d7cf00e17850747c711a62d9d231bcc24ddc7e66dc470bc4ddf81533ce1"} Feb 19 15:35:52 crc kubenswrapper[4810]: I0219 15:35:52.944197 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hdrj4" event={"ID":"6eda6aa7-94b5-4d12-95cb-76730652c627","Type":"ContainerStarted","Data":"c86e35abda59680ef98b6f72e106a7dadaf55f94a3193db1535501490c5c5b34"} Feb 19 15:35:54 crc kubenswrapper[4810]: I0219 15:35:54.967734 4810 generic.go:334] "Generic (PLEG): container finished" podID="6eda6aa7-94b5-4d12-95cb-76730652c627" containerID="d18557e50f207d86079b71b1c0dd4c5bae5091bff4ef8afa200c09cfca4e54b0" exitCode=0 Feb 19 15:35:54 crc kubenswrapper[4810]: I0219 15:35:54.967801 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hdrj4" event={"ID":"6eda6aa7-94b5-4d12-95cb-76730652c627","Type":"ContainerDied","Data":"d18557e50f207d86079b71b1c0dd4c5bae5091bff4ef8afa200c09cfca4e54b0"} Feb 19 15:35:55 crc kubenswrapper[4810]: I0219 15:35:55.978538 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hdrj4" event={"ID":"6eda6aa7-94b5-4d12-95cb-76730652c627","Type":"ContainerStarted","Data":"814f21a94bed7e5fadc18fb49db4f445e001e72a3d8768460c85ea077a8946fd"} Feb 19 15:35:56 crc kubenswrapper[4810]: I0219 15:35:56.005314 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hdrj4" podStartSLOduration=2.556993798 podStartE2EDuration="5.00529805s" podCreationTimestamp="2026-02-19 15:35:51 +0000 UTC" firstStartedPulling="2026-02-19 15:35:52.947076571 +0000 UTC m=+1582.429106695" lastFinishedPulling="2026-02-19 15:35:55.395380823 +0000 UTC m=+1584.877410947" observedRunningTime="2026-02-19 15:35:56.000864579 +0000 UTC m=+1585.482894723" watchObservedRunningTime="2026-02-19 15:35:56.00529805 +0000 UTC m=+1585.487328174" Feb 19 15:36:01 crc kubenswrapper[4810]: I0219 15:36:01.624268 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hdrj4" Feb 19 15:36:01 crc kubenswrapper[4810]: I0219 15:36:01.625147 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hdrj4" Feb 19 15:36:01 crc kubenswrapper[4810]: I0219 15:36:01.688835 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hdrj4" Feb 19 15:36:02 crc kubenswrapper[4810]: I0219 15:36:02.113959 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hdrj4" Feb 19 15:36:02 crc kubenswrapper[4810]: I0219 15:36:02.178750 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hdrj4"] Feb 19 15:36:04 crc kubenswrapper[4810]: I0219 15:36:04.072826 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hdrj4" podUID="6eda6aa7-94b5-4d12-95cb-76730652c627" containerName="registry-server" containerID="cri-o://814f21a94bed7e5fadc18fb49db4f445e001e72a3d8768460c85ea077a8946fd" gracePeriod=2 Feb 19 15:36:04 crc kubenswrapper[4810]: I0219 15:36:04.439568 4810 scope.go:117] "RemoveContainer" containerID="2ce40b521710718c7ba6b5fc71023bb5882beb271696b0616786a7ce052f1e2a" Feb 19 15:36:04 crc kubenswrapper[4810]: E0219 15:36:04.439885 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 15:36:04 crc kubenswrapper[4810]: I0219 15:36:04.713963 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hdrj4" Feb 19 15:36:04 crc kubenswrapper[4810]: I0219 15:36:04.796530 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9qwd\" (UniqueName: \"kubernetes.io/projected/6eda6aa7-94b5-4d12-95cb-76730652c627-kube-api-access-c9qwd\") pod \"6eda6aa7-94b5-4d12-95cb-76730652c627\" (UID: \"6eda6aa7-94b5-4d12-95cb-76730652c627\") " Feb 19 15:36:04 crc kubenswrapper[4810]: I0219 15:36:04.796587 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6eda6aa7-94b5-4d12-95cb-76730652c627-catalog-content\") pod \"6eda6aa7-94b5-4d12-95cb-76730652c627\" (UID: \"6eda6aa7-94b5-4d12-95cb-76730652c627\") " Feb 19 15:36:04 crc kubenswrapper[4810]: I0219 15:36:04.796692 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6eda6aa7-94b5-4d12-95cb-76730652c627-utilities\") pod \"6eda6aa7-94b5-4d12-95cb-76730652c627\" (UID: \"6eda6aa7-94b5-4d12-95cb-76730652c627\") " Feb 19 15:36:04 crc kubenswrapper[4810]: I0219 15:36:04.797642 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6eda6aa7-94b5-4d12-95cb-76730652c627-utilities" (OuterVolumeSpecName: "utilities") pod "6eda6aa7-94b5-4d12-95cb-76730652c627" (UID: "6eda6aa7-94b5-4d12-95cb-76730652c627"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:36:04 crc kubenswrapper[4810]: I0219 15:36:04.809547 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6eda6aa7-94b5-4d12-95cb-76730652c627-kube-api-access-c9qwd" (OuterVolumeSpecName: "kube-api-access-c9qwd") pod "6eda6aa7-94b5-4d12-95cb-76730652c627" (UID: "6eda6aa7-94b5-4d12-95cb-76730652c627"). InnerVolumeSpecName "kube-api-access-c9qwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:36:04 crc kubenswrapper[4810]: I0219 15:36:04.899562 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9qwd\" (UniqueName: \"kubernetes.io/projected/6eda6aa7-94b5-4d12-95cb-76730652c627-kube-api-access-c9qwd\") on node \"crc\" DevicePath \"\"" Feb 19 15:36:04 crc kubenswrapper[4810]: I0219 15:36:04.899618 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6eda6aa7-94b5-4d12-95cb-76730652c627-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 15:36:05 crc kubenswrapper[4810]: I0219 15:36:05.090477 4810 generic.go:334] "Generic (PLEG): container finished" podID="6eda6aa7-94b5-4d12-95cb-76730652c627" containerID="814f21a94bed7e5fadc18fb49db4f445e001e72a3d8768460c85ea077a8946fd" exitCode=0 Feb 19 15:36:05 crc kubenswrapper[4810]: I0219 15:36:05.090671 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hdrj4" event={"ID":"6eda6aa7-94b5-4d12-95cb-76730652c627","Type":"ContainerDied","Data":"814f21a94bed7e5fadc18fb49db4f445e001e72a3d8768460c85ea077a8946fd"} Feb 19 15:36:05 crc kubenswrapper[4810]: I0219 15:36:05.090785 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hdrj4" event={"ID":"6eda6aa7-94b5-4d12-95cb-76730652c627","Type":"ContainerDied","Data":"c86e35abda59680ef98b6f72e106a7dadaf55f94a3193db1535501490c5c5b34"} Feb 19 15:36:05 crc kubenswrapper[4810]: I0219 15:36:05.090715 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hdrj4" Feb 19 15:36:05 crc kubenswrapper[4810]: I0219 15:36:05.090957 4810 scope.go:117] "RemoveContainer" containerID="814f21a94bed7e5fadc18fb49db4f445e001e72a3d8768460c85ea077a8946fd" Feb 19 15:36:05 crc kubenswrapper[4810]: I0219 15:36:05.119599 4810 scope.go:117] "RemoveContainer" containerID="d18557e50f207d86079b71b1c0dd4c5bae5091bff4ef8afa200c09cfca4e54b0" Feb 19 15:36:05 crc kubenswrapper[4810]: I0219 15:36:05.149957 4810 scope.go:117] "RemoveContainer" containerID="ef601d7cf00e17850747c711a62d9d231bcc24ddc7e66dc470bc4ddf81533ce1" Feb 19 15:36:05 crc kubenswrapper[4810]: I0219 15:36:05.194768 4810 scope.go:117] "RemoveContainer" containerID="814f21a94bed7e5fadc18fb49db4f445e001e72a3d8768460c85ea077a8946fd" Feb 19 15:36:05 crc kubenswrapper[4810]: E0219 15:36:05.195290 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"814f21a94bed7e5fadc18fb49db4f445e001e72a3d8768460c85ea077a8946fd\": container with ID starting with 814f21a94bed7e5fadc18fb49db4f445e001e72a3d8768460c85ea077a8946fd not found: ID does not exist" containerID="814f21a94bed7e5fadc18fb49db4f445e001e72a3d8768460c85ea077a8946fd" Feb 19 15:36:05 crc kubenswrapper[4810]: I0219 15:36:05.195375 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"814f21a94bed7e5fadc18fb49db4f445e001e72a3d8768460c85ea077a8946fd"} err="failed to get container status \"814f21a94bed7e5fadc18fb49db4f445e001e72a3d8768460c85ea077a8946fd\": rpc error: code = NotFound desc = could not find container \"814f21a94bed7e5fadc18fb49db4f445e001e72a3d8768460c85ea077a8946fd\": container with ID starting with 814f21a94bed7e5fadc18fb49db4f445e001e72a3d8768460c85ea077a8946fd not found: ID does not exist" Feb 19 15:36:05 crc kubenswrapper[4810]: I0219 15:36:05.195415 4810 scope.go:117] "RemoveContainer" containerID="d18557e50f207d86079b71b1c0dd4c5bae5091bff4ef8afa200c09cfca4e54b0" Feb 19 15:36:05 crc kubenswrapper[4810]: E0219 15:36:05.196144 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d18557e50f207d86079b71b1c0dd4c5bae5091bff4ef8afa200c09cfca4e54b0\": container with ID starting with d18557e50f207d86079b71b1c0dd4c5bae5091bff4ef8afa200c09cfca4e54b0 not found: ID does not exist" containerID="d18557e50f207d86079b71b1c0dd4c5bae5091bff4ef8afa200c09cfca4e54b0" Feb 19 15:36:05 crc kubenswrapper[4810]: I0219 15:36:05.196195 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d18557e50f207d86079b71b1c0dd4c5bae5091bff4ef8afa200c09cfca4e54b0"} err="failed to get container status \"d18557e50f207d86079b71b1c0dd4c5bae5091bff4ef8afa200c09cfca4e54b0\": rpc error: code = NotFound desc = could not find container \"d18557e50f207d86079b71b1c0dd4c5bae5091bff4ef8afa200c09cfca4e54b0\": container with ID starting with d18557e50f207d86079b71b1c0dd4c5bae5091bff4ef8afa200c09cfca4e54b0 not found: ID does not exist" Feb 19 15:36:05 crc kubenswrapper[4810]: I0219 15:36:05.196227 4810 scope.go:117] "RemoveContainer" containerID="ef601d7cf00e17850747c711a62d9d231bcc24ddc7e66dc470bc4ddf81533ce1" Feb 19 15:36:05 crc kubenswrapper[4810]: E0219 15:36:05.197077 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef601d7cf00e17850747c711a62d9d231bcc24ddc7e66dc470bc4ddf81533ce1\": container with ID starting with ef601d7cf00e17850747c711a62d9d231bcc24ddc7e66dc470bc4ddf81533ce1 not found: ID does not exist" containerID="ef601d7cf00e17850747c711a62d9d231bcc24ddc7e66dc470bc4ddf81533ce1" Feb 19 15:36:05 crc kubenswrapper[4810]: I0219 15:36:05.197119 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef601d7cf00e17850747c711a62d9d231bcc24ddc7e66dc470bc4ddf81533ce1"} err="failed to get container status \"ef601d7cf00e17850747c711a62d9d231bcc24ddc7e66dc470bc4ddf81533ce1\": rpc error: code = NotFound desc = could not find container \"ef601d7cf00e17850747c711a62d9d231bcc24ddc7e66dc470bc4ddf81533ce1\": container with ID starting with ef601d7cf00e17850747c711a62d9d231bcc24ddc7e66dc470bc4ddf81533ce1 not found: ID does not exist" Feb 19 15:36:05 crc kubenswrapper[4810]: I0219 15:36:05.953244 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6eda6aa7-94b5-4d12-95cb-76730652c627-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6eda6aa7-94b5-4d12-95cb-76730652c627" (UID: "6eda6aa7-94b5-4d12-95cb-76730652c627"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:36:06 crc kubenswrapper[4810]: I0219 15:36:06.021963 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6eda6aa7-94b5-4d12-95cb-76730652c627-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 15:36:06 crc kubenswrapper[4810]: I0219 15:36:06.028005 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hdrj4"] Feb 19 15:36:06 crc kubenswrapper[4810]: I0219 15:36:06.038545 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hdrj4"] Feb 19 15:36:07 crc kubenswrapper[4810]: I0219 15:36:07.453258 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6eda6aa7-94b5-4d12-95cb-76730652c627" path="/var/lib/kubelet/pods/6eda6aa7-94b5-4d12-95cb-76730652c627/volumes" Feb 19 15:36:15 crc kubenswrapper[4810]: I0219 15:36:15.441491 4810 scope.go:117] "RemoveContainer" containerID="2ce40b521710718c7ba6b5fc71023bb5882beb271696b0616786a7ce052f1e2a" Feb 19 15:36:15 crc kubenswrapper[4810]: E0219 15:36:15.444158 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 15:36:27 crc kubenswrapper[4810]: I0219 15:36:27.439665 4810 scope.go:117] "RemoveContainer" containerID="2ce40b521710718c7ba6b5fc71023bb5882beb271696b0616786a7ce052f1e2a" Feb 19 15:36:27 crc kubenswrapper[4810]: E0219 15:36:27.440990 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 15:36:40 crc kubenswrapper[4810]: I0219 15:36:40.030426 4810 scope.go:117] "RemoveContainer" containerID="a0f55d5dfd4c1951d245770b89cf22415d60ecb97cf8c05e857fc4583af61f68" Feb 19 15:36:40 crc kubenswrapper[4810]: I0219 15:36:40.055071 4810 scope.go:117] "RemoveContainer" containerID="c249e977cefa0135aa004a3d9624e2b5787cc21f20239233e79602a670cf0acb" Feb 19 15:36:41 crc kubenswrapper[4810]: I0219 15:36:41.465575 4810 scope.go:117] "RemoveContainer" containerID="2ce40b521710718c7ba6b5fc71023bb5882beb271696b0616786a7ce052f1e2a" Feb 19 15:36:41 crc kubenswrapper[4810]: E0219 15:36:41.466266 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 15:36:56 crc kubenswrapper[4810]: I0219 15:36:56.439773 4810 scope.go:117] "RemoveContainer" containerID="2ce40b521710718c7ba6b5fc71023bb5882beb271696b0616786a7ce052f1e2a" Feb 19 15:36:56 crc kubenswrapper[4810]: E0219 15:36:56.440826 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 15:37:10 crc kubenswrapper[4810]: I0219 15:37:10.439935 4810 scope.go:117] "RemoveContainer" containerID="2ce40b521710718c7ba6b5fc71023bb5882beb271696b0616786a7ce052f1e2a" Feb 19 15:37:10 crc kubenswrapper[4810]: E0219 15:37:10.441281 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 15:37:22 crc kubenswrapper[4810]: I0219 15:37:22.045273 4810 generic.go:334] "Generic (PLEG): container finished" podID="c4a9ca21-e1c7-490d-8078-14407b530301" containerID="aea7e9dbe67f144df792c12f2c4b232cda1fd424cee70ce7f1f3b4844bd41e5c" exitCode=0 Feb 19 15:37:22 crc kubenswrapper[4810]: I0219 15:37:22.045450 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sdzkx" event={"ID":"c4a9ca21-e1c7-490d-8078-14407b530301","Type":"ContainerDied","Data":"aea7e9dbe67f144df792c12f2c4b232cda1fd424cee70ce7f1f3b4844bd41e5c"} Feb 19 15:37:23 crc kubenswrapper[4810]: I0219 15:37:23.442204 4810 scope.go:117] "RemoveContainer" containerID="2ce40b521710718c7ba6b5fc71023bb5882beb271696b0616786a7ce052f1e2a" Feb 19 15:37:23 crc kubenswrapper[4810]: E0219 15:37:23.442784 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 15:37:23 crc kubenswrapper[4810]: I0219 15:37:23.494745 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sdzkx" Feb 19 15:37:23 crc kubenswrapper[4810]: I0219 15:37:23.526291 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwflq\" (UniqueName: \"kubernetes.io/projected/c4a9ca21-e1c7-490d-8078-14407b530301-kube-api-access-rwflq\") pod \"c4a9ca21-e1c7-490d-8078-14407b530301\" (UID: \"c4a9ca21-e1c7-490d-8078-14407b530301\") " Feb 19 15:37:23 crc kubenswrapper[4810]: I0219 15:37:23.526436 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c4a9ca21-e1c7-490d-8078-14407b530301-inventory\") pod \"c4a9ca21-e1c7-490d-8078-14407b530301\" (UID: \"c4a9ca21-e1c7-490d-8078-14407b530301\") " Feb 19 15:37:23 crc kubenswrapper[4810]: I0219 15:37:23.526580 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4a9ca21-e1c7-490d-8078-14407b530301-bootstrap-combined-ca-bundle\") pod \"c4a9ca21-e1c7-490d-8078-14407b530301\" (UID: \"c4a9ca21-e1c7-490d-8078-14407b530301\") " Feb 19 15:37:23 crc kubenswrapper[4810]: I0219 15:37:23.526615 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c4a9ca21-e1c7-490d-8078-14407b530301-ssh-key-openstack-edpm-ipam\") pod \"c4a9ca21-e1c7-490d-8078-14407b530301\" (UID: \"c4a9ca21-e1c7-490d-8078-14407b530301\") " Feb 19 15:37:23 crc kubenswrapper[4810]: I0219 15:37:23.533374 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4a9ca21-e1c7-490d-8078-14407b530301-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "c4a9ca21-e1c7-490d-8078-14407b530301" (UID: "c4a9ca21-e1c7-490d-8078-14407b530301"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:37:23 crc kubenswrapper[4810]: I0219 15:37:23.533616 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4a9ca21-e1c7-490d-8078-14407b530301-kube-api-access-rwflq" (OuterVolumeSpecName: "kube-api-access-rwflq") pod "c4a9ca21-e1c7-490d-8078-14407b530301" (UID: "c4a9ca21-e1c7-490d-8078-14407b530301"). InnerVolumeSpecName "kube-api-access-rwflq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:37:23 crc kubenswrapper[4810]: I0219 15:37:23.563522 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4a9ca21-e1c7-490d-8078-14407b530301-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "c4a9ca21-e1c7-490d-8078-14407b530301" (UID: "c4a9ca21-e1c7-490d-8078-14407b530301"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:37:23 crc kubenswrapper[4810]: I0219 15:37:23.572284 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4a9ca21-e1c7-490d-8078-14407b530301-inventory" (OuterVolumeSpecName: "inventory") pod "c4a9ca21-e1c7-490d-8078-14407b530301" (UID: "c4a9ca21-e1c7-490d-8078-14407b530301"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:37:23 crc kubenswrapper[4810]: I0219 15:37:23.629029 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rwflq\" (UniqueName: \"kubernetes.io/projected/c4a9ca21-e1c7-490d-8078-14407b530301-kube-api-access-rwflq\") on node \"crc\" DevicePath \"\"" Feb 19 15:37:23 crc kubenswrapper[4810]: I0219 15:37:23.629311 4810 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c4a9ca21-e1c7-490d-8078-14407b530301-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 15:37:23 crc kubenswrapper[4810]: I0219 15:37:23.629341 4810 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4a9ca21-e1c7-490d-8078-14407b530301-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 15:37:23 crc kubenswrapper[4810]: I0219 15:37:23.629358 4810 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c4a9ca21-e1c7-490d-8078-14407b530301-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 15:37:24 crc kubenswrapper[4810]: I0219 15:37:24.071412 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sdzkx" event={"ID":"c4a9ca21-e1c7-490d-8078-14407b530301","Type":"ContainerDied","Data":"aa1bc9f36577e42d99564c9f4bae7a23f2588d18bc7dc16da1d066acf3ad1da1"} Feb 19 15:37:24 crc kubenswrapper[4810]: I0219 15:37:24.071459 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sdzkx" Feb 19 15:37:24 crc kubenswrapper[4810]: I0219 15:37:24.071471 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa1bc9f36577e42d99564c9f4bae7a23f2588d18bc7dc16da1d066acf3ad1da1" Feb 19 15:37:24 crc kubenswrapper[4810]: I0219 15:37:24.190938 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-flmfl"] Feb 19 15:37:24 crc kubenswrapper[4810]: E0219 15:37:24.191649 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6eda6aa7-94b5-4d12-95cb-76730652c627" containerName="extract-content" Feb 19 15:37:24 crc kubenswrapper[4810]: I0219 15:37:24.191677 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="6eda6aa7-94b5-4d12-95cb-76730652c627" containerName="extract-content" Feb 19 15:37:24 crc kubenswrapper[4810]: E0219 15:37:24.191713 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6eda6aa7-94b5-4d12-95cb-76730652c627" containerName="extract-utilities" Feb 19 15:37:24 crc kubenswrapper[4810]: I0219 15:37:24.191727 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="6eda6aa7-94b5-4d12-95cb-76730652c627" containerName="extract-utilities" Feb 19 15:37:24 crc kubenswrapper[4810]: E0219 15:37:24.191757 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6eda6aa7-94b5-4d12-95cb-76730652c627" containerName="registry-server" Feb 19 15:37:24 crc kubenswrapper[4810]: I0219 15:37:24.191771 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="6eda6aa7-94b5-4d12-95cb-76730652c627" containerName="registry-server" Feb 19 15:37:24 crc kubenswrapper[4810]: E0219 15:37:24.191809 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4a9ca21-e1c7-490d-8078-14407b530301" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 19 15:37:24 crc kubenswrapper[4810]: I0219 15:37:24.191823 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4a9ca21-e1c7-490d-8078-14407b530301" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 19 15:37:24 crc kubenswrapper[4810]: I0219 15:37:24.192217 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4a9ca21-e1c7-490d-8078-14407b530301" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 19 15:37:24 crc kubenswrapper[4810]: I0219 15:37:24.192260 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="6eda6aa7-94b5-4d12-95cb-76730652c627" containerName="registry-server" Feb 19 15:37:24 crc kubenswrapper[4810]: I0219 15:37:24.193349 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-flmfl" Feb 19 15:37:24 crc kubenswrapper[4810]: I0219 15:37:24.203008 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 15:37:24 crc kubenswrapper[4810]: I0219 15:37:24.203021 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 15:37:24 crc kubenswrapper[4810]: I0219 15:37:24.203296 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 15:37:24 crc kubenswrapper[4810]: I0219 15:37:24.203612 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-djc4q" Feb 19 15:37:24 crc kubenswrapper[4810]: I0219 15:37:24.207374 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-flmfl"] Feb 19 15:37:24 crc kubenswrapper[4810]: I0219 15:37:24.242036 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzg4x\" (UniqueName: \"kubernetes.io/projected/e6255c5c-26d4-421f-9156-1bdd2f5adcc6-kube-api-access-qzg4x\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-flmfl\" (UID: \"e6255c5c-26d4-421f-9156-1bdd2f5adcc6\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-flmfl" Feb 19 15:37:24 crc kubenswrapper[4810]: I0219 15:37:24.242129 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e6255c5c-26d4-421f-9156-1bdd2f5adcc6-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-flmfl\" (UID: \"e6255c5c-26d4-421f-9156-1bdd2f5adcc6\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-flmfl" Feb 19 15:37:24 crc kubenswrapper[4810]: I0219 15:37:24.242311 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e6255c5c-26d4-421f-9156-1bdd2f5adcc6-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-flmfl\" (UID: \"e6255c5c-26d4-421f-9156-1bdd2f5adcc6\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-flmfl" Feb 19 15:37:24 crc kubenswrapper[4810]: I0219 15:37:24.344406 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzg4x\" (UniqueName: \"kubernetes.io/projected/e6255c5c-26d4-421f-9156-1bdd2f5adcc6-kube-api-access-qzg4x\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-flmfl\" (UID: \"e6255c5c-26d4-421f-9156-1bdd2f5adcc6\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-flmfl" Feb 19 15:37:24 crc kubenswrapper[4810]: I0219 15:37:24.344505 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e6255c5c-26d4-421f-9156-1bdd2f5adcc6-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-flmfl\" (UID: \"e6255c5c-26d4-421f-9156-1bdd2f5adcc6\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-flmfl" Feb 19 15:37:24 crc kubenswrapper[4810]: I0219 15:37:24.344606 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e6255c5c-26d4-421f-9156-1bdd2f5adcc6-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-flmfl\" (UID: \"e6255c5c-26d4-421f-9156-1bdd2f5adcc6\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-flmfl" Feb 19 15:37:24 crc kubenswrapper[4810]: I0219 15:37:24.349663 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e6255c5c-26d4-421f-9156-1bdd2f5adcc6-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-flmfl\" (UID: \"e6255c5c-26d4-421f-9156-1bdd2f5adcc6\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-flmfl" Feb 19 15:37:24 crc kubenswrapper[4810]: I0219 15:37:24.349905 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e6255c5c-26d4-421f-9156-1bdd2f5adcc6-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-flmfl\" (UID: \"e6255c5c-26d4-421f-9156-1bdd2f5adcc6\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-flmfl" Feb 19 15:37:24 crc kubenswrapper[4810]: I0219 15:37:24.373646 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzg4x\" (UniqueName: \"kubernetes.io/projected/e6255c5c-26d4-421f-9156-1bdd2f5adcc6-kube-api-access-qzg4x\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-flmfl\" (UID: \"e6255c5c-26d4-421f-9156-1bdd2f5adcc6\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-flmfl" Feb 19 15:37:24 crc kubenswrapper[4810]: I0219 15:37:24.551449 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-flmfl" Feb 19 15:37:25 crc kubenswrapper[4810]: I0219 15:37:25.141264 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-flmfl"] Feb 19 15:37:25 crc kubenswrapper[4810]: W0219 15:37:25.143177 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6255c5c_26d4_421f_9156_1bdd2f5adcc6.slice/crio-df2d8f9f79616c4ab1d340863a47dd0003a7f33fda9bcbbe938ee53e102fa03b WatchSource:0}: Error finding container df2d8f9f79616c4ab1d340863a47dd0003a7f33fda9bcbbe938ee53e102fa03b: Status 404 returned error can't find the container with id df2d8f9f79616c4ab1d340863a47dd0003a7f33fda9bcbbe938ee53e102fa03b Feb 19 15:37:25 crc kubenswrapper[4810]: I0219 15:37:25.145689 4810 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 15:37:26 crc kubenswrapper[4810]: I0219 15:37:26.101573 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-flmfl" event={"ID":"e6255c5c-26d4-421f-9156-1bdd2f5adcc6","Type":"ContainerStarted","Data":"46b7253182b6dc0e64c2e59a4d275ed006838d41d039fd87a4c506ea7296776b"} Feb 19 15:37:26 crc kubenswrapper[4810]: I0219 15:37:26.102628 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-flmfl" event={"ID":"e6255c5c-26d4-421f-9156-1bdd2f5adcc6","Type":"ContainerStarted","Data":"df2d8f9f79616c4ab1d340863a47dd0003a7f33fda9bcbbe938ee53e102fa03b"} Feb 19 15:37:26 crc kubenswrapper[4810]: I0219 15:37:26.131940 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-flmfl" podStartSLOduration=1.661178058 podStartE2EDuration="2.131912687s" podCreationTimestamp="2026-02-19 15:37:24 +0000 UTC" firstStartedPulling="2026-02-19 15:37:25.145380112 +0000 UTC m=+1674.627410236" lastFinishedPulling="2026-02-19 15:37:25.616114741 +0000 UTC m=+1675.098144865" observedRunningTime="2026-02-19 15:37:26.125084206 +0000 UTC m=+1675.607114370" watchObservedRunningTime="2026-02-19 15:37:26.131912687 +0000 UTC m=+1675.613942851" Feb 19 15:37:34 crc kubenswrapper[4810]: I0219 15:37:34.439876 4810 scope.go:117] "RemoveContainer" containerID="2ce40b521710718c7ba6b5fc71023bb5882beb271696b0616786a7ce052f1e2a" Feb 19 15:37:34 crc kubenswrapper[4810]: E0219 15:37:34.441051 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 15:37:40 crc kubenswrapper[4810]: I0219 15:37:40.204423 4810 scope.go:117] "RemoveContainer" containerID="67d7b1b2fe05fdf05040100d731bde7bfd4f1ee47f3a5b6f3aa77c49f45ebc0c" Feb 19 15:37:40 crc kubenswrapper[4810]: I0219 15:37:40.234694 4810 scope.go:117] "RemoveContainer" containerID="bf2b72419abc5b9d1dc324265338d9686d9c4dd72a0204301ce8a03dc9ab3fd8" Feb 19 15:37:46 crc kubenswrapper[4810]: I0219 15:37:46.439649 4810 scope.go:117] "RemoveContainer" containerID="2ce40b521710718c7ba6b5fc71023bb5882beb271696b0616786a7ce052f1e2a" Feb 19 15:37:46 crc kubenswrapper[4810]: E0219 15:37:46.441136 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 15:37:59 crc kubenswrapper[4810]: I0219 15:37:59.440164 4810 scope.go:117] "RemoveContainer" containerID="2ce40b521710718c7ba6b5fc71023bb5882beb271696b0616786a7ce052f1e2a" Feb 19 15:37:59 crc kubenswrapper[4810]: E0219 15:37:59.440983 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 15:38:03 crc kubenswrapper[4810]: I0219 15:38:03.079303 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-pd6hg"] Feb 19 15:38:03 crc kubenswrapper[4810]: I0219 15:38:03.095475 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-pd6hg"] Feb 19 15:38:03 crc kubenswrapper[4810]: I0219 15:38:03.454054 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2222b6ca-79cd-48d7-b262-87e5cd4db6b1" path="/var/lib/kubelet/pods/2222b6ca-79cd-48d7-b262-87e5cd4db6b1/volumes" Feb 19 15:38:04 crc kubenswrapper[4810]: I0219 15:38:04.034836 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-79a4-account-create-update-mrm9x"] Feb 19 15:38:04 crc kubenswrapper[4810]: I0219 15:38:04.044138 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-79a4-account-create-update-mrm9x"] Feb 19 15:38:05 crc kubenswrapper[4810]: I0219 15:38:05.048522 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-hm7ql"] Feb 19 15:38:05 crc kubenswrapper[4810]: I0219 15:38:05.069671 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-hm7ql"] Feb 19 15:38:05 crc kubenswrapper[4810]: I0219 15:38:05.082864 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-8fq2p"] Feb 19 15:38:05 crc kubenswrapper[4810]: I0219 15:38:05.096581 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-c263-account-create-update-wz7k6"] Feb 19 15:38:05 crc kubenswrapper[4810]: I0219 15:38:05.108362 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-8fq2p"] Feb 19 15:38:05 crc kubenswrapper[4810]: I0219 15:38:05.118845 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-c263-account-create-update-wz7k6"] Feb 19 15:38:05 crc kubenswrapper[4810]: I0219 15:38:05.463461 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b6040a1-1df6-44da-ba23-4c7b1ccf17b1" path="/var/lib/kubelet/pods/8b6040a1-1df6-44da-ba23-4c7b1ccf17b1/volumes" Feb 19 15:38:05 crc kubenswrapper[4810]: I0219 15:38:05.466364 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d8e48ca-d504-48a9-9e92-97651cd15d28" path="/var/lib/kubelet/pods/9d8e48ca-d504-48a9-9e92-97651cd15d28/volumes" Feb 19 15:38:05 crc kubenswrapper[4810]: I0219 15:38:05.467603 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4b22749-0497-48c2-b943-2c48aef05707" path="/var/lib/kubelet/pods/b4b22749-0497-48c2-b943-2c48aef05707/volumes" Feb 19 15:38:05 crc kubenswrapper[4810]: I0219 15:38:05.468891 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bca24a94-16a8-4b5b-9d99-bc98919feb21" path="/var/lib/kubelet/pods/bca24a94-16a8-4b5b-9d99-bc98919feb21/volumes" Feb 19 15:38:06 crc kubenswrapper[4810]: I0219 15:38:06.048067 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-002d-account-create-update-6kk29"] Feb 19 15:38:06 crc kubenswrapper[4810]: I0219 15:38:06.062497 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-db-create-ffjjc"] Feb 19 15:38:06 crc kubenswrapper[4810]: I0219 15:38:06.073296 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-a42d-account-create-update-l2kgw"] Feb 19 15:38:06 crc kubenswrapper[4810]: I0219 15:38:06.084976 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-002d-account-create-update-6kk29"] Feb 19 15:38:06 crc kubenswrapper[4810]: I0219 15:38:06.096002 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-db-create-ffjjc"] Feb 19 15:38:06 crc kubenswrapper[4810]: I0219 15:38:06.107222 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-a42d-account-create-update-l2kgw"] Feb 19 15:38:07 crc kubenswrapper[4810]: I0219 15:38:07.456137 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21eb5702-ca94-449c-839c-e3970593417d" path="/var/lib/kubelet/pods/21eb5702-ca94-449c-839c-e3970593417d/volumes" Feb 19 15:38:07 crc kubenswrapper[4810]: I0219 15:38:07.457020 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f791f64-69f8-448d-8370-aeef0db30071" path="/var/lib/kubelet/pods/7f791f64-69f8-448d-8370-aeef0db30071/volumes" Feb 19 15:38:07 crc kubenswrapper[4810]: I0219 15:38:07.457546 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c141144d-36a6-4c0c-b764-d7453c101ea3" path="/var/lib/kubelet/pods/c141144d-36a6-4c0c-b764-d7453c101ea3/volumes" Feb 19 15:38:12 crc kubenswrapper[4810]: I0219 15:38:12.050669 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-j78h8"] Feb 19 15:38:12 crc kubenswrapper[4810]: I0219 15:38:12.065174 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-j78h8"] Feb 19 15:38:13 crc kubenswrapper[4810]: I0219 15:38:13.456833 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e12a1f8-d78c-41b0-b295-e5e661bf0820" path="/var/lib/kubelet/pods/8e12a1f8-d78c-41b0-b295-e5e661bf0820/volumes" Feb 19 15:38:14 crc kubenswrapper[4810]: I0219 15:38:14.439947 4810 scope.go:117] "RemoveContainer" containerID="2ce40b521710718c7ba6b5fc71023bb5882beb271696b0616786a7ce052f1e2a" Feb 19 15:38:14 crc kubenswrapper[4810]: E0219 15:38:14.440646 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 15:38:29 crc kubenswrapper[4810]: I0219 15:38:29.440579 4810 scope.go:117] "RemoveContainer" containerID="2ce40b521710718c7ba6b5fc71023bb5882beb271696b0616786a7ce052f1e2a" Feb 19 15:38:29 crc kubenswrapper[4810]: E0219 15:38:29.441943 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 15:38:32 crc kubenswrapper[4810]: I0219 15:38:32.060152 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-9wtnf"] Feb 19 15:38:32 crc kubenswrapper[4810]: I0219 15:38:32.075766 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-9wtnf"] Feb 19 15:38:33 crc kubenswrapper[4810]: I0219 15:38:33.478479 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6217aad-07e6-49b6-8e80-41e75cecaaf5" path="/var/lib/kubelet/pods/a6217aad-07e6-49b6-8e80-41e75cecaaf5/volumes" Feb 19 15:38:35 crc kubenswrapper[4810]: I0219 15:38:35.054484 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-9cp2h"] Feb 19 15:38:35 crc kubenswrapper[4810]: I0219 15:38:35.073739 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-dc04-account-create-update-dmf9z"] Feb 19 15:38:35 crc kubenswrapper[4810]: I0219 15:38:35.092606 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-9cp2h"] Feb 19 15:38:35 crc kubenswrapper[4810]: I0219 15:38:35.105302 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-dc04-account-create-update-dmf9z"] Feb 19 15:38:35 crc kubenswrapper[4810]: I0219 15:38:35.451415 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05b15da5-701a-492a-b986-99b767d2876c" path="/var/lib/kubelet/pods/05b15da5-701a-492a-b986-99b767d2876c/volumes" Feb 19 15:38:35 crc kubenswrapper[4810]: I0219 15:38:35.452529 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="700fd144-e077-4468-80a4-f131fdb9d67e" path="/var/lib/kubelet/pods/700fd144-e077-4468-80a4-f131fdb9d67e/volumes" Feb 19 15:38:38 crc kubenswrapper[4810]: I0219 15:38:38.027564 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-8ce6-account-create-update-ztxw9"] Feb 19 15:38:38 crc kubenswrapper[4810]: I0219 15:38:38.038868 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-8ce6-account-create-update-ztxw9"] Feb 19 15:38:39 crc kubenswrapper[4810]: I0219 15:38:39.034420 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-43b3-account-create-update-6gqq5"] Feb 19 15:38:39 crc kubenswrapper[4810]: I0219 15:38:39.050701 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-kpf4t"] Feb 19 15:38:39 crc kubenswrapper[4810]: I0219 15:38:39.063200 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-ptwwk"] Feb 19 15:38:39 crc kubenswrapper[4810]: I0219 15:38:39.072242 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-kpf4t"] Feb 19 15:38:39 crc kubenswrapper[4810]: I0219 15:38:39.079967 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-43b3-account-create-update-6gqq5"] Feb 19 15:38:39 crc kubenswrapper[4810]: I0219 15:38:39.088638 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-ptwwk"] Feb 19 15:38:39 crc kubenswrapper[4810]: I0219 15:38:39.449479 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31093793-65b6-467c-8d5b-218e108fd330" path="/var/lib/kubelet/pods/31093793-65b6-467c-8d5b-218e108fd330/volumes" Feb 19 15:38:39 crc kubenswrapper[4810]: I0219 15:38:39.450176 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5770188c-7480-4529-8450-3d1a44cf50d6" path="/var/lib/kubelet/pods/5770188c-7480-4529-8450-3d1a44cf50d6/volumes" Feb 19 15:38:39 crc kubenswrapper[4810]: I0219 15:38:39.450788 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ac347c6-4f1b-4b05-87a0-9332dec2ba9d" path="/var/lib/kubelet/pods/5ac347c6-4f1b-4b05-87a0-9332dec2ba9d/volumes" Feb 19 15:38:39 crc kubenswrapper[4810]: I0219 15:38:39.451367 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2c7665c-330a-45b8-b461-bd08b069b747" path="/var/lib/kubelet/pods/c2c7665c-330a-45b8-b461-bd08b069b747/volumes" Feb 19 15:38:40 crc kubenswrapper[4810]: I0219 15:38:40.307422 4810 scope.go:117] "RemoveContainer" containerID="e9bec1a534d25c6a14471d437a183afe877fd56061c47642147a9b763a2c6190" Feb 19 15:38:40 crc kubenswrapper[4810]: I0219 15:38:40.361074 4810 scope.go:117] "RemoveContainer" containerID="9704b5e429194139e41b388c5c38c38e001c096f5c05263f386a9f8220160ce9" Feb 19 15:38:40 crc kubenswrapper[4810]: I0219 15:38:40.409578 4810 scope.go:117] "RemoveContainer" containerID="235b187eb3bd473181cf1d8a9d02071a2d445e3841afa055bb60de833ffbcec1" Feb 19 15:38:40 crc kubenswrapper[4810]: I0219 15:38:40.452655 4810 scope.go:117] "RemoveContainer" containerID="e7fde16e8762a1bda91d4aa8dbb2846cb1cc1e9049ca68a05b5713d4c512a4b8" Feb 19 15:38:40 crc kubenswrapper[4810]: I0219 15:38:40.493132 4810 scope.go:117] "RemoveContainer" containerID="ffa1569c8787f552547599568a5b882194ae4226f8c9d82766a9a36d606eb91a" Feb 19 15:38:40 crc kubenswrapper[4810]: I0219 15:38:40.534070 4810 scope.go:117] "RemoveContainer" containerID="f70ca73c865d94282b27c6f5f6e86e7e4679dda8ba68c283e08b2a6314c29261" Feb 19 15:38:40 crc kubenswrapper[4810]: I0219 15:38:40.581561 4810 scope.go:117] "RemoveContainer" containerID="f1368987940841c90c15fd62f94998dda89194fc11c576179de536979c6adc82" Feb 19 15:38:40 crc kubenswrapper[4810]: I0219 15:38:40.609530 4810 scope.go:117] "RemoveContainer" containerID="58aa0a5c9039d1f2e2e2fa3a8464a2460e5ec3924d291301dda66cb2085c37f2" Feb 19 15:38:40 crc kubenswrapper[4810]: I0219 15:38:40.631778 4810 scope.go:117] "RemoveContainer" containerID="7503cb280210459a8b150bda1b5d65c5f4d10619291800104ee52fa8927bdb82" Feb 19 15:38:40 crc kubenswrapper[4810]: I0219 15:38:40.655510 4810 scope.go:117] "RemoveContainer" containerID="78c0fb5a6a2ddab1d7b49b378f905fccf1b07a5af8d34ee0f62b947801682e49" Feb 19 15:38:40 crc kubenswrapper[4810]: I0219 15:38:40.685553 4810 scope.go:117] "RemoveContainer" containerID="9f4b973758b7ab4df4daae42709ff686161b77974aebe961605bcdd7b7ee6895" Feb 19 15:38:40 crc kubenswrapper[4810]: I0219 15:38:40.716694 4810 scope.go:117] "RemoveContainer" containerID="683b765d3388918ee0690173c641c6f414e8fc77c164afb4ab566f37723b326b" Feb 19 15:38:40 crc kubenswrapper[4810]: I0219 15:38:40.742689 4810 scope.go:117] "RemoveContainer" containerID="72214a8edd0c54f3823201969c4eb1d1b241f1f9c89ed676fa59ee81e422993e" Feb 19 15:38:40 crc kubenswrapper[4810]: I0219 15:38:40.774904 4810 scope.go:117] "RemoveContainer" containerID="317f50bb910bab31d9c1242a97f9988671bee73e88d7e795833b2626793ec0c6" Feb 19 15:38:40 crc kubenswrapper[4810]: I0219 15:38:40.803494 4810 scope.go:117] "RemoveContainer" containerID="084e9f9fbe2e5f93a513bee51567c16a8dbaf61639fac09083f059ee237ae6a4" Feb 19 15:38:40 crc kubenswrapper[4810]: I0219 15:38:40.839951 4810 scope.go:117] "RemoveContainer" containerID="f13c00b75444d82ae151313db252a559d67eb3a9e93fc91fd59fa886fe8ada73" Feb 19 15:38:42 crc kubenswrapper[4810]: I0219 15:38:42.439103 4810 scope.go:117] "RemoveContainer" containerID="2ce40b521710718c7ba6b5fc71023bb5882beb271696b0616786a7ce052f1e2a" Feb 19 15:38:42 crc kubenswrapper[4810]: E0219 15:38:42.439745 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 15:38:47 crc kubenswrapper[4810]: I0219 15:38:47.063169 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-db-sync-sd4lr"] Feb 19 15:38:47 crc kubenswrapper[4810]: I0219 15:38:47.073398 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-db-sync-sd4lr"] Feb 19 15:38:47 crc kubenswrapper[4810]: I0219 15:38:47.456919 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63eeb47c-9c4a-4e36-be24-61c126517600" path="/var/lib/kubelet/pods/63eeb47c-9c4a-4e36-be24-61c126517600/volumes" Feb 19 15:38:48 crc kubenswrapper[4810]: I0219 15:38:48.025979 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-k9zsz"] Feb 19 15:38:48 crc kubenswrapper[4810]: I0219 15:38:48.034632 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-k9zsz"] Feb 19 15:38:49 crc kubenswrapper[4810]: I0219 15:38:49.451668 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="082fc735-2850-452d-841a-0af9ed7ed171" path="/var/lib/kubelet/pods/082fc735-2850-452d-841a-0af9ed7ed171/volumes" Feb 19 15:38:56 crc kubenswrapper[4810]: I0219 15:38:56.439521 4810 scope.go:117] "RemoveContainer" containerID="2ce40b521710718c7ba6b5fc71023bb5882beb271696b0616786a7ce052f1e2a" Feb 19 15:38:56 crc kubenswrapper[4810]: E0219 15:38:56.440599 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 15:39:06 crc kubenswrapper[4810]: I0219 15:39:06.264286 4810 generic.go:334] "Generic (PLEG): container finished" podID="e6255c5c-26d4-421f-9156-1bdd2f5adcc6" containerID="46b7253182b6dc0e64c2e59a4d275ed006838d41d039fd87a4c506ea7296776b" exitCode=0 Feb 19 15:39:06 crc kubenswrapper[4810]: I0219 15:39:06.264399 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-flmfl" event={"ID":"e6255c5c-26d4-421f-9156-1bdd2f5adcc6","Type":"ContainerDied","Data":"46b7253182b6dc0e64c2e59a4d275ed006838d41d039fd87a4c506ea7296776b"} Feb 19 15:39:07 crc kubenswrapper[4810]: I0219 15:39:07.787248 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-flmfl" Feb 19 15:39:07 crc kubenswrapper[4810]: I0219 15:39:07.892734 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e6255c5c-26d4-421f-9156-1bdd2f5adcc6-ssh-key-openstack-edpm-ipam\") pod \"e6255c5c-26d4-421f-9156-1bdd2f5adcc6\" (UID: \"e6255c5c-26d4-421f-9156-1bdd2f5adcc6\") " Feb 19 15:39:07 crc kubenswrapper[4810]: I0219 15:39:07.892832 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e6255c5c-26d4-421f-9156-1bdd2f5adcc6-inventory\") pod \"e6255c5c-26d4-421f-9156-1bdd2f5adcc6\" (UID: \"e6255c5c-26d4-421f-9156-1bdd2f5adcc6\") " Feb 19 15:39:07 crc kubenswrapper[4810]: I0219 15:39:07.893033 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzg4x\" (UniqueName: \"kubernetes.io/projected/e6255c5c-26d4-421f-9156-1bdd2f5adcc6-kube-api-access-qzg4x\") pod \"e6255c5c-26d4-421f-9156-1bdd2f5adcc6\" (UID: \"e6255c5c-26d4-421f-9156-1bdd2f5adcc6\") " Feb 19 15:39:07 crc kubenswrapper[4810]: I0219 15:39:07.904593 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6255c5c-26d4-421f-9156-1bdd2f5adcc6-kube-api-access-qzg4x" (OuterVolumeSpecName: "kube-api-access-qzg4x") pod "e6255c5c-26d4-421f-9156-1bdd2f5adcc6" (UID: "e6255c5c-26d4-421f-9156-1bdd2f5adcc6"). InnerVolumeSpecName "kube-api-access-qzg4x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:39:07 crc kubenswrapper[4810]: I0219 15:39:07.921560 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6255c5c-26d4-421f-9156-1bdd2f5adcc6-inventory" (OuterVolumeSpecName: "inventory") pod "e6255c5c-26d4-421f-9156-1bdd2f5adcc6" (UID: "e6255c5c-26d4-421f-9156-1bdd2f5adcc6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:39:07 crc kubenswrapper[4810]: I0219 15:39:07.923918 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6255c5c-26d4-421f-9156-1bdd2f5adcc6-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e6255c5c-26d4-421f-9156-1bdd2f5adcc6" (UID: "e6255c5c-26d4-421f-9156-1bdd2f5adcc6"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:39:07 crc kubenswrapper[4810]: I0219 15:39:07.998262 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qzg4x\" (UniqueName: \"kubernetes.io/projected/e6255c5c-26d4-421f-9156-1bdd2f5adcc6-kube-api-access-qzg4x\") on node \"crc\" DevicePath \"\"" Feb 19 15:39:07 crc kubenswrapper[4810]: I0219 15:39:07.998317 4810 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e6255c5c-26d4-421f-9156-1bdd2f5adcc6-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 15:39:07 crc kubenswrapper[4810]: I0219 15:39:07.998418 4810 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e6255c5c-26d4-421f-9156-1bdd2f5adcc6-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 15:39:08 crc kubenswrapper[4810]: I0219 15:39:08.288179 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-flmfl" event={"ID":"e6255c5c-26d4-421f-9156-1bdd2f5adcc6","Type":"ContainerDied","Data":"df2d8f9f79616c4ab1d340863a47dd0003a7f33fda9bcbbe938ee53e102fa03b"} Feb 19 15:39:08 crc kubenswrapper[4810]: I0219 15:39:08.288237 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df2d8f9f79616c4ab1d340863a47dd0003a7f33fda9bcbbe938ee53e102fa03b" Feb 19 15:39:08 crc kubenswrapper[4810]: I0219 15:39:08.288663 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-flmfl" Feb 19 15:39:08 crc kubenswrapper[4810]: I0219 15:39:08.400397 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6q498"] Feb 19 15:39:08 crc kubenswrapper[4810]: E0219 15:39:08.401223 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6255c5c-26d4-421f-9156-1bdd2f5adcc6" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 19 15:39:08 crc kubenswrapper[4810]: I0219 15:39:08.401251 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6255c5c-26d4-421f-9156-1bdd2f5adcc6" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 19 15:39:08 crc kubenswrapper[4810]: I0219 15:39:08.401592 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6255c5c-26d4-421f-9156-1bdd2f5adcc6" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 19 15:39:08 crc kubenswrapper[4810]: I0219 15:39:08.402524 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6q498" Feb 19 15:39:08 crc kubenswrapper[4810]: I0219 15:39:08.430082 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 15:39:08 crc kubenswrapper[4810]: I0219 15:39:08.430403 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 15:39:08 crc kubenswrapper[4810]: I0219 15:39:08.430469 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 15:39:08 crc kubenswrapper[4810]: I0219 15:39:08.430614 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-djc4q" Feb 19 15:39:08 crc kubenswrapper[4810]: I0219 15:39:08.448071 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6q498"] Feb 19 15:39:08 crc kubenswrapper[4810]: I0219 15:39:08.510784 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49znl\" (UniqueName: \"kubernetes.io/projected/2cff3a3e-0543-4fec-8f5b-5421be276386-kube-api-access-49znl\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6q498\" (UID: \"2cff3a3e-0543-4fec-8f5b-5421be276386\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6q498" Feb 19 15:39:08 crc kubenswrapper[4810]: I0219 15:39:08.510988 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2cff3a3e-0543-4fec-8f5b-5421be276386-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6q498\" (UID: \"2cff3a3e-0543-4fec-8f5b-5421be276386\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6q498" Feb 19 15:39:08 crc kubenswrapper[4810]: I0219 15:39:08.511024 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2cff3a3e-0543-4fec-8f5b-5421be276386-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6q498\" (UID: \"2cff3a3e-0543-4fec-8f5b-5421be276386\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6q498" Feb 19 15:39:08 crc kubenswrapper[4810]: I0219 15:39:08.613359 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49znl\" (UniqueName: \"kubernetes.io/projected/2cff3a3e-0543-4fec-8f5b-5421be276386-kube-api-access-49znl\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6q498\" (UID: \"2cff3a3e-0543-4fec-8f5b-5421be276386\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6q498" Feb 19 15:39:08 crc kubenswrapper[4810]: I0219 15:39:08.613456 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2cff3a3e-0543-4fec-8f5b-5421be276386-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6q498\" (UID: \"2cff3a3e-0543-4fec-8f5b-5421be276386\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6q498" Feb 19 15:39:08 crc kubenswrapper[4810]: I0219 15:39:08.613492 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2cff3a3e-0543-4fec-8f5b-5421be276386-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6q498\" (UID: \"2cff3a3e-0543-4fec-8f5b-5421be276386\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6q498" Feb 19 15:39:08 crc kubenswrapper[4810]: I0219 15:39:08.619478 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2cff3a3e-0543-4fec-8f5b-5421be276386-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6q498\" (UID: \"2cff3a3e-0543-4fec-8f5b-5421be276386\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6q498" Feb 19 15:39:08 crc kubenswrapper[4810]: I0219 15:39:08.629928 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2cff3a3e-0543-4fec-8f5b-5421be276386-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6q498\" (UID: \"2cff3a3e-0543-4fec-8f5b-5421be276386\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6q498" Feb 19 15:39:08 crc kubenswrapper[4810]: I0219 15:39:08.632180 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49znl\" (UniqueName: \"kubernetes.io/projected/2cff3a3e-0543-4fec-8f5b-5421be276386-kube-api-access-49znl\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6q498\" (UID: \"2cff3a3e-0543-4fec-8f5b-5421be276386\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6q498" Feb 19 15:39:08 crc kubenswrapper[4810]: I0219 15:39:08.743839 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6q498" Feb 19 15:39:09 crc kubenswrapper[4810]: I0219 15:39:09.387609 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6q498"] Feb 19 15:39:10 crc kubenswrapper[4810]: I0219 15:39:10.308022 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6q498" event={"ID":"2cff3a3e-0543-4fec-8f5b-5421be276386","Type":"ContainerStarted","Data":"b4e86a3f28595dacb736dcb71007d80706dd96818e80cc25e2a888dfcab09e96"} Feb 19 15:39:10 crc kubenswrapper[4810]: I0219 15:39:10.308651 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6q498" event={"ID":"2cff3a3e-0543-4fec-8f5b-5421be276386","Type":"ContainerStarted","Data":"0dba2a07ad454eaa31c1496b450e78caabf96a5e17db799ef132576e5619dad4"} Feb 19 15:39:10 crc kubenswrapper[4810]: I0219 15:39:10.335900 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6q498" podStartSLOduration=1.880465222 podStartE2EDuration="2.335885428s" podCreationTimestamp="2026-02-19 15:39:08 +0000 UTC" firstStartedPulling="2026-02-19 15:39:09.404369269 +0000 UTC m=+1778.886399413" lastFinishedPulling="2026-02-19 15:39:09.859789505 +0000 UTC m=+1779.341819619" observedRunningTime="2026-02-19 15:39:10.32917508 +0000 UTC m=+1779.811205204" watchObservedRunningTime="2026-02-19 15:39:10.335885428 +0000 UTC m=+1779.817915552" Feb 19 15:39:11 crc kubenswrapper[4810]: I0219 15:39:11.453714 4810 scope.go:117] "RemoveContainer" containerID="2ce40b521710718c7ba6b5fc71023bb5882beb271696b0616786a7ce052f1e2a" Feb 19 15:39:11 crc kubenswrapper[4810]: E0219 15:39:11.454167 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 15:39:25 crc kubenswrapper[4810]: I0219 15:39:25.441517 4810 scope.go:117] "RemoveContainer" containerID="2ce40b521710718c7ba6b5fc71023bb5882beb271696b0616786a7ce052f1e2a" Feb 19 15:39:25 crc kubenswrapper[4810]: E0219 15:39:25.444599 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 15:39:33 crc kubenswrapper[4810]: I0219 15:39:33.060475 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-7jdcp"] Feb 19 15:39:33 crc kubenswrapper[4810]: I0219 15:39:33.070776 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-thnc7"] Feb 19 15:39:33 crc kubenswrapper[4810]: I0219 15:39:33.080918 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-j989d"] Feb 19 15:39:33 crc kubenswrapper[4810]: I0219 15:39:33.089884 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-7jdcp"] Feb 19 15:39:33 crc kubenswrapper[4810]: I0219 15:39:33.098918 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-thnc7"] Feb 19 15:39:33 crc kubenswrapper[4810]: I0219 15:39:33.110890 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-j989d"] Feb 19 15:39:33 crc kubenswrapper[4810]: I0219 15:39:33.458951 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36fe6fdb-2970-4773-8184-a2d16b8ca89a" path="/var/lib/kubelet/pods/36fe6fdb-2970-4773-8184-a2d16b8ca89a/volumes" Feb 19 15:39:33 crc kubenswrapper[4810]: I0219 15:39:33.459660 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4dd5dede-cf58-43c7-954e-b9b1d33ad8d1" path="/var/lib/kubelet/pods/4dd5dede-cf58-43c7-954e-b9b1d33ad8d1/volumes" Feb 19 15:39:33 crc kubenswrapper[4810]: I0219 15:39:33.460298 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92797675-ddf7-43cf-90af-0248cf097509" path="/var/lib/kubelet/pods/92797675-ddf7-43cf-90af-0248cf097509/volumes" Feb 19 15:39:37 crc kubenswrapper[4810]: I0219 15:39:37.439659 4810 scope.go:117] "RemoveContainer" containerID="2ce40b521710718c7ba6b5fc71023bb5882beb271696b0616786a7ce052f1e2a" Feb 19 15:39:37 crc kubenswrapper[4810]: E0219 15:39:37.440710 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 15:39:41 crc kubenswrapper[4810]: I0219 15:39:41.247447 4810 scope.go:117] "RemoveContainer" containerID="22a00ed65eebcb7030f20de212b927e0556118314908589176cea5b5329504cb" Feb 19 15:39:41 crc kubenswrapper[4810]: I0219 15:39:41.289611 4810 scope.go:117] "RemoveContainer" containerID="aa561f23770b052d6b320e47499c0a8789e25a7a2367b69634f88f903c8d780a" Feb 19 15:39:41 crc kubenswrapper[4810]: I0219 15:39:41.357055 4810 scope.go:117] "RemoveContainer" containerID="da258067ce2c7912909dc7c937b6ad45df02bf5c8504937ad6d6f0ea0359724a" Feb 19 15:39:41 crc kubenswrapper[4810]: I0219 15:39:41.403132 4810 scope.go:117] "RemoveContainer" containerID="7e83b0c5177b1183e58ad0498417fc1c3b6e142723e7482bda0235e4615b43f5" Feb 19 15:39:41 crc kubenswrapper[4810]: I0219 15:39:41.467364 4810 scope.go:117] "RemoveContainer" containerID="829a51aca23df8d8763078bcae4b4cba43b6c265996ab11fc55f6d42ce950516" Feb 19 15:39:46 crc kubenswrapper[4810]: I0219 15:39:46.037237 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-hmc6k"] Feb 19 15:39:46 crc kubenswrapper[4810]: I0219 15:39:46.050494 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-hmc6k"] Feb 19 15:39:47 crc kubenswrapper[4810]: I0219 15:39:47.476742 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2024a783-c3f9-4e57-b00f-52bec164e64e" path="/var/lib/kubelet/pods/2024a783-c3f9-4e57-b00f-52bec164e64e/volumes" Feb 19 15:39:49 crc kubenswrapper[4810]: I0219 15:39:49.046831 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-svmgl"] Feb 19 15:39:49 crc kubenswrapper[4810]: I0219 15:39:49.055914 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-svmgl"] Feb 19 15:39:49 crc kubenswrapper[4810]: I0219 15:39:49.438900 4810 scope.go:117] "RemoveContainer" containerID="2ce40b521710718c7ba6b5fc71023bb5882beb271696b0616786a7ce052f1e2a" Feb 19 15:39:49 crc kubenswrapper[4810]: E0219 15:39:49.439140 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 15:39:49 crc kubenswrapper[4810]: I0219 15:39:49.448633 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="848dfe9d-05f4-4ba9-919e-23e9a7ae63d5" path="/var/lib/kubelet/pods/848dfe9d-05f4-4ba9-919e-23e9a7ae63d5/volumes" Feb 19 15:40:02 crc kubenswrapper[4810]: I0219 15:40:02.439481 4810 scope.go:117] "RemoveContainer" containerID="2ce40b521710718c7ba6b5fc71023bb5882beb271696b0616786a7ce052f1e2a" Feb 19 15:40:02 crc kubenswrapper[4810]: E0219 15:40:02.440337 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 15:40:14 crc kubenswrapper[4810]: I0219 15:40:14.440123 4810 scope.go:117] "RemoveContainer" containerID="2ce40b521710718c7ba6b5fc71023bb5882beb271696b0616786a7ce052f1e2a" Feb 19 15:40:14 crc kubenswrapper[4810]: E0219 15:40:14.440815 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 15:40:25 crc kubenswrapper[4810]: I0219 15:40:25.213590 4810 generic.go:334] "Generic (PLEG): container finished" podID="2cff3a3e-0543-4fec-8f5b-5421be276386" containerID="b4e86a3f28595dacb736dcb71007d80706dd96818e80cc25e2a888dfcab09e96" exitCode=0 Feb 19 15:40:25 crc kubenswrapper[4810]: I0219 15:40:25.213677 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6q498" event={"ID":"2cff3a3e-0543-4fec-8f5b-5421be276386","Type":"ContainerDied","Data":"b4e86a3f28595dacb736dcb71007d80706dd96818e80cc25e2a888dfcab09e96"} Feb 19 15:40:25 crc kubenswrapper[4810]: I0219 15:40:25.441207 4810 scope.go:117] "RemoveContainer" containerID="2ce40b521710718c7ba6b5fc71023bb5882beb271696b0616786a7ce052f1e2a" Feb 19 15:40:25 crc kubenswrapper[4810]: E0219 15:40:25.442041 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 15:40:26 crc kubenswrapper[4810]: I0219 15:40:26.686591 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6q498" Feb 19 15:40:26 crc kubenswrapper[4810]: I0219 15:40:26.811541 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2cff3a3e-0543-4fec-8f5b-5421be276386-inventory\") pod \"2cff3a3e-0543-4fec-8f5b-5421be276386\" (UID: \"2cff3a3e-0543-4fec-8f5b-5421be276386\") " Feb 19 15:40:26 crc kubenswrapper[4810]: I0219 15:40:26.811621 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49znl\" (UniqueName: \"kubernetes.io/projected/2cff3a3e-0543-4fec-8f5b-5421be276386-kube-api-access-49znl\") pod \"2cff3a3e-0543-4fec-8f5b-5421be276386\" (UID: \"2cff3a3e-0543-4fec-8f5b-5421be276386\") " Feb 19 15:40:26 crc kubenswrapper[4810]: I0219 15:40:26.811826 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2cff3a3e-0543-4fec-8f5b-5421be276386-ssh-key-openstack-edpm-ipam\") pod \"2cff3a3e-0543-4fec-8f5b-5421be276386\" (UID: \"2cff3a3e-0543-4fec-8f5b-5421be276386\") " Feb 19 15:40:26 crc kubenswrapper[4810]: I0219 15:40:26.817706 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cff3a3e-0543-4fec-8f5b-5421be276386-kube-api-access-49znl" (OuterVolumeSpecName: "kube-api-access-49znl") pod "2cff3a3e-0543-4fec-8f5b-5421be276386" (UID: "2cff3a3e-0543-4fec-8f5b-5421be276386"). InnerVolumeSpecName "kube-api-access-49znl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:40:26 crc kubenswrapper[4810]: I0219 15:40:26.841481 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cff3a3e-0543-4fec-8f5b-5421be276386-inventory" (OuterVolumeSpecName: "inventory") pod "2cff3a3e-0543-4fec-8f5b-5421be276386" (UID: "2cff3a3e-0543-4fec-8f5b-5421be276386"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:40:26 crc kubenswrapper[4810]: I0219 15:40:26.848081 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cff3a3e-0543-4fec-8f5b-5421be276386-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2cff3a3e-0543-4fec-8f5b-5421be276386" (UID: "2cff3a3e-0543-4fec-8f5b-5421be276386"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:40:26 crc kubenswrapper[4810]: I0219 15:40:26.914217 4810 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2cff3a3e-0543-4fec-8f5b-5421be276386-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 15:40:26 crc kubenswrapper[4810]: I0219 15:40:26.914274 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49znl\" (UniqueName: \"kubernetes.io/projected/2cff3a3e-0543-4fec-8f5b-5421be276386-kube-api-access-49znl\") on node \"crc\" DevicePath \"\"" Feb 19 15:40:26 crc kubenswrapper[4810]: I0219 15:40:26.914295 4810 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2cff3a3e-0543-4fec-8f5b-5421be276386-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 15:40:27 crc kubenswrapper[4810]: I0219 15:40:27.239491 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6q498" event={"ID":"2cff3a3e-0543-4fec-8f5b-5421be276386","Type":"ContainerDied","Data":"0dba2a07ad454eaa31c1496b450e78caabf96a5e17db799ef132576e5619dad4"} Feb 19 15:40:27 crc kubenswrapper[4810]: I0219 15:40:27.239551 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0dba2a07ad454eaa31c1496b450e78caabf96a5e17db799ef132576e5619dad4" Feb 19 15:40:27 crc kubenswrapper[4810]: I0219 15:40:27.239561 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6q498" Feb 19 15:40:27 crc kubenswrapper[4810]: I0219 15:40:27.340058 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zbkm7"] Feb 19 15:40:27 crc kubenswrapper[4810]: E0219 15:40:27.340729 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cff3a3e-0543-4fec-8f5b-5421be276386" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 19 15:40:27 crc kubenswrapper[4810]: I0219 15:40:27.340758 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cff3a3e-0543-4fec-8f5b-5421be276386" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 19 15:40:27 crc kubenswrapper[4810]: I0219 15:40:27.341039 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cff3a3e-0543-4fec-8f5b-5421be276386" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 19 15:40:27 crc kubenswrapper[4810]: I0219 15:40:27.341999 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zbkm7" Feb 19 15:40:27 crc kubenswrapper[4810]: I0219 15:40:27.343833 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 15:40:27 crc kubenswrapper[4810]: I0219 15:40:27.343924 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 15:40:27 crc kubenswrapper[4810]: I0219 15:40:27.345129 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 15:40:27 crc kubenswrapper[4810]: I0219 15:40:27.345363 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-djc4q" Feb 19 15:40:27 crc kubenswrapper[4810]: I0219 15:40:27.353737 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zbkm7"] Feb 19 15:40:27 crc kubenswrapper[4810]: I0219 15:40:27.424825 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/412dc62a-d25e-4820-947b-582e310ddff1-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zbkm7\" (UID: \"412dc62a-d25e-4820-947b-582e310ddff1\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zbkm7" Feb 19 15:40:27 crc kubenswrapper[4810]: I0219 15:40:27.424893 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s57fs\" (UniqueName: \"kubernetes.io/projected/412dc62a-d25e-4820-947b-582e310ddff1-kube-api-access-s57fs\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zbkm7\" (UID: \"412dc62a-d25e-4820-947b-582e310ddff1\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zbkm7" Feb 19 15:40:27 crc kubenswrapper[4810]: I0219 15:40:27.425061 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/412dc62a-d25e-4820-947b-582e310ddff1-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zbkm7\" (UID: \"412dc62a-d25e-4820-947b-582e310ddff1\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zbkm7" Feb 19 15:40:27 crc kubenswrapper[4810]: I0219 15:40:27.526554 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/412dc62a-d25e-4820-947b-582e310ddff1-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zbkm7\" (UID: \"412dc62a-d25e-4820-947b-582e310ddff1\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zbkm7" Feb 19 15:40:27 crc kubenswrapper[4810]: I0219 15:40:27.526660 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/412dc62a-d25e-4820-947b-582e310ddff1-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zbkm7\" (UID: \"412dc62a-d25e-4820-947b-582e310ddff1\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zbkm7" Feb 19 15:40:27 crc kubenswrapper[4810]: I0219 15:40:27.526697 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s57fs\" (UniqueName: \"kubernetes.io/projected/412dc62a-d25e-4820-947b-582e310ddff1-kube-api-access-s57fs\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zbkm7\" (UID: \"412dc62a-d25e-4820-947b-582e310ddff1\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zbkm7" Feb 19 15:40:27 crc kubenswrapper[4810]: I0219 15:40:27.531266 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/412dc62a-d25e-4820-947b-582e310ddff1-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zbkm7\" (UID: \"412dc62a-d25e-4820-947b-582e310ddff1\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zbkm7" Feb 19 15:40:27 crc kubenswrapper[4810]: I0219 15:40:27.533386 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/412dc62a-d25e-4820-947b-582e310ddff1-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zbkm7\" (UID: \"412dc62a-d25e-4820-947b-582e310ddff1\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zbkm7" Feb 19 15:40:27 crc kubenswrapper[4810]: I0219 15:40:27.548882 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s57fs\" (UniqueName: \"kubernetes.io/projected/412dc62a-d25e-4820-947b-582e310ddff1-kube-api-access-s57fs\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zbkm7\" (UID: \"412dc62a-d25e-4820-947b-582e310ddff1\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zbkm7" Feb 19 15:40:27 crc kubenswrapper[4810]: I0219 15:40:27.665062 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zbkm7" Feb 19 15:40:28 crc kubenswrapper[4810]: I0219 15:40:28.286134 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zbkm7"] Feb 19 15:40:29 crc kubenswrapper[4810]: I0219 15:40:29.266946 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zbkm7" event={"ID":"412dc62a-d25e-4820-947b-582e310ddff1","Type":"ContainerStarted","Data":"4a7d466ee6eb7612f01efe84e9b3560e7b39b5e659599c6f4786a572b3f2f9b0"} Feb 19 15:40:29 crc kubenswrapper[4810]: I0219 15:40:29.267315 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zbkm7" event={"ID":"412dc62a-d25e-4820-947b-582e310ddff1","Type":"ContainerStarted","Data":"4a1d9d8f611ac0fb3da1c822934589aaede54eb9cff7b74b0f86b60eca68033e"} Feb 19 15:40:29 crc kubenswrapper[4810]: I0219 15:40:29.295932 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zbkm7" podStartSLOduration=1.828377057 podStartE2EDuration="2.295903356s" podCreationTimestamp="2026-02-19 15:40:27 +0000 UTC" firstStartedPulling="2026-02-19 15:40:28.292012124 +0000 UTC m=+1857.774042288" lastFinishedPulling="2026-02-19 15:40:28.759538463 +0000 UTC m=+1858.241568587" observedRunningTime="2026-02-19 15:40:29.291109076 +0000 UTC m=+1858.773139200" watchObservedRunningTime="2026-02-19 15:40:29.295903356 +0000 UTC m=+1858.777933510" Feb 19 15:40:34 crc kubenswrapper[4810]: I0219 15:40:34.064115 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-w2s7h"] Feb 19 15:40:34 crc kubenswrapper[4810]: I0219 15:40:34.085774 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-nxd5j"] Feb 19 15:40:34 crc kubenswrapper[4810]: I0219 15:40:34.097883 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-8ftxl"] Feb 19 15:40:34 crc kubenswrapper[4810]: I0219 15:40:34.108910 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-w2s7h"] Feb 19 15:40:34 crc kubenswrapper[4810]: I0219 15:40:34.119659 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-8ftxl"] Feb 19 15:40:34 crc kubenswrapper[4810]: I0219 15:40:34.128401 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-nxd5j"] Feb 19 15:40:34 crc kubenswrapper[4810]: I0219 15:40:34.326494 4810 generic.go:334] "Generic (PLEG): container finished" podID="412dc62a-d25e-4820-947b-582e310ddff1" containerID="4a7d466ee6eb7612f01efe84e9b3560e7b39b5e659599c6f4786a572b3f2f9b0" exitCode=0 Feb 19 15:40:34 crc kubenswrapper[4810]: I0219 15:40:34.326537 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zbkm7" event={"ID":"412dc62a-d25e-4820-947b-582e310ddff1","Type":"ContainerDied","Data":"4a7d466ee6eb7612f01efe84e9b3560e7b39b5e659599c6f4786a572b3f2f9b0"} Feb 19 15:40:35 crc kubenswrapper[4810]: I0219 15:40:35.112086 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-841f-account-create-update-swd7q"] Feb 19 15:40:35 crc kubenswrapper[4810]: I0219 15:40:35.128857 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-da99-account-create-update-4j7hb"] Feb 19 15:40:35 crc kubenswrapper[4810]: I0219 15:40:35.137142 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-6e67-account-create-update-lk6cv"] Feb 19 15:40:35 crc kubenswrapper[4810]: I0219 15:40:35.147369 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-841f-account-create-update-swd7q"] Feb 19 15:40:35 crc kubenswrapper[4810]: I0219 15:40:35.156645 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-6e67-account-create-update-lk6cv"] Feb 19 15:40:35 crc kubenswrapper[4810]: I0219 15:40:35.164047 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-da99-account-create-update-4j7hb"] Feb 19 15:40:35 crc kubenswrapper[4810]: I0219 15:40:35.470065 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f19eb06-d11c-409b-8b7e-516c9a5db815" path="/var/lib/kubelet/pods/1f19eb06-d11c-409b-8b7e-516c9a5db815/volumes" Feb 19 15:40:35 crc kubenswrapper[4810]: I0219 15:40:35.470975 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48e0d5d9-1d58-41a5-b740-8c8286edec31" path="/var/lib/kubelet/pods/48e0d5d9-1d58-41a5-b740-8c8286edec31/volumes" Feb 19 15:40:35 crc kubenswrapper[4810]: I0219 15:40:35.471486 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="987d17ad-1427-4709-b5db-19fbb00e8a7c" path="/var/lib/kubelet/pods/987d17ad-1427-4709-b5db-19fbb00e8a7c/volumes" Feb 19 15:40:35 crc kubenswrapper[4810]: I0219 15:40:35.472098 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1703853-2754-4348-8c45-dcd98ff5d429" path="/var/lib/kubelet/pods/d1703853-2754-4348-8c45-dcd98ff5d429/volumes" Feb 19 15:40:35 crc kubenswrapper[4810]: I0219 15:40:35.473664 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da5e0166-d811-4dcd-9230-976dd1893c11" path="/var/lib/kubelet/pods/da5e0166-d811-4dcd-9230-976dd1893c11/volumes" Feb 19 15:40:35 crc kubenswrapper[4810]: I0219 15:40:35.474232 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f04e1699-2be0-4dca-8e4a-73035fde359f" path="/var/lib/kubelet/pods/f04e1699-2be0-4dca-8e4a-73035fde359f/volumes" Feb 19 15:40:35 crc kubenswrapper[4810]: I0219 15:40:35.792809 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zbkm7" Feb 19 15:40:35 crc kubenswrapper[4810]: I0219 15:40:35.815071 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s57fs\" (UniqueName: \"kubernetes.io/projected/412dc62a-d25e-4820-947b-582e310ddff1-kube-api-access-s57fs\") pod \"412dc62a-d25e-4820-947b-582e310ddff1\" (UID: \"412dc62a-d25e-4820-947b-582e310ddff1\") " Feb 19 15:40:35 crc kubenswrapper[4810]: I0219 15:40:35.815544 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/412dc62a-d25e-4820-947b-582e310ddff1-inventory\") pod \"412dc62a-d25e-4820-947b-582e310ddff1\" (UID: \"412dc62a-d25e-4820-947b-582e310ddff1\") " Feb 19 15:40:35 crc kubenswrapper[4810]: I0219 15:40:35.815847 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/412dc62a-d25e-4820-947b-582e310ddff1-ssh-key-openstack-edpm-ipam\") pod \"412dc62a-d25e-4820-947b-582e310ddff1\" (UID: \"412dc62a-d25e-4820-947b-582e310ddff1\") " Feb 19 15:40:35 crc kubenswrapper[4810]: I0219 15:40:35.821646 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/412dc62a-d25e-4820-947b-582e310ddff1-kube-api-access-s57fs" (OuterVolumeSpecName: "kube-api-access-s57fs") pod "412dc62a-d25e-4820-947b-582e310ddff1" (UID: "412dc62a-d25e-4820-947b-582e310ddff1"). InnerVolumeSpecName "kube-api-access-s57fs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:40:35 crc kubenswrapper[4810]: I0219 15:40:35.850998 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/412dc62a-d25e-4820-947b-582e310ddff1-inventory" (OuterVolumeSpecName: "inventory") pod "412dc62a-d25e-4820-947b-582e310ddff1" (UID: "412dc62a-d25e-4820-947b-582e310ddff1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:40:35 crc kubenswrapper[4810]: I0219 15:40:35.865476 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/412dc62a-d25e-4820-947b-582e310ddff1-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "412dc62a-d25e-4820-947b-582e310ddff1" (UID: "412dc62a-d25e-4820-947b-582e310ddff1"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:40:35 crc kubenswrapper[4810]: I0219 15:40:35.917924 4810 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/412dc62a-d25e-4820-947b-582e310ddff1-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 15:40:35 crc kubenswrapper[4810]: I0219 15:40:35.917973 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s57fs\" (UniqueName: \"kubernetes.io/projected/412dc62a-d25e-4820-947b-582e310ddff1-kube-api-access-s57fs\") on node \"crc\" DevicePath \"\"" Feb 19 15:40:35 crc kubenswrapper[4810]: I0219 15:40:35.917987 4810 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/412dc62a-d25e-4820-947b-582e310ddff1-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 15:40:36 crc kubenswrapper[4810]: I0219 15:40:36.347159 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zbkm7" event={"ID":"412dc62a-d25e-4820-947b-582e310ddff1","Type":"ContainerDied","Data":"4a1d9d8f611ac0fb3da1c822934589aaede54eb9cff7b74b0f86b60eca68033e"} Feb 19 15:40:36 crc kubenswrapper[4810]: I0219 15:40:36.347217 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a1d9d8f611ac0fb3da1c822934589aaede54eb9cff7b74b0f86b60eca68033e" Feb 19 15:40:36 crc kubenswrapper[4810]: I0219 15:40:36.348908 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zbkm7" Feb 19 15:40:36 crc kubenswrapper[4810]: I0219 15:40:36.455724 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-s8kk5"] Feb 19 15:40:36 crc kubenswrapper[4810]: E0219 15:40:36.456522 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="412dc62a-d25e-4820-947b-582e310ddff1" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 19 15:40:36 crc kubenswrapper[4810]: I0219 15:40:36.456549 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="412dc62a-d25e-4820-947b-582e310ddff1" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 19 15:40:36 crc kubenswrapper[4810]: I0219 15:40:36.457051 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="412dc62a-d25e-4820-947b-582e310ddff1" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 19 15:40:36 crc kubenswrapper[4810]: I0219 15:40:36.458503 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-s8kk5" Feb 19 15:40:36 crc kubenswrapper[4810]: I0219 15:40:36.463567 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 15:40:36 crc kubenswrapper[4810]: I0219 15:40:36.464145 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-djc4q" Feb 19 15:40:36 crc kubenswrapper[4810]: I0219 15:40:36.468219 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 15:40:36 crc kubenswrapper[4810]: I0219 15:40:36.468443 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 15:40:36 crc kubenswrapper[4810]: I0219 15:40:36.468870 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-s8kk5"] Feb 19 15:40:36 crc kubenswrapper[4810]: I0219 15:40:36.538816 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/12c3c35e-69ec-4150-8f1e-7eeb6cc01d9b-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-s8kk5\" (UID: \"12c3c35e-69ec-4150-8f1e-7eeb6cc01d9b\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-s8kk5" Feb 19 15:40:36 crc kubenswrapper[4810]: I0219 15:40:36.538899 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9scx\" (UniqueName: \"kubernetes.io/projected/12c3c35e-69ec-4150-8f1e-7eeb6cc01d9b-kube-api-access-m9scx\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-s8kk5\" (UID: \"12c3c35e-69ec-4150-8f1e-7eeb6cc01d9b\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-s8kk5" Feb 19 15:40:36 crc kubenswrapper[4810]: I0219 15:40:36.539100 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/12c3c35e-69ec-4150-8f1e-7eeb6cc01d9b-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-s8kk5\" (UID: \"12c3c35e-69ec-4150-8f1e-7eeb6cc01d9b\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-s8kk5" Feb 19 15:40:36 crc kubenswrapper[4810]: I0219 15:40:36.640761 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/12c3c35e-69ec-4150-8f1e-7eeb6cc01d9b-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-s8kk5\" (UID: \"12c3c35e-69ec-4150-8f1e-7eeb6cc01d9b\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-s8kk5" Feb 19 15:40:36 crc kubenswrapper[4810]: I0219 15:40:36.640852 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/12c3c35e-69ec-4150-8f1e-7eeb6cc01d9b-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-s8kk5\" (UID: \"12c3c35e-69ec-4150-8f1e-7eeb6cc01d9b\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-s8kk5" Feb 19 15:40:36 crc kubenswrapper[4810]: I0219 15:40:36.640900 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9scx\" (UniqueName: \"kubernetes.io/projected/12c3c35e-69ec-4150-8f1e-7eeb6cc01d9b-kube-api-access-m9scx\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-s8kk5\" (UID: \"12c3c35e-69ec-4150-8f1e-7eeb6cc01d9b\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-s8kk5" Feb 19 15:40:36 crc kubenswrapper[4810]: I0219 15:40:36.644257 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/12c3c35e-69ec-4150-8f1e-7eeb6cc01d9b-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-s8kk5\" (UID: \"12c3c35e-69ec-4150-8f1e-7eeb6cc01d9b\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-s8kk5" Feb 19 15:40:36 crc kubenswrapper[4810]: I0219 15:40:36.647596 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/12c3c35e-69ec-4150-8f1e-7eeb6cc01d9b-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-s8kk5\" (UID: \"12c3c35e-69ec-4150-8f1e-7eeb6cc01d9b\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-s8kk5" Feb 19 15:40:36 crc kubenswrapper[4810]: I0219 15:40:36.656734 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9scx\" (UniqueName: \"kubernetes.io/projected/12c3c35e-69ec-4150-8f1e-7eeb6cc01d9b-kube-api-access-m9scx\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-s8kk5\" (UID: \"12c3c35e-69ec-4150-8f1e-7eeb6cc01d9b\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-s8kk5" Feb 19 15:40:36 crc kubenswrapper[4810]: I0219 15:40:36.832685 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-s8kk5" Feb 19 15:40:37 crc kubenswrapper[4810]: I0219 15:40:37.403217 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-s8kk5"] Feb 19 15:40:38 crc kubenswrapper[4810]: I0219 15:40:38.370411 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-s8kk5" event={"ID":"12c3c35e-69ec-4150-8f1e-7eeb6cc01d9b","Type":"ContainerStarted","Data":"66bd09d023dfa70c25f2018bd9b82f03017c1b6846777cd688b44fdbcc73f0a7"} Feb 19 15:40:38 crc kubenswrapper[4810]: I0219 15:40:38.370736 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-s8kk5" event={"ID":"12c3c35e-69ec-4150-8f1e-7eeb6cc01d9b","Type":"ContainerStarted","Data":"f9bc96f54f3b78da1f1da2ab5fb44990eaf75c7f9b3ec0fa37f91c60d09df681"} Feb 19 15:40:40 crc kubenswrapper[4810]: I0219 15:40:40.440632 4810 scope.go:117] "RemoveContainer" containerID="2ce40b521710718c7ba6b5fc71023bb5882beb271696b0616786a7ce052f1e2a" Feb 19 15:40:40 crc kubenswrapper[4810]: E0219 15:40:40.441140 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 15:40:41 crc kubenswrapper[4810]: I0219 15:40:41.629291 4810 scope.go:117] "RemoveContainer" containerID="2fa804cbc29144cbaa9d2e4c3f648166e91009da2ed6d113042e7022e9308b2c" Feb 19 15:40:41 crc kubenswrapper[4810]: I0219 15:40:41.652573 4810 scope.go:117] "RemoveContainer" containerID="61fee9f5cc97dc9164d9a8b37259645ec27704b544f8031e79cd8630294aa448" Feb 19 15:40:41 crc kubenswrapper[4810]: I0219 15:40:41.715133 4810 scope.go:117] "RemoveContainer" containerID="7cb43c21f053a8d03036f06cd4952d1e70925f82267f48d6f2c4959f93a370e5" Feb 19 15:40:41 crc kubenswrapper[4810]: I0219 15:40:41.735639 4810 scope.go:117] "RemoveContainer" containerID="99cf896833f13eecd3fedefc31f58e2b88d17d37a7cb7ae1aea233b7d9a39af1" Feb 19 15:40:41 crc kubenswrapper[4810]: I0219 15:40:41.780374 4810 scope.go:117] "RemoveContainer" containerID="ead759deef71357ae0d9ddba72b509ea84ac0664aab15baecfde700a4dc84f66" Feb 19 15:40:41 crc kubenswrapper[4810]: I0219 15:40:41.822301 4810 scope.go:117] "RemoveContainer" containerID="d738b05f8038fce0f6f7dca977b306ea2c9695f4bc8b38cb001bb799b15410d3" Feb 19 15:40:41 crc kubenswrapper[4810]: I0219 15:40:41.866852 4810 scope.go:117] "RemoveContainer" containerID="ad28bfaa41efd8e4e6c465f81c081a0451a00386412156a5267acdd97840a40b" Feb 19 15:40:41 crc kubenswrapper[4810]: I0219 15:40:41.890379 4810 scope.go:117] "RemoveContainer" containerID="a519fabbf15898bb4c345dee03c392f33a7ca3106e889528c9a61a815ff5b000" Feb 19 15:40:51 crc kubenswrapper[4810]: I0219 15:40:51.445462 4810 scope.go:117] "RemoveContainer" containerID="2ce40b521710718c7ba6b5fc71023bb5882beb271696b0616786a7ce052f1e2a" Feb 19 15:40:52 crc kubenswrapper[4810]: I0219 15:40:52.523051 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" event={"ID":"dfbf74ef-1e94-4826-8583-42b2e246ccf3","Type":"ContainerStarted","Data":"4e3ef6a26491a979d9b15a4e163fa3567692b3b0eef18273908461c8a7758364"} Feb 19 15:40:52 crc kubenswrapper[4810]: I0219 15:40:52.548602 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-s8kk5" podStartSLOduration=15.972690283 podStartE2EDuration="16.548568357s" podCreationTimestamp="2026-02-19 15:40:36 +0000 UTC" firstStartedPulling="2026-02-19 15:40:37.408203722 +0000 UTC m=+1866.890233846" lastFinishedPulling="2026-02-19 15:40:37.984081786 +0000 UTC m=+1867.466111920" observedRunningTime="2026-02-19 15:40:38.392965565 +0000 UTC m=+1867.874995689" watchObservedRunningTime="2026-02-19 15:40:52.548568357 +0000 UTC m=+1882.030598521" Feb 19 15:41:09 crc kubenswrapper[4810]: I0219 15:41:09.056918 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-58xq9"] Feb 19 15:41:09 crc kubenswrapper[4810]: I0219 15:41:09.067724 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-58xq9"] Feb 19 15:41:09 crc kubenswrapper[4810]: I0219 15:41:09.454138 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="972d6f5e-3edf-4b6e-bdde-39c580caea31" path="/var/lib/kubelet/pods/972d6f5e-3edf-4b6e-bdde-39c580caea31/volumes" Feb 19 15:41:17 crc kubenswrapper[4810]: I0219 15:41:17.826233 4810 generic.go:334] "Generic (PLEG): container finished" podID="12c3c35e-69ec-4150-8f1e-7eeb6cc01d9b" containerID="66bd09d023dfa70c25f2018bd9b82f03017c1b6846777cd688b44fdbcc73f0a7" exitCode=0 Feb 19 15:41:17 crc kubenswrapper[4810]: I0219 15:41:17.826382 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-s8kk5" event={"ID":"12c3c35e-69ec-4150-8f1e-7eeb6cc01d9b","Type":"ContainerDied","Data":"66bd09d023dfa70c25f2018bd9b82f03017c1b6846777cd688b44fdbcc73f0a7"} Feb 19 15:41:19 crc kubenswrapper[4810]: I0219 15:41:19.337125 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-s8kk5" Feb 19 15:41:19 crc kubenswrapper[4810]: I0219 15:41:19.442924 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/12c3c35e-69ec-4150-8f1e-7eeb6cc01d9b-ssh-key-openstack-edpm-ipam\") pod \"12c3c35e-69ec-4150-8f1e-7eeb6cc01d9b\" (UID: \"12c3c35e-69ec-4150-8f1e-7eeb6cc01d9b\") " Feb 19 15:41:19 crc kubenswrapper[4810]: I0219 15:41:19.443498 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9scx\" (UniqueName: \"kubernetes.io/projected/12c3c35e-69ec-4150-8f1e-7eeb6cc01d9b-kube-api-access-m9scx\") pod \"12c3c35e-69ec-4150-8f1e-7eeb6cc01d9b\" (UID: \"12c3c35e-69ec-4150-8f1e-7eeb6cc01d9b\") " Feb 19 15:41:19 crc kubenswrapper[4810]: I0219 15:41:19.443568 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/12c3c35e-69ec-4150-8f1e-7eeb6cc01d9b-inventory\") pod \"12c3c35e-69ec-4150-8f1e-7eeb6cc01d9b\" (UID: \"12c3c35e-69ec-4150-8f1e-7eeb6cc01d9b\") " Feb 19 15:41:19 crc kubenswrapper[4810]: I0219 15:41:19.451635 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12c3c35e-69ec-4150-8f1e-7eeb6cc01d9b-kube-api-access-m9scx" (OuterVolumeSpecName: "kube-api-access-m9scx") pod "12c3c35e-69ec-4150-8f1e-7eeb6cc01d9b" (UID: "12c3c35e-69ec-4150-8f1e-7eeb6cc01d9b"). InnerVolumeSpecName "kube-api-access-m9scx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:41:19 crc kubenswrapper[4810]: I0219 15:41:19.471781 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12c3c35e-69ec-4150-8f1e-7eeb6cc01d9b-inventory" (OuterVolumeSpecName: "inventory") pod "12c3c35e-69ec-4150-8f1e-7eeb6cc01d9b" (UID: "12c3c35e-69ec-4150-8f1e-7eeb6cc01d9b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:41:19 crc kubenswrapper[4810]: I0219 15:41:19.475386 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12c3c35e-69ec-4150-8f1e-7eeb6cc01d9b-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "12c3c35e-69ec-4150-8f1e-7eeb6cc01d9b" (UID: "12c3c35e-69ec-4150-8f1e-7eeb6cc01d9b"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:41:19 crc kubenswrapper[4810]: I0219 15:41:19.547172 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m9scx\" (UniqueName: \"kubernetes.io/projected/12c3c35e-69ec-4150-8f1e-7eeb6cc01d9b-kube-api-access-m9scx\") on node \"crc\" DevicePath \"\"" Feb 19 15:41:19 crc kubenswrapper[4810]: I0219 15:41:19.547199 4810 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/12c3c35e-69ec-4150-8f1e-7eeb6cc01d9b-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 15:41:19 crc kubenswrapper[4810]: I0219 15:41:19.547210 4810 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/12c3c35e-69ec-4150-8f1e-7eeb6cc01d9b-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 15:41:19 crc kubenswrapper[4810]: I0219 15:41:19.858698 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-s8kk5" event={"ID":"12c3c35e-69ec-4150-8f1e-7eeb6cc01d9b","Type":"ContainerDied","Data":"f9bc96f54f3b78da1f1da2ab5fb44990eaf75c7f9b3ec0fa37f91c60d09df681"} Feb 19 15:41:19 crc kubenswrapper[4810]: I0219 15:41:19.858791 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f9bc96f54f3b78da1f1da2ab5fb44990eaf75c7f9b3ec0fa37f91c60d09df681" Feb 19 15:41:19 crc kubenswrapper[4810]: I0219 15:41:19.858806 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-s8kk5" Feb 19 15:41:19 crc kubenswrapper[4810]: I0219 15:41:19.987694 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cm25d"] Feb 19 15:41:19 crc kubenswrapper[4810]: E0219 15:41:19.988114 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12c3c35e-69ec-4150-8f1e-7eeb6cc01d9b" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 19 15:41:19 crc kubenswrapper[4810]: I0219 15:41:19.988135 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="12c3c35e-69ec-4150-8f1e-7eeb6cc01d9b" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 19 15:41:19 crc kubenswrapper[4810]: I0219 15:41:19.988389 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="12c3c35e-69ec-4150-8f1e-7eeb6cc01d9b" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 19 15:41:19 crc kubenswrapper[4810]: I0219 15:41:19.989024 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cm25d" Feb 19 15:41:19 crc kubenswrapper[4810]: I0219 15:41:19.994661 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 15:41:19 crc kubenswrapper[4810]: I0219 15:41:19.994893 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 15:41:19 crc kubenswrapper[4810]: I0219 15:41:19.995083 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 15:41:19 crc kubenswrapper[4810]: I0219 15:41:19.998974 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-djc4q" Feb 19 15:41:20 crc kubenswrapper[4810]: I0219 15:41:20.023533 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cm25d"] Feb 19 15:41:20 crc kubenswrapper[4810]: I0219 15:41:20.059213 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7e1f4472-242a-40a0-a574-9c3119fdb705-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-cm25d\" (UID: \"7e1f4472-242a-40a0-a574-9c3119fdb705\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cm25d" Feb 19 15:41:20 crc kubenswrapper[4810]: I0219 15:41:20.059299 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vsg4\" (UniqueName: \"kubernetes.io/projected/7e1f4472-242a-40a0-a574-9c3119fdb705-kube-api-access-5vsg4\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-cm25d\" (UID: \"7e1f4472-242a-40a0-a574-9c3119fdb705\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cm25d" Feb 19 15:41:20 crc kubenswrapper[4810]: I0219 15:41:20.060064 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e1f4472-242a-40a0-a574-9c3119fdb705-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-cm25d\" (UID: \"7e1f4472-242a-40a0-a574-9c3119fdb705\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cm25d" Feb 19 15:41:20 crc kubenswrapper[4810]: I0219 15:41:20.162474 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7e1f4472-242a-40a0-a574-9c3119fdb705-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-cm25d\" (UID: \"7e1f4472-242a-40a0-a574-9c3119fdb705\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cm25d" Feb 19 15:41:20 crc kubenswrapper[4810]: I0219 15:41:20.162554 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vsg4\" (UniqueName: \"kubernetes.io/projected/7e1f4472-242a-40a0-a574-9c3119fdb705-kube-api-access-5vsg4\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-cm25d\" (UID: \"7e1f4472-242a-40a0-a574-9c3119fdb705\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cm25d" Feb 19 15:41:20 crc kubenswrapper[4810]: I0219 15:41:20.162757 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e1f4472-242a-40a0-a574-9c3119fdb705-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-cm25d\" (UID: \"7e1f4472-242a-40a0-a574-9c3119fdb705\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cm25d" Feb 19 15:41:20 crc kubenswrapper[4810]: I0219 15:41:20.168796 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7e1f4472-242a-40a0-a574-9c3119fdb705-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-cm25d\" (UID: \"7e1f4472-242a-40a0-a574-9c3119fdb705\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cm25d" Feb 19 15:41:20 crc kubenswrapper[4810]: I0219 15:41:20.169025 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e1f4472-242a-40a0-a574-9c3119fdb705-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-cm25d\" (UID: \"7e1f4472-242a-40a0-a574-9c3119fdb705\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cm25d" Feb 19 15:41:20 crc kubenswrapper[4810]: I0219 15:41:20.180851 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vsg4\" (UniqueName: \"kubernetes.io/projected/7e1f4472-242a-40a0-a574-9c3119fdb705-kube-api-access-5vsg4\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-cm25d\" (UID: \"7e1f4472-242a-40a0-a574-9c3119fdb705\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cm25d" Feb 19 15:41:20 crc kubenswrapper[4810]: I0219 15:41:20.320147 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cm25d" Feb 19 15:41:20 crc kubenswrapper[4810]: I0219 15:41:20.949458 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cm25d"] Feb 19 15:41:21 crc kubenswrapper[4810]: I0219 15:41:21.883064 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cm25d" event={"ID":"7e1f4472-242a-40a0-a574-9c3119fdb705","Type":"ContainerStarted","Data":"b27177a5822d1f5cc2c1b88795d4f16252ecccf22df3b242820cee1469f821d3"} Feb 19 15:41:21 crc kubenswrapper[4810]: I0219 15:41:21.883501 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cm25d" event={"ID":"7e1f4472-242a-40a0-a574-9c3119fdb705","Type":"ContainerStarted","Data":"b3ea6f11973dbfa2762088780fd16a66b97d9eb13ec3b90d8368472e7902ee01"} Feb 19 15:41:21 crc kubenswrapper[4810]: I0219 15:41:21.919176 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cm25d" podStartSLOduration=2.49172541 podStartE2EDuration="2.919150504s" podCreationTimestamp="2026-02-19 15:41:19 +0000 UTC" firstStartedPulling="2026-02-19 15:41:20.961808468 +0000 UTC m=+1910.443838622" lastFinishedPulling="2026-02-19 15:41:21.389233592 +0000 UTC m=+1910.871263716" observedRunningTime="2026-02-19 15:41:21.912832906 +0000 UTC m=+1911.394863060" watchObservedRunningTime="2026-02-19 15:41:21.919150504 +0000 UTC m=+1911.401180668" Feb 19 15:41:33 crc kubenswrapper[4810]: I0219 15:41:33.092889 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-k7bkw"] Feb 19 15:41:33 crc kubenswrapper[4810]: I0219 15:41:33.106362 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-k7bkw"] Feb 19 15:41:33 crc kubenswrapper[4810]: I0219 15:41:33.451969 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="299a53ac-e7e5-47a3-bf65-df5624b77717" path="/var/lib/kubelet/pods/299a53ac-e7e5-47a3-bf65-df5624b77717/volumes" Feb 19 15:41:36 crc kubenswrapper[4810]: I0219 15:41:36.032071 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-hk2fs"] Feb 19 15:41:36 crc kubenswrapper[4810]: I0219 15:41:36.040524 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-hk2fs"] Feb 19 15:41:37 crc kubenswrapper[4810]: I0219 15:41:37.458222 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f1a5ee7-3792-4f35-a967-80fb96c7df10" path="/var/lib/kubelet/pods/5f1a5ee7-3792-4f35-a967-80fb96c7df10/volumes" Feb 19 15:41:42 crc kubenswrapper[4810]: I0219 15:41:42.043724 4810 scope.go:117] "RemoveContainer" containerID="9c8648b58dedd6b14f6832bd1d2f895ecfd4e781a2433a653d4f48b76efb9fef" Feb 19 15:41:42 crc kubenswrapper[4810]: I0219 15:41:42.121382 4810 scope.go:117] "RemoveContainer" containerID="45859d708bbdd95af868748506ae358c82e96df75fa08cfe41661e0323e54c01" Feb 19 15:41:42 crc kubenswrapper[4810]: I0219 15:41:42.156603 4810 scope.go:117] "RemoveContainer" containerID="19b609d4be47506e6c511dded32be9ffbc5fec785d73d8309ff072ff0f1cf61d" Feb 19 15:42:09 crc kubenswrapper[4810]: I0219 15:42:09.304563 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kq8xx"] Feb 19 15:42:09 crc kubenswrapper[4810]: I0219 15:42:09.308117 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kq8xx" Feb 19 15:42:09 crc kubenswrapper[4810]: I0219 15:42:09.324547 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kq8xx"] Feb 19 15:42:09 crc kubenswrapper[4810]: I0219 15:42:09.408984 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8baaff2f-6fa7-4d65-b161-fc60f06aab23-catalog-content\") pod \"redhat-marketplace-kq8xx\" (UID: \"8baaff2f-6fa7-4d65-b161-fc60f06aab23\") " pod="openshift-marketplace/redhat-marketplace-kq8xx" Feb 19 15:42:09 crc kubenswrapper[4810]: I0219 15:42:09.409202 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpfqz\" (UniqueName: \"kubernetes.io/projected/8baaff2f-6fa7-4d65-b161-fc60f06aab23-kube-api-access-qpfqz\") pod \"redhat-marketplace-kq8xx\" (UID: \"8baaff2f-6fa7-4d65-b161-fc60f06aab23\") " pod="openshift-marketplace/redhat-marketplace-kq8xx" Feb 19 15:42:09 crc kubenswrapper[4810]: I0219 15:42:09.409702 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8baaff2f-6fa7-4d65-b161-fc60f06aab23-utilities\") pod \"redhat-marketplace-kq8xx\" (UID: \"8baaff2f-6fa7-4d65-b161-fc60f06aab23\") " pod="openshift-marketplace/redhat-marketplace-kq8xx" Feb 19 15:42:09 crc kubenswrapper[4810]: I0219 15:42:09.511686 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpfqz\" (UniqueName: \"kubernetes.io/projected/8baaff2f-6fa7-4d65-b161-fc60f06aab23-kube-api-access-qpfqz\") pod \"redhat-marketplace-kq8xx\" (UID: \"8baaff2f-6fa7-4d65-b161-fc60f06aab23\") " pod="openshift-marketplace/redhat-marketplace-kq8xx" Feb 19 15:42:09 crc kubenswrapper[4810]: I0219 15:42:09.511822 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8baaff2f-6fa7-4d65-b161-fc60f06aab23-utilities\") pod \"redhat-marketplace-kq8xx\" (UID: \"8baaff2f-6fa7-4d65-b161-fc60f06aab23\") " pod="openshift-marketplace/redhat-marketplace-kq8xx" Feb 19 15:42:09 crc kubenswrapper[4810]: I0219 15:42:09.511877 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8baaff2f-6fa7-4d65-b161-fc60f06aab23-catalog-content\") pod \"redhat-marketplace-kq8xx\" (UID: \"8baaff2f-6fa7-4d65-b161-fc60f06aab23\") " pod="openshift-marketplace/redhat-marketplace-kq8xx" Feb 19 15:42:09 crc kubenswrapper[4810]: I0219 15:42:09.512513 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8baaff2f-6fa7-4d65-b161-fc60f06aab23-catalog-content\") pod \"redhat-marketplace-kq8xx\" (UID: \"8baaff2f-6fa7-4d65-b161-fc60f06aab23\") " pod="openshift-marketplace/redhat-marketplace-kq8xx" Feb 19 15:42:09 crc kubenswrapper[4810]: I0219 15:42:09.512489 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8baaff2f-6fa7-4d65-b161-fc60f06aab23-utilities\") pod \"redhat-marketplace-kq8xx\" (UID: \"8baaff2f-6fa7-4d65-b161-fc60f06aab23\") " pod="openshift-marketplace/redhat-marketplace-kq8xx" Feb 19 15:42:09 crc kubenswrapper[4810]: I0219 15:42:09.544295 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpfqz\" (UniqueName: \"kubernetes.io/projected/8baaff2f-6fa7-4d65-b161-fc60f06aab23-kube-api-access-qpfqz\") pod \"redhat-marketplace-kq8xx\" (UID: \"8baaff2f-6fa7-4d65-b161-fc60f06aab23\") " pod="openshift-marketplace/redhat-marketplace-kq8xx" Feb 19 15:42:09 crc kubenswrapper[4810]: I0219 15:42:09.684314 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kq8xx" Feb 19 15:42:10 crc kubenswrapper[4810]: I0219 15:42:10.182928 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kq8xx"] Feb 19 15:42:10 crc kubenswrapper[4810]: I0219 15:42:10.874463 4810 generic.go:334] "Generic (PLEG): container finished" podID="8baaff2f-6fa7-4d65-b161-fc60f06aab23" containerID="7119d702b43fa67da67f60fa699dc67bd8b6502948dc9469d587dcd095715132" exitCode=0 Feb 19 15:42:10 crc kubenswrapper[4810]: I0219 15:42:10.874518 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kq8xx" event={"ID":"8baaff2f-6fa7-4d65-b161-fc60f06aab23","Type":"ContainerDied","Data":"7119d702b43fa67da67f60fa699dc67bd8b6502948dc9469d587dcd095715132"} Feb 19 15:42:10 crc kubenswrapper[4810]: I0219 15:42:10.875444 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kq8xx" event={"ID":"8baaff2f-6fa7-4d65-b161-fc60f06aab23","Type":"ContainerStarted","Data":"200a3d1607f6afdf8f0e98c2f8769884320bf708b1b5a1c5baf2465219d5ac9f"} Feb 19 15:42:11 crc kubenswrapper[4810]: I0219 15:42:11.892731 4810 generic.go:334] "Generic (PLEG): container finished" podID="8baaff2f-6fa7-4d65-b161-fc60f06aab23" containerID="544d12af023b04fc865881934021ea362dce4ac24f0b5797bf6ae10b51163ac8" exitCode=0 Feb 19 15:42:11 crc kubenswrapper[4810]: I0219 15:42:11.892867 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kq8xx" event={"ID":"8baaff2f-6fa7-4d65-b161-fc60f06aab23","Type":"ContainerDied","Data":"544d12af023b04fc865881934021ea362dce4ac24f0b5797bf6ae10b51163ac8"} Feb 19 15:42:12 crc kubenswrapper[4810]: I0219 15:42:12.904544 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kq8xx" event={"ID":"8baaff2f-6fa7-4d65-b161-fc60f06aab23","Type":"ContainerStarted","Data":"c1460f24dfbc8a18564e64c9e3c106dcd6dc43419b56133dfc09067da642a36b"} Feb 19 15:42:12 crc kubenswrapper[4810]: I0219 15:42:12.929454 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kq8xx" podStartSLOduration=2.506465622 podStartE2EDuration="3.929421869s" podCreationTimestamp="2026-02-19 15:42:09 +0000 UTC" firstStartedPulling="2026-02-19 15:42:10.877043145 +0000 UTC m=+1960.359073279" lastFinishedPulling="2026-02-19 15:42:12.299999372 +0000 UTC m=+1961.782029526" observedRunningTime="2026-02-19 15:42:12.922074147 +0000 UTC m=+1962.404104291" watchObservedRunningTime="2026-02-19 15:42:12.929421869 +0000 UTC m=+1962.411451993" Feb 19 15:42:17 crc kubenswrapper[4810]: I0219 15:42:17.057578 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-hf7qg"] Feb 19 15:42:17 crc kubenswrapper[4810]: I0219 15:42:17.072399 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-hf7qg"] Feb 19 15:42:17 crc kubenswrapper[4810]: I0219 15:42:17.449058 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f834f671-3add-4bfc-8152-596d66e90f22" path="/var/lib/kubelet/pods/f834f671-3add-4bfc-8152-596d66e90f22/volumes" Feb 19 15:42:18 crc kubenswrapper[4810]: I0219 15:42:18.970349 4810 generic.go:334] "Generic (PLEG): container finished" podID="7e1f4472-242a-40a0-a574-9c3119fdb705" containerID="b27177a5822d1f5cc2c1b88795d4f16252ecccf22df3b242820cee1469f821d3" exitCode=0 Feb 19 15:42:18 crc kubenswrapper[4810]: I0219 15:42:18.970457 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cm25d" event={"ID":"7e1f4472-242a-40a0-a574-9c3119fdb705","Type":"ContainerDied","Data":"b27177a5822d1f5cc2c1b88795d4f16252ecccf22df3b242820cee1469f821d3"} Feb 19 15:42:19 crc kubenswrapper[4810]: I0219 15:42:19.684535 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kq8xx" Feb 19 15:42:19 crc kubenswrapper[4810]: I0219 15:42:19.684618 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kq8xx" Feb 19 15:42:19 crc kubenswrapper[4810]: I0219 15:42:19.756890 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kq8xx" Feb 19 15:42:20 crc kubenswrapper[4810]: I0219 15:42:20.064241 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kq8xx" Feb 19 15:42:20 crc kubenswrapper[4810]: I0219 15:42:20.114020 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kq8xx"] Feb 19 15:42:20 crc kubenswrapper[4810]: I0219 15:42:20.453439 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cm25d" Feb 19 15:42:20 crc kubenswrapper[4810]: I0219 15:42:20.606151 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e1f4472-242a-40a0-a574-9c3119fdb705-inventory\") pod \"7e1f4472-242a-40a0-a574-9c3119fdb705\" (UID: \"7e1f4472-242a-40a0-a574-9c3119fdb705\") " Feb 19 15:42:20 crc kubenswrapper[4810]: I0219 15:42:20.606591 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7e1f4472-242a-40a0-a574-9c3119fdb705-ssh-key-openstack-edpm-ipam\") pod \"7e1f4472-242a-40a0-a574-9c3119fdb705\" (UID: \"7e1f4472-242a-40a0-a574-9c3119fdb705\") " Feb 19 15:42:20 crc kubenswrapper[4810]: I0219 15:42:20.606732 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vsg4\" (UniqueName: \"kubernetes.io/projected/7e1f4472-242a-40a0-a574-9c3119fdb705-kube-api-access-5vsg4\") pod \"7e1f4472-242a-40a0-a574-9c3119fdb705\" (UID: \"7e1f4472-242a-40a0-a574-9c3119fdb705\") " Feb 19 15:42:20 crc kubenswrapper[4810]: I0219 15:42:20.615475 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e1f4472-242a-40a0-a574-9c3119fdb705-kube-api-access-5vsg4" (OuterVolumeSpecName: "kube-api-access-5vsg4") pod "7e1f4472-242a-40a0-a574-9c3119fdb705" (UID: "7e1f4472-242a-40a0-a574-9c3119fdb705"). InnerVolumeSpecName "kube-api-access-5vsg4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:42:20 crc kubenswrapper[4810]: I0219 15:42:20.634397 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e1f4472-242a-40a0-a574-9c3119fdb705-inventory" (OuterVolumeSpecName: "inventory") pod "7e1f4472-242a-40a0-a574-9c3119fdb705" (UID: "7e1f4472-242a-40a0-a574-9c3119fdb705"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:42:20 crc kubenswrapper[4810]: I0219 15:42:20.636187 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e1f4472-242a-40a0-a574-9c3119fdb705-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7e1f4472-242a-40a0-a574-9c3119fdb705" (UID: "7e1f4472-242a-40a0-a574-9c3119fdb705"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:42:20 crc kubenswrapper[4810]: I0219 15:42:20.709718 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5vsg4\" (UniqueName: \"kubernetes.io/projected/7e1f4472-242a-40a0-a574-9c3119fdb705-kube-api-access-5vsg4\") on node \"crc\" DevicePath \"\"" Feb 19 15:42:20 crc kubenswrapper[4810]: I0219 15:42:20.709786 4810 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e1f4472-242a-40a0-a574-9c3119fdb705-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 15:42:20 crc kubenswrapper[4810]: I0219 15:42:20.709804 4810 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7e1f4472-242a-40a0-a574-9c3119fdb705-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 15:42:21 crc kubenswrapper[4810]: I0219 15:42:21.000522 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cm25d" Feb 19 15:42:21 crc kubenswrapper[4810]: I0219 15:42:21.000511 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cm25d" event={"ID":"7e1f4472-242a-40a0-a574-9c3119fdb705","Type":"ContainerDied","Data":"b3ea6f11973dbfa2762088780fd16a66b97d9eb13ec3b90d8368472e7902ee01"} Feb 19 15:42:21 crc kubenswrapper[4810]: I0219 15:42:21.000613 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3ea6f11973dbfa2762088780fd16a66b97d9eb13ec3b90d8368472e7902ee01" Feb 19 15:42:21 crc kubenswrapper[4810]: I0219 15:42:21.102403 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-gw579"] Feb 19 15:42:21 crc kubenswrapper[4810]: E0219 15:42:21.102910 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e1f4472-242a-40a0-a574-9c3119fdb705" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 19 15:42:21 crc kubenswrapper[4810]: I0219 15:42:21.102931 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e1f4472-242a-40a0-a574-9c3119fdb705" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 19 15:42:21 crc kubenswrapper[4810]: I0219 15:42:21.103203 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e1f4472-242a-40a0-a574-9c3119fdb705" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 19 15:42:21 crc kubenswrapper[4810]: I0219 15:42:21.104069 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-gw579" Feb 19 15:42:21 crc kubenswrapper[4810]: I0219 15:42:21.110064 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 15:42:21 crc kubenswrapper[4810]: I0219 15:42:21.110279 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-djc4q" Feb 19 15:42:21 crc kubenswrapper[4810]: I0219 15:42:21.110307 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 15:42:21 crc kubenswrapper[4810]: I0219 15:42:21.111130 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 15:42:21 crc kubenswrapper[4810]: I0219 15:42:21.126035 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-gw579"] Feb 19 15:42:21 crc kubenswrapper[4810]: I0219 15:42:21.223441 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ln6mc\" (UniqueName: \"kubernetes.io/projected/e3132ed5-687d-4cd1-a539-35c4766a27c1-kube-api-access-ln6mc\") pod \"ssh-known-hosts-edpm-deployment-gw579\" (UID: \"e3132ed5-687d-4cd1-a539-35c4766a27c1\") " pod="openstack/ssh-known-hosts-edpm-deployment-gw579" Feb 19 15:42:21 crc kubenswrapper[4810]: I0219 15:42:21.223973 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/e3132ed5-687d-4cd1-a539-35c4766a27c1-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-gw579\" (UID: \"e3132ed5-687d-4cd1-a539-35c4766a27c1\") " pod="openstack/ssh-known-hosts-edpm-deployment-gw579" Feb 19 15:42:21 crc kubenswrapper[4810]: I0219 15:42:21.224146 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e3132ed5-687d-4cd1-a539-35c4766a27c1-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-gw579\" (UID: \"e3132ed5-687d-4cd1-a539-35c4766a27c1\") " pod="openstack/ssh-known-hosts-edpm-deployment-gw579" Feb 19 15:42:21 crc kubenswrapper[4810]: I0219 15:42:21.326791 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ln6mc\" (UniqueName: \"kubernetes.io/projected/e3132ed5-687d-4cd1-a539-35c4766a27c1-kube-api-access-ln6mc\") pod \"ssh-known-hosts-edpm-deployment-gw579\" (UID: \"e3132ed5-687d-4cd1-a539-35c4766a27c1\") " pod="openstack/ssh-known-hosts-edpm-deployment-gw579" Feb 19 15:42:21 crc kubenswrapper[4810]: I0219 15:42:21.326940 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/e3132ed5-687d-4cd1-a539-35c4766a27c1-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-gw579\" (UID: \"e3132ed5-687d-4cd1-a539-35c4766a27c1\") " pod="openstack/ssh-known-hosts-edpm-deployment-gw579" Feb 19 15:42:21 crc kubenswrapper[4810]: I0219 15:42:21.327128 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e3132ed5-687d-4cd1-a539-35c4766a27c1-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-gw579\" (UID: \"e3132ed5-687d-4cd1-a539-35c4766a27c1\") " pod="openstack/ssh-known-hosts-edpm-deployment-gw579" Feb 19 15:42:21 crc kubenswrapper[4810]: I0219 15:42:21.330749 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e3132ed5-687d-4cd1-a539-35c4766a27c1-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-gw579\" (UID: \"e3132ed5-687d-4cd1-a539-35c4766a27c1\") " pod="openstack/ssh-known-hosts-edpm-deployment-gw579" Feb 19 15:42:21 crc kubenswrapper[4810]: I0219 15:42:21.340151 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/e3132ed5-687d-4cd1-a539-35c4766a27c1-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-gw579\" (UID: \"e3132ed5-687d-4cd1-a539-35c4766a27c1\") " pod="openstack/ssh-known-hosts-edpm-deployment-gw579" Feb 19 15:42:21 crc kubenswrapper[4810]: I0219 15:42:21.342452 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ln6mc\" (UniqueName: \"kubernetes.io/projected/e3132ed5-687d-4cd1-a539-35c4766a27c1-kube-api-access-ln6mc\") pod \"ssh-known-hosts-edpm-deployment-gw579\" (UID: \"e3132ed5-687d-4cd1-a539-35c4766a27c1\") " pod="openstack/ssh-known-hosts-edpm-deployment-gw579" Feb 19 15:42:21 crc kubenswrapper[4810]: I0219 15:42:21.437065 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-gw579" Feb 19 15:42:22 crc kubenswrapper[4810]: I0219 15:42:22.003538 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-gw579"] Feb 19 15:42:22 crc kubenswrapper[4810]: W0219 15:42:22.008663 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3132ed5_687d_4cd1_a539_35c4766a27c1.slice/crio-7d3818ebf7dc58fe335ccb3662cba75b973de23e57f82872a69e287e71bf42d9 WatchSource:0}: Error finding container 7d3818ebf7dc58fe335ccb3662cba75b973de23e57f82872a69e287e71bf42d9: Status 404 returned error can't find the container with id 7d3818ebf7dc58fe335ccb3662cba75b973de23e57f82872a69e287e71bf42d9 Feb 19 15:42:22 crc kubenswrapper[4810]: I0219 15:42:22.010451 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-kq8xx" podUID="8baaff2f-6fa7-4d65-b161-fc60f06aab23" containerName="registry-server" containerID="cri-o://c1460f24dfbc8a18564e64c9e3c106dcd6dc43419b56133dfc09067da642a36b" gracePeriod=2 Feb 19 15:42:22 crc kubenswrapper[4810]: I0219 15:42:22.966409 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kq8xx" Feb 19 15:42:22 crc kubenswrapper[4810]: I0219 15:42:22.977717 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8baaff2f-6fa7-4d65-b161-fc60f06aab23-utilities\") pod \"8baaff2f-6fa7-4d65-b161-fc60f06aab23\" (UID: \"8baaff2f-6fa7-4d65-b161-fc60f06aab23\") " Feb 19 15:42:22 crc kubenswrapper[4810]: I0219 15:42:22.977971 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qpfqz\" (UniqueName: \"kubernetes.io/projected/8baaff2f-6fa7-4d65-b161-fc60f06aab23-kube-api-access-qpfqz\") pod \"8baaff2f-6fa7-4d65-b161-fc60f06aab23\" (UID: \"8baaff2f-6fa7-4d65-b161-fc60f06aab23\") " Feb 19 15:42:22 crc kubenswrapper[4810]: I0219 15:42:22.978094 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8baaff2f-6fa7-4d65-b161-fc60f06aab23-catalog-content\") pod \"8baaff2f-6fa7-4d65-b161-fc60f06aab23\" (UID: \"8baaff2f-6fa7-4d65-b161-fc60f06aab23\") " Feb 19 15:42:22 crc kubenswrapper[4810]: I0219 15:42:22.978548 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8baaff2f-6fa7-4d65-b161-fc60f06aab23-utilities" (OuterVolumeSpecName: "utilities") pod "8baaff2f-6fa7-4d65-b161-fc60f06aab23" (UID: "8baaff2f-6fa7-4d65-b161-fc60f06aab23"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:42:22 crc kubenswrapper[4810]: I0219 15:42:22.978970 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8baaff2f-6fa7-4d65-b161-fc60f06aab23-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 15:42:22 crc kubenswrapper[4810]: I0219 15:42:22.986763 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8baaff2f-6fa7-4d65-b161-fc60f06aab23-kube-api-access-qpfqz" (OuterVolumeSpecName: "kube-api-access-qpfqz") pod "8baaff2f-6fa7-4d65-b161-fc60f06aab23" (UID: "8baaff2f-6fa7-4d65-b161-fc60f06aab23"). InnerVolumeSpecName "kube-api-access-qpfqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:42:23 crc kubenswrapper[4810]: I0219 15:42:23.020755 4810 generic.go:334] "Generic (PLEG): container finished" podID="8baaff2f-6fa7-4d65-b161-fc60f06aab23" containerID="c1460f24dfbc8a18564e64c9e3c106dcd6dc43419b56133dfc09067da642a36b" exitCode=0 Feb 19 15:42:23 crc kubenswrapper[4810]: I0219 15:42:23.020836 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kq8xx" event={"ID":"8baaff2f-6fa7-4d65-b161-fc60f06aab23","Type":"ContainerDied","Data":"c1460f24dfbc8a18564e64c9e3c106dcd6dc43419b56133dfc09067da642a36b"} Feb 19 15:42:23 crc kubenswrapper[4810]: I0219 15:42:23.020855 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kq8xx" Feb 19 15:42:23 crc kubenswrapper[4810]: I0219 15:42:23.020864 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kq8xx" event={"ID":"8baaff2f-6fa7-4d65-b161-fc60f06aab23","Type":"ContainerDied","Data":"200a3d1607f6afdf8f0e98c2f8769884320bf708b1b5a1c5baf2465219d5ac9f"} Feb 19 15:42:23 crc kubenswrapper[4810]: I0219 15:42:23.020877 4810 scope.go:117] "RemoveContainer" containerID="c1460f24dfbc8a18564e64c9e3c106dcd6dc43419b56133dfc09067da642a36b" Feb 19 15:42:23 crc kubenswrapper[4810]: I0219 15:42:23.025214 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-gw579" event={"ID":"e3132ed5-687d-4cd1-a539-35c4766a27c1","Type":"ContainerStarted","Data":"14c148a9427173b4d7525b859ee79d7ec5a4d987fa7589d147eead08832d7e45"} Feb 19 15:42:23 crc kubenswrapper[4810]: I0219 15:42:23.025255 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-gw579" event={"ID":"e3132ed5-687d-4cd1-a539-35c4766a27c1","Type":"ContainerStarted","Data":"7d3818ebf7dc58fe335ccb3662cba75b973de23e57f82872a69e287e71bf42d9"} Feb 19 15:42:23 crc kubenswrapper[4810]: I0219 15:42:23.038882 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8baaff2f-6fa7-4d65-b161-fc60f06aab23-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8baaff2f-6fa7-4d65-b161-fc60f06aab23" (UID: "8baaff2f-6fa7-4d65-b161-fc60f06aab23"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:42:23 crc kubenswrapper[4810]: I0219 15:42:23.051441 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-gw579" podStartSLOduration=1.524778661 podStartE2EDuration="2.051415306s" podCreationTimestamp="2026-02-19 15:42:21 +0000 UTC" firstStartedPulling="2026-02-19 15:42:22.01522478 +0000 UTC m=+1971.497254924" lastFinishedPulling="2026-02-19 15:42:22.541861435 +0000 UTC m=+1972.023891569" observedRunningTime="2026-02-19 15:42:23.046430882 +0000 UTC m=+1972.528461046" watchObservedRunningTime="2026-02-19 15:42:23.051415306 +0000 UTC m=+1972.533445530" Feb 19 15:42:23 crc kubenswrapper[4810]: I0219 15:42:23.068430 4810 scope.go:117] "RemoveContainer" containerID="544d12af023b04fc865881934021ea362dce4ac24f0b5797bf6ae10b51163ac8" Feb 19 15:42:23 crc kubenswrapper[4810]: I0219 15:42:23.080314 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qpfqz\" (UniqueName: \"kubernetes.io/projected/8baaff2f-6fa7-4d65-b161-fc60f06aab23-kube-api-access-qpfqz\") on node \"crc\" DevicePath \"\"" Feb 19 15:42:23 crc kubenswrapper[4810]: I0219 15:42:23.080486 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8baaff2f-6fa7-4d65-b161-fc60f06aab23-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 15:42:23 crc kubenswrapper[4810]: I0219 15:42:23.087500 4810 scope.go:117] "RemoveContainer" containerID="7119d702b43fa67da67f60fa699dc67bd8b6502948dc9469d587dcd095715132" Feb 19 15:42:23 crc kubenswrapper[4810]: I0219 15:42:23.106612 4810 scope.go:117] "RemoveContainer" containerID="c1460f24dfbc8a18564e64c9e3c106dcd6dc43419b56133dfc09067da642a36b" Feb 19 15:42:23 crc kubenswrapper[4810]: E0219 15:42:23.107001 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1460f24dfbc8a18564e64c9e3c106dcd6dc43419b56133dfc09067da642a36b\": container with ID starting with c1460f24dfbc8a18564e64c9e3c106dcd6dc43419b56133dfc09067da642a36b not found: ID does not exist" containerID="c1460f24dfbc8a18564e64c9e3c106dcd6dc43419b56133dfc09067da642a36b" Feb 19 15:42:23 crc kubenswrapper[4810]: I0219 15:42:23.107040 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1460f24dfbc8a18564e64c9e3c106dcd6dc43419b56133dfc09067da642a36b"} err="failed to get container status \"c1460f24dfbc8a18564e64c9e3c106dcd6dc43419b56133dfc09067da642a36b\": rpc error: code = NotFound desc = could not find container \"c1460f24dfbc8a18564e64c9e3c106dcd6dc43419b56133dfc09067da642a36b\": container with ID starting with c1460f24dfbc8a18564e64c9e3c106dcd6dc43419b56133dfc09067da642a36b not found: ID does not exist" Feb 19 15:42:23 crc kubenswrapper[4810]: I0219 15:42:23.107064 4810 scope.go:117] "RemoveContainer" containerID="544d12af023b04fc865881934021ea362dce4ac24f0b5797bf6ae10b51163ac8" Feb 19 15:42:23 crc kubenswrapper[4810]: E0219 15:42:23.107397 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"544d12af023b04fc865881934021ea362dce4ac24f0b5797bf6ae10b51163ac8\": container with ID starting with 544d12af023b04fc865881934021ea362dce4ac24f0b5797bf6ae10b51163ac8 not found: ID does not exist" containerID="544d12af023b04fc865881934021ea362dce4ac24f0b5797bf6ae10b51163ac8" Feb 19 15:42:23 crc kubenswrapper[4810]: I0219 15:42:23.107443 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"544d12af023b04fc865881934021ea362dce4ac24f0b5797bf6ae10b51163ac8"} err="failed to get container status \"544d12af023b04fc865881934021ea362dce4ac24f0b5797bf6ae10b51163ac8\": rpc error: code = NotFound desc = could not find container \"544d12af023b04fc865881934021ea362dce4ac24f0b5797bf6ae10b51163ac8\": container with ID starting with 544d12af023b04fc865881934021ea362dce4ac24f0b5797bf6ae10b51163ac8 not found: ID does not exist" Feb 19 15:42:23 crc kubenswrapper[4810]: I0219 15:42:23.107480 4810 scope.go:117] "RemoveContainer" containerID="7119d702b43fa67da67f60fa699dc67bd8b6502948dc9469d587dcd095715132" Feb 19 15:42:23 crc kubenswrapper[4810]: E0219 15:42:23.107735 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7119d702b43fa67da67f60fa699dc67bd8b6502948dc9469d587dcd095715132\": container with ID starting with 7119d702b43fa67da67f60fa699dc67bd8b6502948dc9469d587dcd095715132 not found: ID does not exist" containerID="7119d702b43fa67da67f60fa699dc67bd8b6502948dc9469d587dcd095715132" Feb 19 15:42:23 crc kubenswrapper[4810]: I0219 15:42:23.107760 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7119d702b43fa67da67f60fa699dc67bd8b6502948dc9469d587dcd095715132"} err="failed to get container status \"7119d702b43fa67da67f60fa699dc67bd8b6502948dc9469d587dcd095715132\": rpc error: code = NotFound desc = could not find container \"7119d702b43fa67da67f60fa699dc67bd8b6502948dc9469d587dcd095715132\": container with ID starting with 7119d702b43fa67da67f60fa699dc67bd8b6502948dc9469d587dcd095715132 not found: ID does not exist" Feb 19 15:42:23 crc kubenswrapper[4810]: I0219 15:42:23.384695 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kq8xx"] Feb 19 15:42:23 crc kubenswrapper[4810]: I0219 15:42:23.397757 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-kq8xx"] Feb 19 15:42:23 crc kubenswrapper[4810]: I0219 15:42:23.462866 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8baaff2f-6fa7-4d65-b161-fc60f06aab23" path="/var/lib/kubelet/pods/8baaff2f-6fa7-4d65-b161-fc60f06aab23/volumes" Feb 19 15:42:30 crc kubenswrapper[4810]: I0219 15:42:30.109545 4810 generic.go:334] "Generic (PLEG): container finished" podID="e3132ed5-687d-4cd1-a539-35c4766a27c1" containerID="14c148a9427173b4d7525b859ee79d7ec5a4d987fa7589d147eead08832d7e45" exitCode=0 Feb 19 15:42:30 crc kubenswrapper[4810]: I0219 15:42:30.109681 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-gw579" event={"ID":"e3132ed5-687d-4cd1-a539-35c4766a27c1","Type":"ContainerDied","Data":"14c148a9427173b4d7525b859ee79d7ec5a4d987fa7589d147eead08832d7e45"} Feb 19 15:42:31 crc kubenswrapper[4810]: I0219 15:42:31.669634 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-gw579" Feb 19 15:42:31 crc kubenswrapper[4810]: I0219 15:42:31.774485 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/e3132ed5-687d-4cd1-a539-35c4766a27c1-inventory-0\") pod \"e3132ed5-687d-4cd1-a539-35c4766a27c1\" (UID: \"e3132ed5-687d-4cd1-a539-35c4766a27c1\") " Feb 19 15:42:31 crc kubenswrapper[4810]: I0219 15:42:31.775108 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ln6mc\" (UniqueName: \"kubernetes.io/projected/e3132ed5-687d-4cd1-a539-35c4766a27c1-kube-api-access-ln6mc\") pod \"e3132ed5-687d-4cd1-a539-35c4766a27c1\" (UID: \"e3132ed5-687d-4cd1-a539-35c4766a27c1\") " Feb 19 15:42:31 crc kubenswrapper[4810]: I0219 15:42:31.775244 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e3132ed5-687d-4cd1-a539-35c4766a27c1-ssh-key-openstack-edpm-ipam\") pod \"e3132ed5-687d-4cd1-a539-35c4766a27c1\" (UID: \"e3132ed5-687d-4cd1-a539-35c4766a27c1\") " Feb 19 15:42:31 crc kubenswrapper[4810]: I0219 15:42:31.785320 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3132ed5-687d-4cd1-a539-35c4766a27c1-kube-api-access-ln6mc" (OuterVolumeSpecName: "kube-api-access-ln6mc") pod "e3132ed5-687d-4cd1-a539-35c4766a27c1" (UID: "e3132ed5-687d-4cd1-a539-35c4766a27c1"). InnerVolumeSpecName "kube-api-access-ln6mc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:42:31 crc kubenswrapper[4810]: I0219 15:42:31.815433 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3132ed5-687d-4cd1-a539-35c4766a27c1-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e3132ed5-687d-4cd1-a539-35c4766a27c1" (UID: "e3132ed5-687d-4cd1-a539-35c4766a27c1"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:42:31 crc kubenswrapper[4810]: I0219 15:42:31.836607 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3132ed5-687d-4cd1-a539-35c4766a27c1-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "e3132ed5-687d-4cd1-a539-35c4766a27c1" (UID: "e3132ed5-687d-4cd1-a539-35c4766a27c1"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:42:31 crc kubenswrapper[4810]: I0219 15:42:31.877386 4810 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/e3132ed5-687d-4cd1-a539-35c4766a27c1-inventory-0\") on node \"crc\" DevicePath \"\"" Feb 19 15:42:31 crc kubenswrapper[4810]: I0219 15:42:31.877435 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ln6mc\" (UniqueName: \"kubernetes.io/projected/e3132ed5-687d-4cd1-a539-35c4766a27c1-kube-api-access-ln6mc\") on node \"crc\" DevicePath \"\"" Feb 19 15:42:31 crc kubenswrapper[4810]: I0219 15:42:31.877459 4810 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e3132ed5-687d-4cd1-a539-35c4766a27c1-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 15:42:32 crc kubenswrapper[4810]: I0219 15:42:32.133043 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-gw579" event={"ID":"e3132ed5-687d-4cd1-a539-35c4766a27c1","Type":"ContainerDied","Data":"7d3818ebf7dc58fe335ccb3662cba75b973de23e57f82872a69e287e71bf42d9"} Feb 19 15:42:32 crc kubenswrapper[4810]: I0219 15:42:32.133092 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d3818ebf7dc58fe335ccb3662cba75b973de23e57f82872a69e287e71bf42d9" Feb 19 15:42:32 crc kubenswrapper[4810]: I0219 15:42:32.133092 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-gw579" Feb 19 15:42:32 crc kubenswrapper[4810]: I0219 15:42:32.242744 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf56v"] Feb 19 15:42:32 crc kubenswrapper[4810]: E0219 15:42:32.243501 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8baaff2f-6fa7-4d65-b161-fc60f06aab23" containerName="registry-server" Feb 19 15:42:32 crc kubenswrapper[4810]: I0219 15:42:32.243544 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="8baaff2f-6fa7-4d65-b161-fc60f06aab23" containerName="registry-server" Feb 19 15:42:32 crc kubenswrapper[4810]: E0219 15:42:32.243592 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8baaff2f-6fa7-4d65-b161-fc60f06aab23" containerName="extract-content" Feb 19 15:42:32 crc kubenswrapper[4810]: I0219 15:42:32.243612 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="8baaff2f-6fa7-4d65-b161-fc60f06aab23" containerName="extract-content" Feb 19 15:42:32 crc kubenswrapper[4810]: E0219 15:42:32.243664 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3132ed5-687d-4cd1-a539-35c4766a27c1" containerName="ssh-known-hosts-edpm-deployment" Feb 19 15:42:32 crc kubenswrapper[4810]: I0219 15:42:32.243684 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3132ed5-687d-4cd1-a539-35c4766a27c1" containerName="ssh-known-hosts-edpm-deployment" Feb 19 15:42:32 crc kubenswrapper[4810]: E0219 15:42:32.243727 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8baaff2f-6fa7-4d65-b161-fc60f06aab23" containerName="extract-utilities" Feb 19 15:42:32 crc kubenswrapper[4810]: I0219 15:42:32.243744 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="8baaff2f-6fa7-4d65-b161-fc60f06aab23" containerName="extract-utilities" Feb 19 15:42:32 crc kubenswrapper[4810]: I0219 15:42:32.244214 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="8baaff2f-6fa7-4d65-b161-fc60f06aab23" containerName="registry-server" Feb 19 15:42:32 crc kubenswrapper[4810]: I0219 15:42:32.244273 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3132ed5-687d-4cd1-a539-35c4766a27c1" containerName="ssh-known-hosts-edpm-deployment" Feb 19 15:42:32 crc kubenswrapper[4810]: I0219 15:42:32.245698 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf56v" Feb 19 15:42:32 crc kubenswrapper[4810]: I0219 15:42:32.250509 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 15:42:32 crc kubenswrapper[4810]: I0219 15:42:32.250932 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 15:42:32 crc kubenswrapper[4810]: I0219 15:42:32.251114 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 15:42:32 crc kubenswrapper[4810]: I0219 15:42:32.251266 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-djc4q" Feb 19 15:42:32 crc kubenswrapper[4810]: I0219 15:42:32.256372 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf56v"] Feb 19 15:42:32 crc kubenswrapper[4810]: I0219 15:42:32.287034 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e77512a1-b460-4008-9e59-5b38f3e9f925-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rf56v\" (UID: \"e77512a1-b460-4008-9e59-5b38f3e9f925\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf56v" Feb 19 15:42:32 crc kubenswrapper[4810]: I0219 15:42:32.287137 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwsz8\" (UniqueName: \"kubernetes.io/projected/e77512a1-b460-4008-9e59-5b38f3e9f925-kube-api-access-cwsz8\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rf56v\" (UID: \"e77512a1-b460-4008-9e59-5b38f3e9f925\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf56v" Feb 19 15:42:32 crc kubenswrapper[4810]: I0219 15:42:32.287200 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e77512a1-b460-4008-9e59-5b38f3e9f925-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rf56v\" (UID: \"e77512a1-b460-4008-9e59-5b38f3e9f925\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf56v" Feb 19 15:42:32 crc kubenswrapper[4810]: I0219 15:42:32.389602 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e77512a1-b460-4008-9e59-5b38f3e9f925-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rf56v\" (UID: \"e77512a1-b460-4008-9e59-5b38f3e9f925\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf56v" Feb 19 15:42:32 crc kubenswrapper[4810]: I0219 15:42:32.390052 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e77512a1-b460-4008-9e59-5b38f3e9f925-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rf56v\" (UID: \"e77512a1-b460-4008-9e59-5b38f3e9f925\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf56v" Feb 19 15:42:32 crc kubenswrapper[4810]: I0219 15:42:32.390246 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwsz8\" (UniqueName: \"kubernetes.io/projected/e77512a1-b460-4008-9e59-5b38f3e9f925-kube-api-access-cwsz8\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rf56v\" (UID: \"e77512a1-b460-4008-9e59-5b38f3e9f925\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf56v" Feb 19 15:42:32 crc kubenswrapper[4810]: I0219 15:42:32.397013 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e77512a1-b460-4008-9e59-5b38f3e9f925-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rf56v\" (UID: \"e77512a1-b460-4008-9e59-5b38f3e9f925\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf56v" Feb 19 15:42:32 crc kubenswrapper[4810]: I0219 15:42:32.397601 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e77512a1-b460-4008-9e59-5b38f3e9f925-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rf56v\" (UID: \"e77512a1-b460-4008-9e59-5b38f3e9f925\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf56v" Feb 19 15:42:32 crc kubenswrapper[4810]: I0219 15:42:32.408004 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwsz8\" (UniqueName: \"kubernetes.io/projected/e77512a1-b460-4008-9e59-5b38f3e9f925-kube-api-access-cwsz8\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rf56v\" (UID: \"e77512a1-b460-4008-9e59-5b38f3e9f925\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf56v" Feb 19 15:42:32 crc kubenswrapper[4810]: I0219 15:42:32.572416 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf56v" Feb 19 15:42:32 crc kubenswrapper[4810]: W0219 15:42:32.969151 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode77512a1_b460_4008_9e59_5b38f3e9f925.slice/crio-e8cbae0924331351ea94389baefdade5c34ca28c724f53efa2a6641fd49957fb WatchSource:0}: Error finding container e8cbae0924331351ea94389baefdade5c34ca28c724f53efa2a6641fd49957fb: Status 404 returned error can't find the container with id e8cbae0924331351ea94389baefdade5c34ca28c724f53efa2a6641fd49957fb Feb 19 15:42:32 crc kubenswrapper[4810]: I0219 15:42:32.969384 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf56v"] Feb 19 15:42:32 crc kubenswrapper[4810]: I0219 15:42:32.972674 4810 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 15:42:33 crc kubenswrapper[4810]: I0219 15:42:33.142375 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf56v" event={"ID":"e77512a1-b460-4008-9e59-5b38f3e9f925","Type":"ContainerStarted","Data":"e8cbae0924331351ea94389baefdade5c34ca28c724f53efa2a6641fd49957fb"} Feb 19 15:42:34 crc kubenswrapper[4810]: I0219 15:42:34.160155 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf56v" event={"ID":"e77512a1-b460-4008-9e59-5b38f3e9f925","Type":"ContainerStarted","Data":"4ba117d816027c740774a62d5ef2e96fec877e6b49c00a3b6d56dcbd8475ed9f"} Feb 19 15:42:34 crc kubenswrapper[4810]: I0219 15:42:34.196601 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf56v" podStartSLOduration=1.804856859 podStartE2EDuration="2.196577804s" podCreationTimestamp="2026-02-19 15:42:32 +0000 UTC" firstStartedPulling="2026-02-19 15:42:32.972237957 +0000 UTC m=+1982.454268081" lastFinishedPulling="2026-02-19 15:42:33.363958902 +0000 UTC m=+1982.845989026" observedRunningTime="2026-02-19 15:42:34.187198691 +0000 UTC m=+1983.669228835" watchObservedRunningTime="2026-02-19 15:42:34.196577804 +0000 UTC m=+1983.678607928" Feb 19 15:42:42 crc kubenswrapper[4810]: I0219 15:42:42.245130 4810 generic.go:334] "Generic (PLEG): container finished" podID="e77512a1-b460-4008-9e59-5b38f3e9f925" containerID="4ba117d816027c740774a62d5ef2e96fec877e6b49c00a3b6d56dcbd8475ed9f" exitCode=0 Feb 19 15:42:42 crc kubenswrapper[4810]: I0219 15:42:42.245243 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf56v" event={"ID":"e77512a1-b460-4008-9e59-5b38f3e9f925","Type":"ContainerDied","Data":"4ba117d816027c740774a62d5ef2e96fec877e6b49c00a3b6d56dcbd8475ed9f"} Feb 19 15:42:42 crc kubenswrapper[4810]: I0219 15:42:42.340685 4810 scope.go:117] "RemoveContainer" containerID="eb023c9cd6a803467c474cff2a48a3d6536859d9cdbc3785ab4eb9814aa6c925" Feb 19 15:42:43 crc kubenswrapper[4810]: I0219 15:42:43.789387 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf56v" Feb 19 15:42:43 crc kubenswrapper[4810]: I0219 15:42:43.936798 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e77512a1-b460-4008-9e59-5b38f3e9f925-inventory\") pod \"e77512a1-b460-4008-9e59-5b38f3e9f925\" (UID: \"e77512a1-b460-4008-9e59-5b38f3e9f925\") " Feb 19 15:42:43 crc kubenswrapper[4810]: I0219 15:42:43.936959 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cwsz8\" (UniqueName: \"kubernetes.io/projected/e77512a1-b460-4008-9e59-5b38f3e9f925-kube-api-access-cwsz8\") pod \"e77512a1-b460-4008-9e59-5b38f3e9f925\" (UID: \"e77512a1-b460-4008-9e59-5b38f3e9f925\") " Feb 19 15:42:43 crc kubenswrapper[4810]: I0219 15:42:43.937009 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e77512a1-b460-4008-9e59-5b38f3e9f925-ssh-key-openstack-edpm-ipam\") pod \"e77512a1-b460-4008-9e59-5b38f3e9f925\" (UID: \"e77512a1-b460-4008-9e59-5b38f3e9f925\") " Feb 19 15:42:43 crc kubenswrapper[4810]: I0219 15:42:43.943740 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e77512a1-b460-4008-9e59-5b38f3e9f925-kube-api-access-cwsz8" (OuterVolumeSpecName: "kube-api-access-cwsz8") pod "e77512a1-b460-4008-9e59-5b38f3e9f925" (UID: "e77512a1-b460-4008-9e59-5b38f3e9f925"). InnerVolumeSpecName "kube-api-access-cwsz8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:42:43 crc kubenswrapper[4810]: I0219 15:42:43.965587 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e77512a1-b460-4008-9e59-5b38f3e9f925-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e77512a1-b460-4008-9e59-5b38f3e9f925" (UID: "e77512a1-b460-4008-9e59-5b38f3e9f925"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:42:43 crc kubenswrapper[4810]: I0219 15:42:43.970776 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e77512a1-b460-4008-9e59-5b38f3e9f925-inventory" (OuterVolumeSpecName: "inventory") pod "e77512a1-b460-4008-9e59-5b38f3e9f925" (UID: "e77512a1-b460-4008-9e59-5b38f3e9f925"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:42:44 crc kubenswrapper[4810]: I0219 15:42:44.040558 4810 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e77512a1-b460-4008-9e59-5b38f3e9f925-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 15:42:44 crc kubenswrapper[4810]: I0219 15:42:44.040609 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cwsz8\" (UniqueName: \"kubernetes.io/projected/e77512a1-b460-4008-9e59-5b38f3e9f925-kube-api-access-cwsz8\") on node \"crc\" DevicePath \"\"" Feb 19 15:42:44 crc kubenswrapper[4810]: I0219 15:42:44.040631 4810 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e77512a1-b460-4008-9e59-5b38f3e9f925-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 15:42:44 crc kubenswrapper[4810]: I0219 15:42:44.264454 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf56v" event={"ID":"e77512a1-b460-4008-9e59-5b38f3e9f925","Type":"ContainerDied","Data":"e8cbae0924331351ea94389baefdade5c34ca28c724f53efa2a6641fd49957fb"} Feb 19 15:42:44 crc kubenswrapper[4810]: I0219 15:42:44.264489 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e8cbae0924331351ea94389baefdade5c34ca28c724f53efa2a6641fd49957fb" Feb 19 15:42:44 crc kubenswrapper[4810]: I0219 15:42:44.264500 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf56v" Feb 19 15:42:44 crc kubenswrapper[4810]: I0219 15:42:44.411628 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ps669"] Feb 19 15:42:44 crc kubenswrapper[4810]: E0219 15:42:44.412409 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e77512a1-b460-4008-9e59-5b38f3e9f925" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 19 15:42:44 crc kubenswrapper[4810]: I0219 15:42:44.412472 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="e77512a1-b460-4008-9e59-5b38f3e9f925" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 19 15:42:44 crc kubenswrapper[4810]: I0219 15:42:44.412849 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="e77512a1-b460-4008-9e59-5b38f3e9f925" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 19 15:42:44 crc kubenswrapper[4810]: I0219 15:42:44.414228 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ps669" Feb 19 15:42:44 crc kubenswrapper[4810]: I0219 15:42:44.416550 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 15:42:44 crc kubenswrapper[4810]: I0219 15:42:44.416676 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-djc4q" Feb 19 15:42:44 crc kubenswrapper[4810]: I0219 15:42:44.417092 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 15:42:44 crc kubenswrapper[4810]: I0219 15:42:44.417181 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 15:42:44 crc kubenswrapper[4810]: I0219 15:42:44.425599 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ps669"] Feb 19 15:42:44 crc kubenswrapper[4810]: I0219 15:42:44.551545 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5r45\" (UniqueName: \"kubernetes.io/projected/69d67433-38d6-4368-a621-254a97b0c619-kube-api-access-q5r45\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ps669\" (UID: \"69d67433-38d6-4368-a621-254a97b0c619\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ps669" Feb 19 15:42:44 crc kubenswrapper[4810]: I0219 15:42:44.551601 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/69d67433-38d6-4368-a621-254a97b0c619-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ps669\" (UID: \"69d67433-38d6-4368-a621-254a97b0c619\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ps669" Feb 19 15:42:44 crc kubenswrapper[4810]: I0219 15:42:44.551726 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/69d67433-38d6-4368-a621-254a97b0c619-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ps669\" (UID: \"69d67433-38d6-4368-a621-254a97b0c619\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ps669" Feb 19 15:42:44 crc kubenswrapper[4810]: I0219 15:42:44.654627 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5r45\" (UniqueName: \"kubernetes.io/projected/69d67433-38d6-4368-a621-254a97b0c619-kube-api-access-q5r45\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ps669\" (UID: \"69d67433-38d6-4368-a621-254a97b0c619\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ps669" Feb 19 15:42:44 crc kubenswrapper[4810]: I0219 15:42:44.654745 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/69d67433-38d6-4368-a621-254a97b0c619-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ps669\" (UID: \"69d67433-38d6-4368-a621-254a97b0c619\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ps669" Feb 19 15:42:44 crc kubenswrapper[4810]: I0219 15:42:44.654979 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/69d67433-38d6-4368-a621-254a97b0c619-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ps669\" (UID: \"69d67433-38d6-4368-a621-254a97b0c619\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ps669" Feb 19 15:42:44 crc kubenswrapper[4810]: I0219 15:42:44.658725 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/69d67433-38d6-4368-a621-254a97b0c619-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ps669\" (UID: \"69d67433-38d6-4368-a621-254a97b0c619\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ps669" Feb 19 15:42:44 crc kubenswrapper[4810]: I0219 15:42:44.659255 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/69d67433-38d6-4368-a621-254a97b0c619-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ps669\" (UID: \"69d67433-38d6-4368-a621-254a97b0c619\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ps669" Feb 19 15:42:44 crc kubenswrapper[4810]: I0219 15:42:44.671856 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5r45\" (UniqueName: \"kubernetes.io/projected/69d67433-38d6-4368-a621-254a97b0c619-kube-api-access-q5r45\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ps669\" (UID: \"69d67433-38d6-4368-a621-254a97b0c619\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ps669" Feb 19 15:42:44 crc kubenswrapper[4810]: I0219 15:42:44.737241 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ps669" Feb 19 15:42:45 crc kubenswrapper[4810]: I0219 15:42:45.420192 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ps669"] Feb 19 15:42:46 crc kubenswrapper[4810]: I0219 15:42:46.285811 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ps669" event={"ID":"69d67433-38d6-4368-a621-254a97b0c619","Type":"ContainerStarted","Data":"bb2836be18d58e4e1b432c36d80b7fd905a535643a6510de9f3063d41bfc5c9d"} Feb 19 15:42:46 crc kubenswrapper[4810]: I0219 15:42:46.286162 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ps669" event={"ID":"69d67433-38d6-4368-a621-254a97b0c619","Type":"ContainerStarted","Data":"aa1393d62d799aebfca60fb4048ba6c87b3bd42bf6b288655c0b82b2157a23ec"} Feb 19 15:42:46 crc kubenswrapper[4810]: I0219 15:42:46.309983 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ps669" podStartSLOduration=1.8658423229999999 podStartE2EDuration="2.309963769s" podCreationTimestamp="2026-02-19 15:42:44 +0000 UTC" firstStartedPulling="2026-02-19 15:42:45.425296585 +0000 UTC m=+1994.907326709" lastFinishedPulling="2026-02-19 15:42:45.869418021 +0000 UTC m=+1995.351448155" observedRunningTime="2026-02-19 15:42:46.30194686 +0000 UTC m=+1995.783976994" watchObservedRunningTime="2026-02-19 15:42:46.309963769 +0000 UTC m=+1995.791993893" Feb 19 15:42:56 crc kubenswrapper[4810]: I0219 15:42:56.437919 4810 generic.go:334] "Generic (PLEG): container finished" podID="69d67433-38d6-4368-a621-254a97b0c619" containerID="bb2836be18d58e4e1b432c36d80b7fd905a535643a6510de9f3063d41bfc5c9d" exitCode=0 Feb 19 15:42:56 crc kubenswrapper[4810]: I0219 15:42:56.438533 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ps669" event={"ID":"69d67433-38d6-4368-a621-254a97b0c619","Type":"ContainerDied","Data":"bb2836be18d58e4e1b432c36d80b7fd905a535643a6510de9f3063d41bfc5c9d"} Feb 19 15:42:58 crc kubenswrapper[4810]: I0219 15:42:58.009019 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ps669" Feb 19 15:42:58 crc kubenswrapper[4810]: I0219 15:42:58.128594 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/69d67433-38d6-4368-a621-254a97b0c619-inventory\") pod \"69d67433-38d6-4368-a621-254a97b0c619\" (UID: \"69d67433-38d6-4368-a621-254a97b0c619\") " Feb 19 15:42:58 crc kubenswrapper[4810]: I0219 15:42:58.128901 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/69d67433-38d6-4368-a621-254a97b0c619-ssh-key-openstack-edpm-ipam\") pod \"69d67433-38d6-4368-a621-254a97b0c619\" (UID: \"69d67433-38d6-4368-a621-254a97b0c619\") " Feb 19 15:42:58 crc kubenswrapper[4810]: I0219 15:42:58.129016 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5r45\" (UniqueName: \"kubernetes.io/projected/69d67433-38d6-4368-a621-254a97b0c619-kube-api-access-q5r45\") pod \"69d67433-38d6-4368-a621-254a97b0c619\" (UID: \"69d67433-38d6-4368-a621-254a97b0c619\") " Feb 19 15:42:58 crc kubenswrapper[4810]: I0219 15:42:58.136157 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69d67433-38d6-4368-a621-254a97b0c619-kube-api-access-q5r45" (OuterVolumeSpecName: "kube-api-access-q5r45") pod "69d67433-38d6-4368-a621-254a97b0c619" (UID: "69d67433-38d6-4368-a621-254a97b0c619"). InnerVolumeSpecName "kube-api-access-q5r45". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:42:58 crc kubenswrapper[4810]: I0219 15:42:58.155621 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69d67433-38d6-4368-a621-254a97b0c619-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "69d67433-38d6-4368-a621-254a97b0c619" (UID: "69d67433-38d6-4368-a621-254a97b0c619"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:42:58 crc kubenswrapper[4810]: I0219 15:42:58.175190 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69d67433-38d6-4368-a621-254a97b0c619-inventory" (OuterVolumeSpecName: "inventory") pod "69d67433-38d6-4368-a621-254a97b0c619" (UID: "69d67433-38d6-4368-a621-254a97b0c619"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:42:58 crc kubenswrapper[4810]: I0219 15:42:58.232052 4810 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/69d67433-38d6-4368-a621-254a97b0c619-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 15:42:58 crc kubenswrapper[4810]: I0219 15:42:58.232106 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5r45\" (UniqueName: \"kubernetes.io/projected/69d67433-38d6-4368-a621-254a97b0c619-kube-api-access-q5r45\") on node \"crc\" DevicePath \"\"" Feb 19 15:42:58 crc kubenswrapper[4810]: I0219 15:42:58.232121 4810 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/69d67433-38d6-4368-a621-254a97b0c619-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 15:42:58 crc kubenswrapper[4810]: I0219 15:42:58.459946 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ps669" event={"ID":"69d67433-38d6-4368-a621-254a97b0c619","Type":"ContainerDied","Data":"aa1393d62d799aebfca60fb4048ba6c87b3bd42bf6b288655c0b82b2157a23ec"} Feb 19 15:42:58 crc kubenswrapper[4810]: I0219 15:42:58.460225 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa1393d62d799aebfca60fb4048ba6c87b3bd42bf6b288655c0b82b2157a23ec" Feb 19 15:42:58 crc kubenswrapper[4810]: I0219 15:42:58.460058 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ps669" Feb 19 15:42:58 crc kubenswrapper[4810]: I0219 15:42:58.576877 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2"] Feb 19 15:42:58 crc kubenswrapper[4810]: E0219 15:42:58.577559 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69d67433-38d6-4368-a621-254a97b0c619" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 19 15:42:58 crc kubenswrapper[4810]: I0219 15:42:58.577639 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="69d67433-38d6-4368-a621-254a97b0c619" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 19 15:42:58 crc kubenswrapper[4810]: I0219 15:42:58.577870 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="69d67433-38d6-4368-a621-254a97b0c619" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 19 15:42:58 crc kubenswrapper[4810]: I0219 15:42:58.578626 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2" Feb 19 15:42:58 crc kubenswrapper[4810]: I0219 15:42:58.581513 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Feb 19 15:42:58 crc kubenswrapper[4810]: I0219 15:42:58.581551 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 15:42:58 crc kubenswrapper[4810]: I0219 15:42:58.581639 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Feb 19 15:42:58 crc kubenswrapper[4810]: I0219 15:42:58.582582 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 15:42:58 crc kubenswrapper[4810]: I0219 15:42:58.583561 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Feb 19 15:42:58 crc kubenswrapper[4810]: I0219 15:42:58.583568 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Feb 19 15:42:58 crc kubenswrapper[4810]: I0219 15:42:58.587344 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-djc4q" Feb 19 15:42:58 crc kubenswrapper[4810]: I0219 15:42:58.587739 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 15:42:58 crc kubenswrapper[4810]: I0219 15:42:58.594577 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2"] Feb 19 15:42:58 crc kubenswrapper[4810]: I0219 15:42:58.741265 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2\" (UID: \"31bd8fe5-f0b6-4463-a545-bdeb0c33b182\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2" Feb 19 15:42:58 crc kubenswrapper[4810]: I0219 15:42:58.741312 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2\" (UID: \"31bd8fe5-f0b6-4463-a545-bdeb0c33b182\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2" Feb 19 15:42:58 crc kubenswrapper[4810]: I0219 15:42:58.741369 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzrf6\" (UniqueName: \"kubernetes.io/projected/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-kube-api-access-mzrf6\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2\" (UID: \"31bd8fe5-f0b6-4463-a545-bdeb0c33b182\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2" Feb 19 15:42:58 crc kubenswrapper[4810]: I0219 15:42:58.741398 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2\" (UID: \"31bd8fe5-f0b6-4463-a545-bdeb0c33b182\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2" Feb 19 15:42:58 crc kubenswrapper[4810]: I0219 15:42:58.741429 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2\" (UID: \"31bd8fe5-f0b6-4463-a545-bdeb0c33b182\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2" Feb 19 15:42:58 crc kubenswrapper[4810]: I0219 15:42:58.741470 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2\" (UID: \"31bd8fe5-f0b6-4463-a545-bdeb0c33b182\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2" Feb 19 15:42:58 crc kubenswrapper[4810]: I0219 15:42:58.741496 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2\" (UID: \"31bd8fe5-f0b6-4463-a545-bdeb0c33b182\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2" Feb 19 15:42:58 crc kubenswrapper[4810]: I0219 15:42:58.741527 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2\" (UID: \"31bd8fe5-f0b6-4463-a545-bdeb0c33b182\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2" Feb 19 15:42:58 crc kubenswrapper[4810]: I0219 15:42:58.741563 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2\" (UID: \"31bd8fe5-f0b6-4463-a545-bdeb0c33b182\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2" Feb 19 15:42:58 crc kubenswrapper[4810]: I0219 15:42:58.741607 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2\" (UID: \"31bd8fe5-f0b6-4463-a545-bdeb0c33b182\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2" Feb 19 15:42:58 crc kubenswrapper[4810]: I0219 15:42:58.741628 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2\" (UID: \"31bd8fe5-f0b6-4463-a545-bdeb0c33b182\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2" Feb 19 15:42:58 crc kubenswrapper[4810]: I0219 15:42:58.741648 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2\" (UID: \"31bd8fe5-f0b6-4463-a545-bdeb0c33b182\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2" Feb 19 15:42:58 crc kubenswrapper[4810]: I0219 15:42:58.741665 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2\" (UID: \"31bd8fe5-f0b6-4463-a545-bdeb0c33b182\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2" Feb 19 15:42:58 crc kubenswrapper[4810]: I0219 15:42:58.741735 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2\" (UID: \"31bd8fe5-f0b6-4463-a545-bdeb0c33b182\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2" Feb 19 15:42:58 crc kubenswrapper[4810]: I0219 15:42:58.843465 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2\" (UID: \"31bd8fe5-f0b6-4463-a545-bdeb0c33b182\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2" Feb 19 15:42:58 crc kubenswrapper[4810]: I0219 15:42:58.843539 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2\" (UID: \"31bd8fe5-f0b6-4463-a545-bdeb0c33b182\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2" Feb 19 15:42:58 crc kubenswrapper[4810]: I0219 15:42:58.843595 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2\" (UID: \"31bd8fe5-f0b6-4463-a545-bdeb0c33b182\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2" Feb 19 15:42:58 crc kubenswrapper[4810]: I0219 15:42:58.843621 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2\" (UID: \"31bd8fe5-f0b6-4463-a545-bdeb0c33b182\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2" Feb 19 15:42:58 crc kubenswrapper[4810]: I0219 15:42:58.843640 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2\" (UID: \"31bd8fe5-f0b6-4463-a545-bdeb0c33b182\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2" Feb 19 15:42:58 crc kubenswrapper[4810]: I0219 15:42:58.843658 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2\" (UID: \"31bd8fe5-f0b6-4463-a545-bdeb0c33b182\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2" Feb 19 15:42:58 crc kubenswrapper[4810]: I0219 15:42:58.843694 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2\" (UID: \"31bd8fe5-f0b6-4463-a545-bdeb0c33b182\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2" Feb 19 15:42:58 crc kubenswrapper[4810]: I0219 15:42:58.843731 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2\" (UID: \"31bd8fe5-f0b6-4463-a545-bdeb0c33b182\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2" Feb 19 15:42:58 crc kubenswrapper[4810]: I0219 15:42:58.843749 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2\" (UID: \"31bd8fe5-f0b6-4463-a545-bdeb0c33b182\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2" Feb 19 15:42:58 crc kubenswrapper[4810]: I0219 15:42:58.843779 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzrf6\" (UniqueName: \"kubernetes.io/projected/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-kube-api-access-mzrf6\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2\" (UID: \"31bd8fe5-f0b6-4463-a545-bdeb0c33b182\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2" Feb 19 15:42:58 crc kubenswrapper[4810]: I0219 15:42:58.843797 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2\" (UID: \"31bd8fe5-f0b6-4463-a545-bdeb0c33b182\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2" Feb 19 15:42:58 crc kubenswrapper[4810]: I0219 15:42:58.843817 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2\" (UID: \"31bd8fe5-f0b6-4463-a545-bdeb0c33b182\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2" Feb 19 15:42:58 crc kubenswrapper[4810]: I0219 15:42:58.843852 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2\" (UID: \"31bd8fe5-f0b6-4463-a545-bdeb0c33b182\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2" Feb 19 15:42:58 crc kubenswrapper[4810]: I0219 15:42:58.843879 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2\" (UID: \"31bd8fe5-f0b6-4463-a545-bdeb0c33b182\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2" Feb 19 15:42:58 crc kubenswrapper[4810]: I0219 15:42:58.848390 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2\" (UID: \"31bd8fe5-f0b6-4463-a545-bdeb0c33b182\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2" Feb 19 15:42:58 crc kubenswrapper[4810]: I0219 15:42:58.848797 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2\" (UID: \"31bd8fe5-f0b6-4463-a545-bdeb0c33b182\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2" Feb 19 15:42:58 crc kubenswrapper[4810]: I0219 15:42:58.849924 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2\" (UID: \"31bd8fe5-f0b6-4463-a545-bdeb0c33b182\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2" Feb 19 15:42:58 crc kubenswrapper[4810]: I0219 15:42:58.850721 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2\" (UID: \"31bd8fe5-f0b6-4463-a545-bdeb0c33b182\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2" Feb 19 15:42:58 crc kubenswrapper[4810]: I0219 15:42:58.851494 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2\" (UID: \"31bd8fe5-f0b6-4463-a545-bdeb0c33b182\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2" Feb 19 15:42:58 crc kubenswrapper[4810]: I0219 15:42:58.851573 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2\" (UID: \"31bd8fe5-f0b6-4463-a545-bdeb0c33b182\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2" Feb 19 15:42:58 crc kubenswrapper[4810]: I0219 15:42:58.851842 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2\" (UID: \"31bd8fe5-f0b6-4463-a545-bdeb0c33b182\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2" Feb 19 15:42:58 crc kubenswrapper[4810]: I0219 15:42:58.855090 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2\" (UID: \"31bd8fe5-f0b6-4463-a545-bdeb0c33b182\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2" Feb 19 15:42:58 crc kubenswrapper[4810]: I0219 15:42:58.856547 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2\" (UID: \"31bd8fe5-f0b6-4463-a545-bdeb0c33b182\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2" Feb 19 15:42:58 crc kubenswrapper[4810]: I0219 15:42:58.856855 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2\" (UID: \"31bd8fe5-f0b6-4463-a545-bdeb0c33b182\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2" Feb 19 15:42:58 crc kubenswrapper[4810]: I0219 15:42:58.857640 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2\" (UID: \"31bd8fe5-f0b6-4463-a545-bdeb0c33b182\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2" Feb 19 15:42:58 crc kubenswrapper[4810]: I0219 15:42:58.859002 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2\" (UID: \"31bd8fe5-f0b6-4463-a545-bdeb0c33b182\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2" Feb 19 15:42:58 crc kubenswrapper[4810]: I0219 15:42:58.866385 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2\" (UID: \"31bd8fe5-f0b6-4463-a545-bdeb0c33b182\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2" Feb 19 15:42:58 crc kubenswrapper[4810]: I0219 15:42:58.872030 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzrf6\" (UniqueName: \"kubernetes.io/projected/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-kube-api-access-mzrf6\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2\" (UID: \"31bd8fe5-f0b6-4463-a545-bdeb0c33b182\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2" Feb 19 15:42:58 crc kubenswrapper[4810]: I0219 15:42:58.897671 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2" Feb 19 15:42:59 crc kubenswrapper[4810]: W0219 15:42:59.509233 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31bd8fe5_f0b6_4463_a545_bdeb0c33b182.slice/crio-9697ec4be9aa47cb04a3a0aed3c914d6f0a0f8b3c8892a8798066b5a83ea1d92 WatchSource:0}: Error finding container 9697ec4be9aa47cb04a3a0aed3c914d6f0a0f8b3c8892a8798066b5a83ea1d92: Status 404 returned error can't find the container with id 9697ec4be9aa47cb04a3a0aed3c914d6f0a0f8b3c8892a8798066b5a83ea1d92 Feb 19 15:42:59 crc kubenswrapper[4810]: I0219 15:42:59.523548 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2"] Feb 19 15:43:00 crc kubenswrapper[4810]: I0219 15:43:00.488704 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2" event={"ID":"31bd8fe5-f0b6-4463-a545-bdeb0c33b182","Type":"ContainerStarted","Data":"29281feef62bd69d0830d82f4dc58d875d408df7a070860546c94eda4eba5135"} Feb 19 15:43:00 crc kubenswrapper[4810]: I0219 15:43:00.489289 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2" event={"ID":"31bd8fe5-f0b6-4463-a545-bdeb0c33b182","Type":"ContainerStarted","Data":"9697ec4be9aa47cb04a3a0aed3c914d6f0a0f8b3c8892a8798066b5a83ea1d92"} Feb 19 15:43:00 crc kubenswrapper[4810]: I0219 15:43:00.508248 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2" podStartSLOduration=2.112100963 podStartE2EDuration="2.508230247s" podCreationTimestamp="2026-02-19 15:42:58 +0000 UTC" firstStartedPulling="2026-02-19 15:42:59.512765133 +0000 UTC m=+2008.994795267" lastFinishedPulling="2026-02-19 15:42:59.908894387 +0000 UTC m=+2009.390924551" observedRunningTime="2026-02-19 15:43:00.50594419 +0000 UTC m=+2009.987974314" watchObservedRunningTime="2026-02-19 15:43:00.508230247 +0000 UTC m=+2009.990260371" Feb 19 15:43:19 crc kubenswrapper[4810]: I0219 15:43:19.537121 4810 patch_prober.go:28] interesting pod/machine-config-daemon-t499d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 15:43:19 crc kubenswrapper[4810]: I0219 15:43:19.538876 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 15:43:37 crc kubenswrapper[4810]: I0219 15:43:37.921378 4810 generic.go:334] "Generic (PLEG): container finished" podID="31bd8fe5-f0b6-4463-a545-bdeb0c33b182" containerID="29281feef62bd69d0830d82f4dc58d875d408df7a070860546c94eda4eba5135" exitCode=0 Feb 19 15:43:37 crc kubenswrapper[4810]: I0219 15:43:37.921415 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2" event={"ID":"31bd8fe5-f0b6-4463-a545-bdeb0c33b182","Type":"ContainerDied","Data":"29281feef62bd69d0830d82f4dc58d875d408df7a070860546c94eda4eba5135"} Feb 19 15:43:39 crc kubenswrapper[4810]: I0219 15:43:39.447816 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2" Feb 19 15:43:39 crc kubenswrapper[4810]: I0219 15:43:39.528768 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-openstack-edpm-ipam-ovn-default-certs-0\") pod \"31bd8fe5-f0b6-4463-a545-bdeb0c33b182\" (UID: \"31bd8fe5-f0b6-4463-a545-bdeb0c33b182\") " Feb 19 15:43:39 crc kubenswrapper[4810]: I0219 15:43:39.528886 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-ssh-key-openstack-edpm-ipam\") pod \"31bd8fe5-f0b6-4463-a545-bdeb0c33b182\" (UID: \"31bd8fe5-f0b6-4463-a545-bdeb0c33b182\") " Feb 19 15:43:39 crc kubenswrapper[4810]: I0219 15:43:39.528927 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-telemetry-combined-ca-bundle\") pod \"31bd8fe5-f0b6-4463-a545-bdeb0c33b182\" (UID: \"31bd8fe5-f0b6-4463-a545-bdeb0c33b182\") " Feb 19 15:43:39 crc kubenswrapper[4810]: I0219 15:43:39.528962 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-repo-setup-combined-ca-bundle\") pod \"31bd8fe5-f0b6-4463-a545-bdeb0c33b182\" (UID: \"31bd8fe5-f0b6-4463-a545-bdeb0c33b182\") " Feb 19 15:43:39 crc kubenswrapper[4810]: I0219 15:43:39.528992 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-nova-combined-ca-bundle\") pod \"31bd8fe5-f0b6-4463-a545-bdeb0c33b182\" (UID: \"31bd8fe5-f0b6-4463-a545-bdeb0c33b182\") " Feb 19 15:43:39 crc kubenswrapper[4810]: I0219 15:43:39.529185 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-libvirt-combined-ca-bundle\") pod \"31bd8fe5-f0b6-4463-a545-bdeb0c33b182\" (UID: \"31bd8fe5-f0b6-4463-a545-bdeb0c33b182\") " Feb 19 15:43:39 crc kubenswrapper[4810]: I0219 15:43:39.529257 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"31bd8fe5-f0b6-4463-a545-bdeb0c33b182\" (UID: \"31bd8fe5-f0b6-4463-a545-bdeb0c33b182\") " Feb 19 15:43:39 crc kubenswrapper[4810]: I0219 15:43:39.529318 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-ovn-combined-ca-bundle\") pod \"31bd8fe5-f0b6-4463-a545-bdeb0c33b182\" (UID: \"31bd8fe5-f0b6-4463-a545-bdeb0c33b182\") " Feb 19 15:43:39 crc kubenswrapper[4810]: I0219 15:43:39.529380 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"31bd8fe5-f0b6-4463-a545-bdeb0c33b182\" (UID: \"31bd8fe5-f0b6-4463-a545-bdeb0c33b182\") " Feb 19 15:43:39 crc kubenswrapper[4810]: I0219 15:43:39.529436 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-bootstrap-combined-ca-bundle\") pod \"31bd8fe5-f0b6-4463-a545-bdeb0c33b182\" (UID: \"31bd8fe5-f0b6-4463-a545-bdeb0c33b182\") " Feb 19 15:43:39 crc kubenswrapper[4810]: I0219 15:43:39.529508 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mzrf6\" (UniqueName: \"kubernetes.io/projected/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-kube-api-access-mzrf6\") pod \"31bd8fe5-f0b6-4463-a545-bdeb0c33b182\" (UID: \"31bd8fe5-f0b6-4463-a545-bdeb0c33b182\") " Feb 19 15:43:39 crc kubenswrapper[4810]: I0219 15:43:39.529545 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-neutron-metadata-combined-ca-bundle\") pod \"31bd8fe5-f0b6-4463-a545-bdeb0c33b182\" (UID: \"31bd8fe5-f0b6-4463-a545-bdeb0c33b182\") " Feb 19 15:43:39 crc kubenswrapper[4810]: I0219 15:43:39.529616 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"31bd8fe5-f0b6-4463-a545-bdeb0c33b182\" (UID: \"31bd8fe5-f0b6-4463-a545-bdeb0c33b182\") " Feb 19 15:43:39 crc kubenswrapper[4810]: I0219 15:43:39.529662 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-inventory\") pod \"31bd8fe5-f0b6-4463-a545-bdeb0c33b182\" (UID: \"31bd8fe5-f0b6-4463-a545-bdeb0c33b182\") " Feb 19 15:43:39 crc kubenswrapper[4810]: I0219 15:43:39.535023 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "31bd8fe5-f0b6-4463-a545-bdeb0c33b182" (UID: "31bd8fe5-f0b6-4463-a545-bdeb0c33b182"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:43:39 crc kubenswrapper[4810]: I0219 15:43:39.537247 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "31bd8fe5-f0b6-4463-a545-bdeb0c33b182" (UID: "31bd8fe5-f0b6-4463-a545-bdeb0c33b182"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:43:39 crc kubenswrapper[4810]: I0219 15:43:39.537768 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "31bd8fe5-f0b6-4463-a545-bdeb0c33b182" (UID: "31bd8fe5-f0b6-4463-a545-bdeb0c33b182"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:43:39 crc kubenswrapper[4810]: I0219 15:43:39.537991 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-kube-api-access-mzrf6" (OuterVolumeSpecName: "kube-api-access-mzrf6") pod "31bd8fe5-f0b6-4463-a545-bdeb0c33b182" (UID: "31bd8fe5-f0b6-4463-a545-bdeb0c33b182"). InnerVolumeSpecName "kube-api-access-mzrf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:43:39 crc kubenswrapper[4810]: I0219 15:43:39.538300 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "31bd8fe5-f0b6-4463-a545-bdeb0c33b182" (UID: "31bd8fe5-f0b6-4463-a545-bdeb0c33b182"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:43:39 crc kubenswrapper[4810]: I0219 15:43:39.538399 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "31bd8fe5-f0b6-4463-a545-bdeb0c33b182" (UID: "31bd8fe5-f0b6-4463-a545-bdeb0c33b182"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:43:39 crc kubenswrapper[4810]: I0219 15:43:39.539160 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "31bd8fe5-f0b6-4463-a545-bdeb0c33b182" (UID: "31bd8fe5-f0b6-4463-a545-bdeb0c33b182"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:43:39 crc kubenswrapper[4810]: I0219 15:43:39.539535 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "31bd8fe5-f0b6-4463-a545-bdeb0c33b182" (UID: "31bd8fe5-f0b6-4463-a545-bdeb0c33b182"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:43:39 crc kubenswrapper[4810]: I0219 15:43:39.540570 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "31bd8fe5-f0b6-4463-a545-bdeb0c33b182" (UID: "31bd8fe5-f0b6-4463-a545-bdeb0c33b182"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:43:39 crc kubenswrapper[4810]: I0219 15:43:39.541023 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "31bd8fe5-f0b6-4463-a545-bdeb0c33b182" (UID: "31bd8fe5-f0b6-4463-a545-bdeb0c33b182"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:43:39 crc kubenswrapper[4810]: I0219 15:43:39.543557 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "31bd8fe5-f0b6-4463-a545-bdeb0c33b182" (UID: "31bd8fe5-f0b6-4463-a545-bdeb0c33b182"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:43:39 crc kubenswrapper[4810]: I0219 15:43:39.551886 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "31bd8fe5-f0b6-4463-a545-bdeb0c33b182" (UID: "31bd8fe5-f0b6-4463-a545-bdeb0c33b182"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:43:39 crc kubenswrapper[4810]: I0219 15:43:39.572055 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-inventory" (OuterVolumeSpecName: "inventory") pod "31bd8fe5-f0b6-4463-a545-bdeb0c33b182" (UID: "31bd8fe5-f0b6-4463-a545-bdeb0c33b182"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:43:39 crc kubenswrapper[4810]: I0219 15:43:39.592387 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "31bd8fe5-f0b6-4463-a545-bdeb0c33b182" (UID: "31bd8fe5-f0b6-4463-a545-bdeb0c33b182"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:43:39 crc kubenswrapper[4810]: I0219 15:43:39.632057 4810 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 19 15:43:39 crc kubenswrapper[4810]: I0219 15:43:39.632100 4810 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 15:43:39 crc kubenswrapper[4810]: I0219 15:43:39.632115 4810 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 19 15:43:39 crc kubenswrapper[4810]: I0219 15:43:39.632132 4810 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 15:43:39 crc kubenswrapper[4810]: I0219 15:43:39.632144 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mzrf6\" (UniqueName: \"kubernetes.io/projected/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-kube-api-access-mzrf6\") on node \"crc\" DevicePath \"\"" Feb 19 15:43:39 crc kubenswrapper[4810]: I0219 15:43:39.632156 4810 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 15:43:39 crc kubenswrapper[4810]: I0219 15:43:39.632170 4810 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 19 15:43:39 crc kubenswrapper[4810]: I0219 15:43:39.632183 4810 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 15:43:39 crc kubenswrapper[4810]: I0219 15:43:39.632194 4810 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 19 15:43:39 crc kubenswrapper[4810]: I0219 15:43:39.632207 4810 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 15:43:39 crc kubenswrapper[4810]: I0219 15:43:39.632219 4810 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 15:43:39 crc kubenswrapper[4810]: I0219 15:43:39.632231 4810 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 15:43:39 crc kubenswrapper[4810]: I0219 15:43:39.632242 4810 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 15:43:39 crc kubenswrapper[4810]: I0219 15:43:39.632256 4810 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31bd8fe5-f0b6-4463-a545-bdeb0c33b182-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 15:43:39 crc kubenswrapper[4810]: I0219 15:43:39.941803 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2" event={"ID":"31bd8fe5-f0b6-4463-a545-bdeb0c33b182","Type":"ContainerDied","Data":"9697ec4be9aa47cb04a3a0aed3c914d6f0a0f8b3c8892a8798066b5a83ea1d92"} Feb 19 15:43:39 crc kubenswrapper[4810]: I0219 15:43:39.941852 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9697ec4be9aa47cb04a3a0aed3c914d6f0a0f8b3c8892a8798066b5a83ea1d92" Feb 19 15:43:39 crc kubenswrapper[4810]: I0219 15:43:39.941874 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2" Feb 19 15:43:40 crc kubenswrapper[4810]: I0219 15:43:40.140988 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-5gjxx"] Feb 19 15:43:40 crc kubenswrapper[4810]: E0219 15:43:40.141662 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31bd8fe5-f0b6-4463-a545-bdeb0c33b182" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 19 15:43:40 crc kubenswrapper[4810]: I0219 15:43:40.141742 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="31bd8fe5-f0b6-4463-a545-bdeb0c33b182" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 19 15:43:40 crc kubenswrapper[4810]: I0219 15:43:40.142028 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="31bd8fe5-f0b6-4463-a545-bdeb0c33b182" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 19 15:43:40 crc kubenswrapper[4810]: I0219 15:43:40.142726 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5gjxx" Feb 19 15:43:40 crc kubenswrapper[4810]: I0219 15:43:40.146039 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 15:43:40 crc kubenswrapper[4810]: I0219 15:43:40.146134 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 15:43:40 crc kubenswrapper[4810]: I0219 15:43:40.146428 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Feb 19 15:43:40 crc kubenswrapper[4810]: I0219 15:43:40.146505 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 15:43:40 crc kubenswrapper[4810]: I0219 15:43:40.146556 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-djc4q" Feb 19 15:43:40 crc kubenswrapper[4810]: I0219 15:43:40.169941 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-5gjxx"] Feb 19 15:43:40 crc kubenswrapper[4810]: I0219 15:43:40.243201 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4defb710-c07f-4e63-9baf-45f51085abdc-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-5gjxx\" (UID: \"4defb710-c07f-4e63-9baf-45f51085abdc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5gjxx" Feb 19 15:43:40 crc kubenswrapper[4810]: I0219 15:43:40.243270 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4defb710-c07f-4e63-9baf-45f51085abdc-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-5gjxx\" (UID: \"4defb710-c07f-4e63-9baf-45f51085abdc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5gjxx" Feb 19 15:43:40 crc kubenswrapper[4810]: I0219 15:43:40.243300 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcrsm\" (UniqueName: \"kubernetes.io/projected/4defb710-c07f-4e63-9baf-45f51085abdc-kube-api-access-mcrsm\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-5gjxx\" (UID: \"4defb710-c07f-4e63-9baf-45f51085abdc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5gjxx" Feb 19 15:43:40 crc kubenswrapper[4810]: I0219 15:43:40.243459 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4defb710-c07f-4e63-9baf-45f51085abdc-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-5gjxx\" (UID: \"4defb710-c07f-4e63-9baf-45f51085abdc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5gjxx" Feb 19 15:43:40 crc kubenswrapper[4810]: I0219 15:43:40.243510 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/4defb710-c07f-4e63-9baf-45f51085abdc-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-5gjxx\" (UID: \"4defb710-c07f-4e63-9baf-45f51085abdc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5gjxx" Feb 19 15:43:40 crc kubenswrapper[4810]: I0219 15:43:40.345805 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4defb710-c07f-4e63-9baf-45f51085abdc-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-5gjxx\" (UID: \"4defb710-c07f-4e63-9baf-45f51085abdc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5gjxx" Feb 19 15:43:40 crc kubenswrapper[4810]: I0219 15:43:40.346104 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4defb710-c07f-4e63-9baf-45f51085abdc-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-5gjxx\" (UID: \"4defb710-c07f-4e63-9baf-45f51085abdc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5gjxx" Feb 19 15:43:40 crc kubenswrapper[4810]: I0219 15:43:40.346165 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcrsm\" (UniqueName: \"kubernetes.io/projected/4defb710-c07f-4e63-9baf-45f51085abdc-kube-api-access-mcrsm\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-5gjxx\" (UID: \"4defb710-c07f-4e63-9baf-45f51085abdc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5gjxx" Feb 19 15:43:40 crc kubenswrapper[4810]: I0219 15:43:40.346245 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4defb710-c07f-4e63-9baf-45f51085abdc-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-5gjxx\" (UID: \"4defb710-c07f-4e63-9baf-45f51085abdc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5gjxx" Feb 19 15:43:40 crc kubenswrapper[4810]: I0219 15:43:40.346318 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/4defb710-c07f-4e63-9baf-45f51085abdc-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-5gjxx\" (UID: \"4defb710-c07f-4e63-9baf-45f51085abdc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5gjxx" Feb 19 15:43:40 crc kubenswrapper[4810]: I0219 15:43:40.348106 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/4defb710-c07f-4e63-9baf-45f51085abdc-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-5gjxx\" (UID: \"4defb710-c07f-4e63-9baf-45f51085abdc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5gjxx" Feb 19 15:43:40 crc kubenswrapper[4810]: I0219 15:43:40.356610 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4defb710-c07f-4e63-9baf-45f51085abdc-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-5gjxx\" (UID: \"4defb710-c07f-4e63-9baf-45f51085abdc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5gjxx" Feb 19 15:43:40 crc kubenswrapper[4810]: I0219 15:43:40.357535 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4defb710-c07f-4e63-9baf-45f51085abdc-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-5gjxx\" (UID: \"4defb710-c07f-4e63-9baf-45f51085abdc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5gjxx" Feb 19 15:43:40 crc kubenswrapper[4810]: I0219 15:43:40.358999 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4defb710-c07f-4e63-9baf-45f51085abdc-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-5gjxx\" (UID: \"4defb710-c07f-4e63-9baf-45f51085abdc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5gjxx" Feb 19 15:43:40 crc kubenswrapper[4810]: I0219 15:43:40.365443 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcrsm\" (UniqueName: \"kubernetes.io/projected/4defb710-c07f-4e63-9baf-45f51085abdc-kube-api-access-mcrsm\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-5gjxx\" (UID: \"4defb710-c07f-4e63-9baf-45f51085abdc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5gjxx" Feb 19 15:43:40 crc kubenswrapper[4810]: I0219 15:43:40.470675 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5gjxx" Feb 19 15:43:41 crc kubenswrapper[4810]: I0219 15:43:41.052057 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-5gjxx"] Feb 19 15:43:41 crc kubenswrapper[4810]: I0219 15:43:41.981722 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5gjxx" event={"ID":"4defb710-c07f-4e63-9baf-45f51085abdc","Type":"ContainerStarted","Data":"c9bd171f35c2431c659b24fe1892d82835d93f77adac91954d4f80c34ebd5311"} Feb 19 15:43:41 crc kubenswrapper[4810]: I0219 15:43:41.983220 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5gjxx" event={"ID":"4defb710-c07f-4e63-9baf-45f51085abdc","Type":"ContainerStarted","Data":"3bf7c888bacb4ee33c608237a690edff99f26fd352fd8968048d21acb796d0f0"} Feb 19 15:43:42 crc kubenswrapper[4810]: I0219 15:43:42.004703 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5gjxx" podStartSLOduration=1.54626412 podStartE2EDuration="2.004677371s" podCreationTimestamp="2026-02-19 15:43:40 +0000 UTC" firstStartedPulling="2026-02-19 15:43:41.057627639 +0000 UTC m=+2050.539657783" lastFinishedPulling="2026-02-19 15:43:41.51604087 +0000 UTC m=+2050.998071034" observedRunningTime="2026-02-19 15:43:42.001725608 +0000 UTC m=+2051.483755762" watchObservedRunningTime="2026-02-19 15:43:42.004677371 +0000 UTC m=+2051.486707535" Feb 19 15:43:49 crc kubenswrapper[4810]: I0219 15:43:49.538415 4810 patch_prober.go:28] interesting pod/machine-config-daemon-t499d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 15:43:49 crc kubenswrapper[4810]: I0219 15:43:49.539098 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 15:44:19 crc kubenswrapper[4810]: I0219 15:44:19.537508 4810 patch_prober.go:28] interesting pod/machine-config-daemon-t499d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 15:44:19 crc kubenswrapper[4810]: I0219 15:44:19.538186 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 15:44:19 crc kubenswrapper[4810]: I0219 15:44:19.538270 4810 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t499d" Feb 19 15:44:19 crc kubenswrapper[4810]: I0219 15:44:19.539521 4810 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4e3ef6a26491a979d9b15a4e163fa3567692b3b0eef18273908461c8a7758364"} pod="openshift-machine-config-operator/machine-config-daemon-t499d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 15:44:19 crc kubenswrapper[4810]: I0219 15:44:19.539611 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" containerID="cri-o://4e3ef6a26491a979d9b15a4e163fa3567692b3b0eef18273908461c8a7758364" gracePeriod=600 Feb 19 15:44:20 crc kubenswrapper[4810]: I0219 15:44:20.393861 4810 generic.go:334] "Generic (PLEG): container finished" podID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerID="4e3ef6a26491a979d9b15a4e163fa3567692b3b0eef18273908461c8a7758364" exitCode=0 Feb 19 15:44:20 crc kubenswrapper[4810]: I0219 15:44:20.393939 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" event={"ID":"dfbf74ef-1e94-4826-8583-42b2e246ccf3","Type":"ContainerDied","Data":"4e3ef6a26491a979d9b15a4e163fa3567692b3b0eef18273908461c8a7758364"} Feb 19 15:44:20 crc kubenswrapper[4810]: I0219 15:44:20.394517 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" event={"ID":"dfbf74ef-1e94-4826-8583-42b2e246ccf3","Type":"ContainerStarted","Data":"75401b3edfa84b40903edbf5e5d73ff2da026a9f8214b6a2404ce809eb32bd1f"} Feb 19 15:44:20 crc kubenswrapper[4810]: I0219 15:44:20.394540 4810 scope.go:117] "RemoveContainer" containerID="2ce40b521710718c7ba6b5fc71023bb5882beb271696b0616786a7ce052f1e2a" Feb 19 15:44:52 crc kubenswrapper[4810]: E0219 15:44:52.015951 4810 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4defb710_c07f_4e63_9baf_45f51085abdc.slice/crio-conmon-c9bd171f35c2431c659b24fe1892d82835d93f77adac91954d4f80c34ebd5311.scope\": RecentStats: unable to find data in memory cache]" Feb 19 15:44:52 crc kubenswrapper[4810]: I0219 15:44:52.764850 4810 generic.go:334] "Generic (PLEG): container finished" podID="4defb710-c07f-4e63-9baf-45f51085abdc" containerID="c9bd171f35c2431c659b24fe1892d82835d93f77adac91954d4f80c34ebd5311" exitCode=0 Feb 19 15:44:52 crc kubenswrapper[4810]: I0219 15:44:52.764991 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5gjxx" event={"ID":"4defb710-c07f-4e63-9baf-45f51085abdc","Type":"ContainerDied","Data":"c9bd171f35c2431c659b24fe1892d82835d93f77adac91954d4f80c34ebd5311"} Feb 19 15:44:54 crc kubenswrapper[4810]: I0219 15:44:54.353735 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5gjxx" Feb 19 15:44:54 crc kubenswrapper[4810]: I0219 15:44:54.456360 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4defb710-c07f-4e63-9baf-45f51085abdc-inventory\") pod \"4defb710-c07f-4e63-9baf-45f51085abdc\" (UID: \"4defb710-c07f-4e63-9baf-45f51085abdc\") " Feb 19 15:44:54 crc kubenswrapper[4810]: I0219 15:44:54.456429 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4defb710-c07f-4e63-9baf-45f51085abdc-ovn-combined-ca-bundle\") pod \"4defb710-c07f-4e63-9baf-45f51085abdc\" (UID: \"4defb710-c07f-4e63-9baf-45f51085abdc\") " Feb 19 15:44:54 crc kubenswrapper[4810]: I0219 15:44:54.456545 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4defb710-c07f-4e63-9baf-45f51085abdc-ssh-key-openstack-edpm-ipam\") pod \"4defb710-c07f-4e63-9baf-45f51085abdc\" (UID: \"4defb710-c07f-4e63-9baf-45f51085abdc\") " Feb 19 15:44:54 crc kubenswrapper[4810]: I0219 15:44:54.456629 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/4defb710-c07f-4e63-9baf-45f51085abdc-ovncontroller-config-0\") pod \"4defb710-c07f-4e63-9baf-45f51085abdc\" (UID: \"4defb710-c07f-4e63-9baf-45f51085abdc\") " Feb 19 15:44:54 crc kubenswrapper[4810]: I0219 15:44:54.456840 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mcrsm\" (UniqueName: \"kubernetes.io/projected/4defb710-c07f-4e63-9baf-45f51085abdc-kube-api-access-mcrsm\") pod \"4defb710-c07f-4e63-9baf-45f51085abdc\" (UID: \"4defb710-c07f-4e63-9baf-45f51085abdc\") " Feb 19 15:44:54 crc kubenswrapper[4810]: I0219 15:44:54.464193 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4defb710-c07f-4e63-9baf-45f51085abdc-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "4defb710-c07f-4e63-9baf-45f51085abdc" (UID: "4defb710-c07f-4e63-9baf-45f51085abdc"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:44:54 crc kubenswrapper[4810]: I0219 15:44:54.468595 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4defb710-c07f-4e63-9baf-45f51085abdc-kube-api-access-mcrsm" (OuterVolumeSpecName: "kube-api-access-mcrsm") pod "4defb710-c07f-4e63-9baf-45f51085abdc" (UID: "4defb710-c07f-4e63-9baf-45f51085abdc"). InnerVolumeSpecName "kube-api-access-mcrsm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:44:54 crc kubenswrapper[4810]: I0219 15:44:54.494178 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4defb710-c07f-4e63-9baf-45f51085abdc-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "4defb710-c07f-4e63-9baf-45f51085abdc" (UID: "4defb710-c07f-4e63-9baf-45f51085abdc"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:44:54 crc kubenswrapper[4810]: I0219 15:44:54.512289 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4defb710-c07f-4e63-9baf-45f51085abdc-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "4defb710-c07f-4e63-9baf-45f51085abdc" (UID: "4defb710-c07f-4e63-9baf-45f51085abdc"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:44:54 crc kubenswrapper[4810]: I0219 15:44:54.514787 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4defb710-c07f-4e63-9baf-45f51085abdc-inventory" (OuterVolumeSpecName: "inventory") pod "4defb710-c07f-4e63-9baf-45f51085abdc" (UID: "4defb710-c07f-4e63-9baf-45f51085abdc"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:44:54 crc kubenswrapper[4810]: I0219 15:44:54.559990 4810 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4defb710-c07f-4e63-9baf-45f51085abdc-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 15:44:54 crc kubenswrapper[4810]: I0219 15:44:54.560043 4810 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4defb710-c07f-4e63-9baf-45f51085abdc-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 15:44:54 crc kubenswrapper[4810]: I0219 15:44:54.560065 4810 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4defb710-c07f-4e63-9baf-45f51085abdc-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 15:44:54 crc kubenswrapper[4810]: I0219 15:44:54.560086 4810 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/4defb710-c07f-4e63-9baf-45f51085abdc-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Feb 19 15:44:54 crc kubenswrapper[4810]: I0219 15:44:54.560104 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mcrsm\" (UniqueName: \"kubernetes.io/projected/4defb710-c07f-4e63-9baf-45f51085abdc-kube-api-access-mcrsm\") on node \"crc\" DevicePath \"\"" Feb 19 15:44:54 crc kubenswrapper[4810]: I0219 15:44:54.790152 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5gjxx" event={"ID":"4defb710-c07f-4e63-9baf-45f51085abdc","Type":"ContainerDied","Data":"3bf7c888bacb4ee33c608237a690edff99f26fd352fd8968048d21acb796d0f0"} Feb 19 15:44:54 crc kubenswrapper[4810]: I0219 15:44:54.790569 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3bf7c888bacb4ee33c608237a690edff99f26fd352fd8968048d21acb796d0f0" Feb 19 15:44:54 crc kubenswrapper[4810]: I0219 15:44:54.790284 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5gjxx" Feb 19 15:44:54 crc kubenswrapper[4810]: I0219 15:44:54.939301 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wpxhs"] Feb 19 15:44:54 crc kubenswrapper[4810]: E0219 15:44:54.939805 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4defb710-c07f-4e63-9baf-45f51085abdc" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 19 15:44:54 crc kubenswrapper[4810]: I0219 15:44:54.939829 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="4defb710-c07f-4e63-9baf-45f51085abdc" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 19 15:44:54 crc kubenswrapper[4810]: I0219 15:44:54.940097 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="4defb710-c07f-4e63-9baf-45f51085abdc" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 19 15:44:54 crc kubenswrapper[4810]: I0219 15:44:54.940949 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wpxhs" Feb 19 15:44:54 crc kubenswrapper[4810]: I0219 15:44:54.945930 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 15:44:54 crc kubenswrapper[4810]: I0219 15:44:54.946241 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 15:44:54 crc kubenswrapper[4810]: I0219 15:44:54.949245 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Feb 19 15:44:54 crc kubenswrapper[4810]: I0219 15:44:54.952415 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-djc4q" Feb 19 15:44:54 crc kubenswrapper[4810]: I0219 15:44:54.952828 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Feb 19 15:44:54 crc kubenswrapper[4810]: I0219 15:44:54.953159 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 15:44:54 crc kubenswrapper[4810]: I0219 15:44:54.965724 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wpxhs"] Feb 19 15:44:55 crc kubenswrapper[4810]: I0219 15:44:55.076643 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6650a3db-fdc1-4342-b8a8-cb91376e75c5-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wpxhs\" (UID: \"6650a3db-fdc1-4342-b8a8-cb91376e75c5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wpxhs" Feb 19 15:44:55 crc kubenswrapper[4810]: I0219 15:44:55.076739 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkgqg\" (UniqueName: \"kubernetes.io/projected/6650a3db-fdc1-4342-b8a8-cb91376e75c5-kube-api-access-zkgqg\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wpxhs\" (UID: \"6650a3db-fdc1-4342-b8a8-cb91376e75c5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wpxhs" Feb 19 15:44:55 crc kubenswrapper[4810]: I0219 15:44:55.077018 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6650a3db-fdc1-4342-b8a8-cb91376e75c5-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wpxhs\" (UID: \"6650a3db-fdc1-4342-b8a8-cb91376e75c5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wpxhs" Feb 19 15:44:55 crc kubenswrapper[4810]: I0219 15:44:55.077105 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6650a3db-fdc1-4342-b8a8-cb91376e75c5-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wpxhs\" (UID: \"6650a3db-fdc1-4342-b8a8-cb91376e75c5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wpxhs" Feb 19 15:44:55 crc kubenswrapper[4810]: I0219 15:44:55.077199 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6650a3db-fdc1-4342-b8a8-cb91376e75c5-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wpxhs\" (UID: \"6650a3db-fdc1-4342-b8a8-cb91376e75c5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wpxhs" Feb 19 15:44:55 crc kubenswrapper[4810]: I0219 15:44:55.077403 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6650a3db-fdc1-4342-b8a8-cb91376e75c5-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wpxhs\" (UID: \"6650a3db-fdc1-4342-b8a8-cb91376e75c5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wpxhs" Feb 19 15:44:55 crc kubenswrapper[4810]: I0219 15:44:55.179813 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6650a3db-fdc1-4342-b8a8-cb91376e75c5-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wpxhs\" (UID: \"6650a3db-fdc1-4342-b8a8-cb91376e75c5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wpxhs" Feb 19 15:44:55 crc kubenswrapper[4810]: I0219 15:44:55.179964 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkgqg\" (UniqueName: \"kubernetes.io/projected/6650a3db-fdc1-4342-b8a8-cb91376e75c5-kube-api-access-zkgqg\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wpxhs\" (UID: \"6650a3db-fdc1-4342-b8a8-cb91376e75c5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wpxhs" Feb 19 15:44:55 crc kubenswrapper[4810]: I0219 15:44:55.180044 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6650a3db-fdc1-4342-b8a8-cb91376e75c5-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wpxhs\" (UID: \"6650a3db-fdc1-4342-b8a8-cb91376e75c5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wpxhs" Feb 19 15:44:55 crc kubenswrapper[4810]: I0219 15:44:55.180091 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6650a3db-fdc1-4342-b8a8-cb91376e75c5-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wpxhs\" (UID: \"6650a3db-fdc1-4342-b8a8-cb91376e75c5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wpxhs" Feb 19 15:44:55 crc kubenswrapper[4810]: I0219 15:44:55.180151 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6650a3db-fdc1-4342-b8a8-cb91376e75c5-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wpxhs\" (UID: \"6650a3db-fdc1-4342-b8a8-cb91376e75c5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wpxhs" Feb 19 15:44:55 crc kubenswrapper[4810]: I0219 15:44:55.181070 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6650a3db-fdc1-4342-b8a8-cb91376e75c5-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wpxhs\" (UID: \"6650a3db-fdc1-4342-b8a8-cb91376e75c5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wpxhs" Feb 19 15:44:55 crc kubenswrapper[4810]: I0219 15:44:55.185551 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6650a3db-fdc1-4342-b8a8-cb91376e75c5-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wpxhs\" (UID: \"6650a3db-fdc1-4342-b8a8-cb91376e75c5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wpxhs" Feb 19 15:44:55 crc kubenswrapper[4810]: I0219 15:44:55.185643 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6650a3db-fdc1-4342-b8a8-cb91376e75c5-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wpxhs\" (UID: \"6650a3db-fdc1-4342-b8a8-cb91376e75c5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wpxhs" Feb 19 15:44:55 crc kubenswrapper[4810]: I0219 15:44:55.187725 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6650a3db-fdc1-4342-b8a8-cb91376e75c5-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wpxhs\" (UID: \"6650a3db-fdc1-4342-b8a8-cb91376e75c5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wpxhs" Feb 19 15:44:55 crc kubenswrapper[4810]: I0219 15:44:55.188831 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6650a3db-fdc1-4342-b8a8-cb91376e75c5-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wpxhs\" (UID: \"6650a3db-fdc1-4342-b8a8-cb91376e75c5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wpxhs" Feb 19 15:44:55 crc kubenswrapper[4810]: I0219 15:44:55.189848 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6650a3db-fdc1-4342-b8a8-cb91376e75c5-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wpxhs\" (UID: \"6650a3db-fdc1-4342-b8a8-cb91376e75c5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wpxhs" Feb 19 15:44:55 crc kubenswrapper[4810]: I0219 15:44:55.201006 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkgqg\" (UniqueName: \"kubernetes.io/projected/6650a3db-fdc1-4342-b8a8-cb91376e75c5-kube-api-access-zkgqg\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wpxhs\" (UID: \"6650a3db-fdc1-4342-b8a8-cb91376e75c5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wpxhs" Feb 19 15:44:55 crc kubenswrapper[4810]: I0219 15:44:55.275163 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wpxhs" Feb 19 15:44:55 crc kubenswrapper[4810]: I0219 15:44:55.872072 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wpxhs"] Feb 19 15:44:56 crc kubenswrapper[4810]: I0219 15:44:56.813507 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wpxhs" event={"ID":"6650a3db-fdc1-4342-b8a8-cb91376e75c5","Type":"ContainerStarted","Data":"0bfb481da1e582a9833bd1505ab13860f9c856ec5cedc6c8ae5ad71bbbe2c772"} Feb 19 15:44:56 crc kubenswrapper[4810]: I0219 15:44:56.814076 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wpxhs" event={"ID":"6650a3db-fdc1-4342-b8a8-cb91376e75c5","Type":"ContainerStarted","Data":"8d2f457eb3a2c22efd1c5f686fd81de59f3738fe958c14608f0dae2bcba6c1cf"} Feb 19 15:44:56 crc kubenswrapper[4810]: I0219 15:44:56.846712 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wpxhs" podStartSLOduration=2.403829057 podStartE2EDuration="2.846687016s" podCreationTimestamp="2026-02-19 15:44:54 +0000 UTC" firstStartedPulling="2026-02-19 15:44:55.893528551 +0000 UTC m=+2125.375558685" lastFinishedPulling="2026-02-19 15:44:56.33638651 +0000 UTC m=+2125.818416644" observedRunningTime="2026-02-19 15:44:56.833037417 +0000 UTC m=+2126.315067541" watchObservedRunningTime="2026-02-19 15:44:56.846687016 +0000 UTC m=+2126.328717150" Feb 19 15:45:00 crc kubenswrapper[4810]: I0219 15:45:00.134549 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525265-dnbnb"] Feb 19 15:45:00 crc kubenswrapper[4810]: I0219 15:45:00.137874 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525265-dnbnb" Feb 19 15:45:00 crc kubenswrapper[4810]: I0219 15:45:00.140971 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 19 15:45:00 crc kubenswrapper[4810]: I0219 15:45:00.188123 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 19 15:45:00 crc kubenswrapper[4810]: I0219 15:45:00.216461 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525265-dnbnb"] Feb 19 15:45:00 crc kubenswrapper[4810]: I0219 15:45:00.290620 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jqxd\" (UniqueName: \"kubernetes.io/projected/9ebe856b-d546-48e1-862d-d9f039620b73-kube-api-access-5jqxd\") pod \"collect-profiles-29525265-dnbnb\" (UID: \"9ebe856b-d546-48e1-862d-d9f039620b73\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525265-dnbnb" Feb 19 15:45:00 crc kubenswrapper[4810]: I0219 15:45:00.290674 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9ebe856b-d546-48e1-862d-d9f039620b73-config-volume\") pod \"collect-profiles-29525265-dnbnb\" (UID: \"9ebe856b-d546-48e1-862d-d9f039620b73\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525265-dnbnb" Feb 19 15:45:00 crc kubenswrapper[4810]: I0219 15:45:00.290698 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9ebe856b-d546-48e1-862d-d9f039620b73-secret-volume\") pod \"collect-profiles-29525265-dnbnb\" (UID: \"9ebe856b-d546-48e1-862d-d9f039620b73\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525265-dnbnb" Feb 19 15:45:00 crc kubenswrapper[4810]: I0219 15:45:00.393567 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9ebe856b-d546-48e1-862d-d9f039620b73-config-volume\") pod \"collect-profiles-29525265-dnbnb\" (UID: \"9ebe856b-d546-48e1-862d-d9f039620b73\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525265-dnbnb" Feb 19 15:45:00 crc kubenswrapper[4810]: I0219 15:45:00.393635 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9ebe856b-d546-48e1-862d-d9f039620b73-secret-volume\") pod \"collect-profiles-29525265-dnbnb\" (UID: \"9ebe856b-d546-48e1-862d-d9f039620b73\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525265-dnbnb" Feb 19 15:45:00 crc kubenswrapper[4810]: I0219 15:45:00.393880 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jqxd\" (UniqueName: \"kubernetes.io/projected/9ebe856b-d546-48e1-862d-d9f039620b73-kube-api-access-5jqxd\") pod \"collect-profiles-29525265-dnbnb\" (UID: \"9ebe856b-d546-48e1-862d-d9f039620b73\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525265-dnbnb" Feb 19 15:45:00 crc kubenswrapper[4810]: I0219 15:45:00.394456 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9ebe856b-d546-48e1-862d-d9f039620b73-config-volume\") pod \"collect-profiles-29525265-dnbnb\" (UID: \"9ebe856b-d546-48e1-862d-d9f039620b73\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525265-dnbnb" Feb 19 15:45:00 crc kubenswrapper[4810]: I0219 15:45:00.402009 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9ebe856b-d546-48e1-862d-d9f039620b73-secret-volume\") pod \"collect-profiles-29525265-dnbnb\" (UID: \"9ebe856b-d546-48e1-862d-d9f039620b73\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525265-dnbnb" Feb 19 15:45:00 crc kubenswrapper[4810]: I0219 15:45:00.418152 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jqxd\" (UniqueName: \"kubernetes.io/projected/9ebe856b-d546-48e1-862d-d9f039620b73-kube-api-access-5jqxd\") pod \"collect-profiles-29525265-dnbnb\" (UID: \"9ebe856b-d546-48e1-862d-d9f039620b73\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525265-dnbnb" Feb 19 15:45:00 crc kubenswrapper[4810]: I0219 15:45:00.506856 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525265-dnbnb" Feb 19 15:45:00 crc kubenswrapper[4810]: I0219 15:45:00.995829 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525265-dnbnb"] Feb 19 15:45:01 crc kubenswrapper[4810]: I0219 15:45:01.862033 4810 generic.go:334] "Generic (PLEG): container finished" podID="9ebe856b-d546-48e1-862d-d9f039620b73" containerID="ea1fa5fe82d5994ff114d9b04616fab9d73e059f9ec50ceb138445dbcf8a33cf" exitCode=0 Feb 19 15:45:01 crc kubenswrapper[4810]: I0219 15:45:01.862074 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525265-dnbnb" event={"ID":"9ebe856b-d546-48e1-862d-d9f039620b73","Type":"ContainerDied","Data":"ea1fa5fe82d5994ff114d9b04616fab9d73e059f9ec50ceb138445dbcf8a33cf"} Feb 19 15:45:01 crc kubenswrapper[4810]: I0219 15:45:01.862345 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525265-dnbnb" event={"ID":"9ebe856b-d546-48e1-862d-d9f039620b73","Type":"ContainerStarted","Data":"3310a652d205d8ec8d61afc514f564de3aa5d2a0ad8d9c147b7c953675134dd1"} Feb 19 15:45:03 crc kubenswrapper[4810]: I0219 15:45:03.292603 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525265-dnbnb" Feb 19 15:45:03 crc kubenswrapper[4810]: I0219 15:45:03.379368 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jqxd\" (UniqueName: \"kubernetes.io/projected/9ebe856b-d546-48e1-862d-d9f039620b73-kube-api-access-5jqxd\") pod \"9ebe856b-d546-48e1-862d-d9f039620b73\" (UID: \"9ebe856b-d546-48e1-862d-d9f039620b73\") " Feb 19 15:45:03 crc kubenswrapper[4810]: I0219 15:45:03.379548 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9ebe856b-d546-48e1-862d-d9f039620b73-config-volume\") pod \"9ebe856b-d546-48e1-862d-d9f039620b73\" (UID: \"9ebe856b-d546-48e1-862d-d9f039620b73\") " Feb 19 15:45:03 crc kubenswrapper[4810]: I0219 15:45:03.379589 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9ebe856b-d546-48e1-862d-d9f039620b73-secret-volume\") pod \"9ebe856b-d546-48e1-862d-d9f039620b73\" (UID: \"9ebe856b-d546-48e1-862d-d9f039620b73\") " Feb 19 15:45:03 crc kubenswrapper[4810]: I0219 15:45:03.380686 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ebe856b-d546-48e1-862d-d9f039620b73-config-volume" (OuterVolumeSpecName: "config-volume") pod "9ebe856b-d546-48e1-862d-d9f039620b73" (UID: "9ebe856b-d546-48e1-862d-d9f039620b73"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:45:03 crc kubenswrapper[4810]: I0219 15:45:03.385107 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ebe856b-d546-48e1-862d-d9f039620b73-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "9ebe856b-d546-48e1-862d-d9f039620b73" (UID: "9ebe856b-d546-48e1-862d-d9f039620b73"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:45:03 crc kubenswrapper[4810]: I0219 15:45:03.385652 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ebe856b-d546-48e1-862d-d9f039620b73-kube-api-access-5jqxd" (OuterVolumeSpecName: "kube-api-access-5jqxd") pod "9ebe856b-d546-48e1-862d-d9f039620b73" (UID: "9ebe856b-d546-48e1-862d-d9f039620b73"). InnerVolumeSpecName "kube-api-access-5jqxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:45:03 crc kubenswrapper[4810]: I0219 15:45:03.481642 4810 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9ebe856b-d546-48e1-862d-d9f039620b73-config-volume\") on node \"crc\" DevicePath \"\"" Feb 19 15:45:03 crc kubenswrapper[4810]: I0219 15:45:03.481687 4810 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9ebe856b-d546-48e1-862d-d9f039620b73-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 19 15:45:03 crc kubenswrapper[4810]: I0219 15:45:03.481704 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5jqxd\" (UniqueName: \"kubernetes.io/projected/9ebe856b-d546-48e1-862d-d9f039620b73-kube-api-access-5jqxd\") on node \"crc\" DevicePath \"\"" Feb 19 15:45:03 crc kubenswrapper[4810]: I0219 15:45:03.891044 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525265-dnbnb" event={"ID":"9ebe856b-d546-48e1-862d-d9f039620b73","Type":"ContainerDied","Data":"3310a652d205d8ec8d61afc514f564de3aa5d2a0ad8d9c147b7c953675134dd1"} Feb 19 15:45:03 crc kubenswrapper[4810]: I0219 15:45:03.891401 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3310a652d205d8ec8d61afc514f564de3aa5d2a0ad8d9c147b7c953675134dd1" Feb 19 15:45:03 crc kubenswrapper[4810]: I0219 15:45:03.891479 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525265-dnbnb" Feb 19 15:45:04 crc kubenswrapper[4810]: I0219 15:45:04.411642 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525220-wqnml"] Feb 19 15:45:04 crc kubenswrapper[4810]: I0219 15:45:04.425361 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525220-wqnml"] Feb 19 15:45:05 crc kubenswrapper[4810]: I0219 15:45:05.462289 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8383a9e3-149b-4512-a9fd-12cd0b65e370" path="/var/lib/kubelet/pods/8383a9e3-149b-4512-a9fd-12cd0b65e370/volumes" Feb 19 15:45:42 crc kubenswrapper[4810]: I0219 15:45:42.583214 4810 scope.go:117] "RemoveContainer" containerID="c19d21352b758655d40944eafe4d1d6cfee80125c13e6f74424f93ccd9aee7cf" Feb 19 15:45:49 crc kubenswrapper[4810]: I0219 15:45:49.455093 4810 generic.go:334] "Generic (PLEG): container finished" podID="6650a3db-fdc1-4342-b8a8-cb91376e75c5" containerID="0bfb481da1e582a9833bd1505ab13860f9c856ec5cedc6c8ae5ad71bbbe2c772" exitCode=0 Feb 19 15:45:49 crc kubenswrapper[4810]: I0219 15:45:49.455718 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wpxhs" event={"ID":"6650a3db-fdc1-4342-b8a8-cb91376e75c5","Type":"ContainerDied","Data":"0bfb481da1e582a9833bd1505ab13860f9c856ec5cedc6c8ae5ad71bbbe2c772"} Feb 19 15:45:51 crc kubenswrapper[4810]: I0219 15:45:51.048417 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wpxhs" Feb 19 15:45:51 crc kubenswrapper[4810]: I0219 15:45:51.132418 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkgqg\" (UniqueName: \"kubernetes.io/projected/6650a3db-fdc1-4342-b8a8-cb91376e75c5-kube-api-access-zkgqg\") pod \"6650a3db-fdc1-4342-b8a8-cb91376e75c5\" (UID: \"6650a3db-fdc1-4342-b8a8-cb91376e75c5\") " Feb 19 15:45:51 crc kubenswrapper[4810]: I0219 15:45:51.132530 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6650a3db-fdc1-4342-b8a8-cb91376e75c5-neutron-metadata-combined-ca-bundle\") pod \"6650a3db-fdc1-4342-b8a8-cb91376e75c5\" (UID: \"6650a3db-fdc1-4342-b8a8-cb91376e75c5\") " Feb 19 15:45:51 crc kubenswrapper[4810]: I0219 15:45:51.132588 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6650a3db-fdc1-4342-b8a8-cb91376e75c5-neutron-ovn-metadata-agent-neutron-config-0\") pod \"6650a3db-fdc1-4342-b8a8-cb91376e75c5\" (UID: \"6650a3db-fdc1-4342-b8a8-cb91376e75c5\") " Feb 19 15:45:51 crc kubenswrapper[4810]: I0219 15:45:51.132625 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6650a3db-fdc1-4342-b8a8-cb91376e75c5-nova-metadata-neutron-config-0\") pod \"6650a3db-fdc1-4342-b8a8-cb91376e75c5\" (UID: \"6650a3db-fdc1-4342-b8a8-cb91376e75c5\") " Feb 19 15:45:51 crc kubenswrapper[4810]: I0219 15:45:51.132660 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6650a3db-fdc1-4342-b8a8-cb91376e75c5-inventory\") pod \"6650a3db-fdc1-4342-b8a8-cb91376e75c5\" (UID: \"6650a3db-fdc1-4342-b8a8-cb91376e75c5\") " Feb 19 15:45:51 crc kubenswrapper[4810]: I0219 15:45:51.132817 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6650a3db-fdc1-4342-b8a8-cb91376e75c5-ssh-key-openstack-edpm-ipam\") pod \"6650a3db-fdc1-4342-b8a8-cb91376e75c5\" (UID: \"6650a3db-fdc1-4342-b8a8-cb91376e75c5\") " Feb 19 15:45:51 crc kubenswrapper[4810]: I0219 15:45:51.139119 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6650a3db-fdc1-4342-b8a8-cb91376e75c5-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "6650a3db-fdc1-4342-b8a8-cb91376e75c5" (UID: "6650a3db-fdc1-4342-b8a8-cb91376e75c5"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:45:51 crc kubenswrapper[4810]: I0219 15:45:51.139306 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6650a3db-fdc1-4342-b8a8-cb91376e75c5-kube-api-access-zkgqg" (OuterVolumeSpecName: "kube-api-access-zkgqg") pod "6650a3db-fdc1-4342-b8a8-cb91376e75c5" (UID: "6650a3db-fdc1-4342-b8a8-cb91376e75c5"). InnerVolumeSpecName "kube-api-access-zkgqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:45:51 crc kubenswrapper[4810]: I0219 15:45:51.164689 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6650a3db-fdc1-4342-b8a8-cb91376e75c5-inventory" (OuterVolumeSpecName: "inventory") pod "6650a3db-fdc1-4342-b8a8-cb91376e75c5" (UID: "6650a3db-fdc1-4342-b8a8-cb91376e75c5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:45:51 crc kubenswrapper[4810]: I0219 15:45:51.165990 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6650a3db-fdc1-4342-b8a8-cb91376e75c5-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "6650a3db-fdc1-4342-b8a8-cb91376e75c5" (UID: "6650a3db-fdc1-4342-b8a8-cb91376e75c5"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:45:51 crc kubenswrapper[4810]: I0219 15:45:51.182517 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6650a3db-fdc1-4342-b8a8-cb91376e75c5-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "6650a3db-fdc1-4342-b8a8-cb91376e75c5" (UID: "6650a3db-fdc1-4342-b8a8-cb91376e75c5"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:45:51 crc kubenswrapper[4810]: I0219 15:45:51.182938 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6650a3db-fdc1-4342-b8a8-cb91376e75c5-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "6650a3db-fdc1-4342-b8a8-cb91376e75c5" (UID: "6650a3db-fdc1-4342-b8a8-cb91376e75c5"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:45:51 crc kubenswrapper[4810]: I0219 15:45:51.237871 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkgqg\" (UniqueName: \"kubernetes.io/projected/6650a3db-fdc1-4342-b8a8-cb91376e75c5-kube-api-access-zkgqg\") on node \"crc\" DevicePath \"\"" Feb 19 15:45:51 crc kubenswrapper[4810]: I0219 15:45:51.237917 4810 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6650a3db-fdc1-4342-b8a8-cb91376e75c5-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 15:45:51 crc kubenswrapper[4810]: I0219 15:45:51.237984 4810 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6650a3db-fdc1-4342-b8a8-cb91376e75c5-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 19 15:45:51 crc kubenswrapper[4810]: I0219 15:45:51.238007 4810 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6650a3db-fdc1-4342-b8a8-cb91376e75c5-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 19 15:45:51 crc kubenswrapper[4810]: I0219 15:45:51.238028 4810 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6650a3db-fdc1-4342-b8a8-cb91376e75c5-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 15:45:51 crc kubenswrapper[4810]: I0219 15:45:51.238045 4810 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6650a3db-fdc1-4342-b8a8-cb91376e75c5-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 15:45:51 crc kubenswrapper[4810]: I0219 15:45:51.484029 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wpxhs" event={"ID":"6650a3db-fdc1-4342-b8a8-cb91376e75c5","Type":"ContainerDied","Data":"8d2f457eb3a2c22efd1c5f686fd81de59f3738fe958c14608f0dae2bcba6c1cf"} Feb 19 15:45:51 crc kubenswrapper[4810]: I0219 15:45:51.484063 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wpxhs" Feb 19 15:45:51 crc kubenswrapper[4810]: I0219 15:45:51.484071 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d2f457eb3a2c22efd1c5f686fd81de59f3738fe958c14608f0dae2bcba6c1cf" Feb 19 15:45:51 crc kubenswrapper[4810]: I0219 15:45:51.658220 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mxd44"] Feb 19 15:45:51 crc kubenswrapper[4810]: E0219 15:45:51.659307 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6650a3db-fdc1-4342-b8a8-cb91376e75c5" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 19 15:45:51 crc kubenswrapper[4810]: I0219 15:45:51.659364 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="6650a3db-fdc1-4342-b8a8-cb91376e75c5" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 19 15:45:51 crc kubenswrapper[4810]: E0219 15:45:51.659417 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ebe856b-d546-48e1-862d-d9f039620b73" containerName="collect-profiles" Feb 19 15:45:51 crc kubenswrapper[4810]: I0219 15:45:51.659431 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ebe856b-d546-48e1-862d-d9f039620b73" containerName="collect-profiles" Feb 19 15:45:51 crc kubenswrapper[4810]: I0219 15:45:51.659885 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="6650a3db-fdc1-4342-b8a8-cb91376e75c5" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 19 15:45:51 crc kubenswrapper[4810]: I0219 15:45:51.659913 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ebe856b-d546-48e1-862d-d9f039620b73" containerName="collect-profiles" Feb 19 15:45:51 crc kubenswrapper[4810]: I0219 15:45:51.661237 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mxd44" Feb 19 15:45:51 crc kubenswrapper[4810]: I0219 15:45:51.663663 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Feb 19 15:45:51 crc kubenswrapper[4810]: I0219 15:45:51.664440 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 15:45:51 crc kubenswrapper[4810]: I0219 15:45:51.665440 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 15:45:51 crc kubenswrapper[4810]: I0219 15:45:51.666112 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 15:45:51 crc kubenswrapper[4810]: I0219 15:45:51.674080 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-djc4q" Feb 19 15:45:51 crc kubenswrapper[4810]: I0219 15:45:51.712229 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mxd44"] Feb 19 15:45:51 crc kubenswrapper[4810]: I0219 15:45:51.748096 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0d687e9-21b0-4abe-b7ec-4fb050926f6c-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mxd44\" (UID: \"b0d687e9-21b0-4abe-b7ec-4fb050926f6c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mxd44" Feb 19 15:45:51 crc kubenswrapper[4810]: I0219 15:45:51.748203 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0d687e9-21b0-4abe-b7ec-4fb050926f6c-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mxd44\" (UID: \"b0d687e9-21b0-4abe-b7ec-4fb050926f6c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mxd44" Feb 19 15:45:51 crc kubenswrapper[4810]: I0219 15:45:51.748232 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpbdx\" (UniqueName: \"kubernetes.io/projected/b0d687e9-21b0-4abe-b7ec-4fb050926f6c-kube-api-access-vpbdx\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mxd44\" (UID: \"b0d687e9-21b0-4abe-b7ec-4fb050926f6c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mxd44" Feb 19 15:45:51 crc kubenswrapper[4810]: I0219 15:45:51.748352 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b0d687e9-21b0-4abe-b7ec-4fb050926f6c-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mxd44\" (UID: \"b0d687e9-21b0-4abe-b7ec-4fb050926f6c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mxd44" Feb 19 15:45:51 crc kubenswrapper[4810]: I0219 15:45:51.748691 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b0d687e9-21b0-4abe-b7ec-4fb050926f6c-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mxd44\" (UID: \"b0d687e9-21b0-4abe-b7ec-4fb050926f6c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mxd44" Feb 19 15:45:51 crc kubenswrapper[4810]: I0219 15:45:51.852923 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b0d687e9-21b0-4abe-b7ec-4fb050926f6c-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mxd44\" (UID: \"b0d687e9-21b0-4abe-b7ec-4fb050926f6c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mxd44" Feb 19 15:45:51 crc kubenswrapper[4810]: I0219 15:45:51.853110 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b0d687e9-21b0-4abe-b7ec-4fb050926f6c-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mxd44\" (UID: \"b0d687e9-21b0-4abe-b7ec-4fb050926f6c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mxd44" Feb 19 15:45:51 crc kubenswrapper[4810]: I0219 15:45:51.853256 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0d687e9-21b0-4abe-b7ec-4fb050926f6c-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mxd44\" (UID: \"b0d687e9-21b0-4abe-b7ec-4fb050926f6c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mxd44" Feb 19 15:45:51 crc kubenswrapper[4810]: I0219 15:45:51.853365 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0d687e9-21b0-4abe-b7ec-4fb050926f6c-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mxd44\" (UID: \"b0d687e9-21b0-4abe-b7ec-4fb050926f6c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mxd44" Feb 19 15:45:51 crc kubenswrapper[4810]: I0219 15:45:51.853406 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpbdx\" (UniqueName: \"kubernetes.io/projected/b0d687e9-21b0-4abe-b7ec-4fb050926f6c-kube-api-access-vpbdx\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mxd44\" (UID: \"b0d687e9-21b0-4abe-b7ec-4fb050926f6c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mxd44" Feb 19 15:45:51 crc kubenswrapper[4810]: I0219 15:45:51.857350 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b0d687e9-21b0-4abe-b7ec-4fb050926f6c-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mxd44\" (UID: \"b0d687e9-21b0-4abe-b7ec-4fb050926f6c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mxd44" Feb 19 15:45:51 crc kubenswrapper[4810]: I0219 15:45:51.857467 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b0d687e9-21b0-4abe-b7ec-4fb050926f6c-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mxd44\" (UID: \"b0d687e9-21b0-4abe-b7ec-4fb050926f6c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mxd44" Feb 19 15:45:51 crc kubenswrapper[4810]: I0219 15:45:51.857855 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0d687e9-21b0-4abe-b7ec-4fb050926f6c-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mxd44\" (UID: \"b0d687e9-21b0-4abe-b7ec-4fb050926f6c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mxd44" Feb 19 15:45:51 crc kubenswrapper[4810]: I0219 15:45:51.859867 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0d687e9-21b0-4abe-b7ec-4fb050926f6c-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mxd44\" (UID: \"b0d687e9-21b0-4abe-b7ec-4fb050926f6c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mxd44" Feb 19 15:45:51 crc kubenswrapper[4810]: I0219 15:45:51.875269 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpbdx\" (UniqueName: \"kubernetes.io/projected/b0d687e9-21b0-4abe-b7ec-4fb050926f6c-kube-api-access-vpbdx\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mxd44\" (UID: \"b0d687e9-21b0-4abe-b7ec-4fb050926f6c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mxd44" Feb 19 15:45:52 crc kubenswrapper[4810]: I0219 15:45:52.016527 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mxd44" Feb 19 15:45:52 crc kubenswrapper[4810]: I0219 15:45:52.643858 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mxd44"] Feb 19 15:45:53 crc kubenswrapper[4810]: I0219 15:45:53.525995 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mxd44" event={"ID":"b0d687e9-21b0-4abe-b7ec-4fb050926f6c","Type":"ContainerStarted","Data":"0b1bbaed8126375699a9e966b7745665836631416087e63a3422e04b7d8a2fdd"} Feb 19 15:45:53 crc kubenswrapper[4810]: I0219 15:45:53.526277 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mxd44" event={"ID":"b0d687e9-21b0-4abe-b7ec-4fb050926f6c","Type":"ContainerStarted","Data":"873b652283637e7d5e6deec7064b3a10014ad0fa5b6fb23a1fe74bd39abab9d5"} Feb 19 15:45:53 crc kubenswrapper[4810]: I0219 15:45:53.568892 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mxd44" podStartSLOduration=2.079083367 podStartE2EDuration="2.568864093s" podCreationTimestamp="2026-02-19 15:45:51 +0000 UTC" firstStartedPulling="2026-02-19 15:45:52.634624437 +0000 UTC m=+2182.116654571" lastFinishedPulling="2026-02-19 15:45:53.124405123 +0000 UTC m=+2182.606435297" observedRunningTime="2026-02-19 15:45:53.561811457 +0000 UTC m=+2183.043841581" watchObservedRunningTime="2026-02-19 15:45:53.568864093 +0000 UTC m=+2183.050894217" Feb 19 15:46:19 crc kubenswrapper[4810]: I0219 15:46:19.537828 4810 patch_prober.go:28] interesting pod/machine-config-daemon-t499d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 15:46:19 crc kubenswrapper[4810]: I0219 15:46:19.538590 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 15:46:49 crc kubenswrapper[4810]: I0219 15:46:49.538295 4810 patch_prober.go:28] interesting pod/machine-config-daemon-t499d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 15:46:49 crc kubenswrapper[4810]: I0219 15:46:49.539510 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 15:46:52 crc kubenswrapper[4810]: I0219 15:46:52.150844 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9mfnq"] Feb 19 15:46:52 crc kubenswrapper[4810]: I0219 15:46:52.156058 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9mfnq" Feb 19 15:46:52 crc kubenswrapper[4810]: I0219 15:46:52.168040 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9mfnq"] Feb 19 15:46:52 crc kubenswrapper[4810]: I0219 15:46:52.252686 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2ndr\" (UniqueName: \"kubernetes.io/projected/c0c0cd7c-28e7-4c44-889e-7808fac96bfa-kube-api-access-h2ndr\") pod \"community-operators-9mfnq\" (UID: \"c0c0cd7c-28e7-4c44-889e-7808fac96bfa\") " pod="openshift-marketplace/community-operators-9mfnq" Feb 19 15:46:52 crc kubenswrapper[4810]: I0219 15:46:52.252786 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0c0cd7c-28e7-4c44-889e-7808fac96bfa-utilities\") pod \"community-operators-9mfnq\" (UID: \"c0c0cd7c-28e7-4c44-889e-7808fac96bfa\") " pod="openshift-marketplace/community-operators-9mfnq" Feb 19 15:46:52 crc kubenswrapper[4810]: I0219 15:46:52.252829 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0c0cd7c-28e7-4c44-889e-7808fac96bfa-catalog-content\") pod \"community-operators-9mfnq\" (UID: \"c0c0cd7c-28e7-4c44-889e-7808fac96bfa\") " pod="openshift-marketplace/community-operators-9mfnq" Feb 19 15:46:52 crc kubenswrapper[4810]: I0219 15:46:52.355481 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2ndr\" (UniqueName: \"kubernetes.io/projected/c0c0cd7c-28e7-4c44-889e-7808fac96bfa-kube-api-access-h2ndr\") pod \"community-operators-9mfnq\" (UID: \"c0c0cd7c-28e7-4c44-889e-7808fac96bfa\") " pod="openshift-marketplace/community-operators-9mfnq" Feb 19 15:46:52 crc kubenswrapper[4810]: I0219 15:46:52.355662 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0c0cd7c-28e7-4c44-889e-7808fac96bfa-utilities\") pod \"community-operators-9mfnq\" (UID: \"c0c0cd7c-28e7-4c44-889e-7808fac96bfa\") " pod="openshift-marketplace/community-operators-9mfnq" Feb 19 15:46:52 crc kubenswrapper[4810]: I0219 15:46:52.355732 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0c0cd7c-28e7-4c44-889e-7808fac96bfa-catalog-content\") pod \"community-operators-9mfnq\" (UID: \"c0c0cd7c-28e7-4c44-889e-7808fac96bfa\") " pod="openshift-marketplace/community-operators-9mfnq" Feb 19 15:46:52 crc kubenswrapper[4810]: I0219 15:46:52.356200 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0c0cd7c-28e7-4c44-889e-7808fac96bfa-utilities\") pod \"community-operators-9mfnq\" (UID: \"c0c0cd7c-28e7-4c44-889e-7808fac96bfa\") " pod="openshift-marketplace/community-operators-9mfnq" Feb 19 15:46:52 crc kubenswrapper[4810]: I0219 15:46:52.356313 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0c0cd7c-28e7-4c44-889e-7808fac96bfa-catalog-content\") pod \"community-operators-9mfnq\" (UID: \"c0c0cd7c-28e7-4c44-889e-7808fac96bfa\") " pod="openshift-marketplace/community-operators-9mfnq" Feb 19 15:46:52 crc kubenswrapper[4810]: I0219 15:46:52.376228 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2ndr\" (UniqueName: \"kubernetes.io/projected/c0c0cd7c-28e7-4c44-889e-7808fac96bfa-kube-api-access-h2ndr\") pod \"community-operators-9mfnq\" (UID: \"c0c0cd7c-28e7-4c44-889e-7808fac96bfa\") " pod="openshift-marketplace/community-operators-9mfnq" Feb 19 15:46:52 crc kubenswrapper[4810]: I0219 15:46:52.501784 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9mfnq" Feb 19 15:46:53 crc kubenswrapper[4810]: I0219 15:46:53.010782 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9mfnq"] Feb 19 15:46:53 crc kubenswrapper[4810]: I0219 15:46:53.221883 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9mfnq" event={"ID":"c0c0cd7c-28e7-4c44-889e-7808fac96bfa","Type":"ContainerStarted","Data":"11f62df48e5277dbcf164f8317d7b4913a22f5e32ec26e5140bca32e8d987dcf"} Feb 19 15:46:54 crc kubenswrapper[4810]: I0219 15:46:54.237473 4810 generic.go:334] "Generic (PLEG): container finished" podID="c0c0cd7c-28e7-4c44-889e-7808fac96bfa" containerID="f27bee4fc68ef668f895a22d4bdc17a101cf723a973b64d06bcc45dc50a5efad" exitCode=0 Feb 19 15:46:54 crc kubenswrapper[4810]: I0219 15:46:54.237554 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9mfnq" event={"ID":"c0c0cd7c-28e7-4c44-889e-7808fac96bfa","Type":"ContainerDied","Data":"f27bee4fc68ef668f895a22d4bdc17a101cf723a973b64d06bcc45dc50a5efad"} Feb 19 15:46:56 crc kubenswrapper[4810]: I0219 15:46:56.271907 4810 generic.go:334] "Generic (PLEG): container finished" podID="c0c0cd7c-28e7-4c44-889e-7808fac96bfa" containerID="d5e557007c609c199cf61b46011dbd18ca7056cb82f034ef80e7bbf27b827e63" exitCode=0 Feb 19 15:46:56 crc kubenswrapper[4810]: I0219 15:46:56.272062 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9mfnq" event={"ID":"c0c0cd7c-28e7-4c44-889e-7808fac96bfa","Type":"ContainerDied","Data":"d5e557007c609c199cf61b46011dbd18ca7056cb82f034ef80e7bbf27b827e63"} Feb 19 15:46:57 crc kubenswrapper[4810]: I0219 15:46:57.286219 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9mfnq" event={"ID":"c0c0cd7c-28e7-4c44-889e-7808fac96bfa","Type":"ContainerStarted","Data":"718c5c7061ee5cf5c92fa1b9a0b876e92d585a0781ad5002a63205e1dd7b4e15"} Feb 19 15:46:57 crc kubenswrapper[4810]: I0219 15:46:57.315198 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9mfnq" podStartSLOduration=2.874103124 podStartE2EDuration="5.315177523s" podCreationTimestamp="2026-02-19 15:46:52 +0000 UTC" firstStartedPulling="2026-02-19 15:46:54.240283196 +0000 UTC m=+2243.722313350" lastFinishedPulling="2026-02-19 15:46:56.681357615 +0000 UTC m=+2246.163387749" observedRunningTime="2026-02-19 15:46:57.308166589 +0000 UTC m=+2246.790196713" watchObservedRunningTime="2026-02-19 15:46:57.315177523 +0000 UTC m=+2246.797207647" Feb 19 15:47:02 crc kubenswrapper[4810]: I0219 15:47:02.501940 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9mfnq" Feb 19 15:47:02 crc kubenswrapper[4810]: I0219 15:47:02.502834 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9mfnq" Feb 19 15:47:02 crc kubenswrapper[4810]: I0219 15:47:02.589506 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9mfnq" Feb 19 15:47:03 crc kubenswrapper[4810]: I0219 15:47:03.429781 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9mfnq" Feb 19 15:47:03 crc kubenswrapper[4810]: I0219 15:47:03.500126 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9mfnq"] Feb 19 15:47:05 crc kubenswrapper[4810]: I0219 15:47:05.382900 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9mfnq" podUID="c0c0cd7c-28e7-4c44-889e-7808fac96bfa" containerName="registry-server" containerID="cri-o://718c5c7061ee5cf5c92fa1b9a0b876e92d585a0781ad5002a63205e1dd7b4e15" gracePeriod=2 Feb 19 15:47:05 crc kubenswrapper[4810]: E0219 15:47:05.684534 4810 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0c0cd7c_28e7_4c44_889e_7808fac96bfa.slice/crio-conmon-718c5c7061ee5cf5c92fa1b9a0b876e92d585a0781ad5002a63205e1dd7b4e15.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0c0cd7c_28e7_4c44_889e_7808fac96bfa.slice/crio-718c5c7061ee5cf5c92fa1b9a0b876e92d585a0781ad5002a63205e1dd7b4e15.scope\": RecentStats: unable to find data in memory cache]" Feb 19 15:47:05 crc kubenswrapper[4810]: I0219 15:47:05.907782 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9mfnq" Feb 19 15:47:06 crc kubenswrapper[4810]: I0219 15:47:06.076469 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0c0cd7c-28e7-4c44-889e-7808fac96bfa-catalog-content\") pod \"c0c0cd7c-28e7-4c44-889e-7808fac96bfa\" (UID: \"c0c0cd7c-28e7-4c44-889e-7808fac96bfa\") " Feb 19 15:47:06 crc kubenswrapper[4810]: I0219 15:47:06.076667 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0c0cd7c-28e7-4c44-889e-7808fac96bfa-utilities\") pod \"c0c0cd7c-28e7-4c44-889e-7808fac96bfa\" (UID: \"c0c0cd7c-28e7-4c44-889e-7808fac96bfa\") " Feb 19 15:47:06 crc kubenswrapper[4810]: I0219 15:47:06.076708 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2ndr\" (UniqueName: \"kubernetes.io/projected/c0c0cd7c-28e7-4c44-889e-7808fac96bfa-kube-api-access-h2ndr\") pod \"c0c0cd7c-28e7-4c44-889e-7808fac96bfa\" (UID: \"c0c0cd7c-28e7-4c44-889e-7808fac96bfa\") " Feb 19 15:47:06 crc kubenswrapper[4810]: I0219 15:47:06.078243 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0c0cd7c-28e7-4c44-889e-7808fac96bfa-utilities" (OuterVolumeSpecName: "utilities") pod "c0c0cd7c-28e7-4c44-889e-7808fac96bfa" (UID: "c0c0cd7c-28e7-4c44-889e-7808fac96bfa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:47:06 crc kubenswrapper[4810]: I0219 15:47:06.087320 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0c0cd7c-28e7-4c44-889e-7808fac96bfa-kube-api-access-h2ndr" (OuterVolumeSpecName: "kube-api-access-h2ndr") pod "c0c0cd7c-28e7-4c44-889e-7808fac96bfa" (UID: "c0c0cd7c-28e7-4c44-889e-7808fac96bfa"). InnerVolumeSpecName "kube-api-access-h2ndr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:47:06 crc kubenswrapper[4810]: I0219 15:47:06.133041 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0c0cd7c-28e7-4c44-889e-7808fac96bfa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c0c0cd7c-28e7-4c44-889e-7808fac96bfa" (UID: "c0c0cd7c-28e7-4c44-889e-7808fac96bfa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:47:06 crc kubenswrapper[4810]: I0219 15:47:06.179281 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0c0cd7c-28e7-4c44-889e-7808fac96bfa-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 15:47:06 crc kubenswrapper[4810]: I0219 15:47:06.179602 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h2ndr\" (UniqueName: \"kubernetes.io/projected/c0c0cd7c-28e7-4c44-889e-7808fac96bfa-kube-api-access-h2ndr\") on node \"crc\" DevicePath \"\"" Feb 19 15:47:06 crc kubenswrapper[4810]: I0219 15:47:06.179708 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0c0cd7c-28e7-4c44-889e-7808fac96bfa-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 15:47:06 crc kubenswrapper[4810]: I0219 15:47:06.398413 4810 generic.go:334] "Generic (PLEG): container finished" podID="c0c0cd7c-28e7-4c44-889e-7808fac96bfa" containerID="718c5c7061ee5cf5c92fa1b9a0b876e92d585a0781ad5002a63205e1dd7b4e15" exitCode=0 Feb 19 15:47:06 crc kubenswrapper[4810]: I0219 15:47:06.398474 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9mfnq" event={"ID":"c0c0cd7c-28e7-4c44-889e-7808fac96bfa","Type":"ContainerDied","Data":"718c5c7061ee5cf5c92fa1b9a0b876e92d585a0781ad5002a63205e1dd7b4e15"} Feb 19 15:47:06 crc kubenswrapper[4810]: I0219 15:47:06.398525 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9mfnq" event={"ID":"c0c0cd7c-28e7-4c44-889e-7808fac96bfa","Type":"ContainerDied","Data":"11f62df48e5277dbcf164f8317d7b4913a22f5e32ec26e5140bca32e8d987dcf"} Feb 19 15:47:06 crc kubenswrapper[4810]: I0219 15:47:06.398588 4810 scope.go:117] "RemoveContainer" containerID="718c5c7061ee5cf5c92fa1b9a0b876e92d585a0781ad5002a63205e1dd7b4e15" Feb 19 15:47:06 crc kubenswrapper[4810]: I0219 15:47:06.400532 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9mfnq" Feb 19 15:47:06 crc kubenswrapper[4810]: I0219 15:47:06.425899 4810 scope.go:117] "RemoveContainer" containerID="d5e557007c609c199cf61b46011dbd18ca7056cb82f034ef80e7bbf27b827e63" Feb 19 15:47:06 crc kubenswrapper[4810]: I0219 15:47:06.480909 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9mfnq"] Feb 19 15:47:06 crc kubenswrapper[4810]: I0219 15:47:06.484970 4810 scope.go:117] "RemoveContainer" containerID="f27bee4fc68ef668f895a22d4bdc17a101cf723a973b64d06bcc45dc50a5efad" Feb 19 15:47:06 crc kubenswrapper[4810]: I0219 15:47:06.494404 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9mfnq"] Feb 19 15:47:06 crc kubenswrapper[4810]: I0219 15:47:06.528374 4810 scope.go:117] "RemoveContainer" containerID="718c5c7061ee5cf5c92fa1b9a0b876e92d585a0781ad5002a63205e1dd7b4e15" Feb 19 15:47:06 crc kubenswrapper[4810]: E0219 15:47:06.529219 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"718c5c7061ee5cf5c92fa1b9a0b876e92d585a0781ad5002a63205e1dd7b4e15\": container with ID starting with 718c5c7061ee5cf5c92fa1b9a0b876e92d585a0781ad5002a63205e1dd7b4e15 not found: ID does not exist" containerID="718c5c7061ee5cf5c92fa1b9a0b876e92d585a0781ad5002a63205e1dd7b4e15" Feb 19 15:47:06 crc kubenswrapper[4810]: I0219 15:47:06.529267 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"718c5c7061ee5cf5c92fa1b9a0b876e92d585a0781ad5002a63205e1dd7b4e15"} err="failed to get container status \"718c5c7061ee5cf5c92fa1b9a0b876e92d585a0781ad5002a63205e1dd7b4e15\": rpc error: code = NotFound desc = could not find container \"718c5c7061ee5cf5c92fa1b9a0b876e92d585a0781ad5002a63205e1dd7b4e15\": container with ID starting with 718c5c7061ee5cf5c92fa1b9a0b876e92d585a0781ad5002a63205e1dd7b4e15 not found: ID does not exist" Feb 19 15:47:06 crc kubenswrapper[4810]: I0219 15:47:06.529299 4810 scope.go:117] "RemoveContainer" containerID="d5e557007c609c199cf61b46011dbd18ca7056cb82f034ef80e7bbf27b827e63" Feb 19 15:47:06 crc kubenswrapper[4810]: E0219 15:47:06.530109 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5e557007c609c199cf61b46011dbd18ca7056cb82f034ef80e7bbf27b827e63\": container with ID starting with d5e557007c609c199cf61b46011dbd18ca7056cb82f034ef80e7bbf27b827e63 not found: ID does not exist" containerID="d5e557007c609c199cf61b46011dbd18ca7056cb82f034ef80e7bbf27b827e63" Feb 19 15:47:06 crc kubenswrapper[4810]: I0219 15:47:06.530177 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5e557007c609c199cf61b46011dbd18ca7056cb82f034ef80e7bbf27b827e63"} err="failed to get container status \"d5e557007c609c199cf61b46011dbd18ca7056cb82f034ef80e7bbf27b827e63\": rpc error: code = NotFound desc = could not find container \"d5e557007c609c199cf61b46011dbd18ca7056cb82f034ef80e7bbf27b827e63\": container with ID starting with d5e557007c609c199cf61b46011dbd18ca7056cb82f034ef80e7bbf27b827e63 not found: ID does not exist" Feb 19 15:47:06 crc kubenswrapper[4810]: I0219 15:47:06.530220 4810 scope.go:117] "RemoveContainer" containerID="f27bee4fc68ef668f895a22d4bdc17a101cf723a973b64d06bcc45dc50a5efad" Feb 19 15:47:06 crc kubenswrapper[4810]: E0219 15:47:06.530627 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f27bee4fc68ef668f895a22d4bdc17a101cf723a973b64d06bcc45dc50a5efad\": container with ID starting with f27bee4fc68ef668f895a22d4bdc17a101cf723a973b64d06bcc45dc50a5efad not found: ID does not exist" containerID="f27bee4fc68ef668f895a22d4bdc17a101cf723a973b64d06bcc45dc50a5efad" Feb 19 15:47:06 crc kubenswrapper[4810]: I0219 15:47:06.530667 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f27bee4fc68ef668f895a22d4bdc17a101cf723a973b64d06bcc45dc50a5efad"} err="failed to get container status \"f27bee4fc68ef668f895a22d4bdc17a101cf723a973b64d06bcc45dc50a5efad\": rpc error: code = NotFound desc = could not find container \"f27bee4fc68ef668f895a22d4bdc17a101cf723a973b64d06bcc45dc50a5efad\": container with ID starting with f27bee4fc68ef668f895a22d4bdc17a101cf723a973b64d06bcc45dc50a5efad not found: ID does not exist" Feb 19 15:47:07 crc kubenswrapper[4810]: I0219 15:47:07.453874 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0c0cd7c-28e7-4c44-889e-7808fac96bfa" path="/var/lib/kubelet/pods/c0c0cd7c-28e7-4c44-889e-7808fac96bfa/volumes" Feb 19 15:47:19 crc kubenswrapper[4810]: I0219 15:47:19.537297 4810 patch_prober.go:28] interesting pod/machine-config-daemon-t499d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 15:47:19 crc kubenswrapper[4810]: I0219 15:47:19.538682 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 15:47:19 crc kubenswrapper[4810]: I0219 15:47:19.538782 4810 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t499d" Feb 19 15:47:19 crc kubenswrapper[4810]: I0219 15:47:19.539585 4810 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"75401b3edfa84b40903edbf5e5d73ff2da026a9f8214b6a2404ce809eb32bd1f"} pod="openshift-machine-config-operator/machine-config-daemon-t499d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 15:47:19 crc kubenswrapper[4810]: I0219 15:47:19.539721 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" containerID="cri-o://75401b3edfa84b40903edbf5e5d73ff2da026a9f8214b6a2404ce809eb32bd1f" gracePeriod=600 Feb 19 15:47:19 crc kubenswrapper[4810]: E0219 15:47:19.662580 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 15:47:20 crc kubenswrapper[4810]: I0219 15:47:20.551758 4810 generic.go:334] "Generic (PLEG): container finished" podID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerID="75401b3edfa84b40903edbf5e5d73ff2da026a9f8214b6a2404ce809eb32bd1f" exitCode=0 Feb 19 15:47:20 crc kubenswrapper[4810]: I0219 15:47:20.551864 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" event={"ID":"dfbf74ef-1e94-4826-8583-42b2e246ccf3","Type":"ContainerDied","Data":"75401b3edfa84b40903edbf5e5d73ff2da026a9f8214b6a2404ce809eb32bd1f"} Feb 19 15:47:20 crc kubenswrapper[4810]: I0219 15:47:20.552197 4810 scope.go:117] "RemoveContainer" containerID="4e3ef6a26491a979d9b15a4e163fa3567692b3b0eef18273908461c8a7758364" Feb 19 15:47:20 crc kubenswrapper[4810]: I0219 15:47:20.553071 4810 scope.go:117] "RemoveContainer" containerID="75401b3edfa84b40903edbf5e5d73ff2da026a9f8214b6a2404ce809eb32bd1f" Feb 19 15:47:20 crc kubenswrapper[4810]: E0219 15:47:20.553903 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 15:47:35 crc kubenswrapper[4810]: I0219 15:47:35.439431 4810 scope.go:117] "RemoveContainer" containerID="75401b3edfa84b40903edbf5e5d73ff2da026a9f8214b6a2404ce809eb32bd1f" Feb 19 15:47:35 crc kubenswrapper[4810]: E0219 15:47:35.440226 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 15:47:49 crc kubenswrapper[4810]: I0219 15:47:49.442440 4810 scope.go:117] "RemoveContainer" containerID="75401b3edfa84b40903edbf5e5d73ff2da026a9f8214b6a2404ce809eb32bd1f" Feb 19 15:47:49 crc kubenswrapper[4810]: E0219 15:47:49.444217 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 15:48:02 crc kubenswrapper[4810]: I0219 15:48:02.441155 4810 scope.go:117] "RemoveContainer" containerID="75401b3edfa84b40903edbf5e5d73ff2da026a9f8214b6a2404ce809eb32bd1f" Feb 19 15:48:02 crc kubenswrapper[4810]: E0219 15:48:02.442086 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 15:48:16 crc kubenswrapper[4810]: I0219 15:48:16.439561 4810 scope.go:117] "RemoveContainer" containerID="75401b3edfa84b40903edbf5e5d73ff2da026a9f8214b6a2404ce809eb32bd1f" Feb 19 15:48:16 crc kubenswrapper[4810]: E0219 15:48:16.440804 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 15:48:31 crc kubenswrapper[4810]: I0219 15:48:31.464530 4810 scope.go:117] "RemoveContainer" containerID="75401b3edfa84b40903edbf5e5d73ff2da026a9f8214b6a2404ce809eb32bd1f" Feb 19 15:48:31 crc kubenswrapper[4810]: E0219 15:48:31.465293 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 15:48:45 crc kubenswrapper[4810]: I0219 15:48:45.440749 4810 scope.go:117] "RemoveContainer" containerID="75401b3edfa84b40903edbf5e5d73ff2da026a9f8214b6a2404ce809eb32bd1f" Feb 19 15:48:45 crc kubenswrapper[4810]: E0219 15:48:45.441580 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 15:49:00 crc kubenswrapper[4810]: I0219 15:49:00.440609 4810 scope.go:117] "RemoveContainer" containerID="75401b3edfa84b40903edbf5e5d73ff2da026a9f8214b6a2404ce809eb32bd1f" Feb 19 15:49:00 crc kubenswrapper[4810]: E0219 15:49:00.441604 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 15:49:12 crc kubenswrapper[4810]: I0219 15:49:12.441466 4810 scope.go:117] "RemoveContainer" containerID="75401b3edfa84b40903edbf5e5d73ff2da026a9f8214b6a2404ce809eb32bd1f" Feb 19 15:49:12 crc kubenswrapper[4810]: E0219 15:49:12.442452 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 15:49:23 crc kubenswrapper[4810]: I0219 15:49:23.440528 4810 scope.go:117] "RemoveContainer" containerID="75401b3edfa84b40903edbf5e5d73ff2da026a9f8214b6a2404ce809eb32bd1f" Feb 19 15:49:23 crc kubenswrapper[4810]: E0219 15:49:23.441720 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 15:49:34 crc kubenswrapper[4810]: I0219 15:49:34.440282 4810 scope.go:117] "RemoveContainer" containerID="75401b3edfa84b40903edbf5e5d73ff2da026a9f8214b6a2404ce809eb32bd1f" Feb 19 15:49:34 crc kubenswrapper[4810]: E0219 15:49:34.442526 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 15:49:49 crc kubenswrapper[4810]: I0219 15:49:49.440246 4810 scope.go:117] "RemoveContainer" containerID="75401b3edfa84b40903edbf5e5d73ff2da026a9f8214b6a2404ce809eb32bd1f" Feb 19 15:49:49 crc kubenswrapper[4810]: E0219 15:49:49.441432 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 15:49:52 crc kubenswrapper[4810]: I0219 15:49:52.348311 4810 generic.go:334] "Generic (PLEG): container finished" podID="b0d687e9-21b0-4abe-b7ec-4fb050926f6c" containerID="0b1bbaed8126375699a9e966b7745665836631416087e63a3422e04b7d8a2fdd" exitCode=0 Feb 19 15:49:52 crc kubenswrapper[4810]: I0219 15:49:52.348379 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mxd44" event={"ID":"b0d687e9-21b0-4abe-b7ec-4fb050926f6c","Type":"ContainerDied","Data":"0b1bbaed8126375699a9e966b7745665836631416087e63a3422e04b7d8a2fdd"} Feb 19 15:49:53 crc kubenswrapper[4810]: I0219 15:49:53.761321 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mxd44" Feb 19 15:49:53 crc kubenswrapper[4810]: I0219 15:49:53.861263 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vpbdx\" (UniqueName: \"kubernetes.io/projected/b0d687e9-21b0-4abe-b7ec-4fb050926f6c-kube-api-access-vpbdx\") pod \"b0d687e9-21b0-4abe-b7ec-4fb050926f6c\" (UID: \"b0d687e9-21b0-4abe-b7ec-4fb050926f6c\") " Feb 19 15:49:53 crc kubenswrapper[4810]: I0219 15:49:53.861512 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b0d687e9-21b0-4abe-b7ec-4fb050926f6c-ssh-key-openstack-edpm-ipam\") pod \"b0d687e9-21b0-4abe-b7ec-4fb050926f6c\" (UID: \"b0d687e9-21b0-4abe-b7ec-4fb050926f6c\") " Feb 19 15:49:53 crc kubenswrapper[4810]: I0219 15:49:53.861539 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0d687e9-21b0-4abe-b7ec-4fb050926f6c-libvirt-combined-ca-bundle\") pod \"b0d687e9-21b0-4abe-b7ec-4fb050926f6c\" (UID: \"b0d687e9-21b0-4abe-b7ec-4fb050926f6c\") " Feb 19 15:49:53 crc kubenswrapper[4810]: I0219 15:49:53.861564 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b0d687e9-21b0-4abe-b7ec-4fb050926f6c-libvirt-secret-0\") pod \"b0d687e9-21b0-4abe-b7ec-4fb050926f6c\" (UID: \"b0d687e9-21b0-4abe-b7ec-4fb050926f6c\") " Feb 19 15:49:53 crc kubenswrapper[4810]: I0219 15:49:53.861656 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0d687e9-21b0-4abe-b7ec-4fb050926f6c-inventory\") pod \"b0d687e9-21b0-4abe-b7ec-4fb050926f6c\" (UID: \"b0d687e9-21b0-4abe-b7ec-4fb050926f6c\") " Feb 19 15:49:53 crc kubenswrapper[4810]: I0219 15:49:53.866803 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0d687e9-21b0-4abe-b7ec-4fb050926f6c-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "b0d687e9-21b0-4abe-b7ec-4fb050926f6c" (UID: "b0d687e9-21b0-4abe-b7ec-4fb050926f6c"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:49:53 crc kubenswrapper[4810]: I0219 15:49:53.868251 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0d687e9-21b0-4abe-b7ec-4fb050926f6c-kube-api-access-vpbdx" (OuterVolumeSpecName: "kube-api-access-vpbdx") pod "b0d687e9-21b0-4abe-b7ec-4fb050926f6c" (UID: "b0d687e9-21b0-4abe-b7ec-4fb050926f6c"). InnerVolumeSpecName "kube-api-access-vpbdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:49:53 crc kubenswrapper[4810]: I0219 15:49:53.891600 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0d687e9-21b0-4abe-b7ec-4fb050926f6c-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "b0d687e9-21b0-4abe-b7ec-4fb050926f6c" (UID: "b0d687e9-21b0-4abe-b7ec-4fb050926f6c"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:49:53 crc kubenswrapper[4810]: I0219 15:49:53.895273 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0d687e9-21b0-4abe-b7ec-4fb050926f6c-inventory" (OuterVolumeSpecName: "inventory") pod "b0d687e9-21b0-4abe-b7ec-4fb050926f6c" (UID: "b0d687e9-21b0-4abe-b7ec-4fb050926f6c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:49:53 crc kubenswrapper[4810]: I0219 15:49:53.895507 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0d687e9-21b0-4abe-b7ec-4fb050926f6c-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b0d687e9-21b0-4abe-b7ec-4fb050926f6c" (UID: "b0d687e9-21b0-4abe-b7ec-4fb050926f6c"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:49:53 crc kubenswrapper[4810]: I0219 15:49:53.964859 4810 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b0d687e9-21b0-4abe-b7ec-4fb050926f6c-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 15:49:53 crc kubenswrapper[4810]: I0219 15:49:53.964902 4810 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0d687e9-21b0-4abe-b7ec-4fb050926f6c-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 15:49:53 crc kubenswrapper[4810]: I0219 15:49:53.964915 4810 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b0d687e9-21b0-4abe-b7ec-4fb050926f6c-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Feb 19 15:49:53 crc kubenswrapper[4810]: I0219 15:49:53.964928 4810 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0d687e9-21b0-4abe-b7ec-4fb050926f6c-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 15:49:53 crc kubenswrapper[4810]: I0219 15:49:53.964939 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vpbdx\" (UniqueName: \"kubernetes.io/projected/b0d687e9-21b0-4abe-b7ec-4fb050926f6c-kube-api-access-vpbdx\") on node \"crc\" DevicePath \"\"" Feb 19 15:49:54 crc kubenswrapper[4810]: I0219 15:49:54.373583 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mxd44" event={"ID":"b0d687e9-21b0-4abe-b7ec-4fb050926f6c","Type":"ContainerDied","Data":"873b652283637e7d5e6deec7064b3a10014ad0fa5b6fb23a1fe74bd39abab9d5"} Feb 19 15:49:54 crc kubenswrapper[4810]: I0219 15:49:54.374145 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="873b652283637e7d5e6deec7064b3a10014ad0fa5b6fb23a1fe74bd39abab9d5" Feb 19 15:49:54 crc kubenswrapper[4810]: I0219 15:49:54.373668 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mxd44" Feb 19 15:49:54 crc kubenswrapper[4810]: I0219 15:49:54.562207 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-nv8wh"] Feb 19 15:49:54 crc kubenswrapper[4810]: E0219 15:49:54.562907 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0d687e9-21b0-4abe-b7ec-4fb050926f6c" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 19 15:49:54 crc kubenswrapper[4810]: I0219 15:49:54.562965 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0d687e9-21b0-4abe-b7ec-4fb050926f6c" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 19 15:49:54 crc kubenswrapper[4810]: E0219 15:49:54.562995 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0c0cd7c-28e7-4c44-889e-7808fac96bfa" containerName="registry-server" Feb 19 15:49:54 crc kubenswrapper[4810]: I0219 15:49:54.563009 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0c0cd7c-28e7-4c44-889e-7808fac96bfa" containerName="registry-server" Feb 19 15:49:54 crc kubenswrapper[4810]: E0219 15:49:54.563069 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0c0cd7c-28e7-4c44-889e-7808fac96bfa" containerName="extract-utilities" Feb 19 15:49:54 crc kubenswrapper[4810]: I0219 15:49:54.563084 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0c0cd7c-28e7-4c44-889e-7808fac96bfa" containerName="extract-utilities" Feb 19 15:49:54 crc kubenswrapper[4810]: E0219 15:49:54.563108 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0c0cd7c-28e7-4c44-889e-7808fac96bfa" containerName="extract-content" Feb 19 15:49:54 crc kubenswrapper[4810]: I0219 15:49:54.563120 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0c0cd7c-28e7-4c44-889e-7808fac96bfa" containerName="extract-content" Feb 19 15:49:54 crc kubenswrapper[4810]: I0219 15:49:54.563501 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0c0cd7c-28e7-4c44-889e-7808fac96bfa" containerName="registry-server" Feb 19 15:49:54 crc kubenswrapper[4810]: I0219 15:49:54.563534 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0d687e9-21b0-4abe-b7ec-4fb050926f6c" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 19 15:49:54 crc kubenswrapper[4810]: I0219 15:49:54.564753 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nv8wh" Feb 19 15:49:54 crc kubenswrapper[4810]: I0219 15:49:54.568943 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 15:49:54 crc kubenswrapper[4810]: I0219 15:49:54.569096 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Feb 19 15:49:54 crc kubenswrapper[4810]: I0219 15:49:54.570026 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Feb 19 15:49:54 crc kubenswrapper[4810]: I0219 15:49:54.571716 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Feb 19 15:49:54 crc kubenswrapper[4810]: I0219 15:49:54.572070 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 15:49:54 crc kubenswrapper[4810]: I0219 15:49:54.573574 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-nv8wh"] Feb 19 15:49:54 crc kubenswrapper[4810]: I0219 15:49:54.574879 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 15:49:54 crc kubenswrapper[4810]: I0219 15:49:54.575978 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-djc4q" Feb 19 15:49:54 crc kubenswrapper[4810]: I0219 15:49:54.677757 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/cc5014f8-e5aa-47ad-8787-c187b0f7f0e1-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nv8wh\" (UID: \"cc5014f8-e5aa-47ad-8787-c187b0f7f0e1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nv8wh" Feb 19 15:49:54 crc kubenswrapper[4810]: I0219 15:49:54.677844 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cc5014f8-e5aa-47ad-8787-c187b0f7f0e1-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nv8wh\" (UID: \"cc5014f8-e5aa-47ad-8787-c187b0f7f0e1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nv8wh" Feb 19 15:49:54 crc kubenswrapper[4810]: I0219 15:49:54.677875 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cc5014f8-e5aa-47ad-8787-c187b0f7f0e1-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nv8wh\" (UID: \"cc5014f8-e5aa-47ad-8787-c187b0f7f0e1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nv8wh" Feb 19 15:49:54 crc kubenswrapper[4810]: I0219 15:49:54.677911 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/cc5014f8-e5aa-47ad-8787-c187b0f7f0e1-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nv8wh\" (UID: \"cc5014f8-e5aa-47ad-8787-c187b0f7f0e1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nv8wh" Feb 19 15:49:54 crc kubenswrapper[4810]: I0219 15:49:54.677959 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/cc5014f8-e5aa-47ad-8787-c187b0f7f0e1-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nv8wh\" (UID: \"cc5014f8-e5aa-47ad-8787-c187b0f7f0e1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nv8wh" Feb 19 15:49:54 crc kubenswrapper[4810]: I0219 15:49:54.678068 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x224k\" (UniqueName: \"kubernetes.io/projected/cc5014f8-e5aa-47ad-8787-c187b0f7f0e1-kube-api-access-x224k\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nv8wh\" (UID: \"cc5014f8-e5aa-47ad-8787-c187b0f7f0e1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nv8wh" Feb 19 15:49:54 crc kubenswrapper[4810]: I0219 15:49:54.678100 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/cc5014f8-e5aa-47ad-8787-c187b0f7f0e1-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nv8wh\" (UID: \"cc5014f8-e5aa-47ad-8787-c187b0f7f0e1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nv8wh" Feb 19 15:49:54 crc kubenswrapper[4810]: I0219 15:49:54.678125 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/cc5014f8-e5aa-47ad-8787-c187b0f7f0e1-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nv8wh\" (UID: \"cc5014f8-e5aa-47ad-8787-c187b0f7f0e1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nv8wh" Feb 19 15:49:54 crc kubenswrapper[4810]: I0219 15:49:54.678186 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/cc5014f8-e5aa-47ad-8787-c187b0f7f0e1-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nv8wh\" (UID: \"cc5014f8-e5aa-47ad-8787-c187b0f7f0e1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nv8wh" Feb 19 15:49:54 crc kubenswrapper[4810]: I0219 15:49:54.678264 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/cc5014f8-e5aa-47ad-8787-c187b0f7f0e1-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nv8wh\" (UID: \"cc5014f8-e5aa-47ad-8787-c187b0f7f0e1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nv8wh" Feb 19 15:49:54 crc kubenswrapper[4810]: I0219 15:49:54.678306 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc5014f8-e5aa-47ad-8787-c187b0f7f0e1-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nv8wh\" (UID: \"cc5014f8-e5aa-47ad-8787-c187b0f7f0e1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nv8wh" Feb 19 15:49:54 crc kubenswrapper[4810]: I0219 15:49:54.780096 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cc5014f8-e5aa-47ad-8787-c187b0f7f0e1-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nv8wh\" (UID: \"cc5014f8-e5aa-47ad-8787-c187b0f7f0e1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nv8wh" Feb 19 15:49:54 crc kubenswrapper[4810]: I0219 15:49:54.780154 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cc5014f8-e5aa-47ad-8787-c187b0f7f0e1-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nv8wh\" (UID: \"cc5014f8-e5aa-47ad-8787-c187b0f7f0e1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nv8wh" Feb 19 15:49:54 crc kubenswrapper[4810]: I0219 15:49:54.780187 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/cc5014f8-e5aa-47ad-8787-c187b0f7f0e1-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nv8wh\" (UID: \"cc5014f8-e5aa-47ad-8787-c187b0f7f0e1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nv8wh" Feb 19 15:49:54 crc kubenswrapper[4810]: I0219 15:49:54.780238 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/cc5014f8-e5aa-47ad-8787-c187b0f7f0e1-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nv8wh\" (UID: \"cc5014f8-e5aa-47ad-8787-c187b0f7f0e1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nv8wh" Feb 19 15:49:54 crc kubenswrapper[4810]: I0219 15:49:54.780293 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x224k\" (UniqueName: \"kubernetes.io/projected/cc5014f8-e5aa-47ad-8787-c187b0f7f0e1-kube-api-access-x224k\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nv8wh\" (UID: \"cc5014f8-e5aa-47ad-8787-c187b0f7f0e1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nv8wh" Feb 19 15:49:54 crc kubenswrapper[4810]: I0219 15:49:54.780553 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/cc5014f8-e5aa-47ad-8787-c187b0f7f0e1-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nv8wh\" (UID: \"cc5014f8-e5aa-47ad-8787-c187b0f7f0e1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nv8wh" Feb 19 15:49:54 crc kubenswrapper[4810]: I0219 15:49:54.780807 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/cc5014f8-e5aa-47ad-8787-c187b0f7f0e1-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nv8wh\" (UID: \"cc5014f8-e5aa-47ad-8787-c187b0f7f0e1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nv8wh" Feb 19 15:49:54 crc kubenswrapper[4810]: I0219 15:49:54.780882 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/cc5014f8-e5aa-47ad-8787-c187b0f7f0e1-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nv8wh\" (UID: \"cc5014f8-e5aa-47ad-8787-c187b0f7f0e1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nv8wh" Feb 19 15:49:54 crc kubenswrapper[4810]: I0219 15:49:54.780972 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/cc5014f8-e5aa-47ad-8787-c187b0f7f0e1-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nv8wh\" (UID: \"cc5014f8-e5aa-47ad-8787-c187b0f7f0e1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nv8wh" Feb 19 15:49:54 crc kubenswrapper[4810]: I0219 15:49:54.781016 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc5014f8-e5aa-47ad-8787-c187b0f7f0e1-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nv8wh\" (UID: \"cc5014f8-e5aa-47ad-8787-c187b0f7f0e1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nv8wh" Feb 19 15:49:54 crc kubenswrapper[4810]: I0219 15:49:54.781057 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/cc5014f8-e5aa-47ad-8787-c187b0f7f0e1-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nv8wh\" (UID: \"cc5014f8-e5aa-47ad-8787-c187b0f7f0e1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nv8wh" Feb 19 15:49:54 crc kubenswrapper[4810]: I0219 15:49:54.783399 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/cc5014f8-e5aa-47ad-8787-c187b0f7f0e1-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nv8wh\" (UID: \"cc5014f8-e5aa-47ad-8787-c187b0f7f0e1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nv8wh" Feb 19 15:49:54 crc kubenswrapper[4810]: I0219 15:49:54.785581 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/cc5014f8-e5aa-47ad-8787-c187b0f7f0e1-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nv8wh\" (UID: \"cc5014f8-e5aa-47ad-8787-c187b0f7f0e1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nv8wh" Feb 19 15:49:54 crc kubenswrapper[4810]: I0219 15:49:54.785582 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cc5014f8-e5aa-47ad-8787-c187b0f7f0e1-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nv8wh\" (UID: \"cc5014f8-e5aa-47ad-8787-c187b0f7f0e1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nv8wh" Feb 19 15:49:54 crc kubenswrapper[4810]: I0219 15:49:54.785611 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cc5014f8-e5aa-47ad-8787-c187b0f7f0e1-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nv8wh\" (UID: \"cc5014f8-e5aa-47ad-8787-c187b0f7f0e1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nv8wh" Feb 19 15:49:54 crc kubenswrapper[4810]: I0219 15:49:54.786040 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/cc5014f8-e5aa-47ad-8787-c187b0f7f0e1-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nv8wh\" (UID: \"cc5014f8-e5aa-47ad-8787-c187b0f7f0e1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nv8wh" Feb 19 15:49:54 crc kubenswrapper[4810]: I0219 15:49:54.786461 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/cc5014f8-e5aa-47ad-8787-c187b0f7f0e1-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nv8wh\" (UID: \"cc5014f8-e5aa-47ad-8787-c187b0f7f0e1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nv8wh" Feb 19 15:49:54 crc kubenswrapper[4810]: I0219 15:49:54.787576 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc5014f8-e5aa-47ad-8787-c187b0f7f0e1-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nv8wh\" (UID: \"cc5014f8-e5aa-47ad-8787-c187b0f7f0e1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nv8wh" Feb 19 15:49:54 crc kubenswrapper[4810]: I0219 15:49:54.792944 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/cc5014f8-e5aa-47ad-8787-c187b0f7f0e1-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nv8wh\" (UID: \"cc5014f8-e5aa-47ad-8787-c187b0f7f0e1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nv8wh" Feb 19 15:49:54 crc kubenswrapper[4810]: I0219 15:49:54.793307 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/cc5014f8-e5aa-47ad-8787-c187b0f7f0e1-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nv8wh\" (UID: \"cc5014f8-e5aa-47ad-8787-c187b0f7f0e1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nv8wh" Feb 19 15:49:54 crc kubenswrapper[4810]: I0219 15:49:54.793592 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/cc5014f8-e5aa-47ad-8787-c187b0f7f0e1-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nv8wh\" (UID: \"cc5014f8-e5aa-47ad-8787-c187b0f7f0e1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nv8wh" Feb 19 15:49:54 crc kubenswrapper[4810]: I0219 15:49:54.802175 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x224k\" (UniqueName: \"kubernetes.io/projected/cc5014f8-e5aa-47ad-8787-c187b0f7f0e1-kube-api-access-x224k\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nv8wh\" (UID: \"cc5014f8-e5aa-47ad-8787-c187b0f7f0e1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nv8wh" Feb 19 15:49:54 crc kubenswrapper[4810]: I0219 15:49:54.884625 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nv8wh" Feb 19 15:49:55 crc kubenswrapper[4810]: I0219 15:49:55.422389 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-nv8wh"] Feb 19 15:49:55 crc kubenswrapper[4810]: W0219 15:49:55.429012 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcc5014f8_e5aa_47ad_8787_c187b0f7f0e1.slice/crio-ba6d44e109e65ba6e824f2d3f6c56a0eb16fcbd5ac7ce5cace72d39335df678a WatchSource:0}: Error finding container ba6d44e109e65ba6e824f2d3f6c56a0eb16fcbd5ac7ce5cace72d39335df678a: Status 404 returned error can't find the container with id ba6d44e109e65ba6e824f2d3f6c56a0eb16fcbd5ac7ce5cace72d39335df678a Feb 19 15:49:55 crc kubenswrapper[4810]: I0219 15:49:55.432284 4810 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 15:49:56 crc kubenswrapper[4810]: I0219 15:49:56.399310 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nv8wh" event={"ID":"cc5014f8-e5aa-47ad-8787-c187b0f7f0e1","Type":"ContainerStarted","Data":"2e87240d488a9e3fc53ca295a40f539e65e717cb89ae7e5a0bdf479e1de61898"} Feb 19 15:49:56 crc kubenswrapper[4810]: I0219 15:49:56.399674 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nv8wh" event={"ID":"cc5014f8-e5aa-47ad-8787-c187b0f7f0e1","Type":"ContainerStarted","Data":"ba6d44e109e65ba6e824f2d3f6c56a0eb16fcbd5ac7ce5cace72d39335df678a"} Feb 19 15:49:56 crc kubenswrapper[4810]: I0219 15:49:56.446673 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nv8wh" podStartSLOduration=1.976681879 podStartE2EDuration="2.446640242s" podCreationTimestamp="2026-02-19 15:49:54 +0000 UTC" firstStartedPulling="2026-02-19 15:49:55.43203918 +0000 UTC m=+2424.914069294" lastFinishedPulling="2026-02-19 15:49:55.901997523 +0000 UTC m=+2425.384027657" observedRunningTime="2026-02-19 15:49:56.428610404 +0000 UTC m=+2425.910640528" watchObservedRunningTime="2026-02-19 15:49:56.446640242 +0000 UTC m=+2425.928670416" Feb 19 15:50:03 crc kubenswrapper[4810]: I0219 15:50:03.439976 4810 scope.go:117] "RemoveContainer" containerID="75401b3edfa84b40903edbf5e5d73ff2da026a9f8214b6a2404ce809eb32bd1f" Feb 19 15:50:03 crc kubenswrapper[4810]: E0219 15:50:03.441698 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 15:50:10 crc kubenswrapper[4810]: I0219 15:50:10.123782 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-fc5r4"] Feb 19 15:50:10 crc kubenswrapper[4810]: I0219 15:50:10.128584 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fc5r4" Feb 19 15:50:10 crc kubenswrapper[4810]: I0219 15:50:10.138149 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fc5r4"] Feb 19 15:50:10 crc kubenswrapper[4810]: I0219 15:50:10.288344 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4b7ad7c-cc02-4a75-8ac7-47efd18067b9-catalog-content\") pod \"certified-operators-fc5r4\" (UID: \"f4b7ad7c-cc02-4a75-8ac7-47efd18067b9\") " pod="openshift-marketplace/certified-operators-fc5r4" Feb 19 15:50:10 crc kubenswrapper[4810]: I0219 15:50:10.288689 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sm7d\" (UniqueName: \"kubernetes.io/projected/f4b7ad7c-cc02-4a75-8ac7-47efd18067b9-kube-api-access-8sm7d\") pod \"certified-operators-fc5r4\" (UID: \"f4b7ad7c-cc02-4a75-8ac7-47efd18067b9\") " pod="openshift-marketplace/certified-operators-fc5r4" Feb 19 15:50:10 crc kubenswrapper[4810]: I0219 15:50:10.288962 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4b7ad7c-cc02-4a75-8ac7-47efd18067b9-utilities\") pod \"certified-operators-fc5r4\" (UID: \"f4b7ad7c-cc02-4a75-8ac7-47efd18067b9\") " pod="openshift-marketplace/certified-operators-fc5r4" Feb 19 15:50:10 crc kubenswrapper[4810]: I0219 15:50:10.390708 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4b7ad7c-cc02-4a75-8ac7-47efd18067b9-utilities\") pod \"certified-operators-fc5r4\" (UID: \"f4b7ad7c-cc02-4a75-8ac7-47efd18067b9\") " pod="openshift-marketplace/certified-operators-fc5r4" Feb 19 15:50:10 crc kubenswrapper[4810]: I0219 15:50:10.390800 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4b7ad7c-cc02-4a75-8ac7-47efd18067b9-catalog-content\") pod \"certified-operators-fc5r4\" (UID: \"f4b7ad7c-cc02-4a75-8ac7-47efd18067b9\") " pod="openshift-marketplace/certified-operators-fc5r4" Feb 19 15:50:10 crc kubenswrapper[4810]: I0219 15:50:10.390886 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8sm7d\" (UniqueName: \"kubernetes.io/projected/f4b7ad7c-cc02-4a75-8ac7-47efd18067b9-kube-api-access-8sm7d\") pod \"certified-operators-fc5r4\" (UID: \"f4b7ad7c-cc02-4a75-8ac7-47efd18067b9\") " pod="openshift-marketplace/certified-operators-fc5r4" Feb 19 15:50:10 crc kubenswrapper[4810]: I0219 15:50:10.391360 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4b7ad7c-cc02-4a75-8ac7-47efd18067b9-utilities\") pod \"certified-operators-fc5r4\" (UID: \"f4b7ad7c-cc02-4a75-8ac7-47efd18067b9\") " pod="openshift-marketplace/certified-operators-fc5r4" Feb 19 15:50:10 crc kubenswrapper[4810]: I0219 15:50:10.391369 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4b7ad7c-cc02-4a75-8ac7-47efd18067b9-catalog-content\") pod \"certified-operators-fc5r4\" (UID: \"f4b7ad7c-cc02-4a75-8ac7-47efd18067b9\") " pod="openshift-marketplace/certified-operators-fc5r4" Feb 19 15:50:10 crc kubenswrapper[4810]: I0219 15:50:10.413651 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sm7d\" (UniqueName: \"kubernetes.io/projected/f4b7ad7c-cc02-4a75-8ac7-47efd18067b9-kube-api-access-8sm7d\") pod \"certified-operators-fc5r4\" (UID: \"f4b7ad7c-cc02-4a75-8ac7-47efd18067b9\") " pod="openshift-marketplace/certified-operators-fc5r4" Feb 19 15:50:10 crc kubenswrapper[4810]: I0219 15:50:10.462423 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fc5r4" Feb 19 15:50:11 crc kubenswrapper[4810]: W0219 15:50:11.003031 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b7ad7c_cc02_4a75_8ac7_47efd18067b9.slice/crio-c43fde9c5e8b2fa870f56fd8d2482a1979719f8bc8a8500731f79f842e1747c0 WatchSource:0}: Error finding container c43fde9c5e8b2fa870f56fd8d2482a1979719f8bc8a8500731f79f842e1747c0: Status 404 returned error can't find the container with id c43fde9c5e8b2fa870f56fd8d2482a1979719f8bc8a8500731f79f842e1747c0 Feb 19 15:50:11 crc kubenswrapper[4810]: I0219 15:50:11.004685 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fc5r4"] Feb 19 15:50:11 crc kubenswrapper[4810]: I0219 15:50:11.579583 4810 generic.go:334] "Generic (PLEG): container finished" podID="f4b7ad7c-cc02-4a75-8ac7-47efd18067b9" containerID="4aca170e13e53204b5023bdecba116a7203aa5391a1b34fd52cfc2d7f859375e" exitCode=0 Feb 19 15:50:11 crc kubenswrapper[4810]: I0219 15:50:11.579744 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fc5r4" event={"ID":"f4b7ad7c-cc02-4a75-8ac7-47efd18067b9","Type":"ContainerDied","Data":"4aca170e13e53204b5023bdecba116a7203aa5391a1b34fd52cfc2d7f859375e"} Feb 19 15:50:11 crc kubenswrapper[4810]: I0219 15:50:11.579952 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fc5r4" event={"ID":"f4b7ad7c-cc02-4a75-8ac7-47efd18067b9","Type":"ContainerStarted","Data":"c43fde9c5e8b2fa870f56fd8d2482a1979719f8bc8a8500731f79f842e1747c0"} Feb 19 15:50:11 crc kubenswrapper[4810]: I0219 15:50:11.925385 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4gb7b"] Feb 19 15:50:11 crc kubenswrapper[4810]: I0219 15:50:11.928792 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4gb7b" Feb 19 15:50:11 crc kubenswrapper[4810]: I0219 15:50:11.935150 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4gb7b"] Feb 19 15:50:12 crc kubenswrapper[4810]: I0219 15:50:12.019349 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/505f9989-9548-4391-b758-33ef9484f145-utilities\") pod \"redhat-operators-4gb7b\" (UID: \"505f9989-9548-4391-b758-33ef9484f145\") " pod="openshift-marketplace/redhat-operators-4gb7b" Feb 19 15:50:12 crc kubenswrapper[4810]: I0219 15:50:12.019890 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/505f9989-9548-4391-b758-33ef9484f145-catalog-content\") pod \"redhat-operators-4gb7b\" (UID: \"505f9989-9548-4391-b758-33ef9484f145\") " pod="openshift-marketplace/redhat-operators-4gb7b" Feb 19 15:50:12 crc kubenswrapper[4810]: I0219 15:50:12.020659 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwzp4\" (UniqueName: \"kubernetes.io/projected/505f9989-9548-4391-b758-33ef9484f145-kube-api-access-pwzp4\") pod \"redhat-operators-4gb7b\" (UID: \"505f9989-9548-4391-b758-33ef9484f145\") " pod="openshift-marketplace/redhat-operators-4gb7b" Feb 19 15:50:12 crc kubenswrapper[4810]: I0219 15:50:12.122811 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwzp4\" (UniqueName: \"kubernetes.io/projected/505f9989-9548-4391-b758-33ef9484f145-kube-api-access-pwzp4\") pod \"redhat-operators-4gb7b\" (UID: \"505f9989-9548-4391-b758-33ef9484f145\") " pod="openshift-marketplace/redhat-operators-4gb7b" Feb 19 15:50:12 crc kubenswrapper[4810]: I0219 15:50:12.122883 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/505f9989-9548-4391-b758-33ef9484f145-utilities\") pod \"redhat-operators-4gb7b\" (UID: \"505f9989-9548-4391-b758-33ef9484f145\") " pod="openshift-marketplace/redhat-operators-4gb7b" Feb 19 15:50:12 crc kubenswrapper[4810]: I0219 15:50:12.122990 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/505f9989-9548-4391-b758-33ef9484f145-catalog-content\") pod \"redhat-operators-4gb7b\" (UID: \"505f9989-9548-4391-b758-33ef9484f145\") " pod="openshift-marketplace/redhat-operators-4gb7b" Feb 19 15:50:12 crc kubenswrapper[4810]: I0219 15:50:12.123585 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/505f9989-9548-4391-b758-33ef9484f145-catalog-content\") pod \"redhat-operators-4gb7b\" (UID: \"505f9989-9548-4391-b758-33ef9484f145\") " pod="openshift-marketplace/redhat-operators-4gb7b" Feb 19 15:50:12 crc kubenswrapper[4810]: I0219 15:50:12.124182 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/505f9989-9548-4391-b758-33ef9484f145-utilities\") pod \"redhat-operators-4gb7b\" (UID: \"505f9989-9548-4391-b758-33ef9484f145\") " pod="openshift-marketplace/redhat-operators-4gb7b" Feb 19 15:50:12 crc kubenswrapper[4810]: I0219 15:50:12.142377 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwzp4\" (UniqueName: \"kubernetes.io/projected/505f9989-9548-4391-b758-33ef9484f145-kube-api-access-pwzp4\") pod \"redhat-operators-4gb7b\" (UID: \"505f9989-9548-4391-b758-33ef9484f145\") " pod="openshift-marketplace/redhat-operators-4gb7b" Feb 19 15:50:12 crc kubenswrapper[4810]: I0219 15:50:12.276541 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4gb7b" Feb 19 15:50:12 crc kubenswrapper[4810]: I0219 15:50:12.592876 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fc5r4" event={"ID":"f4b7ad7c-cc02-4a75-8ac7-47efd18067b9","Type":"ContainerStarted","Data":"8510db2e629e8bb213e4a8889c2e19ef5716ff9167ee722bb255cd38156b4002"} Feb 19 15:50:12 crc kubenswrapper[4810]: I0219 15:50:12.764134 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4gb7b"] Feb 19 15:50:13 crc kubenswrapper[4810]: I0219 15:50:13.603384 4810 generic.go:334] "Generic (PLEG): container finished" podID="505f9989-9548-4391-b758-33ef9484f145" containerID="98c0f22403cbf6d1afd69ca2613abb589784f884be7c977062718a384a79f1d6" exitCode=0 Feb 19 15:50:13 crc kubenswrapper[4810]: I0219 15:50:13.603457 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4gb7b" event={"ID":"505f9989-9548-4391-b758-33ef9484f145","Type":"ContainerDied","Data":"98c0f22403cbf6d1afd69ca2613abb589784f884be7c977062718a384a79f1d6"} Feb 19 15:50:13 crc kubenswrapper[4810]: I0219 15:50:13.603727 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4gb7b" event={"ID":"505f9989-9548-4391-b758-33ef9484f145","Type":"ContainerStarted","Data":"8abdba986b1625129435e6790b7e3757e1e286235597f9e2b12f7877f6574c61"} Feb 19 15:50:13 crc kubenswrapper[4810]: I0219 15:50:13.605584 4810 generic.go:334] "Generic (PLEG): container finished" podID="f4b7ad7c-cc02-4a75-8ac7-47efd18067b9" containerID="8510db2e629e8bb213e4a8889c2e19ef5716ff9167ee722bb255cd38156b4002" exitCode=0 Feb 19 15:50:13 crc kubenswrapper[4810]: I0219 15:50:13.605610 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fc5r4" event={"ID":"f4b7ad7c-cc02-4a75-8ac7-47efd18067b9","Type":"ContainerDied","Data":"8510db2e629e8bb213e4a8889c2e19ef5716ff9167ee722bb255cd38156b4002"} Feb 19 15:50:14 crc kubenswrapper[4810]: I0219 15:50:14.618087 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fc5r4" event={"ID":"f4b7ad7c-cc02-4a75-8ac7-47efd18067b9","Type":"ContainerStarted","Data":"7100a0cce48fd72efe5710c85a2b6e7a60fd1424a4a404164ae5118372e83f43"} Feb 19 15:50:14 crc kubenswrapper[4810]: I0219 15:50:14.651824 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-fc5r4" podStartSLOduration=2.231688478 podStartE2EDuration="4.651787389s" podCreationTimestamp="2026-02-19 15:50:10 +0000 UTC" firstStartedPulling="2026-02-19 15:50:11.584861898 +0000 UTC m=+2441.066892062" lastFinishedPulling="2026-02-19 15:50:14.004960839 +0000 UTC m=+2443.486990973" observedRunningTime="2026-02-19 15:50:14.643189665 +0000 UTC m=+2444.125219859" watchObservedRunningTime="2026-02-19 15:50:14.651787389 +0000 UTC m=+2444.133817553" Feb 19 15:50:15 crc kubenswrapper[4810]: I0219 15:50:15.630980 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4gb7b" event={"ID":"505f9989-9548-4391-b758-33ef9484f145","Type":"ContainerStarted","Data":"9156315524b62b8e2dc59d942645a9530f1aefb6429d80d2e3bb2c9ad987cb72"} Feb 19 15:50:16 crc kubenswrapper[4810]: I0219 15:50:16.440182 4810 scope.go:117] "RemoveContainer" containerID="75401b3edfa84b40903edbf5e5d73ff2da026a9f8214b6a2404ce809eb32bd1f" Feb 19 15:50:16 crc kubenswrapper[4810]: E0219 15:50:16.441016 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 15:50:17 crc kubenswrapper[4810]: I0219 15:50:17.655527 4810 generic.go:334] "Generic (PLEG): container finished" podID="505f9989-9548-4391-b758-33ef9484f145" containerID="9156315524b62b8e2dc59d942645a9530f1aefb6429d80d2e3bb2c9ad987cb72" exitCode=0 Feb 19 15:50:17 crc kubenswrapper[4810]: I0219 15:50:17.655595 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4gb7b" event={"ID":"505f9989-9548-4391-b758-33ef9484f145","Type":"ContainerDied","Data":"9156315524b62b8e2dc59d942645a9530f1aefb6429d80d2e3bb2c9ad987cb72"} Feb 19 15:50:19 crc kubenswrapper[4810]: I0219 15:50:19.687043 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4gb7b" event={"ID":"505f9989-9548-4391-b758-33ef9484f145","Type":"ContainerStarted","Data":"d13f04dd8f83d8f3b49c862c6aa6e98dfeeac1f7ae0d8ec94ede6a74195918b3"} Feb 19 15:50:19 crc kubenswrapper[4810]: I0219 15:50:19.727529 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4gb7b" podStartSLOduration=3.917432384 podStartE2EDuration="8.727501448s" podCreationTimestamp="2026-02-19 15:50:11 +0000 UTC" firstStartedPulling="2026-02-19 15:50:13.604950266 +0000 UTC m=+2443.086980400" lastFinishedPulling="2026-02-19 15:50:18.41501932 +0000 UTC m=+2447.897049464" observedRunningTime="2026-02-19 15:50:19.715706374 +0000 UTC m=+2449.197736508" watchObservedRunningTime="2026-02-19 15:50:19.727501448 +0000 UTC m=+2449.209531612" Feb 19 15:50:20 crc kubenswrapper[4810]: I0219 15:50:20.463203 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-fc5r4" Feb 19 15:50:20 crc kubenswrapper[4810]: I0219 15:50:20.463806 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-fc5r4" Feb 19 15:50:20 crc kubenswrapper[4810]: I0219 15:50:20.524090 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-fc5r4" Feb 19 15:50:20 crc kubenswrapper[4810]: I0219 15:50:20.758043 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-fc5r4" Feb 19 15:50:21 crc kubenswrapper[4810]: I0219 15:50:21.906305 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fc5r4"] Feb 19 15:50:22 crc kubenswrapper[4810]: I0219 15:50:22.278234 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4gb7b" Feb 19 15:50:22 crc kubenswrapper[4810]: I0219 15:50:22.278623 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4gb7b" Feb 19 15:50:23 crc kubenswrapper[4810]: I0219 15:50:23.348029 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-4gb7b" podUID="505f9989-9548-4391-b758-33ef9484f145" containerName="registry-server" probeResult="failure" output=< Feb 19 15:50:23 crc kubenswrapper[4810]: timeout: failed to connect service ":50051" within 1s Feb 19 15:50:23 crc kubenswrapper[4810]: > Feb 19 15:50:23 crc kubenswrapper[4810]: I0219 15:50:23.741812 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-fc5r4" podUID="f4b7ad7c-cc02-4a75-8ac7-47efd18067b9" containerName="registry-server" containerID="cri-o://7100a0cce48fd72efe5710c85a2b6e7a60fd1424a4a404164ae5118372e83f43" gracePeriod=2 Feb 19 15:50:24 crc kubenswrapper[4810]: I0219 15:50:24.328050 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fc5r4" Feb 19 15:50:24 crc kubenswrapper[4810]: I0219 15:50:24.413068 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4b7ad7c-cc02-4a75-8ac7-47efd18067b9-catalog-content\") pod \"f4b7ad7c-cc02-4a75-8ac7-47efd18067b9\" (UID: \"f4b7ad7c-cc02-4a75-8ac7-47efd18067b9\") " Feb 19 15:50:24 crc kubenswrapper[4810]: I0219 15:50:24.413159 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4b7ad7c-cc02-4a75-8ac7-47efd18067b9-utilities\") pod \"f4b7ad7c-cc02-4a75-8ac7-47efd18067b9\" (UID: \"f4b7ad7c-cc02-4a75-8ac7-47efd18067b9\") " Feb 19 15:50:24 crc kubenswrapper[4810]: I0219 15:50:24.413249 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8sm7d\" (UniqueName: \"kubernetes.io/projected/f4b7ad7c-cc02-4a75-8ac7-47efd18067b9-kube-api-access-8sm7d\") pod \"f4b7ad7c-cc02-4a75-8ac7-47efd18067b9\" (UID: \"f4b7ad7c-cc02-4a75-8ac7-47efd18067b9\") " Feb 19 15:50:24 crc kubenswrapper[4810]: I0219 15:50:24.414137 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4b7ad7c-cc02-4a75-8ac7-47efd18067b9-utilities" (OuterVolumeSpecName: "utilities") pod "f4b7ad7c-cc02-4a75-8ac7-47efd18067b9" (UID: "f4b7ad7c-cc02-4a75-8ac7-47efd18067b9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:50:24 crc kubenswrapper[4810]: I0219 15:50:24.426658 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4b7ad7c-cc02-4a75-8ac7-47efd18067b9-kube-api-access-8sm7d" (OuterVolumeSpecName: "kube-api-access-8sm7d") pod "f4b7ad7c-cc02-4a75-8ac7-47efd18067b9" (UID: "f4b7ad7c-cc02-4a75-8ac7-47efd18067b9"). InnerVolumeSpecName "kube-api-access-8sm7d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:50:24 crc kubenswrapper[4810]: I0219 15:50:24.485979 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4b7ad7c-cc02-4a75-8ac7-47efd18067b9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f4b7ad7c-cc02-4a75-8ac7-47efd18067b9" (UID: "f4b7ad7c-cc02-4a75-8ac7-47efd18067b9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:50:24 crc kubenswrapper[4810]: I0219 15:50:24.515971 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4b7ad7c-cc02-4a75-8ac7-47efd18067b9-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 15:50:24 crc kubenswrapper[4810]: I0219 15:50:24.516018 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4b7ad7c-cc02-4a75-8ac7-47efd18067b9-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 15:50:24 crc kubenswrapper[4810]: I0219 15:50:24.516031 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8sm7d\" (UniqueName: \"kubernetes.io/projected/f4b7ad7c-cc02-4a75-8ac7-47efd18067b9-kube-api-access-8sm7d\") on node \"crc\" DevicePath \"\"" Feb 19 15:50:24 crc kubenswrapper[4810]: I0219 15:50:24.754583 4810 generic.go:334] "Generic (PLEG): container finished" podID="f4b7ad7c-cc02-4a75-8ac7-47efd18067b9" containerID="7100a0cce48fd72efe5710c85a2b6e7a60fd1424a4a404164ae5118372e83f43" exitCode=0 Feb 19 15:50:24 crc kubenswrapper[4810]: I0219 15:50:24.754620 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fc5r4" event={"ID":"f4b7ad7c-cc02-4a75-8ac7-47efd18067b9","Type":"ContainerDied","Data":"7100a0cce48fd72efe5710c85a2b6e7a60fd1424a4a404164ae5118372e83f43"} Feb 19 15:50:24 crc kubenswrapper[4810]: I0219 15:50:24.755056 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fc5r4" event={"ID":"f4b7ad7c-cc02-4a75-8ac7-47efd18067b9","Type":"ContainerDied","Data":"c43fde9c5e8b2fa870f56fd8d2482a1979719f8bc8a8500731f79f842e1747c0"} Feb 19 15:50:24 crc kubenswrapper[4810]: I0219 15:50:24.754628 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fc5r4" Feb 19 15:50:24 crc kubenswrapper[4810]: I0219 15:50:24.755085 4810 scope.go:117] "RemoveContainer" containerID="7100a0cce48fd72efe5710c85a2b6e7a60fd1424a4a404164ae5118372e83f43" Feb 19 15:50:24 crc kubenswrapper[4810]: I0219 15:50:24.783620 4810 scope.go:117] "RemoveContainer" containerID="8510db2e629e8bb213e4a8889c2e19ef5716ff9167ee722bb255cd38156b4002" Feb 19 15:50:24 crc kubenswrapper[4810]: I0219 15:50:24.801215 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fc5r4"] Feb 19 15:50:24 crc kubenswrapper[4810]: I0219 15:50:24.808816 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-fc5r4"] Feb 19 15:50:24 crc kubenswrapper[4810]: I0219 15:50:24.833611 4810 scope.go:117] "RemoveContainer" containerID="4aca170e13e53204b5023bdecba116a7203aa5391a1b34fd52cfc2d7f859375e" Feb 19 15:50:24 crc kubenswrapper[4810]: I0219 15:50:24.861640 4810 scope.go:117] "RemoveContainer" containerID="7100a0cce48fd72efe5710c85a2b6e7a60fd1424a4a404164ae5118372e83f43" Feb 19 15:50:24 crc kubenswrapper[4810]: E0219 15:50:24.862388 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7100a0cce48fd72efe5710c85a2b6e7a60fd1424a4a404164ae5118372e83f43\": container with ID starting with 7100a0cce48fd72efe5710c85a2b6e7a60fd1424a4a404164ae5118372e83f43 not found: ID does not exist" containerID="7100a0cce48fd72efe5710c85a2b6e7a60fd1424a4a404164ae5118372e83f43" Feb 19 15:50:24 crc kubenswrapper[4810]: I0219 15:50:24.862535 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7100a0cce48fd72efe5710c85a2b6e7a60fd1424a4a404164ae5118372e83f43"} err="failed to get container status \"7100a0cce48fd72efe5710c85a2b6e7a60fd1424a4a404164ae5118372e83f43\": rpc error: code = NotFound desc = could not find container \"7100a0cce48fd72efe5710c85a2b6e7a60fd1424a4a404164ae5118372e83f43\": container with ID starting with 7100a0cce48fd72efe5710c85a2b6e7a60fd1424a4a404164ae5118372e83f43 not found: ID does not exist" Feb 19 15:50:24 crc kubenswrapper[4810]: I0219 15:50:24.862653 4810 scope.go:117] "RemoveContainer" containerID="8510db2e629e8bb213e4a8889c2e19ef5716ff9167ee722bb255cd38156b4002" Feb 19 15:50:24 crc kubenswrapper[4810]: E0219 15:50:24.863172 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8510db2e629e8bb213e4a8889c2e19ef5716ff9167ee722bb255cd38156b4002\": container with ID starting with 8510db2e629e8bb213e4a8889c2e19ef5716ff9167ee722bb255cd38156b4002 not found: ID does not exist" containerID="8510db2e629e8bb213e4a8889c2e19ef5716ff9167ee722bb255cd38156b4002" Feb 19 15:50:24 crc kubenswrapper[4810]: I0219 15:50:24.863205 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8510db2e629e8bb213e4a8889c2e19ef5716ff9167ee722bb255cd38156b4002"} err="failed to get container status \"8510db2e629e8bb213e4a8889c2e19ef5716ff9167ee722bb255cd38156b4002\": rpc error: code = NotFound desc = could not find container \"8510db2e629e8bb213e4a8889c2e19ef5716ff9167ee722bb255cd38156b4002\": container with ID starting with 8510db2e629e8bb213e4a8889c2e19ef5716ff9167ee722bb255cd38156b4002 not found: ID does not exist" Feb 19 15:50:24 crc kubenswrapper[4810]: I0219 15:50:24.863229 4810 scope.go:117] "RemoveContainer" containerID="4aca170e13e53204b5023bdecba116a7203aa5391a1b34fd52cfc2d7f859375e" Feb 19 15:50:24 crc kubenswrapper[4810]: E0219 15:50:24.863715 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4aca170e13e53204b5023bdecba116a7203aa5391a1b34fd52cfc2d7f859375e\": container with ID starting with 4aca170e13e53204b5023bdecba116a7203aa5391a1b34fd52cfc2d7f859375e not found: ID does not exist" containerID="4aca170e13e53204b5023bdecba116a7203aa5391a1b34fd52cfc2d7f859375e" Feb 19 15:50:24 crc kubenswrapper[4810]: I0219 15:50:24.863821 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4aca170e13e53204b5023bdecba116a7203aa5391a1b34fd52cfc2d7f859375e"} err="failed to get container status \"4aca170e13e53204b5023bdecba116a7203aa5391a1b34fd52cfc2d7f859375e\": rpc error: code = NotFound desc = could not find container \"4aca170e13e53204b5023bdecba116a7203aa5391a1b34fd52cfc2d7f859375e\": container with ID starting with 4aca170e13e53204b5023bdecba116a7203aa5391a1b34fd52cfc2d7f859375e not found: ID does not exist" Feb 19 15:50:25 crc kubenswrapper[4810]: I0219 15:50:25.457424 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b7ad7c-cc02-4a75-8ac7-47efd18067b9" path="/var/lib/kubelet/pods/f4b7ad7c-cc02-4a75-8ac7-47efd18067b9/volumes" Feb 19 15:50:29 crc kubenswrapper[4810]: I0219 15:50:29.439906 4810 scope.go:117] "RemoveContainer" containerID="75401b3edfa84b40903edbf5e5d73ff2da026a9f8214b6a2404ce809eb32bd1f" Feb 19 15:50:29 crc kubenswrapper[4810]: E0219 15:50:29.441113 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 15:50:32 crc kubenswrapper[4810]: I0219 15:50:32.329454 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4gb7b" Feb 19 15:50:32 crc kubenswrapper[4810]: I0219 15:50:32.392965 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4gb7b" Feb 19 15:50:32 crc kubenswrapper[4810]: I0219 15:50:32.564152 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4gb7b"] Feb 19 15:50:33 crc kubenswrapper[4810]: I0219 15:50:33.900147 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4gb7b" podUID="505f9989-9548-4391-b758-33ef9484f145" containerName="registry-server" containerID="cri-o://d13f04dd8f83d8f3b49c862c6aa6e98dfeeac1f7ae0d8ec94ede6a74195918b3" gracePeriod=2 Feb 19 15:50:34 crc kubenswrapper[4810]: I0219 15:50:34.428458 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4gb7b" Feb 19 15:50:34 crc kubenswrapper[4810]: I0219 15:50:34.548239 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/505f9989-9548-4391-b758-33ef9484f145-utilities\") pod \"505f9989-9548-4391-b758-33ef9484f145\" (UID: \"505f9989-9548-4391-b758-33ef9484f145\") " Feb 19 15:50:34 crc kubenswrapper[4810]: I0219 15:50:34.548288 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/505f9989-9548-4391-b758-33ef9484f145-catalog-content\") pod \"505f9989-9548-4391-b758-33ef9484f145\" (UID: \"505f9989-9548-4391-b758-33ef9484f145\") " Feb 19 15:50:34 crc kubenswrapper[4810]: I0219 15:50:34.548347 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwzp4\" (UniqueName: \"kubernetes.io/projected/505f9989-9548-4391-b758-33ef9484f145-kube-api-access-pwzp4\") pod \"505f9989-9548-4391-b758-33ef9484f145\" (UID: \"505f9989-9548-4391-b758-33ef9484f145\") " Feb 19 15:50:34 crc kubenswrapper[4810]: I0219 15:50:34.550968 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/505f9989-9548-4391-b758-33ef9484f145-utilities" (OuterVolumeSpecName: "utilities") pod "505f9989-9548-4391-b758-33ef9484f145" (UID: "505f9989-9548-4391-b758-33ef9484f145"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:50:34 crc kubenswrapper[4810]: I0219 15:50:34.556851 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/505f9989-9548-4391-b758-33ef9484f145-kube-api-access-pwzp4" (OuterVolumeSpecName: "kube-api-access-pwzp4") pod "505f9989-9548-4391-b758-33ef9484f145" (UID: "505f9989-9548-4391-b758-33ef9484f145"). InnerVolumeSpecName "kube-api-access-pwzp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:50:34 crc kubenswrapper[4810]: I0219 15:50:34.651232 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/505f9989-9548-4391-b758-33ef9484f145-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 15:50:34 crc kubenswrapper[4810]: I0219 15:50:34.651281 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwzp4\" (UniqueName: \"kubernetes.io/projected/505f9989-9548-4391-b758-33ef9484f145-kube-api-access-pwzp4\") on node \"crc\" DevicePath \"\"" Feb 19 15:50:34 crc kubenswrapper[4810]: I0219 15:50:34.716624 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/505f9989-9548-4391-b758-33ef9484f145-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "505f9989-9548-4391-b758-33ef9484f145" (UID: "505f9989-9548-4391-b758-33ef9484f145"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:50:34 crc kubenswrapper[4810]: I0219 15:50:34.752823 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/505f9989-9548-4391-b758-33ef9484f145-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 15:50:34 crc kubenswrapper[4810]: I0219 15:50:34.915442 4810 generic.go:334] "Generic (PLEG): container finished" podID="505f9989-9548-4391-b758-33ef9484f145" containerID="d13f04dd8f83d8f3b49c862c6aa6e98dfeeac1f7ae0d8ec94ede6a74195918b3" exitCode=0 Feb 19 15:50:34 crc kubenswrapper[4810]: I0219 15:50:34.915509 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4gb7b" event={"ID":"505f9989-9548-4391-b758-33ef9484f145","Type":"ContainerDied","Data":"d13f04dd8f83d8f3b49c862c6aa6e98dfeeac1f7ae0d8ec94ede6a74195918b3"} Feb 19 15:50:34 crc kubenswrapper[4810]: I0219 15:50:34.915566 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4gb7b" event={"ID":"505f9989-9548-4391-b758-33ef9484f145","Type":"ContainerDied","Data":"8abdba986b1625129435e6790b7e3757e1e286235597f9e2b12f7877f6574c61"} Feb 19 15:50:34 crc kubenswrapper[4810]: I0219 15:50:34.915574 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4gb7b" Feb 19 15:50:34 crc kubenswrapper[4810]: I0219 15:50:34.915595 4810 scope.go:117] "RemoveContainer" containerID="d13f04dd8f83d8f3b49c862c6aa6e98dfeeac1f7ae0d8ec94ede6a74195918b3" Feb 19 15:50:34 crc kubenswrapper[4810]: I0219 15:50:34.945732 4810 scope.go:117] "RemoveContainer" containerID="9156315524b62b8e2dc59d942645a9530f1aefb6429d80d2e3bb2c9ad987cb72" Feb 19 15:50:34 crc kubenswrapper[4810]: I0219 15:50:34.975597 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4gb7b"] Feb 19 15:50:34 crc kubenswrapper[4810]: I0219 15:50:34.990189 4810 scope.go:117] "RemoveContainer" containerID="98c0f22403cbf6d1afd69ca2613abb589784f884be7c977062718a384a79f1d6" Feb 19 15:50:34 crc kubenswrapper[4810]: I0219 15:50:34.992195 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4gb7b"] Feb 19 15:50:35 crc kubenswrapper[4810]: I0219 15:50:35.048782 4810 scope.go:117] "RemoveContainer" containerID="d13f04dd8f83d8f3b49c862c6aa6e98dfeeac1f7ae0d8ec94ede6a74195918b3" Feb 19 15:50:35 crc kubenswrapper[4810]: E0219 15:50:35.049363 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d13f04dd8f83d8f3b49c862c6aa6e98dfeeac1f7ae0d8ec94ede6a74195918b3\": container with ID starting with d13f04dd8f83d8f3b49c862c6aa6e98dfeeac1f7ae0d8ec94ede6a74195918b3 not found: ID does not exist" containerID="d13f04dd8f83d8f3b49c862c6aa6e98dfeeac1f7ae0d8ec94ede6a74195918b3" Feb 19 15:50:35 crc kubenswrapper[4810]: I0219 15:50:35.049411 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d13f04dd8f83d8f3b49c862c6aa6e98dfeeac1f7ae0d8ec94ede6a74195918b3"} err="failed to get container status \"d13f04dd8f83d8f3b49c862c6aa6e98dfeeac1f7ae0d8ec94ede6a74195918b3\": rpc error: code = NotFound desc = could not find container \"d13f04dd8f83d8f3b49c862c6aa6e98dfeeac1f7ae0d8ec94ede6a74195918b3\": container with ID starting with d13f04dd8f83d8f3b49c862c6aa6e98dfeeac1f7ae0d8ec94ede6a74195918b3 not found: ID does not exist" Feb 19 15:50:35 crc kubenswrapper[4810]: I0219 15:50:35.049443 4810 scope.go:117] "RemoveContainer" containerID="9156315524b62b8e2dc59d942645a9530f1aefb6429d80d2e3bb2c9ad987cb72" Feb 19 15:50:35 crc kubenswrapper[4810]: E0219 15:50:35.049811 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9156315524b62b8e2dc59d942645a9530f1aefb6429d80d2e3bb2c9ad987cb72\": container with ID starting with 9156315524b62b8e2dc59d942645a9530f1aefb6429d80d2e3bb2c9ad987cb72 not found: ID does not exist" containerID="9156315524b62b8e2dc59d942645a9530f1aefb6429d80d2e3bb2c9ad987cb72" Feb 19 15:50:35 crc kubenswrapper[4810]: I0219 15:50:35.049854 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9156315524b62b8e2dc59d942645a9530f1aefb6429d80d2e3bb2c9ad987cb72"} err="failed to get container status \"9156315524b62b8e2dc59d942645a9530f1aefb6429d80d2e3bb2c9ad987cb72\": rpc error: code = NotFound desc = could not find container \"9156315524b62b8e2dc59d942645a9530f1aefb6429d80d2e3bb2c9ad987cb72\": container with ID starting with 9156315524b62b8e2dc59d942645a9530f1aefb6429d80d2e3bb2c9ad987cb72 not found: ID does not exist" Feb 19 15:50:35 crc kubenswrapper[4810]: I0219 15:50:35.049895 4810 scope.go:117] "RemoveContainer" containerID="98c0f22403cbf6d1afd69ca2613abb589784f884be7c977062718a384a79f1d6" Feb 19 15:50:35 crc kubenswrapper[4810]: E0219 15:50:35.050222 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98c0f22403cbf6d1afd69ca2613abb589784f884be7c977062718a384a79f1d6\": container with ID starting with 98c0f22403cbf6d1afd69ca2613abb589784f884be7c977062718a384a79f1d6 not found: ID does not exist" containerID="98c0f22403cbf6d1afd69ca2613abb589784f884be7c977062718a384a79f1d6" Feb 19 15:50:35 crc kubenswrapper[4810]: I0219 15:50:35.050254 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98c0f22403cbf6d1afd69ca2613abb589784f884be7c977062718a384a79f1d6"} err="failed to get container status \"98c0f22403cbf6d1afd69ca2613abb589784f884be7c977062718a384a79f1d6\": rpc error: code = NotFound desc = could not find container \"98c0f22403cbf6d1afd69ca2613abb589784f884be7c977062718a384a79f1d6\": container with ID starting with 98c0f22403cbf6d1afd69ca2613abb589784f884be7c977062718a384a79f1d6 not found: ID does not exist" Feb 19 15:50:35 crc kubenswrapper[4810]: I0219 15:50:35.454632 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="505f9989-9548-4391-b758-33ef9484f145" path="/var/lib/kubelet/pods/505f9989-9548-4391-b758-33ef9484f145/volumes" Feb 19 15:50:40 crc kubenswrapper[4810]: I0219 15:50:40.439639 4810 scope.go:117] "RemoveContainer" containerID="75401b3edfa84b40903edbf5e5d73ff2da026a9f8214b6a2404ce809eb32bd1f" Feb 19 15:50:40 crc kubenswrapper[4810]: E0219 15:50:40.440621 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 15:50:51 crc kubenswrapper[4810]: I0219 15:50:51.455222 4810 scope.go:117] "RemoveContainer" containerID="75401b3edfa84b40903edbf5e5d73ff2da026a9f8214b6a2404ce809eb32bd1f" Feb 19 15:50:51 crc kubenswrapper[4810]: E0219 15:50:51.456589 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 15:51:05 crc kubenswrapper[4810]: I0219 15:51:05.441862 4810 scope.go:117] "RemoveContainer" containerID="75401b3edfa84b40903edbf5e5d73ff2da026a9f8214b6a2404ce809eb32bd1f" Feb 19 15:51:05 crc kubenswrapper[4810]: E0219 15:51:05.442852 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 15:51:16 crc kubenswrapper[4810]: I0219 15:51:16.439664 4810 scope.go:117] "RemoveContainer" containerID="75401b3edfa84b40903edbf5e5d73ff2da026a9f8214b6a2404ce809eb32bd1f" Feb 19 15:51:16 crc kubenswrapper[4810]: E0219 15:51:16.440544 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 15:51:30 crc kubenswrapper[4810]: I0219 15:51:30.439288 4810 scope.go:117] "RemoveContainer" containerID="75401b3edfa84b40903edbf5e5d73ff2da026a9f8214b6a2404ce809eb32bd1f" Feb 19 15:51:30 crc kubenswrapper[4810]: E0219 15:51:30.440259 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 15:51:41 crc kubenswrapper[4810]: I0219 15:51:41.444899 4810 scope.go:117] "RemoveContainer" containerID="75401b3edfa84b40903edbf5e5d73ff2da026a9f8214b6a2404ce809eb32bd1f" Feb 19 15:51:41 crc kubenswrapper[4810]: E0219 15:51:41.445876 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 15:51:53 crc kubenswrapper[4810]: I0219 15:51:53.442640 4810 scope.go:117] "RemoveContainer" containerID="75401b3edfa84b40903edbf5e5d73ff2da026a9f8214b6a2404ce809eb32bd1f" Feb 19 15:51:53 crc kubenswrapper[4810]: E0219 15:51:53.443456 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 15:52:07 crc kubenswrapper[4810]: I0219 15:52:07.440705 4810 scope.go:117] "RemoveContainer" containerID="75401b3edfa84b40903edbf5e5d73ff2da026a9f8214b6a2404ce809eb32bd1f" Feb 19 15:52:07 crc kubenswrapper[4810]: E0219 15:52:07.441798 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 15:52:22 crc kubenswrapper[4810]: I0219 15:52:22.440603 4810 scope.go:117] "RemoveContainer" containerID="75401b3edfa84b40903edbf5e5d73ff2da026a9f8214b6a2404ce809eb32bd1f" Feb 19 15:52:23 crc kubenswrapper[4810]: I0219 15:52:23.149289 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" event={"ID":"dfbf74ef-1e94-4826-8583-42b2e246ccf3","Type":"ContainerStarted","Data":"6287fea84a36b36e84360e8242ff92d8a387529e6d7bde8d2a6410fbcb262896"} Feb 19 15:52:28 crc kubenswrapper[4810]: I0219 15:52:28.218982 4810 generic.go:334] "Generic (PLEG): container finished" podID="cc5014f8-e5aa-47ad-8787-c187b0f7f0e1" containerID="2e87240d488a9e3fc53ca295a40f539e65e717cb89ae7e5a0bdf479e1de61898" exitCode=0 Feb 19 15:52:28 crc kubenswrapper[4810]: I0219 15:52:28.219176 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nv8wh" event={"ID":"cc5014f8-e5aa-47ad-8787-c187b0f7f0e1","Type":"ContainerDied","Data":"2e87240d488a9e3fc53ca295a40f539e65e717cb89ae7e5a0bdf479e1de61898"} Feb 19 15:52:29 crc kubenswrapper[4810]: I0219 15:52:29.781100 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nv8wh" Feb 19 15:52:29 crc kubenswrapper[4810]: I0219 15:52:29.928315 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/cc5014f8-e5aa-47ad-8787-c187b0f7f0e1-nova-cell1-compute-config-2\") pod \"cc5014f8-e5aa-47ad-8787-c187b0f7f0e1\" (UID: \"cc5014f8-e5aa-47ad-8787-c187b0f7f0e1\") " Feb 19 15:52:29 crc kubenswrapper[4810]: I0219 15:52:29.928857 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc5014f8-e5aa-47ad-8787-c187b0f7f0e1-nova-combined-ca-bundle\") pod \"cc5014f8-e5aa-47ad-8787-c187b0f7f0e1\" (UID: \"cc5014f8-e5aa-47ad-8787-c187b0f7f0e1\") " Feb 19 15:52:29 crc kubenswrapper[4810]: I0219 15:52:29.929055 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x224k\" (UniqueName: \"kubernetes.io/projected/cc5014f8-e5aa-47ad-8787-c187b0f7f0e1-kube-api-access-x224k\") pod \"cc5014f8-e5aa-47ad-8787-c187b0f7f0e1\" (UID: \"cc5014f8-e5aa-47ad-8787-c187b0f7f0e1\") " Feb 19 15:52:29 crc kubenswrapper[4810]: I0219 15:52:29.929525 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/cc5014f8-e5aa-47ad-8787-c187b0f7f0e1-nova-extra-config-0\") pod \"cc5014f8-e5aa-47ad-8787-c187b0f7f0e1\" (UID: \"cc5014f8-e5aa-47ad-8787-c187b0f7f0e1\") " Feb 19 15:52:29 crc kubenswrapper[4810]: I0219 15:52:29.929672 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/cc5014f8-e5aa-47ad-8787-c187b0f7f0e1-nova-migration-ssh-key-1\") pod \"cc5014f8-e5aa-47ad-8787-c187b0f7f0e1\" (UID: \"cc5014f8-e5aa-47ad-8787-c187b0f7f0e1\") " Feb 19 15:52:29 crc kubenswrapper[4810]: I0219 15:52:29.929799 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/cc5014f8-e5aa-47ad-8787-c187b0f7f0e1-nova-cell1-compute-config-0\") pod \"cc5014f8-e5aa-47ad-8787-c187b0f7f0e1\" (UID: \"cc5014f8-e5aa-47ad-8787-c187b0f7f0e1\") " Feb 19 15:52:29 crc kubenswrapper[4810]: I0219 15:52:29.929910 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cc5014f8-e5aa-47ad-8787-c187b0f7f0e1-inventory\") pod \"cc5014f8-e5aa-47ad-8787-c187b0f7f0e1\" (UID: \"cc5014f8-e5aa-47ad-8787-c187b0f7f0e1\") " Feb 19 15:52:29 crc kubenswrapper[4810]: I0219 15:52:29.930081 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/cc5014f8-e5aa-47ad-8787-c187b0f7f0e1-nova-cell1-compute-config-3\") pod \"cc5014f8-e5aa-47ad-8787-c187b0f7f0e1\" (UID: \"cc5014f8-e5aa-47ad-8787-c187b0f7f0e1\") " Feb 19 15:52:29 crc kubenswrapper[4810]: I0219 15:52:29.930220 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cc5014f8-e5aa-47ad-8787-c187b0f7f0e1-ssh-key-openstack-edpm-ipam\") pod \"cc5014f8-e5aa-47ad-8787-c187b0f7f0e1\" (UID: \"cc5014f8-e5aa-47ad-8787-c187b0f7f0e1\") " Feb 19 15:52:29 crc kubenswrapper[4810]: I0219 15:52:29.930739 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/cc5014f8-e5aa-47ad-8787-c187b0f7f0e1-nova-cell1-compute-config-1\") pod \"cc5014f8-e5aa-47ad-8787-c187b0f7f0e1\" (UID: \"cc5014f8-e5aa-47ad-8787-c187b0f7f0e1\") " Feb 19 15:52:29 crc kubenswrapper[4810]: I0219 15:52:29.930902 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/cc5014f8-e5aa-47ad-8787-c187b0f7f0e1-nova-migration-ssh-key-0\") pod \"cc5014f8-e5aa-47ad-8787-c187b0f7f0e1\" (UID: \"cc5014f8-e5aa-47ad-8787-c187b0f7f0e1\") " Feb 19 15:52:29 crc kubenswrapper[4810]: I0219 15:52:29.936272 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc5014f8-e5aa-47ad-8787-c187b0f7f0e1-kube-api-access-x224k" (OuterVolumeSpecName: "kube-api-access-x224k") pod "cc5014f8-e5aa-47ad-8787-c187b0f7f0e1" (UID: "cc5014f8-e5aa-47ad-8787-c187b0f7f0e1"). InnerVolumeSpecName "kube-api-access-x224k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:52:29 crc kubenswrapper[4810]: I0219 15:52:29.937731 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc5014f8-e5aa-47ad-8787-c187b0f7f0e1-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "cc5014f8-e5aa-47ad-8787-c187b0f7f0e1" (UID: "cc5014f8-e5aa-47ad-8787-c187b0f7f0e1"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:52:29 crc kubenswrapper[4810]: I0219 15:52:29.963613 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc5014f8-e5aa-47ad-8787-c187b0f7f0e1-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "cc5014f8-e5aa-47ad-8787-c187b0f7f0e1" (UID: "cc5014f8-e5aa-47ad-8787-c187b0f7f0e1"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:52:29 crc kubenswrapper[4810]: I0219 15:52:29.965494 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc5014f8-e5aa-47ad-8787-c187b0f7f0e1-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "cc5014f8-e5aa-47ad-8787-c187b0f7f0e1" (UID: "cc5014f8-e5aa-47ad-8787-c187b0f7f0e1"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:52:29 crc kubenswrapper[4810]: I0219 15:52:29.966941 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc5014f8-e5aa-47ad-8787-c187b0f7f0e1-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "cc5014f8-e5aa-47ad-8787-c187b0f7f0e1" (UID: "cc5014f8-e5aa-47ad-8787-c187b0f7f0e1"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:52:29 crc kubenswrapper[4810]: I0219 15:52:29.968993 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc5014f8-e5aa-47ad-8787-c187b0f7f0e1-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "cc5014f8-e5aa-47ad-8787-c187b0f7f0e1" (UID: "cc5014f8-e5aa-47ad-8787-c187b0f7f0e1"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:52:29 crc kubenswrapper[4810]: I0219 15:52:29.972525 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc5014f8-e5aa-47ad-8787-c187b0f7f0e1-inventory" (OuterVolumeSpecName: "inventory") pod "cc5014f8-e5aa-47ad-8787-c187b0f7f0e1" (UID: "cc5014f8-e5aa-47ad-8787-c187b0f7f0e1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:52:29 crc kubenswrapper[4810]: I0219 15:52:29.973575 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc5014f8-e5aa-47ad-8787-c187b0f7f0e1-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "cc5014f8-e5aa-47ad-8787-c187b0f7f0e1" (UID: "cc5014f8-e5aa-47ad-8787-c187b0f7f0e1"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:52:29 crc kubenswrapper[4810]: I0219 15:52:29.981593 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc5014f8-e5aa-47ad-8787-c187b0f7f0e1-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "cc5014f8-e5aa-47ad-8787-c187b0f7f0e1" (UID: "cc5014f8-e5aa-47ad-8787-c187b0f7f0e1"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:52:29 crc kubenswrapper[4810]: I0219 15:52:29.984486 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc5014f8-e5aa-47ad-8787-c187b0f7f0e1-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "cc5014f8-e5aa-47ad-8787-c187b0f7f0e1" (UID: "cc5014f8-e5aa-47ad-8787-c187b0f7f0e1"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:52:29 crc kubenswrapper[4810]: I0219 15:52:29.998569 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc5014f8-e5aa-47ad-8787-c187b0f7f0e1-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "cc5014f8-e5aa-47ad-8787-c187b0f7f0e1" (UID: "cc5014f8-e5aa-47ad-8787-c187b0f7f0e1"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:52:30 crc kubenswrapper[4810]: I0219 15:52:30.038768 4810 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/cc5014f8-e5aa-47ad-8787-c187b0f7f0e1-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Feb 19 15:52:30 crc kubenswrapper[4810]: I0219 15:52:30.038810 4810 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cc5014f8-e5aa-47ad-8787-c187b0f7f0e1-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 15:52:30 crc kubenswrapper[4810]: I0219 15:52:30.038823 4810 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/cc5014f8-e5aa-47ad-8787-c187b0f7f0e1-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Feb 19 15:52:30 crc kubenswrapper[4810]: I0219 15:52:30.038837 4810 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/cc5014f8-e5aa-47ad-8787-c187b0f7f0e1-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Feb 19 15:52:30 crc kubenswrapper[4810]: I0219 15:52:30.038849 4810 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/cc5014f8-e5aa-47ad-8787-c187b0f7f0e1-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Feb 19 15:52:30 crc kubenswrapper[4810]: I0219 15:52:30.038860 4810 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc5014f8-e5aa-47ad-8787-c187b0f7f0e1-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 15:52:30 crc kubenswrapper[4810]: I0219 15:52:30.038872 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x224k\" (UniqueName: \"kubernetes.io/projected/cc5014f8-e5aa-47ad-8787-c187b0f7f0e1-kube-api-access-x224k\") on node \"crc\" DevicePath \"\"" Feb 19 15:52:30 crc kubenswrapper[4810]: I0219 15:52:30.038884 4810 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/cc5014f8-e5aa-47ad-8787-c187b0f7f0e1-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Feb 19 15:52:30 crc kubenswrapper[4810]: I0219 15:52:30.038895 4810 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/cc5014f8-e5aa-47ad-8787-c187b0f7f0e1-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Feb 19 15:52:30 crc kubenswrapper[4810]: I0219 15:52:30.038908 4810 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/cc5014f8-e5aa-47ad-8787-c187b0f7f0e1-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Feb 19 15:52:30 crc kubenswrapper[4810]: I0219 15:52:30.038920 4810 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cc5014f8-e5aa-47ad-8787-c187b0f7f0e1-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 15:52:30 crc kubenswrapper[4810]: I0219 15:52:30.247001 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nv8wh" event={"ID":"cc5014f8-e5aa-47ad-8787-c187b0f7f0e1","Type":"ContainerDied","Data":"ba6d44e109e65ba6e824f2d3f6c56a0eb16fcbd5ac7ce5cace72d39335df678a"} Feb 19 15:52:30 crc kubenswrapper[4810]: I0219 15:52:30.247044 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba6d44e109e65ba6e824f2d3f6c56a0eb16fcbd5ac7ce5cace72d39335df678a" Feb 19 15:52:30 crc kubenswrapper[4810]: I0219 15:52:30.247086 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nv8wh" Feb 19 15:52:30 crc kubenswrapper[4810]: I0219 15:52:30.383073 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bfhfc"] Feb 19 15:52:30 crc kubenswrapper[4810]: E0219 15:52:30.383517 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b7ad7c-cc02-4a75-8ac7-47efd18067b9" containerName="extract-content" Feb 19 15:52:30 crc kubenswrapper[4810]: I0219 15:52:30.383538 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b7ad7c-cc02-4a75-8ac7-47efd18067b9" containerName="extract-content" Feb 19 15:52:30 crc kubenswrapper[4810]: E0219 15:52:30.383555 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc5014f8-e5aa-47ad-8787-c187b0f7f0e1" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 19 15:52:30 crc kubenswrapper[4810]: I0219 15:52:30.383564 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc5014f8-e5aa-47ad-8787-c187b0f7f0e1" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 19 15:52:30 crc kubenswrapper[4810]: E0219 15:52:30.383579 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b7ad7c-cc02-4a75-8ac7-47efd18067b9" containerName="registry-server" Feb 19 15:52:30 crc kubenswrapper[4810]: I0219 15:52:30.383587 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b7ad7c-cc02-4a75-8ac7-47efd18067b9" containerName="registry-server" Feb 19 15:52:30 crc kubenswrapper[4810]: E0219 15:52:30.383608 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="505f9989-9548-4391-b758-33ef9484f145" containerName="extract-content" Feb 19 15:52:30 crc kubenswrapper[4810]: I0219 15:52:30.383633 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="505f9989-9548-4391-b758-33ef9484f145" containerName="extract-content" Feb 19 15:52:30 crc kubenswrapper[4810]: E0219 15:52:30.383656 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="505f9989-9548-4391-b758-33ef9484f145" containerName="registry-server" Feb 19 15:52:30 crc kubenswrapper[4810]: I0219 15:52:30.383663 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="505f9989-9548-4391-b758-33ef9484f145" containerName="registry-server" Feb 19 15:52:30 crc kubenswrapper[4810]: E0219 15:52:30.383677 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="505f9989-9548-4391-b758-33ef9484f145" containerName="extract-utilities" Feb 19 15:52:30 crc kubenswrapper[4810]: I0219 15:52:30.383685 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="505f9989-9548-4391-b758-33ef9484f145" containerName="extract-utilities" Feb 19 15:52:30 crc kubenswrapper[4810]: E0219 15:52:30.383696 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b7ad7c-cc02-4a75-8ac7-47efd18067b9" containerName="extract-utilities" Feb 19 15:52:30 crc kubenswrapper[4810]: I0219 15:52:30.383704 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b7ad7c-cc02-4a75-8ac7-47efd18067b9" containerName="extract-utilities" Feb 19 15:52:30 crc kubenswrapper[4810]: I0219 15:52:30.383933 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b7ad7c-cc02-4a75-8ac7-47efd18067b9" containerName="registry-server" Feb 19 15:52:30 crc kubenswrapper[4810]: I0219 15:52:30.383954 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="505f9989-9548-4391-b758-33ef9484f145" containerName="registry-server" Feb 19 15:52:30 crc kubenswrapper[4810]: I0219 15:52:30.383974 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc5014f8-e5aa-47ad-8787-c187b0f7f0e1" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 19 15:52:30 crc kubenswrapper[4810]: I0219 15:52:30.384720 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bfhfc" Feb 19 15:52:30 crc kubenswrapper[4810]: I0219 15:52:30.388037 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Feb 19 15:52:30 crc kubenswrapper[4810]: I0219 15:52:30.388408 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-djc4q" Feb 19 15:52:30 crc kubenswrapper[4810]: I0219 15:52:30.388492 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 15:52:30 crc kubenswrapper[4810]: I0219 15:52:30.388693 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 15:52:30 crc kubenswrapper[4810]: I0219 15:52:30.388845 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 15:52:30 crc kubenswrapper[4810]: I0219 15:52:30.401027 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bfhfc"] Feb 19 15:52:30 crc kubenswrapper[4810]: I0219 15:52:30.446818 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/f7ca8c9a-db61-400f-9319-21590462f929-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bfhfc\" (UID: \"f7ca8c9a-db61-400f-9319-21590462f929\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bfhfc" Feb 19 15:52:30 crc kubenswrapper[4810]: I0219 15:52:30.446896 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7ca8c9a-db61-400f-9319-21590462f929-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bfhfc\" (UID: \"f7ca8c9a-db61-400f-9319-21590462f929\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bfhfc" Feb 19 15:52:30 crc kubenswrapper[4810]: I0219 15:52:30.446947 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldvxh\" (UniqueName: \"kubernetes.io/projected/f7ca8c9a-db61-400f-9319-21590462f929-kube-api-access-ldvxh\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bfhfc\" (UID: \"f7ca8c9a-db61-400f-9319-21590462f929\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bfhfc" Feb 19 15:52:30 crc kubenswrapper[4810]: I0219 15:52:30.447003 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f7ca8c9a-db61-400f-9319-21590462f929-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bfhfc\" (UID: \"f7ca8c9a-db61-400f-9319-21590462f929\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bfhfc" Feb 19 15:52:30 crc kubenswrapper[4810]: I0219 15:52:30.447024 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/f7ca8c9a-db61-400f-9319-21590462f929-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bfhfc\" (UID: \"f7ca8c9a-db61-400f-9319-21590462f929\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bfhfc" Feb 19 15:52:30 crc kubenswrapper[4810]: I0219 15:52:30.447058 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f7ca8c9a-db61-400f-9319-21590462f929-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bfhfc\" (UID: \"f7ca8c9a-db61-400f-9319-21590462f929\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bfhfc" Feb 19 15:52:30 crc kubenswrapper[4810]: I0219 15:52:30.447120 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/f7ca8c9a-db61-400f-9319-21590462f929-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bfhfc\" (UID: \"f7ca8c9a-db61-400f-9319-21590462f929\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bfhfc" Feb 19 15:52:30 crc kubenswrapper[4810]: I0219 15:52:30.549126 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldvxh\" (UniqueName: \"kubernetes.io/projected/f7ca8c9a-db61-400f-9319-21590462f929-kube-api-access-ldvxh\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bfhfc\" (UID: \"f7ca8c9a-db61-400f-9319-21590462f929\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bfhfc" Feb 19 15:52:30 crc kubenswrapper[4810]: I0219 15:52:30.549266 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f7ca8c9a-db61-400f-9319-21590462f929-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bfhfc\" (UID: \"f7ca8c9a-db61-400f-9319-21590462f929\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bfhfc" Feb 19 15:52:30 crc kubenswrapper[4810]: I0219 15:52:30.549362 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/f7ca8c9a-db61-400f-9319-21590462f929-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bfhfc\" (UID: \"f7ca8c9a-db61-400f-9319-21590462f929\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bfhfc" Feb 19 15:52:30 crc kubenswrapper[4810]: I0219 15:52:30.549425 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f7ca8c9a-db61-400f-9319-21590462f929-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bfhfc\" (UID: \"f7ca8c9a-db61-400f-9319-21590462f929\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bfhfc" Feb 19 15:52:30 crc kubenswrapper[4810]: I0219 15:52:30.549528 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/f7ca8c9a-db61-400f-9319-21590462f929-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bfhfc\" (UID: \"f7ca8c9a-db61-400f-9319-21590462f929\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bfhfc" Feb 19 15:52:30 crc kubenswrapper[4810]: I0219 15:52:30.549647 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/f7ca8c9a-db61-400f-9319-21590462f929-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bfhfc\" (UID: \"f7ca8c9a-db61-400f-9319-21590462f929\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bfhfc" Feb 19 15:52:30 crc kubenswrapper[4810]: I0219 15:52:30.549759 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7ca8c9a-db61-400f-9319-21590462f929-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bfhfc\" (UID: \"f7ca8c9a-db61-400f-9319-21590462f929\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bfhfc" Feb 19 15:52:30 crc kubenswrapper[4810]: I0219 15:52:30.554406 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f7ca8c9a-db61-400f-9319-21590462f929-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bfhfc\" (UID: \"f7ca8c9a-db61-400f-9319-21590462f929\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bfhfc" Feb 19 15:52:30 crc kubenswrapper[4810]: I0219 15:52:30.554918 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/f7ca8c9a-db61-400f-9319-21590462f929-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bfhfc\" (UID: \"f7ca8c9a-db61-400f-9319-21590462f929\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bfhfc" Feb 19 15:52:30 crc kubenswrapper[4810]: I0219 15:52:30.555258 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/f7ca8c9a-db61-400f-9319-21590462f929-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bfhfc\" (UID: \"f7ca8c9a-db61-400f-9319-21590462f929\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bfhfc" Feb 19 15:52:30 crc kubenswrapper[4810]: I0219 15:52:30.555720 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/f7ca8c9a-db61-400f-9319-21590462f929-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bfhfc\" (UID: \"f7ca8c9a-db61-400f-9319-21590462f929\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bfhfc" Feb 19 15:52:30 crc kubenswrapper[4810]: I0219 15:52:30.556140 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7ca8c9a-db61-400f-9319-21590462f929-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bfhfc\" (UID: \"f7ca8c9a-db61-400f-9319-21590462f929\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bfhfc" Feb 19 15:52:30 crc kubenswrapper[4810]: I0219 15:52:30.558259 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f7ca8c9a-db61-400f-9319-21590462f929-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bfhfc\" (UID: \"f7ca8c9a-db61-400f-9319-21590462f929\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bfhfc" Feb 19 15:52:30 crc kubenswrapper[4810]: I0219 15:52:30.567036 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldvxh\" (UniqueName: \"kubernetes.io/projected/f7ca8c9a-db61-400f-9319-21590462f929-kube-api-access-ldvxh\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bfhfc\" (UID: \"f7ca8c9a-db61-400f-9319-21590462f929\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bfhfc" Feb 19 15:52:30 crc kubenswrapper[4810]: I0219 15:52:30.720203 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bfhfc" Feb 19 15:52:31 crc kubenswrapper[4810]: I0219 15:52:31.294518 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bfhfc"] Feb 19 15:52:31 crc kubenswrapper[4810]: W0219 15:52:31.300413 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf7ca8c9a_db61_400f_9319_21590462f929.slice/crio-20a3b9e0f65be3023f64999fad259ea78baedb786dcbbe39432440197d3ff6aa WatchSource:0}: Error finding container 20a3b9e0f65be3023f64999fad259ea78baedb786dcbbe39432440197d3ff6aa: Status 404 returned error can't find the container with id 20a3b9e0f65be3023f64999fad259ea78baedb786dcbbe39432440197d3ff6aa Feb 19 15:52:31 crc kubenswrapper[4810]: I0219 15:52:31.818362 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 15:52:32 crc kubenswrapper[4810]: I0219 15:52:32.267787 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bfhfc" event={"ID":"f7ca8c9a-db61-400f-9319-21590462f929","Type":"ContainerStarted","Data":"d0eed7b2d2efeb2bed27dcf249640e7dd353ed994126d4417aa3f9b188221638"} Feb 19 15:52:32 crc kubenswrapper[4810]: I0219 15:52:32.268193 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bfhfc" event={"ID":"f7ca8c9a-db61-400f-9319-21590462f929","Type":"ContainerStarted","Data":"20a3b9e0f65be3023f64999fad259ea78baedb786dcbbe39432440197d3ff6aa"} Feb 19 15:52:32 crc kubenswrapper[4810]: I0219 15:52:32.296684 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bfhfc" podStartSLOduration=1.784789517 podStartE2EDuration="2.296666748s" podCreationTimestamp="2026-02-19 15:52:30 +0000 UTC" firstStartedPulling="2026-02-19 15:52:31.303411006 +0000 UTC m=+2580.785441140" lastFinishedPulling="2026-02-19 15:52:31.815288207 +0000 UTC m=+2581.297318371" observedRunningTime="2026-02-19 15:52:32.289434098 +0000 UTC m=+2581.771464212" watchObservedRunningTime="2026-02-19 15:52:32.296666748 +0000 UTC m=+2581.778696872" Feb 19 15:53:30 crc kubenswrapper[4810]: I0219 15:53:30.365151 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-j8tzn"] Feb 19 15:53:30 crc kubenswrapper[4810]: I0219 15:53:30.374621 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j8tzn" Feb 19 15:53:30 crc kubenswrapper[4810]: I0219 15:53:30.405522 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j8tzn"] Feb 19 15:53:30 crc kubenswrapper[4810]: I0219 15:53:30.460581 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af674839-9e12-4569-bee5-abcf06b09ee4-catalog-content\") pod \"redhat-marketplace-j8tzn\" (UID: \"af674839-9e12-4569-bee5-abcf06b09ee4\") " pod="openshift-marketplace/redhat-marketplace-j8tzn" Feb 19 15:53:30 crc kubenswrapper[4810]: I0219 15:53:30.460676 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fmk9\" (UniqueName: \"kubernetes.io/projected/af674839-9e12-4569-bee5-abcf06b09ee4-kube-api-access-2fmk9\") pod \"redhat-marketplace-j8tzn\" (UID: \"af674839-9e12-4569-bee5-abcf06b09ee4\") " pod="openshift-marketplace/redhat-marketplace-j8tzn" Feb 19 15:53:30 crc kubenswrapper[4810]: I0219 15:53:30.460719 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af674839-9e12-4569-bee5-abcf06b09ee4-utilities\") pod \"redhat-marketplace-j8tzn\" (UID: \"af674839-9e12-4569-bee5-abcf06b09ee4\") " pod="openshift-marketplace/redhat-marketplace-j8tzn" Feb 19 15:53:30 crc kubenswrapper[4810]: I0219 15:53:30.563369 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af674839-9e12-4569-bee5-abcf06b09ee4-utilities\") pod \"redhat-marketplace-j8tzn\" (UID: \"af674839-9e12-4569-bee5-abcf06b09ee4\") " pod="openshift-marketplace/redhat-marketplace-j8tzn" Feb 19 15:53:30 crc kubenswrapper[4810]: I0219 15:53:30.563628 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af674839-9e12-4569-bee5-abcf06b09ee4-catalog-content\") pod \"redhat-marketplace-j8tzn\" (UID: \"af674839-9e12-4569-bee5-abcf06b09ee4\") " pod="openshift-marketplace/redhat-marketplace-j8tzn" Feb 19 15:53:30 crc kubenswrapper[4810]: I0219 15:53:30.563687 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fmk9\" (UniqueName: \"kubernetes.io/projected/af674839-9e12-4569-bee5-abcf06b09ee4-kube-api-access-2fmk9\") pod \"redhat-marketplace-j8tzn\" (UID: \"af674839-9e12-4569-bee5-abcf06b09ee4\") " pod="openshift-marketplace/redhat-marketplace-j8tzn" Feb 19 15:53:30 crc kubenswrapper[4810]: I0219 15:53:30.564459 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af674839-9e12-4569-bee5-abcf06b09ee4-utilities\") pod \"redhat-marketplace-j8tzn\" (UID: \"af674839-9e12-4569-bee5-abcf06b09ee4\") " pod="openshift-marketplace/redhat-marketplace-j8tzn" Feb 19 15:53:30 crc kubenswrapper[4810]: I0219 15:53:30.564471 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af674839-9e12-4569-bee5-abcf06b09ee4-catalog-content\") pod \"redhat-marketplace-j8tzn\" (UID: \"af674839-9e12-4569-bee5-abcf06b09ee4\") " pod="openshift-marketplace/redhat-marketplace-j8tzn" Feb 19 15:53:30 crc kubenswrapper[4810]: I0219 15:53:30.587967 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fmk9\" (UniqueName: \"kubernetes.io/projected/af674839-9e12-4569-bee5-abcf06b09ee4-kube-api-access-2fmk9\") pod \"redhat-marketplace-j8tzn\" (UID: \"af674839-9e12-4569-bee5-abcf06b09ee4\") " pod="openshift-marketplace/redhat-marketplace-j8tzn" Feb 19 15:53:30 crc kubenswrapper[4810]: I0219 15:53:30.719938 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j8tzn" Feb 19 15:53:31 crc kubenswrapper[4810]: I0219 15:53:31.257215 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j8tzn"] Feb 19 15:53:32 crc kubenswrapper[4810]: I0219 15:53:32.015848 4810 generic.go:334] "Generic (PLEG): container finished" podID="af674839-9e12-4569-bee5-abcf06b09ee4" containerID="896186bd577c6d6da0fe0df76aea6a2e25eecc44dada34f1358dac166c3d7478" exitCode=0 Feb 19 15:53:32 crc kubenswrapper[4810]: I0219 15:53:32.016087 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j8tzn" event={"ID":"af674839-9e12-4569-bee5-abcf06b09ee4","Type":"ContainerDied","Data":"896186bd577c6d6da0fe0df76aea6a2e25eecc44dada34f1358dac166c3d7478"} Feb 19 15:53:32 crc kubenswrapper[4810]: I0219 15:53:32.020111 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j8tzn" event={"ID":"af674839-9e12-4569-bee5-abcf06b09ee4","Type":"ContainerStarted","Data":"2dca8097edd5d82f15db95e6d4de2a250e4adea34e9887f2e668fad9bb9de061"} Feb 19 15:53:33 crc kubenswrapper[4810]: I0219 15:53:33.029627 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j8tzn" event={"ID":"af674839-9e12-4569-bee5-abcf06b09ee4","Type":"ContainerStarted","Data":"f336b0a144f66301814adca72bf53897501afb1e619a9505a15772da7d9fb52e"} Feb 19 15:53:34 crc kubenswrapper[4810]: I0219 15:53:34.045390 4810 generic.go:334] "Generic (PLEG): container finished" podID="af674839-9e12-4569-bee5-abcf06b09ee4" containerID="f336b0a144f66301814adca72bf53897501afb1e619a9505a15772da7d9fb52e" exitCode=0 Feb 19 15:53:34 crc kubenswrapper[4810]: I0219 15:53:34.045440 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j8tzn" event={"ID":"af674839-9e12-4569-bee5-abcf06b09ee4","Type":"ContainerDied","Data":"f336b0a144f66301814adca72bf53897501afb1e619a9505a15772da7d9fb52e"} Feb 19 15:53:35 crc kubenswrapper[4810]: I0219 15:53:35.057881 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j8tzn" event={"ID":"af674839-9e12-4569-bee5-abcf06b09ee4","Type":"ContainerStarted","Data":"7497e6376bfd2621cb0d8554208fdc97c2fee88ae0127ebe48b955d5a65f7526"} Feb 19 15:53:35 crc kubenswrapper[4810]: I0219 15:53:35.082481 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-j8tzn" podStartSLOduration=2.651895813 podStartE2EDuration="5.082463968s" podCreationTimestamp="2026-02-19 15:53:30 +0000 UTC" firstStartedPulling="2026-02-19 15:53:32.018768714 +0000 UTC m=+2641.500798848" lastFinishedPulling="2026-02-19 15:53:34.449336869 +0000 UTC m=+2643.931367003" observedRunningTime="2026-02-19 15:53:35.080023737 +0000 UTC m=+2644.562053871" watchObservedRunningTime="2026-02-19 15:53:35.082463968 +0000 UTC m=+2644.564494102" Feb 19 15:53:36 crc kubenswrapper[4810]: E0219 15:53:36.515265 4810 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf674839_9e12_4569_bee5_abcf06b09ee4.slice/crio-conmon-896186bd577c6d6da0fe0df76aea6a2e25eecc44dada34f1358dac166c3d7478.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf674839_9e12_4569_bee5_abcf06b09ee4.slice/crio-896186bd577c6d6da0fe0df76aea6a2e25eecc44dada34f1358dac166c3d7478.scope\": RecentStats: unable to find data in memory cache]" Feb 19 15:53:40 crc kubenswrapper[4810]: I0219 15:53:40.720474 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-j8tzn" Feb 19 15:53:40 crc kubenswrapper[4810]: I0219 15:53:40.721455 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-j8tzn" Feb 19 15:53:40 crc kubenswrapper[4810]: I0219 15:53:40.796961 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-j8tzn" Feb 19 15:53:41 crc kubenswrapper[4810]: I0219 15:53:41.195582 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-j8tzn" Feb 19 15:53:41 crc kubenswrapper[4810]: I0219 15:53:41.253381 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j8tzn"] Feb 19 15:53:43 crc kubenswrapper[4810]: I0219 15:53:43.148000 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-j8tzn" podUID="af674839-9e12-4569-bee5-abcf06b09ee4" containerName="registry-server" containerID="cri-o://7497e6376bfd2621cb0d8554208fdc97c2fee88ae0127ebe48b955d5a65f7526" gracePeriod=2 Feb 19 15:53:43 crc kubenswrapper[4810]: I0219 15:53:43.749163 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j8tzn" Feb 19 15:53:43 crc kubenswrapper[4810]: I0219 15:53:43.787544 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2fmk9\" (UniqueName: \"kubernetes.io/projected/af674839-9e12-4569-bee5-abcf06b09ee4-kube-api-access-2fmk9\") pod \"af674839-9e12-4569-bee5-abcf06b09ee4\" (UID: \"af674839-9e12-4569-bee5-abcf06b09ee4\") " Feb 19 15:53:43 crc kubenswrapper[4810]: I0219 15:53:43.787906 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af674839-9e12-4569-bee5-abcf06b09ee4-utilities\") pod \"af674839-9e12-4569-bee5-abcf06b09ee4\" (UID: \"af674839-9e12-4569-bee5-abcf06b09ee4\") " Feb 19 15:53:43 crc kubenswrapper[4810]: I0219 15:53:43.788182 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af674839-9e12-4569-bee5-abcf06b09ee4-catalog-content\") pod \"af674839-9e12-4569-bee5-abcf06b09ee4\" (UID: \"af674839-9e12-4569-bee5-abcf06b09ee4\") " Feb 19 15:53:43 crc kubenswrapper[4810]: I0219 15:53:43.789065 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af674839-9e12-4569-bee5-abcf06b09ee4-utilities" (OuterVolumeSpecName: "utilities") pod "af674839-9e12-4569-bee5-abcf06b09ee4" (UID: "af674839-9e12-4569-bee5-abcf06b09ee4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:53:43 crc kubenswrapper[4810]: I0219 15:53:43.801872 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af674839-9e12-4569-bee5-abcf06b09ee4-kube-api-access-2fmk9" (OuterVolumeSpecName: "kube-api-access-2fmk9") pod "af674839-9e12-4569-bee5-abcf06b09ee4" (UID: "af674839-9e12-4569-bee5-abcf06b09ee4"). InnerVolumeSpecName "kube-api-access-2fmk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:53:43 crc kubenswrapper[4810]: I0219 15:53:43.841260 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af674839-9e12-4569-bee5-abcf06b09ee4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "af674839-9e12-4569-bee5-abcf06b09ee4" (UID: "af674839-9e12-4569-bee5-abcf06b09ee4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:53:43 crc kubenswrapper[4810]: I0219 15:53:43.891390 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af674839-9e12-4569-bee5-abcf06b09ee4-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 15:53:43 crc kubenswrapper[4810]: I0219 15:53:43.891680 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2fmk9\" (UniqueName: \"kubernetes.io/projected/af674839-9e12-4569-bee5-abcf06b09ee4-kube-api-access-2fmk9\") on node \"crc\" DevicePath \"\"" Feb 19 15:53:43 crc kubenswrapper[4810]: I0219 15:53:43.891766 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af674839-9e12-4569-bee5-abcf06b09ee4-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 15:53:44 crc kubenswrapper[4810]: I0219 15:53:44.158592 4810 generic.go:334] "Generic (PLEG): container finished" podID="af674839-9e12-4569-bee5-abcf06b09ee4" containerID="7497e6376bfd2621cb0d8554208fdc97c2fee88ae0127ebe48b955d5a65f7526" exitCode=0 Feb 19 15:53:44 crc kubenswrapper[4810]: I0219 15:53:44.158654 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j8tzn" Feb 19 15:53:44 crc kubenswrapper[4810]: I0219 15:53:44.158654 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j8tzn" event={"ID":"af674839-9e12-4569-bee5-abcf06b09ee4","Type":"ContainerDied","Data":"7497e6376bfd2621cb0d8554208fdc97c2fee88ae0127ebe48b955d5a65f7526"} Feb 19 15:53:44 crc kubenswrapper[4810]: I0219 15:53:44.158835 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j8tzn" event={"ID":"af674839-9e12-4569-bee5-abcf06b09ee4","Type":"ContainerDied","Data":"2dca8097edd5d82f15db95e6d4de2a250e4adea34e9887f2e668fad9bb9de061"} Feb 19 15:53:44 crc kubenswrapper[4810]: I0219 15:53:44.158869 4810 scope.go:117] "RemoveContainer" containerID="7497e6376bfd2621cb0d8554208fdc97c2fee88ae0127ebe48b955d5a65f7526" Feb 19 15:53:44 crc kubenswrapper[4810]: I0219 15:53:44.180812 4810 scope.go:117] "RemoveContainer" containerID="f336b0a144f66301814adca72bf53897501afb1e619a9505a15772da7d9fb52e" Feb 19 15:53:44 crc kubenswrapper[4810]: I0219 15:53:44.199454 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j8tzn"] Feb 19 15:53:44 crc kubenswrapper[4810]: I0219 15:53:44.218132 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-j8tzn"] Feb 19 15:53:44 crc kubenswrapper[4810]: I0219 15:53:44.220727 4810 scope.go:117] "RemoveContainer" containerID="896186bd577c6d6da0fe0df76aea6a2e25eecc44dada34f1358dac166c3d7478" Feb 19 15:53:44 crc kubenswrapper[4810]: I0219 15:53:44.271766 4810 scope.go:117] "RemoveContainer" containerID="7497e6376bfd2621cb0d8554208fdc97c2fee88ae0127ebe48b955d5a65f7526" Feb 19 15:53:44 crc kubenswrapper[4810]: E0219 15:53:44.272726 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7497e6376bfd2621cb0d8554208fdc97c2fee88ae0127ebe48b955d5a65f7526\": container with ID starting with 7497e6376bfd2621cb0d8554208fdc97c2fee88ae0127ebe48b955d5a65f7526 not found: ID does not exist" containerID="7497e6376bfd2621cb0d8554208fdc97c2fee88ae0127ebe48b955d5a65f7526" Feb 19 15:53:44 crc kubenswrapper[4810]: I0219 15:53:44.272779 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7497e6376bfd2621cb0d8554208fdc97c2fee88ae0127ebe48b955d5a65f7526"} err="failed to get container status \"7497e6376bfd2621cb0d8554208fdc97c2fee88ae0127ebe48b955d5a65f7526\": rpc error: code = NotFound desc = could not find container \"7497e6376bfd2621cb0d8554208fdc97c2fee88ae0127ebe48b955d5a65f7526\": container with ID starting with 7497e6376bfd2621cb0d8554208fdc97c2fee88ae0127ebe48b955d5a65f7526 not found: ID does not exist" Feb 19 15:53:44 crc kubenswrapper[4810]: I0219 15:53:44.272816 4810 scope.go:117] "RemoveContainer" containerID="f336b0a144f66301814adca72bf53897501afb1e619a9505a15772da7d9fb52e" Feb 19 15:53:44 crc kubenswrapper[4810]: E0219 15:53:44.273093 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f336b0a144f66301814adca72bf53897501afb1e619a9505a15772da7d9fb52e\": container with ID starting with f336b0a144f66301814adca72bf53897501afb1e619a9505a15772da7d9fb52e not found: ID does not exist" containerID="f336b0a144f66301814adca72bf53897501afb1e619a9505a15772da7d9fb52e" Feb 19 15:53:44 crc kubenswrapper[4810]: I0219 15:53:44.273133 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f336b0a144f66301814adca72bf53897501afb1e619a9505a15772da7d9fb52e"} err="failed to get container status \"f336b0a144f66301814adca72bf53897501afb1e619a9505a15772da7d9fb52e\": rpc error: code = NotFound desc = could not find container \"f336b0a144f66301814adca72bf53897501afb1e619a9505a15772da7d9fb52e\": container with ID starting with f336b0a144f66301814adca72bf53897501afb1e619a9505a15772da7d9fb52e not found: ID does not exist" Feb 19 15:53:44 crc kubenswrapper[4810]: I0219 15:53:44.273176 4810 scope.go:117] "RemoveContainer" containerID="896186bd577c6d6da0fe0df76aea6a2e25eecc44dada34f1358dac166c3d7478" Feb 19 15:53:44 crc kubenswrapper[4810]: E0219 15:53:44.273842 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"896186bd577c6d6da0fe0df76aea6a2e25eecc44dada34f1358dac166c3d7478\": container with ID starting with 896186bd577c6d6da0fe0df76aea6a2e25eecc44dada34f1358dac166c3d7478 not found: ID does not exist" containerID="896186bd577c6d6da0fe0df76aea6a2e25eecc44dada34f1358dac166c3d7478" Feb 19 15:53:44 crc kubenswrapper[4810]: I0219 15:53:44.273886 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"896186bd577c6d6da0fe0df76aea6a2e25eecc44dada34f1358dac166c3d7478"} err="failed to get container status \"896186bd577c6d6da0fe0df76aea6a2e25eecc44dada34f1358dac166c3d7478\": rpc error: code = NotFound desc = could not find container \"896186bd577c6d6da0fe0df76aea6a2e25eecc44dada34f1358dac166c3d7478\": container with ID starting with 896186bd577c6d6da0fe0df76aea6a2e25eecc44dada34f1358dac166c3d7478 not found: ID does not exist" Feb 19 15:53:45 crc kubenswrapper[4810]: I0219 15:53:45.453884 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af674839-9e12-4569-bee5-abcf06b09ee4" path="/var/lib/kubelet/pods/af674839-9e12-4569-bee5-abcf06b09ee4/volumes" Feb 19 15:53:46 crc kubenswrapper[4810]: E0219 15:53:46.865974 4810 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf674839_9e12_4569_bee5_abcf06b09ee4.slice/crio-conmon-896186bd577c6d6da0fe0df76aea6a2e25eecc44dada34f1358dac166c3d7478.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf674839_9e12_4569_bee5_abcf06b09ee4.slice/crio-896186bd577c6d6da0fe0df76aea6a2e25eecc44dada34f1358dac166c3d7478.scope\": RecentStats: unable to find data in memory cache]" Feb 19 15:53:57 crc kubenswrapper[4810]: E0219 15:53:57.142929 4810 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf674839_9e12_4569_bee5_abcf06b09ee4.slice/crio-conmon-896186bd577c6d6da0fe0df76aea6a2e25eecc44dada34f1358dac166c3d7478.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf674839_9e12_4569_bee5_abcf06b09ee4.slice/crio-896186bd577c6d6da0fe0df76aea6a2e25eecc44dada34f1358dac166c3d7478.scope\": RecentStats: unable to find data in memory cache]" Feb 19 15:54:07 crc kubenswrapper[4810]: E0219 15:54:07.459212 4810 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf674839_9e12_4569_bee5_abcf06b09ee4.slice/crio-896186bd577c6d6da0fe0df76aea6a2e25eecc44dada34f1358dac166c3d7478.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf674839_9e12_4569_bee5_abcf06b09ee4.slice/crio-conmon-896186bd577c6d6da0fe0df76aea6a2e25eecc44dada34f1358dac166c3d7478.scope\": RecentStats: unable to find data in memory cache]" Feb 19 15:54:17 crc kubenswrapper[4810]: E0219 15:54:17.753708 4810 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf674839_9e12_4569_bee5_abcf06b09ee4.slice/crio-896186bd577c6d6da0fe0df76aea6a2e25eecc44dada34f1358dac166c3d7478.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf674839_9e12_4569_bee5_abcf06b09ee4.slice/crio-conmon-896186bd577c6d6da0fe0df76aea6a2e25eecc44dada34f1358dac166c3d7478.scope\": RecentStats: unable to find data in memory cache]" Feb 19 15:54:28 crc kubenswrapper[4810]: E0219 15:54:28.082642 4810 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf674839_9e12_4569_bee5_abcf06b09ee4.slice/crio-896186bd577c6d6da0fe0df76aea6a2e25eecc44dada34f1358dac166c3d7478.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf674839_9e12_4569_bee5_abcf06b09ee4.slice/crio-conmon-896186bd577c6d6da0fe0df76aea6a2e25eecc44dada34f1358dac166c3d7478.scope\": RecentStats: unable to find data in memory cache]" Feb 19 15:54:31 crc kubenswrapper[4810]: E0219 15:54:31.470995 4810 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/6e750a3a26584da6ec5e11eb1f18ab5b6d020df87b3986858e10a7310cc6e786/diff" to get inode usage: stat /var/lib/containers/storage/overlay/6e750a3a26584da6ec5e11eb1f18ab5b6d020df87b3986858e10a7310cc6e786/diff: no such file or directory, extraDiskErr: could not stat "/var/log/pods/openshift-marketplace_redhat-marketplace-j8tzn_af674839-9e12-4569-bee5-abcf06b09ee4/extract-utilities/0.log" to get inode usage: stat /var/log/pods/openshift-marketplace_redhat-marketplace-j8tzn_af674839-9e12-4569-bee5-abcf06b09ee4/extract-utilities/0.log: no such file or directory Feb 19 15:54:36 crc kubenswrapper[4810]: I0219 15:54:36.757898 4810 generic.go:334] "Generic (PLEG): container finished" podID="f7ca8c9a-db61-400f-9319-21590462f929" containerID="d0eed7b2d2efeb2bed27dcf249640e7dd353ed994126d4417aa3f9b188221638" exitCode=0 Feb 19 15:54:36 crc kubenswrapper[4810]: I0219 15:54:36.758536 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bfhfc" event={"ID":"f7ca8c9a-db61-400f-9319-21590462f929","Type":"ContainerDied","Data":"d0eed7b2d2efeb2bed27dcf249640e7dd353ed994126d4417aa3f9b188221638"} Feb 19 15:54:38 crc kubenswrapper[4810]: I0219 15:54:38.323432 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bfhfc" Feb 19 15:54:38 crc kubenswrapper[4810]: I0219 15:54:38.417143 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7ca8c9a-db61-400f-9319-21590462f929-telemetry-combined-ca-bundle\") pod \"f7ca8c9a-db61-400f-9319-21590462f929\" (UID: \"f7ca8c9a-db61-400f-9319-21590462f929\") " Feb 19 15:54:38 crc kubenswrapper[4810]: I0219 15:54:38.417213 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/f7ca8c9a-db61-400f-9319-21590462f929-ceilometer-compute-config-data-1\") pod \"f7ca8c9a-db61-400f-9319-21590462f929\" (UID: \"f7ca8c9a-db61-400f-9319-21590462f929\") " Feb 19 15:54:38 crc kubenswrapper[4810]: I0219 15:54:38.417443 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/f7ca8c9a-db61-400f-9319-21590462f929-ceilometer-compute-config-data-0\") pod \"f7ca8c9a-db61-400f-9319-21590462f929\" (UID: \"f7ca8c9a-db61-400f-9319-21590462f929\") " Feb 19 15:54:38 crc kubenswrapper[4810]: I0219 15:54:38.417483 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f7ca8c9a-db61-400f-9319-21590462f929-inventory\") pod \"f7ca8c9a-db61-400f-9319-21590462f929\" (UID: \"f7ca8c9a-db61-400f-9319-21590462f929\") " Feb 19 15:54:38 crc kubenswrapper[4810]: I0219 15:54:38.417544 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f7ca8c9a-db61-400f-9319-21590462f929-ssh-key-openstack-edpm-ipam\") pod \"f7ca8c9a-db61-400f-9319-21590462f929\" (UID: \"f7ca8c9a-db61-400f-9319-21590462f929\") " Feb 19 15:54:38 crc kubenswrapper[4810]: I0219 15:54:38.417656 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ldvxh\" (UniqueName: \"kubernetes.io/projected/f7ca8c9a-db61-400f-9319-21590462f929-kube-api-access-ldvxh\") pod \"f7ca8c9a-db61-400f-9319-21590462f929\" (UID: \"f7ca8c9a-db61-400f-9319-21590462f929\") " Feb 19 15:54:38 crc kubenswrapper[4810]: I0219 15:54:38.417740 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/f7ca8c9a-db61-400f-9319-21590462f929-ceilometer-compute-config-data-2\") pod \"f7ca8c9a-db61-400f-9319-21590462f929\" (UID: \"f7ca8c9a-db61-400f-9319-21590462f929\") " Feb 19 15:54:38 crc kubenswrapper[4810]: I0219 15:54:38.424199 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7ca8c9a-db61-400f-9319-21590462f929-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "f7ca8c9a-db61-400f-9319-21590462f929" (UID: "f7ca8c9a-db61-400f-9319-21590462f929"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:54:38 crc kubenswrapper[4810]: I0219 15:54:38.424637 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7ca8c9a-db61-400f-9319-21590462f929-kube-api-access-ldvxh" (OuterVolumeSpecName: "kube-api-access-ldvxh") pod "f7ca8c9a-db61-400f-9319-21590462f929" (UID: "f7ca8c9a-db61-400f-9319-21590462f929"). InnerVolumeSpecName "kube-api-access-ldvxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:54:38 crc kubenswrapper[4810]: I0219 15:54:38.447072 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7ca8c9a-db61-400f-9319-21590462f929-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "f7ca8c9a-db61-400f-9319-21590462f929" (UID: "f7ca8c9a-db61-400f-9319-21590462f929"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:54:38 crc kubenswrapper[4810]: I0219 15:54:38.449121 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7ca8c9a-db61-400f-9319-21590462f929-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "f7ca8c9a-db61-400f-9319-21590462f929" (UID: "f7ca8c9a-db61-400f-9319-21590462f929"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:54:38 crc kubenswrapper[4810]: I0219 15:54:38.450242 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7ca8c9a-db61-400f-9319-21590462f929-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f7ca8c9a-db61-400f-9319-21590462f929" (UID: "f7ca8c9a-db61-400f-9319-21590462f929"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:54:38 crc kubenswrapper[4810]: I0219 15:54:38.454046 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7ca8c9a-db61-400f-9319-21590462f929-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "f7ca8c9a-db61-400f-9319-21590462f929" (UID: "f7ca8c9a-db61-400f-9319-21590462f929"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:54:38 crc kubenswrapper[4810]: I0219 15:54:38.467411 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7ca8c9a-db61-400f-9319-21590462f929-inventory" (OuterVolumeSpecName: "inventory") pod "f7ca8c9a-db61-400f-9319-21590462f929" (UID: "f7ca8c9a-db61-400f-9319-21590462f929"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:54:38 crc kubenswrapper[4810]: I0219 15:54:38.521044 4810 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7ca8c9a-db61-400f-9319-21590462f929-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 15:54:38 crc kubenswrapper[4810]: I0219 15:54:38.521096 4810 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/f7ca8c9a-db61-400f-9319-21590462f929-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Feb 19 15:54:38 crc kubenswrapper[4810]: I0219 15:54:38.521117 4810 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/f7ca8c9a-db61-400f-9319-21590462f929-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Feb 19 15:54:38 crc kubenswrapper[4810]: I0219 15:54:38.521138 4810 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f7ca8c9a-db61-400f-9319-21590462f929-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 15:54:38 crc kubenswrapper[4810]: I0219 15:54:38.521159 4810 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f7ca8c9a-db61-400f-9319-21590462f929-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 15:54:38 crc kubenswrapper[4810]: I0219 15:54:38.521179 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ldvxh\" (UniqueName: \"kubernetes.io/projected/f7ca8c9a-db61-400f-9319-21590462f929-kube-api-access-ldvxh\") on node \"crc\" DevicePath \"\"" Feb 19 15:54:38 crc kubenswrapper[4810]: I0219 15:54:38.521196 4810 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/f7ca8c9a-db61-400f-9319-21590462f929-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Feb 19 15:54:38 crc kubenswrapper[4810]: I0219 15:54:38.802293 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bfhfc" event={"ID":"f7ca8c9a-db61-400f-9319-21590462f929","Type":"ContainerDied","Data":"20a3b9e0f65be3023f64999fad259ea78baedb786dcbbe39432440197d3ff6aa"} Feb 19 15:54:38 crc kubenswrapper[4810]: I0219 15:54:38.802765 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="20a3b9e0f65be3023f64999fad259ea78baedb786dcbbe39432440197d3ff6aa" Feb 19 15:54:38 crc kubenswrapper[4810]: I0219 15:54:38.802979 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bfhfc" Feb 19 15:54:49 crc kubenswrapper[4810]: I0219 15:54:49.538112 4810 patch_prober.go:28] interesting pod/machine-config-daemon-t499d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 15:54:49 crc kubenswrapper[4810]: I0219 15:54:49.538864 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.331602 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Feb 19 15:55:12 crc kubenswrapper[4810]: E0219 15:55:12.332430 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af674839-9e12-4569-bee5-abcf06b09ee4" containerName="registry-server" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.332442 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="af674839-9e12-4569-bee5-abcf06b09ee4" containerName="registry-server" Feb 19 15:55:12 crc kubenswrapper[4810]: E0219 15:55:12.332458 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af674839-9e12-4569-bee5-abcf06b09ee4" containerName="extract-utilities" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.332465 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="af674839-9e12-4569-bee5-abcf06b09ee4" containerName="extract-utilities" Feb 19 15:55:12 crc kubenswrapper[4810]: E0219 15:55:12.332483 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7ca8c9a-db61-400f-9319-21590462f929" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.332490 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7ca8c9a-db61-400f-9319-21590462f929" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 19 15:55:12 crc kubenswrapper[4810]: E0219 15:55:12.332504 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af674839-9e12-4569-bee5-abcf06b09ee4" containerName="extract-content" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.332510 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="af674839-9e12-4569-bee5-abcf06b09ee4" containerName="extract-content" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.332682 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7ca8c9a-db61-400f-9319-21590462f929" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.332697 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="af674839-9e12-4569-bee5-abcf06b09ee4" containerName="registry-server" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.333616 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.340458 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.344721 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.351536 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f66b86b2-b164-4380-8a89-bb0cf5f833ef-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"f66b86b2-b164-4380-8a89-bb0cf5f833ef\") " pod="openstack/cinder-backup-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.351572 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f66b86b2-b164-4380-8a89-bb0cf5f833ef-lib-modules\") pod \"cinder-backup-0\" (UID: \"f66b86b2-b164-4380-8a89-bb0cf5f833ef\") " pod="openstack/cinder-backup-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.351592 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5cqj\" (UniqueName: \"kubernetes.io/projected/f66b86b2-b164-4380-8a89-bb0cf5f833ef-kube-api-access-z5cqj\") pod \"cinder-backup-0\" (UID: \"f66b86b2-b164-4380-8a89-bb0cf5f833ef\") " pod="openstack/cinder-backup-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.351753 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f66b86b2-b164-4380-8a89-bb0cf5f833ef-run\") pod \"cinder-backup-0\" (UID: \"f66b86b2-b164-4380-8a89-bb0cf5f833ef\") " pod="openstack/cinder-backup-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.351794 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f66b86b2-b164-4380-8a89-bb0cf5f833ef-sys\") pod \"cinder-backup-0\" (UID: \"f66b86b2-b164-4380-8a89-bb0cf5f833ef\") " pod="openstack/cinder-backup-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.351821 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f66b86b2-b164-4380-8a89-bb0cf5f833ef-etc-nvme\") pod \"cinder-backup-0\" (UID: \"f66b86b2-b164-4380-8a89-bb0cf5f833ef\") " pod="openstack/cinder-backup-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.351870 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f66b86b2-b164-4380-8a89-bb0cf5f833ef-config-data-custom\") pod \"cinder-backup-0\" (UID: \"f66b86b2-b164-4380-8a89-bb0cf5f833ef\") " pod="openstack/cinder-backup-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.351901 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f66b86b2-b164-4380-8a89-bb0cf5f833ef-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"f66b86b2-b164-4380-8a89-bb0cf5f833ef\") " pod="openstack/cinder-backup-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.352014 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f66b86b2-b164-4380-8a89-bb0cf5f833ef-config-data\") pod \"cinder-backup-0\" (UID: \"f66b86b2-b164-4380-8a89-bb0cf5f833ef\") " pod="openstack/cinder-backup-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.352148 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f66b86b2-b164-4380-8a89-bb0cf5f833ef-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"f66b86b2-b164-4380-8a89-bb0cf5f833ef\") " pod="openstack/cinder-backup-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.352256 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f66b86b2-b164-4380-8a89-bb0cf5f833ef-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"f66b86b2-b164-4380-8a89-bb0cf5f833ef\") " pod="openstack/cinder-backup-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.352285 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f66b86b2-b164-4380-8a89-bb0cf5f833ef-dev\") pod \"cinder-backup-0\" (UID: \"f66b86b2-b164-4380-8a89-bb0cf5f833ef\") " pod="openstack/cinder-backup-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.352374 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f66b86b2-b164-4380-8a89-bb0cf5f833ef-scripts\") pod \"cinder-backup-0\" (UID: \"f66b86b2-b164-4380-8a89-bb0cf5f833ef\") " pod="openstack/cinder-backup-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.352398 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/f66b86b2-b164-4380-8a89-bb0cf5f833ef-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"f66b86b2-b164-4380-8a89-bb0cf5f833ef\") " pod="openstack/cinder-backup-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.352468 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/f66b86b2-b164-4380-8a89-bb0cf5f833ef-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"f66b86b2-b164-4380-8a89-bb0cf5f833ef\") " pod="openstack/cinder-backup-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.422457 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-nfs-0"] Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.425642 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-nfs-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.427833 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-nfs-config-data" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.440827 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-nfs-0"] Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.454394 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f66b86b2-b164-4380-8a89-bb0cf5f833ef-config-data-custom\") pod \"cinder-backup-0\" (UID: \"f66b86b2-b164-4380-8a89-bb0cf5f833ef\") " pod="openstack/cinder-backup-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.454433 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f66b86b2-b164-4380-8a89-bb0cf5f833ef-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"f66b86b2-b164-4380-8a89-bb0cf5f833ef\") " pod="openstack/cinder-backup-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.454466 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f66b86b2-b164-4380-8a89-bb0cf5f833ef-config-data\") pod \"cinder-backup-0\" (UID: \"f66b86b2-b164-4380-8a89-bb0cf5f833ef\") " pod="openstack/cinder-backup-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.454533 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f66b86b2-b164-4380-8a89-bb0cf5f833ef-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"f66b86b2-b164-4380-8a89-bb0cf5f833ef\") " pod="openstack/cinder-backup-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.454582 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f66b86b2-b164-4380-8a89-bb0cf5f833ef-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"f66b86b2-b164-4380-8a89-bb0cf5f833ef\") " pod="openstack/cinder-backup-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.454598 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f66b86b2-b164-4380-8a89-bb0cf5f833ef-dev\") pod \"cinder-backup-0\" (UID: \"f66b86b2-b164-4380-8a89-bb0cf5f833ef\") " pod="openstack/cinder-backup-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.454622 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f66b86b2-b164-4380-8a89-bb0cf5f833ef-scripts\") pod \"cinder-backup-0\" (UID: \"f66b86b2-b164-4380-8a89-bb0cf5f833ef\") " pod="openstack/cinder-backup-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.454636 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/f66b86b2-b164-4380-8a89-bb0cf5f833ef-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"f66b86b2-b164-4380-8a89-bb0cf5f833ef\") " pod="openstack/cinder-backup-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.454671 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/f66b86b2-b164-4380-8a89-bb0cf5f833ef-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"f66b86b2-b164-4380-8a89-bb0cf5f833ef\") " pod="openstack/cinder-backup-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.454720 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f66b86b2-b164-4380-8a89-bb0cf5f833ef-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"f66b86b2-b164-4380-8a89-bb0cf5f833ef\") " pod="openstack/cinder-backup-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.454749 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f66b86b2-b164-4380-8a89-bb0cf5f833ef-lib-modules\") pod \"cinder-backup-0\" (UID: \"f66b86b2-b164-4380-8a89-bb0cf5f833ef\") " pod="openstack/cinder-backup-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.454766 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5cqj\" (UniqueName: \"kubernetes.io/projected/f66b86b2-b164-4380-8a89-bb0cf5f833ef-kube-api-access-z5cqj\") pod \"cinder-backup-0\" (UID: \"f66b86b2-b164-4380-8a89-bb0cf5f833ef\") " pod="openstack/cinder-backup-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.454808 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f66b86b2-b164-4380-8a89-bb0cf5f833ef-run\") pod \"cinder-backup-0\" (UID: \"f66b86b2-b164-4380-8a89-bb0cf5f833ef\") " pod="openstack/cinder-backup-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.454823 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f66b86b2-b164-4380-8a89-bb0cf5f833ef-sys\") pod \"cinder-backup-0\" (UID: \"f66b86b2-b164-4380-8a89-bb0cf5f833ef\") " pod="openstack/cinder-backup-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.454840 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f66b86b2-b164-4380-8a89-bb0cf5f833ef-etc-nvme\") pod \"cinder-backup-0\" (UID: \"f66b86b2-b164-4380-8a89-bb0cf5f833ef\") " pod="openstack/cinder-backup-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.454960 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f66b86b2-b164-4380-8a89-bb0cf5f833ef-etc-nvme\") pod \"cinder-backup-0\" (UID: \"f66b86b2-b164-4380-8a89-bb0cf5f833ef\") " pod="openstack/cinder-backup-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.455242 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f66b86b2-b164-4380-8a89-bb0cf5f833ef-lib-modules\") pod \"cinder-backup-0\" (UID: \"f66b86b2-b164-4380-8a89-bb0cf5f833ef\") " pod="openstack/cinder-backup-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.455390 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f66b86b2-b164-4380-8a89-bb0cf5f833ef-run\") pod \"cinder-backup-0\" (UID: \"f66b86b2-b164-4380-8a89-bb0cf5f833ef\") " pod="openstack/cinder-backup-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.455429 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f66b86b2-b164-4380-8a89-bb0cf5f833ef-sys\") pod \"cinder-backup-0\" (UID: \"f66b86b2-b164-4380-8a89-bb0cf5f833ef\") " pod="openstack/cinder-backup-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.455442 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f66b86b2-b164-4380-8a89-bb0cf5f833ef-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"f66b86b2-b164-4380-8a89-bb0cf5f833ef\") " pod="openstack/cinder-backup-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.455480 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f66b86b2-b164-4380-8a89-bb0cf5f833ef-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"f66b86b2-b164-4380-8a89-bb0cf5f833ef\") " pod="openstack/cinder-backup-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.455661 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f66b86b2-b164-4380-8a89-bb0cf5f833ef-dev\") pod \"cinder-backup-0\" (UID: \"f66b86b2-b164-4380-8a89-bb0cf5f833ef\") " pod="openstack/cinder-backup-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.459531 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/f66b86b2-b164-4380-8a89-bb0cf5f833ef-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"f66b86b2-b164-4380-8a89-bb0cf5f833ef\") " pod="openstack/cinder-backup-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.460101 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/f66b86b2-b164-4380-8a89-bb0cf5f833ef-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"f66b86b2-b164-4380-8a89-bb0cf5f833ef\") " pod="openstack/cinder-backup-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.460142 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f66b86b2-b164-4380-8a89-bb0cf5f833ef-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"f66b86b2-b164-4380-8a89-bb0cf5f833ef\") " pod="openstack/cinder-backup-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.461200 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f66b86b2-b164-4380-8a89-bb0cf5f833ef-config-data\") pod \"cinder-backup-0\" (UID: \"f66b86b2-b164-4380-8a89-bb0cf5f833ef\") " pod="openstack/cinder-backup-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.476340 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f66b86b2-b164-4380-8a89-bb0cf5f833ef-config-data-custom\") pod \"cinder-backup-0\" (UID: \"f66b86b2-b164-4380-8a89-bb0cf5f833ef\") " pod="openstack/cinder-backup-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.476821 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f66b86b2-b164-4380-8a89-bb0cf5f833ef-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"f66b86b2-b164-4380-8a89-bb0cf5f833ef\") " pod="openstack/cinder-backup-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.478511 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-nfs-2-0"] Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.480771 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f66b86b2-b164-4380-8a89-bb0cf5f833ef-scripts\") pod \"cinder-backup-0\" (UID: \"f66b86b2-b164-4380-8a89-bb0cf5f833ef\") " pod="openstack/cinder-backup-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.489771 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-nfs-2-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.489771 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5cqj\" (UniqueName: \"kubernetes.io/projected/f66b86b2-b164-4380-8a89-bb0cf5f833ef-kube-api-access-z5cqj\") pod \"cinder-backup-0\" (UID: \"f66b86b2-b164-4380-8a89-bb0cf5f833ef\") " pod="openstack/cinder-backup-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.492120 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-nfs-2-config-data" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.520165 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-nfs-2-0"] Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.556200 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20a46eb8-508d-45be-bf13-31aed23d1582-combined-ca-bundle\") pod \"cinder-volume-nfs-0\" (UID: \"20a46eb8-508d-45be-bf13-31aed23d1582\") " pod="openstack/cinder-volume-nfs-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.556248 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/20a46eb8-508d-45be-bf13-31aed23d1582-lib-modules\") pod \"cinder-volume-nfs-0\" (UID: \"20a46eb8-508d-45be-bf13-31aed23d1582\") " pod="openstack/cinder-volume-nfs-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.556290 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/20a46eb8-508d-45be-bf13-31aed23d1582-etc-machine-id\") pod \"cinder-volume-nfs-0\" (UID: \"20a46eb8-508d-45be-bf13-31aed23d1582\") " pod="openstack/cinder-volume-nfs-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.556583 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/20a46eb8-508d-45be-bf13-31aed23d1582-var-lib-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"20a46eb8-508d-45be-bf13-31aed23d1582\") " pod="openstack/cinder-volume-nfs-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.556663 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/20a46eb8-508d-45be-bf13-31aed23d1582-run\") pod \"cinder-volume-nfs-0\" (UID: \"20a46eb8-508d-45be-bf13-31aed23d1582\") " pod="openstack/cinder-volume-nfs-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.556696 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/20a46eb8-508d-45be-bf13-31aed23d1582-var-locks-brick\") pod \"cinder-volume-nfs-0\" (UID: \"20a46eb8-508d-45be-bf13-31aed23d1582\") " pod="openstack/cinder-volume-nfs-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.556727 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/20a46eb8-508d-45be-bf13-31aed23d1582-etc-iscsi\") pod \"cinder-volume-nfs-0\" (UID: \"20a46eb8-508d-45be-bf13-31aed23d1582\") " pod="openstack/cinder-volume-nfs-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.556874 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20a46eb8-508d-45be-bf13-31aed23d1582-scripts\") pod \"cinder-volume-nfs-0\" (UID: \"20a46eb8-508d-45be-bf13-31aed23d1582\") " pod="openstack/cinder-volume-nfs-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.556963 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/20a46eb8-508d-45be-bf13-31aed23d1582-etc-nvme\") pod \"cinder-volume-nfs-0\" (UID: \"20a46eb8-508d-45be-bf13-31aed23d1582\") " pod="openstack/cinder-volume-nfs-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.557000 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/20a46eb8-508d-45be-bf13-31aed23d1582-config-data-custom\") pod \"cinder-volume-nfs-0\" (UID: \"20a46eb8-508d-45be-bf13-31aed23d1582\") " pod="openstack/cinder-volume-nfs-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.557138 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20a46eb8-508d-45be-bf13-31aed23d1582-config-data\") pod \"cinder-volume-nfs-0\" (UID: \"20a46eb8-508d-45be-bf13-31aed23d1582\") " pod="openstack/cinder-volume-nfs-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.557175 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/20a46eb8-508d-45be-bf13-31aed23d1582-sys\") pod \"cinder-volume-nfs-0\" (UID: \"20a46eb8-508d-45be-bf13-31aed23d1582\") " pod="openstack/cinder-volume-nfs-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.557300 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/20a46eb8-508d-45be-bf13-31aed23d1582-dev\") pod \"cinder-volume-nfs-0\" (UID: \"20a46eb8-508d-45be-bf13-31aed23d1582\") " pod="openstack/cinder-volume-nfs-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.557355 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/20a46eb8-508d-45be-bf13-31aed23d1582-var-locks-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"20a46eb8-508d-45be-bf13-31aed23d1582\") " pod="openstack/cinder-volume-nfs-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.557389 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ms54\" (UniqueName: \"kubernetes.io/projected/20a46eb8-508d-45be-bf13-31aed23d1582-kube-api-access-8ms54\") pod \"cinder-volume-nfs-0\" (UID: \"20a46eb8-508d-45be-bf13-31aed23d1582\") " pod="openstack/cinder-volume-nfs-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.654703 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.669876 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/74a12495-8d82-4296-9328-430af6d923b2-var-locks-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"74a12495-8d82-4296-9328-430af6d923b2\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.669965 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/20a46eb8-508d-45be-bf13-31aed23d1582-var-lib-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"20a46eb8-508d-45be-bf13-31aed23d1582\") " pod="openstack/cinder-volume-nfs-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.670013 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/74a12495-8d82-4296-9328-430af6d923b2-dev\") pod \"cinder-volume-nfs-2-0\" (UID: \"74a12495-8d82-4296-9328-430af6d923b2\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.670048 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/74a12495-8d82-4296-9328-430af6d923b2-etc-machine-id\") pod \"cinder-volume-nfs-2-0\" (UID: \"74a12495-8d82-4296-9328-430af6d923b2\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.670079 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/20a46eb8-508d-45be-bf13-31aed23d1582-run\") pod \"cinder-volume-nfs-0\" (UID: \"20a46eb8-508d-45be-bf13-31aed23d1582\") " pod="openstack/cinder-volume-nfs-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.670109 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/20a46eb8-508d-45be-bf13-31aed23d1582-var-locks-brick\") pod \"cinder-volume-nfs-0\" (UID: \"20a46eb8-508d-45be-bf13-31aed23d1582\") " pod="openstack/cinder-volume-nfs-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.670143 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/20a46eb8-508d-45be-bf13-31aed23d1582-etc-iscsi\") pod \"cinder-volume-nfs-0\" (UID: \"20a46eb8-508d-45be-bf13-31aed23d1582\") " pod="openstack/cinder-volume-nfs-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.670165 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/74a12495-8d82-4296-9328-430af6d923b2-lib-modules\") pod \"cinder-volume-nfs-2-0\" (UID: \"74a12495-8d82-4296-9328-430af6d923b2\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.670198 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74a12495-8d82-4296-9328-430af6d923b2-config-data\") pod \"cinder-volume-nfs-2-0\" (UID: \"74a12495-8d82-4296-9328-430af6d923b2\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.670225 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20a46eb8-508d-45be-bf13-31aed23d1582-scripts\") pod \"cinder-volume-nfs-0\" (UID: \"20a46eb8-508d-45be-bf13-31aed23d1582\") " pod="openstack/cinder-volume-nfs-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.670270 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/20a46eb8-508d-45be-bf13-31aed23d1582-etc-nvme\") pod \"cinder-volume-nfs-0\" (UID: \"20a46eb8-508d-45be-bf13-31aed23d1582\") " pod="openstack/cinder-volume-nfs-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.670298 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/20a46eb8-508d-45be-bf13-31aed23d1582-config-data-custom\") pod \"cinder-volume-nfs-0\" (UID: \"20a46eb8-508d-45be-bf13-31aed23d1582\") " pod="openstack/cinder-volume-nfs-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.670385 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/74a12495-8d82-4296-9328-430af6d923b2-config-data-custom\") pod \"cinder-volume-nfs-2-0\" (UID: \"74a12495-8d82-4296-9328-430af6d923b2\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.674496 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/20a46eb8-508d-45be-bf13-31aed23d1582-run\") pod \"cinder-volume-nfs-0\" (UID: \"20a46eb8-508d-45be-bf13-31aed23d1582\") " pod="openstack/cinder-volume-nfs-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.674592 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/20a46eb8-508d-45be-bf13-31aed23d1582-var-locks-brick\") pod \"cinder-volume-nfs-0\" (UID: \"20a46eb8-508d-45be-bf13-31aed23d1582\") " pod="openstack/cinder-volume-nfs-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.674641 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/20a46eb8-508d-45be-bf13-31aed23d1582-var-lib-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"20a46eb8-508d-45be-bf13-31aed23d1582\") " pod="openstack/cinder-volume-nfs-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.674705 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/20a46eb8-508d-45be-bf13-31aed23d1582-etc-iscsi\") pod \"cinder-volume-nfs-0\" (UID: \"20a46eb8-508d-45be-bf13-31aed23d1582\") " pod="openstack/cinder-volume-nfs-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.674903 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/20a46eb8-508d-45be-bf13-31aed23d1582-etc-nvme\") pod \"cinder-volume-nfs-0\" (UID: \"20a46eb8-508d-45be-bf13-31aed23d1582\") " pod="openstack/cinder-volume-nfs-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.675588 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20a46eb8-508d-45be-bf13-31aed23d1582-config-data\") pod \"cinder-volume-nfs-0\" (UID: \"20a46eb8-508d-45be-bf13-31aed23d1582\") " pod="openstack/cinder-volume-nfs-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.675696 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/20a46eb8-508d-45be-bf13-31aed23d1582-sys\") pod \"cinder-volume-nfs-0\" (UID: \"20a46eb8-508d-45be-bf13-31aed23d1582\") " pod="openstack/cinder-volume-nfs-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.675790 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/74a12495-8d82-4296-9328-430af6d923b2-var-lib-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"74a12495-8d82-4296-9328-430af6d923b2\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.675884 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/74a12495-8d82-4296-9328-430af6d923b2-etc-iscsi\") pod \"cinder-volume-nfs-2-0\" (UID: \"74a12495-8d82-4296-9328-430af6d923b2\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.676046 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/20a46eb8-508d-45be-bf13-31aed23d1582-dev\") pod \"cinder-volume-nfs-0\" (UID: \"20a46eb8-508d-45be-bf13-31aed23d1582\") " pod="openstack/cinder-volume-nfs-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.676142 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/20a46eb8-508d-45be-bf13-31aed23d1582-var-locks-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"20a46eb8-508d-45be-bf13-31aed23d1582\") " pod="openstack/cinder-volume-nfs-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.676247 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ms54\" (UniqueName: \"kubernetes.io/projected/20a46eb8-508d-45be-bf13-31aed23d1582-kube-api-access-8ms54\") pod \"cinder-volume-nfs-0\" (UID: \"20a46eb8-508d-45be-bf13-31aed23d1582\") " pod="openstack/cinder-volume-nfs-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.676397 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74a12495-8d82-4296-9328-430af6d923b2-scripts\") pod \"cinder-volume-nfs-2-0\" (UID: \"74a12495-8d82-4296-9328-430af6d923b2\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.676531 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/74a12495-8d82-4296-9328-430af6d923b2-run\") pod \"cinder-volume-nfs-2-0\" (UID: \"74a12495-8d82-4296-9328-430af6d923b2\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.676672 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/74a12495-8d82-4296-9328-430af6d923b2-var-locks-brick\") pod \"cinder-volume-nfs-2-0\" (UID: \"74a12495-8d82-4296-9328-430af6d923b2\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.676835 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/74a12495-8d82-4296-9328-430af6d923b2-etc-nvme\") pod \"cinder-volume-nfs-2-0\" (UID: \"74a12495-8d82-4296-9328-430af6d923b2\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.676910 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/74a12495-8d82-4296-9328-430af6d923b2-sys\") pod \"cinder-volume-nfs-2-0\" (UID: \"74a12495-8d82-4296-9328-430af6d923b2\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.676971 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20a46eb8-508d-45be-bf13-31aed23d1582-combined-ca-bundle\") pod \"cinder-volume-nfs-0\" (UID: \"20a46eb8-508d-45be-bf13-31aed23d1582\") " pod="openstack/cinder-volume-nfs-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.677056 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/20a46eb8-508d-45be-bf13-31aed23d1582-lib-modules\") pod \"cinder-volume-nfs-0\" (UID: \"20a46eb8-508d-45be-bf13-31aed23d1582\") " pod="openstack/cinder-volume-nfs-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.677140 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/20a46eb8-508d-45be-bf13-31aed23d1582-etc-machine-id\") pod \"cinder-volume-nfs-0\" (UID: \"20a46eb8-508d-45be-bf13-31aed23d1582\") " pod="openstack/cinder-volume-nfs-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.677149 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/20a46eb8-508d-45be-bf13-31aed23d1582-var-locks-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"20a46eb8-508d-45be-bf13-31aed23d1582\") " pod="openstack/cinder-volume-nfs-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.677187 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9h55\" (UniqueName: \"kubernetes.io/projected/74a12495-8d82-4296-9328-430af6d923b2-kube-api-access-l9h55\") pod \"cinder-volume-nfs-2-0\" (UID: \"74a12495-8d82-4296-9328-430af6d923b2\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.678371 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/20a46eb8-508d-45be-bf13-31aed23d1582-sys\") pod \"cinder-volume-nfs-0\" (UID: \"20a46eb8-508d-45be-bf13-31aed23d1582\") " pod="openstack/cinder-volume-nfs-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.678727 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/20a46eb8-508d-45be-bf13-31aed23d1582-lib-modules\") pod \"cinder-volume-nfs-0\" (UID: \"20a46eb8-508d-45be-bf13-31aed23d1582\") " pod="openstack/cinder-volume-nfs-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.678758 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/20a46eb8-508d-45be-bf13-31aed23d1582-etc-machine-id\") pod \"cinder-volume-nfs-0\" (UID: \"20a46eb8-508d-45be-bf13-31aed23d1582\") " pod="openstack/cinder-volume-nfs-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.678798 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/20a46eb8-508d-45be-bf13-31aed23d1582-dev\") pod \"cinder-volume-nfs-0\" (UID: \"20a46eb8-508d-45be-bf13-31aed23d1582\") " pod="openstack/cinder-volume-nfs-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.678982 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74a12495-8d82-4296-9328-430af6d923b2-combined-ca-bundle\") pod \"cinder-volume-nfs-2-0\" (UID: \"74a12495-8d82-4296-9328-430af6d923b2\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.689610 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20a46eb8-508d-45be-bf13-31aed23d1582-scripts\") pod \"cinder-volume-nfs-0\" (UID: \"20a46eb8-508d-45be-bf13-31aed23d1582\") " pod="openstack/cinder-volume-nfs-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.699235 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20a46eb8-508d-45be-bf13-31aed23d1582-combined-ca-bundle\") pod \"cinder-volume-nfs-0\" (UID: \"20a46eb8-508d-45be-bf13-31aed23d1582\") " pod="openstack/cinder-volume-nfs-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.702506 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/20a46eb8-508d-45be-bf13-31aed23d1582-config-data-custom\") pod \"cinder-volume-nfs-0\" (UID: \"20a46eb8-508d-45be-bf13-31aed23d1582\") " pod="openstack/cinder-volume-nfs-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.706061 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20a46eb8-508d-45be-bf13-31aed23d1582-config-data\") pod \"cinder-volume-nfs-0\" (UID: \"20a46eb8-508d-45be-bf13-31aed23d1582\") " pod="openstack/cinder-volume-nfs-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.721023 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ms54\" (UniqueName: \"kubernetes.io/projected/20a46eb8-508d-45be-bf13-31aed23d1582-kube-api-access-8ms54\") pod \"cinder-volume-nfs-0\" (UID: \"20a46eb8-508d-45be-bf13-31aed23d1582\") " pod="openstack/cinder-volume-nfs-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.761931 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-nfs-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.782965 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/74a12495-8d82-4296-9328-430af6d923b2-var-locks-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"74a12495-8d82-4296-9328-430af6d923b2\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.783012 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/74a12495-8d82-4296-9328-430af6d923b2-dev\") pod \"cinder-volume-nfs-2-0\" (UID: \"74a12495-8d82-4296-9328-430af6d923b2\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.783036 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/74a12495-8d82-4296-9328-430af6d923b2-etc-machine-id\") pod \"cinder-volume-nfs-2-0\" (UID: \"74a12495-8d82-4296-9328-430af6d923b2\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.783059 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/74a12495-8d82-4296-9328-430af6d923b2-lib-modules\") pod \"cinder-volume-nfs-2-0\" (UID: \"74a12495-8d82-4296-9328-430af6d923b2\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.783078 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74a12495-8d82-4296-9328-430af6d923b2-config-data\") pod \"cinder-volume-nfs-2-0\" (UID: \"74a12495-8d82-4296-9328-430af6d923b2\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.783114 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/74a12495-8d82-4296-9328-430af6d923b2-config-data-custom\") pod \"cinder-volume-nfs-2-0\" (UID: \"74a12495-8d82-4296-9328-430af6d923b2\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.783139 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/74a12495-8d82-4296-9328-430af6d923b2-var-lib-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"74a12495-8d82-4296-9328-430af6d923b2\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.783153 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/74a12495-8d82-4296-9328-430af6d923b2-etc-iscsi\") pod \"cinder-volume-nfs-2-0\" (UID: \"74a12495-8d82-4296-9328-430af6d923b2\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.783183 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74a12495-8d82-4296-9328-430af6d923b2-scripts\") pod \"cinder-volume-nfs-2-0\" (UID: \"74a12495-8d82-4296-9328-430af6d923b2\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.783200 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/74a12495-8d82-4296-9328-430af6d923b2-run\") pod \"cinder-volume-nfs-2-0\" (UID: \"74a12495-8d82-4296-9328-430af6d923b2\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.783223 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/74a12495-8d82-4296-9328-430af6d923b2-var-locks-brick\") pod \"cinder-volume-nfs-2-0\" (UID: \"74a12495-8d82-4296-9328-430af6d923b2\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.783250 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/74a12495-8d82-4296-9328-430af6d923b2-etc-nvme\") pod \"cinder-volume-nfs-2-0\" (UID: \"74a12495-8d82-4296-9328-430af6d923b2\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.783269 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/74a12495-8d82-4296-9328-430af6d923b2-sys\") pod \"cinder-volume-nfs-2-0\" (UID: \"74a12495-8d82-4296-9328-430af6d923b2\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.783310 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9h55\" (UniqueName: \"kubernetes.io/projected/74a12495-8d82-4296-9328-430af6d923b2-kube-api-access-l9h55\") pod \"cinder-volume-nfs-2-0\" (UID: \"74a12495-8d82-4296-9328-430af6d923b2\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.783345 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74a12495-8d82-4296-9328-430af6d923b2-combined-ca-bundle\") pod \"cinder-volume-nfs-2-0\" (UID: \"74a12495-8d82-4296-9328-430af6d923b2\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.783868 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/74a12495-8d82-4296-9328-430af6d923b2-var-lib-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"74a12495-8d82-4296-9328-430af6d923b2\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.783945 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/74a12495-8d82-4296-9328-430af6d923b2-var-locks-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"74a12495-8d82-4296-9328-430af6d923b2\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.783971 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/74a12495-8d82-4296-9328-430af6d923b2-dev\") pod \"cinder-volume-nfs-2-0\" (UID: \"74a12495-8d82-4296-9328-430af6d923b2\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.783994 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/74a12495-8d82-4296-9328-430af6d923b2-etc-machine-id\") pod \"cinder-volume-nfs-2-0\" (UID: \"74a12495-8d82-4296-9328-430af6d923b2\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.784017 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/74a12495-8d82-4296-9328-430af6d923b2-lib-modules\") pod \"cinder-volume-nfs-2-0\" (UID: \"74a12495-8d82-4296-9328-430af6d923b2\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.785530 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/74a12495-8d82-4296-9328-430af6d923b2-var-locks-brick\") pod \"cinder-volume-nfs-2-0\" (UID: \"74a12495-8d82-4296-9328-430af6d923b2\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.785577 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/74a12495-8d82-4296-9328-430af6d923b2-etc-iscsi\") pod \"cinder-volume-nfs-2-0\" (UID: \"74a12495-8d82-4296-9328-430af6d923b2\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.785618 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/74a12495-8d82-4296-9328-430af6d923b2-sys\") pod \"cinder-volume-nfs-2-0\" (UID: \"74a12495-8d82-4296-9328-430af6d923b2\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.785733 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/74a12495-8d82-4296-9328-430af6d923b2-etc-nvme\") pod \"cinder-volume-nfs-2-0\" (UID: \"74a12495-8d82-4296-9328-430af6d923b2\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.786093 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/74a12495-8d82-4296-9328-430af6d923b2-run\") pod \"cinder-volume-nfs-2-0\" (UID: \"74a12495-8d82-4296-9328-430af6d923b2\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.789452 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74a12495-8d82-4296-9328-430af6d923b2-scripts\") pod \"cinder-volume-nfs-2-0\" (UID: \"74a12495-8d82-4296-9328-430af6d923b2\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.790414 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74a12495-8d82-4296-9328-430af6d923b2-combined-ca-bundle\") pod \"cinder-volume-nfs-2-0\" (UID: \"74a12495-8d82-4296-9328-430af6d923b2\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.796040 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74a12495-8d82-4296-9328-430af6d923b2-config-data\") pod \"cinder-volume-nfs-2-0\" (UID: \"74a12495-8d82-4296-9328-430af6d923b2\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.800263 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/74a12495-8d82-4296-9328-430af6d923b2-config-data-custom\") pod \"cinder-volume-nfs-2-0\" (UID: \"74a12495-8d82-4296-9328-430af6d923b2\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.810647 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9h55\" (UniqueName: \"kubernetes.io/projected/74a12495-8d82-4296-9328-430af6d923b2-kube-api-access-l9h55\") pod \"cinder-volume-nfs-2-0\" (UID: \"74a12495-8d82-4296-9328-430af6d923b2\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 15:55:12 crc kubenswrapper[4810]: I0219 15:55:12.865385 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-nfs-2-0" Feb 19 15:55:13 crc kubenswrapper[4810]: I0219 15:55:13.231420 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Feb 19 15:55:13 crc kubenswrapper[4810]: I0219 15:55:13.232193 4810 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 15:55:13 crc kubenswrapper[4810]: I0219 15:55:13.247733 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"f66b86b2-b164-4380-8a89-bb0cf5f833ef","Type":"ContainerStarted","Data":"5e8f85fe54dd5e1ff99ce6a0b3b1973b9b7f40c59cdb226f912652060a92e50d"} Feb 19 15:55:13 crc kubenswrapper[4810]: I0219 15:55:13.357949 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-nfs-0"] Feb 19 15:55:13 crc kubenswrapper[4810]: W0219 15:55:13.380568 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod20a46eb8_508d_45be_bf13_31aed23d1582.slice/crio-caa41dbb4ab4a4e44b929f9e401457fad90d829340625a561aa3366a426ed26f WatchSource:0}: Error finding container caa41dbb4ab4a4e44b929f9e401457fad90d829340625a561aa3366a426ed26f: Status 404 returned error can't find the container with id caa41dbb4ab4a4e44b929f9e401457fad90d829340625a561aa3366a426ed26f Feb 19 15:55:13 crc kubenswrapper[4810]: I0219 15:55:13.454958 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-nfs-2-0"] Feb 19 15:55:14 crc kubenswrapper[4810]: I0219 15:55:14.261288 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-0" event={"ID":"20a46eb8-508d-45be-bf13-31aed23d1582","Type":"ContainerStarted","Data":"3f67b6a567911aefaeebcb4cb8f53b7df187985f6339bf75ae32887067f8caea"} Feb 19 15:55:14 crc kubenswrapper[4810]: I0219 15:55:14.261781 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-0" event={"ID":"20a46eb8-508d-45be-bf13-31aed23d1582","Type":"ContainerStarted","Data":"6adc24350d3170c12c6b597c23afb4f51983344ab109c740e5de2016b94b66d0"} Feb 19 15:55:14 crc kubenswrapper[4810]: I0219 15:55:14.261793 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-0" event={"ID":"20a46eb8-508d-45be-bf13-31aed23d1582","Type":"ContainerStarted","Data":"caa41dbb4ab4a4e44b929f9e401457fad90d829340625a561aa3366a426ed26f"} Feb 19 15:55:14 crc kubenswrapper[4810]: I0219 15:55:14.263461 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"f66b86b2-b164-4380-8a89-bb0cf5f833ef","Type":"ContainerStarted","Data":"fa7ec313ba6e9576a0b0fe26b9bd5c94fbc1c00f4438d71688b30f2a7f92e5b2"} Feb 19 15:55:14 crc kubenswrapper[4810]: I0219 15:55:14.263523 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"f66b86b2-b164-4380-8a89-bb0cf5f833ef","Type":"ContainerStarted","Data":"9efdda8c20157fd0a49e2aa9e42453f7ef8c5a0a98846e8b6c156c34e8c0e905"} Feb 19 15:55:14 crc kubenswrapper[4810]: I0219 15:55:14.265447 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-2-0" event={"ID":"74a12495-8d82-4296-9328-430af6d923b2","Type":"ContainerStarted","Data":"ac5b7f2828f61565b87a1ad1cd03ce4f3eb142d433b910028fa40ad25f3cfa3f"} Feb 19 15:55:14 crc kubenswrapper[4810]: I0219 15:55:14.265485 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-2-0" event={"ID":"74a12495-8d82-4296-9328-430af6d923b2","Type":"ContainerStarted","Data":"4d633a099a5f16eb87773769986770cdb28b6232f8d620e57d8768afa212c73e"} Feb 19 15:55:14 crc kubenswrapper[4810]: I0219 15:55:14.265495 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-2-0" event={"ID":"74a12495-8d82-4296-9328-430af6d923b2","Type":"ContainerStarted","Data":"ed513194dd7a21859393ce2fccc3bf6bdf7ed97abb1119d451cf1462e1b04fa3"} Feb 19 15:55:14 crc kubenswrapper[4810]: I0219 15:55:14.285888 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-nfs-0" podStartSLOduration=2.042332184 podStartE2EDuration="2.285873434s" podCreationTimestamp="2026-02-19 15:55:12 +0000 UTC" firstStartedPulling="2026-02-19 15:55:13.383477464 +0000 UTC m=+2742.865507588" lastFinishedPulling="2026-02-19 15:55:13.627018684 +0000 UTC m=+2743.109048838" observedRunningTime="2026-02-19 15:55:14.283731071 +0000 UTC m=+2743.765761195" watchObservedRunningTime="2026-02-19 15:55:14.285873434 +0000 UTC m=+2743.767903558" Feb 19 15:55:14 crc kubenswrapper[4810]: I0219 15:55:14.308794 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-nfs-2-0" podStartSLOduration=2.231381246 podStartE2EDuration="2.308772485s" podCreationTimestamp="2026-02-19 15:55:12 +0000 UTC" firstStartedPulling="2026-02-19 15:55:13.556375173 +0000 UTC m=+2743.038405317" lastFinishedPulling="2026-02-19 15:55:13.633766432 +0000 UTC m=+2743.115796556" observedRunningTime="2026-02-19 15:55:14.301726729 +0000 UTC m=+2743.783756853" watchObservedRunningTime="2026-02-19 15:55:14.308772485 +0000 UTC m=+2743.790802609" Feb 19 15:55:17 crc kubenswrapper[4810]: I0219 15:55:17.655199 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Feb 19 15:55:17 crc kubenswrapper[4810]: I0219 15:55:17.762509 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-nfs-0" Feb 19 15:55:17 crc kubenswrapper[4810]: I0219 15:55:17.866275 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-nfs-2-0" Feb 19 15:55:19 crc kubenswrapper[4810]: I0219 15:55:19.537344 4810 patch_prober.go:28] interesting pod/machine-config-daemon-t499d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 15:55:19 crc kubenswrapper[4810]: I0219 15:55:19.537740 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 15:55:22 crc kubenswrapper[4810]: I0219 15:55:22.862557 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Feb 19 15:55:22 crc kubenswrapper[4810]: I0219 15:55:22.896599 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=10.580401542 podStartE2EDuration="10.896580791s" podCreationTimestamp="2026-02-19 15:55:12 +0000 UTC" firstStartedPulling="2026-02-19 15:55:13.231975819 +0000 UTC m=+2742.714005943" lastFinishedPulling="2026-02-19 15:55:13.548155058 +0000 UTC m=+2743.030185192" observedRunningTime="2026-02-19 15:55:14.324516787 +0000 UTC m=+2743.806546931" watchObservedRunningTime="2026-02-19 15:55:22.896580791 +0000 UTC m=+2752.378610905" Feb 19 15:55:23 crc kubenswrapper[4810]: I0219 15:55:23.015478 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-nfs-0" Feb 19 15:55:23 crc kubenswrapper[4810]: I0219 15:55:23.094167 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-nfs-2-0" Feb 19 15:55:49 crc kubenswrapper[4810]: I0219 15:55:49.537232 4810 patch_prober.go:28] interesting pod/machine-config-daemon-t499d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 15:55:49 crc kubenswrapper[4810]: I0219 15:55:49.537930 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 15:55:49 crc kubenswrapper[4810]: I0219 15:55:49.537982 4810 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t499d" Feb 19 15:55:49 crc kubenswrapper[4810]: I0219 15:55:49.538685 4810 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6287fea84a36b36e84360e8242ff92d8a387529e6d7bde8d2a6410fbcb262896"} pod="openshift-machine-config-operator/machine-config-daemon-t499d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 15:55:49 crc kubenswrapper[4810]: I0219 15:55:49.538748 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" containerID="cri-o://6287fea84a36b36e84360e8242ff92d8a387529e6d7bde8d2a6410fbcb262896" gracePeriod=600 Feb 19 15:55:49 crc kubenswrapper[4810]: I0219 15:55:49.878797 4810 generic.go:334] "Generic (PLEG): container finished" podID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerID="6287fea84a36b36e84360e8242ff92d8a387529e6d7bde8d2a6410fbcb262896" exitCode=0 Feb 19 15:55:49 crc kubenswrapper[4810]: I0219 15:55:49.878882 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" event={"ID":"dfbf74ef-1e94-4826-8583-42b2e246ccf3","Type":"ContainerDied","Data":"6287fea84a36b36e84360e8242ff92d8a387529e6d7bde8d2a6410fbcb262896"} Feb 19 15:55:49 crc kubenswrapper[4810]: I0219 15:55:49.879192 4810 scope.go:117] "RemoveContainer" containerID="75401b3edfa84b40903edbf5e5d73ff2da026a9f8214b6a2404ce809eb32bd1f" Feb 19 15:55:50 crc kubenswrapper[4810]: I0219 15:55:50.892910 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" event={"ID":"dfbf74ef-1e94-4826-8583-42b2e246ccf3","Type":"ContainerStarted","Data":"c35ee8c1523e361fb9a2d8ec76f05b9bc5a2cc5f563a8013cd8c34860471cd09"} Feb 19 15:56:43 crc kubenswrapper[4810]: I0219 15:56:43.156890 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 19 15:56:43 crc kubenswrapper[4810]: I0219 15:56:43.158077 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="5c213a3a-78fd-4b42-bc1c-e09837eae684" containerName="prometheus" containerID="cri-o://09171322701d01cc2b65792ae9294bf2facdbb2647bec3473b5a2f2c7595769d" gracePeriod=600 Feb 19 15:56:43 crc kubenswrapper[4810]: I0219 15:56:43.158154 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="5c213a3a-78fd-4b42-bc1c-e09837eae684" containerName="thanos-sidecar" containerID="cri-o://ab34c252f979f018755a4fd37fb43b24f2dc32b972b84fea55031898aade6962" gracePeriod=600 Feb 19 15:56:43 crc kubenswrapper[4810]: I0219 15:56:43.158201 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="5c213a3a-78fd-4b42-bc1c-e09837eae684" containerName="config-reloader" containerID="cri-o://1cf6b1e26b593f87dbae05831261b5448c34285d54ffc513dcc21ce3c137d57f" gracePeriod=600 Feb 19 15:56:43 crc kubenswrapper[4810]: I0219 15:56:43.532312 4810 generic.go:334] "Generic (PLEG): container finished" podID="5c213a3a-78fd-4b42-bc1c-e09837eae684" containerID="ab34c252f979f018755a4fd37fb43b24f2dc32b972b84fea55031898aade6962" exitCode=0 Feb 19 15:56:43 crc kubenswrapper[4810]: I0219 15:56:43.532590 4810 generic.go:334] "Generic (PLEG): container finished" podID="5c213a3a-78fd-4b42-bc1c-e09837eae684" containerID="09171322701d01cc2b65792ae9294bf2facdbb2647bec3473b5a2f2c7595769d" exitCode=0 Feb 19 15:56:43 crc kubenswrapper[4810]: I0219 15:56:43.532353 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"5c213a3a-78fd-4b42-bc1c-e09837eae684","Type":"ContainerDied","Data":"ab34c252f979f018755a4fd37fb43b24f2dc32b972b84fea55031898aade6962"} Feb 19 15:56:43 crc kubenswrapper[4810]: I0219 15:56:43.532620 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"5c213a3a-78fd-4b42-bc1c-e09837eae684","Type":"ContainerDied","Data":"09171322701d01cc2b65792ae9294bf2facdbb2647bec3473b5a2f2c7595769d"} Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.240634 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.344703 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5c213a3a-78fd-4b42-bc1c-e09837eae684-config-out\") pod \"5c213a3a-78fd-4b42-bc1c-e09837eae684\" (UID: \"5c213a3a-78fd-4b42-bc1c-e09837eae684\") " Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.344783 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/5c213a3a-78fd-4b42-bc1c-e09837eae684-prometheus-metric-storage-rulefiles-2\") pod \"5c213a3a-78fd-4b42-bc1c-e09837eae684\" (UID: \"5c213a3a-78fd-4b42-bc1c-e09837eae684\") " Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.344861 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/5c213a3a-78fd-4b42-bc1c-e09837eae684-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"5c213a3a-78fd-4b42-bc1c-e09837eae684\" (UID: \"5c213a3a-78fd-4b42-bc1c-e09837eae684\") " Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.344921 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/5c213a3a-78fd-4b42-bc1c-e09837eae684-prometheus-metric-storage-rulefiles-1\") pod \"5c213a3a-78fd-4b42-bc1c-e09837eae684\" (UID: \"5c213a3a-78fd-4b42-bc1c-e09837eae684\") " Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.344944 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5c213a3a-78fd-4b42-bc1c-e09837eae684-web-config\") pod \"5c213a3a-78fd-4b42-bc1c-e09837eae684\" (UID: \"5c213a3a-78fd-4b42-bc1c-e09837eae684\") " Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.344994 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/5c213a3a-78fd-4b42-bc1c-e09837eae684-prometheus-metric-storage-rulefiles-0\") pod \"5c213a3a-78fd-4b42-bc1c-e09837eae684\" (UID: \"5c213a3a-78fd-4b42-bc1c-e09837eae684\") " Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.345041 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5c213a3a-78fd-4b42-bc1c-e09837eae684-tls-assets\") pod \"5c213a3a-78fd-4b42-bc1c-e09837eae684\" (UID: \"5c213a3a-78fd-4b42-bc1c-e09837eae684\") " Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.345069 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmlm6\" (UniqueName: \"kubernetes.io/projected/5c213a3a-78fd-4b42-bc1c-e09837eae684-kube-api-access-zmlm6\") pod \"5c213a3a-78fd-4b42-bc1c-e09837eae684\" (UID: \"5c213a3a-78fd-4b42-bc1c-e09837eae684\") " Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.345128 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/5c213a3a-78fd-4b42-bc1c-e09837eae684-thanos-prometheus-http-client-file\") pod \"5c213a3a-78fd-4b42-bc1c-e09837eae684\" (UID: \"5c213a3a-78fd-4b42-bc1c-e09837eae684\") " Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.345176 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/5c213a3a-78fd-4b42-bc1c-e09837eae684-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"5c213a3a-78fd-4b42-bc1c-e09837eae684\" (UID: \"5c213a3a-78fd-4b42-bc1c-e09837eae684\") " Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.345218 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c213a3a-78fd-4b42-bc1c-e09837eae684-secret-combined-ca-bundle\") pod \"5c213a3a-78fd-4b42-bc1c-e09837eae684\" (UID: \"5c213a3a-78fd-4b42-bc1c-e09837eae684\") " Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.345250 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5c213a3a-78fd-4b42-bc1c-e09837eae684-config\") pod \"5c213a3a-78fd-4b42-bc1c-e09837eae684\" (UID: \"5c213a3a-78fd-4b42-bc1c-e09837eae684\") " Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.345283 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c213a3a-78fd-4b42-bc1c-e09837eae684-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "5c213a3a-78fd-4b42-bc1c-e09837eae684" (UID: "5c213a3a-78fd-4b42-bc1c-e09837eae684"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.345410 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b3b143f1-488b-49bf-8792-af0d760f341e\") pod \"5c213a3a-78fd-4b42-bc1c-e09837eae684\" (UID: \"5c213a3a-78fd-4b42-bc1c-e09837eae684\") " Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.345838 4810 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/5c213a3a-78fd-4b42-bc1c-e09837eae684-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.348859 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c213a3a-78fd-4b42-bc1c-e09837eae684-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "5c213a3a-78fd-4b42-bc1c-e09837eae684" (UID: "5c213a3a-78fd-4b42-bc1c-e09837eae684"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.349261 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c213a3a-78fd-4b42-bc1c-e09837eae684-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "5c213a3a-78fd-4b42-bc1c-e09837eae684" (UID: "5c213a3a-78fd-4b42-bc1c-e09837eae684"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.355431 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c213a3a-78fd-4b42-bc1c-e09837eae684-config-out" (OuterVolumeSpecName: "config-out") pod "5c213a3a-78fd-4b42-bc1c-e09837eae684" (UID: "5c213a3a-78fd-4b42-bc1c-e09837eae684"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.356197 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c213a3a-78fd-4b42-bc1c-e09837eae684-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d") pod "5c213a3a-78fd-4b42-bc1c-e09837eae684" (UID: "5c213a3a-78fd-4b42-bc1c-e09837eae684"). InnerVolumeSpecName "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.357503 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c213a3a-78fd-4b42-bc1c-e09837eae684-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "5c213a3a-78fd-4b42-bc1c-e09837eae684" (UID: "5c213a3a-78fd-4b42-bc1c-e09837eae684"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.358204 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c213a3a-78fd-4b42-bc1c-e09837eae684-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "5c213a3a-78fd-4b42-bc1c-e09837eae684" (UID: "5c213a3a-78fd-4b42-bc1c-e09837eae684"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.358848 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c213a3a-78fd-4b42-bc1c-e09837eae684-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d") pod "5c213a3a-78fd-4b42-bc1c-e09837eae684" (UID: "5c213a3a-78fd-4b42-bc1c-e09837eae684"). InnerVolumeSpecName "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.361626 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c213a3a-78fd-4b42-bc1c-e09837eae684-kube-api-access-zmlm6" (OuterVolumeSpecName: "kube-api-access-zmlm6") pod "5c213a3a-78fd-4b42-bc1c-e09837eae684" (UID: "5c213a3a-78fd-4b42-bc1c-e09837eae684"). InnerVolumeSpecName "kube-api-access-zmlm6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.368359 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c213a3a-78fd-4b42-bc1c-e09837eae684-secret-combined-ca-bundle" (OuterVolumeSpecName: "secret-combined-ca-bundle") pod "5c213a3a-78fd-4b42-bc1c-e09837eae684" (UID: "5c213a3a-78fd-4b42-bc1c-e09837eae684"). InnerVolumeSpecName "secret-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.369827 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c213a3a-78fd-4b42-bc1c-e09837eae684-config" (OuterVolumeSpecName: "config") pod "5c213a3a-78fd-4b42-bc1c-e09837eae684" (UID: "5c213a3a-78fd-4b42-bc1c-e09837eae684"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.394749 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b3b143f1-488b-49bf-8792-af0d760f341e" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "5c213a3a-78fd-4b42-bc1c-e09837eae684" (UID: "5c213a3a-78fd-4b42-bc1c-e09837eae684"). InnerVolumeSpecName "pvc-b3b143f1-488b-49bf-8792-af0d760f341e". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.441192 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c213a3a-78fd-4b42-bc1c-e09837eae684-web-config" (OuterVolumeSpecName: "web-config") pod "5c213a3a-78fd-4b42-bc1c-e09837eae684" (UID: "5c213a3a-78fd-4b42-bc1c-e09837eae684"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.448226 4810 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/5c213a3a-78fd-4b42-bc1c-e09837eae684-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.448256 4810 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/5c213a3a-78fd-4b42-bc1c-e09837eae684-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") on node \"crc\" DevicePath \"\"" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.448270 4810 reconciler_common.go:293] "Volume detached for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c213a3a-78fd-4b42-bc1c-e09837eae684-secret-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.448280 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/5c213a3a-78fd-4b42-bc1c-e09837eae684-config\") on node \"crc\" DevicePath \"\"" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.448315 4810 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-b3b143f1-488b-49bf-8792-af0d760f341e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b3b143f1-488b-49bf-8792-af0d760f341e\") on node \"crc\" " Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.448338 4810 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5c213a3a-78fd-4b42-bc1c-e09837eae684-config-out\") on node \"crc\" DevicePath \"\"" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.448348 4810 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/5c213a3a-78fd-4b42-bc1c-e09837eae684-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") on node \"crc\" DevicePath \"\"" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.448358 4810 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/5c213a3a-78fd-4b42-bc1c-e09837eae684-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.448367 4810 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5c213a3a-78fd-4b42-bc1c-e09837eae684-web-config\") on node \"crc\" DevicePath \"\"" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.448376 4810 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/5c213a3a-78fd-4b42-bc1c-e09837eae684-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.448385 4810 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5c213a3a-78fd-4b42-bc1c-e09837eae684-tls-assets\") on node \"crc\" DevicePath \"\"" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.448395 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zmlm6\" (UniqueName: \"kubernetes.io/projected/5c213a3a-78fd-4b42-bc1c-e09837eae684-kube-api-access-zmlm6\") on node \"crc\" DevicePath \"\"" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.477350 4810 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.477535 4810 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-b3b143f1-488b-49bf-8792-af0d760f341e" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b3b143f1-488b-49bf-8792-af0d760f341e") on node "crc" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.543417 4810 generic.go:334] "Generic (PLEG): container finished" podID="5c213a3a-78fd-4b42-bc1c-e09837eae684" containerID="1cf6b1e26b593f87dbae05831261b5448c34285d54ffc513dcc21ce3c137d57f" exitCode=0 Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.543464 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"5c213a3a-78fd-4b42-bc1c-e09837eae684","Type":"ContainerDied","Data":"1cf6b1e26b593f87dbae05831261b5448c34285d54ffc513dcc21ce3c137d57f"} Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.543493 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"5c213a3a-78fd-4b42-bc1c-e09837eae684","Type":"ContainerDied","Data":"971051d9ba9c03f03793d16fa10f74ed254c90779c474f368dd54279d38af75a"} Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.543514 4810 scope.go:117] "RemoveContainer" containerID="ab34c252f979f018755a4fd37fb43b24f2dc32b972b84fea55031898aade6962" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.543649 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.550088 4810 reconciler_common.go:293] "Volume detached for volume \"pvc-b3b143f1-488b-49bf-8792-af0d760f341e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b3b143f1-488b-49bf-8792-af0d760f341e\") on node \"crc\" DevicePath \"\"" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.584035 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.592181 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.615093 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 19 15:56:44 crc kubenswrapper[4810]: E0219 15:56:44.615476 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c213a3a-78fd-4b42-bc1c-e09837eae684" containerName="prometheus" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.615493 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c213a3a-78fd-4b42-bc1c-e09837eae684" containerName="prometheus" Feb 19 15:56:44 crc kubenswrapper[4810]: E0219 15:56:44.615503 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c213a3a-78fd-4b42-bc1c-e09837eae684" containerName="thanos-sidecar" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.615511 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c213a3a-78fd-4b42-bc1c-e09837eae684" containerName="thanos-sidecar" Feb 19 15:56:44 crc kubenswrapper[4810]: E0219 15:56:44.615529 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c213a3a-78fd-4b42-bc1c-e09837eae684" containerName="init-config-reloader" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.615537 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c213a3a-78fd-4b42-bc1c-e09837eae684" containerName="init-config-reloader" Feb 19 15:56:44 crc kubenswrapper[4810]: E0219 15:56:44.615557 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c213a3a-78fd-4b42-bc1c-e09837eae684" containerName="config-reloader" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.615563 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c213a3a-78fd-4b42-bc1c-e09837eae684" containerName="config-reloader" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.615761 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c213a3a-78fd-4b42-bc1c-e09837eae684" containerName="prometheus" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.615776 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c213a3a-78fd-4b42-bc1c-e09837eae684" containerName="thanos-sidecar" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.615791 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c213a3a-78fd-4b42-bc1c-e09837eae684" containerName="config-reloader" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.617529 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.624738 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.629244 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.629344 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.629395 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.629476 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.629563 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.629614 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-x7hn6" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.632663 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.650943 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.675853 4810 scope.go:117] "RemoveContainer" containerID="1cf6b1e26b593f87dbae05831261b5448c34285d54ffc513dcc21ce3c137d57f" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.712831 4810 scope.go:117] "RemoveContainer" containerID="09171322701d01cc2b65792ae9294bf2facdbb2647bec3473b5a2f2c7595769d" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.735500 4810 scope.go:117] "RemoveContainer" containerID="f004bfd72f8d26f3aaf22a2c6561639f20d7d6d9b961893f534ded590f3c40ec" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.754389 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/bf65af35-1e80-49a0-ada2-3bd027193193-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"bf65af35-1e80-49a0-ada2-3bd027193193\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.754447 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf65af35-1e80-49a0-ada2-3bd027193193-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"bf65af35-1e80-49a0-ada2-3bd027193193\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.754501 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-b3b143f1-488b-49bf-8792-af0d760f341e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b3b143f1-488b-49bf-8792-af0d760f341e\") pod \"prometheus-metric-storage-0\" (UID: \"bf65af35-1e80-49a0-ada2-3bd027193193\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.754538 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bf65af35-1e80-49a0-ada2-3bd027193193-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"bf65af35-1e80-49a0-ada2-3bd027193193\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.754555 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/bf65af35-1e80-49a0-ada2-3bd027193193-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"bf65af35-1e80-49a0-ada2-3bd027193193\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.754590 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/bf65af35-1e80-49a0-ada2-3bd027193193-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"bf65af35-1e80-49a0-ada2-3bd027193193\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.754611 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/bf65af35-1e80-49a0-ada2-3bd027193193-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"bf65af35-1e80-49a0-ada2-3bd027193193\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.754627 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bf65af35-1e80-49a0-ada2-3bd027193193-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"bf65af35-1e80-49a0-ada2-3bd027193193\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.754674 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bf65af35-1e80-49a0-ada2-3bd027193193-config\") pod \"prometheus-metric-storage-0\" (UID: \"bf65af35-1e80-49a0-ada2-3bd027193193\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.754695 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mkgk\" (UniqueName: \"kubernetes.io/projected/bf65af35-1e80-49a0-ada2-3bd027193193-kube-api-access-7mkgk\") pod \"prometheus-metric-storage-0\" (UID: \"bf65af35-1e80-49a0-ada2-3bd027193193\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.754711 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bf65af35-1e80-49a0-ada2-3bd027193193-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"bf65af35-1e80-49a0-ada2-3bd027193193\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.754732 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/bf65af35-1e80-49a0-ada2-3bd027193193-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"bf65af35-1e80-49a0-ada2-3bd027193193\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.754767 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/bf65af35-1e80-49a0-ada2-3bd027193193-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"bf65af35-1e80-49a0-ada2-3bd027193193\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.758128 4810 scope.go:117] "RemoveContainer" containerID="ab34c252f979f018755a4fd37fb43b24f2dc32b972b84fea55031898aade6962" Feb 19 15:56:44 crc kubenswrapper[4810]: E0219 15:56:44.762781 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab34c252f979f018755a4fd37fb43b24f2dc32b972b84fea55031898aade6962\": container with ID starting with ab34c252f979f018755a4fd37fb43b24f2dc32b972b84fea55031898aade6962 not found: ID does not exist" containerID="ab34c252f979f018755a4fd37fb43b24f2dc32b972b84fea55031898aade6962" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.762810 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab34c252f979f018755a4fd37fb43b24f2dc32b972b84fea55031898aade6962"} err="failed to get container status \"ab34c252f979f018755a4fd37fb43b24f2dc32b972b84fea55031898aade6962\": rpc error: code = NotFound desc = could not find container \"ab34c252f979f018755a4fd37fb43b24f2dc32b972b84fea55031898aade6962\": container with ID starting with ab34c252f979f018755a4fd37fb43b24f2dc32b972b84fea55031898aade6962 not found: ID does not exist" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.762830 4810 scope.go:117] "RemoveContainer" containerID="1cf6b1e26b593f87dbae05831261b5448c34285d54ffc513dcc21ce3c137d57f" Feb 19 15:56:44 crc kubenswrapper[4810]: E0219 15:56:44.763297 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cf6b1e26b593f87dbae05831261b5448c34285d54ffc513dcc21ce3c137d57f\": container with ID starting with 1cf6b1e26b593f87dbae05831261b5448c34285d54ffc513dcc21ce3c137d57f not found: ID does not exist" containerID="1cf6b1e26b593f87dbae05831261b5448c34285d54ffc513dcc21ce3c137d57f" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.763357 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cf6b1e26b593f87dbae05831261b5448c34285d54ffc513dcc21ce3c137d57f"} err="failed to get container status \"1cf6b1e26b593f87dbae05831261b5448c34285d54ffc513dcc21ce3c137d57f\": rpc error: code = NotFound desc = could not find container \"1cf6b1e26b593f87dbae05831261b5448c34285d54ffc513dcc21ce3c137d57f\": container with ID starting with 1cf6b1e26b593f87dbae05831261b5448c34285d54ffc513dcc21ce3c137d57f not found: ID does not exist" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.763385 4810 scope.go:117] "RemoveContainer" containerID="09171322701d01cc2b65792ae9294bf2facdbb2647bec3473b5a2f2c7595769d" Feb 19 15:56:44 crc kubenswrapper[4810]: E0219 15:56:44.764011 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09171322701d01cc2b65792ae9294bf2facdbb2647bec3473b5a2f2c7595769d\": container with ID starting with 09171322701d01cc2b65792ae9294bf2facdbb2647bec3473b5a2f2c7595769d not found: ID does not exist" containerID="09171322701d01cc2b65792ae9294bf2facdbb2647bec3473b5a2f2c7595769d" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.764043 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09171322701d01cc2b65792ae9294bf2facdbb2647bec3473b5a2f2c7595769d"} err="failed to get container status \"09171322701d01cc2b65792ae9294bf2facdbb2647bec3473b5a2f2c7595769d\": rpc error: code = NotFound desc = could not find container \"09171322701d01cc2b65792ae9294bf2facdbb2647bec3473b5a2f2c7595769d\": container with ID starting with 09171322701d01cc2b65792ae9294bf2facdbb2647bec3473b5a2f2c7595769d not found: ID does not exist" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.764059 4810 scope.go:117] "RemoveContainer" containerID="f004bfd72f8d26f3aaf22a2c6561639f20d7d6d9b961893f534ded590f3c40ec" Feb 19 15:56:44 crc kubenswrapper[4810]: E0219 15:56:44.767470 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f004bfd72f8d26f3aaf22a2c6561639f20d7d6d9b961893f534ded590f3c40ec\": container with ID starting with f004bfd72f8d26f3aaf22a2c6561639f20d7d6d9b961893f534ded590f3c40ec not found: ID does not exist" containerID="f004bfd72f8d26f3aaf22a2c6561639f20d7d6d9b961893f534ded590f3c40ec" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.767519 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f004bfd72f8d26f3aaf22a2c6561639f20d7d6d9b961893f534ded590f3c40ec"} err="failed to get container status \"f004bfd72f8d26f3aaf22a2c6561639f20d7d6d9b961893f534ded590f3c40ec\": rpc error: code = NotFound desc = could not find container \"f004bfd72f8d26f3aaf22a2c6561639f20d7d6d9b961893f534ded590f3c40ec\": container with ID starting with f004bfd72f8d26f3aaf22a2c6561639f20d7d6d9b961893f534ded590f3c40ec not found: ID does not exist" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.856241 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/bf65af35-1e80-49a0-ada2-3bd027193193-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"bf65af35-1e80-49a0-ada2-3bd027193193\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.856522 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/bf65af35-1e80-49a0-ada2-3bd027193193-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"bf65af35-1e80-49a0-ada2-3bd027193193\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.856549 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/bf65af35-1e80-49a0-ada2-3bd027193193-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"bf65af35-1e80-49a0-ada2-3bd027193193\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.856576 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf65af35-1e80-49a0-ada2-3bd027193193-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"bf65af35-1e80-49a0-ada2-3bd027193193\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.856631 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-b3b143f1-488b-49bf-8792-af0d760f341e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b3b143f1-488b-49bf-8792-af0d760f341e\") pod \"prometheus-metric-storage-0\" (UID: \"bf65af35-1e80-49a0-ada2-3bd027193193\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.856670 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bf65af35-1e80-49a0-ada2-3bd027193193-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"bf65af35-1e80-49a0-ada2-3bd027193193\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.856689 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/bf65af35-1e80-49a0-ada2-3bd027193193-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"bf65af35-1e80-49a0-ada2-3bd027193193\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.856728 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/bf65af35-1e80-49a0-ada2-3bd027193193-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"bf65af35-1e80-49a0-ada2-3bd027193193\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.856750 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/bf65af35-1e80-49a0-ada2-3bd027193193-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"bf65af35-1e80-49a0-ada2-3bd027193193\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.856769 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bf65af35-1e80-49a0-ada2-3bd027193193-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"bf65af35-1e80-49a0-ada2-3bd027193193\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.856821 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bf65af35-1e80-49a0-ada2-3bd027193193-config\") pod \"prometheus-metric-storage-0\" (UID: \"bf65af35-1e80-49a0-ada2-3bd027193193\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.856843 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mkgk\" (UniqueName: \"kubernetes.io/projected/bf65af35-1e80-49a0-ada2-3bd027193193-kube-api-access-7mkgk\") pod \"prometheus-metric-storage-0\" (UID: \"bf65af35-1e80-49a0-ada2-3bd027193193\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.856860 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bf65af35-1e80-49a0-ada2-3bd027193193-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"bf65af35-1e80-49a0-ada2-3bd027193193\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.857245 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/bf65af35-1e80-49a0-ada2-3bd027193193-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"bf65af35-1e80-49a0-ada2-3bd027193193\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.857824 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/bf65af35-1e80-49a0-ada2-3bd027193193-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"bf65af35-1e80-49a0-ada2-3bd027193193\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.859115 4810 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.859160 4810 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-b3b143f1-488b-49bf-8792-af0d760f341e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b3b143f1-488b-49bf-8792-af0d760f341e\") pod \"prometheus-metric-storage-0\" (UID: \"bf65af35-1e80-49a0-ada2-3bd027193193\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/e63bc62ea909687cb5abb0c5cf8da7008d795f1441aaff1987b707a42a388027/globalmount\"" pod="openstack/prometheus-metric-storage-0" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.859195 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/bf65af35-1e80-49a0-ada2-3bd027193193-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"bf65af35-1e80-49a0-ada2-3bd027193193\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.862124 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/bf65af35-1e80-49a0-ada2-3bd027193193-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"bf65af35-1e80-49a0-ada2-3bd027193193\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.862465 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bf65af35-1e80-49a0-ada2-3bd027193193-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"bf65af35-1e80-49a0-ada2-3bd027193193\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.862700 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/bf65af35-1e80-49a0-ada2-3bd027193193-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"bf65af35-1e80-49a0-ada2-3bd027193193\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.863426 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf65af35-1e80-49a0-ada2-3bd027193193-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"bf65af35-1e80-49a0-ada2-3bd027193193\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.863485 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bf65af35-1e80-49a0-ada2-3bd027193193-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"bf65af35-1e80-49a0-ada2-3bd027193193\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.863809 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bf65af35-1e80-49a0-ada2-3bd027193193-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"bf65af35-1e80-49a0-ada2-3bd027193193\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.867759 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/bf65af35-1e80-49a0-ada2-3bd027193193-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"bf65af35-1e80-49a0-ada2-3bd027193193\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.867850 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/bf65af35-1e80-49a0-ada2-3bd027193193-config\") pod \"prometheus-metric-storage-0\" (UID: \"bf65af35-1e80-49a0-ada2-3bd027193193\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.889173 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mkgk\" (UniqueName: \"kubernetes.io/projected/bf65af35-1e80-49a0-ada2-3bd027193193-kube-api-access-7mkgk\") pod \"prometheus-metric-storage-0\" (UID: \"bf65af35-1e80-49a0-ada2-3bd027193193\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.906194 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-b3b143f1-488b-49bf-8792-af0d760f341e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b3b143f1-488b-49bf-8792-af0d760f341e\") pod \"prometheus-metric-storage-0\" (UID: \"bf65af35-1e80-49a0-ada2-3bd027193193\") " pod="openstack/prometheus-metric-storage-0" Feb 19 15:56:44 crc kubenswrapper[4810]: I0219 15:56:44.934315 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 19 15:56:45 crc kubenswrapper[4810]: I0219 15:56:45.430708 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 19 15:56:45 crc kubenswrapper[4810]: W0219 15:56:45.442879 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf65af35_1e80_49a0_ada2_3bd027193193.slice/crio-5c2643d70b93f742d55230bda1d4d99a26796b1faec3f2d8cd40094f4cdf3174 WatchSource:0}: Error finding container 5c2643d70b93f742d55230bda1d4d99a26796b1faec3f2d8cd40094f4cdf3174: Status 404 returned error can't find the container with id 5c2643d70b93f742d55230bda1d4d99a26796b1faec3f2d8cd40094f4cdf3174 Feb 19 15:56:45 crc kubenswrapper[4810]: I0219 15:56:45.451175 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c213a3a-78fd-4b42-bc1c-e09837eae684" path="/var/lib/kubelet/pods/5c213a3a-78fd-4b42-bc1c-e09837eae684/volumes" Feb 19 15:56:45 crc kubenswrapper[4810]: I0219 15:56:45.554341 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"bf65af35-1e80-49a0-ada2-3bd027193193","Type":"ContainerStarted","Data":"5c2643d70b93f742d55230bda1d4d99a26796b1faec3f2d8cd40094f4cdf3174"} Feb 19 15:56:50 crc kubenswrapper[4810]: I0219 15:56:50.615866 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"bf65af35-1e80-49a0-ada2-3bd027193193","Type":"ContainerStarted","Data":"1beb7acac6bd1bedd6051f55579bb2b30e8e57731225978bffc636370ec9ae34"} Feb 19 15:56:59 crc kubenswrapper[4810]: I0219 15:56:59.736576 4810 generic.go:334] "Generic (PLEG): container finished" podID="bf65af35-1e80-49a0-ada2-3bd027193193" containerID="1beb7acac6bd1bedd6051f55579bb2b30e8e57731225978bffc636370ec9ae34" exitCode=0 Feb 19 15:56:59 crc kubenswrapper[4810]: I0219 15:56:59.736692 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"bf65af35-1e80-49a0-ada2-3bd027193193","Type":"ContainerDied","Data":"1beb7acac6bd1bedd6051f55579bb2b30e8e57731225978bffc636370ec9ae34"} Feb 19 15:57:00 crc kubenswrapper[4810]: I0219 15:57:00.749303 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"bf65af35-1e80-49a0-ada2-3bd027193193","Type":"ContainerStarted","Data":"e2bf6727a3974e363277c9433eaf103264295c66ded7f90846b6564632582353"} Feb 19 15:57:05 crc kubenswrapper[4810]: I0219 15:57:05.811630 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"bf65af35-1e80-49a0-ada2-3bd027193193","Type":"ContainerStarted","Data":"e50299ea42b7be305f4cb7128ef81136eaa54995403b6f0bb84bd2027eedad79"} Feb 19 15:57:05 crc kubenswrapper[4810]: I0219 15:57:05.812411 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"bf65af35-1e80-49a0-ada2-3bd027193193","Type":"ContainerStarted","Data":"f28b0f870796240cb37e08bd8fd4ba28ddff6279c418df9a1a6cdb0e1602bde6"} Feb 19 15:57:05 crc kubenswrapper[4810]: I0219 15:57:05.863267 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=21.863234206 podStartE2EDuration="21.863234206s" podCreationTimestamp="2026-02-19 15:56:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 15:57:05.845222278 +0000 UTC m=+2855.327252472" watchObservedRunningTime="2026-02-19 15:57:05.863234206 +0000 UTC m=+2855.345264370" Feb 19 15:57:09 crc kubenswrapper[4810]: I0219 15:57:09.936010 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Feb 19 15:57:14 crc kubenswrapper[4810]: I0219 15:57:14.935973 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Feb 19 15:57:14 crc kubenswrapper[4810]: I0219 15:57:14.946854 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Feb 19 15:57:15 crc kubenswrapper[4810]: I0219 15:57:15.934478 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Feb 19 15:57:32 crc kubenswrapper[4810]: I0219 15:57:32.970508 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Feb 19 15:57:32 crc kubenswrapper[4810]: I0219 15:57:32.974492 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 19 15:57:33 crc kubenswrapper[4810]: I0219 15:57:32.991357 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Feb 19 15:57:33 crc kubenswrapper[4810]: I0219 15:57:33.009386 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Feb 19 15:57:33 crc kubenswrapper[4810]: I0219 15:57:33.009711 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Feb 19 15:57:33 crc kubenswrapper[4810]: I0219 15:57:33.009959 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Feb 19 15:57:33 crc kubenswrapper[4810]: I0219 15:57:33.010095 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-hvstp" Feb 19 15:57:33 crc kubenswrapper[4810]: I0219 15:57:33.047400 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/a4c017a9-c049-4baa-acc0-e08a25437c90-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"a4c017a9-c049-4baa-acc0-e08a25437c90\") " pod="openstack/tempest-tests-tempest" Feb 19 15:57:33 crc kubenswrapper[4810]: I0219 15:57:33.047472 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest\" (UID: \"a4c017a9-c049-4baa-acc0-e08a25437c90\") " pod="openstack/tempest-tests-tempest" Feb 19 15:57:33 crc kubenswrapper[4810]: I0219 15:57:33.047570 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/a4c017a9-c049-4baa-acc0-e08a25437c90-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"a4c017a9-c049-4baa-acc0-e08a25437c90\") " pod="openstack/tempest-tests-tempest" Feb 19 15:57:33 crc kubenswrapper[4810]: I0219 15:57:33.047738 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a4c017a9-c049-4baa-acc0-e08a25437c90-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"a4c017a9-c049-4baa-acc0-e08a25437c90\") " pod="openstack/tempest-tests-tempest" Feb 19 15:57:33 crc kubenswrapper[4810]: I0219 15:57:33.047806 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a4c017a9-c049-4baa-acc0-e08a25437c90-config-data\") pod \"tempest-tests-tempest\" (UID: \"a4c017a9-c049-4baa-acc0-e08a25437c90\") " pod="openstack/tempest-tests-tempest" Feb 19 15:57:33 crc kubenswrapper[4810]: I0219 15:57:33.047895 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a4c017a9-c049-4baa-acc0-e08a25437c90-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"a4c017a9-c049-4baa-acc0-e08a25437c90\") " pod="openstack/tempest-tests-tempest" Feb 19 15:57:33 crc kubenswrapper[4810]: I0219 15:57:33.047973 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a4c017a9-c049-4baa-acc0-e08a25437c90-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"a4c017a9-c049-4baa-acc0-e08a25437c90\") " pod="openstack/tempest-tests-tempest" Feb 19 15:57:33 crc kubenswrapper[4810]: I0219 15:57:33.048110 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/a4c017a9-c049-4baa-acc0-e08a25437c90-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"a4c017a9-c049-4baa-acc0-e08a25437c90\") " pod="openstack/tempest-tests-tempest" Feb 19 15:57:33 crc kubenswrapper[4810]: I0219 15:57:33.048259 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tv5x5\" (UniqueName: \"kubernetes.io/projected/a4c017a9-c049-4baa-acc0-e08a25437c90-kube-api-access-tv5x5\") pod \"tempest-tests-tempest\" (UID: \"a4c017a9-c049-4baa-acc0-e08a25437c90\") " pod="openstack/tempest-tests-tempest" Feb 19 15:57:33 crc kubenswrapper[4810]: I0219 15:57:33.150714 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tv5x5\" (UniqueName: \"kubernetes.io/projected/a4c017a9-c049-4baa-acc0-e08a25437c90-kube-api-access-tv5x5\") pod \"tempest-tests-tempest\" (UID: \"a4c017a9-c049-4baa-acc0-e08a25437c90\") " pod="openstack/tempest-tests-tempest" Feb 19 15:57:33 crc kubenswrapper[4810]: I0219 15:57:33.150872 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/a4c017a9-c049-4baa-acc0-e08a25437c90-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"a4c017a9-c049-4baa-acc0-e08a25437c90\") " pod="openstack/tempest-tests-tempest" Feb 19 15:57:33 crc kubenswrapper[4810]: I0219 15:57:33.150931 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest\" (UID: \"a4c017a9-c049-4baa-acc0-e08a25437c90\") " pod="openstack/tempest-tests-tempest" Feb 19 15:57:33 crc kubenswrapper[4810]: I0219 15:57:33.151024 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/a4c017a9-c049-4baa-acc0-e08a25437c90-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"a4c017a9-c049-4baa-acc0-e08a25437c90\") " pod="openstack/tempest-tests-tempest" Feb 19 15:57:33 crc kubenswrapper[4810]: I0219 15:57:33.151236 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a4c017a9-c049-4baa-acc0-e08a25437c90-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"a4c017a9-c049-4baa-acc0-e08a25437c90\") " pod="openstack/tempest-tests-tempest" Feb 19 15:57:33 crc kubenswrapper[4810]: I0219 15:57:33.151365 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a4c017a9-c049-4baa-acc0-e08a25437c90-config-data\") pod \"tempest-tests-tempest\" (UID: \"a4c017a9-c049-4baa-acc0-e08a25437c90\") " pod="openstack/tempest-tests-tempest" Feb 19 15:57:33 crc kubenswrapper[4810]: I0219 15:57:33.151431 4810 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest\" (UID: \"a4c017a9-c049-4baa-acc0-e08a25437c90\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/tempest-tests-tempest" Feb 19 15:57:33 crc kubenswrapper[4810]: I0219 15:57:33.151461 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a4c017a9-c049-4baa-acc0-e08a25437c90-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"a4c017a9-c049-4baa-acc0-e08a25437c90\") " pod="openstack/tempest-tests-tempest" Feb 19 15:57:33 crc kubenswrapper[4810]: I0219 15:57:33.151534 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a4c017a9-c049-4baa-acc0-e08a25437c90-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"a4c017a9-c049-4baa-acc0-e08a25437c90\") " pod="openstack/tempest-tests-tempest" Feb 19 15:57:33 crc kubenswrapper[4810]: I0219 15:57:33.151610 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/a4c017a9-c049-4baa-acc0-e08a25437c90-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"a4c017a9-c049-4baa-acc0-e08a25437c90\") " pod="openstack/tempest-tests-tempest" Feb 19 15:57:33 crc kubenswrapper[4810]: I0219 15:57:33.152973 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/a4c017a9-c049-4baa-acc0-e08a25437c90-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"a4c017a9-c049-4baa-acc0-e08a25437c90\") " pod="openstack/tempest-tests-tempest" Feb 19 15:57:33 crc kubenswrapper[4810]: I0219 15:57:33.153315 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/a4c017a9-c049-4baa-acc0-e08a25437c90-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"a4c017a9-c049-4baa-acc0-e08a25437c90\") " pod="openstack/tempest-tests-tempest" Feb 19 15:57:33 crc kubenswrapper[4810]: I0219 15:57:33.153781 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a4c017a9-c049-4baa-acc0-e08a25437c90-config-data\") pod \"tempest-tests-tempest\" (UID: \"a4c017a9-c049-4baa-acc0-e08a25437c90\") " pod="openstack/tempest-tests-tempest" Feb 19 15:57:33 crc kubenswrapper[4810]: I0219 15:57:33.154306 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a4c017a9-c049-4baa-acc0-e08a25437c90-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"a4c017a9-c049-4baa-acc0-e08a25437c90\") " pod="openstack/tempest-tests-tempest" Feb 19 15:57:33 crc kubenswrapper[4810]: I0219 15:57:33.163240 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/a4c017a9-c049-4baa-acc0-e08a25437c90-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"a4c017a9-c049-4baa-acc0-e08a25437c90\") " pod="openstack/tempest-tests-tempest" Feb 19 15:57:33 crc kubenswrapper[4810]: I0219 15:57:33.168909 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a4c017a9-c049-4baa-acc0-e08a25437c90-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"a4c017a9-c049-4baa-acc0-e08a25437c90\") " pod="openstack/tempest-tests-tempest" Feb 19 15:57:33 crc kubenswrapper[4810]: I0219 15:57:33.170132 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a4c017a9-c049-4baa-acc0-e08a25437c90-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"a4c017a9-c049-4baa-acc0-e08a25437c90\") " pod="openstack/tempest-tests-tempest" Feb 19 15:57:33 crc kubenswrapper[4810]: I0219 15:57:33.179540 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tv5x5\" (UniqueName: \"kubernetes.io/projected/a4c017a9-c049-4baa-acc0-e08a25437c90-kube-api-access-tv5x5\") pod \"tempest-tests-tempest\" (UID: \"a4c017a9-c049-4baa-acc0-e08a25437c90\") " pod="openstack/tempest-tests-tempest" Feb 19 15:57:33 crc kubenswrapper[4810]: I0219 15:57:33.201586 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest\" (UID: \"a4c017a9-c049-4baa-acc0-e08a25437c90\") " pod="openstack/tempest-tests-tempest" Feb 19 15:57:33 crc kubenswrapper[4810]: I0219 15:57:33.334176 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 19 15:57:33 crc kubenswrapper[4810]: I0219 15:57:33.808614 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Feb 19 15:57:33 crc kubenswrapper[4810]: W0219 15:57:33.817059 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4c017a9_c049_4baa_acc0_e08a25437c90.slice/crio-bc53d602c1ff8998f515238b989c9bf62ab660f688576459213d5fed2a632fb2 WatchSource:0}: Error finding container bc53d602c1ff8998f515238b989c9bf62ab660f688576459213d5fed2a632fb2: Status 404 returned error can't find the container with id bc53d602c1ff8998f515238b989c9bf62ab660f688576459213d5fed2a632fb2 Feb 19 15:57:34 crc kubenswrapper[4810]: I0219 15:57:34.130763 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"a4c017a9-c049-4baa-acc0-e08a25437c90","Type":"ContainerStarted","Data":"bc53d602c1ff8998f515238b989c9bf62ab660f688576459213d5fed2a632fb2"} Feb 19 15:57:44 crc kubenswrapper[4810]: I0219 15:57:44.253871 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"a4c017a9-c049-4baa-acc0-e08a25437c90","Type":"ContainerStarted","Data":"7c1119346fc25e4a8eb25c191fe4eed4ab3589389debf84e048e3e376479d897"} Feb 19 15:57:44 crc kubenswrapper[4810]: I0219 15:57:44.286535 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.497704982 podStartE2EDuration="13.286507051s" podCreationTimestamp="2026-02-19 15:57:31 +0000 UTC" firstStartedPulling="2026-02-19 15:57:33.820892829 +0000 UTC m=+2883.302922963" lastFinishedPulling="2026-02-19 15:57:42.609694898 +0000 UTC m=+2892.091725032" observedRunningTime="2026-02-19 15:57:44.275482827 +0000 UTC m=+2893.757512991" watchObservedRunningTime="2026-02-19 15:57:44.286507051 +0000 UTC m=+2893.768537215" Feb 19 15:57:49 crc kubenswrapper[4810]: I0219 15:57:49.538039 4810 patch_prober.go:28] interesting pod/machine-config-daemon-t499d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 15:57:49 crc kubenswrapper[4810]: I0219 15:57:49.538635 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 15:58:19 crc kubenswrapper[4810]: I0219 15:58:19.538000 4810 patch_prober.go:28] interesting pod/machine-config-daemon-t499d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 15:58:19 crc kubenswrapper[4810]: I0219 15:58:19.538598 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 15:58:25 crc kubenswrapper[4810]: E0219 15:58:25.035168 4810 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/system.slice/rpm-ostreed.service\": RecentStats: unable to find data in memory cache]" Feb 19 15:58:49 crc kubenswrapper[4810]: I0219 15:58:49.538269 4810 patch_prober.go:28] interesting pod/machine-config-daemon-t499d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 15:58:49 crc kubenswrapper[4810]: I0219 15:58:49.538780 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 15:58:49 crc kubenswrapper[4810]: I0219 15:58:49.538838 4810 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t499d" Feb 19 15:58:49 crc kubenswrapper[4810]: I0219 15:58:49.539806 4810 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c35ee8c1523e361fb9a2d8ec76f05b9bc5a2cc5f563a8013cd8c34860471cd09"} pod="openshift-machine-config-operator/machine-config-daemon-t499d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 15:58:49 crc kubenswrapper[4810]: I0219 15:58:49.539875 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" containerID="cri-o://c35ee8c1523e361fb9a2d8ec76f05b9bc5a2cc5f563a8013cd8c34860471cd09" gracePeriod=600 Feb 19 15:58:49 crc kubenswrapper[4810]: E0219 15:58:49.673318 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 15:58:50 crc kubenswrapper[4810]: I0219 15:58:50.586589 4810 generic.go:334] "Generic (PLEG): container finished" podID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerID="c35ee8c1523e361fb9a2d8ec76f05b9bc5a2cc5f563a8013cd8c34860471cd09" exitCode=0 Feb 19 15:58:50 crc kubenswrapper[4810]: I0219 15:58:50.586636 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" event={"ID":"dfbf74ef-1e94-4826-8583-42b2e246ccf3","Type":"ContainerDied","Data":"c35ee8c1523e361fb9a2d8ec76f05b9bc5a2cc5f563a8013cd8c34860471cd09"} Feb 19 15:58:50 crc kubenswrapper[4810]: I0219 15:58:50.586888 4810 scope.go:117] "RemoveContainer" containerID="6287fea84a36b36e84360e8242ff92d8a387529e6d7bde8d2a6410fbcb262896" Feb 19 15:58:50 crc kubenswrapper[4810]: I0219 15:58:50.587844 4810 scope.go:117] "RemoveContainer" containerID="c35ee8c1523e361fb9a2d8ec76f05b9bc5a2cc5f563a8013cd8c34860471cd09" Feb 19 15:58:50 crc kubenswrapper[4810]: E0219 15:58:50.588137 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 15:59:06 crc kubenswrapper[4810]: I0219 15:59:06.442402 4810 scope.go:117] "RemoveContainer" containerID="c35ee8c1523e361fb9a2d8ec76f05b9bc5a2cc5f563a8013cd8c34860471cd09" Feb 19 15:59:06 crc kubenswrapper[4810]: E0219 15:59:06.443952 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 15:59:21 crc kubenswrapper[4810]: I0219 15:59:21.448783 4810 scope.go:117] "RemoveContainer" containerID="c35ee8c1523e361fb9a2d8ec76f05b9bc5a2cc5f563a8013cd8c34860471cd09" Feb 19 15:59:21 crc kubenswrapper[4810]: E0219 15:59:21.449909 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 15:59:35 crc kubenswrapper[4810]: I0219 15:59:35.439542 4810 scope.go:117] "RemoveContainer" containerID="c35ee8c1523e361fb9a2d8ec76f05b9bc5a2cc5f563a8013cd8c34860471cd09" Feb 19 15:59:35 crc kubenswrapper[4810]: E0219 15:59:35.440787 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 15:59:50 crc kubenswrapper[4810]: I0219 15:59:50.439901 4810 scope.go:117] "RemoveContainer" containerID="c35ee8c1523e361fb9a2d8ec76f05b9bc5a2cc5f563a8013cd8c34860471cd09" Feb 19 15:59:50 crc kubenswrapper[4810]: E0219 15:59:50.440676 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:00:00 crc kubenswrapper[4810]: I0219 16:00:00.152679 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525280-gjtpd"] Feb 19 16:00:00 crc kubenswrapper[4810]: I0219 16:00:00.154281 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525280-gjtpd" Feb 19 16:00:00 crc kubenswrapper[4810]: I0219 16:00:00.157512 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 19 16:00:00 crc kubenswrapper[4810]: I0219 16:00:00.158753 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 19 16:00:00 crc kubenswrapper[4810]: I0219 16:00:00.168556 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525280-gjtpd"] Feb 19 16:00:00 crc kubenswrapper[4810]: I0219 16:00:00.281402 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gb7fb\" (UniqueName: \"kubernetes.io/projected/1327b7dc-e5ad-463c-8ca9-89b735b1fec2-kube-api-access-gb7fb\") pod \"collect-profiles-29525280-gjtpd\" (UID: \"1327b7dc-e5ad-463c-8ca9-89b735b1fec2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525280-gjtpd" Feb 19 16:00:00 crc kubenswrapper[4810]: I0219 16:00:00.281471 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1327b7dc-e5ad-463c-8ca9-89b735b1fec2-config-volume\") pod \"collect-profiles-29525280-gjtpd\" (UID: \"1327b7dc-e5ad-463c-8ca9-89b735b1fec2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525280-gjtpd" Feb 19 16:00:00 crc kubenswrapper[4810]: I0219 16:00:00.281531 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1327b7dc-e5ad-463c-8ca9-89b735b1fec2-secret-volume\") pod \"collect-profiles-29525280-gjtpd\" (UID: \"1327b7dc-e5ad-463c-8ca9-89b735b1fec2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525280-gjtpd" Feb 19 16:00:00 crc kubenswrapper[4810]: I0219 16:00:00.384986 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1327b7dc-e5ad-463c-8ca9-89b735b1fec2-secret-volume\") pod \"collect-profiles-29525280-gjtpd\" (UID: \"1327b7dc-e5ad-463c-8ca9-89b735b1fec2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525280-gjtpd" Feb 19 16:00:00 crc kubenswrapper[4810]: I0219 16:00:00.385234 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gb7fb\" (UniqueName: \"kubernetes.io/projected/1327b7dc-e5ad-463c-8ca9-89b735b1fec2-kube-api-access-gb7fb\") pod \"collect-profiles-29525280-gjtpd\" (UID: \"1327b7dc-e5ad-463c-8ca9-89b735b1fec2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525280-gjtpd" Feb 19 16:00:00 crc kubenswrapper[4810]: I0219 16:00:00.385315 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1327b7dc-e5ad-463c-8ca9-89b735b1fec2-config-volume\") pod \"collect-profiles-29525280-gjtpd\" (UID: \"1327b7dc-e5ad-463c-8ca9-89b735b1fec2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525280-gjtpd" Feb 19 16:00:00 crc kubenswrapper[4810]: I0219 16:00:00.386741 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1327b7dc-e5ad-463c-8ca9-89b735b1fec2-config-volume\") pod \"collect-profiles-29525280-gjtpd\" (UID: \"1327b7dc-e5ad-463c-8ca9-89b735b1fec2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525280-gjtpd" Feb 19 16:00:00 crc kubenswrapper[4810]: I0219 16:00:00.396192 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1327b7dc-e5ad-463c-8ca9-89b735b1fec2-secret-volume\") pod \"collect-profiles-29525280-gjtpd\" (UID: \"1327b7dc-e5ad-463c-8ca9-89b735b1fec2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525280-gjtpd" Feb 19 16:00:00 crc kubenswrapper[4810]: I0219 16:00:00.408472 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gb7fb\" (UniqueName: \"kubernetes.io/projected/1327b7dc-e5ad-463c-8ca9-89b735b1fec2-kube-api-access-gb7fb\") pod \"collect-profiles-29525280-gjtpd\" (UID: \"1327b7dc-e5ad-463c-8ca9-89b735b1fec2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525280-gjtpd" Feb 19 16:00:00 crc kubenswrapper[4810]: I0219 16:00:00.495660 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525280-gjtpd" Feb 19 16:00:00 crc kubenswrapper[4810]: I0219 16:00:00.993839 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525280-gjtpd"] Feb 19 16:00:00 crc kubenswrapper[4810]: W0219 16:00:00.997945 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1327b7dc_e5ad_463c_8ca9_89b735b1fec2.slice/crio-f0260a0d3be5bef0c92553d74849a54156c06eacc6e7908dad8600c08065b3d3 WatchSource:0}: Error finding container f0260a0d3be5bef0c92553d74849a54156c06eacc6e7908dad8600c08065b3d3: Status 404 returned error can't find the container with id f0260a0d3be5bef0c92553d74849a54156c06eacc6e7908dad8600c08065b3d3 Feb 19 16:00:01 crc kubenswrapper[4810]: I0219 16:00:01.470395 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525280-gjtpd" event={"ID":"1327b7dc-e5ad-463c-8ca9-89b735b1fec2","Type":"ContainerStarted","Data":"4e99f6d1c84e426443c4e13972b553f91aa0857b582f33dd75b9fc978d8acc56"} Feb 19 16:00:01 crc kubenswrapper[4810]: I0219 16:00:01.470752 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525280-gjtpd" event={"ID":"1327b7dc-e5ad-463c-8ca9-89b735b1fec2","Type":"ContainerStarted","Data":"f0260a0d3be5bef0c92553d74849a54156c06eacc6e7908dad8600c08065b3d3"} Feb 19 16:00:01 crc kubenswrapper[4810]: I0219 16:00:01.542964 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29525280-gjtpd" podStartSLOduration=1.542937405 podStartE2EDuration="1.542937405s" podCreationTimestamp="2026-02-19 16:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 16:00:01.514763644 +0000 UTC m=+3030.996793798" watchObservedRunningTime="2026-02-19 16:00:01.542937405 +0000 UTC m=+3031.024967569" Feb 19 16:00:02 crc kubenswrapper[4810]: I0219 16:00:02.460169 4810 generic.go:334] "Generic (PLEG): container finished" podID="1327b7dc-e5ad-463c-8ca9-89b735b1fec2" containerID="4e99f6d1c84e426443c4e13972b553f91aa0857b582f33dd75b9fc978d8acc56" exitCode=0 Feb 19 16:00:02 crc kubenswrapper[4810]: I0219 16:00:02.460247 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525280-gjtpd" event={"ID":"1327b7dc-e5ad-463c-8ca9-89b735b1fec2","Type":"ContainerDied","Data":"4e99f6d1c84e426443c4e13972b553f91aa0857b582f33dd75b9fc978d8acc56"} Feb 19 16:00:03 crc kubenswrapper[4810]: I0219 16:00:03.892567 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525280-gjtpd" Feb 19 16:00:03 crc kubenswrapper[4810]: I0219 16:00:03.970313 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gb7fb\" (UniqueName: \"kubernetes.io/projected/1327b7dc-e5ad-463c-8ca9-89b735b1fec2-kube-api-access-gb7fb\") pod \"1327b7dc-e5ad-463c-8ca9-89b735b1fec2\" (UID: \"1327b7dc-e5ad-463c-8ca9-89b735b1fec2\") " Feb 19 16:00:03 crc kubenswrapper[4810]: I0219 16:00:03.970431 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1327b7dc-e5ad-463c-8ca9-89b735b1fec2-secret-volume\") pod \"1327b7dc-e5ad-463c-8ca9-89b735b1fec2\" (UID: \"1327b7dc-e5ad-463c-8ca9-89b735b1fec2\") " Feb 19 16:00:03 crc kubenswrapper[4810]: I0219 16:00:03.970673 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1327b7dc-e5ad-463c-8ca9-89b735b1fec2-config-volume\") pod \"1327b7dc-e5ad-463c-8ca9-89b735b1fec2\" (UID: \"1327b7dc-e5ad-463c-8ca9-89b735b1fec2\") " Feb 19 16:00:03 crc kubenswrapper[4810]: I0219 16:00:03.972308 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1327b7dc-e5ad-463c-8ca9-89b735b1fec2-config-volume" (OuterVolumeSpecName: "config-volume") pod "1327b7dc-e5ad-463c-8ca9-89b735b1fec2" (UID: "1327b7dc-e5ad-463c-8ca9-89b735b1fec2"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 16:00:03 crc kubenswrapper[4810]: I0219 16:00:03.979596 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1327b7dc-e5ad-463c-8ca9-89b735b1fec2-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1327b7dc-e5ad-463c-8ca9-89b735b1fec2" (UID: "1327b7dc-e5ad-463c-8ca9-89b735b1fec2"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 16:00:03 crc kubenswrapper[4810]: I0219 16:00:03.980293 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1327b7dc-e5ad-463c-8ca9-89b735b1fec2-kube-api-access-gb7fb" (OuterVolumeSpecName: "kube-api-access-gb7fb") pod "1327b7dc-e5ad-463c-8ca9-89b735b1fec2" (UID: "1327b7dc-e5ad-463c-8ca9-89b735b1fec2"). InnerVolumeSpecName "kube-api-access-gb7fb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 16:00:04 crc kubenswrapper[4810]: I0219 16:00:04.073506 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gb7fb\" (UniqueName: \"kubernetes.io/projected/1327b7dc-e5ad-463c-8ca9-89b735b1fec2-kube-api-access-gb7fb\") on node \"crc\" DevicePath \"\"" Feb 19 16:00:04 crc kubenswrapper[4810]: I0219 16:00:04.074153 4810 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1327b7dc-e5ad-463c-8ca9-89b735b1fec2-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 19 16:00:04 crc kubenswrapper[4810]: I0219 16:00:04.074179 4810 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1327b7dc-e5ad-463c-8ca9-89b735b1fec2-config-volume\") on node \"crc\" DevicePath \"\"" Feb 19 16:00:04 crc kubenswrapper[4810]: I0219 16:00:04.488901 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525280-gjtpd" event={"ID":"1327b7dc-e5ad-463c-8ca9-89b735b1fec2","Type":"ContainerDied","Data":"f0260a0d3be5bef0c92553d74849a54156c06eacc6e7908dad8600c08065b3d3"} Feb 19 16:00:04 crc kubenswrapper[4810]: I0219 16:00:04.488938 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f0260a0d3be5bef0c92553d74849a54156c06eacc6e7908dad8600c08065b3d3" Feb 19 16:00:04 crc kubenswrapper[4810]: I0219 16:00:04.488997 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525280-gjtpd" Feb 19 16:00:04 crc kubenswrapper[4810]: I0219 16:00:04.990173 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525235-vk2jg"] Feb 19 16:00:05 crc kubenswrapper[4810]: I0219 16:00:05.002633 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525235-vk2jg"] Feb 19 16:00:05 crc kubenswrapper[4810]: I0219 16:00:05.440046 4810 scope.go:117] "RemoveContainer" containerID="c35ee8c1523e361fb9a2d8ec76f05b9bc5a2cc5f563a8013cd8c34860471cd09" Feb 19 16:00:05 crc kubenswrapper[4810]: E0219 16:00:05.440530 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:00:05 crc kubenswrapper[4810]: I0219 16:00:05.462603 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f3fa539-f490-4b25-b592-d199cc757b8a" path="/var/lib/kubelet/pods/2f3fa539-f490-4b25-b592-d199cc757b8a/volumes" Feb 19 16:00:19 crc kubenswrapper[4810]: I0219 16:00:19.441416 4810 scope.go:117] "RemoveContainer" containerID="c35ee8c1523e361fb9a2d8ec76f05b9bc5a2cc5f563a8013cd8c34860471cd09" Feb 19 16:00:19 crc kubenswrapper[4810]: E0219 16:00:19.442753 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:00:32 crc kubenswrapper[4810]: I0219 16:00:32.440647 4810 scope.go:117] "RemoveContainer" containerID="c35ee8c1523e361fb9a2d8ec76f05b9bc5a2cc5f563a8013cd8c34860471cd09" Feb 19 16:00:32 crc kubenswrapper[4810]: E0219 16:00:32.441466 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:00:43 crc kubenswrapper[4810]: I0219 16:00:43.150698 4810 scope.go:117] "RemoveContainer" containerID="88bb50e9f73c36f14be470f12c634b9e24a853c43e328fa253f758c7a485a40a" Feb 19 16:00:44 crc kubenswrapper[4810]: I0219 16:00:44.439926 4810 scope.go:117] "RemoveContainer" containerID="c35ee8c1523e361fb9a2d8ec76f05b9bc5a2cc5f563a8013cd8c34860471cd09" Feb 19 16:00:44 crc kubenswrapper[4810]: E0219 16:00:44.441285 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:00:55 crc kubenswrapper[4810]: I0219 16:00:55.439923 4810 scope.go:117] "RemoveContainer" containerID="c35ee8c1523e361fb9a2d8ec76f05b9bc5a2cc5f563a8013cd8c34860471cd09" Feb 19 16:00:55 crc kubenswrapper[4810]: E0219 16:00:55.441827 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:01:00 crc kubenswrapper[4810]: I0219 16:01:00.159409 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29525281-26qqv"] Feb 19 16:01:00 crc kubenswrapper[4810]: E0219 16:01:00.163018 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1327b7dc-e5ad-463c-8ca9-89b735b1fec2" containerName="collect-profiles" Feb 19 16:01:00 crc kubenswrapper[4810]: I0219 16:01:00.163070 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="1327b7dc-e5ad-463c-8ca9-89b735b1fec2" containerName="collect-profiles" Feb 19 16:01:00 crc kubenswrapper[4810]: I0219 16:01:00.165093 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="1327b7dc-e5ad-463c-8ca9-89b735b1fec2" containerName="collect-profiles" Feb 19 16:01:00 crc kubenswrapper[4810]: I0219 16:01:00.167854 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29525281-26qqv" Feb 19 16:01:00 crc kubenswrapper[4810]: I0219 16:01:00.195539 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8984eff3-6c82-4e2f-8bd6-1e820a450874-config-data\") pod \"keystone-cron-29525281-26qqv\" (UID: \"8984eff3-6c82-4e2f-8bd6-1e820a450874\") " pod="openstack/keystone-cron-29525281-26qqv" Feb 19 16:01:00 crc kubenswrapper[4810]: I0219 16:01:00.195724 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smlzq\" (UniqueName: \"kubernetes.io/projected/8984eff3-6c82-4e2f-8bd6-1e820a450874-kube-api-access-smlzq\") pod \"keystone-cron-29525281-26qqv\" (UID: \"8984eff3-6c82-4e2f-8bd6-1e820a450874\") " pod="openstack/keystone-cron-29525281-26qqv" Feb 19 16:01:00 crc kubenswrapper[4810]: I0219 16:01:00.195823 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8984eff3-6c82-4e2f-8bd6-1e820a450874-combined-ca-bundle\") pod \"keystone-cron-29525281-26qqv\" (UID: \"8984eff3-6c82-4e2f-8bd6-1e820a450874\") " pod="openstack/keystone-cron-29525281-26qqv" Feb 19 16:01:00 crc kubenswrapper[4810]: I0219 16:01:00.196018 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8984eff3-6c82-4e2f-8bd6-1e820a450874-fernet-keys\") pod \"keystone-cron-29525281-26qqv\" (UID: \"8984eff3-6c82-4e2f-8bd6-1e820a450874\") " pod="openstack/keystone-cron-29525281-26qqv" Feb 19 16:01:00 crc kubenswrapper[4810]: I0219 16:01:00.197069 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29525281-26qqv"] Feb 19 16:01:00 crc kubenswrapper[4810]: I0219 16:01:00.298203 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8984eff3-6c82-4e2f-8bd6-1e820a450874-fernet-keys\") pod \"keystone-cron-29525281-26qqv\" (UID: \"8984eff3-6c82-4e2f-8bd6-1e820a450874\") " pod="openstack/keystone-cron-29525281-26qqv" Feb 19 16:01:00 crc kubenswrapper[4810]: I0219 16:01:00.298427 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8984eff3-6c82-4e2f-8bd6-1e820a450874-config-data\") pod \"keystone-cron-29525281-26qqv\" (UID: \"8984eff3-6c82-4e2f-8bd6-1e820a450874\") " pod="openstack/keystone-cron-29525281-26qqv" Feb 19 16:01:00 crc kubenswrapper[4810]: I0219 16:01:00.298529 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smlzq\" (UniqueName: \"kubernetes.io/projected/8984eff3-6c82-4e2f-8bd6-1e820a450874-kube-api-access-smlzq\") pod \"keystone-cron-29525281-26qqv\" (UID: \"8984eff3-6c82-4e2f-8bd6-1e820a450874\") " pod="openstack/keystone-cron-29525281-26qqv" Feb 19 16:01:00 crc kubenswrapper[4810]: I0219 16:01:00.298623 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8984eff3-6c82-4e2f-8bd6-1e820a450874-combined-ca-bundle\") pod \"keystone-cron-29525281-26qqv\" (UID: \"8984eff3-6c82-4e2f-8bd6-1e820a450874\") " pod="openstack/keystone-cron-29525281-26qqv" Feb 19 16:01:00 crc kubenswrapper[4810]: I0219 16:01:00.307367 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8984eff3-6c82-4e2f-8bd6-1e820a450874-fernet-keys\") pod \"keystone-cron-29525281-26qqv\" (UID: \"8984eff3-6c82-4e2f-8bd6-1e820a450874\") " pod="openstack/keystone-cron-29525281-26qqv" Feb 19 16:01:00 crc kubenswrapper[4810]: I0219 16:01:00.308591 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8984eff3-6c82-4e2f-8bd6-1e820a450874-config-data\") pod \"keystone-cron-29525281-26qqv\" (UID: \"8984eff3-6c82-4e2f-8bd6-1e820a450874\") " pod="openstack/keystone-cron-29525281-26qqv" Feb 19 16:01:00 crc kubenswrapper[4810]: I0219 16:01:00.309729 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8984eff3-6c82-4e2f-8bd6-1e820a450874-combined-ca-bundle\") pod \"keystone-cron-29525281-26qqv\" (UID: \"8984eff3-6c82-4e2f-8bd6-1e820a450874\") " pod="openstack/keystone-cron-29525281-26qqv" Feb 19 16:01:00 crc kubenswrapper[4810]: I0219 16:01:00.319193 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smlzq\" (UniqueName: \"kubernetes.io/projected/8984eff3-6c82-4e2f-8bd6-1e820a450874-kube-api-access-smlzq\") pod \"keystone-cron-29525281-26qqv\" (UID: \"8984eff3-6c82-4e2f-8bd6-1e820a450874\") " pod="openstack/keystone-cron-29525281-26qqv" Feb 19 16:01:00 crc kubenswrapper[4810]: I0219 16:01:00.510546 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29525281-26qqv" Feb 19 16:01:00 crc kubenswrapper[4810]: I0219 16:01:00.792289 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29525281-26qqv"] Feb 19 16:01:01 crc kubenswrapper[4810]: I0219 16:01:01.124509 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29525281-26qqv" event={"ID":"8984eff3-6c82-4e2f-8bd6-1e820a450874","Type":"ContainerStarted","Data":"917515301fc786ad918811d2b3e221a682d1b36e815ac4c50924d29590c00e46"} Feb 19 16:01:01 crc kubenswrapper[4810]: I0219 16:01:01.124583 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29525281-26qqv" event={"ID":"8984eff3-6c82-4e2f-8bd6-1e820a450874","Type":"ContainerStarted","Data":"7f756605f4dd1141e04daa9451f347481858b07b73d74e10611d02aa8b166bdd"} Feb 19 16:01:01 crc kubenswrapper[4810]: I0219 16:01:01.156166 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29525281-26qqv" podStartSLOduration=1.156139913 podStartE2EDuration="1.156139913s" podCreationTimestamp="2026-02-19 16:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 16:01:01.15481363 +0000 UTC m=+3090.636843754" watchObservedRunningTime="2026-02-19 16:01:01.156139913 +0000 UTC m=+3090.638170077" Feb 19 16:01:04 crc kubenswrapper[4810]: I0219 16:01:04.173723 4810 generic.go:334] "Generic (PLEG): container finished" podID="8984eff3-6c82-4e2f-8bd6-1e820a450874" containerID="917515301fc786ad918811d2b3e221a682d1b36e815ac4c50924d29590c00e46" exitCode=0 Feb 19 16:01:04 crc kubenswrapper[4810]: I0219 16:01:04.173811 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29525281-26qqv" event={"ID":"8984eff3-6c82-4e2f-8bd6-1e820a450874","Type":"ContainerDied","Data":"917515301fc786ad918811d2b3e221a682d1b36e815ac4c50924d29590c00e46"} Feb 19 16:01:05 crc kubenswrapper[4810]: I0219 16:01:05.634166 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29525281-26qqv" Feb 19 16:01:05 crc kubenswrapper[4810]: I0219 16:01:05.818507 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8984eff3-6c82-4e2f-8bd6-1e820a450874-combined-ca-bundle\") pod \"8984eff3-6c82-4e2f-8bd6-1e820a450874\" (UID: \"8984eff3-6c82-4e2f-8bd6-1e820a450874\") " Feb 19 16:01:05 crc kubenswrapper[4810]: I0219 16:01:05.818748 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8984eff3-6c82-4e2f-8bd6-1e820a450874-fernet-keys\") pod \"8984eff3-6c82-4e2f-8bd6-1e820a450874\" (UID: \"8984eff3-6c82-4e2f-8bd6-1e820a450874\") " Feb 19 16:01:05 crc kubenswrapper[4810]: I0219 16:01:05.818992 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-smlzq\" (UniqueName: \"kubernetes.io/projected/8984eff3-6c82-4e2f-8bd6-1e820a450874-kube-api-access-smlzq\") pod \"8984eff3-6c82-4e2f-8bd6-1e820a450874\" (UID: \"8984eff3-6c82-4e2f-8bd6-1e820a450874\") " Feb 19 16:01:05 crc kubenswrapper[4810]: I0219 16:01:05.819233 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8984eff3-6c82-4e2f-8bd6-1e820a450874-config-data\") pod \"8984eff3-6c82-4e2f-8bd6-1e820a450874\" (UID: \"8984eff3-6c82-4e2f-8bd6-1e820a450874\") " Feb 19 16:01:05 crc kubenswrapper[4810]: I0219 16:01:05.825695 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8984eff3-6c82-4e2f-8bd6-1e820a450874-kube-api-access-smlzq" (OuterVolumeSpecName: "kube-api-access-smlzq") pod "8984eff3-6c82-4e2f-8bd6-1e820a450874" (UID: "8984eff3-6c82-4e2f-8bd6-1e820a450874"). InnerVolumeSpecName "kube-api-access-smlzq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 16:01:05 crc kubenswrapper[4810]: I0219 16:01:05.827381 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8984eff3-6c82-4e2f-8bd6-1e820a450874-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "8984eff3-6c82-4e2f-8bd6-1e820a450874" (UID: "8984eff3-6c82-4e2f-8bd6-1e820a450874"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 16:01:05 crc kubenswrapper[4810]: I0219 16:01:05.865059 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8984eff3-6c82-4e2f-8bd6-1e820a450874-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8984eff3-6c82-4e2f-8bd6-1e820a450874" (UID: "8984eff3-6c82-4e2f-8bd6-1e820a450874"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 16:01:05 crc kubenswrapper[4810]: I0219 16:01:05.890179 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8984eff3-6c82-4e2f-8bd6-1e820a450874-config-data" (OuterVolumeSpecName: "config-data") pod "8984eff3-6c82-4e2f-8bd6-1e820a450874" (UID: "8984eff3-6c82-4e2f-8bd6-1e820a450874"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 16:01:05 crc kubenswrapper[4810]: I0219 16:01:05.922135 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8984eff3-6c82-4e2f-8bd6-1e820a450874-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 16:01:05 crc kubenswrapper[4810]: I0219 16:01:05.922169 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8984eff3-6c82-4e2f-8bd6-1e820a450874-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 16:01:05 crc kubenswrapper[4810]: I0219 16:01:05.922181 4810 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8984eff3-6c82-4e2f-8bd6-1e820a450874-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 19 16:01:05 crc kubenswrapper[4810]: I0219 16:01:05.922190 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-smlzq\" (UniqueName: \"kubernetes.io/projected/8984eff3-6c82-4e2f-8bd6-1e820a450874-kube-api-access-smlzq\") on node \"crc\" DevicePath \"\"" Feb 19 16:01:06 crc kubenswrapper[4810]: I0219 16:01:06.195758 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29525281-26qqv" event={"ID":"8984eff3-6c82-4e2f-8bd6-1e820a450874","Type":"ContainerDied","Data":"7f756605f4dd1141e04daa9451f347481858b07b73d74e10611d02aa8b166bdd"} Feb 19 16:01:06 crc kubenswrapper[4810]: I0219 16:01:06.195800 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f756605f4dd1141e04daa9451f347481858b07b73d74e10611d02aa8b166bdd" Feb 19 16:01:06 crc kubenswrapper[4810]: I0219 16:01:06.195908 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29525281-26qqv" Feb 19 16:01:07 crc kubenswrapper[4810]: I0219 16:01:07.439357 4810 scope.go:117] "RemoveContainer" containerID="c35ee8c1523e361fb9a2d8ec76f05b9bc5a2cc5f563a8013cd8c34860471cd09" Feb 19 16:01:07 crc kubenswrapper[4810]: E0219 16:01:07.440235 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:01:20 crc kubenswrapper[4810]: I0219 16:01:20.444283 4810 scope.go:117] "RemoveContainer" containerID="c35ee8c1523e361fb9a2d8ec76f05b9bc5a2cc5f563a8013cd8c34860471cd09" Feb 19 16:01:20 crc kubenswrapper[4810]: E0219 16:01:20.445094 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:01:32 crc kubenswrapper[4810]: I0219 16:01:32.440666 4810 scope.go:117] "RemoveContainer" containerID="c35ee8c1523e361fb9a2d8ec76f05b9bc5a2cc5f563a8013cd8c34860471cd09" Feb 19 16:01:32 crc kubenswrapper[4810]: E0219 16:01:32.442089 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:01:46 crc kubenswrapper[4810]: I0219 16:01:46.441004 4810 scope.go:117] "RemoveContainer" containerID="c35ee8c1523e361fb9a2d8ec76f05b9bc5a2cc5f563a8013cd8c34860471cd09" Feb 19 16:01:46 crc kubenswrapper[4810]: E0219 16:01:46.441999 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:01:59 crc kubenswrapper[4810]: I0219 16:01:59.439980 4810 scope.go:117] "RemoveContainer" containerID="c35ee8c1523e361fb9a2d8ec76f05b9bc5a2cc5f563a8013cd8c34860471cd09" Feb 19 16:01:59 crc kubenswrapper[4810]: E0219 16:01:59.441108 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:02:13 crc kubenswrapper[4810]: I0219 16:02:13.439005 4810 scope.go:117] "RemoveContainer" containerID="c35ee8c1523e361fb9a2d8ec76f05b9bc5a2cc5f563a8013cd8c34860471cd09" Feb 19 16:02:13 crc kubenswrapper[4810]: E0219 16:02:13.439837 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:02:25 crc kubenswrapper[4810]: I0219 16:02:25.440526 4810 scope.go:117] "RemoveContainer" containerID="c35ee8c1523e361fb9a2d8ec76f05b9bc5a2cc5f563a8013cd8c34860471cd09" Feb 19 16:02:25 crc kubenswrapper[4810]: E0219 16:02:25.441367 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:02:37 crc kubenswrapper[4810]: I0219 16:02:37.440293 4810 scope.go:117] "RemoveContainer" containerID="c35ee8c1523e361fb9a2d8ec76f05b9bc5a2cc5f563a8013cd8c34860471cd09" Feb 19 16:02:37 crc kubenswrapper[4810]: E0219 16:02:37.442185 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:02:50 crc kubenswrapper[4810]: I0219 16:02:50.439309 4810 scope.go:117] "RemoveContainer" containerID="c35ee8c1523e361fb9a2d8ec76f05b9bc5a2cc5f563a8013cd8c34860471cd09" Feb 19 16:02:50 crc kubenswrapper[4810]: E0219 16:02:50.440297 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:03:02 crc kubenswrapper[4810]: I0219 16:03:02.439943 4810 scope.go:117] "RemoveContainer" containerID="c35ee8c1523e361fb9a2d8ec76f05b9bc5a2cc5f563a8013cd8c34860471cd09" Feb 19 16:03:02 crc kubenswrapper[4810]: E0219 16:03:02.441099 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:03:14 crc kubenswrapper[4810]: I0219 16:03:14.439523 4810 scope.go:117] "RemoveContainer" containerID="c35ee8c1523e361fb9a2d8ec76f05b9bc5a2cc5f563a8013cd8c34860471cd09" Feb 19 16:03:14 crc kubenswrapper[4810]: E0219 16:03:14.440633 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:03:26 crc kubenswrapper[4810]: I0219 16:03:26.440365 4810 scope.go:117] "RemoveContainer" containerID="c35ee8c1523e361fb9a2d8ec76f05b9bc5a2cc5f563a8013cd8c34860471cd09" Feb 19 16:03:26 crc kubenswrapper[4810]: E0219 16:03:26.441089 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:03:40 crc kubenswrapper[4810]: I0219 16:03:40.440878 4810 scope.go:117] "RemoveContainer" containerID="c35ee8c1523e361fb9a2d8ec76f05b9bc5a2cc5f563a8013cd8c34860471cd09" Feb 19 16:03:40 crc kubenswrapper[4810]: E0219 16:03:40.442410 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:03:53 crc kubenswrapper[4810]: I0219 16:03:53.439954 4810 scope.go:117] "RemoveContainer" containerID="c35ee8c1523e361fb9a2d8ec76f05b9bc5a2cc5f563a8013cd8c34860471cd09" Feb 19 16:03:54 crc kubenswrapper[4810]: I0219 16:03:54.616974 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" event={"ID":"dfbf74ef-1e94-4826-8583-42b2e246ccf3","Type":"ContainerStarted","Data":"2578d4284c44e38a7496c83ee59e7a00d386a9b2aebd20063f610b39b9a8d15a"} Feb 19 16:06:19 crc kubenswrapper[4810]: I0219 16:06:19.537014 4810 patch_prober.go:28] interesting pod/machine-config-daemon-t499d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 16:06:19 crc kubenswrapper[4810]: I0219 16:06:19.537547 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 16:06:44 crc kubenswrapper[4810]: I0219 16:06:44.189203 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-78p9d"] Feb 19 16:06:44 crc kubenswrapper[4810]: E0219 16:06:44.191319 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8984eff3-6c82-4e2f-8bd6-1e820a450874" containerName="keystone-cron" Feb 19 16:06:44 crc kubenswrapper[4810]: I0219 16:06:44.192598 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="8984eff3-6c82-4e2f-8bd6-1e820a450874" containerName="keystone-cron" Feb 19 16:06:44 crc kubenswrapper[4810]: I0219 16:06:44.194686 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="8984eff3-6c82-4e2f-8bd6-1e820a450874" containerName="keystone-cron" Feb 19 16:06:44 crc kubenswrapper[4810]: I0219 16:06:44.206038 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-78p9d" Feb 19 16:06:44 crc kubenswrapper[4810]: I0219 16:06:44.241357 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-78p9d"] Feb 19 16:06:44 crc kubenswrapper[4810]: I0219 16:06:44.272148 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4070bcdc-bd83-4c82-920b-8cd10671c498-utilities\") pod \"community-operators-78p9d\" (UID: \"4070bcdc-bd83-4c82-920b-8cd10671c498\") " pod="openshift-marketplace/community-operators-78p9d" Feb 19 16:06:44 crc kubenswrapper[4810]: I0219 16:06:44.272657 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4070bcdc-bd83-4c82-920b-8cd10671c498-catalog-content\") pod \"community-operators-78p9d\" (UID: \"4070bcdc-bd83-4c82-920b-8cd10671c498\") " pod="openshift-marketplace/community-operators-78p9d" Feb 19 16:06:44 crc kubenswrapper[4810]: I0219 16:06:44.272762 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88slz\" (UniqueName: \"kubernetes.io/projected/4070bcdc-bd83-4c82-920b-8cd10671c498-kube-api-access-88slz\") pod \"community-operators-78p9d\" (UID: \"4070bcdc-bd83-4c82-920b-8cd10671c498\") " pod="openshift-marketplace/community-operators-78p9d" Feb 19 16:06:44 crc kubenswrapper[4810]: I0219 16:06:44.374304 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4070bcdc-bd83-4c82-920b-8cd10671c498-catalog-content\") pod \"community-operators-78p9d\" (UID: \"4070bcdc-bd83-4c82-920b-8cd10671c498\") " pod="openshift-marketplace/community-operators-78p9d" Feb 19 16:06:44 crc kubenswrapper[4810]: I0219 16:06:44.374430 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88slz\" (UniqueName: \"kubernetes.io/projected/4070bcdc-bd83-4c82-920b-8cd10671c498-kube-api-access-88slz\") pod \"community-operators-78p9d\" (UID: \"4070bcdc-bd83-4c82-920b-8cd10671c498\") " pod="openshift-marketplace/community-operators-78p9d" Feb 19 16:06:44 crc kubenswrapper[4810]: I0219 16:06:44.374501 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4070bcdc-bd83-4c82-920b-8cd10671c498-utilities\") pod \"community-operators-78p9d\" (UID: \"4070bcdc-bd83-4c82-920b-8cd10671c498\") " pod="openshift-marketplace/community-operators-78p9d" Feb 19 16:06:44 crc kubenswrapper[4810]: I0219 16:06:44.374961 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4070bcdc-bd83-4c82-920b-8cd10671c498-catalog-content\") pod \"community-operators-78p9d\" (UID: \"4070bcdc-bd83-4c82-920b-8cd10671c498\") " pod="openshift-marketplace/community-operators-78p9d" Feb 19 16:06:44 crc kubenswrapper[4810]: I0219 16:06:44.374981 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4070bcdc-bd83-4c82-920b-8cd10671c498-utilities\") pod \"community-operators-78p9d\" (UID: \"4070bcdc-bd83-4c82-920b-8cd10671c498\") " pod="openshift-marketplace/community-operators-78p9d" Feb 19 16:06:44 crc kubenswrapper[4810]: I0219 16:06:44.403719 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88slz\" (UniqueName: \"kubernetes.io/projected/4070bcdc-bd83-4c82-920b-8cd10671c498-kube-api-access-88slz\") pod \"community-operators-78p9d\" (UID: \"4070bcdc-bd83-4c82-920b-8cd10671c498\") " pod="openshift-marketplace/community-operators-78p9d" Feb 19 16:06:44 crc kubenswrapper[4810]: I0219 16:06:44.547109 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-78p9d" Feb 19 16:06:45 crc kubenswrapper[4810]: I0219 16:06:45.144385 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-78p9d"] Feb 19 16:06:45 crc kubenswrapper[4810]: I0219 16:06:45.499038 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-78p9d" event={"ID":"4070bcdc-bd83-4c82-920b-8cd10671c498","Type":"ContainerStarted","Data":"0c95ccc80153fc2aa4312b67f7d6c9a87f9473946f133fded513db482a269be4"} Feb 19 16:06:46 crc kubenswrapper[4810]: I0219 16:06:46.517185 4810 generic.go:334] "Generic (PLEG): container finished" podID="4070bcdc-bd83-4c82-920b-8cd10671c498" containerID="ab74f7f85dc91d920ea16df4b6ea4adb1452fa6de4e5f98251385b261184b652" exitCode=0 Feb 19 16:06:46 crc kubenswrapper[4810]: I0219 16:06:46.517281 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-78p9d" event={"ID":"4070bcdc-bd83-4c82-920b-8cd10671c498","Type":"ContainerDied","Data":"ab74f7f85dc91d920ea16df4b6ea4adb1452fa6de4e5f98251385b261184b652"} Feb 19 16:06:46 crc kubenswrapper[4810]: I0219 16:06:46.523102 4810 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 16:06:49 crc kubenswrapper[4810]: I0219 16:06:49.537894 4810 patch_prober.go:28] interesting pod/machine-config-daemon-t499d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 16:06:49 crc kubenswrapper[4810]: I0219 16:06:49.538514 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 16:06:53 crc kubenswrapper[4810]: I0219 16:06:53.586297 4810 generic.go:334] "Generic (PLEG): container finished" podID="4070bcdc-bd83-4c82-920b-8cd10671c498" containerID="702b9f8e53344f3c21f1cd920ff8a37a750c3a8370aa348e1d772d912bfa2ac0" exitCode=0 Feb 19 16:06:53 crc kubenswrapper[4810]: I0219 16:06:53.586655 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-78p9d" event={"ID":"4070bcdc-bd83-4c82-920b-8cd10671c498","Type":"ContainerDied","Data":"702b9f8e53344f3c21f1cd920ff8a37a750c3a8370aa348e1d772d912bfa2ac0"} Feb 19 16:06:55 crc kubenswrapper[4810]: I0219 16:06:55.617615 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-78p9d" event={"ID":"4070bcdc-bd83-4c82-920b-8cd10671c498","Type":"ContainerStarted","Data":"2d183229f511b7d5a8cb0fb396e7514676855b732318632722a576d32921f68f"} Feb 19 16:06:55 crc kubenswrapper[4810]: I0219 16:06:55.645205 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-78p9d" podStartSLOduration=3.890360568 podStartE2EDuration="11.645186831s" podCreationTimestamp="2026-02-19 16:06:44 +0000 UTC" firstStartedPulling="2026-02-19 16:06:46.522700395 +0000 UTC m=+3436.004730549" lastFinishedPulling="2026-02-19 16:06:54.277526648 +0000 UTC m=+3443.759556812" observedRunningTime="2026-02-19 16:06:55.638526356 +0000 UTC m=+3445.120556480" watchObservedRunningTime="2026-02-19 16:06:55.645186831 +0000 UTC m=+3445.127216955" Feb 19 16:07:04 crc kubenswrapper[4810]: I0219 16:07:04.548196 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-78p9d" Feb 19 16:07:04 crc kubenswrapper[4810]: I0219 16:07:04.549026 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-78p9d" Feb 19 16:07:04 crc kubenswrapper[4810]: I0219 16:07:04.623826 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-78p9d" Feb 19 16:07:04 crc kubenswrapper[4810]: I0219 16:07:04.792797 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-78p9d" Feb 19 16:07:04 crc kubenswrapper[4810]: I0219 16:07:04.886756 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-78p9d"] Feb 19 16:07:04 crc kubenswrapper[4810]: I0219 16:07:04.937001 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-v68v6"] Feb 19 16:07:04 crc kubenswrapper[4810]: I0219 16:07:04.937221 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-v68v6" podUID="935efdc2-5596-4207-a27b-68a8a39b6529" containerName="registry-server" containerID="cri-o://fdb2805a108d53dbae6af5f6571743f61b51164e75890b28023d46aa9db3310a" gracePeriod=2 Feb 19 16:07:05 crc kubenswrapper[4810]: I0219 16:07:05.496445 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v68v6" Feb 19 16:07:05 crc kubenswrapper[4810]: I0219 16:07:05.612033 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/935efdc2-5596-4207-a27b-68a8a39b6529-utilities\") pod \"935efdc2-5596-4207-a27b-68a8a39b6529\" (UID: \"935efdc2-5596-4207-a27b-68a8a39b6529\") " Feb 19 16:07:05 crc kubenswrapper[4810]: I0219 16:07:05.612215 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlxt4\" (UniqueName: \"kubernetes.io/projected/935efdc2-5596-4207-a27b-68a8a39b6529-kube-api-access-dlxt4\") pod \"935efdc2-5596-4207-a27b-68a8a39b6529\" (UID: \"935efdc2-5596-4207-a27b-68a8a39b6529\") " Feb 19 16:07:05 crc kubenswrapper[4810]: I0219 16:07:05.612272 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/935efdc2-5596-4207-a27b-68a8a39b6529-catalog-content\") pod \"935efdc2-5596-4207-a27b-68a8a39b6529\" (UID: \"935efdc2-5596-4207-a27b-68a8a39b6529\") " Feb 19 16:07:05 crc kubenswrapper[4810]: I0219 16:07:05.612701 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/935efdc2-5596-4207-a27b-68a8a39b6529-utilities" (OuterVolumeSpecName: "utilities") pod "935efdc2-5596-4207-a27b-68a8a39b6529" (UID: "935efdc2-5596-4207-a27b-68a8a39b6529"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 16:07:05 crc kubenswrapper[4810]: I0219 16:07:05.625054 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/935efdc2-5596-4207-a27b-68a8a39b6529-kube-api-access-dlxt4" (OuterVolumeSpecName: "kube-api-access-dlxt4") pod "935efdc2-5596-4207-a27b-68a8a39b6529" (UID: "935efdc2-5596-4207-a27b-68a8a39b6529"). InnerVolumeSpecName "kube-api-access-dlxt4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 16:07:05 crc kubenswrapper[4810]: I0219 16:07:05.692679 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/935efdc2-5596-4207-a27b-68a8a39b6529-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "935efdc2-5596-4207-a27b-68a8a39b6529" (UID: "935efdc2-5596-4207-a27b-68a8a39b6529"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 16:07:05 crc kubenswrapper[4810]: I0219 16:07:05.714143 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/935efdc2-5596-4207-a27b-68a8a39b6529-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 16:07:05 crc kubenswrapper[4810]: I0219 16:07:05.714169 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlxt4\" (UniqueName: \"kubernetes.io/projected/935efdc2-5596-4207-a27b-68a8a39b6529-kube-api-access-dlxt4\") on node \"crc\" DevicePath \"\"" Feb 19 16:07:05 crc kubenswrapper[4810]: I0219 16:07:05.714180 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/935efdc2-5596-4207-a27b-68a8a39b6529-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 16:07:05 crc kubenswrapper[4810]: I0219 16:07:05.732139 4810 generic.go:334] "Generic (PLEG): container finished" podID="935efdc2-5596-4207-a27b-68a8a39b6529" containerID="fdb2805a108d53dbae6af5f6571743f61b51164e75890b28023d46aa9db3310a" exitCode=0 Feb 19 16:07:05 crc kubenswrapper[4810]: I0219 16:07:05.733088 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v68v6" Feb 19 16:07:05 crc kubenswrapper[4810]: I0219 16:07:05.735862 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v68v6" event={"ID":"935efdc2-5596-4207-a27b-68a8a39b6529","Type":"ContainerDied","Data":"fdb2805a108d53dbae6af5f6571743f61b51164e75890b28023d46aa9db3310a"} Feb 19 16:07:05 crc kubenswrapper[4810]: I0219 16:07:05.735941 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v68v6" event={"ID":"935efdc2-5596-4207-a27b-68a8a39b6529","Type":"ContainerDied","Data":"514f59846afe5a035371037e15fe72e417c83d08b819931a1c6ac5a2603476da"} Feb 19 16:07:05 crc kubenswrapper[4810]: I0219 16:07:05.735966 4810 scope.go:117] "RemoveContainer" containerID="fdb2805a108d53dbae6af5f6571743f61b51164e75890b28023d46aa9db3310a" Feb 19 16:07:05 crc kubenswrapper[4810]: I0219 16:07:05.766920 4810 scope.go:117] "RemoveContainer" containerID="454b485efdf0b632d0bf68a7c004625d4bec4676c25fb0c84c862bf5d9832936" Feb 19 16:07:05 crc kubenswrapper[4810]: I0219 16:07:05.792944 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-v68v6"] Feb 19 16:07:05 crc kubenswrapper[4810]: I0219 16:07:05.796184 4810 scope.go:117] "RemoveContainer" containerID="1bf8e18430d52e1bdffbf7e432c09ccbb227c1161c0321cdc60f7b17956e0999" Feb 19 16:07:05 crc kubenswrapper[4810]: I0219 16:07:05.806347 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-v68v6"] Feb 19 16:07:05 crc kubenswrapper[4810]: I0219 16:07:05.846447 4810 scope.go:117] "RemoveContainer" containerID="fdb2805a108d53dbae6af5f6571743f61b51164e75890b28023d46aa9db3310a" Feb 19 16:07:05 crc kubenswrapper[4810]: E0219 16:07:05.847007 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdb2805a108d53dbae6af5f6571743f61b51164e75890b28023d46aa9db3310a\": container with ID starting with fdb2805a108d53dbae6af5f6571743f61b51164e75890b28023d46aa9db3310a not found: ID does not exist" containerID="fdb2805a108d53dbae6af5f6571743f61b51164e75890b28023d46aa9db3310a" Feb 19 16:07:05 crc kubenswrapper[4810]: I0219 16:07:05.847046 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdb2805a108d53dbae6af5f6571743f61b51164e75890b28023d46aa9db3310a"} err="failed to get container status \"fdb2805a108d53dbae6af5f6571743f61b51164e75890b28023d46aa9db3310a\": rpc error: code = NotFound desc = could not find container \"fdb2805a108d53dbae6af5f6571743f61b51164e75890b28023d46aa9db3310a\": container with ID starting with fdb2805a108d53dbae6af5f6571743f61b51164e75890b28023d46aa9db3310a not found: ID does not exist" Feb 19 16:07:05 crc kubenswrapper[4810]: I0219 16:07:05.847071 4810 scope.go:117] "RemoveContainer" containerID="454b485efdf0b632d0bf68a7c004625d4bec4676c25fb0c84c862bf5d9832936" Feb 19 16:07:05 crc kubenswrapper[4810]: E0219 16:07:05.847406 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"454b485efdf0b632d0bf68a7c004625d4bec4676c25fb0c84c862bf5d9832936\": container with ID starting with 454b485efdf0b632d0bf68a7c004625d4bec4676c25fb0c84c862bf5d9832936 not found: ID does not exist" containerID="454b485efdf0b632d0bf68a7c004625d4bec4676c25fb0c84c862bf5d9832936" Feb 19 16:07:05 crc kubenswrapper[4810]: I0219 16:07:05.847435 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"454b485efdf0b632d0bf68a7c004625d4bec4676c25fb0c84c862bf5d9832936"} err="failed to get container status \"454b485efdf0b632d0bf68a7c004625d4bec4676c25fb0c84c862bf5d9832936\": rpc error: code = NotFound desc = could not find container \"454b485efdf0b632d0bf68a7c004625d4bec4676c25fb0c84c862bf5d9832936\": container with ID starting with 454b485efdf0b632d0bf68a7c004625d4bec4676c25fb0c84c862bf5d9832936 not found: ID does not exist" Feb 19 16:07:05 crc kubenswrapper[4810]: I0219 16:07:05.847472 4810 scope.go:117] "RemoveContainer" containerID="1bf8e18430d52e1bdffbf7e432c09ccbb227c1161c0321cdc60f7b17956e0999" Feb 19 16:07:05 crc kubenswrapper[4810]: E0219 16:07:05.847845 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bf8e18430d52e1bdffbf7e432c09ccbb227c1161c0321cdc60f7b17956e0999\": container with ID starting with 1bf8e18430d52e1bdffbf7e432c09ccbb227c1161c0321cdc60f7b17956e0999 not found: ID does not exist" containerID="1bf8e18430d52e1bdffbf7e432c09ccbb227c1161c0321cdc60f7b17956e0999" Feb 19 16:07:05 crc kubenswrapper[4810]: I0219 16:07:05.847875 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bf8e18430d52e1bdffbf7e432c09ccbb227c1161c0321cdc60f7b17956e0999"} err="failed to get container status \"1bf8e18430d52e1bdffbf7e432c09ccbb227c1161c0321cdc60f7b17956e0999\": rpc error: code = NotFound desc = could not find container \"1bf8e18430d52e1bdffbf7e432c09ccbb227c1161c0321cdc60f7b17956e0999\": container with ID starting with 1bf8e18430d52e1bdffbf7e432c09ccbb227c1161c0321cdc60f7b17956e0999 not found: ID does not exist" Feb 19 16:07:07 crc kubenswrapper[4810]: I0219 16:07:07.455971 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="935efdc2-5596-4207-a27b-68a8a39b6529" path="/var/lib/kubelet/pods/935efdc2-5596-4207-a27b-68a8a39b6529/volumes" Feb 19 16:07:19 crc kubenswrapper[4810]: I0219 16:07:19.537720 4810 patch_prober.go:28] interesting pod/machine-config-daemon-t499d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 16:07:19 crc kubenswrapper[4810]: I0219 16:07:19.538300 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 16:07:19 crc kubenswrapper[4810]: I0219 16:07:19.538364 4810 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t499d" Feb 19 16:07:19 crc kubenswrapper[4810]: I0219 16:07:19.539128 4810 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2578d4284c44e38a7496c83ee59e7a00d386a9b2aebd20063f610b39b9a8d15a"} pod="openshift-machine-config-operator/machine-config-daemon-t499d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 16:07:19 crc kubenswrapper[4810]: I0219 16:07:19.539189 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" containerID="cri-o://2578d4284c44e38a7496c83ee59e7a00d386a9b2aebd20063f610b39b9a8d15a" gracePeriod=600 Feb 19 16:07:19 crc kubenswrapper[4810]: I0219 16:07:19.895196 4810 generic.go:334] "Generic (PLEG): container finished" podID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerID="2578d4284c44e38a7496c83ee59e7a00d386a9b2aebd20063f610b39b9a8d15a" exitCode=0 Feb 19 16:07:19 crc kubenswrapper[4810]: I0219 16:07:19.895245 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" event={"ID":"dfbf74ef-1e94-4826-8583-42b2e246ccf3","Type":"ContainerDied","Data":"2578d4284c44e38a7496c83ee59e7a00d386a9b2aebd20063f610b39b9a8d15a"} Feb 19 16:07:19 crc kubenswrapper[4810]: I0219 16:07:19.895626 4810 scope.go:117] "RemoveContainer" containerID="c35ee8c1523e361fb9a2d8ec76f05b9bc5a2cc5f563a8013cd8c34860471cd09" Feb 19 16:07:20 crc kubenswrapper[4810]: I0219 16:07:20.914244 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" event={"ID":"dfbf74ef-1e94-4826-8583-42b2e246ccf3","Type":"ContainerStarted","Data":"8520d3639f4b062a60a4e8d48690a9ddf9c8f0411d765a4be05ac76b8b0faa8e"} Feb 19 16:07:29 crc kubenswrapper[4810]: I0219 16:07:29.054050 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2nf72"] Feb 19 16:07:29 crc kubenswrapper[4810]: E0219 16:07:29.055468 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="935efdc2-5596-4207-a27b-68a8a39b6529" containerName="extract-utilities" Feb 19 16:07:29 crc kubenswrapper[4810]: I0219 16:07:29.055490 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="935efdc2-5596-4207-a27b-68a8a39b6529" containerName="extract-utilities" Feb 19 16:07:29 crc kubenswrapper[4810]: E0219 16:07:29.055521 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="935efdc2-5596-4207-a27b-68a8a39b6529" containerName="registry-server" Feb 19 16:07:29 crc kubenswrapper[4810]: I0219 16:07:29.055532 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="935efdc2-5596-4207-a27b-68a8a39b6529" containerName="registry-server" Feb 19 16:07:29 crc kubenswrapper[4810]: E0219 16:07:29.055562 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="935efdc2-5596-4207-a27b-68a8a39b6529" containerName="extract-content" Feb 19 16:07:29 crc kubenswrapper[4810]: I0219 16:07:29.055572 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="935efdc2-5596-4207-a27b-68a8a39b6529" containerName="extract-content" Feb 19 16:07:29 crc kubenswrapper[4810]: I0219 16:07:29.055903 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="935efdc2-5596-4207-a27b-68a8a39b6529" containerName="registry-server" Feb 19 16:07:29 crc kubenswrapper[4810]: I0219 16:07:29.058396 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2nf72" Feb 19 16:07:29 crc kubenswrapper[4810]: I0219 16:07:29.084549 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2nf72"] Feb 19 16:07:29 crc kubenswrapper[4810]: I0219 16:07:29.178659 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6de6280-d32b-4c86-b93b-4c0a06ee631a-catalog-content\") pod \"redhat-marketplace-2nf72\" (UID: \"f6de6280-d32b-4c86-b93b-4c0a06ee631a\") " pod="openshift-marketplace/redhat-marketplace-2nf72" Feb 19 16:07:29 crc kubenswrapper[4810]: I0219 16:07:29.179108 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6de6280-d32b-4c86-b93b-4c0a06ee631a-utilities\") pod \"redhat-marketplace-2nf72\" (UID: \"f6de6280-d32b-4c86-b93b-4c0a06ee631a\") " pod="openshift-marketplace/redhat-marketplace-2nf72" Feb 19 16:07:29 crc kubenswrapper[4810]: I0219 16:07:29.179498 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rk984\" (UniqueName: \"kubernetes.io/projected/f6de6280-d32b-4c86-b93b-4c0a06ee631a-kube-api-access-rk984\") pod \"redhat-marketplace-2nf72\" (UID: \"f6de6280-d32b-4c86-b93b-4c0a06ee631a\") " pod="openshift-marketplace/redhat-marketplace-2nf72" Feb 19 16:07:29 crc kubenswrapper[4810]: I0219 16:07:29.281700 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6de6280-d32b-4c86-b93b-4c0a06ee631a-utilities\") pod \"redhat-marketplace-2nf72\" (UID: \"f6de6280-d32b-4c86-b93b-4c0a06ee631a\") " pod="openshift-marketplace/redhat-marketplace-2nf72" Feb 19 16:07:29 crc kubenswrapper[4810]: I0219 16:07:29.282113 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rk984\" (UniqueName: \"kubernetes.io/projected/f6de6280-d32b-4c86-b93b-4c0a06ee631a-kube-api-access-rk984\") pod \"redhat-marketplace-2nf72\" (UID: \"f6de6280-d32b-4c86-b93b-4c0a06ee631a\") " pod="openshift-marketplace/redhat-marketplace-2nf72" Feb 19 16:07:29 crc kubenswrapper[4810]: I0219 16:07:29.282246 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6de6280-d32b-4c86-b93b-4c0a06ee631a-catalog-content\") pod \"redhat-marketplace-2nf72\" (UID: \"f6de6280-d32b-4c86-b93b-4c0a06ee631a\") " pod="openshift-marketplace/redhat-marketplace-2nf72" Feb 19 16:07:29 crc kubenswrapper[4810]: I0219 16:07:29.282528 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6de6280-d32b-4c86-b93b-4c0a06ee631a-utilities\") pod \"redhat-marketplace-2nf72\" (UID: \"f6de6280-d32b-4c86-b93b-4c0a06ee631a\") " pod="openshift-marketplace/redhat-marketplace-2nf72" Feb 19 16:07:29 crc kubenswrapper[4810]: I0219 16:07:29.282794 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6de6280-d32b-4c86-b93b-4c0a06ee631a-catalog-content\") pod \"redhat-marketplace-2nf72\" (UID: \"f6de6280-d32b-4c86-b93b-4c0a06ee631a\") " pod="openshift-marketplace/redhat-marketplace-2nf72" Feb 19 16:07:29 crc kubenswrapper[4810]: I0219 16:07:29.314298 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rk984\" (UniqueName: \"kubernetes.io/projected/f6de6280-d32b-4c86-b93b-4c0a06ee631a-kube-api-access-rk984\") pod \"redhat-marketplace-2nf72\" (UID: \"f6de6280-d32b-4c86-b93b-4c0a06ee631a\") " pod="openshift-marketplace/redhat-marketplace-2nf72" Feb 19 16:07:29 crc kubenswrapper[4810]: I0219 16:07:29.430582 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2nf72" Feb 19 16:07:29 crc kubenswrapper[4810]: I0219 16:07:29.908109 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2nf72"] Feb 19 16:07:30 crc kubenswrapper[4810]: I0219 16:07:30.026406 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2nf72" event={"ID":"f6de6280-d32b-4c86-b93b-4c0a06ee631a","Type":"ContainerStarted","Data":"8253aabd5e0eccfe864a20e447838bea8c93e171ca8b4c9d578faed4739fe2a9"} Feb 19 16:07:31 crc kubenswrapper[4810]: I0219 16:07:31.039560 4810 generic.go:334] "Generic (PLEG): container finished" podID="f6de6280-d32b-4c86-b93b-4c0a06ee631a" containerID="b46f8abcb3e1715f798a9e99f09aa1eda0b33b52fe5bccf58044aa7a87c06e1c" exitCode=0 Feb 19 16:07:31 crc kubenswrapper[4810]: I0219 16:07:31.039655 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2nf72" event={"ID":"f6de6280-d32b-4c86-b93b-4c0a06ee631a","Type":"ContainerDied","Data":"b46f8abcb3e1715f798a9e99f09aa1eda0b33b52fe5bccf58044aa7a87c06e1c"} Feb 19 16:07:32 crc kubenswrapper[4810]: I0219 16:07:32.056000 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2nf72" event={"ID":"f6de6280-d32b-4c86-b93b-4c0a06ee631a","Type":"ContainerStarted","Data":"05cc9a216a60a1d1ff50d8970ba8fe384e3037ab972f04c4b82084af60c6196b"} Feb 19 16:07:32 crc kubenswrapper[4810]: I0219 16:07:32.460451 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jljh4"] Feb 19 16:07:32 crc kubenswrapper[4810]: I0219 16:07:32.464445 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jljh4" Feb 19 16:07:32 crc kubenswrapper[4810]: I0219 16:07:32.479417 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jljh4"] Feb 19 16:07:32 crc kubenswrapper[4810]: I0219 16:07:32.564352 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3d8a679-ba7e-40bb-a22e-a1ae751fe6bd-utilities\") pod \"certified-operators-jljh4\" (UID: \"c3d8a679-ba7e-40bb-a22e-a1ae751fe6bd\") " pod="openshift-marketplace/certified-operators-jljh4" Feb 19 16:07:32 crc kubenswrapper[4810]: I0219 16:07:32.564396 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3d8a679-ba7e-40bb-a22e-a1ae751fe6bd-catalog-content\") pod \"certified-operators-jljh4\" (UID: \"c3d8a679-ba7e-40bb-a22e-a1ae751fe6bd\") " pod="openshift-marketplace/certified-operators-jljh4" Feb 19 16:07:32 crc kubenswrapper[4810]: I0219 16:07:32.564424 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jss82\" (UniqueName: \"kubernetes.io/projected/c3d8a679-ba7e-40bb-a22e-a1ae751fe6bd-kube-api-access-jss82\") pod \"certified-operators-jljh4\" (UID: \"c3d8a679-ba7e-40bb-a22e-a1ae751fe6bd\") " pod="openshift-marketplace/certified-operators-jljh4" Feb 19 16:07:32 crc kubenswrapper[4810]: I0219 16:07:32.666494 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3d8a679-ba7e-40bb-a22e-a1ae751fe6bd-utilities\") pod \"certified-operators-jljh4\" (UID: \"c3d8a679-ba7e-40bb-a22e-a1ae751fe6bd\") " pod="openshift-marketplace/certified-operators-jljh4" Feb 19 16:07:32 crc kubenswrapper[4810]: I0219 16:07:32.666553 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3d8a679-ba7e-40bb-a22e-a1ae751fe6bd-catalog-content\") pod \"certified-operators-jljh4\" (UID: \"c3d8a679-ba7e-40bb-a22e-a1ae751fe6bd\") " pod="openshift-marketplace/certified-operators-jljh4" Feb 19 16:07:32 crc kubenswrapper[4810]: I0219 16:07:32.666594 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jss82\" (UniqueName: \"kubernetes.io/projected/c3d8a679-ba7e-40bb-a22e-a1ae751fe6bd-kube-api-access-jss82\") pod \"certified-operators-jljh4\" (UID: \"c3d8a679-ba7e-40bb-a22e-a1ae751fe6bd\") " pod="openshift-marketplace/certified-operators-jljh4" Feb 19 16:07:32 crc kubenswrapper[4810]: I0219 16:07:32.667092 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3d8a679-ba7e-40bb-a22e-a1ae751fe6bd-utilities\") pod \"certified-operators-jljh4\" (UID: \"c3d8a679-ba7e-40bb-a22e-a1ae751fe6bd\") " pod="openshift-marketplace/certified-operators-jljh4" Feb 19 16:07:32 crc kubenswrapper[4810]: I0219 16:07:32.667152 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3d8a679-ba7e-40bb-a22e-a1ae751fe6bd-catalog-content\") pod \"certified-operators-jljh4\" (UID: \"c3d8a679-ba7e-40bb-a22e-a1ae751fe6bd\") " pod="openshift-marketplace/certified-operators-jljh4" Feb 19 16:07:32 crc kubenswrapper[4810]: I0219 16:07:32.693231 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jss82\" (UniqueName: \"kubernetes.io/projected/c3d8a679-ba7e-40bb-a22e-a1ae751fe6bd-kube-api-access-jss82\") pod \"certified-operators-jljh4\" (UID: \"c3d8a679-ba7e-40bb-a22e-a1ae751fe6bd\") " pod="openshift-marketplace/certified-operators-jljh4" Feb 19 16:07:32 crc kubenswrapper[4810]: I0219 16:07:32.789556 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jljh4" Feb 19 16:07:32 crc kubenswrapper[4810]: I0219 16:07:32.867276 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-64j86"] Feb 19 16:07:32 crc kubenswrapper[4810]: I0219 16:07:32.869522 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-64j86" Feb 19 16:07:32 crc kubenswrapper[4810]: I0219 16:07:32.879260 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-64j86"] Feb 19 16:07:32 crc kubenswrapper[4810]: I0219 16:07:32.971503 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4w7q7\" (UniqueName: \"kubernetes.io/projected/577659f8-9dbb-46d0-b2cb-80951550957f-kube-api-access-4w7q7\") pod \"redhat-operators-64j86\" (UID: \"577659f8-9dbb-46d0-b2cb-80951550957f\") " pod="openshift-marketplace/redhat-operators-64j86" Feb 19 16:07:32 crc kubenswrapper[4810]: I0219 16:07:32.971592 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/577659f8-9dbb-46d0-b2cb-80951550957f-catalog-content\") pod \"redhat-operators-64j86\" (UID: \"577659f8-9dbb-46d0-b2cb-80951550957f\") " pod="openshift-marketplace/redhat-operators-64j86" Feb 19 16:07:32 crc kubenswrapper[4810]: I0219 16:07:32.971633 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/577659f8-9dbb-46d0-b2cb-80951550957f-utilities\") pod \"redhat-operators-64j86\" (UID: \"577659f8-9dbb-46d0-b2cb-80951550957f\") " pod="openshift-marketplace/redhat-operators-64j86" Feb 19 16:07:33 crc kubenswrapper[4810]: I0219 16:07:33.073914 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4w7q7\" (UniqueName: \"kubernetes.io/projected/577659f8-9dbb-46d0-b2cb-80951550957f-kube-api-access-4w7q7\") pod \"redhat-operators-64j86\" (UID: \"577659f8-9dbb-46d0-b2cb-80951550957f\") " pod="openshift-marketplace/redhat-operators-64j86" Feb 19 16:07:33 crc kubenswrapper[4810]: I0219 16:07:33.074018 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/577659f8-9dbb-46d0-b2cb-80951550957f-catalog-content\") pod \"redhat-operators-64j86\" (UID: \"577659f8-9dbb-46d0-b2cb-80951550957f\") " pod="openshift-marketplace/redhat-operators-64j86" Feb 19 16:07:33 crc kubenswrapper[4810]: I0219 16:07:33.074068 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/577659f8-9dbb-46d0-b2cb-80951550957f-utilities\") pod \"redhat-operators-64j86\" (UID: \"577659f8-9dbb-46d0-b2cb-80951550957f\") " pod="openshift-marketplace/redhat-operators-64j86" Feb 19 16:07:33 crc kubenswrapper[4810]: I0219 16:07:33.074681 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/577659f8-9dbb-46d0-b2cb-80951550957f-utilities\") pod \"redhat-operators-64j86\" (UID: \"577659f8-9dbb-46d0-b2cb-80951550957f\") " pod="openshift-marketplace/redhat-operators-64j86" Feb 19 16:07:33 crc kubenswrapper[4810]: I0219 16:07:33.075615 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/577659f8-9dbb-46d0-b2cb-80951550957f-catalog-content\") pod \"redhat-operators-64j86\" (UID: \"577659f8-9dbb-46d0-b2cb-80951550957f\") " pod="openshift-marketplace/redhat-operators-64j86" Feb 19 16:07:33 crc kubenswrapper[4810]: I0219 16:07:33.101475 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4w7q7\" (UniqueName: \"kubernetes.io/projected/577659f8-9dbb-46d0-b2cb-80951550957f-kube-api-access-4w7q7\") pod \"redhat-operators-64j86\" (UID: \"577659f8-9dbb-46d0-b2cb-80951550957f\") " pod="openshift-marketplace/redhat-operators-64j86" Feb 19 16:07:33 crc kubenswrapper[4810]: I0219 16:07:33.283430 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-64j86" Feb 19 16:07:34 crc kubenswrapper[4810]: I0219 16:07:34.058633 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jljh4"] Feb 19 16:07:34 crc kubenswrapper[4810]: I0219 16:07:34.091725 4810 generic.go:334] "Generic (PLEG): container finished" podID="f6de6280-d32b-4c86-b93b-4c0a06ee631a" containerID="05cc9a216a60a1d1ff50d8970ba8fe384e3037ab972f04c4b82084af60c6196b" exitCode=0 Feb 19 16:07:34 crc kubenswrapper[4810]: I0219 16:07:34.091800 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2nf72" event={"ID":"f6de6280-d32b-4c86-b93b-4c0a06ee631a","Type":"ContainerDied","Data":"05cc9a216a60a1d1ff50d8970ba8fe384e3037ab972f04c4b82084af60c6196b"} Feb 19 16:07:34 crc kubenswrapper[4810]: I0219 16:07:34.094836 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jljh4" event={"ID":"c3d8a679-ba7e-40bb-a22e-a1ae751fe6bd","Type":"ContainerStarted","Data":"3f2a36ddbb2854ba8d1bc8b3161a45126a4855b2d5a76816b0ab55f69aa7490f"} Feb 19 16:07:34 crc kubenswrapper[4810]: I0219 16:07:34.347666 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-64j86"] Feb 19 16:07:35 crc kubenswrapper[4810]: I0219 16:07:35.106138 4810 generic.go:334] "Generic (PLEG): container finished" podID="577659f8-9dbb-46d0-b2cb-80951550957f" containerID="0a997d370511dea75f129a7681651a204854d63f35705e130fcf606397cfec6f" exitCode=0 Feb 19 16:07:35 crc kubenswrapper[4810]: I0219 16:07:35.106306 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-64j86" event={"ID":"577659f8-9dbb-46d0-b2cb-80951550957f","Type":"ContainerDied","Data":"0a997d370511dea75f129a7681651a204854d63f35705e130fcf606397cfec6f"} Feb 19 16:07:35 crc kubenswrapper[4810]: I0219 16:07:35.106586 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-64j86" event={"ID":"577659f8-9dbb-46d0-b2cb-80951550957f","Type":"ContainerStarted","Data":"96a616bff386690da5f589c057f363366e36ad19ad36fdff70cbdf3bb4704f10"} Feb 19 16:07:35 crc kubenswrapper[4810]: I0219 16:07:35.108941 4810 generic.go:334] "Generic (PLEG): container finished" podID="c3d8a679-ba7e-40bb-a22e-a1ae751fe6bd" containerID="6fb2edb1a24f496d9ddd2804e3d51a61772791f54decbda879e1317be0fa6cc6" exitCode=0 Feb 19 16:07:35 crc kubenswrapper[4810]: I0219 16:07:35.109510 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jljh4" event={"ID":"c3d8a679-ba7e-40bb-a22e-a1ae751fe6bd","Type":"ContainerDied","Data":"6fb2edb1a24f496d9ddd2804e3d51a61772791f54decbda879e1317be0fa6cc6"} Feb 19 16:07:35 crc kubenswrapper[4810]: I0219 16:07:35.115550 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2nf72" event={"ID":"f6de6280-d32b-4c86-b93b-4c0a06ee631a","Type":"ContainerStarted","Data":"d7bd439d8bc0d4dd1a6b8ad202c10491abd329175439774b81dd46530d917993"} Feb 19 16:07:35 crc kubenswrapper[4810]: I0219 16:07:35.175307 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2nf72" podStartSLOduration=2.44099654 podStartE2EDuration="6.175286507s" podCreationTimestamp="2026-02-19 16:07:29 +0000 UTC" firstStartedPulling="2026-02-19 16:07:31.042684499 +0000 UTC m=+3480.524714633" lastFinishedPulling="2026-02-19 16:07:34.776974476 +0000 UTC m=+3484.259004600" observedRunningTime="2026-02-19 16:07:35.169224207 +0000 UTC m=+3484.651254351" watchObservedRunningTime="2026-02-19 16:07:35.175286507 +0000 UTC m=+3484.657316631" Feb 19 16:07:38 crc kubenswrapper[4810]: I0219 16:07:38.150727 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jljh4" event={"ID":"c3d8a679-ba7e-40bb-a22e-a1ae751fe6bd","Type":"ContainerStarted","Data":"0436f9add9060712a8eadd4b9553ee23df2b08b125e5848fa1234b303b4f8c12"} Feb 19 16:07:38 crc kubenswrapper[4810]: I0219 16:07:38.154073 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-64j86" event={"ID":"577659f8-9dbb-46d0-b2cb-80951550957f","Type":"ContainerStarted","Data":"b8577964a2dfe54d0a59674fdaf01d96e87490efbc3e6f7eab9d40ccbe5b4994"} Feb 19 16:07:39 crc kubenswrapper[4810]: I0219 16:07:39.431424 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2nf72" Feb 19 16:07:39 crc kubenswrapper[4810]: I0219 16:07:39.433942 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2nf72" Feb 19 16:07:39 crc kubenswrapper[4810]: I0219 16:07:39.505001 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2nf72" Feb 19 16:07:40 crc kubenswrapper[4810]: I0219 16:07:40.239319 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2nf72" Feb 19 16:07:44 crc kubenswrapper[4810]: I0219 16:07:44.225221 4810 generic.go:334] "Generic (PLEG): container finished" podID="c3d8a679-ba7e-40bb-a22e-a1ae751fe6bd" containerID="0436f9add9060712a8eadd4b9553ee23df2b08b125e5848fa1234b303b4f8c12" exitCode=0 Feb 19 16:07:44 crc kubenswrapper[4810]: I0219 16:07:44.225306 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jljh4" event={"ID":"c3d8a679-ba7e-40bb-a22e-a1ae751fe6bd","Type":"ContainerDied","Data":"0436f9add9060712a8eadd4b9553ee23df2b08b125e5848fa1234b303b4f8c12"} Feb 19 16:07:44 crc kubenswrapper[4810]: I0219 16:07:44.232597 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2nf72"] Feb 19 16:07:44 crc kubenswrapper[4810]: I0219 16:07:44.232809 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2nf72" podUID="f6de6280-d32b-4c86-b93b-4c0a06ee631a" containerName="registry-server" containerID="cri-o://d7bd439d8bc0d4dd1a6b8ad202c10491abd329175439774b81dd46530d917993" gracePeriod=2 Feb 19 16:07:44 crc kubenswrapper[4810]: I0219 16:07:44.737046 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2nf72" Feb 19 16:07:44 crc kubenswrapper[4810]: I0219 16:07:44.900424 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6de6280-d32b-4c86-b93b-4c0a06ee631a-utilities\") pod \"f6de6280-d32b-4c86-b93b-4c0a06ee631a\" (UID: \"f6de6280-d32b-4c86-b93b-4c0a06ee631a\") " Feb 19 16:07:44 crc kubenswrapper[4810]: I0219 16:07:44.900546 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6de6280-d32b-4c86-b93b-4c0a06ee631a-catalog-content\") pod \"f6de6280-d32b-4c86-b93b-4c0a06ee631a\" (UID: \"f6de6280-d32b-4c86-b93b-4c0a06ee631a\") " Feb 19 16:07:44 crc kubenswrapper[4810]: I0219 16:07:44.900612 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rk984\" (UniqueName: \"kubernetes.io/projected/f6de6280-d32b-4c86-b93b-4c0a06ee631a-kube-api-access-rk984\") pod \"f6de6280-d32b-4c86-b93b-4c0a06ee631a\" (UID: \"f6de6280-d32b-4c86-b93b-4c0a06ee631a\") " Feb 19 16:07:44 crc kubenswrapper[4810]: I0219 16:07:44.904769 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6de6280-d32b-4c86-b93b-4c0a06ee631a-utilities" (OuterVolumeSpecName: "utilities") pod "f6de6280-d32b-4c86-b93b-4c0a06ee631a" (UID: "f6de6280-d32b-4c86-b93b-4c0a06ee631a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 16:07:44 crc kubenswrapper[4810]: I0219 16:07:44.914078 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6de6280-d32b-4c86-b93b-4c0a06ee631a-kube-api-access-rk984" (OuterVolumeSpecName: "kube-api-access-rk984") pod "f6de6280-d32b-4c86-b93b-4c0a06ee631a" (UID: "f6de6280-d32b-4c86-b93b-4c0a06ee631a"). InnerVolumeSpecName "kube-api-access-rk984". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 16:07:44 crc kubenswrapper[4810]: I0219 16:07:44.936347 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6de6280-d32b-4c86-b93b-4c0a06ee631a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f6de6280-d32b-4c86-b93b-4c0a06ee631a" (UID: "f6de6280-d32b-4c86-b93b-4c0a06ee631a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 16:07:45 crc kubenswrapper[4810]: I0219 16:07:45.003147 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rk984\" (UniqueName: \"kubernetes.io/projected/f6de6280-d32b-4c86-b93b-4c0a06ee631a-kube-api-access-rk984\") on node \"crc\" DevicePath \"\"" Feb 19 16:07:45 crc kubenswrapper[4810]: I0219 16:07:45.003232 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6de6280-d32b-4c86-b93b-4c0a06ee631a-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 16:07:45 crc kubenswrapper[4810]: I0219 16:07:45.003242 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6de6280-d32b-4c86-b93b-4c0a06ee631a-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 16:07:45 crc kubenswrapper[4810]: I0219 16:07:45.239828 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jljh4" event={"ID":"c3d8a679-ba7e-40bb-a22e-a1ae751fe6bd","Type":"ContainerStarted","Data":"a1b65af471e174bbe1242694aadff9dbcea59298cf527d53fbdffc0d91120e42"} Feb 19 16:07:45 crc kubenswrapper[4810]: I0219 16:07:45.243991 4810 generic.go:334] "Generic (PLEG): container finished" podID="f6de6280-d32b-4c86-b93b-4c0a06ee631a" containerID="d7bd439d8bc0d4dd1a6b8ad202c10491abd329175439774b81dd46530d917993" exitCode=0 Feb 19 16:07:45 crc kubenswrapper[4810]: I0219 16:07:45.244046 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2nf72" Feb 19 16:07:45 crc kubenswrapper[4810]: I0219 16:07:45.244045 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2nf72" event={"ID":"f6de6280-d32b-4c86-b93b-4c0a06ee631a","Type":"ContainerDied","Data":"d7bd439d8bc0d4dd1a6b8ad202c10491abd329175439774b81dd46530d917993"} Feb 19 16:07:45 crc kubenswrapper[4810]: I0219 16:07:45.244165 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2nf72" event={"ID":"f6de6280-d32b-4c86-b93b-4c0a06ee631a","Type":"ContainerDied","Data":"8253aabd5e0eccfe864a20e447838bea8c93e171ca8b4c9d578faed4739fe2a9"} Feb 19 16:07:45 crc kubenswrapper[4810]: I0219 16:07:45.244194 4810 scope.go:117] "RemoveContainer" containerID="d7bd439d8bc0d4dd1a6b8ad202c10491abd329175439774b81dd46530d917993" Feb 19 16:07:45 crc kubenswrapper[4810]: I0219 16:07:45.269619 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jljh4" podStartSLOduration=3.535936123 podStartE2EDuration="13.267643211s" podCreationTimestamp="2026-02-19 16:07:32 +0000 UTC" firstStartedPulling="2026-02-19 16:07:35.11257298 +0000 UTC m=+3484.594603104" lastFinishedPulling="2026-02-19 16:07:44.844280058 +0000 UTC m=+3494.326310192" observedRunningTime="2026-02-19 16:07:45.262852522 +0000 UTC m=+3494.744882656" watchObservedRunningTime="2026-02-19 16:07:45.267643211 +0000 UTC m=+3494.749673335" Feb 19 16:07:45 crc kubenswrapper[4810]: I0219 16:07:45.295301 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2nf72"] Feb 19 16:07:45 crc kubenswrapper[4810]: I0219 16:07:45.302250 4810 scope.go:117] "RemoveContainer" containerID="05cc9a216a60a1d1ff50d8970ba8fe384e3037ab972f04c4b82084af60c6196b" Feb 19 16:07:45 crc kubenswrapper[4810]: I0219 16:07:45.311507 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2nf72"] Feb 19 16:07:45 crc kubenswrapper[4810]: I0219 16:07:45.608176 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6de6280-d32b-4c86-b93b-4c0a06ee631a" path="/var/lib/kubelet/pods/f6de6280-d32b-4c86-b93b-4c0a06ee631a/volumes" Feb 19 16:07:45 crc kubenswrapper[4810]: I0219 16:07:45.693003 4810 scope.go:117] "RemoveContainer" containerID="b46f8abcb3e1715f798a9e99f09aa1eda0b33b52fe5bccf58044aa7a87c06e1c" Feb 19 16:07:45 crc kubenswrapper[4810]: I0219 16:07:45.725501 4810 scope.go:117] "RemoveContainer" containerID="d7bd439d8bc0d4dd1a6b8ad202c10491abd329175439774b81dd46530d917993" Feb 19 16:07:45 crc kubenswrapper[4810]: E0219 16:07:45.725975 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7bd439d8bc0d4dd1a6b8ad202c10491abd329175439774b81dd46530d917993\": container with ID starting with d7bd439d8bc0d4dd1a6b8ad202c10491abd329175439774b81dd46530d917993 not found: ID does not exist" containerID="d7bd439d8bc0d4dd1a6b8ad202c10491abd329175439774b81dd46530d917993" Feb 19 16:07:45 crc kubenswrapper[4810]: I0219 16:07:45.726026 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7bd439d8bc0d4dd1a6b8ad202c10491abd329175439774b81dd46530d917993"} err="failed to get container status \"d7bd439d8bc0d4dd1a6b8ad202c10491abd329175439774b81dd46530d917993\": rpc error: code = NotFound desc = could not find container \"d7bd439d8bc0d4dd1a6b8ad202c10491abd329175439774b81dd46530d917993\": container with ID starting with d7bd439d8bc0d4dd1a6b8ad202c10491abd329175439774b81dd46530d917993 not found: ID does not exist" Feb 19 16:07:45 crc kubenswrapper[4810]: I0219 16:07:45.726058 4810 scope.go:117] "RemoveContainer" containerID="05cc9a216a60a1d1ff50d8970ba8fe384e3037ab972f04c4b82084af60c6196b" Feb 19 16:07:45 crc kubenswrapper[4810]: E0219 16:07:45.726549 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05cc9a216a60a1d1ff50d8970ba8fe384e3037ab972f04c4b82084af60c6196b\": container with ID starting with 05cc9a216a60a1d1ff50d8970ba8fe384e3037ab972f04c4b82084af60c6196b not found: ID does not exist" containerID="05cc9a216a60a1d1ff50d8970ba8fe384e3037ab972f04c4b82084af60c6196b" Feb 19 16:07:45 crc kubenswrapper[4810]: I0219 16:07:45.726581 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05cc9a216a60a1d1ff50d8970ba8fe384e3037ab972f04c4b82084af60c6196b"} err="failed to get container status \"05cc9a216a60a1d1ff50d8970ba8fe384e3037ab972f04c4b82084af60c6196b\": rpc error: code = NotFound desc = could not find container \"05cc9a216a60a1d1ff50d8970ba8fe384e3037ab972f04c4b82084af60c6196b\": container with ID starting with 05cc9a216a60a1d1ff50d8970ba8fe384e3037ab972f04c4b82084af60c6196b not found: ID does not exist" Feb 19 16:07:45 crc kubenswrapper[4810]: I0219 16:07:45.726602 4810 scope.go:117] "RemoveContainer" containerID="b46f8abcb3e1715f798a9e99f09aa1eda0b33b52fe5bccf58044aa7a87c06e1c" Feb 19 16:07:45 crc kubenswrapper[4810]: E0219 16:07:45.726885 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b46f8abcb3e1715f798a9e99f09aa1eda0b33b52fe5bccf58044aa7a87c06e1c\": container with ID starting with b46f8abcb3e1715f798a9e99f09aa1eda0b33b52fe5bccf58044aa7a87c06e1c not found: ID does not exist" containerID="b46f8abcb3e1715f798a9e99f09aa1eda0b33b52fe5bccf58044aa7a87c06e1c" Feb 19 16:07:45 crc kubenswrapper[4810]: I0219 16:07:45.726913 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b46f8abcb3e1715f798a9e99f09aa1eda0b33b52fe5bccf58044aa7a87c06e1c"} err="failed to get container status \"b46f8abcb3e1715f798a9e99f09aa1eda0b33b52fe5bccf58044aa7a87c06e1c\": rpc error: code = NotFound desc = could not find container \"b46f8abcb3e1715f798a9e99f09aa1eda0b33b52fe5bccf58044aa7a87c06e1c\": container with ID starting with b46f8abcb3e1715f798a9e99f09aa1eda0b33b52fe5bccf58044aa7a87c06e1c not found: ID does not exist" Feb 19 16:07:52 crc kubenswrapper[4810]: I0219 16:07:52.790695 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jljh4" Feb 19 16:07:52 crc kubenswrapper[4810]: I0219 16:07:52.791493 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jljh4" Feb 19 16:07:53 crc kubenswrapper[4810]: I0219 16:07:53.866688 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-jljh4" podUID="c3d8a679-ba7e-40bb-a22e-a1ae751fe6bd" containerName="registry-server" probeResult="failure" output=< Feb 19 16:07:53 crc kubenswrapper[4810]: timeout: failed to connect service ":50051" within 1s Feb 19 16:07:53 crc kubenswrapper[4810]: > Feb 19 16:07:56 crc kubenswrapper[4810]: I0219 16:07:56.404234 4810 generic.go:334] "Generic (PLEG): container finished" podID="577659f8-9dbb-46d0-b2cb-80951550957f" containerID="b8577964a2dfe54d0a59674fdaf01d96e87490efbc3e6f7eab9d40ccbe5b4994" exitCode=0 Feb 19 16:07:56 crc kubenswrapper[4810]: I0219 16:07:56.405043 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-64j86" event={"ID":"577659f8-9dbb-46d0-b2cb-80951550957f","Type":"ContainerDied","Data":"b8577964a2dfe54d0a59674fdaf01d96e87490efbc3e6f7eab9d40ccbe5b4994"} Feb 19 16:07:58 crc kubenswrapper[4810]: I0219 16:07:58.429417 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-64j86" event={"ID":"577659f8-9dbb-46d0-b2cb-80951550957f","Type":"ContainerStarted","Data":"129527e774d59e3246c9d516c24ebaf0a21d8e8e0c1747f09b62f3640f4d8e2d"} Feb 19 16:07:58 crc kubenswrapper[4810]: I0219 16:07:58.454880 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-64j86" podStartSLOduration=4.307370601 podStartE2EDuration="26.454858511s" podCreationTimestamp="2026-02-19 16:07:32 +0000 UTC" firstStartedPulling="2026-02-19 16:07:35.107802831 +0000 UTC m=+3484.589832965" lastFinishedPulling="2026-02-19 16:07:57.255290741 +0000 UTC m=+3506.737320875" observedRunningTime="2026-02-19 16:07:58.452542083 +0000 UTC m=+3507.934572207" watchObservedRunningTime="2026-02-19 16:07:58.454858511 +0000 UTC m=+3507.936888655" Feb 19 16:08:02 crc kubenswrapper[4810]: I0219 16:08:02.850902 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jljh4" Feb 19 16:08:02 crc kubenswrapper[4810]: I0219 16:08:02.928431 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jljh4" Feb 19 16:08:03 crc kubenswrapper[4810]: I0219 16:08:03.283745 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-64j86" Feb 19 16:08:03 crc kubenswrapper[4810]: I0219 16:08:03.284732 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-64j86" Feb 19 16:08:03 crc kubenswrapper[4810]: I0219 16:08:03.665608 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jljh4"] Feb 19 16:08:04 crc kubenswrapper[4810]: I0219 16:08:04.337520 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-64j86" podUID="577659f8-9dbb-46d0-b2cb-80951550957f" containerName="registry-server" probeResult="failure" output=< Feb 19 16:08:04 crc kubenswrapper[4810]: timeout: failed to connect service ":50051" within 1s Feb 19 16:08:04 crc kubenswrapper[4810]: > Feb 19 16:08:04 crc kubenswrapper[4810]: I0219 16:08:04.494619 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jljh4" podUID="c3d8a679-ba7e-40bb-a22e-a1ae751fe6bd" containerName="registry-server" containerID="cri-o://a1b65af471e174bbe1242694aadff9dbcea59298cf527d53fbdffc0d91120e42" gracePeriod=2 Feb 19 16:08:05 crc kubenswrapper[4810]: I0219 16:08:05.478978 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jljh4" Feb 19 16:08:05 crc kubenswrapper[4810]: I0219 16:08:05.505779 4810 generic.go:334] "Generic (PLEG): container finished" podID="c3d8a679-ba7e-40bb-a22e-a1ae751fe6bd" containerID="a1b65af471e174bbe1242694aadff9dbcea59298cf527d53fbdffc0d91120e42" exitCode=0 Feb 19 16:08:05 crc kubenswrapper[4810]: I0219 16:08:05.505828 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jljh4" event={"ID":"c3d8a679-ba7e-40bb-a22e-a1ae751fe6bd","Type":"ContainerDied","Data":"a1b65af471e174bbe1242694aadff9dbcea59298cf527d53fbdffc0d91120e42"} Feb 19 16:08:05 crc kubenswrapper[4810]: I0219 16:08:05.505856 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jljh4" event={"ID":"c3d8a679-ba7e-40bb-a22e-a1ae751fe6bd","Type":"ContainerDied","Data":"3f2a36ddbb2854ba8d1bc8b3161a45126a4855b2d5a76816b0ab55f69aa7490f"} Feb 19 16:08:05 crc kubenswrapper[4810]: I0219 16:08:05.505875 4810 scope.go:117] "RemoveContainer" containerID="a1b65af471e174bbe1242694aadff9dbcea59298cf527d53fbdffc0d91120e42" Feb 19 16:08:05 crc kubenswrapper[4810]: I0219 16:08:05.506020 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jljh4" Feb 19 16:08:05 crc kubenswrapper[4810]: I0219 16:08:05.574295 4810 scope.go:117] "RemoveContainer" containerID="0436f9add9060712a8eadd4b9553ee23df2b08b125e5848fa1234b303b4f8c12" Feb 19 16:08:05 crc kubenswrapper[4810]: I0219 16:08:05.651979 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3d8a679-ba7e-40bb-a22e-a1ae751fe6bd-catalog-content\") pod \"c3d8a679-ba7e-40bb-a22e-a1ae751fe6bd\" (UID: \"c3d8a679-ba7e-40bb-a22e-a1ae751fe6bd\") " Feb 19 16:08:05 crc kubenswrapper[4810]: I0219 16:08:05.652417 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jss82\" (UniqueName: \"kubernetes.io/projected/c3d8a679-ba7e-40bb-a22e-a1ae751fe6bd-kube-api-access-jss82\") pod \"c3d8a679-ba7e-40bb-a22e-a1ae751fe6bd\" (UID: \"c3d8a679-ba7e-40bb-a22e-a1ae751fe6bd\") " Feb 19 16:08:05 crc kubenswrapper[4810]: I0219 16:08:05.652502 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3d8a679-ba7e-40bb-a22e-a1ae751fe6bd-utilities\") pod \"c3d8a679-ba7e-40bb-a22e-a1ae751fe6bd\" (UID: \"c3d8a679-ba7e-40bb-a22e-a1ae751fe6bd\") " Feb 19 16:08:05 crc kubenswrapper[4810]: I0219 16:08:05.654119 4810 scope.go:117] "RemoveContainer" containerID="6fb2edb1a24f496d9ddd2804e3d51a61772791f54decbda879e1317be0fa6cc6" Feb 19 16:08:05 crc kubenswrapper[4810]: I0219 16:08:05.654238 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3d8a679-ba7e-40bb-a22e-a1ae751fe6bd-utilities" (OuterVolumeSpecName: "utilities") pod "c3d8a679-ba7e-40bb-a22e-a1ae751fe6bd" (UID: "c3d8a679-ba7e-40bb-a22e-a1ae751fe6bd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 16:08:05 crc kubenswrapper[4810]: I0219 16:08:05.698520 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3d8a679-ba7e-40bb-a22e-a1ae751fe6bd-kube-api-access-jss82" (OuterVolumeSpecName: "kube-api-access-jss82") pod "c3d8a679-ba7e-40bb-a22e-a1ae751fe6bd" (UID: "c3d8a679-ba7e-40bb-a22e-a1ae751fe6bd"). InnerVolumeSpecName "kube-api-access-jss82". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 16:08:05 crc kubenswrapper[4810]: I0219 16:08:05.755056 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3d8a679-ba7e-40bb-a22e-a1ae751fe6bd-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 16:08:05 crc kubenswrapper[4810]: I0219 16:08:05.755090 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jss82\" (UniqueName: \"kubernetes.io/projected/c3d8a679-ba7e-40bb-a22e-a1ae751fe6bd-kube-api-access-jss82\") on node \"crc\" DevicePath \"\"" Feb 19 16:08:05 crc kubenswrapper[4810]: I0219 16:08:05.761366 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3d8a679-ba7e-40bb-a22e-a1ae751fe6bd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c3d8a679-ba7e-40bb-a22e-a1ae751fe6bd" (UID: "c3d8a679-ba7e-40bb-a22e-a1ae751fe6bd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 16:08:05 crc kubenswrapper[4810]: I0219 16:08:05.768759 4810 scope.go:117] "RemoveContainer" containerID="a1b65af471e174bbe1242694aadff9dbcea59298cf527d53fbdffc0d91120e42" Feb 19 16:08:05 crc kubenswrapper[4810]: E0219 16:08:05.769468 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1b65af471e174bbe1242694aadff9dbcea59298cf527d53fbdffc0d91120e42\": container with ID starting with a1b65af471e174bbe1242694aadff9dbcea59298cf527d53fbdffc0d91120e42 not found: ID does not exist" containerID="a1b65af471e174bbe1242694aadff9dbcea59298cf527d53fbdffc0d91120e42" Feb 19 16:08:05 crc kubenswrapper[4810]: I0219 16:08:05.769527 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1b65af471e174bbe1242694aadff9dbcea59298cf527d53fbdffc0d91120e42"} err="failed to get container status \"a1b65af471e174bbe1242694aadff9dbcea59298cf527d53fbdffc0d91120e42\": rpc error: code = NotFound desc = could not find container \"a1b65af471e174bbe1242694aadff9dbcea59298cf527d53fbdffc0d91120e42\": container with ID starting with a1b65af471e174bbe1242694aadff9dbcea59298cf527d53fbdffc0d91120e42 not found: ID does not exist" Feb 19 16:08:05 crc kubenswrapper[4810]: I0219 16:08:05.769556 4810 scope.go:117] "RemoveContainer" containerID="0436f9add9060712a8eadd4b9553ee23df2b08b125e5848fa1234b303b4f8c12" Feb 19 16:08:05 crc kubenswrapper[4810]: E0219 16:08:05.769865 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0436f9add9060712a8eadd4b9553ee23df2b08b125e5848fa1234b303b4f8c12\": container with ID starting with 0436f9add9060712a8eadd4b9553ee23df2b08b125e5848fa1234b303b4f8c12 not found: ID does not exist" containerID="0436f9add9060712a8eadd4b9553ee23df2b08b125e5848fa1234b303b4f8c12" Feb 19 16:08:05 crc kubenswrapper[4810]: I0219 16:08:05.769910 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0436f9add9060712a8eadd4b9553ee23df2b08b125e5848fa1234b303b4f8c12"} err="failed to get container status \"0436f9add9060712a8eadd4b9553ee23df2b08b125e5848fa1234b303b4f8c12\": rpc error: code = NotFound desc = could not find container \"0436f9add9060712a8eadd4b9553ee23df2b08b125e5848fa1234b303b4f8c12\": container with ID starting with 0436f9add9060712a8eadd4b9553ee23df2b08b125e5848fa1234b303b4f8c12 not found: ID does not exist" Feb 19 16:08:05 crc kubenswrapper[4810]: I0219 16:08:05.769938 4810 scope.go:117] "RemoveContainer" containerID="6fb2edb1a24f496d9ddd2804e3d51a61772791f54decbda879e1317be0fa6cc6" Feb 19 16:08:05 crc kubenswrapper[4810]: E0219 16:08:05.770259 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fb2edb1a24f496d9ddd2804e3d51a61772791f54decbda879e1317be0fa6cc6\": container with ID starting with 6fb2edb1a24f496d9ddd2804e3d51a61772791f54decbda879e1317be0fa6cc6 not found: ID does not exist" containerID="6fb2edb1a24f496d9ddd2804e3d51a61772791f54decbda879e1317be0fa6cc6" Feb 19 16:08:05 crc kubenswrapper[4810]: I0219 16:08:05.770290 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fb2edb1a24f496d9ddd2804e3d51a61772791f54decbda879e1317be0fa6cc6"} err="failed to get container status \"6fb2edb1a24f496d9ddd2804e3d51a61772791f54decbda879e1317be0fa6cc6\": rpc error: code = NotFound desc = could not find container \"6fb2edb1a24f496d9ddd2804e3d51a61772791f54decbda879e1317be0fa6cc6\": container with ID starting with 6fb2edb1a24f496d9ddd2804e3d51a61772791f54decbda879e1317be0fa6cc6 not found: ID does not exist" Feb 19 16:08:05 crc kubenswrapper[4810]: I0219 16:08:05.840528 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jljh4"] Feb 19 16:08:05 crc kubenswrapper[4810]: I0219 16:08:05.850784 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jljh4"] Feb 19 16:08:05 crc kubenswrapper[4810]: I0219 16:08:05.857046 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3d8a679-ba7e-40bb-a22e-a1ae751fe6bd-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 16:08:07 crc kubenswrapper[4810]: I0219 16:08:07.456123 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3d8a679-ba7e-40bb-a22e-a1ae751fe6bd" path="/var/lib/kubelet/pods/c3d8a679-ba7e-40bb-a22e-a1ae751fe6bd/volumes" Feb 19 16:08:14 crc kubenswrapper[4810]: I0219 16:08:14.350358 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-64j86" podUID="577659f8-9dbb-46d0-b2cb-80951550957f" containerName="registry-server" probeResult="failure" output=< Feb 19 16:08:14 crc kubenswrapper[4810]: timeout: failed to connect service ":50051" within 1s Feb 19 16:08:14 crc kubenswrapper[4810]: > Feb 19 16:08:23 crc kubenswrapper[4810]: I0219 16:08:23.368763 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-64j86" Feb 19 16:08:23 crc kubenswrapper[4810]: I0219 16:08:23.411945 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-64j86" Feb 19 16:08:26 crc kubenswrapper[4810]: I0219 16:08:26.013042 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-64j86"] Feb 19 16:08:26 crc kubenswrapper[4810]: I0219 16:08:26.013757 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-64j86" podUID="577659f8-9dbb-46d0-b2cb-80951550957f" containerName="registry-server" containerID="cri-o://129527e774d59e3246c9d516c24ebaf0a21d8e8e0c1747f09b62f3640f4d8e2d" gracePeriod=2 Feb 19 16:08:26 crc kubenswrapper[4810]: I0219 16:08:26.516449 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-64j86" Feb 19 16:08:26 crc kubenswrapper[4810]: I0219 16:08:26.544302 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/577659f8-9dbb-46d0-b2cb-80951550957f-catalog-content\") pod \"577659f8-9dbb-46d0-b2cb-80951550957f\" (UID: \"577659f8-9dbb-46d0-b2cb-80951550957f\") " Feb 19 16:08:26 crc kubenswrapper[4810]: I0219 16:08:26.544747 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4w7q7\" (UniqueName: \"kubernetes.io/projected/577659f8-9dbb-46d0-b2cb-80951550957f-kube-api-access-4w7q7\") pod \"577659f8-9dbb-46d0-b2cb-80951550957f\" (UID: \"577659f8-9dbb-46d0-b2cb-80951550957f\") " Feb 19 16:08:26 crc kubenswrapper[4810]: I0219 16:08:26.544784 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/577659f8-9dbb-46d0-b2cb-80951550957f-utilities\") pod \"577659f8-9dbb-46d0-b2cb-80951550957f\" (UID: \"577659f8-9dbb-46d0-b2cb-80951550957f\") " Feb 19 16:08:26 crc kubenswrapper[4810]: I0219 16:08:26.545693 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/577659f8-9dbb-46d0-b2cb-80951550957f-utilities" (OuterVolumeSpecName: "utilities") pod "577659f8-9dbb-46d0-b2cb-80951550957f" (UID: "577659f8-9dbb-46d0-b2cb-80951550957f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 16:08:26 crc kubenswrapper[4810]: I0219 16:08:26.566955 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/577659f8-9dbb-46d0-b2cb-80951550957f-kube-api-access-4w7q7" (OuterVolumeSpecName: "kube-api-access-4w7q7") pod "577659f8-9dbb-46d0-b2cb-80951550957f" (UID: "577659f8-9dbb-46d0-b2cb-80951550957f"). InnerVolumeSpecName "kube-api-access-4w7q7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 16:08:26 crc kubenswrapper[4810]: I0219 16:08:26.651283 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4w7q7\" (UniqueName: \"kubernetes.io/projected/577659f8-9dbb-46d0-b2cb-80951550957f-kube-api-access-4w7q7\") on node \"crc\" DevicePath \"\"" Feb 19 16:08:26 crc kubenswrapper[4810]: I0219 16:08:26.651343 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/577659f8-9dbb-46d0-b2cb-80951550957f-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 16:08:26 crc kubenswrapper[4810]: I0219 16:08:26.691825 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/577659f8-9dbb-46d0-b2cb-80951550957f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "577659f8-9dbb-46d0-b2cb-80951550957f" (UID: "577659f8-9dbb-46d0-b2cb-80951550957f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 16:08:26 crc kubenswrapper[4810]: I0219 16:08:26.752081 4810 generic.go:334] "Generic (PLEG): container finished" podID="577659f8-9dbb-46d0-b2cb-80951550957f" containerID="129527e774d59e3246c9d516c24ebaf0a21d8e8e0c1747f09b62f3640f4d8e2d" exitCode=0 Feb 19 16:08:26 crc kubenswrapper[4810]: I0219 16:08:26.752159 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-64j86" event={"ID":"577659f8-9dbb-46d0-b2cb-80951550957f","Type":"ContainerDied","Data":"129527e774d59e3246c9d516c24ebaf0a21d8e8e0c1747f09b62f3640f4d8e2d"} Feb 19 16:08:26 crc kubenswrapper[4810]: I0219 16:08:26.752205 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-64j86" Feb 19 16:08:26 crc kubenswrapper[4810]: I0219 16:08:26.752243 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-64j86" event={"ID":"577659f8-9dbb-46d0-b2cb-80951550957f","Type":"ContainerDied","Data":"96a616bff386690da5f589c057f363366e36ad19ad36fdff70cbdf3bb4704f10"} Feb 19 16:08:26 crc kubenswrapper[4810]: I0219 16:08:26.752278 4810 scope.go:117] "RemoveContainer" containerID="129527e774d59e3246c9d516c24ebaf0a21d8e8e0c1747f09b62f3640f4d8e2d" Feb 19 16:08:26 crc kubenswrapper[4810]: I0219 16:08:26.753552 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/577659f8-9dbb-46d0-b2cb-80951550957f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 16:08:26 crc kubenswrapper[4810]: I0219 16:08:26.794016 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-64j86"] Feb 19 16:08:26 crc kubenswrapper[4810]: I0219 16:08:26.797731 4810 scope.go:117] "RemoveContainer" containerID="b8577964a2dfe54d0a59674fdaf01d96e87490efbc3e6f7eab9d40ccbe5b4994" Feb 19 16:08:26 crc kubenswrapper[4810]: I0219 16:08:26.803852 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-64j86"] Feb 19 16:08:26 crc kubenswrapper[4810]: I0219 16:08:26.831829 4810 scope.go:117] "RemoveContainer" containerID="0a997d370511dea75f129a7681651a204854d63f35705e130fcf606397cfec6f" Feb 19 16:08:26 crc kubenswrapper[4810]: I0219 16:08:26.909833 4810 scope.go:117] "RemoveContainer" containerID="129527e774d59e3246c9d516c24ebaf0a21d8e8e0c1747f09b62f3640f4d8e2d" Feb 19 16:08:26 crc kubenswrapper[4810]: E0219 16:08:26.910356 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"129527e774d59e3246c9d516c24ebaf0a21d8e8e0c1747f09b62f3640f4d8e2d\": container with ID starting with 129527e774d59e3246c9d516c24ebaf0a21d8e8e0c1747f09b62f3640f4d8e2d not found: ID does not exist" containerID="129527e774d59e3246c9d516c24ebaf0a21d8e8e0c1747f09b62f3640f4d8e2d" Feb 19 16:08:26 crc kubenswrapper[4810]: I0219 16:08:26.910395 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"129527e774d59e3246c9d516c24ebaf0a21d8e8e0c1747f09b62f3640f4d8e2d"} err="failed to get container status \"129527e774d59e3246c9d516c24ebaf0a21d8e8e0c1747f09b62f3640f4d8e2d\": rpc error: code = NotFound desc = could not find container \"129527e774d59e3246c9d516c24ebaf0a21d8e8e0c1747f09b62f3640f4d8e2d\": container with ID starting with 129527e774d59e3246c9d516c24ebaf0a21d8e8e0c1747f09b62f3640f4d8e2d not found: ID does not exist" Feb 19 16:08:26 crc kubenswrapper[4810]: I0219 16:08:26.910425 4810 scope.go:117] "RemoveContainer" containerID="b8577964a2dfe54d0a59674fdaf01d96e87490efbc3e6f7eab9d40ccbe5b4994" Feb 19 16:08:26 crc kubenswrapper[4810]: E0219 16:08:26.911012 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8577964a2dfe54d0a59674fdaf01d96e87490efbc3e6f7eab9d40ccbe5b4994\": container with ID starting with b8577964a2dfe54d0a59674fdaf01d96e87490efbc3e6f7eab9d40ccbe5b4994 not found: ID does not exist" containerID="b8577964a2dfe54d0a59674fdaf01d96e87490efbc3e6f7eab9d40ccbe5b4994" Feb 19 16:08:26 crc kubenswrapper[4810]: I0219 16:08:26.911085 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8577964a2dfe54d0a59674fdaf01d96e87490efbc3e6f7eab9d40ccbe5b4994"} err="failed to get container status \"b8577964a2dfe54d0a59674fdaf01d96e87490efbc3e6f7eab9d40ccbe5b4994\": rpc error: code = NotFound desc = could not find container \"b8577964a2dfe54d0a59674fdaf01d96e87490efbc3e6f7eab9d40ccbe5b4994\": container with ID starting with b8577964a2dfe54d0a59674fdaf01d96e87490efbc3e6f7eab9d40ccbe5b4994 not found: ID does not exist" Feb 19 16:08:26 crc kubenswrapper[4810]: I0219 16:08:26.911122 4810 scope.go:117] "RemoveContainer" containerID="0a997d370511dea75f129a7681651a204854d63f35705e130fcf606397cfec6f" Feb 19 16:08:26 crc kubenswrapper[4810]: E0219 16:08:26.911721 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a997d370511dea75f129a7681651a204854d63f35705e130fcf606397cfec6f\": container with ID starting with 0a997d370511dea75f129a7681651a204854d63f35705e130fcf606397cfec6f not found: ID does not exist" containerID="0a997d370511dea75f129a7681651a204854d63f35705e130fcf606397cfec6f" Feb 19 16:08:26 crc kubenswrapper[4810]: I0219 16:08:26.911773 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a997d370511dea75f129a7681651a204854d63f35705e130fcf606397cfec6f"} err="failed to get container status \"0a997d370511dea75f129a7681651a204854d63f35705e130fcf606397cfec6f\": rpc error: code = NotFound desc = could not find container \"0a997d370511dea75f129a7681651a204854d63f35705e130fcf606397cfec6f\": container with ID starting with 0a997d370511dea75f129a7681651a204854d63f35705e130fcf606397cfec6f not found: ID does not exist" Feb 19 16:08:27 crc kubenswrapper[4810]: I0219 16:08:27.460755 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="577659f8-9dbb-46d0-b2cb-80951550957f" path="/var/lib/kubelet/pods/577659f8-9dbb-46d0-b2cb-80951550957f/volumes" Feb 19 16:09:19 crc kubenswrapper[4810]: I0219 16:09:19.537226 4810 patch_prober.go:28] interesting pod/machine-config-daemon-t499d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 16:09:19 crc kubenswrapper[4810]: I0219 16:09:19.537936 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 16:09:49 crc kubenswrapper[4810]: I0219 16:09:49.537570 4810 patch_prober.go:28] interesting pod/machine-config-daemon-t499d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 16:09:49 crc kubenswrapper[4810]: I0219 16:09:49.538186 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 16:10:19 crc kubenswrapper[4810]: I0219 16:10:19.537435 4810 patch_prober.go:28] interesting pod/machine-config-daemon-t499d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 16:10:19 crc kubenswrapper[4810]: I0219 16:10:19.538047 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 16:10:19 crc kubenswrapper[4810]: I0219 16:10:19.538105 4810 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t499d" Feb 19 16:10:19 crc kubenswrapper[4810]: I0219 16:10:19.539077 4810 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8520d3639f4b062a60a4e8d48690a9ddf9c8f0411d765a4be05ac76b8b0faa8e"} pod="openshift-machine-config-operator/machine-config-daemon-t499d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 16:10:19 crc kubenswrapper[4810]: I0219 16:10:19.539173 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" containerID="cri-o://8520d3639f4b062a60a4e8d48690a9ddf9c8f0411d765a4be05ac76b8b0faa8e" gracePeriod=600 Feb 19 16:10:19 crc kubenswrapper[4810]: E0219 16:10:19.674192 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:10:20 crc kubenswrapper[4810]: I0219 16:10:20.304234 4810 generic.go:334] "Generic (PLEG): container finished" podID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerID="8520d3639f4b062a60a4e8d48690a9ddf9c8f0411d765a4be05ac76b8b0faa8e" exitCode=0 Feb 19 16:10:20 crc kubenswrapper[4810]: I0219 16:10:20.304309 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" event={"ID":"dfbf74ef-1e94-4826-8583-42b2e246ccf3","Type":"ContainerDied","Data":"8520d3639f4b062a60a4e8d48690a9ddf9c8f0411d765a4be05ac76b8b0faa8e"} Feb 19 16:10:20 crc kubenswrapper[4810]: I0219 16:10:20.304608 4810 scope.go:117] "RemoveContainer" containerID="2578d4284c44e38a7496c83ee59e7a00d386a9b2aebd20063f610b39b9a8d15a" Feb 19 16:10:20 crc kubenswrapper[4810]: I0219 16:10:20.305268 4810 scope.go:117] "RemoveContainer" containerID="8520d3639f4b062a60a4e8d48690a9ddf9c8f0411d765a4be05ac76b8b0faa8e" Feb 19 16:10:20 crc kubenswrapper[4810]: E0219 16:10:20.305543 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:10:34 crc kubenswrapper[4810]: I0219 16:10:34.440469 4810 scope.go:117] "RemoveContainer" containerID="8520d3639f4b062a60a4e8d48690a9ddf9c8f0411d765a4be05ac76b8b0faa8e" Feb 19 16:10:34 crc kubenswrapper[4810]: E0219 16:10:34.441252 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:10:48 crc kubenswrapper[4810]: I0219 16:10:48.439451 4810 scope.go:117] "RemoveContainer" containerID="8520d3639f4b062a60a4e8d48690a9ddf9c8f0411d765a4be05ac76b8b0faa8e" Feb 19 16:10:48 crc kubenswrapper[4810]: E0219 16:10:48.440131 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:11:02 crc kubenswrapper[4810]: I0219 16:11:02.439578 4810 scope.go:117] "RemoveContainer" containerID="8520d3639f4b062a60a4e8d48690a9ddf9c8f0411d765a4be05ac76b8b0faa8e" Feb 19 16:11:02 crc kubenswrapper[4810]: E0219 16:11:02.440744 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:11:16 crc kubenswrapper[4810]: I0219 16:11:16.439788 4810 scope.go:117] "RemoveContainer" containerID="8520d3639f4b062a60a4e8d48690a9ddf9c8f0411d765a4be05ac76b8b0faa8e" Feb 19 16:11:16 crc kubenswrapper[4810]: E0219 16:11:16.440684 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:11:29 crc kubenswrapper[4810]: I0219 16:11:29.439996 4810 scope.go:117] "RemoveContainer" containerID="8520d3639f4b062a60a4e8d48690a9ddf9c8f0411d765a4be05ac76b8b0faa8e" Feb 19 16:11:29 crc kubenswrapper[4810]: E0219 16:11:29.440990 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:11:44 crc kubenswrapper[4810]: I0219 16:11:44.439645 4810 scope.go:117] "RemoveContainer" containerID="8520d3639f4b062a60a4e8d48690a9ddf9c8f0411d765a4be05ac76b8b0faa8e" Feb 19 16:11:44 crc kubenswrapper[4810]: E0219 16:11:44.440539 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:11:55 crc kubenswrapper[4810]: I0219 16:11:55.439385 4810 scope.go:117] "RemoveContainer" containerID="8520d3639f4b062a60a4e8d48690a9ddf9c8f0411d765a4be05ac76b8b0faa8e" Feb 19 16:11:55 crc kubenswrapper[4810]: E0219 16:11:55.440380 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:12:09 crc kubenswrapper[4810]: I0219 16:12:09.439424 4810 scope.go:117] "RemoveContainer" containerID="8520d3639f4b062a60a4e8d48690a9ddf9c8f0411d765a4be05ac76b8b0faa8e" Feb 19 16:12:09 crc kubenswrapper[4810]: E0219 16:12:09.440348 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:12:24 crc kubenswrapper[4810]: I0219 16:12:24.439438 4810 scope.go:117] "RemoveContainer" containerID="8520d3639f4b062a60a4e8d48690a9ddf9c8f0411d765a4be05ac76b8b0faa8e" Feb 19 16:12:24 crc kubenswrapper[4810]: E0219 16:12:24.440207 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:12:36 crc kubenswrapper[4810]: I0219 16:12:36.439837 4810 scope.go:117] "RemoveContainer" containerID="8520d3639f4b062a60a4e8d48690a9ddf9c8f0411d765a4be05ac76b8b0faa8e" Feb 19 16:12:36 crc kubenswrapper[4810]: E0219 16:12:36.440923 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:12:47 crc kubenswrapper[4810]: I0219 16:12:47.440066 4810 scope.go:117] "RemoveContainer" containerID="8520d3639f4b062a60a4e8d48690a9ddf9c8f0411d765a4be05ac76b8b0faa8e" Feb 19 16:12:47 crc kubenswrapper[4810]: E0219 16:12:47.441218 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:12:58 crc kubenswrapper[4810]: I0219 16:12:58.440212 4810 scope.go:117] "RemoveContainer" containerID="8520d3639f4b062a60a4e8d48690a9ddf9c8f0411d765a4be05ac76b8b0faa8e" Feb 19 16:12:58 crc kubenswrapper[4810]: E0219 16:12:58.441417 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:13:11 crc kubenswrapper[4810]: I0219 16:13:11.447892 4810 scope.go:117] "RemoveContainer" containerID="8520d3639f4b062a60a4e8d48690a9ddf9c8f0411d765a4be05ac76b8b0faa8e" Feb 19 16:13:11 crc kubenswrapper[4810]: E0219 16:13:11.448641 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:13:26 crc kubenswrapper[4810]: I0219 16:13:26.440013 4810 scope.go:117] "RemoveContainer" containerID="8520d3639f4b062a60a4e8d48690a9ddf9c8f0411d765a4be05ac76b8b0faa8e" Feb 19 16:13:26 crc kubenswrapper[4810]: E0219 16:13:26.440902 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:13:40 crc kubenswrapper[4810]: I0219 16:13:40.440052 4810 scope.go:117] "RemoveContainer" containerID="8520d3639f4b062a60a4e8d48690a9ddf9c8f0411d765a4be05ac76b8b0faa8e" Feb 19 16:13:40 crc kubenswrapper[4810]: E0219 16:13:40.440976 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:13:54 crc kubenswrapper[4810]: I0219 16:13:54.440201 4810 scope.go:117] "RemoveContainer" containerID="8520d3639f4b062a60a4e8d48690a9ddf9c8f0411d765a4be05ac76b8b0faa8e" Feb 19 16:13:54 crc kubenswrapper[4810]: E0219 16:13:54.441554 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:14:09 crc kubenswrapper[4810]: I0219 16:14:09.440172 4810 scope.go:117] "RemoveContainer" containerID="8520d3639f4b062a60a4e8d48690a9ddf9c8f0411d765a4be05ac76b8b0faa8e" Feb 19 16:14:09 crc kubenswrapper[4810]: E0219 16:14:09.441736 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:14:23 crc kubenswrapper[4810]: I0219 16:14:23.439760 4810 scope.go:117] "RemoveContainer" containerID="8520d3639f4b062a60a4e8d48690a9ddf9c8f0411d765a4be05ac76b8b0faa8e" Feb 19 16:14:23 crc kubenswrapper[4810]: E0219 16:14:23.440853 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:14:36 crc kubenswrapper[4810]: I0219 16:14:36.450540 4810 scope.go:117] "RemoveContainer" containerID="8520d3639f4b062a60a4e8d48690a9ddf9c8f0411d765a4be05ac76b8b0faa8e" Feb 19 16:14:36 crc kubenswrapper[4810]: E0219 16:14:36.451582 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:14:49 crc kubenswrapper[4810]: I0219 16:14:49.439443 4810 scope.go:117] "RemoveContainer" containerID="8520d3639f4b062a60a4e8d48690a9ddf9c8f0411d765a4be05ac76b8b0faa8e" Feb 19 16:14:49 crc kubenswrapper[4810]: E0219 16:14:49.440031 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:15:00 crc kubenswrapper[4810]: I0219 16:15:00.222582 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525295-b9xtq"] Feb 19 16:15:00 crc kubenswrapper[4810]: E0219 16:15:00.223864 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3d8a679-ba7e-40bb-a22e-a1ae751fe6bd" containerName="registry-server" Feb 19 16:15:00 crc kubenswrapper[4810]: I0219 16:15:00.223879 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3d8a679-ba7e-40bb-a22e-a1ae751fe6bd" containerName="registry-server" Feb 19 16:15:00 crc kubenswrapper[4810]: E0219 16:15:00.223909 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="577659f8-9dbb-46d0-b2cb-80951550957f" containerName="extract-utilities" Feb 19 16:15:00 crc kubenswrapper[4810]: I0219 16:15:00.223915 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="577659f8-9dbb-46d0-b2cb-80951550957f" containerName="extract-utilities" Feb 19 16:15:00 crc kubenswrapper[4810]: E0219 16:15:00.223930 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6de6280-d32b-4c86-b93b-4c0a06ee631a" containerName="extract-content" Feb 19 16:15:00 crc kubenswrapper[4810]: I0219 16:15:00.223936 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6de6280-d32b-4c86-b93b-4c0a06ee631a" containerName="extract-content" Feb 19 16:15:00 crc kubenswrapper[4810]: E0219 16:15:00.223949 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3d8a679-ba7e-40bb-a22e-a1ae751fe6bd" containerName="extract-content" Feb 19 16:15:00 crc kubenswrapper[4810]: I0219 16:15:00.223955 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3d8a679-ba7e-40bb-a22e-a1ae751fe6bd" containerName="extract-content" Feb 19 16:15:00 crc kubenswrapper[4810]: E0219 16:15:00.223973 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6de6280-d32b-4c86-b93b-4c0a06ee631a" containerName="extract-utilities" Feb 19 16:15:00 crc kubenswrapper[4810]: I0219 16:15:00.223981 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6de6280-d32b-4c86-b93b-4c0a06ee631a" containerName="extract-utilities" Feb 19 16:15:00 crc kubenswrapper[4810]: E0219 16:15:00.224008 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="577659f8-9dbb-46d0-b2cb-80951550957f" containerName="extract-content" Feb 19 16:15:00 crc kubenswrapper[4810]: I0219 16:15:00.224014 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="577659f8-9dbb-46d0-b2cb-80951550957f" containerName="extract-content" Feb 19 16:15:00 crc kubenswrapper[4810]: E0219 16:15:00.224026 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="577659f8-9dbb-46d0-b2cb-80951550957f" containerName="registry-server" Feb 19 16:15:00 crc kubenswrapper[4810]: I0219 16:15:00.224032 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="577659f8-9dbb-46d0-b2cb-80951550957f" containerName="registry-server" Feb 19 16:15:00 crc kubenswrapper[4810]: E0219 16:15:00.224048 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3d8a679-ba7e-40bb-a22e-a1ae751fe6bd" containerName="extract-utilities" Feb 19 16:15:00 crc kubenswrapper[4810]: I0219 16:15:00.224054 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3d8a679-ba7e-40bb-a22e-a1ae751fe6bd" containerName="extract-utilities" Feb 19 16:15:00 crc kubenswrapper[4810]: E0219 16:15:00.224078 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6de6280-d32b-4c86-b93b-4c0a06ee631a" containerName="registry-server" Feb 19 16:15:00 crc kubenswrapper[4810]: I0219 16:15:00.224084 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6de6280-d32b-4c86-b93b-4c0a06ee631a" containerName="registry-server" Feb 19 16:15:00 crc kubenswrapper[4810]: I0219 16:15:00.224446 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3d8a679-ba7e-40bb-a22e-a1ae751fe6bd" containerName="registry-server" Feb 19 16:15:00 crc kubenswrapper[4810]: I0219 16:15:00.224467 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="577659f8-9dbb-46d0-b2cb-80951550957f" containerName="registry-server" Feb 19 16:15:00 crc kubenswrapper[4810]: I0219 16:15:00.224491 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6de6280-d32b-4c86-b93b-4c0a06ee631a" containerName="registry-server" Feb 19 16:15:00 crc kubenswrapper[4810]: I0219 16:15:00.225396 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525295-b9xtq" Feb 19 16:15:00 crc kubenswrapper[4810]: I0219 16:15:00.228425 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 19 16:15:00 crc kubenswrapper[4810]: I0219 16:15:00.230932 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 19 16:15:00 crc kubenswrapper[4810]: I0219 16:15:00.234966 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525295-b9xtq"] Feb 19 16:15:00 crc kubenswrapper[4810]: I0219 16:15:00.278812 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/99fb536c-bd62-47d3-87d6-9f56d3e51f72-config-volume\") pod \"collect-profiles-29525295-b9xtq\" (UID: \"99fb536c-bd62-47d3-87d6-9f56d3e51f72\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525295-b9xtq" Feb 19 16:15:00 crc kubenswrapper[4810]: I0219 16:15:00.279149 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j28zk\" (UniqueName: \"kubernetes.io/projected/99fb536c-bd62-47d3-87d6-9f56d3e51f72-kube-api-access-j28zk\") pod \"collect-profiles-29525295-b9xtq\" (UID: \"99fb536c-bd62-47d3-87d6-9f56d3e51f72\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525295-b9xtq" Feb 19 16:15:00 crc kubenswrapper[4810]: I0219 16:15:00.279220 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/99fb536c-bd62-47d3-87d6-9f56d3e51f72-secret-volume\") pod \"collect-profiles-29525295-b9xtq\" (UID: \"99fb536c-bd62-47d3-87d6-9f56d3e51f72\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525295-b9xtq" Feb 19 16:15:00 crc kubenswrapper[4810]: I0219 16:15:00.380971 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j28zk\" (UniqueName: \"kubernetes.io/projected/99fb536c-bd62-47d3-87d6-9f56d3e51f72-kube-api-access-j28zk\") pod \"collect-profiles-29525295-b9xtq\" (UID: \"99fb536c-bd62-47d3-87d6-9f56d3e51f72\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525295-b9xtq" Feb 19 16:15:00 crc kubenswrapper[4810]: I0219 16:15:00.381378 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/99fb536c-bd62-47d3-87d6-9f56d3e51f72-secret-volume\") pod \"collect-profiles-29525295-b9xtq\" (UID: \"99fb536c-bd62-47d3-87d6-9f56d3e51f72\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525295-b9xtq" Feb 19 16:15:00 crc kubenswrapper[4810]: I0219 16:15:00.381469 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/99fb536c-bd62-47d3-87d6-9f56d3e51f72-config-volume\") pod \"collect-profiles-29525295-b9xtq\" (UID: \"99fb536c-bd62-47d3-87d6-9f56d3e51f72\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525295-b9xtq" Feb 19 16:15:00 crc kubenswrapper[4810]: I0219 16:15:00.383314 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/99fb536c-bd62-47d3-87d6-9f56d3e51f72-config-volume\") pod \"collect-profiles-29525295-b9xtq\" (UID: \"99fb536c-bd62-47d3-87d6-9f56d3e51f72\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525295-b9xtq" Feb 19 16:15:00 crc kubenswrapper[4810]: I0219 16:15:00.387654 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/99fb536c-bd62-47d3-87d6-9f56d3e51f72-secret-volume\") pod \"collect-profiles-29525295-b9xtq\" (UID: \"99fb536c-bd62-47d3-87d6-9f56d3e51f72\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525295-b9xtq" Feb 19 16:15:00 crc kubenswrapper[4810]: I0219 16:15:00.402026 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j28zk\" (UniqueName: \"kubernetes.io/projected/99fb536c-bd62-47d3-87d6-9f56d3e51f72-kube-api-access-j28zk\") pod \"collect-profiles-29525295-b9xtq\" (UID: \"99fb536c-bd62-47d3-87d6-9f56d3e51f72\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525295-b9xtq" Feb 19 16:15:00 crc kubenswrapper[4810]: I0219 16:15:00.564802 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525295-b9xtq" Feb 19 16:15:01 crc kubenswrapper[4810]: I0219 16:15:01.071290 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525295-b9xtq"] Feb 19 16:15:01 crc kubenswrapper[4810]: I0219 16:15:01.274814 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525295-b9xtq" event={"ID":"99fb536c-bd62-47d3-87d6-9f56d3e51f72","Type":"ContainerStarted","Data":"932479730c0a6c125151920519504a29cae36b5a433b12d52524833608c85c05"} Feb 19 16:15:01 crc kubenswrapper[4810]: I0219 16:15:01.276101 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525295-b9xtq" event={"ID":"99fb536c-bd62-47d3-87d6-9f56d3e51f72","Type":"ContainerStarted","Data":"c36e6a7677ac2c7f43e439596cd29f482f6c5361556fbcc3e64fe5eceac4059c"} Feb 19 16:15:01 crc kubenswrapper[4810]: I0219 16:15:01.294537 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29525295-b9xtq" podStartSLOduration=1.294522117 podStartE2EDuration="1.294522117s" podCreationTimestamp="2026-02-19 16:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 16:15:01.291753948 +0000 UTC m=+3930.773784072" watchObservedRunningTime="2026-02-19 16:15:01.294522117 +0000 UTC m=+3930.776552241" Feb 19 16:15:02 crc kubenswrapper[4810]: I0219 16:15:02.287128 4810 generic.go:334] "Generic (PLEG): container finished" podID="99fb536c-bd62-47d3-87d6-9f56d3e51f72" containerID="932479730c0a6c125151920519504a29cae36b5a433b12d52524833608c85c05" exitCode=0 Feb 19 16:15:02 crc kubenswrapper[4810]: I0219 16:15:02.287174 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525295-b9xtq" event={"ID":"99fb536c-bd62-47d3-87d6-9f56d3e51f72","Type":"ContainerDied","Data":"932479730c0a6c125151920519504a29cae36b5a433b12d52524833608c85c05"} Feb 19 16:15:03 crc kubenswrapper[4810]: I0219 16:15:03.754991 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525295-b9xtq" Feb 19 16:15:03 crc kubenswrapper[4810]: I0219 16:15:03.853014 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/99fb536c-bd62-47d3-87d6-9f56d3e51f72-secret-volume\") pod \"99fb536c-bd62-47d3-87d6-9f56d3e51f72\" (UID: \"99fb536c-bd62-47d3-87d6-9f56d3e51f72\") " Feb 19 16:15:03 crc kubenswrapper[4810]: I0219 16:15:03.853508 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/99fb536c-bd62-47d3-87d6-9f56d3e51f72-config-volume\") pod \"99fb536c-bd62-47d3-87d6-9f56d3e51f72\" (UID: \"99fb536c-bd62-47d3-87d6-9f56d3e51f72\") " Feb 19 16:15:03 crc kubenswrapper[4810]: I0219 16:15:03.853563 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j28zk\" (UniqueName: \"kubernetes.io/projected/99fb536c-bd62-47d3-87d6-9f56d3e51f72-kube-api-access-j28zk\") pod \"99fb536c-bd62-47d3-87d6-9f56d3e51f72\" (UID: \"99fb536c-bd62-47d3-87d6-9f56d3e51f72\") " Feb 19 16:15:03 crc kubenswrapper[4810]: I0219 16:15:03.854361 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99fb536c-bd62-47d3-87d6-9f56d3e51f72-config-volume" (OuterVolumeSpecName: "config-volume") pod "99fb536c-bd62-47d3-87d6-9f56d3e51f72" (UID: "99fb536c-bd62-47d3-87d6-9f56d3e51f72"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 16:15:03 crc kubenswrapper[4810]: I0219 16:15:03.872665 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99fb536c-bd62-47d3-87d6-9f56d3e51f72-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "99fb536c-bd62-47d3-87d6-9f56d3e51f72" (UID: "99fb536c-bd62-47d3-87d6-9f56d3e51f72"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 16:15:03 crc kubenswrapper[4810]: I0219 16:15:03.882028 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99fb536c-bd62-47d3-87d6-9f56d3e51f72-kube-api-access-j28zk" (OuterVolumeSpecName: "kube-api-access-j28zk") pod "99fb536c-bd62-47d3-87d6-9f56d3e51f72" (UID: "99fb536c-bd62-47d3-87d6-9f56d3e51f72"). InnerVolumeSpecName "kube-api-access-j28zk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 16:15:03 crc kubenswrapper[4810]: I0219 16:15:03.954701 4810 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/99fb536c-bd62-47d3-87d6-9f56d3e51f72-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 19 16:15:03 crc kubenswrapper[4810]: I0219 16:15:03.954723 4810 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/99fb536c-bd62-47d3-87d6-9f56d3e51f72-config-volume\") on node \"crc\" DevicePath \"\"" Feb 19 16:15:03 crc kubenswrapper[4810]: I0219 16:15:03.954732 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j28zk\" (UniqueName: \"kubernetes.io/projected/99fb536c-bd62-47d3-87d6-9f56d3e51f72-kube-api-access-j28zk\") on node \"crc\" DevicePath \"\"" Feb 19 16:15:04 crc kubenswrapper[4810]: I0219 16:15:04.307209 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525295-b9xtq" event={"ID":"99fb536c-bd62-47d3-87d6-9f56d3e51f72","Type":"ContainerDied","Data":"c36e6a7677ac2c7f43e439596cd29f482f6c5361556fbcc3e64fe5eceac4059c"} Feb 19 16:15:04 crc kubenswrapper[4810]: I0219 16:15:04.307258 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c36e6a7677ac2c7f43e439596cd29f482f6c5361556fbcc3e64fe5eceac4059c" Feb 19 16:15:04 crc kubenswrapper[4810]: I0219 16:15:04.307274 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525295-b9xtq" Feb 19 16:15:04 crc kubenswrapper[4810]: I0219 16:15:04.375521 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525250-rls78"] Feb 19 16:15:04 crc kubenswrapper[4810]: I0219 16:15:04.384237 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525250-rls78"] Feb 19 16:15:04 crc kubenswrapper[4810]: I0219 16:15:04.440070 4810 scope.go:117] "RemoveContainer" containerID="8520d3639f4b062a60a4e8d48690a9ddf9c8f0411d765a4be05ac76b8b0faa8e" Feb 19 16:15:04 crc kubenswrapper[4810]: E0219 16:15:04.440393 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:15:05 crc kubenswrapper[4810]: I0219 16:15:05.461514 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40500a46-a16b-4282-86e4-1d99277d7c7a" path="/var/lib/kubelet/pods/40500a46-a16b-4282-86e4-1d99277d7c7a/volumes" Feb 19 16:15:16 crc kubenswrapper[4810]: I0219 16:15:16.159830 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-78bc5d479f-k79xx" podUID="9190a865-226b-487c-b0f9-2573f50f0eab" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Feb 19 16:15:19 crc kubenswrapper[4810]: I0219 16:15:19.439951 4810 scope.go:117] "RemoveContainer" containerID="8520d3639f4b062a60a4e8d48690a9ddf9c8f0411d765a4be05ac76b8b0faa8e" Feb 19 16:15:19 crc kubenswrapper[4810]: E0219 16:15:19.440731 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:15:30 crc kubenswrapper[4810]: I0219 16:15:30.440382 4810 scope.go:117] "RemoveContainer" containerID="8520d3639f4b062a60a4e8d48690a9ddf9c8f0411d765a4be05ac76b8b0faa8e" Feb 19 16:15:31 crc kubenswrapper[4810]: I0219 16:15:31.612456 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" event={"ID":"dfbf74ef-1e94-4826-8583-42b2e246ccf3","Type":"ContainerStarted","Data":"dd8031bab4ace844a0a95e2a4ed00eb2a49420bcc364c28753fbe8146e7a2fd5"} Feb 19 16:15:43 crc kubenswrapper[4810]: I0219 16:15:43.900654 4810 scope.go:117] "RemoveContainer" containerID="b50c5c7b6b7301c6c8992f22517d1d5b4a8f3065b75aef28f8faaa61ad94fd27" Feb 19 16:17:07 crc kubenswrapper[4810]: I0219 16:17:07.902046 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kzwcn"] Feb 19 16:17:07 crc kubenswrapper[4810]: E0219 16:17:07.903080 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99fb536c-bd62-47d3-87d6-9f56d3e51f72" containerName="collect-profiles" Feb 19 16:17:07 crc kubenswrapper[4810]: I0219 16:17:07.903096 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="99fb536c-bd62-47d3-87d6-9f56d3e51f72" containerName="collect-profiles" Feb 19 16:17:07 crc kubenswrapper[4810]: I0219 16:17:07.903334 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="99fb536c-bd62-47d3-87d6-9f56d3e51f72" containerName="collect-profiles" Feb 19 16:17:07 crc kubenswrapper[4810]: I0219 16:17:07.905068 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kzwcn" Feb 19 16:17:08 crc kubenswrapper[4810]: I0219 16:17:08.002152 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kzwcn"] Feb 19 16:17:08 crc kubenswrapper[4810]: I0219 16:17:08.022424 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xbc9\" (UniqueName: \"kubernetes.io/projected/29a395e7-cdab-40db-ae2c-1d746a31aeec-kube-api-access-4xbc9\") pod \"community-operators-kzwcn\" (UID: \"29a395e7-cdab-40db-ae2c-1d746a31aeec\") " pod="openshift-marketplace/community-operators-kzwcn" Feb 19 16:17:08 crc kubenswrapper[4810]: I0219 16:17:08.022503 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29a395e7-cdab-40db-ae2c-1d746a31aeec-utilities\") pod \"community-operators-kzwcn\" (UID: \"29a395e7-cdab-40db-ae2c-1d746a31aeec\") " pod="openshift-marketplace/community-operators-kzwcn" Feb 19 16:17:08 crc kubenswrapper[4810]: I0219 16:17:08.022603 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29a395e7-cdab-40db-ae2c-1d746a31aeec-catalog-content\") pod \"community-operators-kzwcn\" (UID: \"29a395e7-cdab-40db-ae2c-1d746a31aeec\") " pod="openshift-marketplace/community-operators-kzwcn" Feb 19 16:17:08 crc kubenswrapper[4810]: I0219 16:17:08.124686 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29a395e7-cdab-40db-ae2c-1d746a31aeec-utilities\") pod \"community-operators-kzwcn\" (UID: \"29a395e7-cdab-40db-ae2c-1d746a31aeec\") " pod="openshift-marketplace/community-operators-kzwcn" Feb 19 16:17:08 crc kubenswrapper[4810]: I0219 16:17:08.124886 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29a395e7-cdab-40db-ae2c-1d746a31aeec-catalog-content\") pod \"community-operators-kzwcn\" (UID: \"29a395e7-cdab-40db-ae2c-1d746a31aeec\") " pod="openshift-marketplace/community-operators-kzwcn" Feb 19 16:17:08 crc kubenswrapper[4810]: I0219 16:17:08.124983 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xbc9\" (UniqueName: \"kubernetes.io/projected/29a395e7-cdab-40db-ae2c-1d746a31aeec-kube-api-access-4xbc9\") pod \"community-operators-kzwcn\" (UID: \"29a395e7-cdab-40db-ae2c-1d746a31aeec\") " pod="openshift-marketplace/community-operators-kzwcn" Feb 19 16:17:08 crc kubenswrapper[4810]: I0219 16:17:08.125163 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29a395e7-cdab-40db-ae2c-1d746a31aeec-utilities\") pod \"community-operators-kzwcn\" (UID: \"29a395e7-cdab-40db-ae2c-1d746a31aeec\") " pod="openshift-marketplace/community-operators-kzwcn" Feb 19 16:17:08 crc kubenswrapper[4810]: I0219 16:17:08.125424 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29a395e7-cdab-40db-ae2c-1d746a31aeec-catalog-content\") pod \"community-operators-kzwcn\" (UID: \"29a395e7-cdab-40db-ae2c-1d746a31aeec\") " pod="openshift-marketplace/community-operators-kzwcn" Feb 19 16:17:08 crc kubenswrapper[4810]: I0219 16:17:08.156588 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xbc9\" (UniqueName: \"kubernetes.io/projected/29a395e7-cdab-40db-ae2c-1d746a31aeec-kube-api-access-4xbc9\") pod \"community-operators-kzwcn\" (UID: \"29a395e7-cdab-40db-ae2c-1d746a31aeec\") " pod="openshift-marketplace/community-operators-kzwcn" Feb 19 16:17:08 crc kubenswrapper[4810]: I0219 16:17:08.228111 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kzwcn" Feb 19 16:17:08 crc kubenswrapper[4810]: I0219 16:17:08.762003 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kzwcn"] Feb 19 16:17:08 crc kubenswrapper[4810]: I0219 16:17:08.822970 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kzwcn" event={"ID":"29a395e7-cdab-40db-ae2c-1d746a31aeec","Type":"ContainerStarted","Data":"d2ce1af4e07d6583d631ee1c7d3aaab708cde8706d5712cba248062381b3897d"} Feb 19 16:17:09 crc kubenswrapper[4810]: I0219 16:17:09.836290 4810 generic.go:334] "Generic (PLEG): container finished" podID="29a395e7-cdab-40db-ae2c-1d746a31aeec" containerID="b829f1abe881ccf1d9a4b3ef3b15225ee5ee8a7301ae1750f3751e75f85d7cb7" exitCode=0 Feb 19 16:17:09 crc kubenswrapper[4810]: I0219 16:17:09.836564 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kzwcn" event={"ID":"29a395e7-cdab-40db-ae2c-1d746a31aeec","Type":"ContainerDied","Data":"b829f1abe881ccf1d9a4b3ef3b15225ee5ee8a7301ae1750f3751e75f85d7cb7"} Feb 19 16:17:09 crc kubenswrapper[4810]: I0219 16:17:09.838860 4810 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 16:17:10 crc kubenswrapper[4810]: I0219 16:17:10.856261 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kzwcn" event={"ID":"29a395e7-cdab-40db-ae2c-1d746a31aeec","Type":"ContainerStarted","Data":"83153e32fc105bc211e810ffb7fbd7a3511dd3b91cf6cb2f1882279bdb7321fe"} Feb 19 16:17:11 crc kubenswrapper[4810]: I0219 16:17:11.870727 4810 generic.go:334] "Generic (PLEG): container finished" podID="29a395e7-cdab-40db-ae2c-1d746a31aeec" containerID="83153e32fc105bc211e810ffb7fbd7a3511dd3b91cf6cb2f1882279bdb7321fe" exitCode=0 Feb 19 16:17:11 crc kubenswrapper[4810]: I0219 16:17:11.870773 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kzwcn" event={"ID":"29a395e7-cdab-40db-ae2c-1d746a31aeec","Type":"ContainerDied","Data":"83153e32fc105bc211e810ffb7fbd7a3511dd3b91cf6cb2f1882279bdb7321fe"} Feb 19 16:17:12 crc kubenswrapper[4810]: I0219 16:17:12.883113 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kzwcn" event={"ID":"29a395e7-cdab-40db-ae2c-1d746a31aeec","Type":"ContainerStarted","Data":"d0e3429da6f9749d75a96ff2598af52b5193e6866e807dc84768878ca6720e4a"} Feb 19 16:17:18 crc kubenswrapper[4810]: I0219 16:17:18.228837 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kzwcn" Feb 19 16:17:18 crc kubenswrapper[4810]: I0219 16:17:18.229460 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kzwcn" Feb 19 16:17:18 crc kubenswrapper[4810]: I0219 16:17:18.301218 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kzwcn" Feb 19 16:17:18 crc kubenswrapper[4810]: I0219 16:17:18.331261 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kzwcn" podStartSLOduration=8.856918612 podStartE2EDuration="11.331236552s" podCreationTimestamp="2026-02-19 16:17:07 +0000 UTC" firstStartedPulling="2026-02-19 16:17:09.838070034 +0000 UTC m=+4059.320100198" lastFinishedPulling="2026-02-19 16:17:12.312388004 +0000 UTC m=+4061.794418138" observedRunningTime="2026-02-19 16:17:12.911233613 +0000 UTC m=+4062.393263737" watchObservedRunningTime="2026-02-19 16:17:18.331236552 +0000 UTC m=+4067.813266706" Feb 19 16:17:19 crc kubenswrapper[4810]: I0219 16:17:19.018993 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kzwcn" Feb 19 16:17:19 crc kubenswrapper[4810]: I0219 16:17:19.092497 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kzwcn"] Feb 19 16:17:20 crc kubenswrapper[4810]: I0219 16:17:20.961186 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kzwcn" podUID="29a395e7-cdab-40db-ae2c-1d746a31aeec" containerName="registry-server" containerID="cri-o://d0e3429da6f9749d75a96ff2598af52b5193e6866e807dc84768878ca6720e4a" gracePeriod=2 Feb 19 16:17:21 crc kubenswrapper[4810]: I0219 16:17:21.526379 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kzwcn" Feb 19 16:17:21 crc kubenswrapper[4810]: I0219 16:17:21.665694 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29a395e7-cdab-40db-ae2c-1d746a31aeec-utilities\") pod \"29a395e7-cdab-40db-ae2c-1d746a31aeec\" (UID: \"29a395e7-cdab-40db-ae2c-1d746a31aeec\") " Feb 19 16:17:21 crc kubenswrapper[4810]: I0219 16:17:21.666045 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xbc9\" (UniqueName: \"kubernetes.io/projected/29a395e7-cdab-40db-ae2c-1d746a31aeec-kube-api-access-4xbc9\") pod \"29a395e7-cdab-40db-ae2c-1d746a31aeec\" (UID: \"29a395e7-cdab-40db-ae2c-1d746a31aeec\") " Feb 19 16:17:21 crc kubenswrapper[4810]: I0219 16:17:21.666161 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29a395e7-cdab-40db-ae2c-1d746a31aeec-catalog-content\") pod \"29a395e7-cdab-40db-ae2c-1d746a31aeec\" (UID: \"29a395e7-cdab-40db-ae2c-1d746a31aeec\") " Feb 19 16:17:21 crc kubenswrapper[4810]: I0219 16:17:21.666594 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29a395e7-cdab-40db-ae2c-1d746a31aeec-utilities" (OuterVolumeSpecName: "utilities") pod "29a395e7-cdab-40db-ae2c-1d746a31aeec" (UID: "29a395e7-cdab-40db-ae2c-1d746a31aeec"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 16:17:21 crc kubenswrapper[4810]: I0219 16:17:21.667884 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29a395e7-cdab-40db-ae2c-1d746a31aeec-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 16:17:21 crc kubenswrapper[4810]: I0219 16:17:21.682578 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29a395e7-cdab-40db-ae2c-1d746a31aeec-kube-api-access-4xbc9" (OuterVolumeSpecName: "kube-api-access-4xbc9") pod "29a395e7-cdab-40db-ae2c-1d746a31aeec" (UID: "29a395e7-cdab-40db-ae2c-1d746a31aeec"). InnerVolumeSpecName "kube-api-access-4xbc9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 16:17:21 crc kubenswrapper[4810]: I0219 16:17:21.716495 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29a395e7-cdab-40db-ae2c-1d746a31aeec-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "29a395e7-cdab-40db-ae2c-1d746a31aeec" (UID: "29a395e7-cdab-40db-ae2c-1d746a31aeec"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 16:17:21 crc kubenswrapper[4810]: I0219 16:17:21.786519 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29a395e7-cdab-40db-ae2c-1d746a31aeec-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 16:17:21 crc kubenswrapper[4810]: I0219 16:17:21.786557 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xbc9\" (UniqueName: \"kubernetes.io/projected/29a395e7-cdab-40db-ae2c-1d746a31aeec-kube-api-access-4xbc9\") on node \"crc\" DevicePath \"\"" Feb 19 16:17:21 crc kubenswrapper[4810]: I0219 16:17:21.979637 4810 generic.go:334] "Generic (PLEG): container finished" podID="29a395e7-cdab-40db-ae2c-1d746a31aeec" containerID="d0e3429da6f9749d75a96ff2598af52b5193e6866e807dc84768878ca6720e4a" exitCode=0 Feb 19 16:17:21 crc kubenswrapper[4810]: I0219 16:17:21.979705 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kzwcn" event={"ID":"29a395e7-cdab-40db-ae2c-1d746a31aeec","Type":"ContainerDied","Data":"d0e3429da6f9749d75a96ff2598af52b5193e6866e807dc84768878ca6720e4a"} Feb 19 16:17:21 crc kubenswrapper[4810]: I0219 16:17:21.979730 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kzwcn" Feb 19 16:17:21 crc kubenswrapper[4810]: I0219 16:17:21.979819 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kzwcn" event={"ID":"29a395e7-cdab-40db-ae2c-1d746a31aeec","Type":"ContainerDied","Data":"d2ce1af4e07d6583d631ee1c7d3aaab708cde8706d5712cba248062381b3897d"} Feb 19 16:17:21 crc kubenswrapper[4810]: I0219 16:17:21.979856 4810 scope.go:117] "RemoveContainer" containerID="d0e3429da6f9749d75a96ff2598af52b5193e6866e807dc84768878ca6720e4a" Feb 19 16:17:22 crc kubenswrapper[4810]: I0219 16:17:22.036889 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kzwcn"] Feb 19 16:17:22 crc kubenswrapper[4810]: I0219 16:17:22.049561 4810 scope.go:117] "RemoveContainer" containerID="83153e32fc105bc211e810ffb7fbd7a3511dd3b91cf6cb2f1882279bdb7321fe" Feb 19 16:17:22 crc kubenswrapper[4810]: I0219 16:17:22.049678 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kzwcn"] Feb 19 16:17:22 crc kubenswrapper[4810]: I0219 16:17:22.072681 4810 scope.go:117] "RemoveContainer" containerID="b829f1abe881ccf1d9a4b3ef3b15225ee5ee8a7301ae1750f3751e75f85d7cb7" Feb 19 16:17:22 crc kubenswrapper[4810]: I0219 16:17:22.129538 4810 scope.go:117] "RemoveContainer" containerID="d0e3429da6f9749d75a96ff2598af52b5193e6866e807dc84768878ca6720e4a" Feb 19 16:17:22 crc kubenswrapper[4810]: E0219 16:17:22.130103 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0e3429da6f9749d75a96ff2598af52b5193e6866e807dc84768878ca6720e4a\": container with ID starting with d0e3429da6f9749d75a96ff2598af52b5193e6866e807dc84768878ca6720e4a not found: ID does not exist" containerID="d0e3429da6f9749d75a96ff2598af52b5193e6866e807dc84768878ca6720e4a" Feb 19 16:17:22 crc kubenswrapper[4810]: I0219 16:17:22.130142 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0e3429da6f9749d75a96ff2598af52b5193e6866e807dc84768878ca6720e4a"} err="failed to get container status \"d0e3429da6f9749d75a96ff2598af52b5193e6866e807dc84768878ca6720e4a\": rpc error: code = NotFound desc = could not find container \"d0e3429da6f9749d75a96ff2598af52b5193e6866e807dc84768878ca6720e4a\": container with ID starting with d0e3429da6f9749d75a96ff2598af52b5193e6866e807dc84768878ca6720e4a not found: ID does not exist" Feb 19 16:17:22 crc kubenswrapper[4810]: I0219 16:17:22.130169 4810 scope.go:117] "RemoveContainer" containerID="83153e32fc105bc211e810ffb7fbd7a3511dd3b91cf6cb2f1882279bdb7321fe" Feb 19 16:17:22 crc kubenswrapper[4810]: E0219 16:17:22.131119 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83153e32fc105bc211e810ffb7fbd7a3511dd3b91cf6cb2f1882279bdb7321fe\": container with ID starting with 83153e32fc105bc211e810ffb7fbd7a3511dd3b91cf6cb2f1882279bdb7321fe not found: ID does not exist" containerID="83153e32fc105bc211e810ffb7fbd7a3511dd3b91cf6cb2f1882279bdb7321fe" Feb 19 16:17:22 crc kubenswrapper[4810]: I0219 16:17:22.131155 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83153e32fc105bc211e810ffb7fbd7a3511dd3b91cf6cb2f1882279bdb7321fe"} err="failed to get container status \"83153e32fc105bc211e810ffb7fbd7a3511dd3b91cf6cb2f1882279bdb7321fe\": rpc error: code = NotFound desc = could not find container \"83153e32fc105bc211e810ffb7fbd7a3511dd3b91cf6cb2f1882279bdb7321fe\": container with ID starting with 83153e32fc105bc211e810ffb7fbd7a3511dd3b91cf6cb2f1882279bdb7321fe not found: ID does not exist" Feb 19 16:17:22 crc kubenswrapper[4810]: I0219 16:17:22.131174 4810 scope.go:117] "RemoveContainer" containerID="b829f1abe881ccf1d9a4b3ef3b15225ee5ee8a7301ae1750f3751e75f85d7cb7" Feb 19 16:17:22 crc kubenswrapper[4810]: E0219 16:17:22.131996 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b829f1abe881ccf1d9a4b3ef3b15225ee5ee8a7301ae1750f3751e75f85d7cb7\": container with ID starting with b829f1abe881ccf1d9a4b3ef3b15225ee5ee8a7301ae1750f3751e75f85d7cb7 not found: ID does not exist" containerID="b829f1abe881ccf1d9a4b3ef3b15225ee5ee8a7301ae1750f3751e75f85d7cb7" Feb 19 16:17:22 crc kubenswrapper[4810]: I0219 16:17:22.132025 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b829f1abe881ccf1d9a4b3ef3b15225ee5ee8a7301ae1750f3751e75f85d7cb7"} err="failed to get container status \"b829f1abe881ccf1d9a4b3ef3b15225ee5ee8a7301ae1750f3751e75f85d7cb7\": rpc error: code = NotFound desc = could not find container \"b829f1abe881ccf1d9a4b3ef3b15225ee5ee8a7301ae1750f3751e75f85d7cb7\": container with ID starting with b829f1abe881ccf1d9a4b3ef3b15225ee5ee8a7301ae1750f3751e75f85d7cb7 not found: ID does not exist" Feb 19 16:17:23 crc kubenswrapper[4810]: I0219 16:17:23.457351 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29a395e7-cdab-40db-ae2c-1d746a31aeec" path="/var/lib/kubelet/pods/29a395e7-cdab-40db-ae2c-1d746a31aeec/volumes" Feb 19 16:17:49 crc kubenswrapper[4810]: I0219 16:17:49.537805 4810 patch_prober.go:28] interesting pod/machine-config-daemon-t499d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 16:17:49 crc kubenswrapper[4810]: I0219 16:17:49.538529 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 16:18:02 crc kubenswrapper[4810]: I0219 16:18:02.206630 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9mm6p"] Feb 19 16:18:02 crc kubenswrapper[4810]: E0219 16:18:02.207677 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29a395e7-cdab-40db-ae2c-1d746a31aeec" containerName="extract-content" Feb 19 16:18:02 crc kubenswrapper[4810]: I0219 16:18:02.207696 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="29a395e7-cdab-40db-ae2c-1d746a31aeec" containerName="extract-content" Feb 19 16:18:02 crc kubenswrapper[4810]: E0219 16:18:02.207722 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29a395e7-cdab-40db-ae2c-1d746a31aeec" containerName="extract-utilities" Feb 19 16:18:02 crc kubenswrapper[4810]: I0219 16:18:02.207731 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="29a395e7-cdab-40db-ae2c-1d746a31aeec" containerName="extract-utilities" Feb 19 16:18:02 crc kubenswrapper[4810]: E0219 16:18:02.207759 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29a395e7-cdab-40db-ae2c-1d746a31aeec" containerName="registry-server" Feb 19 16:18:02 crc kubenswrapper[4810]: I0219 16:18:02.207768 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="29a395e7-cdab-40db-ae2c-1d746a31aeec" containerName="registry-server" Feb 19 16:18:02 crc kubenswrapper[4810]: I0219 16:18:02.207990 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="29a395e7-cdab-40db-ae2c-1d746a31aeec" containerName="registry-server" Feb 19 16:18:02 crc kubenswrapper[4810]: I0219 16:18:02.209632 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9mm6p" Feb 19 16:18:02 crc kubenswrapper[4810]: I0219 16:18:02.227380 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9mm6p"] Feb 19 16:18:03 crc kubenswrapper[4810]: I0219 16:18:02.933143 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a123ea72-e0fc-43b7-b1d9-af79a382d010-catalog-content\") pod \"redhat-marketplace-9mm6p\" (UID: \"a123ea72-e0fc-43b7-b1d9-af79a382d010\") " pod="openshift-marketplace/redhat-marketplace-9mm6p" Feb 19 16:18:03 crc kubenswrapper[4810]: I0219 16:18:02.933340 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95tfz\" (UniqueName: \"kubernetes.io/projected/a123ea72-e0fc-43b7-b1d9-af79a382d010-kube-api-access-95tfz\") pod \"redhat-marketplace-9mm6p\" (UID: \"a123ea72-e0fc-43b7-b1d9-af79a382d010\") " pod="openshift-marketplace/redhat-marketplace-9mm6p" Feb 19 16:18:03 crc kubenswrapper[4810]: I0219 16:18:02.933366 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a123ea72-e0fc-43b7-b1d9-af79a382d010-utilities\") pod \"redhat-marketplace-9mm6p\" (UID: \"a123ea72-e0fc-43b7-b1d9-af79a382d010\") " pod="openshift-marketplace/redhat-marketplace-9mm6p" Feb 19 16:18:03 crc kubenswrapper[4810]: I0219 16:18:03.092486 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95tfz\" (UniqueName: \"kubernetes.io/projected/a123ea72-e0fc-43b7-b1d9-af79a382d010-kube-api-access-95tfz\") pod \"redhat-marketplace-9mm6p\" (UID: \"a123ea72-e0fc-43b7-b1d9-af79a382d010\") " pod="openshift-marketplace/redhat-marketplace-9mm6p" Feb 19 16:18:03 crc kubenswrapper[4810]: I0219 16:18:03.092527 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a123ea72-e0fc-43b7-b1d9-af79a382d010-utilities\") pod \"redhat-marketplace-9mm6p\" (UID: \"a123ea72-e0fc-43b7-b1d9-af79a382d010\") " pod="openshift-marketplace/redhat-marketplace-9mm6p" Feb 19 16:18:03 crc kubenswrapper[4810]: I0219 16:18:03.092617 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a123ea72-e0fc-43b7-b1d9-af79a382d010-catalog-content\") pod \"redhat-marketplace-9mm6p\" (UID: \"a123ea72-e0fc-43b7-b1d9-af79a382d010\") " pod="openshift-marketplace/redhat-marketplace-9mm6p" Feb 19 16:18:03 crc kubenswrapper[4810]: I0219 16:18:03.093171 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a123ea72-e0fc-43b7-b1d9-af79a382d010-catalog-content\") pod \"redhat-marketplace-9mm6p\" (UID: \"a123ea72-e0fc-43b7-b1d9-af79a382d010\") " pod="openshift-marketplace/redhat-marketplace-9mm6p" Feb 19 16:18:03 crc kubenswrapper[4810]: I0219 16:18:03.093441 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a123ea72-e0fc-43b7-b1d9-af79a382d010-utilities\") pod \"redhat-marketplace-9mm6p\" (UID: \"a123ea72-e0fc-43b7-b1d9-af79a382d010\") " pod="openshift-marketplace/redhat-marketplace-9mm6p" Feb 19 16:18:03 crc kubenswrapper[4810]: I0219 16:18:03.136130 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95tfz\" (UniqueName: \"kubernetes.io/projected/a123ea72-e0fc-43b7-b1d9-af79a382d010-kube-api-access-95tfz\") pod \"redhat-marketplace-9mm6p\" (UID: \"a123ea72-e0fc-43b7-b1d9-af79a382d010\") " pod="openshift-marketplace/redhat-marketplace-9mm6p" Feb 19 16:18:03 crc kubenswrapper[4810]: I0219 16:18:03.428509 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9mm6p" Feb 19 16:18:03 crc kubenswrapper[4810]: I0219 16:18:03.927353 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9mm6p"] Feb 19 16:18:04 crc kubenswrapper[4810]: I0219 16:18:04.172307 4810 generic.go:334] "Generic (PLEG): container finished" podID="a123ea72-e0fc-43b7-b1d9-af79a382d010" containerID="727e747b80d7b6f50c3abcbcf2d1a7df66b394bb5d4bed9f6c637270e5269b28" exitCode=0 Feb 19 16:18:04 crc kubenswrapper[4810]: I0219 16:18:04.172381 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9mm6p" event={"ID":"a123ea72-e0fc-43b7-b1d9-af79a382d010","Type":"ContainerDied","Data":"727e747b80d7b6f50c3abcbcf2d1a7df66b394bb5d4bed9f6c637270e5269b28"} Feb 19 16:18:04 crc kubenswrapper[4810]: I0219 16:18:04.172413 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9mm6p" event={"ID":"a123ea72-e0fc-43b7-b1d9-af79a382d010","Type":"ContainerStarted","Data":"6d3216b043794cb744c98cc73cbb45b1ff615a9067efafa48c652be47cd88065"} Feb 19 16:18:06 crc kubenswrapper[4810]: I0219 16:18:06.197356 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9mm6p" event={"ID":"a123ea72-e0fc-43b7-b1d9-af79a382d010","Type":"ContainerStarted","Data":"a0036126afa42137de8816cea32a8f172bdb2f49584406dc4b2b19e61cff06cc"} Feb 19 16:18:07 crc kubenswrapper[4810]: I0219 16:18:07.210980 4810 generic.go:334] "Generic (PLEG): container finished" podID="a123ea72-e0fc-43b7-b1d9-af79a382d010" containerID="a0036126afa42137de8816cea32a8f172bdb2f49584406dc4b2b19e61cff06cc" exitCode=0 Feb 19 16:18:07 crc kubenswrapper[4810]: I0219 16:18:07.211032 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9mm6p" event={"ID":"a123ea72-e0fc-43b7-b1d9-af79a382d010","Type":"ContainerDied","Data":"a0036126afa42137de8816cea32a8f172bdb2f49584406dc4b2b19e61cff06cc"} Feb 19 16:18:07 crc kubenswrapper[4810]: I0219 16:18:07.778942 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bwbpv"] Feb 19 16:18:07 crc kubenswrapper[4810]: I0219 16:18:07.782344 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bwbpv" Feb 19 16:18:07 crc kubenswrapper[4810]: I0219 16:18:07.794786 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bwbpv"] Feb 19 16:18:07 crc kubenswrapper[4810]: I0219 16:18:07.894582 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bx454\" (UniqueName: \"kubernetes.io/projected/05b0324a-36c1-419b-8bdd-e41ad42a6a3f-kube-api-access-bx454\") pod \"redhat-operators-bwbpv\" (UID: \"05b0324a-36c1-419b-8bdd-e41ad42a6a3f\") " pod="openshift-marketplace/redhat-operators-bwbpv" Feb 19 16:18:07 crc kubenswrapper[4810]: I0219 16:18:07.894715 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05b0324a-36c1-419b-8bdd-e41ad42a6a3f-utilities\") pod \"redhat-operators-bwbpv\" (UID: \"05b0324a-36c1-419b-8bdd-e41ad42a6a3f\") " pod="openshift-marketplace/redhat-operators-bwbpv" Feb 19 16:18:07 crc kubenswrapper[4810]: I0219 16:18:07.894964 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05b0324a-36c1-419b-8bdd-e41ad42a6a3f-catalog-content\") pod \"redhat-operators-bwbpv\" (UID: \"05b0324a-36c1-419b-8bdd-e41ad42a6a3f\") " pod="openshift-marketplace/redhat-operators-bwbpv" Feb 19 16:18:07 crc kubenswrapper[4810]: I0219 16:18:07.996646 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05b0324a-36c1-419b-8bdd-e41ad42a6a3f-catalog-content\") pod \"redhat-operators-bwbpv\" (UID: \"05b0324a-36c1-419b-8bdd-e41ad42a6a3f\") " pod="openshift-marketplace/redhat-operators-bwbpv" Feb 19 16:18:07 crc kubenswrapper[4810]: I0219 16:18:07.997007 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bx454\" (UniqueName: \"kubernetes.io/projected/05b0324a-36c1-419b-8bdd-e41ad42a6a3f-kube-api-access-bx454\") pod \"redhat-operators-bwbpv\" (UID: \"05b0324a-36c1-419b-8bdd-e41ad42a6a3f\") " pod="openshift-marketplace/redhat-operators-bwbpv" Feb 19 16:18:07 crc kubenswrapper[4810]: I0219 16:18:07.997047 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05b0324a-36c1-419b-8bdd-e41ad42a6a3f-utilities\") pod \"redhat-operators-bwbpv\" (UID: \"05b0324a-36c1-419b-8bdd-e41ad42a6a3f\") " pod="openshift-marketplace/redhat-operators-bwbpv" Feb 19 16:18:07 crc kubenswrapper[4810]: I0219 16:18:07.997137 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05b0324a-36c1-419b-8bdd-e41ad42a6a3f-catalog-content\") pod \"redhat-operators-bwbpv\" (UID: \"05b0324a-36c1-419b-8bdd-e41ad42a6a3f\") " pod="openshift-marketplace/redhat-operators-bwbpv" Feb 19 16:18:07 crc kubenswrapper[4810]: I0219 16:18:07.997463 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05b0324a-36c1-419b-8bdd-e41ad42a6a3f-utilities\") pod \"redhat-operators-bwbpv\" (UID: \"05b0324a-36c1-419b-8bdd-e41ad42a6a3f\") " pod="openshift-marketplace/redhat-operators-bwbpv" Feb 19 16:18:08 crc kubenswrapper[4810]: I0219 16:18:08.015476 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bx454\" (UniqueName: \"kubernetes.io/projected/05b0324a-36c1-419b-8bdd-e41ad42a6a3f-kube-api-access-bx454\") pod \"redhat-operators-bwbpv\" (UID: \"05b0324a-36c1-419b-8bdd-e41ad42a6a3f\") " pod="openshift-marketplace/redhat-operators-bwbpv" Feb 19 16:18:08 crc kubenswrapper[4810]: I0219 16:18:08.102253 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bwbpv" Feb 19 16:18:08 crc kubenswrapper[4810]: I0219 16:18:08.238607 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9mm6p" event={"ID":"a123ea72-e0fc-43b7-b1d9-af79a382d010","Type":"ContainerStarted","Data":"069b1791957e989d646d3208ea272273d0d112531c277fc7259a952f9cba60cf"} Feb 19 16:18:08 crc kubenswrapper[4810]: I0219 16:18:08.284583 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9mm6p" podStartSLOduration=2.85304976 podStartE2EDuration="6.284562965s" podCreationTimestamp="2026-02-19 16:18:02 +0000 UTC" firstStartedPulling="2026-02-19 16:18:04.174535469 +0000 UTC m=+4113.656565603" lastFinishedPulling="2026-02-19 16:18:07.606048674 +0000 UTC m=+4117.088078808" observedRunningTime="2026-02-19 16:18:08.263105154 +0000 UTC m=+4117.745135308" watchObservedRunningTime="2026-02-19 16:18:08.284562965 +0000 UTC m=+4117.766593089" Feb 19 16:18:08 crc kubenswrapper[4810]: I0219 16:18:08.747659 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bwbpv"] Feb 19 16:18:09 crc kubenswrapper[4810]: I0219 16:18:09.247181 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bwbpv" event={"ID":"05b0324a-36c1-419b-8bdd-e41ad42a6a3f","Type":"ContainerStarted","Data":"93a8135c28be8635f5340efbe9657c055e768c0e32dd5af6f1b72560d483c3e0"} Feb 19 16:18:10 crc kubenswrapper[4810]: I0219 16:18:10.262693 4810 generic.go:334] "Generic (PLEG): container finished" podID="05b0324a-36c1-419b-8bdd-e41ad42a6a3f" containerID="9a684aad47e63c793748f0e63d3e8324617bba9ecfb36e6d1acb733f20e678a2" exitCode=0 Feb 19 16:18:10 crc kubenswrapper[4810]: I0219 16:18:10.262841 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bwbpv" event={"ID":"05b0324a-36c1-419b-8bdd-e41ad42a6a3f","Type":"ContainerDied","Data":"9a684aad47e63c793748f0e63d3e8324617bba9ecfb36e6d1acb733f20e678a2"} Feb 19 16:18:13 crc kubenswrapper[4810]: I0219 16:18:13.429627 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9mm6p" Feb 19 16:18:13 crc kubenswrapper[4810]: I0219 16:18:13.430107 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9mm6p" Feb 19 16:18:13 crc kubenswrapper[4810]: I0219 16:18:13.512265 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9mm6p" Feb 19 16:18:14 crc kubenswrapper[4810]: I0219 16:18:14.349114 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9mm6p" Feb 19 16:18:14 crc kubenswrapper[4810]: I0219 16:18:14.772877 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9mm6p"] Feb 19 16:18:16 crc kubenswrapper[4810]: I0219 16:18:16.326037 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9mm6p" podUID="a123ea72-e0fc-43b7-b1d9-af79a382d010" containerName="registry-server" containerID="cri-o://069b1791957e989d646d3208ea272273d0d112531c277fc7259a952f9cba60cf" gracePeriod=2 Feb 19 16:18:17 crc kubenswrapper[4810]: I0219 16:18:17.337909 4810 generic.go:334] "Generic (PLEG): container finished" podID="a123ea72-e0fc-43b7-b1d9-af79a382d010" containerID="069b1791957e989d646d3208ea272273d0d112531c277fc7259a952f9cba60cf" exitCode=0 Feb 19 16:18:17 crc kubenswrapper[4810]: I0219 16:18:17.337953 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9mm6p" event={"ID":"a123ea72-e0fc-43b7-b1d9-af79a382d010","Type":"ContainerDied","Data":"069b1791957e989d646d3208ea272273d0d112531c277fc7259a952f9cba60cf"} Feb 19 16:18:19 crc kubenswrapper[4810]: I0219 16:18:19.537105 4810 patch_prober.go:28] interesting pod/machine-config-daemon-t499d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 16:18:19 crc kubenswrapper[4810]: I0219 16:18:19.537359 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 16:18:19 crc kubenswrapper[4810]: I0219 16:18:19.771589 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9mm6p" Feb 19 16:18:19 crc kubenswrapper[4810]: I0219 16:18:19.844033 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a123ea72-e0fc-43b7-b1d9-af79a382d010-catalog-content\") pod \"a123ea72-e0fc-43b7-b1d9-af79a382d010\" (UID: \"a123ea72-e0fc-43b7-b1d9-af79a382d010\") " Feb 19 16:18:19 crc kubenswrapper[4810]: I0219 16:18:19.844143 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a123ea72-e0fc-43b7-b1d9-af79a382d010-utilities\") pod \"a123ea72-e0fc-43b7-b1d9-af79a382d010\" (UID: \"a123ea72-e0fc-43b7-b1d9-af79a382d010\") " Feb 19 16:18:19 crc kubenswrapper[4810]: I0219 16:18:19.844187 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95tfz\" (UniqueName: \"kubernetes.io/projected/a123ea72-e0fc-43b7-b1d9-af79a382d010-kube-api-access-95tfz\") pod \"a123ea72-e0fc-43b7-b1d9-af79a382d010\" (UID: \"a123ea72-e0fc-43b7-b1d9-af79a382d010\") " Feb 19 16:18:19 crc kubenswrapper[4810]: I0219 16:18:19.849961 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a123ea72-e0fc-43b7-b1d9-af79a382d010-utilities" (OuterVolumeSpecName: "utilities") pod "a123ea72-e0fc-43b7-b1d9-af79a382d010" (UID: "a123ea72-e0fc-43b7-b1d9-af79a382d010"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 16:18:19 crc kubenswrapper[4810]: I0219 16:18:19.851509 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a123ea72-e0fc-43b7-b1d9-af79a382d010-kube-api-access-95tfz" (OuterVolumeSpecName: "kube-api-access-95tfz") pod "a123ea72-e0fc-43b7-b1d9-af79a382d010" (UID: "a123ea72-e0fc-43b7-b1d9-af79a382d010"). InnerVolumeSpecName "kube-api-access-95tfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 16:18:19 crc kubenswrapper[4810]: I0219 16:18:19.875101 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a123ea72-e0fc-43b7-b1d9-af79a382d010-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a123ea72-e0fc-43b7-b1d9-af79a382d010" (UID: "a123ea72-e0fc-43b7-b1d9-af79a382d010"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 16:18:19 crc kubenswrapper[4810]: I0219 16:18:19.951403 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a123ea72-e0fc-43b7-b1d9-af79a382d010-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 16:18:19 crc kubenswrapper[4810]: I0219 16:18:19.951435 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a123ea72-e0fc-43b7-b1d9-af79a382d010-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 16:18:19 crc kubenswrapper[4810]: I0219 16:18:19.951445 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-95tfz\" (UniqueName: \"kubernetes.io/projected/a123ea72-e0fc-43b7-b1d9-af79a382d010-kube-api-access-95tfz\") on node \"crc\" DevicePath \"\"" Feb 19 16:18:20 crc kubenswrapper[4810]: I0219 16:18:20.366672 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9mm6p" Feb 19 16:18:20 crc kubenswrapper[4810]: I0219 16:18:20.366670 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9mm6p" event={"ID":"a123ea72-e0fc-43b7-b1d9-af79a382d010","Type":"ContainerDied","Data":"6d3216b043794cb744c98cc73cbb45b1ff615a9067efafa48c652be47cd88065"} Feb 19 16:18:20 crc kubenswrapper[4810]: I0219 16:18:20.366822 4810 scope.go:117] "RemoveContainer" containerID="069b1791957e989d646d3208ea272273d0d112531c277fc7259a952f9cba60cf" Feb 19 16:18:20 crc kubenswrapper[4810]: I0219 16:18:20.369386 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bwbpv" event={"ID":"05b0324a-36c1-419b-8bdd-e41ad42a6a3f","Type":"ContainerStarted","Data":"79902e166f5edf35fbc7fb13052676c944ef7dba0af06603968486e0cdfcd6fe"} Feb 19 16:18:20 crc kubenswrapper[4810]: I0219 16:18:20.417656 4810 scope.go:117] "RemoveContainer" containerID="a0036126afa42137de8816cea32a8f172bdb2f49584406dc4b2b19e61cff06cc" Feb 19 16:18:20 crc kubenswrapper[4810]: I0219 16:18:20.429439 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9mm6p"] Feb 19 16:18:20 crc kubenswrapper[4810]: I0219 16:18:20.443670 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9mm6p"] Feb 19 16:18:20 crc kubenswrapper[4810]: I0219 16:18:20.538876 4810 scope.go:117] "RemoveContainer" containerID="727e747b80d7b6f50c3abcbcf2d1a7df66b394bb5d4bed9f6c637270e5269b28" Feb 19 16:18:21 crc kubenswrapper[4810]: I0219 16:18:21.460942 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a123ea72-e0fc-43b7-b1d9-af79a382d010" path="/var/lib/kubelet/pods/a123ea72-e0fc-43b7-b1d9-af79a382d010/volumes" Feb 19 16:18:23 crc kubenswrapper[4810]: I0219 16:18:23.410189 4810 generic.go:334] "Generic (PLEG): container finished" podID="05b0324a-36c1-419b-8bdd-e41ad42a6a3f" containerID="79902e166f5edf35fbc7fb13052676c944ef7dba0af06603968486e0cdfcd6fe" exitCode=0 Feb 19 16:18:23 crc kubenswrapper[4810]: I0219 16:18:23.410353 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bwbpv" event={"ID":"05b0324a-36c1-419b-8bdd-e41ad42a6a3f","Type":"ContainerDied","Data":"79902e166f5edf35fbc7fb13052676c944ef7dba0af06603968486e0cdfcd6fe"} Feb 19 16:18:24 crc kubenswrapper[4810]: I0219 16:18:24.218073 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-fsjqs"] Feb 19 16:18:24 crc kubenswrapper[4810]: E0219 16:18:24.219101 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a123ea72-e0fc-43b7-b1d9-af79a382d010" containerName="registry-server" Feb 19 16:18:24 crc kubenswrapper[4810]: I0219 16:18:24.219131 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="a123ea72-e0fc-43b7-b1d9-af79a382d010" containerName="registry-server" Feb 19 16:18:24 crc kubenswrapper[4810]: E0219 16:18:24.219181 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a123ea72-e0fc-43b7-b1d9-af79a382d010" containerName="extract-utilities" Feb 19 16:18:24 crc kubenswrapper[4810]: I0219 16:18:24.219195 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="a123ea72-e0fc-43b7-b1d9-af79a382d010" containerName="extract-utilities" Feb 19 16:18:24 crc kubenswrapper[4810]: E0219 16:18:24.219224 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a123ea72-e0fc-43b7-b1d9-af79a382d010" containerName="extract-content" Feb 19 16:18:24 crc kubenswrapper[4810]: I0219 16:18:24.219237 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="a123ea72-e0fc-43b7-b1d9-af79a382d010" containerName="extract-content" Feb 19 16:18:24 crc kubenswrapper[4810]: I0219 16:18:24.219895 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="a123ea72-e0fc-43b7-b1d9-af79a382d010" containerName="registry-server" Feb 19 16:18:24 crc kubenswrapper[4810]: I0219 16:18:24.223579 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fsjqs" Feb 19 16:18:24 crc kubenswrapper[4810]: I0219 16:18:24.239263 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fsjqs"] Feb 19 16:18:24 crc kubenswrapper[4810]: I0219 16:18:24.365588 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rskwc\" (UniqueName: \"kubernetes.io/projected/18dc9098-5a5e-438e-88d7-b611f88e7e56-kube-api-access-rskwc\") pod \"certified-operators-fsjqs\" (UID: \"18dc9098-5a5e-438e-88d7-b611f88e7e56\") " pod="openshift-marketplace/certified-operators-fsjqs" Feb 19 16:18:24 crc kubenswrapper[4810]: I0219 16:18:24.365867 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18dc9098-5a5e-438e-88d7-b611f88e7e56-catalog-content\") pod \"certified-operators-fsjqs\" (UID: \"18dc9098-5a5e-438e-88d7-b611f88e7e56\") " pod="openshift-marketplace/certified-operators-fsjqs" Feb 19 16:18:24 crc kubenswrapper[4810]: I0219 16:18:24.366054 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18dc9098-5a5e-438e-88d7-b611f88e7e56-utilities\") pod \"certified-operators-fsjqs\" (UID: \"18dc9098-5a5e-438e-88d7-b611f88e7e56\") " pod="openshift-marketplace/certified-operators-fsjqs" Feb 19 16:18:24 crc kubenswrapper[4810]: I0219 16:18:24.468607 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rskwc\" (UniqueName: \"kubernetes.io/projected/18dc9098-5a5e-438e-88d7-b611f88e7e56-kube-api-access-rskwc\") pod \"certified-operators-fsjqs\" (UID: \"18dc9098-5a5e-438e-88d7-b611f88e7e56\") " pod="openshift-marketplace/certified-operators-fsjqs" Feb 19 16:18:24 crc kubenswrapper[4810]: I0219 16:18:24.468818 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18dc9098-5a5e-438e-88d7-b611f88e7e56-catalog-content\") pod \"certified-operators-fsjqs\" (UID: \"18dc9098-5a5e-438e-88d7-b611f88e7e56\") " pod="openshift-marketplace/certified-operators-fsjqs" Feb 19 16:18:24 crc kubenswrapper[4810]: I0219 16:18:24.468926 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18dc9098-5a5e-438e-88d7-b611f88e7e56-utilities\") pod \"certified-operators-fsjqs\" (UID: \"18dc9098-5a5e-438e-88d7-b611f88e7e56\") " pod="openshift-marketplace/certified-operators-fsjqs" Feb 19 16:18:24 crc kubenswrapper[4810]: I0219 16:18:24.469406 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18dc9098-5a5e-438e-88d7-b611f88e7e56-utilities\") pod \"certified-operators-fsjqs\" (UID: \"18dc9098-5a5e-438e-88d7-b611f88e7e56\") " pod="openshift-marketplace/certified-operators-fsjqs" Feb 19 16:18:24 crc kubenswrapper[4810]: I0219 16:18:24.469406 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18dc9098-5a5e-438e-88d7-b611f88e7e56-catalog-content\") pod \"certified-operators-fsjqs\" (UID: \"18dc9098-5a5e-438e-88d7-b611f88e7e56\") " pod="openshift-marketplace/certified-operators-fsjqs" Feb 19 16:18:24 crc kubenswrapper[4810]: I0219 16:18:24.489480 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rskwc\" (UniqueName: \"kubernetes.io/projected/18dc9098-5a5e-438e-88d7-b611f88e7e56-kube-api-access-rskwc\") pod \"certified-operators-fsjqs\" (UID: \"18dc9098-5a5e-438e-88d7-b611f88e7e56\") " pod="openshift-marketplace/certified-operators-fsjqs" Feb 19 16:18:24 crc kubenswrapper[4810]: I0219 16:18:24.580370 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fsjqs" Feb 19 16:18:25 crc kubenswrapper[4810]: I0219 16:18:25.111470 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fsjqs"] Feb 19 16:18:25 crc kubenswrapper[4810]: I0219 16:18:25.430609 4810 generic.go:334] "Generic (PLEG): container finished" podID="18dc9098-5a5e-438e-88d7-b611f88e7e56" containerID="f8f03eba801535218b99db7ea105ed2a091125b7d3975eb406370048e19ab062" exitCode=0 Feb 19 16:18:25 crc kubenswrapper[4810]: I0219 16:18:25.430926 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fsjqs" event={"ID":"18dc9098-5a5e-438e-88d7-b611f88e7e56","Type":"ContainerDied","Data":"f8f03eba801535218b99db7ea105ed2a091125b7d3975eb406370048e19ab062"} Feb 19 16:18:25 crc kubenswrapper[4810]: I0219 16:18:25.430967 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fsjqs" event={"ID":"18dc9098-5a5e-438e-88d7-b611f88e7e56","Type":"ContainerStarted","Data":"543aee6faf2d0487412dcc9038ac404f9f42cb7c9ccaddf0001984891f549f8e"} Feb 19 16:18:25 crc kubenswrapper[4810]: I0219 16:18:25.454394 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bwbpv" event={"ID":"05b0324a-36c1-419b-8bdd-e41ad42a6a3f","Type":"ContainerStarted","Data":"89278896eb5981099add6badbd7a3e1af7af3548857a31a60c78e5beeb65a030"} Feb 19 16:18:25 crc kubenswrapper[4810]: I0219 16:18:25.477510 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bwbpv" podStartSLOduration=4.199355101 podStartE2EDuration="18.477493323s" podCreationTimestamp="2026-02-19 16:18:07 +0000 UTC" firstStartedPulling="2026-02-19 16:18:10.265773298 +0000 UTC m=+4119.747803462" lastFinishedPulling="2026-02-19 16:18:24.54391155 +0000 UTC m=+4134.025941684" observedRunningTime="2026-02-19 16:18:25.472582141 +0000 UTC m=+4134.954612275" watchObservedRunningTime="2026-02-19 16:18:25.477493323 +0000 UTC m=+4134.959523447" Feb 19 16:18:26 crc kubenswrapper[4810]: I0219 16:18:26.455457 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fsjqs" event={"ID":"18dc9098-5a5e-438e-88d7-b611f88e7e56","Type":"ContainerStarted","Data":"1cb29dd13e72da68f942f209eb6e0550ffc40a6639757fdf8af6c17ba48e7f94"} Feb 19 16:18:28 crc kubenswrapper[4810]: I0219 16:18:28.103249 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bwbpv" Feb 19 16:18:28 crc kubenswrapper[4810]: I0219 16:18:28.103532 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bwbpv" Feb 19 16:18:28 crc kubenswrapper[4810]: I0219 16:18:28.476649 4810 generic.go:334] "Generic (PLEG): container finished" podID="18dc9098-5a5e-438e-88d7-b611f88e7e56" containerID="1cb29dd13e72da68f942f209eb6e0550ffc40a6639757fdf8af6c17ba48e7f94" exitCode=0 Feb 19 16:18:28 crc kubenswrapper[4810]: I0219 16:18:28.476699 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fsjqs" event={"ID":"18dc9098-5a5e-438e-88d7-b611f88e7e56","Type":"ContainerDied","Data":"1cb29dd13e72da68f942f209eb6e0550ffc40a6639757fdf8af6c17ba48e7f94"} Feb 19 16:18:29 crc kubenswrapper[4810]: I0219 16:18:29.159136 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bwbpv" podUID="05b0324a-36c1-419b-8bdd-e41ad42a6a3f" containerName="registry-server" probeResult="failure" output=< Feb 19 16:18:29 crc kubenswrapper[4810]: timeout: failed to connect service ":50051" within 1s Feb 19 16:18:29 crc kubenswrapper[4810]: > Feb 19 16:18:29 crc kubenswrapper[4810]: I0219 16:18:29.492037 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fsjqs" event={"ID":"18dc9098-5a5e-438e-88d7-b611f88e7e56","Type":"ContainerStarted","Data":"f5a6dd294de86b91df249d4c74319c874e8cb7930dab7621079b8f6cbbf62a7e"} Feb 19 16:18:29 crc kubenswrapper[4810]: I0219 16:18:29.517442 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-fsjqs" podStartSLOduration=2.061526893 podStartE2EDuration="5.517419015s" podCreationTimestamp="2026-02-19 16:18:24 +0000 UTC" firstStartedPulling="2026-02-19 16:18:25.434771294 +0000 UTC m=+4134.916801428" lastFinishedPulling="2026-02-19 16:18:28.890663426 +0000 UTC m=+4138.372693550" observedRunningTime="2026-02-19 16:18:29.516637236 +0000 UTC m=+4138.998667400" watchObservedRunningTime="2026-02-19 16:18:29.517419015 +0000 UTC m=+4138.999449149" Feb 19 16:18:34 crc kubenswrapper[4810]: I0219 16:18:34.581351 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-fsjqs" Feb 19 16:18:34 crc kubenswrapper[4810]: I0219 16:18:34.582057 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-fsjqs" Feb 19 16:18:34 crc kubenswrapper[4810]: I0219 16:18:34.656568 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-fsjqs" Feb 19 16:18:35 crc kubenswrapper[4810]: I0219 16:18:35.648083 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-fsjqs" Feb 19 16:18:35 crc kubenswrapper[4810]: I0219 16:18:35.720548 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fsjqs"] Feb 19 16:18:37 crc kubenswrapper[4810]: I0219 16:18:37.587321 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-fsjqs" podUID="18dc9098-5a5e-438e-88d7-b611f88e7e56" containerName="registry-server" containerID="cri-o://f5a6dd294de86b91df249d4c74319c874e8cb7930dab7621079b8f6cbbf62a7e" gracePeriod=2 Feb 19 16:18:38 crc kubenswrapper[4810]: I0219 16:18:38.143096 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fsjqs" Feb 19 16:18:38 crc kubenswrapper[4810]: I0219 16:18:38.190186 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bwbpv" Feb 19 16:18:38 crc kubenswrapper[4810]: I0219 16:18:38.211123 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18dc9098-5a5e-438e-88d7-b611f88e7e56-catalog-content\") pod \"18dc9098-5a5e-438e-88d7-b611f88e7e56\" (UID: \"18dc9098-5a5e-438e-88d7-b611f88e7e56\") " Feb 19 16:18:38 crc kubenswrapper[4810]: I0219 16:18:38.211240 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rskwc\" (UniqueName: \"kubernetes.io/projected/18dc9098-5a5e-438e-88d7-b611f88e7e56-kube-api-access-rskwc\") pod \"18dc9098-5a5e-438e-88d7-b611f88e7e56\" (UID: \"18dc9098-5a5e-438e-88d7-b611f88e7e56\") " Feb 19 16:18:38 crc kubenswrapper[4810]: I0219 16:18:38.211378 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18dc9098-5a5e-438e-88d7-b611f88e7e56-utilities\") pod \"18dc9098-5a5e-438e-88d7-b611f88e7e56\" (UID: \"18dc9098-5a5e-438e-88d7-b611f88e7e56\") " Feb 19 16:18:38 crc kubenswrapper[4810]: I0219 16:18:38.212177 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18dc9098-5a5e-438e-88d7-b611f88e7e56-utilities" (OuterVolumeSpecName: "utilities") pod "18dc9098-5a5e-438e-88d7-b611f88e7e56" (UID: "18dc9098-5a5e-438e-88d7-b611f88e7e56"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 16:18:38 crc kubenswrapper[4810]: I0219 16:18:38.220716 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18dc9098-5a5e-438e-88d7-b611f88e7e56-kube-api-access-rskwc" (OuterVolumeSpecName: "kube-api-access-rskwc") pod "18dc9098-5a5e-438e-88d7-b611f88e7e56" (UID: "18dc9098-5a5e-438e-88d7-b611f88e7e56"). InnerVolumeSpecName "kube-api-access-rskwc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 16:18:38 crc kubenswrapper[4810]: I0219 16:18:38.243806 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bwbpv" Feb 19 16:18:38 crc kubenswrapper[4810]: I0219 16:18:38.277047 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18dc9098-5a5e-438e-88d7-b611f88e7e56-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "18dc9098-5a5e-438e-88d7-b611f88e7e56" (UID: "18dc9098-5a5e-438e-88d7-b611f88e7e56"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 16:18:38 crc kubenswrapper[4810]: I0219 16:18:38.313436 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18dc9098-5a5e-438e-88d7-b611f88e7e56-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 16:18:38 crc kubenswrapper[4810]: I0219 16:18:38.313464 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18dc9098-5a5e-438e-88d7-b611f88e7e56-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 16:18:38 crc kubenswrapper[4810]: I0219 16:18:38.313475 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rskwc\" (UniqueName: \"kubernetes.io/projected/18dc9098-5a5e-438e-88d7-b611f88e7e56-kube-api-access-rskwc\") on node \"crc\" DevicePath \"\"" Feb 19 16:18:38 crc kubenswrapper[4810]: I0219 16:18:38.600073 4810 generic.go:334] "Generic (PLEG): container finished" podID="18dc9098-5a5e-438e-88d7-b611f88e7e56" containerID="f5a6dd294de86b91df249d4c74319c874e8cb7930dab7621079b8f6cbbf62a7e" exitCode=0 Feb 19 16:18:38 crc kubenswrapper[4810]: I0219 16:18:38.600169 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fsjqs" event={"ID":"18dc9098-5a5e-438e-88d7-b611f88e7e56","Type":"ContainerDied","Data":"f5a6dd294de86b91df249d4c74319c874e8cb7930dab7621079b8f6cbbf62a7e"} Feb 19 16:18:38 crc kubenswrapper[4810]: I0219 16:18:38.600403 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fsjqs" event={"ID":"18dc9098-5a5e-438e-88d7-b611f88e7e56","Type":"ContainerDied","Data":"543aee6faf2d0487412dcc9038ac404f9f42cb7c9ccaddf0001984891f549f8e"} Feb 19 16:18:38 crc kubenswrapper[4810]: I0219 16:18:38.600440 4810 scope.go:117] "RemoveContainer" containerID="f5a6dd294de86b91df249d4c74319c874e8cb7930dab7621079b8f6cbbf62a7e" Feb 19 16:18:38 crc kubenswrapper[4810]: I0219 16:18:38.600209 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fsjqs" Feb 19 16:18:38 crc kubenswrapper[4810]: I0219 16:18:38.631389 4810 scope.go:117] "RemoveContainer" containerID="1cb29dd13e72da68f942f209eb6e0550ffc40a6639757fdf8af6c17ba48e7f94" Feb 19 16:18:38 crc kubenswrapper[4810]: I0219 16:18:38.676227 4810 scope.go:117] "RemoveContainer" containerID="f8f03eba801535218b99db7ea105ed2a091125b7d3975eb406370048e19ab062" Feb 19 16:18:38 crc kubenswrapper[4810]: I0219 16:18:38.714207 4810 scope.go:117] "RemoveContainer" containerID="f5a6dd294de86b91df249d4c74319c874e8cb7930dab7621079b8f6cbbf62a7e" Feb 19 16:18:38 crc kubenswrapper[4810]: E0219 16:18:38.714854 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5a6dd294de86b91df249d4c74319c874e8cb7930dab7621079b8f6cbbf62a7e\": container with ID starting with f5a6dd294de86b91df249d4c74319c874e8cb7930dab7621079b8f6cbbf62a7e not found: ID does not exist" containerID="f5a6dd294de86b91df249d4c74319c874e8cb7930dab7621079b8f6cbbf62a7e" Feb 19 16:18:38 crc kubenswrapper[4810]: I0219 16:18:38.714907 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5a6dd294de86b91df249d4c74319c874e8cb7930dab7621079b8f6cbbf62a7e"} err="failed to get container status \"f5a6dd294de86b91df249d4c74319c874e8cb7930dab7621079b8f6cbbf62a7e\": rpc error: code = NotFound desc = could not find container \"f5a6dd294de86b91df249d4c74319c874e8cb7930dab7621079b8f6cbbf62a7e\": container with ID starting with f5a6dd294de86b91df249d4c74319c874e8cb7930dab7621079b8f6cbbf62a7e not found: ID does not exist" Feb 19 16:18:38 crc kubenswrapper[4810]: I0219 16:18:38.714936 4810 scope.go:117] "RemoveContainer" containerID="1cb29dd13e72da68f942f209eb6e0550ffc40a6639757fdf8af6c17ba48e7f94" Feb 19 16:18:38 crc kubenswrapper[4810]: E0219 16:18:38.715342 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cb29dd13e72da68f942f209eb6e0550ffc40a6639757fdf8af6c17ba48e7f94\": container with ID starting with 1cb29dd13e72da68f942f209eb6e0550ffc40a6639757fdf8af6c17ba48e7f94 not found: ID does not exist" containerID="1cb29dd13e72da68f942f209eb6e0550ffc40a6639757fdf8af6c17ba48e7f94" Feb 19 16:18:38 crc kubenswrapper[4810]: I0219 16:18:38.715388 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cb29dd13e72da68f942f209eb6e0550ffc40a6639757fdf8af6c17ba48e7f94"} err="failed to get container status \"1cb29dd13e72da68f942f209eb6e0550ffc40a6639757fdf8af6c17ba48e7f94\": rpc error: code = NotFound desc = could not find container \"1cb29dd13e72da68f942f209eb6e0550ffc40a6639757fdf8af6c17ba48e7f94\": container with ID starting with 1cb29dd13e72da68f942f209eb6e0550ffc40a6639757fdf8af6c17ba48e7f94 not found: ID does not exist" Feb 19 16:18:38 crc kubenswrapper[4810]: I0219 16:18:38.715425 4810 scope.go:117] "RemoveContainer" containerID="f8f03eba801535218b99db7ea105ed2a091125b7d3975eb406370048e19ab062" Feb 19 16:18:38 crc kubenswrapper[4810]: E0219 16:18:38.715830 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8f03eba801535218b99db7ea105ed2a091125b7d3975eb406370048e19ab062\": container with ID starting with f8f03eba801535218b99db7ea105ed2a091125b7d3975eb406370048e19ab062 not found: ID does not exist" containerID="f8f03eba801535218b99db7ea105ed2a091125b7d3975eb406370048e19ab062" Feb 19 16:18:38 crc kubenswrapper[4810]: I0219 16:18:38.715855 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8f03eba801535218b99db7ea105ed2a091125b7d3975eb406370048e19ab062"} err="failed to get container status \"f8f03eba801535218b99db7ea105ed2a091125b7d3975eb406370048e19ab062\": rpc error: code = NotFound desc = could not find container \"f8f03eba801535218b99db7ea105ed2a091125b7d3975eb406370048e19ab062\": container with ID starting with f8f03eba801535218b99db7ea105ed2a091125b7d3975eb406370048e19ab062 not found: ID does not exist" Feb 19 16:18:38 crc kubenswrapper[4810]: I0219 16:18:38.725839 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fsjqs"] Feb 19 16:18:38 crc kubenswrapper[4810]: I0219 16:18:38.736286 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-fsjqs"] Feb 19 16:18:39 crc kubenswrapper[4810]: I0219 16:18:39.457319 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18dc9098-5a5e-438e-88d7-b611f88e7e56" path="/var/lib/kubelet/pods/18dc9098-5a5e-438e-88d7-b611f88e7e56/volumes" Feb 19 16:18:39 crc kubenswrapper[4810]: I0219 16:18:39.538006 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bwbpv"] Feb 19 16:18:39 crc kubenswrapper[4810]: I0219 16:18:39.918012 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gzkwp"] Feb 19 16:18:39 crc kubenswrapper[4810]: I0219 16:18:39.919040 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gzkwp" podUID="cb41d90e-0896-4229-a19b-a8577292bbf6" containerName="registry-server" containerID="cri-o://4dcb06a25fb7e17e0617e078d6ec18aa0f057a2f73d88aa7a8ec24cf486c4034" gracePeriod=2 Feb 19 16:18:40 crc kubenswrapper[4810]: I0219 16:18:40.438794 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gzkwp" Feb 19 16:18:40 crc kubenswrapper[4810]: I0219 16:18:40.462488 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb41d90e-0896-4229-a19b-a8577292bbf6-catalog-content\") pod \"cb41d90e-0896-4229-a19b-a8577292bbf6\" (UID: \"cb41d90e-0896-4229-a19b-a8577292bbf6\") " Feb 19 16:18:40 crc kubenswrapper[4810]: I0219 16:18:40.462555 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb41d90e-0896-4229-a19b-a8577292bbf6-utilities\") pod \"cb41d90e-0896-4229-a19b-a8577292bbf6\" (UID: \"cb41d90e-0896-4229-a19b-a8577292bbf6\") " Feb 19 16:18:40 crc kubenswrapper[4810]: I0219 16:18:40.462617 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s79c7\" (UniqueName: \"kubernetes.io/projected/cb41d90e-0896-4229-a19b-a8577292bbf6-kube-api-access-s79c7\") pod \"cb41d90e-0896-4229-a19b-a8577292bbf6\" (UID: \"cb41d90e-0896-4229-a19b-a8577292bbf6\") " Feb 19 16:18:40 crc kubenswrapper[4810]: I0219 16:18:40.473518 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb41d90e-0896-4229-a19b-a8577292bbf6-utilities" (OuterVolumeSpecName: "utilities") pod "cb41d90e-0896-4229-a19b-a8577292bbf6" (UID: "cb41d90e-0896-4229-a19b-a8577292bbf6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 16:18:40 crc kubenswrapper[4810]: I0219 16:18:40.475568 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb41d90e-0896-4229-a19b-a8577292bbf6-kube-api-access-s79c7" (OuterVolumeSpecName: "kube-api-access-s79c7") pod "cb41d90e-0896-4229-a19b-a8577292bbf6" (UID: "cb41d90e-0896-4229-a19b-a8577292bbf6"). InnerVolumeSpecName "kube-api-access-s79c7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 16:18:40 crc kubenswrapper[4810]: I0219 16:18:40.565009 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb41d90e-0896-4229-a19b-a8577292bbf6-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 16:18:40 crc kubenswrapper[4810]: I0219 16:18:40.565040 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s79c7\" (UniqueName: \"kubernetes.io/projected/cb41d90e-0896-4229-a19b-a8577292bbf6-kube-api-access-s79c7\") on node \"crc\" DevicePath \"\"" Feb 19 16:18:40 crc kubenswrapper[4810]: I0219 16:18:40.601680 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb41d90e-0896-4229-a19b-a8577292bbf6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cb41d90e-0896-4229-a19b-a8577292bbf6" (UID: "cb41d90e-0896-4229-a19b-a8577292bbf6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 16:18:40 crc kubenswrapper[4810]: I0219 16:18:40.631877 4810 generic.go:334] "Generic (PLEG): container finished" podID="cb41d90e-0896-4229-a19b-a8577292bbf6" containerID="4dcb06a25fb7e17e0617e078d6ec18aa0f057a2f73d88aa7a8ec24cf486c4034" exitCode=0 Feb 19 16:18:40 crc kubenswrapper[4810]: I0219 16:18:40.631931 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gzkwp" event={"ID":"cb41d90e-0896-4229-a19b-a8577292bbf6","Type":"ContainerDied","Data":"4dcb06a25fb7e17e0617e078d6ec18aa0f057a2f73d88aa7a8ec24cf486c4034"} Feb 19 16:18:40 crc kubenswrapper[4810]: I0219 16:18:40.631996 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gzkwp" event={"ID":"cb41d90e-0896-4229-a19b-a8577292bbf6","Type":"ContainerDied","Data":"359d44fe31f278162e80c02cff4ce2bbbb87a6a8b85fc0045e405dd62ec98267"} Feb 19 16:18:40 crc kubenswrapper[4810]: I0219 16:18:40.632015 4810 scope.go:117] "RemoveContainer" containerID="4dcb06a25fb7e17e0617e078d6ec18aa0f057a2f73d88aa7a8ec24cf486c4034" Feb 19 16:18:40 crc kubenswrapper[4810]: I0219 16:18:40.632381 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gzkwp" Feb 19 16:18:40 crc kubenswrapper[4810]: I0219 16:18:40.654525 4810 scope.go:117] "RemoveContainer" containerID="563ec001d1590fa9c16e3de9515c66b2a8327d12667295252c48a5230c733ba1" Feb 19 16:18:40 crc kubenswrapper[4810]: I0219 16:18:40.665986 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb41d90e-0896-4229-a19b-a8577292bbf6-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 16:18:40 crc kubenswrapper[4810]: I0219 16:18:40.672973 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gzkwp"] Feb 19 16:18:40 crc kubenswrapper[4810]: I0219 16:18:40.682748 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gzkwp"] Feb 19 16:18:40 crc kubenswrapper[4810]: I0219 16:18:40.704730 4810 scope.go:117] "RemoveContainer" containerID="446909cef9363d01f975cdf4a69b340a8cfb1e1ae9bddb9ba40e5d706fc6a797" Feb 19 16:18:40 crc kubenswrapper[4810]: I0219 16:18:40.725816 4810 scope.go:117] "RemoveContainer" containerID="4dcb06a25fb7e17e0617e078d6ec18aa0f057a2f73d88aa7a8ec24cf486c4034" Feb 19 16:18:40 crc kubenswrapper[4810]: E0219 16:18:40.726281 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4dcb06a25fb7e17e0617e078d6ec18aa0f057a2f73d88aa7a8ec24cf486c4034\": container with ID starting with 4dcb06a25fb7e17e0617e078d6ec18aa0f057a2f73d88aa7a8ec24cf486c4034 not found: ID does not exist" containerID="4dcb06a25fb7e17e0617e078d6ec18aa0f057a2f73d88aa7a8ec24cf486c4034" Feb 19 16:18:40 crc kubenswrapper[4810]: I0219 16:18:40.726369 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4dcb06a25fb7e17e0617e078d6ec18aa0f057a2f73d88aa7a8ec24cf486c4034"} err="failed to get container status \"4dcb06a25fb7e17e0617e078d6ec18aa0f057a2f73d88aa7a8ec24cf486c4034\": rpc error: code = NotFound desc = could not find container \"4dcb06a25fb7e17e0617e078d6ec18aa0f057a2f73d88aa7a8ec24cf486c4034\": container with ID starting with 4dcb06a25fb7e17e0617e078d6ec18aa0f057a2f73d88aa7a8ec24cf486c4034 not found: ID does not exist" Feb 19 16:18:40 crc kubenswrapper[4810]: I0219 16:18:40.726399 4810 scope.go:117] "RemoveContainer" containerID="563ec001d1590fa9c16e3de9515c66b2a8327d12667295252c48a5230c733ba1" Feb 19 16:18:40 crc kubenswrapper[4810]: E0219 16:18:40.726686 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"563ec001d1590fa9c16e3de9515c66b2a8327d12667295252c48a5230c733ba1\": container with ID starting with 563ec001d1590fa9c16e3de9515c66b2a8327d12667295252c48a5230c733ba1 not found: ID does not exist" containerID="563ec001d1590fa9c16e3de9515c66b2a8327d12667295252c48a5230c733ba1" Feb 19 16:18:40 crc kubenswrapper[4810]: I0219 16:18:40.726715 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"563ec001d1590fa9c16e3de9515c66b2a8327d12667295252c48a5230c733ba1"} err="failed to get container status \"563ec001d1590fa9c16e3de9515c66b2a8327d12667295252c48a5230c733ba1\": rpc error: code = NotFound desc = could not find container \"563ec001d1590fa9c16e3de9515c66b2a8327d12667295252c48a5230c733ba1\": container with ID starting with 563ec001d1590fa9c16e3de9515c66b2a8327d12667295252c48a5230c733ba1 not found: ID does not exist" Feb 19 16:18:40 crc kubenswrapper[4810]: I0219 16:18:40.726735 4810 scope.go:117] "RemoveContainer" containerID="446909cef9363d01f975cdf4a69b340a8cfb1e1ae9bddb9ba40e5d706fc6a797" Feb 19 16:18:40 crc kubenswrapper[4810]: E0219 16:18:40.726930 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"446909cef9363d01f975cdf4a69b340a8cfb1e1ae9bddb9ba40e5d706fc6a797\": container with ID starting with 446909cef9363d01f975cdf4a69b340a8cfb1e1ae9bddb9ba40e5d706fc6a797 not found: ID does not exist" containerID="446909cef9363d01f975cdf4a69b340a8cfb1e1ae9bddb9ba40e5d706fc6a797" Feb 19 16:18:40 crc kubenswrapper[4810]: I0219 16:18:40.726949 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"446909cef9363d01f975cdf4a69b340a8cfb1e1ae9bddb9ba40e5d706fc6a797"} err="failed to get container status \"446909cef9363d01f975cdf4a69b340a8cfb1e1ae9bddb9ba40e5d706fc6a797\": rpc error: code = NotFound desc = could not find container \"446909cef9363d01f975cdf4a69b340a8cfb1e1ae9bddb9ba40e5d706fc6a797\": container with ID starting with 446909cef9363d01f975cdf4a69b340a8cfb1e1ae9bddb9ba40e5d706fc6a797 not found: ID does not exist" Feb 19 16:18:41 crc kubenswrapper[4810]: I0219 16:18:41.456051 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb41d90e-0896-4229-a19b-a8577292bbf6" path="/var/lib/kubelet/pods/cb41d90e-0896-4229-a19b-a8577292bbf6/volumes" Feb 19 16:18:49 crc kubenswrapper[4810]: I0219 16:18:49.538289 4810 patch_prober.go:28] interesting pod/machine-config-daemon-t499d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 16:18:49 crc kubenswrapper[4810]: I0219 16:18:49.539087 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 16:18:49 crc kubenswrapper[4810]: I0219 16:18:49.539161 4810 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t499d" Feb 19 16:18:49 crc kubenswrapper[4810]: I0219 16:18:49.540367 4810 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"dd8031bab4ace844a0a95e2a4ed00eb2a49420bcc364c28753fbe8146e7a2fd5"} pod="openshift-machine-config-operator/machine-config-daemon-t499d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 16:18:49 crc kubenswrapper[4810]: I0219 16:18:49.540479 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" containerID="cri-o://dd8031bab4ace844a0a95e2a4ed00eb2a49420bcc364c28753fbe8146e7a2fd5" gracePeriod=600 Feb 19 16:18:49 crc kubenswrapper[4810]: I0219 16:18:49.733167 4810 generic.go:334] "Generic (PLEG): container finished" podID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerID="dd8031bab4ace844a0a95e2a4ed00eb2a49420bcc364c28753fbe8146e7a2fd5" exitCode=0 Feb 19 16:18:49 crc kubenswrapper[4810]: I0219 16:18:49.733215 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" event={"ID":"dfbf74ef-1e94-4826-8583-42b2e246ccf3","Type":"ContainerDied","Data":"dd8031bab4ace844a0a95e2a4ed00eb2a49420bcc364c28753fbe8146e7a2fd5"} Feb 19 16:18:49 crc kubenswrapper[4810]: I0219 16:18:49.733258 4810 scope.go:117] "RemoveContainer" containerID="8520d3639f4b062a60a4e8d48690a9ddf9c8f0411d765a4be05ac76b8b0faa8e" Feb 19 16:18:50 crc kubenswrapper[4810]: I0219 16:18:50.761686 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" event={"ID":"dfbf74ef-1e94-4826-8583-42b2e246ccf3","Type":"ContainerStarted","Data":"fe77cba37c7a5825c8d865a2480ab7487bd5dce94a751361742d3c3e639d1af6"} Feb 19 16:20:49 crc kubenswrapper[4810]: I0219 16:20:49.537793 4810 patch_prober.go:28] interesting pod/machine-config-daemon-t499d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 16:20:49 crc kubenswrapper[4810]: I0219 16:20:49.538640 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 16:21:19 crc kubenswrapper[4810]: I0219 16:21:19.537516 4810 patch_prober.go:28] interesting pod/machine-config-daemon-t499d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 16:21:19 crc kubenswrapper[4810]: I0219 16:21:19.538137 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 16:21:49 crc kubenswrapper[4810]: I0219 16:21:49.537189 4810 patch_prober.go:28] interesting pod/machine-config-daemon-t499d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 16:21:49 crc kubenswrapper[4810]: I0219 16:21:49.537785 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 16:21:49 crc kubenswrapper[4810]: I0219 16:21:49.537831 4810 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t499d" Feb 19 16:21:49 crc kubenswrapper[4810]: I0219 16:21:49.538625 4810 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fe77cba37c7a5825c8d865a2480ab7487bd5dce94a751361742d3c3e639d1af6"} pod="openshift-machine-config-operator/machine-config-daemon-t499d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 16:21:49 crc kubenswrapper[4810]: I0219 16:21:49.538684 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" containerID="cri-o://fe77cba37c7a5825c8d865a2480ab7487bd5dce94a751361742d3c3e639d1af6" gracePeriod=600 Feb 19 16:21:49 crc kubenswrapper[4810]: E0219 16:21:49.739365 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:21:49 crc kubenswrapper[4810]: I0219 16:21:49.871999 4810 generic.go:334] "Generic (PLEG): container finished" podID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerID="fe77cba37c7a5825c8d865a2480ab7487bd5dce94a751361742d3c3e639d1af6" exitCode=0 Feb 19 16:21:49 crc kubenswrapper[4810]: I0219 16:21:49.872241 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" event={"ID":"dfbf74ef-1e94-4826-8583-42b2e246ccf3","Type":"ContainerDied","Data":"fe77cba37c7a5825c8d865a2480ab7487bd5dce94a751361742d3c3e639d1af6"} Feb 19 16:21:49 crc kubenswrapper[4810]: I0219 16:21:49.872305 4810 scope.go:117] "RemoveContainer" containerID="dd8031bab4ace844a0a95e2a4ed00eb2a49420bcc364c28753fbe8146e7a2fd5" Feb 19 16:21:49 crc kubenswrapper[4810]: I0219 16:21:49.872961 4810 scope.go:117] "RemoveContainer" containerID="fe77cba37c7a5825c8d865a2480ab7487bd5dce94a751361742d3c3e639d1af6" Feb 19 16:21:49 crc kubenswrapper[4810]: E0219 16:21:49.873532 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:22:02 crc kubenswrapper[4810]: I0219 16:22:02.439522 4810 scope.go:117] "RemoveContainer" containerID="fe77cba37c7a5825c8d865a2480ab7487bd5dce94a751361742d3c3e639d1af6" Feb 19 16:22:02 crc kubenswrapper[4810]: E0219 16:22:02.440432 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:22:14 crc kubenswrapper[4810]: I0219 16:22:14.439916 4810 scope.go:117] "RemoveContainer" containerID="fe77cba37c7a5825c8d865a2480ab7487bd5dce94a751361742d3c3e639d1af6" Feb 19 16:22:14 crc kubenswrapper[4810]: E0219 16:22:14.440873 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:22:27 crc kubenswrapper[4810]: I0219 16:22:27.439977 4810 scope.go:117] "RemoveContainer" containerID="fe77cba37c7a5825c8d865a2480ab7487bd5dce94a751361742d3c3e639d1af6" Feb 19 16:22:27 crc kubenswrapper[4810]: E0219 16:22:27.440718 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:22:39 crc kubenswrapper[4810]: I0219 16:22:39.439661 4810 scope.go:117] "RemoveContainer" containerID="fe77cba37c7a5825c8d865a2480ab7487bd5dce94a751361742d3c3e639d1af6" Feb 19 16:22:39 crc kubenswrapper[4810]: E0219 16:22:39.440886 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:22:51 crc kubenswrapper[4810]: I0219 16:22:51.451922 4810 scope.go:117] "RemoveContainer" containerID="fe77cba37c7a5825c8d865a2480ab7487bd5dce94a751361742d3c3e639d1af6" Feb 19 16:22:51 crc kubenswrapper[4810]: E0219 16:22:51.453062 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:23:06 crc kubenswrapper[4810]: I0219 16:23:06.441100 4810 scope.go:117] "RemoveContainer" containerID="fe77cba37c7a5825c8d865a2480ab7487bd5dce94a751361742d3c3e639d1af6" Feb 19 16:23:06 crc kubenswrapper[4810]: E0219 16:23:06.441868 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:23:19 crc kubenswrapper[4810]: I0219 16:23:19.439803 4810 scope.go:117] "RemoveContainer" containerID="fe77cba37c7a5825c8d865a2480ab7487bd5dce94a751361742d3c3e639d1af6" Feb 19 16:23:19 crc kubenswrapper[4810]: E0219 16:23:19.440826 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:23:33 crc kubenswrapper[4810]: I0219 16:23:33.440206 4810 scope.go:117] "RemoveContainer" containerID="fe77cba37c7a5825c8d865a2480ab7487bd5dce94a751361742d3c3e639d1af6" Feb 19 16:23:33 crc kubenswrapper[4810]: E0219 16:23:33.441312 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:23:45 crc kubenswrapper[4810]: I0219 16:23:45.438960 4810 scope.go:117] "RemoveContainer" containerID="fe77cba37c7a5825c8d865a2480ab7487bd5dce94a751361742d3c3e639d1af6" Feb 19 16:23:45 crc kubenswrapper[4810]: E0219 16:23:45.439856 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:23:56 crc kubenswrapper[4810]: I0219 16:23:56.439966 4810 scope.go:117] "RemoveContainer" containerID="fe77cba37c7a5825c8d865a2480ab7487bd5dce94a751361742d3c3e639d1af6" Feb 19 16:23:56 crc kubenswrapper[4810]: E0219 16:23:56.443538 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:24:10 crc kubenswrapper[4810]: I0219 16:24:10.442676 4810 scope.go:117] "RemoveContainer" containerID="fe77cba37c7a5825c8d865a2480ab7487bd5dce94a751361742d3c3e639d1af6" Feb 19 16:24:10 crc kubenswrapper[4810]: E0219 16:24:10.443555 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:24:22 crc kubenswrapper[4810]: I0219 16:24:22.440253 4810 scope.go:117] "RemoveContainer" containerID="fe77cba37c7a5825c8d865a2480ab7487bd5dce94a751361742d3c3e639d1af6" Feb 19 16:24:22 crc kubenswrapper[4810]: E0219 16:24:22.441309 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:24:37 crc kubenswrapper[4810]: I0219 16:24:37.440426 4810 scope.go:117] "RemoveContainer" containerID="fe77cba37c7a5825c8d865a2480ab7487bd5dce94a751361742d3c3e639d1af6" Feb 19 16:24:37 crc kubenswrapper[4810]: E0219 16:24:37.441256 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:24:52 crc kubenswrapper[4810]: E0219 16:24:52.096855 4810 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.162:51034->38.102.83.162:41765: write tcp 38.102.83.162:51034->38.102.83.162:41765: write: connection reset by peer Feb 19 16:24:52 crc kubenswrapper[4810]: I0219 16:24:52.440547 4810 scope.go:117] "RemoveContainer" containerID="fe77cba37c7a5825c8d865a2480ab7487bd5dce94a751361742d3c3e639d1af6" Feb 19 16:24:52 crc kubenswrapper[4810]: E0219 16:24:52.441074 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:25:05 crc kubenswrapper[4810]: I0219 16:25:05.456374 4810 scope.go:117] "RemoveContainer" containerID="fe77cba37c7a5825c8d865a2480ab7487bd5dce94a751361742d3c3e639d1af6" Feb 19 16:25:05 crc kubenswrapper[4810]: E0219 16:25:05.465529 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:25:19 crc kubenswrapper[4810]: I0219 16:25:19.440232 4810 scope.go:117] "RemoveContainer" containerID="fe77cba37c7a5825c8d865a2480ab7487bd5dce94a751361742d3c3e639d1af6" Feb 19 16:25:19 crc kubenswrapper[4810]: E0219 16:25:19.441053 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:25:31 crc kubenswrapper[4810]: I0219 16:25:31.446621 4810 scope.go:117] "RemoveContainer" containerID="fe77cba37c7a5825c8d865a2480ab7487bd5dce94a751361742d3c3e639d1af6" Feb 19 16:25:31 crc kubenswrapper[4810]: E0219 16:25:31.447426 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:25:46 crc kubenswrapper[4810]: I0219 16:25:46.440113 4810 scope.go:117] "RemoveContainer" containerID="fe77cba37c7a5825c8d865a2480ab7487bd5dce94a751361742d3c3e639d1af6" Feb 19 16:25:46 crc kubenswrapper[4810]: E0219 16:25:46.441509 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:25:57 crc kubenswrapper[4810]: I0219 16:25:57.439402 4810 scope.go:117] "RemoveContainer" containerID="fe77cba37c7a5825c8d865a2480ab7487bd5dce94a751361742d3c3e639d1af6" Feb 19 16:25:57 crc kubenswrapper[4810]: E0219 16:25:57.440047 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:26:08 crc kubenswrapper[4810]: I0219 16:26:08.439670 4810 scope.go:117] "RemoveContainer" containerID="fe77cba37c7a5825c8d865a2480ab7487bd5dce94a751361742d3c3e639d1af6" Feb 19 16:26:08 crc kubenswrapper[4810]: E0219 16:26:08.440400 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:26:22 crc kubenswrapper[4810]: I0219 16:26:22.440410 4810 scope.go:117] "RemoveContainer" containerID="fe77cba37c7a5825c8d865a2480ab7487bd5dce94a751361742d3c3e639d1af6" Feb 19 16:26:22 crc kubenswrapper[4810]: E0219 16:26:22.441667 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:26:35 crc kubenswrapper[4810]: I0219 16:26:35.440490 4810 scope.go:117] "RemoveContainer" containerID="fe77cba37c7a5825c8d865a2480ab7487bd5dce94a751361742d3c3e639d1af6" Feb 19 16:26:35 crc kubenswrapper[4810]: E0219 16:26:35.442596 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:26:47 crc kubenswrapper[4810]: I0219 16:26:47.441501 4810 scope.go:117] "RemoveContainer" containerID="fe77cba37c7a5825c8d865a2480ab7487bd5dce94a751361742d3c3e639d1af6" Feb 19 16:26:47 crc kubenswrapper[4810]: E0219 16:26:47.442866 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:27:00 crc kubenswrapper[4810]: I0219 16:27:00.440507 4810 scope.go:117] "RemoveContainer" containerID="fe77cba37c7a5825c8d865a2480ab7487bd5dce94a751361742d3c3e639d1af6" Feb 19 16:27:00 crc kubenswrapper[4810]: I0219 16:27:00.795810 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" event={"ID":"dfbf74ef-1e94-4826-8583-42b2e246ccf3","Type":"ContainerStarted","Data":"61e1161022488800f1a95fde17c57eb209b943fcdd172d04938731e0ad552ce2"} Feb 19 16:28:11 crc kubenswrapper[4810]: I0219 16:28:11.446025 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-7rbxk" podUID="66c7e596-ffa3-4687-8c80-21acecbd8075" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 16:28:11 crc kubenswrapper[4810]: I0219 16:28:11.635560 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-64pwn"] Feb 19 16:28:11 crc kubenswrapper[4810]: E0219 16:28:11.636057 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18dc9098-5a5e-438e-88d7-b611f88e7e56" containerName="extract-utilities" Feb 19 16:28:11 crc kubenswrapper[4810]: I0219 16:28:11.636079 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="18dc9098-5a5e-438e-88d7-b611f88e7e56" containerName="extract-utilities" Feb 19 16:28:11 crc kubenswrapper[4810]: E0219 16:28:11.636102 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18dc9098-5a5e-438e-88d7-b611f88e7e56" containerName="extract-content" Feb 19 16:28:11 crc kubenswrapper[4810]: I0219 16:28:11.636113 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="18dc9098-5a5e-438e-88d7-b611f88e7e56" containerName="extract-content" Feb 19 16:28:11 crc kubenswrapper[4810]: E0219 16:28:11.636143 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18dc9098-5a5e-438e-88d7-b611f88e7e56" containerName="registry-server" Feb 19 16:28:11 crc kubenswrapper[4810]: I0219 16:28:11.636150 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="18dc9098-5a5e-438e-88d7-b611f88e7e56" containerName="registry-server" Feb 19 16:28:11 crc kubenswrapper[4810]: E0219 16:28:11.636161 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb41d90e-0896-4229-a19b-a8577292bbf6" containerName="registry-server" Feb 19 16:28:11 crc kubenswrapper[4810]: I0219 16:28:11.636192 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb41d90e-0896-4229-a19b-a8577292bbf6" containerName="registry-server" Feb 19 16:28:11 crc kubenswrapper[4810]: E0219 16:28:11.636206 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb41d90e-0896-4229-a19b-a8577292bbf6" containerName="extract-content" Feb 19 16:28:11 crc kubenswrapper[4810]: I0219 16:28:11.636214 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb41d90e-0896-4229-a19b-a8577292bbf6" containerName="extract-content" Feb 19 16:28:11 crc kubenswrapper[4810]: E0219 16:28:11.636242 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb41d90e-0896-4229-a19b-a8577292bbf6" containerName="extract-utilities" Feb 19 16:28:11 crc kubenswrapper[4810]: I0219 16:28:11.636249 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb41d90e-0896-4229-a19b-a8577292bbf6" containerName="extract-utilities" Feb 19 16:28:11 crc kubenswrapper[4810]: I0219 16:28:11.636504 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="18dc9098-5a5e-438e-88d7-b611f88e7e56" containerName="registry-server" Feb 19 16:28:11 crc kubenswrapper[4810]: I0219 16:28:11.636522 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb41d90e-0896-4229-a19b-a8577292bbf6" containerName="registry-server" Feb 19 16:28:11 crc kubenswrapper[4810]: I0219 16:28:11.638266 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-64pwn" Feb 19 16:28:11 crc kubenswrapper[4810]: I0219 16:28:11.658890 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-64pwn"] Feb 19 16:28:11 crc kubenswrapper[4810]: I0219 16:28:11.768801 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gnbp\" (UniqueName: \"kubernetes.io/projected/9e626cbc-4140-4c84-8ecc-8b3315f6023d-kube-api-access-6gnbp\") pod \"redhat-operators-64pwn\" (UID: \"9e626cbc-4140-4c84-8ecc-8b3315f6023d\") " pod="openshift-marketplace/redhat-operators-64pwn" Feb 19 16:28:11 crc kubenswrapper[4810]: I0219 16:28:11.768946 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e626cbc-4140-4c84-8ecc-8b3315f6023d-catalog-content\") pod \"redhat-operators-64pwn\" (UID: \"9e626cbc-4140-4c84-8ecc-8b3315f6023d\") " pod="openshift-marketplace/redhat-operators-64pwn" Feb 19 16:28:11 crc kubenswrapper[4810]: I0219 16:28:11.769000 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e626cbc-4140-4c84-8ecc-8b3315f6023d-utilities\") pod \"redhat-operators-64pwn\" (UID: \"9e626cbc-4140-4c84-8ecc-8b3315f6023d\") " pod="openshift-marketplace/redhat-operators-64pwn" Feb 19 16:28:11 crc kubenswrapper[4810]: I0219 16:28:11.871003 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e626cbc-4140-4c84-8ecc-8b3315f6023d-catalog-content\") pod \"redhat-operators-64pwn\" (UID: \"9e626cbc-4140-4c84-8ecc-8b3315f6023d\") " pod="openshift-marketplace/redhat-operators-64pwn" Feb 19 16:28:11 crc kubenswrapper[4810]: I0219 16:28:11.871266 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e626cbc-4140-4c84-8ecc-8b3315f6023d-utilities\") pod \"redhat-operators-64pwn\" (UID: \"9e626cbc-4140-4c84-8ecc-8b3315f6023d\") " pod="openshift-marketplace/redhat-operators-64pwn" Feb 19 16:28:11 crc kubenswrapper[4810]: I0219 16:28:11.871543 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gnbp\" (UniqueName: \"kubernetes.io/projected/9e626cbc-4140-4c84-8ecc-8b3315f6023d-kube-api-access-6gnbp\") pod \"redhat-operators-64pwn\" (UID: \"9e626cbc-4140-4c84-8ecc-8b3315f6023d\") " pod="openshift-marketplace/redhat-operators-64pwn" Feb 19 16:28:11 crc kubenswrapper[4810]: I0219 16:28:11.871651 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e626cbc-4140-4c84-8ecc-8b3315f6023d-catalog-content\") pod \"redhat-operators-64pwn\" (UID: \"9e626cbc-4140-4c84-8ecc-8b3315f6023d\") " pod="openshift-marketplace/redhat-operators-64pwn" Feb 19 16:28:11 crc kubenswrapper[4810]: I0219 16:28:11.871723 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e626cbc-4140-4c84-8ecc-8b3315f6023d-utilities\") pod \"redhat-operators-64pwn\" (UID: \"9e626cbc-4140-4c84-8ecc-8b3315f6023d\") " pod="openshift-marketplace/redhat-operators-64pwn" Feb 19 16:28:11 crc kubenswrapper[4810]: I0219 16:28:11.897157 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gnbp\" (UniqueName: \"kubernetes.io/projected/9e626cbc-4140-4c84-8ecc-8b3315f6023d-kube-api-access-6gnbp\") pod \"redhat-operators-64pwn\" (UID: \"9e626cbc-4140-4c84-8ecc-8b3315f6023d\") " pod="openshift-marketplace/redhat-operators-64pwn" Feb 19 16:28:11 crc kubenswrapper[4810]: I0219 16:28:11.970318 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-64pwn" Feb 19 16:28:12 crc kubenswrapper[4810]: I0219 16:28:12.616656 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-64pwn"] Feb 19 16:28:13 crc kubenswrapper[4810]: I0219 16:28:13.439596 4810 generic.go:334] "Generic (PLEG): container finished" podID="9e626cbc-4140-4c84-8ecc-8b3315f6023d" containerID="ca5d92eef27bd3b021224c9164b7e74cb5a78a5abd43eb0d862a50d3ae8cd3f9" exitCode=0 Feb 19 16:28:13 crc kubenswrapper[4810]: I0219 16:28:13.446290 4810 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 16:28:13 crc kubenswrapper[4810]: I0219 16:28:13.461076 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-64pwn" event={"ID":"9e626cbc-4140-4c84-8ecc-8b3315f6023d","Type":"ContainerDied","Data":"ca5d92eef27bd3b021224c9164b7e74cb5a78a5abd43eb0d862a50d3ae8cd3f9"} Feb 19 16:28:13 crc kubenswrapper[4810]: I0219 16:28:13.461577 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-64pwn" event={"ID":"9e626cbc-4140-4c84-8ecc-8b3315f6023d","Type":"ContainerStarted","Data":"565952d67109a9fcad8e71d5e0aabde77b113979aa3c289f2b682f3ba5f316a6"} Feb 19 16:28:15 crc kubenswrapper[4810]: I0219 16:28:15.463391 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-64pwn" event={"ID":"9e626cbc-4140-4c84-8ecc-8b3315f6023d","Type":"ContainerStarted","Data":"11aae31930dc1715ed8dc4d319b46478d41e580c6f80c8f65ea1a3331f2c975b"} Feb 19 16:28:19 crc kubenswrapper[4810]: I0219 16:28:19.519750 4810 generic.go:334] "Generic (PLEG): container finished" podID="9e626cbc-4140-4c84-8ecc-8b3315f6023d" containerID="11aae31930dc1715ed8dc4d319b46478d41e580c6f80c8f65ea1a3331f2c975b" exitCode=0 Feb 19 16:28:19 crc kubenswrapper[4810]: I0219 16:28:19.519832 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-64pwn" event={"ID":"9e626cbc-4140-4c84-8ecc-8b3315f6023d","Type":"ContainerDied","Data":"11aae31930dc1715ed8dc4d319b46478d41e580c6f80c8f65ea1a3331f2c975b"} Feb 19 16:28:20 crc kubenswrapper[4810]: I0219 16:28:20.531919 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-64pwn" event={"ID":"9e626cbc-4140-4c84-8ecc-8b3315f6023d","Type":"ContainerStarted","Data":"81668e94dc20932a1366bc891e92da4a77667055ce44a21f7d966c87ad5627e7"} Feb 19 16:28:20 crc kubenswrapper[4810]: I0219 16:28:20.550878 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-64pwn" podStartSLOduration=3.039722839 podStartE2EDuration="9.55086217s" podCreationTimestamp="2026-02-19 16:28:11 +0000 UTC" firstStartedPulling="2026-02-19 16:28:13.445911345 +0000 UTC m=+4722.927941499" lastFinishedPulling="2026-02-19 16:28:19.957050706 +0000 UTC m=+4729.439080830" observedRunningTime="2026-02-19 16:28:20.549202959 +0000 UTC m=+4730.031233083" watchObservedRunningTime="2026-02-19 16:28:20.55086217 +0000 UTC m=+4730.032892294" Feb 19 16:28:20 crc kubenswrapper[4810]: I0219 16:28:20.786065 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mjjlb"] Feb 19 16:28:20 crc kubenswrapper[4810]: I0219 16:28:20.790012 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mjjlb" Feb 19 16:28:20 crc kubenswrapper[4810]: I0219 16:28:20.804764 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mjjlb"] Feb 19 16:28:20 crc kubenswrapper[4810]: I0219 16:28:20.982432 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eeb4ac5e-9779-4e63-8008-83368b12aea5-catalog-content\") pod \"community-operators-mjjlb\" (UID: \"eeb4ac5e-9779-4e63-8008-83368b12aea5\") " pod="openshift-marketplace/community-operators-mjjlb" Feb 19 16:28:20 crc kubenswrapper[4810]: I0219 16:28:20.982861 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncnnf\" (UniqueName: \"kubernetes.io/projected/eeb4ac5e-9779-4e63-8008-83368b12aea5-kube-api-access-ncnnf\") pod \"community-operators-mjjlb\" (UID: \"eeb4ac5e-9779-4e63-8008-83368b12aea5\") " pod="openshift-marketplace/community-operators-mjjlb" Feb 19 16:28:20 crc kubenswrapper[4810]: I0219 16:28:20.983025 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eeb4ac5e-9779-4e63-8008-83368b12aea5-utilities\") pod \"community-operators-mjjlb\" (UID: \"eeb4ac5e-9779-4e63-8008-83368b12aea5\") " pod="openshift-marketplace/community-operators-mjjlb" Feb 19 16:28:21 crc kubenswrapper[4810]: I0219 16:28:21.084831 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eeb4ac5e-9779-4e63-8008-83368b12aea5-utilities\") pod \"community-operators-mjjlb\" (UID: \"eeb4ac5e-9779-4e63-8008-83368b12aea5\") " pod="openshift-marketplace/community-operators-mjjlb" Feb 19 16:28:21 crc kubenswrapper[4810]: I0219 16:28:21.084945 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eeb4ac5e-9779-4e63-8008-83368b12aea5-catalog-content\") pod \"community-operators-mjjlb\" (UID: \"eeb4ac5e-9779-4e63-8008-83368b12aea5\") " pod="openshift-marketplace/community-operators-mjjlb" Feb 19 16:28:21 crc kubenswrapper[4810]: I0219 16:28:21.084989 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncnnf\" (UniqueName: \"kubernetes.io/projected/eeb4ac5e-9779-4e63-8008-83368b12aea5-kube-api-access-ncnnf\") pod \"community-operators-mjjlb\" (UID: \"eeb4ac5e-9779-4e63-8008-83368b12aea5\") " pod="openshift-marketplace/community-operators-mjjlb" Feb 19 16:28:21 crc kubenswrapper[4810]: I0219 16:28:21.085747 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eeb4ac5e-9779-4e63-8008-83368b12aea5-utilities\") pod \"community-operators-mjjlb\" (UID: \"eeb4ac5e-9779-4e63-8008-83368b12aea5\") " pod="openshift-marketplace/community-operators-mjjlb" Feb 19 16:28:21 crc kubenswrapper[4810]: I0219 16:28:21.085783 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eeb4ac5e-9779-4e63-8008-83368b12aea5-catalog-content\") pod \"community-operators-mjjlb\" (UID: \"eeb4ac5e-9779-4e63-8008-83368b12aea5\") " pod="openshift-marketplace/community-operators-mjjlb" Feb 19 16:28:21 crc kubenswrapper[4810]: I0219 16:28:21.108268 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncnnf\" (UniqueName: \"kubernetes.io/projected/eeb4ac5e-9779-4e63-8008-83368b12aea5-kube-api-access-ncnnf\") pod \"community-operators-mjjlb\" (UID: \"eeb4ac5e-9779-4e63-8008-83368b12aea5\") " pod="openshift-marketplace/community-operators-mjjlb" Feb 19 16:28:21 crc kubenswrapper[4810]: I0219 16:28:21.126026 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mjjlb" Feb 19 16:28:21 crc kubenswrapper[4810]: W0219 16:28:21.651104 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeeb4ac5e_9779_4e63_8008_83368b12aea5.slice/crio-b08f75ea1a51a0539020767f2a50472c00e5d58cf749563300c8504c2411cae9 WatchSource:0}: Error finding container b08f75ea1a51a0539020767f2a50472c00e5d58cf749563300c8504c2411cae9: Status 404 returned error can't find the container with id b08f75ea1a51a0539020767f2a50472c00e5d58cf749563300c8504c2411cae9 Feb 19 16:28:21 crc kubenswrapper[4810]: I0219 16:28:21.659516 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mjjlb"] Feb 19 16:28:21 crc kubenswrapper[4810]: I0219 16:28:21.970872 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-64pwn" Feb 19 16:28:21 crc kubenswrapper[4810]: I0219 16:28:21.971194 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-64pwn" Feb 19 16:28:22 crc kubenswrapper[4810]: I0219 16:28:22.552824 4810 generic.go:334] "Generic (PLEG): container finished" podID="eeb4ac5e-9779-4e63-8008-83368b12aea5" containerID="1d64cbe5478a9ad4d9cd763d9f5b1175310a03b2d98326cddede9d7757669756" exitCode=0 Feb 19 16:28:22 crc kubenswrapper[4810]: I0219 16:28:22.552890 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mjjlb" event={"ID":"eeb4ac5e-9779-4e63-8008-83368b12aea5","Type":"ContainerDied","Data":"1d64cbe5478a9ad4d9cd763d9f5b1175310a03b2d98326cddede9d7757669756"} Feb 19 16:28:22 crc kubenswrapper[4810]: I0219 16:28:22.552930 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mjjlb" event={"ID":"eeb4ac5e-9779-4e63-8008-83368b12aea5","Type":"ContainerStarted","Data":"b08f75ea1a51a0539020767f2a50472c00e5d58cf749563300c8504c2411cae9"} Feb 19 16:28:23 crc kubenswrapper[4810]: I0219 16:28:23.038244 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-64pwn" podUID="9e626cbc-4140-4c84-8ecc-8b3315f6023d" containerName="registry-server" probeResult="failure" output=< Feb 19 16:28:23 crc kubenswrapper[4810]: timeout: failed to connect service ":50051" within 1s Feb 19 16:28:23 crc kubenswrapper[4810]: > Feb 19 16:28:24 crc kubenswrapper[4810]: I0219 16:28:24.632182 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mjjlb" event={"ID":"eeb4ac5e-9779-4e63-8008-83368b12aea5","Type":"ContainerStarted","Data":"15ccceaa5002f7c88549c7fabba52cc9a89e3ef2a9a0f0d670c55de44651b1cf"} Feb 19 16:28:25 crc kubenswrapper[4810]: I0219 16:28:25.646428 4810 generic.go:334] "Generic (PLEG): container finished" podID="eeb4ac5e-9779-4e63-8008-83368b12aea5" containerID="15ccceaa5002f7c88549c7fabba52cc9a89e3ef2a9a0f0d670c55de44651b1cf" exitCode=0 Feb 19 16:28:25 crc kubenswrapper[4810]: I0219 16:28:25.646527 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mjjlb" event={"ID":"eeb4ac5e-9779-4e63-8008-83368b12aea5","Type":"ContainerDied","Data":"15ccceaa5002f7c88549c7fabba52cc9a89e3ef2a9a0f0d670c55de44651b1cf"} Feb 19 16:28:26 crc kubenswrapper[4810]: I0219 16:28:26.662100 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mjjlb" event={"ID":"eeb4ac5e-9779-4e63-8008-83368b12aea5","Type":"ContainerStarted","Data":"346725c8d42f0e9ac0d8ff2060dd270f9f2549bd03d2d4245c0e6fb675bfcb27"} Feb 19 16:28:26 crc kubenswrapper[4810]: I0219 16:28:26.683484 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mjjlb" podStartSLOduration=3.157034551 podStartE2EDuration="6.683465084s" podCreationTimestamp="2026-02-19 16:28:20 +0000 UTC" firstStartedPulling="2026-02-19 16:28:22.555186688 +0000 UTC m=+4732.037216852" lastFinishedPulling="2026-02-19 16:28:26.081617221 +0000 UTC m=+4735.563647385" observedRunningTime="2026-02-19 16:28:26.680974313 +0000 UTC m=+4736.163004447" watchObservedRunningTime="2026-02-19 16:28:26.683465084 +0000 UTC m=+4736.165495218" Feb 19 16:28:31 crc kubenswrapper[4810]: I0219 16:28:31.127062 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mjjlb" Feb 19 16:28:31 crc kubenswrapper[4810]: I0219 16:28:31.127761 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mjjlb" Feb 19 16:28:31 crc kubenswrapper[4810]: I0219 16:28:31.201448 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mjjlb" Feb 19 16:28:31 crc kubenswrapper[4810]: I0219 16:28:31.803950 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mjjlb" Feb 19 16:28:31 crc kubenswrapper[4810]: I0219 16:28:31.861037 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mjjlb"] Feb 19 16:28:33 crc kubenswrapper[4810]: I0219 16:28:33.038227 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-64pwn" podUID="9e626cbc-4140-4c84-8ecc-8b3315f6023d" containerName="registry-server" probeResult="failure" output=< Feb 19 16:28:33 crc kubenswrapper[4810]: timeout: failed to connect service ":50051" within 1s Feb 19 16:28:33 crc kubenswrapper[4810]: > Feb 19 16:28:33 crc kubenswrapper[4810]: I0219 16:28:33.737318 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mjjlb" podUID="eeb4ac5e-9779-4e63-8008-83368b12aea5" containerName="registry-server" containerID="cri-o://346725c8d42f0e9ac0d8ff2060dd270f9f2549bd03d2d4245c0e6fb675bfcb27" gracePeriod=2 Feb 19 16:28:33 crc kubenswrapper[4810]: I0219 16:28:33.895794 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5n9gc"] Feb 19 16:28:33 crc kubenswrapper[4810]: I0219 16:28:33.899296 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5n9gc" Feb 19 16:28:33 crc kubenswrapper[4810]: I0219 16:28:33.911502 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dde1ea5-68be-4851-8816-3c7302dc2579-catalog-content\") pod \"certified-operators-5n9gc\" (UID: \"6dde1ea5-68be-4851-8816-3c7302dc2579\") " pod="openshift-marketplace/certified-operators-5n9gc" Feb 19 16:28:33 crc kubenswrapper[4810]: I0219 16:28:33.911645 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5j8ln\" (UniqueName: \"kubernetes.io/projected/6dde1ea5-68be-4851-8816-3c7302dc2579-kube-api-access-5j8ln\") pod \"certified-operators-5n9gc\" (UID: \"6dde1ea5-68be-4851-8816-3c7302dc2579\") " pod="openshift-marketplace/certified-operators-5n9gc" Feb 19 16:28:33 crc kubenswrapper[4810]: I0219 16:28:33.911677 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dde1ea5-68be-4851-8816-3c7302dc2579-utilities\") pod \"certified-operators-5n9gc\" (UID: \"6dde1ea5-68be-4851-8816-3c7302dc2579\") " pod="openshift-marketplace/certified-operators-5n9gc" Feb 19 16:28:33 crc kubenswrapper[4810]: I0219 16:28:33.911769 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5n9gc"] Feb 19 16:28:34 crc kubenswrapper[4810]: I0219 16:28:34.013251 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5j8ln\" (UniqueName: \"kubernetes.io/projected/6dde1ea5-68be-4851-8816-3c7302dc2579-kube-api-access-5j8ln\") pod \"certified-operators-5n9gc\" (UID: \"6dde1ea5-68be-4851-8816-3c7302dc2579\") " pod="openshift-marketplace/certified-operators-5n9gc" Feb 19 16:28:34 crc kubenswrapper[4810]: I0219 16:28:34.013296 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dde1ea5-68be-4851-8816-3c7302dc2579-utilities\") pod \"certified-operators-5n9gc\" (UID: \"6dde1ea5-68be-4851-8816-3c7302dc2579\") " pod="openshift-marketplace/certified-operators-5n9gc" Feb 19 16:28:34 crc kubenswrapper[4810]: I0219 16:28:34.013435 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dde1ea5-68be-4851-8816-3c7302dc2579-catalog-content\") pod \"certified-operators-5n9gc\" (UID: \"6dde1ea5-68be-4851-8816-3c7302dc2579\") " pod="openshift-marketplace/certified-operators-5n9gc" Feb 19 16:28:34 crc kubenswrapper[4810]: I0219 16:28:34.013867 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dde1ea5-68be-4851-8816-3c7302dc2579-utilities\") pod \"certified-operators-5n9gc\" (UID: \"6dde1ea5-68be-4851-8816-3c7302dc2579\") " pod="openshift-marketplace/certified-operators-5n9gc" Feb 19 16:28:34 crc kubenswrapper[4810]: I0219 16:28:34.014364 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dde1ea5-68be-4851-8816-3c7302dc2579-catalog-content\") pod \"certified-operators-5n9gc\" (UID: \"6dde1ea5-68be-4851-8816-3c7302dc2579\") " pod="openshift-marketplace/certified-operators-5n9gc" Feb 19 16:28:34 crc kubenswrapper[4810]: I0219 16:28:34.035180 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5j8ln\" (UniqueName: \"kubernetes.io/projected/6dde1ea5-68be-4851-8816-3c7302dc2579-kube-api-access-5j8ln\") pod \"certified-operators-5n9gc\" (UID: \"6dde1ea5-68be-4851-8816-3c7302dc2579\") " pod="openshift-marketplace/certified-operators-5n9gc" Feb 19 16:28:34 crc kubenswrapper[4810]: I0219 16:28:34.252928 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5n9gc" Feb 19 16:28:34 crc kubenswrapper[4810]: I0219 16:28:34.316994 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mjjlb" Feb 19 16:28:34 crc kubenswrapper[4810]: I0219 16:28:34.532797 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eeb4ac5e-9779-4e63-8008-83368b12aea5-utilities\") pod \"eeb4ac5e-9779-4e63-8008-83368b12aea5\" (UID: \"eeb4ac5e-9779-4e63-8008-83368b12aea5\") " Feb 19 16:28:34 crc kubenswrapper[4810]: I0219 16:28:34.533162 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eeb4ac5e-9779-4e63-8008-83368b12aea5-catalog-content\") pod \"eeb4ac5e-9779-4e63-8008-83368b12aea5\" (UID: \"eeb4ac5e-9779-4e63-8008-83368b12aea5\") " Feb 19 16:28:34 crc kubenswrapper[4810]: I0219 16:28:34.533304 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ncnnf\" (UniqueName: \"kubernetes.io/projected/eeb4ac5e-9779-4e63-8008-83368b12aea5-kube-api-access-ncnnf\") pod \"eeb4ac5e-9779-4e63-8008-83368b12aea5\" (UID: \"eeb4ac5e-9779-4e63-8008-83368b12aea5\") " Feb 19 16:28:34 crc kubenswrapper[4810]: I0219 16:28:34.534210 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eeb4ac5e-9779-4e63-8008-83368b12aea5-utilities" (OuterVolumeSpecName: "utilities") pod "eeb4ac5e-9779-4e63-8008-83368b12aea5" (UID: "eeb4ac5e-9779-4e63-8008-83368b12aea5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 16:28:34 crc kubenswrapper[4810]: I0219 16:28:34.534669 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eeb4ac5e-9779-4e63-8008-83368b12aea5-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 16:28:34 crc kubenswrapper[4810]: I0219 16:28:34.544891 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eeb4ac5e-9779-4e63-8008-83368b12aea5-kube-api-access-ncnnf" (OuterVolumeSpecName: "kube-api-access-ncnnf") pod "eeb4ac5e-9779-4e63-8008-83368b12aea5" (UID: "eeb4ac5e-9779-4e63-8008-83368b12aea5"). InnerVolumeSpecName "kube-api-access-ncnnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 16:28:34 crc kubenswrapper[4810]: I0219 16:28:34.628948 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eeb4ac5e-9779-4e63-8008-83368b12aea5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "eeb4ac5e-9779-4e63-8008-83368b12aea5" (UID: "eeb4ac5e-9779-4e63-8008-83368b12aea5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 16:28:34 crc kubenswrapper[4810]: I0219 16:28:34.641101 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eeb4ac5e-9779-4e63-8008-83368b12aea5-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 16:28:34 crc kubenswrapper[4810]: I0219 16:28:34.641449 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ncnnf\" (UniqueName: \"kubernetes.io/projected/eeb4ac5e-9779-4e63-8008-83368b12aea5-kube-api-access-ncnnf\") on node \"crc\" DevicePath \"\"" Feb 19 16:28:34 crc kubenswrapper[4810]: I0219 16:28:34.751949 4810 generic.go:334] "Generic (PLEG): container finished" podID="eeb4ac5e-9779-4e63-8008-83368b12aea5" containerID="346725c8d42f0e9ac0d8ff2060dd270f9f2549bd03d2d4245c0e6fb675bfcb27" exitCode=0 Feb 19 16:28:34 crc kubenswrapper[4810]: I0219 16:28:34.752010 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mjjlb" event={"ID":"eeb4ac5e-9779-4e63-8008-83368b12aea5","Type":"ContainerDied","Data":"346725c8d42f0e9ac0d8ff2060dd270f9f2549bd03d2d4245c0e6fb675bfcb27"} Feb 19 16:28:34 crc kubenswrapper[4810]: I0219 16:28:34.752063 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mjjlb" Feb 19 16:28:34 crc kubenswrapper[4810]: I0219 16:28:34.752265 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mjjlb" event={"ID":"eeb4ac5e-9779-4e63-8008-83368b12aea5","Type":"ContainerDied","Data":"b08f75ea1a51a0539020767f2a50472c00e5d58cf749563300c8504c2411cae9"} Feb 19 16:28:34 crc kubenswrapper[4810]: I0219 16:28:34.752409 4810 scope.go:117] "RemoveContainer" containerID="346725c8d42f0e9ac0d8ff2060dd270f9f2549bd03d2d4245c0e6fb675bfcb27" Feb 19 16:28:34 crc kubenswrapper[4810]: I0219 16:28:34.781480 4810 scope.go:117] "RemoveContainer" containerID="15ccceaa5002f7c88549c7fabba52cc9a89e3ef2a9a0f0d670c55de44651b1cf" Feb 19 16:28:34 crc kubenswrapper[4810]: I0219 16:28:34.802398 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mjjlb"] Feb 19 16:28:34 crc kubenswrapper[4810]: I0219 16:28:34.807952 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mjjlb"] Feb 19 16:28:34 crc kubenswrapper[4810]: I0219 16:28:34.833125 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5n9gc"] Feb 19 16:28:35 crc kubenswrapper[4810]: W0219 16:28:35.114757 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6dde1ea5_68be_4851_8816_3c7302dc2579.slice/crio-280973c57a3f46eeb314d42cd9ad11d9e2b63939b685ed7e15c043b29db4262c WatchSource:0}: Error finding container 280973c57a3f46eeb314d42cd9ad11d9e2b63939b685ed7e15c043b29db4262c: Status 404 returned error can't find the container with id 280973c57a3f46eeb314d42cd9ad11d9e2b63939b685ed7e15c043b29db4262c Feb 19 16:28:35 crc kubenswrapper[4810]: I0219 16:28:35.126717 4810 scope.go:117] "RemoveContainer" containerID="1d64cbe5478a9ad4d9cd763d9f5b1175310a03b2d98326cddede9d7757669756" Feb 19 16:28:35 crc kubenswrapper[4810]: I0219 16:28:35.274648 4810 scope.go:117] "RemoveContainer" containerID="346725c8d42f0e9ac0d8ff2060dd270f9f2549bd03d2d4245c0e6fb675bfcb27" Feb 19 16:28:35 crc kubenswrapper[4810]: E0219 16:28:35.275117 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"346725c8d42f0e9ac0d8ff2060dd270f9f2549bd03d2d4245c0e6fb675bfcb27\": container with ID starting with 346725c8d42f0e9ac0d8ff2060dd270f9f2549bd03d2d4245c0e6fb675bfcb27 not found: ID does not exist" containerID="346725c8d42f0e9ac0d8ff2060dd270f9f2549bd03d2d4245c0e6fb675bfcb27" Feb 19 16:28:35 crc kubenswrapper[4810]: I0219 16:28:35.275153 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"346725c8d42f0e9ac0d8ff2060dd270f9f2549bd03d2d4245c0e6fb675bfcb27"} err="failed to get container status \"346725c8d42f0e9ac0d8ff2060dd270f9f2549bd03d2d4245c0e6fb675bfcb27\": rpc error: code = NotFound desc = could not find container \"346725c8d42f0e9ac0d8ff2060dd270f9f2549bd03d2d4245c0e6fb675bfcb27\": container with ID starting with 346725c8d42f0e9ac0d8ff2060dd270f9f2549bd03d2d4245c0e6fb675bfcb27 not found: ID does not exist" Feb 19 16:28:35 crc kubenswrapper[4810]: I0219 16:28:35.275180 4810 scope.go:117] "RemoveContainer" containerID="15ccceaa5002f7c88549c7fabba52cc9a89e3ef2a9a0f0d670c55de44651b1cf" Feb 19 16:28:35 crc kubenswrapper[4810]: E0219 16:28:35.275552 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15ccceaa5002f7c88549c7fabba52cc9a89e3ef2a9a0f0d670c55de44651b1cf\": container with ID starting with 15ccceaa5002f7c88549c7fabba52cc9a89e3ef2a9a0f0d670c55de44651b1cf not found: ID does not exist" containerID="15ccceaa5002f7c88549c7fabba52cc9a89e3ef2a9a0f0d670c55de44651b1cf" Feb 19 16:28:35 crc kubenswrapper[4810]: I0219 16:28:35.275567 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15ccceaa5002f7c88549c7fabba52cc9a89e3ef2a9a0f0d670c55de44651b1cf"} err="failed to get container status \"15ccceaa5002f7c88549c7fabba52cc9a89e3ef2a9a0f0d670c55de44651b1cf\": rpc error: code = NotFound desc = could not find container \"15ccceaa5002f7c88549c7fabba52cc9a89e3ef2a9a0f0d670c55de44651b1cf\": container with ID starting with 15ccceaa5002f7c88549c7fabba52cc9a89e3ef2a9a0f0d670c55de44651b1cf not found: ID does not exist" Feb 19 16:28:35 crc kubenswrapper[4810]: I0219 16:28:35.275582 4810 scope.go:117] "RemoveContainer" containerID="1d64cbe5478a9ad4d9cd763d9f5b1175310a03b2d98326cddede9d7757669756" Feb 19 16:28:35 crc kubenswrapper[4810]: E0219 16:28:35.275806 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d64cbe5478a9ad4d9cd763d9f5b1175310a03b2d98326cddede9d7757669756\": container with ID starting with 1d64cbe5478a9ad4d9cd763d9f5b1175310a03b2d98326cddede9d7757669756 not found: ID does not exist" containerID="1d64cbe5478a9ad4d9cd763d9f5b1175310a03b2d98326cddede9d7757669756" Feb 19 16:28:35 crc kubenswrapper[4810]: I0219 16:28:35.275825 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d64cbe5478a9ad4d9cd763d9f5b1175310a03b2d98326cddede9d7757669756"} err="failed to get container status \"1d64cbe5478a9ad4d9cd763d9f5b1175310a03b2d98326cddede9d7757669756\": rpc error: code = NotFound desc = could not find container \"1d64cbe5478a9ad4d9cd763d9f5b1175310a03b2d98326cddede9d7757669756\": container with ID starting with 1d64cbe5478a9ad4d9cd763d9f5b1175310a03b2d98326cddede9d7757669756 not found: ID does not exist" Feb 19 16:28:35 crc kubenswrapper[4810]: I0219 16:28:35.451614 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eeb4ac5e-9779-4e63-8008-83368b12aea5" path="/var/lib/kubelet/pods/eeb4ac5e-9779-4e63-8008-83368b12aea5/volumes" Feb 19 16:28:35 crc kubenswrapper[4810]: I0219 16:28:35.763387 4810 generic.go:334] "Generic (PLEG): container finished" podID="6dde1ea5-68be-4851-8816-3c7302dc2579" containerID="c60a4f5da78e4d16d81eacb1c4cfea0a5df0810a9ff302334c88f13711140033" exitCode=0 Feb 19 16:28:35 crc kubenswrapper[4810]: I0219 16:28:35.763508 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5n9gc" event={"ID":"6dde1ea5-68be-4851-8816-3c7302dc2579","Type":"ContainerDied","Data":"c60a4f5da78e4d16d81eacb1c4cfea0a5df0810a9ff302334c88f13711140033"} Feb 19 16:28:35 crc kubenswrapper[4810]: I0219 16:28:35.763570 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5n9gc" event={"ID":"6dde1ea5-68be-4851-8816-3c7302dc2579","Type":"ContainerStarted","Data":"280973c57a3f46eeb314d42cd9ad11d9e2b63939b685ed7e15c043b29db4262c"} Feb 19 16:28:42 crc kubenswrapper[4810]: I0219 16:28:42.834683 4810 generic.go:334] "Generic (PLEG): container finished" podID="6dde1ea5-68be-4851-8816-3c7302dc2579" containerID="8046aa230ca2b4fe6d7939a4777943e6f17a788dfab6e0b652084234b84eaf5e" exitCode=0 Feb 19 16:28:42 crc kubenswrapper[4810]: I0219 16:28:42.834955 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5n9gc" event={"ID":"6dde1ea5-68be-4851-8816-3c7302dc2579","Type":"ContainerDied","Data":"8046aa230ca2b4fe6d7939a4777943e6f17a788dfab6e0b652084234b84eaf5e"} Feb 19 16:28:43 crc kubenswrapper[4810]: I0219 16:28:43.049651 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-64pwn" podUID="9e626cbc-4140-4c84-8ecc-8b3315f6023d" containerName="registry-server" probeResult="failure" output=< Feb 19 16:28:43 crc kubenswrapper[4810]: timeout: failed to connect service ":50051" within 1s Feb 19 16:28:43 crc kubenswrapper[4810]: > Feb 19 16:28:43 crc kubenswrapper[4810]: I0219 16:28:43.854058 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5n9gc" event={"ID":"6dde1ea5-68be-4851-8816-3c7302dc2579","Type":"ContainerStarted","Data":"c46982b47a689551c84c8a94559da6c4aba508584e390cc906dbadf87e5b2444"} Feb 19 16:28:43 crc kubenswrapper[4810]: I0219 16:28:43.894828 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5n9gc" podStartSLOduration=3.4006189989999998 podStartE2EDuration="10.894810426s" podCreationTimestamp="2026-02-19 16:28:33 +0000 UTC" firstStartedPulling="2026-02-19 16:28:35.765308918 +0000 UTC m=+4745.247339082" lastFinishedPulling="2026-02-19 16:28:43.259500385 +0000 UTC m=+4752.741530509" observedRunningTime="2026-02-19 16:28:43.886549272 +0000 UTC m=+4753.368579396" watchObservedRunningTime="2026-02-19 16:28:43.894810426 +0000 UTC m=+4753.376840550" Feb 19 16:28:44 crc kubenswrapper[4810]: I0219 16:28:44.253913 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5n9gc" Feb 19 16:28:44 crc kubenswrapper[4810]: I0219 16:28:44.253952 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5n9gc" Feb 19 16:28:45 crc kubenswrapper[4810]: I0219 16:28:45.313984 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-5n9gc" podUID="6dde1ea5-68be-4851-8816-3c7302dc2579" containerName="registry-server" probeResult="failure" output=< Feb 19 16:28:45 crc kubenswrapper[4810]: timeout: failed to connect service ":50051" within 1s Feb 19 16:28:45 crc kubenswrapper[4810]: > Feb 19 16:28:52 crc kubenswrapper[4810]: I0219 16:28:52.059107 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-64pwn" Feb 19 16:28:52 crc kubenswrapper[4810]: I0219 16:28:52.134000 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-64pwn" Feb 19 16:28:52 crc kubenswrapper[4810]: I0219 16:28:52.308110 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-64pwn"] Feb 19 16:28:53 crc kubenswrapper[4810]: I0219 16:28:53.979600 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-64pwn" podUID="9e626cbc-4140-4c84-8ecc-8b3315f6023d" containerName="registry-server" containerID="cri-o://81668e94dc20932a1366bc891e92da4a77667055ce44a21f7d966c87ad5627e7" gracePeriod=2 Feb 19 16:28:54 crc kubenswrapper[4810]: I0219 16:28:54.330146 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5n9gc" Feb 19 16:28:54 crc kubenswrapper[4810]: I0219 16:28:54.403521 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5n9gc" Feb 19 16:28:54 crc kubenswrapper[4810]: I0219 16:28:54.517454 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5n9gc"] Feb 19 16:28:54 crc kubenswrapper[4810]: I0219 16:28:54.557652 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-64pwn" Feb 19 16:28:54 crc kubenswrapper[4810]: I0219 16:28:54.631986 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e626cbc-4140-4c84-8ecc-8b3315f6023d-catalog-content\") pod \"9e626cbc-4140-4c84-8ecc-8b3315f6023d\" (UID: \"9e626cbc-4140-4c84-8ecc-8b3315f6023d\") " Feb 19 16:28:54 crc kubenswrapper[4810]: I0219 16:28:54.632288 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6gnbp\" (UniqueName: \"kubernetes.io/projected/9e626cbc-4140-4c84-8ecc-8b3315f6023d-kube-api-access-6gnbp\") pod \"9e626cbc-4140-4c84-8ecc-8b3315f6023d\" (UID: \"9e626cbc-4140-4c84-8ecc-8b3315f6023d\") " Feb 19 16:28:54 crc kubenswrapper[4810]: I0219 16:28:54.632314 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e626cbc-4140-4c84-8ecc-8b3315f6023d-utilities\") pod \"9e626cbc-4140-4c84-8ecc-8b3315f6023d\" (UID: \"9e626cbc-4140-4c84-8ecc-8b3315f6023d\") " Feb 19 16:28:54 crc kubenswrapper[4810]: I0219 16:28:54.633045 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e626cbc-4140-4c84-8ecc-8b3315f6023d-utilities" (OuterVolumeSpecName: "utilities") pod "9e626cbc-4140-4c84-8ecc-8b3315f6023d" (UID: "9e626cbc-4140-4c84-8ecc-8b3315f6023d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 16:28:54 crc kubenswrapper[4810]: I0219 16:28:54.638207 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e626cbc-4140-4c84-8ecc-8b3315f6023d-kube-api-access-6gnbp" (OuterVolumeSpecName: "kube-api-access-6gnbp") pod "9e626cbc-4140-4c84-8ecc-8b3315f6023d" (UID: "9e626cbc-4140-4c84-8ecc-8b3315f6023d"). InnerVolumeSpecName "kube-api-access-6gnbp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 16:28:54 crc kubenswrapper[4810]: I0219 16:28:54.703121 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-96zmk"] Feb 19 16:28:54 crc kubenswrapper[4810]: I0219 16:28:54.703678 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-96zmk" podUID="78aaed3c-dfb4-4332-bc63-4fc5342870ae" containerName="registry-server" containerID="cri-o://d3b7a9028d72c7eb4783b51f96495dc71d42e8907df523028cceebd689902934" gracePeriod=2 Feb 19 16:28:54 crc kubenswrapper[4810]: I0219 16:28:54.734412 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6gnbp\" (UniqueName: \"kubernetes.io/projected/9e626cbc-4140-4c84-8ecc-8b3315f6023d-kube-api-access-6gnbp\") on node \"crc\" DevicePath \"\"" Feb 19 16:28:54 crc kubenswrapper[4810]: I0219 16:28:54.734440 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e626cbc-4140-4c84-8ecc-8b3315f6023d-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 16:28:54 crc kubenswrapper[4810]: I0219 16:28:54.813265 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e626cbc-4140-4c84-8ecc-8b3315f6023d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9e626cbc-4140-4c84-8ecc-8b3315f6023d" (UID: "9e626cbc-4140-4c84-8ecc-8b3315f6023d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 16:28:54 crc kubenswrapper[4810]: I0219 16:28:54.836527 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e626cbc-4140-4c84-8ecc-8b3315f6023d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 16:28:54 crc kubenswrapper[4810]: I0219 16:28:54.991682 4810 generic.go:334] "Generic (PLEG): container finished" podID="78aaed3c-dfb4-4332-bc63-4fc5342870ae" containerID="d3b7a9028d72c7eb4783b51f96495dc71d42e8907df523028cceebd689902934" exitCode=0 Feb 19 16:28:54 crc kubenswrapper[4810]: I0219 16:28:54.991769 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-96zmk" event={"ID":"78aaed3c-dfb4-4332-bc63-4fc5342870ae","Type":"ContainerDied","Data":"d3b7a9028d72c7eb4783b51f96495dc71d42e8907df523028cceebd689902934"} Feb 19 16:28:54 crc kubenswrapper[4810]: I0219 16:28:54.994305 4810 generic.go:334] "Generic (PLEG): container finished" podID="9e626cbc-4140-4c84-8ecc-8b3315f6023d" containerID="81668e94dc20932a1366bc891e92da4a77667055ce44a21f7d966c87ad5627e7" exitCode=0 Feb 19 16:28:54 crc kubenswrapper[4810]: I0219 16:28:54.994395 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-64pwn" Feb 19 16:28:54 crc kubenswrapper[4810]: I0219 16:28:54.994435 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-64pwn" event={"ID":"9e626cbc-4140-4c84-8ecc-8b3315f6023d","Type":"ContainerDied","Data":"81668e94dc20932a1366bc891e92da4a77667055ce44a21f7d966c87ad5627e7"} Feb 19 16:28:54 crc kubenswrapper[4810]: I0219 16:28:54.994463 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-64pwn" event={"ID":"9e626cbc-4140-4c84-8ecc-8b3315f6023d","Type":"ContainerDied","Data":"565952d67109a9fcad8e71d5e0aabde77b113979aa3c289f2b682f3ba5f316a6"} Feb 19 16:28:54 crc kubenswrapper[4810]: I0219 16:28:54.994485 4810 scope.go:117] "RemoveContainer" containerID="81668e94dc20932a1366bc891e92da4a77667055ce44a21f7d966c87ad5627e7" Feb 19 16:28:55 crc kubenswrapper[4810]: I0219 16:28:55.025521 4810 scope.go:117] "RemoveContainer" containerID="11aae31930dc1715ed8dc4d319b46478d41e580c6f80c8f65ea1a3331f2c975b" Feb 19 16:28:55 crc kubenswrapper[4810]: I0219 16:28:55.095156 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-64pwn"] Feb 19 16:28:55 crc kubenswrapper[4810]: I0219 16:28:55.105315 4810 scope.go:117] "RemoveContainer" containerID="ca5d92eef27bd3b021224c9164b7e74cb5a78a5abd43eb0d862a50d3ae8cd3f9" Feb 19 16:28:55 crc kubenswrapper[4810]: I0219 16:28:55.120190 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-64pwn"] Feb 19 16:28:55 crc kubenswrapper[4810]: I0219 16:28:55.228498 4810 scope.go:117] "RemoveContainer" containerID="81668e94dc20932a1366bc891e92da4a77667055ce44a21f7d966c87ad5627e7" Feb 19 16:28:55 crc kubenswrapper[4810]: E0219 16:28:55.230072 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81668e94dc20932a1366bc891e92da4a77667055ce44a21f7d966c87ad5627e7\": container with ID starting with 81668e94dc20932a1366bc891e92da4a77667055ce44a21f7d966c87ad5627e7 not found: ID does not exist" containerID="81668e94dc20932a1366bc891e92da4a77667055ce44a21f7d966c87ad5627e7" Feb 19 16:28:55 crc kubenswrapper[4810]: I0219 16:28:55.230118 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81668e94dc20932a1366bc891e92da4a77667055ce44a21f7d966c87ad5627e7"} err="failed to get container status \"81668e94dc20932a1366bc891e92da4a77667055ce44a21f7d966c87ad5627e7\": rpc error: code = NotFound desc = could not find container \"81668e94dc20932a1366bc891e92da4a77667055ce44a21f7d966c87ad5627e7\": container with ID starting with 81668e94dc20932a1366bc891e92da4a77667055ce44a21f7d966c87ad5627e7 not found: ID does not exist" Feb 19 16:28:55 crc kubenswrapper[4810]: I0219 16:28:55.230148 4810 scope.go:117] "RemoveContainer" containerID="11aae31930dc1715ed8dc4d319b46478d41e580c6f80c8f65ea1a3331f2c975b" Feb 19 16:28:55 crc kubenswrapper[4810]: E0219 16:28:55.230461 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11aae31930dc1715ed8dc4d319b46478d41e580c6f80c8f65ea1a3331f2c975b\": container with ID starting with 11aae31930dc1715ed8dc4d319b46478d41e580c6f80c8f65ea1a3331f2c975b not found: ID does not exist" containerID="11aae31930dc1715ed8dc4d319b46478d41e580c6f80c8f65ea1a3331f2c975b" Feb 19 16:28:55 crc kubenswrapper[4810]: I0219 16:28:55.230499 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11aae31930dc1715ed8dc4d319b46478d41e580c6f80c8f65ea1a3331f2c975b"} err="failed to get container status \"11aae31930dc1715ed8dc4d319b46478d41e580c6f80c8f65ea1a3331f2c975b\": rpc error: code = NotFound desc = could not find container \"11aae31930dc1715ed8dc4d319b46478d41e580c6f80c8f65ea1a3331f2c975b\": container with ID starting with 11aae31930dc1715ed8dc4d319b46478d41e580c6f80c8f65ea1a3331f2c975b not found: ID does not exist" Feb 19 16:28:55 crc kubenswrapper[4810]: I0219 16:28:55.230530 4810 scope.go:117] "RemoveContainer" containerID="ca5d92eef27bd3b021224c9164b7e74cb5a78a5abd43eb0d862a50d3ae8cd3f9" Feb 19 16:28:55 crc kubenswrapper[4810]: E0219 16:28:55.233875 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca5d92eef27bd3b021224c9164b7e74cb5a78a5abd43eb0d862a50d3ae8cd3f9\": container with ID starting with ca5d92eef27bd3b021224c9164b7e74cb5a78a5abd43eb0d862a50d3ae8cd3f9 not found: ID does not exist" containerID="ca5d92eef27bd3b021224c9164b7e74cb5a78a5abd43eb0d862a50d3ae8cd3f9" Feb 19 16:28:55 crc kubenswrapper[4810]: I0219 16:28:55.233922 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca5d92eef27bd3b021224c9164b7e74cb5a78a5abd43eb0d862a50d3ae8cd3f9"} err="failed to get container status \"ca5d92eef27bd3b021224c9164b7e74cb5a78a5abd43eb0d862a50d3ae8cd3f9\": rpc error: code = NotFound desc = could not find container \"ca5d92eef27bd3b021224c9164b7e74cb5a78a5abd43eb0d862a50d3ae8cd3f9\": container with ID starting with ca5d92eef27bd3b021224c9164b7e74cb5a78a5abd43eb0d862a50d3ae8cd3f9 not found: ID does not exist" Feb 19 16:28:55 crc kubenswrapper[4810]: I0219 16:28:55.263812 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-96zmk" Feb 19 16:28:55 crc kubenswrapper[4810]: I0219 16:28:55.358435 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78aaed3c-dfb4-4332-bc63-4fc5342870ae-catalog-content\") pod \"78aaed3c-dfb4-4332-bc63-4fc5342870ae\" (UID: \"78aaed3c-dfb4-4332-bc63-4fc5342870ae\") " Feb 19 16:28:55 crc kubenswrapper[4810]: I0219 16:28:55.358476 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78aaed3c-dfb4-4332-bc63-4fc5342870ae-utilities\") pod \"78aaed3c-dfb4-4332-bc63-4fc5342870ae\" (UID: \"78aaed3c-dfb4-4332-bc63-4fc5342870ae\") " Feb 19 16:28:55 crc kubenswrapper[4810]: I0219 16:28:55.358595 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7wtj\" (UniqueName: \"kubernetes.io/projected/78aaed3c-dfb4-4332-bc63-4fc5342870ae-kube-api-access-z7wtj\") pod \"78aaed3c-dfb4-4332-bc63-4fc5342870ae\" (UID: \"78aaed3c-dfb4-4332-bc63-4fc5342870ae\") " Feb 19 16:28:55 crc kubenswrapper[4810]: I0219 16:28:55.360298 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78aaed3c-dfb4-4332-bc63-4fc5342870ae-utilities" (OuterVolumeSpecName: "utilities") pod "78aaed3c-dfb4-4332-bc63-4fc5342870ae" (UID: "78aaed3c-dfb4-4332-bc63-4fc5342870ae"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 16:28:55 crc kubenswrapper[4810]: I0219 16:28:55.382445 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78aaed3c-dfb4-4332-bc63-4fc5342870ae-kube-api-access-z7wtj" (OuterVolumeSpecName: "kube-api-access-z7wtj") pod "78aaed3c-dfb4-4332-bc63-4fc5342870ae" (UID: "78aaed3c-dfb4-4332-bc63-4fc5342870ae"). InnerVolumeSpecName "kube-api-access-z7wtj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 16:28:55 crc kubenswrapper[4810]: I0219 16:28:55.408998 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78aaed3c-dfb4-4332-bc63-4fc5342870ae-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "78aaed3c-dfb4-4332-bc63-4fc5342870ae" (UID: "78aaed3c-dfb4-4332-bc63-4fc5342870ae"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 16:28:55 crc kubenswrapper[4810]: I0219 16:28:55.452090 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e626cbc-4140-4c84-8ecc-8b3315f6023d" path="/var/lib/kubelet/pods/9e626cbc-4140-4c84-8ecc-8b3315f6023d/volumes" Feb 19 16:28:55 crc kubenswrapper[4810]: I0219 16:28:55.460997 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7wtj\" (UniqueName: \"kubernetes.io/projected/78aaed3c-dfb4-4332-bc63-4fc5342870ae-kube-api-access-z7wtj\") on node \"crc\" DevicePath \"\"" Feb 19 16:28:55 crc kubenswrapper[4810]: I0219 16:28:55.461031 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78aaed3c-dfb4-4332-bc63-4fc5342870ae-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 16:28:55 crc kubenswrapper[4810]: I0219 16:28:55.461039 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78aaed3c-dfb4-4332-bc63-4fc5342870ae-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 16:28:56 crc kubenswrapper[4810]: I0219 16:28:56.005881 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-96zmk" event={"ID":"78aaed3c-dfb4-4332-bc63-4fc5342870ae","Type":"ContainerDied","Data":"dc3bb4ddeed72ae65273594c85ea1bf8c3ee8a4155cd199ebba0a1a239217684"} Feb 19 16:28:56 crc kubenswrapper[4810]: I0219 16:28:56.006229 4810 scope.go:117] "RemoveContainer" containerID="d3b7a9028d72c7eb4783b51f96495dc71d42e8907df523028cceebd689902934" Feb 19 16:28:56 crc kubenswrapper[4810]: I0219 16:28:56.005910 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-96zmk" Feb 19 16:28:56 crc kubenswrapper[4810]: I0219 16:28:56.037462 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-96zmk"] Feb 19 16:28:56 crc kubenswrapper[4810]: I0219 16:28:56.048141 4810 scope.go:117] "RemoveContainer" containerID="6b9165a9af4fc64c1f899ec8f99221cc29231316d69d504fd16ef6a4ce57525c" Feb 19 16:28:56 crc kubenswrapper[4810]: I0219 16:28:56.051723 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-96zmk"] Feb 19 16:28:56 crc kubenswrapper[4810]: I0219 16:28:56.075529 4810 scope.go:117] "RemoveContainer" containerID="389e0276f99c2489270efb421c00ea721017dbf02f8163a0b8723a2949f2384c" Feb 19 16:28:57 crc kubenswrapper[4810]: I0219 16:28:57.450063 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78aaed3c-dfb4-4332-bc63-4fc5342870ae" path="/var/lib/kubelet/pods/78aaed3c-dfb4-4332-bc63-4fc5342870ae/volumes" Feb 19 16:29:01 crc kubenswrapper[4810]: I0219 16:29:01.119123 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rgqm9"] Feb 19 16:29:01 crc kubenswrapper[4810]: E0219 16:29:01.119972 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78aaed3c-dfb4-4332-bc63-4fc5342870ae" containerName="extract-utilities" Feb 19 16:29:01 crc kubenswrapper[4810]: I0219 16:29:01.119985 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="78aaed3c-dfb4-4332-bc63-4fc5342870ae" containerName="extract-utilities" Feb 19 16:29:01 crc kubenswrapper[4810]: E0219 16:29:01.119999 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e626cbc-4140-4c84-8ecc-8b3315f6023d" containerName="registry-server" Feb 19 16:29:01 crc kubenswrapper[4810]: I0219 16:29:01.120005 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e626cbc-4140-4c84-8ecc-8b3315f6023d" containerName="registry-server" Feb 19 16:29:01 crc kubenswrapper[4810]: E0219 16:29:01.120014 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eeb4ac5e-9779-4e63-8008-83368b12aea5" containerName="registry-server" Feb 19 16:29:01 crc kubenswrapper[4810]: I0219 16:29:01.120020 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="eeb4ac5e-9779-4e63-8008-83368b12aea5" containerName="registry-server" Feb 19 16:29:01 crc kubenswrapper[4810]: E0219 16:29:01.120032 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e626cbc-4140-4c84-8ecc-8b3315f6023d" containerName="extract-utilities" Feb 19 16:29:01 crc kubenswrapper[4810]: I0219 16:29:01.120038 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e626cbc-4140-4c84-8ecc-8b3315f6023d" containerName="extract-utilities" Feb 19 16:29:01 crc kubenswrapper[4810]: E0219 16:29:01.120047 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eeb4ac5e-9779-4e63-8008-83368b12aea5" containerName="extract-content" Feb 19 16:29:01 crc kubenswrapper[4810]: I0219 16:29:01.120053 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="eeb4ac5e-9779-4e63-8008-83368b12aea5" containerName="extract-content" Feb 19 16:29:01 crc kubenswrapper[4810]: E0219 16:29:01.120072 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eeb4ac5e-9779-4e63-8008-83368b12aea5" containerName="extract-utilities" Feb 19 16:29:01 crc kubenswrapper[4810]: I0219 16:29:01.120077 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="eeb4ac5e-9779-4e63-8008-83368b12aea5" containerName="extract-utilities" Feb 19 16:29:01 crc kubenswrapper[4810]: E0219 16:29:01.120093 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78aaed3c-dfb4-4332-bc63-4fc5342870ae" containerName="registry-server" Feb 19 16:29:01 crc kubenswrapper[4810]: I0219 16:29:01.120098 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="78aaed3c-dfb4-4332-bc63-4fc5342870ae" containerName="registry-server" Feb 19 16:29:01 crc kubenswrapper[4810]: E0219 16:29:01.120110 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e626cbc-4140-4c84-8ecc-8b3315f6023d" containerName="extract-content" Feb 19 16:29:01 crc kubenswrapper[4810]: I0219 16:29:01.120115 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e626cbc-4140-4c84-8ecc-8b3315f6023d" containerName="extract-content" Feb 19 16:29:01 crc kubenswrapper[4810]: E0219 16:29:01.120129 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78aaed3c-dfb4-4332-bc63-4fc5342870ae" containerName="extract-content" Feb 19 16:29:01 crc kubenswrapper[4810]: I0219 16:29:01.120134 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="78aaed3c-dfb4-4332-bc63-4fc5342870ae" containerName="extract-content" Feb 19 16:29:01 crc kubenswrapper[4810]: I0219 16:29:01.120317 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="eeb4ac5e-9779-4e63-8008-83368b12aea5" containerName="registry-server" Feb 19 16:29:01 crc kubenswrapper[4810]: I0219 16:29:01.120349 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="78aaed3c-dfb4-4332-bc63-4fc5342870ae" containerName="registry-server" Feb 19 16:29:01 crc kubenswrapper[4810]: I0219 16:29:01.120360 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e626cbc-4140-4c84-8ecc-8b3315f6023d" containerName="registry-server" Feb 19 16:29:01 crc kubenswrapper[4810]: I0219 16:29:01.121720 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rgqm9" Feb 19 16:29:01 crc kubenswrapper[4810]: I0219 16:29:01.140044 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rgqm9"] Feb 19 16:29:01 crc kubenswrapper[4810]: I0219 16:29:01.286306 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a673077a-0e58-4ad8-a5db-f6eeb283be61-utilities\") pod \"redhat-marketplace-rgqm9\" (UID: \"a673077a-0e58-4ad8-a5db-f6eeb283be61\") " pod="openshift-marketplace/redhat-marketplace-rgqm9" Feb 19 16:29:01 crc kubenswrapper[4810]: I0219 16:29:01.286438 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xpm7\" (UniqueName: \"kubernetes.io/projected/a673077a-0e58-4ad8-a5db-f6eeb283be61-kube-api-access-4xpm7\") pod \"redhat-marketplace-rgqm9\" (UID: \"a673077a-0e58-4ad8-a5db-f6eeb283be61\") " pod="openshift-marketplace/redhat-marketplace-rgqm9" Feb 19 16:29:01 crc kubenswrapper[4810]: I0219 16:29:01.286488 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a673077a-0e58-4ad8-a5db-f6eeb283be61-catalog-content\") pod \"redhat-marketplace-rgqm9\" (UID: \"a673077a-0e58-4ad8-a5db-f6eeb283be61\") " pod="openshift-marketplace/redhat-marketplace-rgqm9" Feb 19 16:29:01 crc kubenswrapper[4810]: I0219 16:29:01.388698 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a673077a-0e58-4ad8-a5db-f6eeb283be61-utilities\") pod \"redhat-marketplace-rgqm9\" (UID: \"a673077a-0e58-4ad8-a5db-f6eeb283be61\") " pod="openshift-marketplace/redhat-marketplace-rgqm9" Feb 19 16:29:01 crc kubenswrapper[4810]: I0219 16:29:01.388817 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xpm7\" (UniqueName: \"kubernetes.io/projected/a673077a-0e58-4ad8-a5db-f6eeb283be61-kube-api-access-4xpm7\") pod \"redhat-marketplace-rgqm9\" (UID: \"a673077a-0e58-4ad8-a5db-f6eeb283be61\") " pod="openshift-marketplace/redhat-marketplace-rgqm9" Feb 19 16:29:01 crc kubenswrapper[4810]: I0219 16:29:01.388861 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a673077a-0e58-4ad8-a5db-f6eeb283be61-catalog-content\") pod \"redhat-marketplace-rgqm9\" (UID: \"a673077a-0e58-4ad8-a5db-f6eeb283be61\") " pod="openshift-marketplace/redhat-marketplace-rgqm9" Feb 19 16:29:01 crc kubenswrapper[4810]: I0219 16:29:01.389521 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a673077a-0e58-4ad8-a5db-f6eeb283be61-catalog-content\") pod \"redhat-marketplace-rgqm9\" (UID: \"a673077a-0e58-4ad8-a5db-f6eeb283be61\") " pod="openshift-marketplace/redhat-marketplace-rgqm9" Feb 19 16:29:01 crc kubenswrapper[4810]: I0219 16:29:01.389530 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a673077a-0e58-4ad8-a5db-f6eeb283be61-utilities\") pod \"redhat-marketplace-rgqm9\" (UID: \"a673077a-0e58-4ad8-a5db-f6eeb283be61\") " pod="openshift-marketplace/redhat-marketplace-rgqm9" Feb 19 16:29:01 crc kubenswrapper[4810]: I0219 16:29:01.427364 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xpm7\" (UniqueName: \"kubernetes.io/projected/a673077a-0e58-4ad8-a5db-f6eeb283be61-kube-api-access-4xpm7\") pod \"redhat-marketplace-rgqm9\" (UID: \"a673077a-0e58-4ad8-a5db-f6eeb283be61\") " pod="openshift-marketplace/redhat-marketplace-rgqm9" Feb 19 16:29:01 crc kubenswrapper[4810]: I0219 16:29:01.446562 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rgqm9" Feb 19 16:29:01 crc kubenswrapper[4810]: I0219 16:29:01.916512 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rgqm9"] Feb 19 16:29:02 crc kubenswrapper[4810]: I0219 16:29:02.070493 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rgqm9" event={"ID":"a673077a-0e58-4ad8-a5db-f6eeb283be61","Type":"ContainerStarted","Data":"9bf23c8154e85ae548ff97afcec39329d16c32d382c90a3d6ac8b672224668f7"} Feb 19 16:29:03 crc kubenswrapper[4810]: I0219 16:29:03.084286 4810 generic.go:334] "Generic (PLEG): container finished" podID="a673077a-0e58-4ad8-a5db-f6eeb283be61" containerID="ae4c13191ff8d66385f46593ebb524ac0d7b604bcdad265511c266dac42d3ed2" exitCode=0 Feb 19 16:29:03 crc kubenswrapper[4810]: I0219 16:29:03.084372 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rgqm9" event={"ID":"a673077a-0e58-4ad8-a5db-f6eeb283be61","Type":"ContainerDied","Data":"ae4c13191ff8d66385f46593ebb524ac0d7b604bcdad265511c266dac42d3ed2"} Feb 19 16:29:04 crc kubenswrapper[4810]: I0219 16:29:04.096395 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rgqm9" event={"ID":"a673077a-0e58-4ad8-a5db-f6eeb283be61","Type":"ContainerStarted","Data":"710dcf4fbde260c53ab3771ea6e3f9c3d8f23b24754ad03ee8737a762d01c1ca"} Feb 19 16:29:05 crc kubenswrapper[4810]: I0219 16:29:05.112646 4810 generic.go:334] "Generic (PLEG): container finished" podID="a673077a-0e58-4ad8-a5db-f6eeb283be61" containerID="710dcf4fbde260c53ab3771ea6e3f9c3d8f23b24754ad03ee8737a762d01c1ca" exitCode=0 Feb 19 16:29:05 crc kubenswrapper[4810]: I0219 16:29:05.112711 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rgqm9" event={"ID":"a673077a-0e58-4ad8-a5db-f6eeb283be61","Type":"ContainerDied","Data":"710dcf4fbde260c53ab3771ea6e3f9c3d8f23b24754ad03ee8737a762d01c1ca"} Feb 19 16:29:06 crc kubenswrapper[4810]: I0219 16:29:06.124117 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rgqm9" event={"ID":"a673077a-0e58-4ad8-a5db-f6eeb283be61","Type":"ContainerStarted","Data":"66857f6c81a47c65bb164b0a03fa49e824fc23f4623166e8ae894a6257ce21a9"} Feb 19 16:29:06 crc kubenswrapper[4810]: I0219 16:29:06.150946 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rgqm9" podStartSLOduration=2.486515931 podStartE2EDuration="5.150927344s" podCreationTimestamp="2026-02-19 16:29:01 +0000 UTC" firstStartedPulling="2026-02-19 16:29:03.08721296 +0000 UTC m=+4772.569243124" lastFinishedPulling="2026-02-19 16:29:05.751624373 +0000 UTC m=+4775.233654537" observedRunningTime="2026-02-19 16:29:06.141384927 +0000 UTC m=+4775.623415071" watchObservedRunningTime="2026-02-19 16:29:06.150927344 +0000 UTC m=+4775.632957468" Feb 19 16:29:11 crc kubenswrapper[4810]: I0219 16:29:11.465662 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rgqm9" Feb 19 16:29:11 crc kubenswrapper[4810]: I0219 16:29:11.466304 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rgqm9" Feb 19 16:29:11 crc kubenswrapper[4810]: I0219 16:29:11.532079 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rgqm9" Feb 19 16:29:12 crc kubenswrapper[4810]: I0219 16:29:12.274231 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rgqm9" Feb 19 16:29:12 crc kubenswrapper[4810]: I0219 16:29:12.356604 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rgqm9"] Feb 19 16:29:14 crc kubenswrapper[4810]: I0219 16:29:14.215054 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rgqm9" podUID="a673077a-0e58-4ad8-a5db-f6eeb283be61" containerName="registry-server" containerID="cri-o://66857f6c81a47c65bb164b0a03fa49e824fc23f4623166e8ae894a6257ce21a9" gracePeriod=2 Feb 19 16:29:14 crc kubenswrapper[4810]: I0219 16:29:14.771979 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rgqm9" Feb 19 16:29:14 crc kubenswrapper[4810]: I0219 16:29:14.886355 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a673077a-0e58-4ad8-a5db-f6eeb283be61-catalog-content\") pod \"a673077a-0e58-4ad8-a5db-f6eeb283be61\" (UID: \"a673077a-0e58-4ad8-a5db-f6eeb283be61\") " Feb 19 16:29:14 crc kubenswrapper[4810]: I0219 16:29:14.886799 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xpm7\" (UniqueName: \"kubernetes.io/projected/a673077a-0e58-4ad8-a5db-f6eeb283be61-kube-api-access-4xpm7\") pod \"a673077a-0e58-4ad8-a5db-f6eeb283be61\" (UID: \"a673077a-0e58-4ad8-a5db-f6eeb283be61\") " Feb 19 16:29:14 crc kubenswrapper[4810]: I0219 16:29:14.886989 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a673077a-0e58-4ad8-a5db-f6eeb283be61-utilities\") pod \"a673077a-0e58-4ad8-a5db-f6eeb283be61\" (UID: \"a673077a-0e58-4ad8-a5db-f6eeb283be61\") " Feb 19 16:29:14 crc kubenswrapper[4810]: I0219 16:29:14.887738 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a673077a-0e58-4ad8-a5db-f6eeb283be61-utilities" (OuterVolumeSpecName: "utilities") pod "a673077a-0e58-4ad8-a5db-f6eeb283be61" (UID: "a673077a-0e58-4ad8-a5db-f6eeb283be61"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 16:29:14 crc kubenswrapper[4810]: I0219 16:29:14.888061 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a673077a-0e58-4ad8-a5db-f6eeb283be61-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 16:29:14 crc kubenswrapper[4810]: I0219 16:29:14.893276 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a673077a-0e58-4ad8-a5db-f6eeb283be61-kube-api-access-4xpm7" (OuterVolumeSpecName: "kube-api-access-4xpm7") pod "a673077a-0e58-4ad8-a5db-f6eeb283be61" (UID: "a673077a-0e58-4ad8-a5db-f6eeb283be61"). InnerVolumeSpecName "kube-api-access-4xpm7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 16:29:14 crc kubenswrapper[4810]: I0219 16:29:14.908064 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a673077a-0e58-4ad8-a5db-f6eeb283be61-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a673077a-0e58-4ad8-a5db-f6eeb283be61" (UID: "a673077a-0e58-4ad8-a5db-f6eeb283be61"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 16:29:14 crc kubenswrapper[4810]: I0219 16:29:14.990406 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a673077a-0e58-4ad8-a5db-f6eeb283be61-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 16:29:14 crc kubenswrapper[4810]: I0219 16:29:14.990437 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xpm7\" (UniqueName: \"kubernetes.io/projected/a673077a-0e58-4ad8-a5db-f6eeb283be61-kube-api-access-4xpm7\") on node \"crc\" DevicePath \"\"" Feb 19 16:29:15 crc kubenswrapper[4810]: I0219 16:29:15.231599 4810 generic.go:334] "Generic (PLEG): container finished" podID="a673077a-0e58-4ad8-a5db-f6eeb283be61" containerID="66857f6c81a47c65bb164b0a03fa49e824fc23f4623166e8ae894a6257ce21a9" exitCode=0 Feb 19 16:29:15 crc kubenswrapper[4810]: I0219 16:29:15.231684 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rgqm9" Feb 19 16:29:15 crc kubenswrapper[4810]: I0219 16:29:15.231728 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rgqm9" event={"ID":"a673077a-0e58-4ad8-a5db-f6eeb283be61","Type":"ContainerDied","Data":"66857f6c81a47c65bb164b0a03fa49e824fc23f4623166e8ae894a6257ce21a9"} Feb 19 16:29:15 crc kubenswrapper[4810]: I0219 16:29:15.232107 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rgqm9" event={"ID":"a673077a-0e58-4ad8-a5db-f6eeb283be61","Type":"ContainerDied","Data":"9bf23c8154e85ae548ff97afcec39329d16c32d382c90a3d6ac8b672224668f7"} Feb 19 16:29:15 crc kubenswrapper[4810]: I0219 16:29:15.232136 4810 scope.go:117] "RemoveContainer" containerID="66857f6c81a47c65bb164b0a03fa49e824fc23f4623166e8ae894a6257ce21a9" Feb 19 16:29:15 crc kubenswrapper[4810]: I0219 16:29:15.270116 4810 scope.go:117] "RemoveContainer" containerID="710dcf4fbde260c53ab3771ea6e3f9c3d8f23b24754ad03ee8737a762d01c1ca" Feb 19 16:29:15 crc kubenswrapper[4810]: I0219 16:29:15.279420 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rgqm9"] Feb 19 16:29:15 crc kubenswrapper[4810]: I0219 16:29:15.290622 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rgqm9"] Feb 19 16:29:15 crc kubenswrapper[4810]: I0219 16:29:15.301622 4810 scope.go:117] "RemoveContainer" containerID="ae4c13191ff8d66385f46593ebb524ac0d7b604bcdad265511c266dac42d3ed2" Feb 19 16:29:15 crc kubenswrapper[4810]: I0219 16:29:15.374460 4810 scope.go:117] "RemoveContainer" containerID="66857f6c81a47c65bb164b0a03fa49e824fc23f4623166e8ae894a6257ce21a9" Feb 19 16:29:15 crc kubenswrapper[4810]: E0219 16:29:15.374910 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66857f6c81a47c65bb164b0a03fa49e824fc23f4623166e8ae894a6257ce21a9\": container with ID starting with 66857f6c81a47c65bb164b0a03fa49e824fc23f4623166e8ae894a6257ce21a9 not found: ID does not exist" containerID="66857f6c81a47c65bb164b0a03fa49e824fc23f4623166e8ae894a6257ce21a9" Feb 19 16:29:15 crc kubenswrapper[4810]: I0219 16:29:15.374951 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66857f6c81a47c65bb164b0a03fa49e824fc23f4623166e8ae894a6257ce21a9"} err="failed to get container status \"66857f6c81a47c65bb164b0a03fa49e824fc23f4623166e8ae894a6257ce21a9\": rpc error: code = NotFound desc = could not find container \"66857f6c81a47c65bb164b0a03fa49e824fc23f4623166e8ae894a6257ce21a9\": container with ID starting with 66857f6c81a47c65bb164b0a03fa49e824fc23f4623166e8ae894a6257ce21a9 not found: ID does not exist" Feb 19 16:29:15 crc kubenswrapper[4810]: I0219 16:29:15.374982 4810 scope.go:117] "RemoveContainer" containerID="710dcf4fbde260c53ab3771ea6e3f9c3d8f23b24754ad03ee8737a762d01c1ca" Feb 19 16:29:15 crc kubenswrapper[4810]: E0219 16:29:15.375382 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"710dcf4fbde260c53ab3771ea6e3f9c3d8f23b24754ad03ee8737a762d01c1ca\": container with ID starting with 710dcf4fbde260c53ab3771ea6e3f9c3d8f23b24754ad03ee8737a762d01c1ca not found: ID does not exist" containerID="710dcf4fbde260c53ab3771ea6e3f9c3d8f23b24754ad03ee8737a762d01c1ca" Feb 19 16:29:15 crc kubenswrapper[4810]: I0219 16:29:15.375423 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"710dcf4fbde260c53ab3771ea6e3f9c3d8f23b24754ad03ee8737a762d01c1ca"} err="failed to get container status \"710dcf4fbde260c53ab3771ea6e3f9c3d8f23b24754ad03ee8737a762d01c1ca\": rpc error: code = NotFound desc = could not find container \"710dcf4fbde260c53ab3771ea6e3f9c3d8f23b24754ad03ee8737a762d01c1ca\": container with ID starting with 710dcf4fbde260c53ab3771ea6e3f9c3d8f23b24754ad03ee8737a762d01c1ca not found: ID does not exist" Feb 19 16:29:15 crc kubenswrapper[4810]: I0219 16:29:15.375449 4810 scope.go:117] "RemoveContainer" containerID="ae4c13191ff8d66385f46593ebb524ac0d7b604bcdad265511c266dac42d3ed2" Feb 19 16:29:15 crc kubenswrapper[4810]: E0219 16:29:15.375749 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae4c13191ff8d66385f46593ebb524ac0d7b604bcdad265511c266dac42d3ed2\": container with ID starting with ae4c13191ff8d66385f46593ebb524ac0d7b604bcdad265511c266dac42d3ed2 not found: ID does not exist" containerID="ae4c13191ff8d66385f46593ebb524ac0d7b604bcdad265511c266dac42d3ed2" Feb 19 16:29:15 crc kubenswrapper[4810]: I0219 16:29:15.375774 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae4c13191ff8d66385f46593ebb524ac0d7b604bcdad265511c266dac42d3ed2"} err="failed to get container status \"ae4c13191ff8d66385f46593ebb524ac0d7b604bcdad265511c266dac42d3ed2\": rpc error: code = NotFound desc = could not find container \"ae4c13191ff8d66385f46593ebb524ac0d7b604bcdad265511c266dac42d3ed2\": container with ID starting with ae4c13191ff8d66385f46593ebb524ac0d7b604bcdad265511c266dac42d3ed2 not found: ID does not exist" Feb 19 16:29:15 crc kubenswrapper[4810]: I0219 16:29:15.457165 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a673077a-0e58-4ad8-a5db-f6eeb283be61" path="/var/lib/kubelet/pods/a673077a-0e58-4ad8-a5db-f6eeb283be61/volumes" Feb 19 16:29:19 crc kubenswrapper[4810]: I0219 16:29:19.537913 4810 patch_prober.go:28] interesting pod/machine-config-daemon-t499d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 16:29:19 crc kubenswrapper[4810]: I0219 16:29:19.538501 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 16:29:49 crc kubenswrapper[4810]: I0219 16:29:49.537153 4810 patch_prober.go:28] interesting pod/machine-config-daemon-t499d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 16:29:49 crc kubenswrapper[4810]: I0219 16:29:49.537728 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 16:30:00 crc kubenswrapper[4810]: I0219 16:30:00.163179 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525310-6p748"] Feb 19 16:30:00 crc kubenswrapper[4810]: E0219 16:30:00.164367 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a673077a-0e58-4ad8-a5db-f6eeb283be61" containerName="registry-server" Feb 19 16:30:00 crc kubenswrapper[4810]: I0219 16:30:00.164389 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="a673077a-0e58-4ad8-a5db-f6eeb283be61" containerName="registry-server" Feb 19 16:30:00 crc kubenswrapper[4810]: E0219 16:30:00.164438 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a673077a-0e58-4ad8-a5db-f6eeb283be61" containerName="extract-utilities" Feb 19 16:30:00 crc kubenswrapper[4810]: I0219 16:30:00.164446 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="a673077a-0e58-4ad8-a5db-f6eeb283be61" containerName="extract-utilities" Feb 19 16:30:00 crc kubenswrapper[4810]: E0219 16:30:00.164463 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a673077a-0e58-4ad8-a5db-f6eeb283be61" containerName="extract-content" Feb 19 16:30:00 crc kubenswrapper[4810]: I0219 16:30:00.164471 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="a673077a-0e58-4ad8-a5db-f6eeb283be61" containerName="extract-content" Feb 19 16:30:00 crc kubenswrapper[4810]: I0219 16:30:00.164703 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="a673077a-0e58-4ad8-a5db-f6eeb283be61" containerName="registry-server" Feb 19 16:30:00 crc kubenswrapper[4810]: I0219 16:30:00.165688 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525310-6p748" Feb 19 16:30:00 crc kubenswrapper[4810]: I0219 16:30:00.168668 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 19 16:30:00 crc kubenswrapper[4810]: I0219 16:30:00.169472 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 19 16:30:00 crc kubenswrapper[4810]: I0219 16:30:00.183362 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525310-6p748"] Feb 19 16:30:00 crc kubenswrapper[4810]: I0219 16:30:00.196909 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c6ec97af-4e4f-48fd-a173-d19b4c7c5d3a-secret-volume\") pod \"collect-profiles-29525310-6p748\" (UID: \"c6ec97af-4e4f-48fd-a173-d19b4c7c5d3a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525310-6p748" Feb 19 16:30:00 crc kubenswrapper[4810]: I0219 16:30:00.197305 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2srws\" (UniqueName: \"kubernetes.io/projected/c6ec97af-4e4f-48fd-a173-d19b4c7c5d3a-kube-api-access-2srws\") pod \"collect-profiles-29525310-6p748\" (UID: \"c6ec97af-4e4f-48fd-a173-d19b4c7c5d3a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525310-6p748" Feb 19 16:30:00 crc kubenswrapper[4810]: I0219 16:30:00.197368 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c6ec97af-4e4f-48fd-a173-d19b4c7c5d3a-config-volume\") pod \"collect-profiles-29525310-6p748\" (UID: \"c6ec97af-4e4f-48fd-a173-d19b4c7c5d3a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525310-6p748" Feb 19 16:30:00 crc kubenswrapper[4810]: I0219 16:30:00.299141 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2srws\" (UniqueName: \"kubernetes.io/projected/c6ec97af-4e4f-48fd-a173-d19b4c7c5d3a-kube-api-access-2srws\") pod \"collect-profiles-29525310-6p748\" (UID: \"c6ec97af-4e4f-48fd-a173-d19b4c7c5d3a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525310-6p748" Feb 19 16:30:00 crc kubenswrapper[4810]: I0219 16:30:00.299202 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c6ec97af-4e4f-48fd-a173-d19b4c7c5d3a-config-volume\") pod \"collect-profiles-29525310-6p748\" (UID: \"c6ec97af-4e4f-48fd-a173-d19b4c7c5d3a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525310-6p748" Feb 19 16:30:00 crc kubenswrapper[4810]: I0219 16:30:00.299485 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c6ec97af-4e4f-48fd-a173-d19b4c7c5d3a-secret-volume\") pod \"collect-profiles-29525310-6p748\" (UID: \"c6ec97af-4e4f-48fd-a173-d19b4c7c5d3a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525310-6p748" Feb 19 16:30:00 crc kubenswrapper[4810]: I0219 16:30:00.300316 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c6ec97af-4e4f-48fd-a173-d19b4c7c5d3a-config-volume\") pod \"collect-profiles-29525310-6p748\" (UID: \"c6ec97af-4e4f-48fd-a173-d19b4c7c5d3a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525310-6p748" Feb 19 16:30:00 crc kubenswrapper[4810]: I0219 16:30:00.307018 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c6ec97af-4e4f-48fd-a173-d19b4c7c5d3a-secret-volume\") pod \"collect-profiles-29525310-6p748\" (UID: \"c6ec97af-4e4f-48fd-a173-d19b4c7c5d3a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525310-6p748" Feb 19 16:30:00 crc kubenswrapper[4810]: I0219 16:30:00.334770 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2srws\" (UniqueName: \"kubernetes.io/projected/c6ec97af-4e4f-48fd-a173-d19b4c7c5d3a-kube-api-access-2srws\") pod \"collect-profiles-29525310-6p748\" (UID: \"c6ec97af-4e4f-48fd-a173-d19b4c7c5d3a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525310-6p748" Feb 19 16:30:00 crc kubenswrapper[4810]: I0219 16:30:00.498137 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525310-6p748" Feb 19 16:30:01 crc kubenswrapper[4810]: I0219 16:30:01.022486 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525310-6p748"] Feb 19 16:30:01 crc kubenswrapper[4810]: I0219 16:30:01.820297 4810 generic.go:334] "Generic (PLEG): container finished" podID="c6ec97af-4e4f-48fd-a173-d19b4c7c5d3a" containerID="bd22ffcc93ad514f9c9f0fd30ea32bc13e72bf86f993fc9a5d41d80446f59f0f" exitCode=0 Feb 19 16:30:01 crc kubenswrapper[4810]: I0219 16:30:01.820840 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525310-6p748" event={"ID":"c6ec97af-4e4f-48fd-a173-d19b4c7c5d3a","Type":"ContainerDied","Data":"bd22ffcc93ad514f9c9f0fd30ea32bc13e72bf86f993fc9a5d41d80446f59f0f"} Feb 19 16:30:01 crc kubenswrapper[4810]: I0219 16:30:01.821071 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525310-6p748" event={"ID":"c6ec97af-4e4f-48fd-a173-d19b4c7c5d3a","Type":"ContainerStarted","Data":"d65d3016f5d8bd6cf29a5305a05195f6a5475db2ee983b718f17ecf85f25023c"} Feb 19 16:30:03 crc kubenswrapper[4810]: I0219 16:30:03.308098 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525310-6p748" Feb 19 16:30:03 crc kubenswrapper[4810]: I0219 16:30:03.380003 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c6ec97af-4e4f-48fd-a173-d19b4c7c5d3a-secret-volume\") pod \"c6ec97af-4e4f-48fd-a173-d19b4c7c5d3a\" (UID: \"c6ec97af-4e4f-48fd-a173-d19b4c7c5d3a\") " Feb 19 16:30:03 crc kubenswrapper[4810]: I0219 16:30:03.380218 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c6ec97af-4e4f-48fd-a173-d19b4c7c5d3a-config-volume\") pod \"c6ec97af-4e4f-48fd-a173-d19b4c7c5d3a\" (UID: \"c6ec97af-4e4f-48fd-a173-d19b4c7c5d3a\") " Feb 19 16:30:03 crc kubenswrapper[4810]: I0219 16:30:03.380381 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2srws\" (UniqueName: \"kubernetes.io/projected/c6ec97af-4e4f-48fd-a173-d19b4c7c5d3a-kube-api-access-2srws\") pod \"c6ec97af-4e4f-48fd-a173-d19b4c7c5d3a\" (UID: \"c6ec97af-4e4f-48fd-a173-d19b4c7c5d3a\") " Feb 19 16:30:03 crc kubenswrapper[4810]: I0219 16:30:03.381438 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6ec97af-4e4f-48fd-a173-d19b4c7c5d3a-config-volume" (OuterVolumeSpecName: "config-volume") pod "c6ec97af-4e4f-48fd-a173-d19b4c7c5d3a" (UID: "c6ec97af-4e4f-48fd-a173-d19b4c7c5d3a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 16:30:03 crc kubenswrapper[4810]: I0219 16:30:03.390731 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6ec97af-4e4f-48fd-a173-d19b4c7c5d3a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c6ec97af-4e4f-48fd-a173-d19b4c7c5d3a" (UID: "c6ec97af-4e4f-48fd-a173-d19b4c7c5d3a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 16:30:03 crc kubenswrapper[4810]: I0219 16:30:03.391723 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6ec97af-4e4f-48fd-a173-d19b4c7c5d3a-kube-api-access-2srws" (OuterVolumeSpecName: "kube-api-access-2srws") pod "c6ec97af-4e4f-48fd-a173-d19b4c7c5d3a" (UID: "c6ec97af-4e4f-48fd-a173-d19b4c7c5d3a"). InnerVolumeSpecName "kube-api-access-2srws". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 16:30:03 crc kubenswrapper[4810]: I0219 16:30:03.484160 4810 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c6ec97af-4e4f-48fd-a173-d19b4c7c5d3a-config-volume\") on node \"crc\" DevicePath \"\"" Feb 19 16:30:03 crc kubenswrapper[4810]: I0219 16:30:03.484565 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2srws\" (UniqueName: \"kubernetes.io/projected/c6ec97af-4e4f-48fd-a173-d19b4c7c5d3a-kube-api-access-2srws\") on node \"crc\" DevicePath \"\"" Feb 19 16:30:03 crc kubenswrapper[4810]: I0219 16:30:03.484708 4810 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c6ec97af-4e4f-48fd-a173-d19b4c7c5d3a-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 19 16:30:03 crc kubenswrapper[4810]: I0219 16:30:03.846722 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525310-6p748" event={"ID":"c6ec97af-4e4f-48fd-a173-d19b4c7c5d3a","Type":"ContainerDied","Data":"d65d3016f5d8bd6cf29a5305a05195f6a5475db2ee983b718f17ecf85f25023c"} Feb 19 16:30:03 crc kubenswrapper[4810]: I0219 16:30:03.847062 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d65d3016f5d8bd6cf29a5305a05195f6a5475db2ee983b718f17ecf85f25023c" Feb 19 16:30:03 crc kubenswrapper[4810]: I0219 16:30:03.847132 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525310-6p748" Feb 19 16:30:04 crc kubenswrapper[4810]: I0219 16:30:04.413385 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525265-dnbnb"] Feb 19 16:30:04 crc kubenswrapper[4810]: I0219 16:30:04.423576 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525265-dnbnb"] Feb 19 16:30:05 crc kubenswrapper[4810]: I0219 16:30:05.458718 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ebe856b-d546-48e1-862d-d9f039620b73" path="/var/lib/kubelet/pods/9ebe856b-d546-48e1-862d-d9f039620b73/volumes" Feb 19 16:30:19 crc kubenswrapper[4810]: I0219 16:30:19.537841 4810 patch_prober.go:28] interesting pod/machine-config-daemon-t499d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 16:30:19 crc kubenswrapper[4810]: I0219 16:30:19.538792 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 16:30:19 crc kubenswrapper[4810]: I0219 16:30:19.538882 4810 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t499d" Feb 19 16:30:19 crc kubenswrapper[4810]: I0219 16:30:19.540375 4810 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"61e1161022488800f1a95fde17c57eb209b943fcdd172d04938731e0ad552ce2"} pod="openshift-machine-config-operator/machine-config-daemon-t499d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 16:30:19 crc kubenswrapper[4810]: I0219 16:30:19.540523 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" containerID="cri-o://61e1161022488800f1a95fde17c57eb209b943fcdd172d04938731e0ad552ce2" gracePeriod=600 Feb 19 16:30:20 crc kubenswrapper[4810]: I0219 16:30:20.045819 4810 generic.go:334] "Generic (PLEG): container finished" podID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerID="61e1161022488800f1a95fde17c57eb209b943fcdd172d04938731e0ad552ce2" exitCode=0 Feb 19 16:30:20 crc kubenswrapper[4810]: I0219 16:30:20.045887 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" event={"ID":"dfbf74ef-1e94-4826-8583-42b2e246ccf3","Type":"ContainerDied","Data":"61e1161022488800f1a95fde17c57eb209b943fcdd172d04938731e0ad552ce2"} Feb 19 16:30:20 crc kubenswrapper[4810]: I0219 16:30:20.046228 4810 scope.go:117] "RemoveContainer" containerID="fe77cba37c7a5825c8d865a2480ab7487bd5dce94a751361742d3c3e639d1af6" Feb 19 16:30:21 crc kubenswrapper[4810]: I0219 16:30:21.069997 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" event={"ID":"dfbf74ef-1e94-4826-8583-42b2e246ccf3","Type":"ContainerStarted","Data":"c54befe76a9fc3a2a84e7b2c94c4037774045172c204f2e9cb88741a51f735a3"} Feb 19 16:30:44 crc kubenswrapper[4810]: I0219 16:30:44.483296 4810 scope.go:117] "RemoveContainer" containerID="ea1fa5fe82d5994ff114d9b04616fab9d73e059f9ec50ceb138445dbcf8a33cf" Feb 19 16:32:49 crc kubenswrapper[4810]: I0219 16:32:49.537559 4810 patch_prober.go:28] interesting pod/machine-config-daemon-t499d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 16:32:49 crc kubenswrapper[4810]: I0219 16:32:49.538138 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 16:33:05 crc kubenswrapper[4810]: E0219 16:33:05.724932 4810 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.162:36526->38.102.83.162:41765: write tcp 38.102.83.162:36526->38.102.83.162:41765: write: broken pipe Feb 19 16:33:14 crc kubenswrapper[4810]: E0219 16:33:14.201382 4810 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.162:42186->38.102.83.162:41765: write tcp 38.102.83.162:42186->38.102.83.162:41765: write: broken pipe Feb 19 16:33:19 crc kubenswrapper[4810]: I0219 16:33:19.537198 4810 patch_prober.go:28] interesting pod/machine-config-daemon-t499d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 16:33:19 crc kubenswrapper[4810]: I0219 16:33:19.537720 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 16:33:49 crc kubenswrapper[4810]: I0219 16:33:49.538206 4810 patch_prober.go:28] interesting pod/machine-config-daemon-t499d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 16:33:49 crc kubenswrapper[4810]: I0219 16:33:49.538987 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 16:33:49 crc kubenswrapper[4810]: I0219 16:33:49.539069 4810 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t499d" Feb 19 16:33:49 crc kubenswrapper[4810]: I0219 16:33:49.540258 4810 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c54befe76a9fc3a2a84e7b2c94c4037774045172c204f2e9cb88741a51f735a3"} pod="openshift-machine-config-operator/machine-config-daemon-t499d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 16:33:49 crc kubenswrapper[4810]: I0219 16:33:49.540378 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" containerID="cri-o://c54befe76a9fc3a2a84e7b2c94c4037774045172c204f2e9cb88741a51f735a3" gracePeriod=600 Feb 19 16:33:50 crc kubenswrapper[4810]: E0219 16:33:50.008418 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:33:50 crc kubenswrapper[4810]: I0219 16:33:50.414042 4810 generic.go:334] "Generic (PLEG): container finished" podID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerID="c54befe76a9fc3a2a84e7b2c94c4037774045172c204f2e9cb88741a51f735a3" exitCode=0 Feb 19 16:33:50 crc kubenswrapper[4810]: I0219 16:33:50.414090 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" event={"ID":"dfbf74ef-1e94-4826-8583-42b2e246ccf3","Type":"ContainerDied","Data":"c54befe76a9fc3a2a84e7b2c94c4037774045172c204f2e9cb88741a51f735a3"} Feb 19 16:33:50 crc kubenswrapper[4810]: I0219 16:33:50.414134 4810 scope.go:117] "RemoveContainer" containerID="61e1161022488800f1a95fde17c57eb209b943fcdd172d04938731e0ad552ce2" Feb 19 16:33:50 crc kubenswrapper[4810]: I0219 16:33:50.414706 4810 scope.go:117] "RemoveContainer" containerID="c54befe76a9fc3a2a84e7b2c94c4037774045172c204f2e9cb88741a51f735a3" Feb 19 16:33:50 crc kubenswrapper[4810]: E0219 16:33:50.414939 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:34:04 crc kubenswrapper[4810]: I0219 16:34:04.439987 4810 scope.go:117] "RemoveContainer" containerID="c54befe76a9fc3a2a84e7b2c94c4037774045172c204f2e9cb88741a51f735a3" Feb 19 16:34:04 crc kubenswrapper[4810]: E0219 16:34:04.440819 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:34:15 crc kubenswrapper[4810]: I0219 16:34:15.439271 4810 scope.go:117] "RemoveContainer" containerID="c54befe76a9fc3a2a84e7b2c94c4037774045172c204f2e9cb88741a51f735a3" Feb 19 16:34:15 crc kubenswrapper[4810]: E0219 16:34:15.440190 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:34:27 crc kubenswrapper[4810]: I0219 16:34:27.442076 4810 scope.go:117] "RemoveContainer" containerID="c54befe76a9fc3a2a84e7b2c94c4037774045172c204f2e9cb88741a51f735a3" Feb 19 16:34:27 crc kubenswrapper[4810]: E0219 16:34:27.443122 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:34:40 crc kubenswrapper[4810]: I0219 16:34:40.439679 4810 scope.go:117] "RemoveContainer" containerID="c54befe76a9fc3a2a84e7b2c94c4037774045172c204f2e9cb88741a51f735a3" Feb 19 16:34:40 crc kubenswrapper[4810]: E0219 16:34:40.440584 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:34:51 crc kubenswrapper[4810]: I0219 16:34:51.446594 4810 scope.go:117] "RemoveContainer" containerID="c54befe76a9fc3a2a84e7b2c94c4037774045172c204f2e9cb88741a51f735a3" Feb 19 16:34:51 crc kubenswrapper[4810]: E0219 16:34:51.447273 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:35:02 crc kubenswrapper[4810]: I0219 16:35:02.440128 4810 scope.go:117] "RemoveContainer" containerID="c54befe76a9fc3a2a84e7b2c94c4037774045172c204f2e9cb88741a51f735a3" Feb 19 16:35:02 crc kubenswrapper[4810]: E0219 16:35:02.441217 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:35:16 crc kubenswrapper[4810]: I0219 16:35:16.440191 4810 scope.go:117] "RemoveContainer" containerID="c54befe76a9fc3a2a84e7b2c94c4037774045172c204f2e9cb88741a51f735a3" Feb 19 16:35:16 crc kubenswrapper[4810]: E0219 16:35:16.441450 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:35:31 crc kubenswrapper[4810]: I0219 16:35:31.447850 4810 scope.go:117] "RemoveContainer" containerID="c54befe76a9fc3a2a84e7b2c94c4037774045172c204f2e9cb88741a51f735a3" Feb 19 16:35:31 crc kubenswrapper[4810]: E0219 16:35:31.448778 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:35:44 crc kubenswrapper[4810]: I0219 16:35:44.440142 4810 scope.go:117] "RemoveContainer" containerID="c54befe76a9fc3a2a84e7b2c94c4037774045172c204f2e9cb88741a51f735a3" Feb 19 16:35:44 crc kubenswrapper[4810]: E0219 16:35:44.441085 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:35:59 crc kubenswrapper[4810]: I0219 16:35:59.439732 4810 scope.go:117] "RemoveContainer" containerID="c54befe76a9fc3a2a84e7b2c94c4037774045172c204f2e9cb88741a51f735a3" Feb 19 16:35:59 crc kubenswrapper[4810]: E0219 16:35:59.440826 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:36:13 crc kubenswrapper[4810]: I0219 16:36:13.439677 4810 scope.go:117] "RemoveContainer" containerID="c54befe76a9fc3a2a84e7b2c94c4037774045172c204f2e9cb88741a51f735a3" Feb 19 16:36:13 crc kubenswrapper[4810]: E0219 16:36:13.443920 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:36:24 crc kubenswrapper[4810]: I0219 16:36:24.438996 4810 scope.go:117] "RemoveContainer" containerID="c54befe76a9fc3a2a84e7b2c94c4037774045172c204f2e9cb88741a51f735a3" Feb 19 16:36:24 crc kubenswrapper[4810]: E0219 16:36:24.439931 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:36:35 crc kubenswrapper[4810]: I0219 16:36:35.439469 4810 scope.go:117] "RemoveContainer" containerID="c54befe76a9fc3a2a84e7b2c94c4037774045172c204f2e9cb88741a51f735a3" Feb 19 16:36:35 crc kubenswrapper[4810]: E0219 16:36:35.440314 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:36:49 crc kubenswrapper[4810]: I0219 16:36:49.440013 4810 scope.go:117] "RemoveContainer" containerID="c54befe76a9fc3a2a84e7b2c94c4037774045172c204f2e9cb88741a51f735a3" Feb 19 16:36:49 crc kubenswrapper[4810]: E0219 16:36:49.440645 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:37:01 crc kubenswrapper[4810]: I0219 16:37:01.448478 4810 scope.go:117] "RemoveContainer" containerID="c54befe76a9fc3a2a84e7b2c94c4037774045172c204f2e9cb88741a51f735a3" Feb 19 16:37:01 crc kubenswrapper[4810]: E0219 16:37:01.449485 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:37:15 crc kubenswrapper[4810]: I0219 16:37:15.455972 4810 scope.go:117] "RemoveContainer" containerID="c54befe76a9fc3a2a84e7b2c94c4037774045172c204f2e9cb88741a51f735a3" Feb 19 16:37:15 crc kubenswrapper[4810]: E0219 16:37:15.457212 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:37:27 crc kubenswrapper[4810]: I0219 16:37:27.439261 4810 scope.go:117] "RemoveContainer" containerID="c54befe76a9fc3a2a84e7b2c94c4037774045172c204f2e9cb88741a51f735a3" Feb 19 16:37:27 crc kubenswrapper[4810]: E0219 16:37:27.439878 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:37:40 crc kubenswrapper[4810]: I0219 16:37:40.440443 4810 scope.go:117] "RemoveContainer" containerID="c54befe76a9fc3a2a84e7b2c94c4037774045172c204f2e9cb88741a51f735a3" Feb 19 16:37:40 crc kubenswrapper[4810]: E0219 16:37:40.441459 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:37:52 crc kubenswrapper[4810]: I0219 16:37:52.439605 4810 scope.go:117] "RemoveContainer" containerID="c54befe76a9fc3a2a84e7b2c94c4037774045172c204f2e9cb88741a51f735a3" Feb 19 16:37:52 crc kubenswrapper[4810]: E0219 16:37:52.440657 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:38:07 crc kubenswrapper[4810]: I0219 16:38:07.439948 4810 scope.go:117] "RemoveContainer" containerID="c54befe76a9fc3a2a84e7b2c94c4037774045172c204f2e9cb88741a51f735a3" Feb 19 16:38:07 crc kubenswrapper[4810]: E0219 16:38:07.440684 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:38:22 crc kubenswrapper[4810]: I0219 16:38:22.439685 4810 scope.go:117] "RemoveContainer" containerID="c54befe76a9fc3a2a84e7b2c94c4037774045172c204f2e9cb88741a51f735a3" Feb 19 16:38:22 crc kubenswrapper[4810]: E0219 16:38:22.440397 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:38:33 crc kubenswrapper[4810]: I0219 16:38:33.440773 4810 scope.go:117] "RemoveContainer" containerID="c54befe76a9fc3a2a84e7b2c94c4037774045172c204f2e9cb88741a51f735a3" Feb 19 16:38:33 crc kubenswrapper[4810]: E0219 16:38:33.441896 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:38:48 crc kubenswrapper[4810]: I0219 16:38:48.439807 4810 scope.go:117] "RemoveContainer" containerID="c54befe76a9fc3a2a84e7b2c94c4037774045172c204f2e9cb88741a51f735a3" Feb 19 16:38:48 crc kubenswrapper[4810]: E0219 16:38:48.440560 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:39:00 crc kubenswrapper[4810]: I0219 16:39:00.441074 4810 scope.go:117] "RemoveContainer" containerID="c54befe76a9fc3a2a84e7b2c94c4037774045172c204f2e9cb88741a51f735a3" Feb 19 16:39:01 crc kubenswrapper[4810]: I0219 16:39:01.081811 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" event={"ID":"dfbf74ef-1e94-4826-8583-42b2e246ccf3","Type":"ContainerStarted","Data":"b8edda1f9342dacaf59cc4da7ca5a5e6fa7b1be00b10f0e3774c25fffdb24625"} Feb 19 16:39:02 crc kubenswrapper[4810]: I0219 16:39:02.612662 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-fgvsh"] Feb 19 16:39:02 crc kubenswrapper[4810]: E0219 16:39:02.614196 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6ec97af-4e4f-48fd-a173-d19b4c7c5d3a" containerName="collect-profiles" Feb 19 16:39:02 crc kubenswrapper[4810]: I0219 16:39:02.614212 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6ec97af-4e4f-48fd-a173-d19b4c7c5d3a" containerName="collect-profiles" Feb 19 16:39:02 crc kubenswrapper[4810]: I0219 16:39:02.614543 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6ec97af-4e4f-48fd-a173-d19b4c7c5d3a" containerName="collect-profiles" Feb 19 16:39:02 crc kubenswrapper[4810]: I0219 16:39:02.616357 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fgvsh" Feb 19 16:39:02 crc kubenswrapper[4810]: I0219 16:39:02.625453 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fgvsh"] Feb 19 16:39:02 crc kubenswrapper[4810]: I0219 16:39:02.770707 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f034bea-4449-48b7-b4dc-359682644709-utilities\") pod \"certified-operators-fgvsh\" (UID: \"3f034bea-4449-48b7-b4dc-359682644709\") " pod="openshift-marketplace/certified-operators-fgvsh" Feb 19 16:39:02 crc kubenswrapper[4810]: I0219 16:39:02.770806 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2bg6\" (UniqueName: \"kubernetes.io/projected/3f034bea-4449-48b7-b4dc-359682644709-kube-api-access-w2bg6\") pod \"certified-operators-fgvsh\" (UID: \"3f034bea-4449-48b7-b4dc-359682644709\") " pod="openshift-marketplace/certified-operators-fgvsh" Feb 19 16:39:02 crc kubenswrapper[4810]: I0219 16:39:02.770933 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f034bea-4449-48b7-b4dc-359682644709-catalog-content\") pod \"certified-operators-fgvsh\" (UID: \"3f034bea-4449-48b7-b4dc-359682644709\") " pod="openshift-marketplace/certified-operators-fgvsh" Feb 19 16:39:02 crc kubenswrapper[4810]: I0219 16:39:02.873349 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f034bea-4449-48b7-b4dc-359682644709-utilities\") pod \"certified-operators-fgvsh\" (UID: \"3f034bea-4449-48b7-b4dc-359682644709\") " pod="openshift-marketplace/certified-operators-fgvsh" Feb 19 16:39:02 crc kubenswrapper[4810]: I0219 16:39:02.873428 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2bg6\" (UniqueName: \"kubernetes.io/projected/3f034bea-4449-48b7-b4dc-359682644709-kube-api-access-w2bg6\") pod \"certified-operators-fgvsh\" (UID: \"3f034bea-4449-48b7-b4dc-359682644709\") " pod="openshift-marketplace/certified-operators-fgvsh" Feb 19 16:39:02 crc kubenswrapper[4810]: I0219 16:39:02.873505 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f034bea-4449-48b7-b4dc-359682644709-catalog-content\") pod \"certified-operators-fgvsh\" (UID: \"3f034bea-4449-48b7-b4dc-359682644709\") " pod="openshift-marketplace/certified-operators-fgvsh" Feb 19 16:39:02 crc kubenswrapper[4810]: I0219 16:39:02.874363 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f034bea-4449-48b7-b4dc-359682644709-catalog-content\") pod \"certified-operators-fgvsh\" (UID: \"3f034bea-4449-48b7-b4dc-359682644709\") " pod="openshift-marketplace/certified-operators-fgvsh" Feb 19 16:39:02 crc kubenswrapper[4810]: I0219 16:39:02.874521 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f034bea-4449-48b7-b4dc-359682644709-utilities\") pod \"certified-operators-fgvsh\" (UID: \"3f034bea-4449-48b7-b4dc-359682644709\") " pod="openshift-marketplace/certified-operators-fgvsh" Feb 19 16:39:02 crc kubenswrapper[4810]: I0219 16:39:02.910913 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2bg6\" (UniqueName: \"kubernetes.io/projected/3f034bea-4449-48b7-b4dc-359682644709-kube-api-access-w2bg6\") pod \"certified-operators-fgvsh\" (UID: \"3f034bea-4449-48b7-b4dc-359682644709\") " pod="openshift-marketplace/certified-operators-fgvsh" Feb 19 16:39:02 crc kubenswrapper[4810]: I0219 16:39:02.959171 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fgvsh" Feb 19 16:39:03 crc kubenswrapper[4810]: I0219 16:39:03.468663 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fgvsh"] Feb 19 16:39:03 crc kubenswrapper[4810]: W0219 16:39:03.469568 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f034bea_4449_48b7_b4dc_359682644709.slice/crio-30f07e3fe9974f137e654fec2499095509becdecd424a45b6ebe9ca7430c3355 WatchSource:0}: Error finding container 30f07e3fe9974f137e654fec2499095509becdecd424a45b6ebe9ca7430c3355: Status 404 returned error can't find the container with id 30f07e3fe9974f137e654fec2499095509becdecd424a45b6ebe9ca7430c3355 Feb 19 16:39:04 crc kubenswrapper[4810]: I0219 16:39:04.130852 4810 generic.go:334] "Generic (PLEG): container finished" podID="3f034bea-4449-48b7-b4dc-359682644709" containerID="5b8771d19c21acb137923f1de719182c1c51622cce0781c46ea7cb43d5900fe4" exitCode=0 Feb 19 16:39:04 crc kubenswrapper[4810]: I0219 16:39:04.130941 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fgvsh" event={"ID":"3f034bea-4449-48b7-b4dc-359682644709","Type":"ContainerDied","Data":"5b8771d19c21acb137923f1de719182c1c51622cce0781c46ea7cb43d5900fe4"} Feb 19 16:39:04 crc kubenswrapper[4810]: I0219 16:39:04.132568 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fgvsh" event={"ID":"3f034bea-4449-48b7-b4dc-359682644709","Type":"ContainerStarted","Data":"30f07e3fe9974f137e654fec2499095509becdecd424a45b6ebe9ca7430c3355"} Feb 19 16:39:04 crc kubenswrapper[4810]: I0219 16:39:04.135282 4810 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 16:39:05 crc kubenswrapper[4810]: I0219 16:39:05.149802 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fgvsh" event={"ID":"3f034bea-4449-48b7-b4dc-359682644709","Type":"ContainerStarted","Data":"ab1039b6a551294a7e68715becc61601fbef14c361a510fef74eb845ad560452"} Feb 19 16:39:07 crc kubenswrapper[4810]: I0219 16:39:07.176572 4810 generic.go:334] "Generic (PLEG): container finished" podID="3f034bea-4449-48b7-b4dc-359682644709" containerID="ab1039b6a551294a7e68715becc61601fbef14c361a510fef74eb845ad560452" exitCode=0 Feb 19 16:39:07 crc kubenswrapper[4810]: I0219 16:39:07.176650 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fgvsh" event={"ID":"3f034bea-4449-48b7-b4dc-359682644709","Type":"ContainerDied","Data":"ab1039b6a551294a7e68715becc61601fbef14c361a510fef74eb845ad560452"} Feb 19 16:39:08 crc kubenswrapper[4810]: I0219 16:39:08.197955 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fgvsh" event={"ID":"3f034bea-4449-48b7-b4dc-359682644709","Type":"ContainerStarted","Data":"c1237ceb27bc050239ad7d857ad7d267187a75eddcf61de8348b5ad0d374a081"} Feb 19 16:39:08 crc kubenswrapper[4810]: I0219 16:39:08.239920 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-fgvsh" podStartSLOduration=2.797637962 podStartE2EDuration="6.239890736s" podCreationTimestamp="2026-02-19 16:39:02 +0000 UTC" firstStartedPulling="2026-02-19 16:39:04.134241862 +0000 UTC m=+5373.616272026" lastFinishedPulling="2026-02-19 16:39:07.576494636 +0000 UTC m=+5377.058524800" observedRunningTime="2026-02-19 16:39:08.221538821 +0000 UTC m=+5377.703568995" watchObservedRunningTime="2026-02-19 16:39:08.239890736 +0000 UTC m=+5377.721920890" Feb 19 16:39:08 crc kubenswrapper[4810]: I0219 16:39:08.797626 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="c0ffb8ce-a356-4416-b96c-49db30ff1947" containerName="galera" probeResult="failure" output="command timed out" Feb 19 16:39:08 crc kubenswrapper[4810]: I0219 16:39:08.797667 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-galera-0" podUID="c0ffb8ce-a356-4416-b96c-49db30ff1947" containerName="galera" probeResult="failure" output="command timed out" Feb 19 16:39:09 crc kubenswrapper[4810]: I0219 16:39:09.572637 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ch7bb"] Feb 19 16:39:09 crc kubenswrapper[4810]: I0219 16:39:09.593007 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ch7bb" Feb 19 16:39:09 crc kubenswrapper[4810]: I0219 16:39:09.599794 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ch7bb"] Feb 19 16:39:09 crc kubenswrapper[4810]: I0219 16:39:09.744532 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81d6df67-4f89-42b8-9e19-2812ece77996-catalog-content\") pod \"community-operators-ch7bb\" (UID: \"81d6df67-4f89-42b8-9e19-2812ece77996\") " pod="openshift-marketplace/community-operators-ch7bb" Feb 19 16:39:09 crc kubenswrapper[4810]: I0219 16:39:09.744639 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbsd6\" (UniqueName: \"kubernetes.io/projected/81d6df67-4f89-42b8-9e19-2812ece77996-kube-api-access-fbsd6\") pod \"community-operators-ch7bb\" (UID: \"81d6df67-4f89-42b8-9e19-2812ece77996\") " pod="openshift-marketplace/community-operators-ch7bb" Feb 19 16:39:09 crc kubenswrapper[4810]: I0219 16:39:09.744707 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81d6df67-4f89-42b8-9e19-2812ece77996-utilities\") pod \"community-operators-ch7bb\" (UID: \"81d6df67-4f89-42b8-9e19-2812ece77996\") " pod="openshift-marketplace/community-operators-ch7bb" Feb 19 16:39:09 crc kubenswrapper[4810]: I0219 16:39:09.847577 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81d6df67-4f89-42b8-9e19-2812ece77996-utilities\") pod \"community-operators-ch7bb\" (UID: \"81d6df67-4f89-42b8-9e19-2812ece77996\") " pod="openshift-marketplace/community-operators-ch7bb" Feb 19 16:39:09 crc kubenswrapper[4810]: I0219 16:39:09.847941 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81d6df67-4f89-42b8-9e19-2812ece77996-catalog-content\") pod \"community-operators-ch7bb\" (UID: \"81d6df67-4f89-42b8-9e19-2812ece77996\") " pod="openshift-marketplace/community-operators-ch7bb" Feb 19 16:39:09 crc kubenswrapper[4810]: I0219 16:39:09.848041 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbsd6\" (UniqueName: \"kubernetes.io/projected/81d6df67-4f89-42b8-9e19-2812ece77996-kube-api-access-fbsd6\") pod \"community-operators-ch7bb\" (UID: \"81d6df67-4f89-42b8-9e19-2812ece77996\") " pod="openshift-marketplace/community-operators-ch7bb" Feb 19 16:39:09 crc kubenswrapper[4810]: I0219 16:39:09.848198 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81d6df67-4f89-42b8-9e19-2812ece77996-utilities\") pod \"community-operators-ch7bb\" (UID: \"81d6df67-4f89-42b8-9e19-2812ece77996\") " pod="openshift-marketplace/community-operators-ch7bb" Feb 19 16:39:09 crc kubenswrapper[4810]: I0219 16:39:09.848447 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81d6df67-4f89-42b8-9e19-2812ece77996-catalog-content\") pod \"community-operators-ch7bb\" (UID: \"81d6df67-4f89-42b8-9e19-2812ece77996\") " pod="openshift-marketplace/community-operators-ch7bb" Feb 19 16:39:09 crc kubenswrapper[4810]: I0219 16:39:09.878443 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbsd6\" (UniqueName: \"kubernetes.io/projected/81d6df67-4f89-42b8-9e19-2812ece77996-kube-api-access-fbsd6\") pod \"community-operators-ch7bb\" (UID: \"81d6df67-4f89-42b8-9e19-2812ece77996\") " pod="openshift-marketplace/community-operators-ch7bb" Feb 19 16:39:09 crc kubenswrapper[4810]: I0219 16:39:09.937761 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ch7bb" Feb 19 16:39:10 crc kubenswrapper[4810]: I0219 16:39:10.455670 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ch7bb"] Feb 19 16:39:11 crc kubenswrapper[4810]: I0219 16:39:11.240456 4810 generic.go:334] "Generic (PLEG): container finished" podID="81d6df67-4f89-42b8-9e19-2812ece77996" containerID="e2710dde94bb0d603558ff62e2c9e0cf51744ccb3f0aaf41d2db02d41c8d6da0" exitCode=0 Feb 19 16:39:11 crc kubenswrapper[4810]: I0219 16:39:11.240558 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ch7bb" event={"ID":"81d6df67-4f89-42b8-9e19-2812ece77996","Type":"ContainerDied","Data":"e2710dde94bb0d603558ff62e2c9e0cf51744ccb3f0aaf41d2db02d41c8d6da0"} Feb 19 16:39:11 crc kubenswrapper[4810]: I0219 16:39:11.240893 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ch7bb" event={"ID":"81d6df67-4f89-42b8-9e19-2812ece77996","Type":"ContainerStarted","Data":"378c52d973ebcb5515fc8fc9204fa2b612ee44117d5bb74a714d7060313b49d8"} Feb 19 16:39:12 crc kubenswrapper[4810]: I0219 16:39:12.260640 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ch7bb" event={"ID":"81d6df67-4f89-42b8-9e19-2812ece77996","Type":"ContainerStarted","Data":"d903890d8b4001a65581b9af763d7137528b1ef0a5e31781ad32e07dbd687c83"} Feb 19 16:39:12 crc kubenswrapper[4810]: I0219 16:39:12.960420 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-fgvsh" Feb 19 16:39:12 crc kubenswrapper[4810]: I0219 16:39:12.960505 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-fgvsh" Feb 19 16:39:13 crc kubenswrapper[4810]: I0219 16:39:13.037108 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-fgvsh" Feb 19 16:39:13 crc kubenswrapper[4810]: I0219 16:39:13.357109 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-fgvsh" Feb 19 16:39:14 crc kubenswrapper[4810]: I0219 16:39:14.311581 4810 generic.go:334] "Generic (PLEG): container finished" podID="81d6df67-4f89-42b8-9e19-2812ece77996" containerID="d903890d8b4001a65581b9af763d7137528b1ef0a5e31781ad32e07dbd687c83" exitCode=0 Feb 19 16:39:14 crc kubenswrapper[4810]: I0219 16:39:14.313124 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ch7bb" event={"ID":"81d6df67-4f89-42b8-9e19-2812ece77996","Type":"ContainerDied","Data":"d903890d8b4001a65581b9af763d7137528b1ef0a5e31781ad32e07dbd687c83"} Feb 19 16:39:15 crc kubenswrapper[4810]: I0219 16:39:15.327430 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ch7bb" event={"ID":"81d6df67-4f89-42b8-9e19-2812ece77996","Type":"ContainerStarted","Data":"7a641592bba10da8ad71aed1fef8b54b79dde8c98967b7fbae699b4f3ead5673"} Feb 19 16:39:15 crc kubenswrapper[4810]: I0219 16:39:15.357597 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ch7bb" podStartSLOduration=2.914650821 podStartE2EDuration="6.357577423s" podCreationTimestamp="2026-02-19 16:39:09 +0000 UTC" firstStartedPulling="2026-02-19 16:39:11.245960181 +0000 UTC m=+5380.727990335" lastFinishedPulling="2026-02-19 16:39:14.688886783 +0000 UTC m=+5384.170916937" observedRunningTime="2026-02-19 16:39:15.349829341 +0000 UTC m=+5384.831859495" watchObservedRunningTime="2026-02-19 16:39:15.357577423 +0000 UTC m=+5384.839607547" Feb 19 16:39:15 crc kubenswrapper[4810]: I0219 16:39:15.375427 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fgvsh"] Feb 19 16:39:15 crc kubenswrapper[4810]: I0219 16:39:15.375802 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-fgvsh" podUID="3f034bea-4449-48b7-b4dc-359682644709" containerName="registry-server" containerID="cri-o://c1237ceb27bc050239ad7d857ad7d267187a75eddcf61de8348b5ad0d374a081" gracePeriod=2 Feb 19 16:39:15 crc kubenswrapper[4810]: I0219 16:39:15.915987 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fgvsh" Feb 19 16:39:16 crc kubenswrapper[4810]: I0219 16:39:16.102100 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2bg6\" (UniqueName: \"kubernetes.io/projected/3f034bea-4449-48b7-b4dc-359682644709-kube-api-access-w2bg6\") pod \"3f034bea-4449-48b7-b4dc-359682644709\" (UID: \"3f034bea-4449-48b7-b4dc-359682644709\") " Feb 19 16:39:16 crc kubenswrapper[4810]: I0219 16:39:16.102310 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f034bea-4449-48b7-b4dc-359682644709-catalog-content\") pod \"3f034bea-4449-48b7-b4dc-359682644709\" (UID: \"3f034bea-4449-48b7-b4dc-359682644709\") " Feb 19 16:39:16 crc kubenswrapper[4810]: I0219 16:39:16.102358 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f034bea-4449-48b7-b4dc-359682644709-utilities\") pod \"3f034bea-4449-48b7-b4dc-359682644709\" (UID: \"3f034bea-4449-48b7-b4dc-359682644709\") " Feb 19 16:39:16 crc kubenswrapper[4810]: I0219 16:39:16.110535 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f034bea-4449-48b7-b4dc-359682644709-kube-api-access-w2bg6" (OuterVolumeSpecName: "kube-api-access-w2bg6") pod "3f034bea-4449-48b7-b4dc-359682644709" (UID: "3f034bea-4449-48b7-b4dc-359682644709"). InnerVolumeSpecName "kube-api-access-w2bg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 16:39:16 crc kubenswrapper[4810]: I0219 16:39:16.123698 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f034bea-4449-48b7-b4dc-359682644709-utilities" (OuterVolumeSpecName: "utilities") pod "3f034bea-4449-48b7-b4dc-359682644709" (UID: "3f034bea-4449-48b7-b4dc-359682644709"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 16:39:16 crc kubenswrapper[4810]: I0219 16:39:16.170061 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f034bea-4449-48b7-b4dc-359682644709-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3f034bea-4449-48b7-b4dc-359682644709" (UID: "3f034bea-4449-48b7-b4dc-359682644709"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 16:39:16 crc kubenswrapper[4810]: I0219 16:39:16.204973 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2bg6\" (UniqueName: \"kubernetes.io/projected/3f034bea-4449-48b7-b4dc-359682644709-kube-api-access-w2bg6\") on node \"crc\" DevicePath \"\"" Feb 19 16:39:16 crc kubenswrapper[4810]: I0219 16:39:16.205394 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f034bea-4449-48b7-b4dc-359682644709-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 16:39:16 crc kubenswrapper[4810]: I0219 16:39:16.205414 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f034bea-4449-48b7-b4dc-359682644709-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 16:39:16 crc kubenswrapper[4810]: I0219 16:39:16.340898 4810 generic.go:334] "Generic (PLEG): container finished" podID="3f034bea-4449-48b7-b4dc-359682644709" containerID="c1237ceb27bc050239ad7d857ad7d267187a75eddcf61de8348b5ad0d374a081" exitCode=0 Feb 19 16:39:16 crc kubenswrapper[4810]: I0219 16:39:16.340963 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fgvsh" Feb 19 16:39:16 crc kubenswrapper[4810]: I0219 16:39:16.340994 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fgvsh" event={"ID":"3f034bea-4449-48b7-b4dc-359682644709","Type":"ContainerDied","Data":"c1237ceb27bc050239ad7d857ad7d267187a75eddcf61de8348b5ad0d374a081"} Feb 19 16:39:16 crc kubenswrapper[4810]: I0219 16:39:16.341077 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fgvsh" event={"ID":"3f034bea-4449-48b7-b4dc-359682644709","Type":"ContainerDied","Data":"30f07e3fe9974f137e654fec2499095509becdecd424a45b6ebe9ca7430c3355"} Feb 19 16:39:16 crc kubenswrapper[4810]: I0219 16:39:16.341108 4810 scope.go:117] "RemoveContainer" containerID="c1237ceb27bc050239ad7d857ad7d267187a75eddcf61de8348b5ad0d374a081" Feb 19 16:39:16 crc kubenswrapper[4810]: I0219 16:39:16.396432 4810 scope.go:117] "RemoveContainer" containerID="ab1039b6a551294a7e68715becc61601fbef14c361a510fef74eb845ad560452" Feb 19 16:39:16 crc kubenswrapper[4810]: I0219 16:39:16.403183 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fgvsh"] Feb 19 16:39:16 crc kubenswrapper[4810]: I0219 16:39:16.412476 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-fgvsh"] Feb 19 16:39:16 crc kubenswrapper[4810]: I0219 16:39:16.435465 4810 scope.go:117] "RemoveContainer" containerID="5b8771d19c21acb137923f1de719182c1c51622cce0781c46ea7cb43d5900fe4" Feb 19 16:39:16 crc kubenswrapper[4810]: I0219 16:39:16.485792 4810 scope.go:117] "RemoveContainer" containerID="c1237ceb27bc050239ad7d857ad7d267187a75eddcf61de8348b5ad0d374a081" Feb 19 16:39:16 crc kubenswrapper[4810]: E0219 16:39:16.486213 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1237ceb27bc050239ad7d857ad7d267187a75eddcf61de8348b5ad0d374a081\": container with ID starting with c1237ceb27bc050239ad7d857ad7d267187a75eddcf61de8348b5ad0d374a081 not found: ID does not exist" containerID="c1237ceb27bc050239ad7d857ad7d267187a75eddcf61de8348b5ad0d374a081" Feb 19 16:39:16 crc kubenswrapper[4810]: I0219 16:39:16.486348 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1237ceb27bc050239ad7d857ad7d267187a75eddcf61de8348b5ad0d374a081"} err="failed to get container status \"c1237ceb27bc050239ad7d857ad7d267187a75eddcf61de8348b5ad0d374a081\": rpc error: code = NotFound desc = could not find container \"c1237ceb27bc050239ad7d857ad7d267187a75eddcf61de8348b5ad0d374a081\": container with ID starting with c1237ceb27bc050239ad7d857ad7d267187a75eddcf61de8348b5ad0d374a081 not found: ID does not exist" Feb 19 16:39:16 crc kubenswrapper[4810]: I0219 16:39:16.486452 4810 scope.go:117] "RemoveContainer" containerID="ab1039b6a551294a7e68715becc61601fbef14c361a510fef74eb845ad560452" Feb 19 16:39:16 crc kubenswrapper[4810]: E0219 16:39:16.486846 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab1039b6a551294a7e68715becc61601fbef14c361a510fef74eb845ad560452\": container with ID starting with ab1039b6a551294a7e68715becc61601fbef14c361a510fef74eb845ad560452 not found: ID does not exist" containerID="ab1039b6a551294a7e68715becc61601fbef14c361a510fef74eb845ad560452" Feb 19 16:39:16 crc kubenswrapper[4810]: I0219 16:39:16.486977 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab1039b6a551294a7e68715becc61601fbef14c361a510fef74eb845ad560452"} err="failed to get container status \"ab1039b6a551294a7e68715becc61601fbef14c361a510fef74eb845ad560452\": rpc error: code = NotFound desc = could not find container \"ab1039b6a551294a7e68715becc61601fbef14c361a510fef74eb845ad560452\": container with ID starting with ab1039b6a551294a7e68715becc61601fbef14c361a510fef74eb845ad560452 not found: ID does not exist" Feb 19 16:39:16 crc kubenswrapper[4810]: I0219 16:39:16.487063 4810 scope.go:117] "RemoveContainer" containerID="5b8771d19c21acb137923f1de719182c1c51622cce0781c46ea7cb43d5900fe4" Feb 19 16:39:16 crc kubenswrapper[4810]: E0219 16:39:16.487466 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b8771d19c21acb137923f1de719182c1c51622cce0781c46ea7cb43d5900fe4\": container with ID starting with 5b8771d19c21acb137923f1de719182c1c51622cce0781c46ea7cb43d5900fe4 not found: ID does not exist" containerID="5b8771d19c21acb137923f1de719182c1c51622cce0781c46ea7cb43d5900fe4" Feb 19 16:39:16 crc kubenswrapper[4810]: I0219 16:39:16.487568 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b8771d19c21acb137923f1de719182c1c51622cce0781c46ea7cb43d5900fe4"} err="failed to get container status \"5b8771d19c21acb137923f1de719182c1c51622cce0781c46ea7cb43d5900fe4\": rpc error: code = NotFound desc = could not find container \"5b8771d19c21acb137923f1de719182c1c51622cce0781c46ea7cb43d5900fe4\": container with ID starting with 5b8771d19c21acb137923f1de719182c1c51622cce0781c46ea7cb43d5900fe4 not found: ID does not exist" Feb 19 16:39:17 crc kubenswrapper[4810]: I0219 16:39:17.486926 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f034bea-4449-48b7-b4dc-359682644709" path="/var/lib/kubelet/pods/3f034bea-4449-48b7-b4dc-359682644709/volumes" Feb 19 16:39:19 crc kubenswrapper[4810]: I0219 16:39:19.938517 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ch7bb" Feb 19 16:39:19 crc kubenswrapper[4810]: I0219 16:39:19.939158 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ch7bb" Feb 19 16:39:21 crc kubenswrapper[4810]: I0219 16:39:21.154342 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-ch7bb" podUID="81d6df67-4f89-42b8-9e19-2812ece77996" containerName="registry-server" probeResult="failure" output=< Feb 19 16:39:21 crc kubenswrapper[4810]: timeout: failed to connect service ":50051" within 1s Feb 19 16:39:21 crc kubenswrapper[4810]: > Feb 19 16:39:26 crc kubenswrapper[4810]: I0219 16:39:26.439496 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-w474k"] Feb 19 16:39:26 crc kubenswrapper[4810]: E0219 16:39:26.442438 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f034bea-4449-48b7-b4dc-359682644709" containerName="extract-utilities" Feb 19 16:39:26 crc kubenswrapper[4810]: I0219 16:39:26.442489 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f034bea-4449-48b7-b4dc-359682644709" containerName="extract-utilities" Feb 19 16:39:26 crc kubenswrapper[4810]: E0219 16:39:26.442526 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f034bea-4449-48b7-b4dc-359682644709" containerName="extract-content" Feb 19 16:39:26 crc kubenswrapper[4810]: I0219 16:39:26.442540 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f034bea-4449-48b7-b4dc-359682644709" containerName="extract-content" Feb 19 16:39:26 crc kubenswrapper[4810]: E0219 16:39:26.442580 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f034bea-4449-48b7-b4dc-359682644709" containerName="registry-server" Feb 19 16:39:26 crc kubenswrapper[4810]: I0219 16:39:26.442593 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f034bea-4449-48b7-b4dc-359682644709" containerName="registry-server" Feb 19 16:39:26 crc kubenswrapper[4810]: I0219 16:39:26.442975 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f034bea-4449-48b7-b4dc-359682644709" containerName="registry-server" Feb 19 16:39:26 crc kubenswrapper[4810]: I0219 16:39:26.445855 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w474k" Feb 19 16:39:26 crc kubenswrapper[4810]: I0219 16:39:26.501839 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w474k"] Feb 19 16:39:26 crc kubenswrapper[4810]: I0219 16:39:26.549988 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7403ee1-3e5c-4f08-a3ab-4eca75254405-catalog-content\") pod \"redhat-operators-w474k\" (UID: \"c7403ee1-3e5c-4f08-a3ab-4eca75254405\") " pod="openshift-marketplace/redhat-operators-w474k" Feb 19 16:39:26 crc kubenswrapper[4810]: I0219 16:39:26.550063 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7403ee1-3e5c-4f08-a3ab-4eca75254405-utilities\") pod \"redhat-operators-w474k\" (UID: \"c7403ee1-3e5c-4f08-a3ab-4eca75254405\") " pod="openshift-marketplace/redhat-operators-w474k" Feb 19 16:39:26 crc kubenswrapper[4810]: I0219 16:39:26.550164 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqsdk\" (UniqueName: \"kubernetes.io/projected/c7403ee1-3e5c-4f08-a3ab-4eca75254405-kube-api-access-bqsdk\") pod \"redhat-operators-w474k\" (UID: \"c7403ee1-3e5c-4f08-a3ab-4eca75254405\") " pod="openshift-marketplace/redhat-operators-w474k" Feb 19 16:39:26 crc kubenswrapper[4810]: I0219 16:39:26.652381 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7403ee1-3e5c-4f08-a3ab-4eca75254405-catalog-content\") pod \"redhat-operators-w474k\" (UID: \"c7403ee1-3e5c-4f08-a3ab-4eca75254405\") " pod="openshift-marketplace/redhat-operators-w474k" Feb 19 16:39:26 crc kubenswrapper[4810]: I0219 16:39:26.652451 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7403ee1-3e5c-4f08-a3ab-4eca75254405-utilities\") pod \"redhat-operators-w474k\" (UID: \"c7403ee1-3e5c-4f08-a3ab-4eca75254405\") " pod="openshift-marketplace/redhat-operators-w474k" Feb 19 16:39:26 crc kubenswrapper[4810]: I0219 16:39:26.652516 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqsdk\" (UniqueName: \"kubernetes.io/projected/c7403ee1-3e5c-4f08-a3ab-4eca75254405-kube-api-access-bqsdk\") pod \"redhat-operators-w474k\" (UID: \"c7403ee1-3e5c-4f08-a3ab-4eca75254405\") " pod="openshift-marketplace/redhat-operators-w474k" Feb 19 16:39:26 crc kubenswrapper[4810]: I0219 16:39:26.652985 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7403ee1-3e5c-4f08-a3ab-4eca75254405-catalog-content\") pod \"redhat-operators-w474k\" (UID: \"c7403ee1-3e5c-4f08-a3ab-4eca75254405\") " pod="openshift-marketplace/redhat-operators-w474k" Feb 19 16:39:26 crc kubenswrapper[4810]: I0219 16:39:26.653199 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7403ee1-3e5c-4f08-a3ab-4eca75254405-utilities\") pod \"redhat-operators-w474k\" (UID: \"c7403ee1-3e5c-4f08-a3ab-4eca75254405\") " pod="openshift-marketplace/redhat-operators-w474k" Feb 19 16:39:26 crc kubenswrapper[4810]: I0219 16:39:26.673436 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqsdk\" (UniqueName: \"kubernetes.io/projected/c7403ee1-3e5c-4f08-a3ab-4eca75254405-kube-api-access-bqsdk\") pod \"redhat-operators-w474k\" (UID: \"c7403ee1-3e5c-4f08-a3ab-4eca75254405\") " pod="openshift-marketplace/redhat-operators-w474k" Feb 19 16:39:26 crc kubenswrapper[4810]: I0219 16:39:26.794165 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w474k" Feb 19 16:39:27 crc kubenswrapper[4810]: I0219 16:39:27.260074 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w474k"] Feb 19 16:39:27 crc kubenswrapper[4810]: I0219 16:39:27.475972 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w474k" event={"ID":"c7403ee1-3e5c-4f08-a3ab-4eca75254405","Type":"ContainerStarted","Data":"442b7ce65bc1412aed55a2335f985bfa36080044176d83cf3f6041d9b930cf3e"} Feb 19 16:39:28 crc kubenswrapper[4810]: I0219 16:39:28.493046 4810 generic.go:334] "Generic (PLEG): container finished" podID="c7403ee1-3e5c-4f08-a3ab-4eca75254405" containerID="da5e11ed127d0033e661e20ce3da73f34f371a2138b894304f1b033ea38ce0ec" exitCode=0 Feb 19 16:39:28 crc kubenswrapper[4810]: I0219 16:39:28.493147 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w474k" event={"ID":"c7403ee1-3e5c-4f08-a3ab-4eca75254405","Type":"ContainerDied","Data":"da5e11ed127d0033e661e20ce3da73f34f371a2138b894304f1b033ea38ce0ec"} Feb 19 16:39:30 crc kubenswrapper[4810]: I0219 16:39:30.023660 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ch7bb" Feb 19 16:39:30 crc kubenswrapper[4810]: I0219 16:39:30.099675 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ch7bb" Feb 19 16:39:30 crc kubenswrapper[4810]: I0219 16:39:30.522124 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w474k" event={"ID":"c7403ee1-3e5c-4f08-a3ab-4eca75254405","Type":"ContainerStarted","Data":"5d843baf96eed557854d6793bfaf74426aa5aeab7dc682913f3b750090dfd60c"} Feb 19 16:39:31 crc kubenswrapper[4810]: I0219 16:39:31.806232 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ch7bb"] Feb 19 16:39:31 crc kubenswrapper[4810]: I0219 16:39:31.809156 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ch7bb" podUID="81d6df67-4f89-42b8-9e19-2812ece77996" containerName="registry-server" containerID="cri-o://7a641592bba10da8ad71aed1fef8b54b79dde8c98967b7fbae699b4f3ead5673" gracePeriod=2 Feb 19 16:39:33 crc kubenswrapper[4810]: I0219 16:39:33.333214 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ch7bb" Feb 19 16:39:33 crc kubenswrapper[4810]: I0219 16:39:33.423683 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbsd6\" (UniqueName: \"kubernetes.io/projected/81d6df67-4f89-42b8-9e19-2812ece77996-kube-api-access-fbsd6\") pod \"81d6df67-4f89-42b8-9e19-2812ece77996\" (UID: \"81d6df67-4f89-42b8-9e19-2812ece77996\") " Feb 19 16:39:33 crc kubenswrapper[4810]: I0219 16:39:33.424019 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81d6df67-4f89-42b8-9e19-2812ece77996-utilities\") pod \"81d6df67-4f89-42b8-9e19-2812ece77996\" (UID: \"81d6df67-4f89-42b8-9e19-2812ece77996\") " Feb 19 16:39:33 crc kubenswrapper[4810]: I0219 16:39:33.424119 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81d6df67-4f89-42b8-9e19-2812ece77996-catalog-content\") pod \"81d6df67-4f89-42b8-9e19-2812ece77996\" (UID: \"81d6df67-4f89-42b8-9e19-2812ece77996\") " Feb 19 16:39:33 crc kubenswrapper[4810]: I0219 16:39:33.424624 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81d6df67-4f89-42b8-9e19-2812ece77996-utilities" (OuterVolumeSpecName: "utilities") pod "81d6df67-4f89-42b8-9e19-2812ece77996" (UID: "81d6df67-4f89-42b8-9e19-2812ece77996"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 16:39:33 crc kubenswrapper[4810]: I0219 16:39:33.424986 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81d6df67-4f89-42b8-9e19-2812ece77996-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 16:39:33 crc kubenswrapper[4810]: I0219 16:39:33.431005 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81d6df67-4f89-42b8-9e19-2812ece77996-kube-api-access-fbsd6" (OuterVolumeSpecName: "kube-api-access-fbsd6") pod "81d6df67-4f89-42b8-9e19-2812ece77996" (UID: "81d6df67-4f89-42b8-9e19-2812ece77996"). InnerVolumeSpecName "kube-api-access-fbsd6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 16:39:33 crc kubenswrapper[4810]: I0219 16:39:33.508729 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81d6df67-4f89-42b8-9e19-2812ece77996-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "81d6df67-4f89-42b8-9e19-2812ece77996" (UID: "81d6df67-4f89-42b8-9e19-2812ece77996"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 16:39:33 crc kubenswrapper[4810]: I0219 16:39:33.527192 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81d6df67-4f89-42b8-9e19-2812ece77996-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 16:39:33 crc kubenswrapper[4810]: I0219 16:39:33.527217 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fbsd6\" (UniqueName: \"kubernetes.io/projected/81d6df67-4f89-42b8-9e19-2812ece77996-kube-api-access-fbsd6\") on node \"crc\" DevicePath \"\"" Feb 19 16:39:33 crc kubenswrapper[4810]: I0219 16:39:33.555777 4810 generic.go:334] "Generic (PLEG): container finished" podID="81d6df67-4f89-42b8-9e19-2812ece77996" containerID="7a641592bba10da8ad71aed1fef8b54b79dde8c98967b7fbae699b4f3ead5673" exitCode=0 Feb 19 16:39:33 crc kubenswrapper[4810]: I0219 16:39:33.555855 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ch7bb" event={"ID":"81d6df67-4f89-42b8-9e19-2812ece77996","Type":"ContainerDied","Data":"7a641592bba10da8ad71aed1fef8b54b79dde8c98967b7fbae699b4f3ead5673"} Feb 19 16:39:33 crc kubenswrapper[4810]: I0219 16:39:33.555931 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ch7bb" event={"ID":"81d6df67-4f89-42b8-9e19-2812ece77996","Type":"ContainerDied","Data":"378c52d973ebcb5515fc8fc9204fa2b612ee44117d5bb74a714d7060313b49d8"} Feb 19 16:39:33 crc kubenswrapper[4810]: I0219 16:39:33.555969 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ch7bb" Feb 19 16:39:33 crc kubenswrapper[4810]: I0219 16:39:33.555983 4810 scope.go:117] "RemoveContainer" containerID="7a641592bba10da8ad71aed1fef8b54b79dde8c98967b7fbae699b4f3ead5673" Feb 19 16:39:33 crc kubenswrapper[4810]: I0219 16:39:33.579229 4810 scope.go:117] "RemoveContainer" containerID="d903890d8b4001a65581b9af763d7137528b1ef0a5e31781ad32e07dbd687c83" Feb 19 16:39:33 crc kubenswrapper[4810]: I0219 16:39:33.619068 4810 scope.go:117] "RemoveContainer" containerID="e2710dde94bb0d603558ff62e2c9e0cf51744ccb3f0aaf41d2db02d41c8d6da0" Feb 19 16:39:33 crc kubenswrapper[4810]: I0219 16:39:33.627239 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ch7bb"] Feb 19 16:39:33 crc kubenswrapper[4810]: I0219 16:39:33.641580 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ch7bb"] Feb 19 16:39:33 crc kubenswrapper[4810]: I0219 16:39:33.650466 4810 scope.go:117] "RemoveContainer" containerID="7a641592bba10da8ad71aed1fef8b54b79dde8c98967b7fbae699b4f3ead5673" Feb 19 16:39:33 crc kubenswrapper[4810]: E0219 16:39:33.650849 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a641592bba10da8ad71aed1fef8b54b79dde8c98967b7fbae699b4f3ead5673\": container with ID starting with 7a641592bba10da8ad71aed1fef8b54b79dde8c98967b7fbae699b4f3ead5673 not found: ID does not exist" containerID="7a641592bba10da8ad71aed1fef8b54b79dde8c98967b7fbae699b4f3ead5673" Feb 19 16:39:33 crc kubenswrapper[4810]: I0219 16:39:33.650888 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a641592bba10da8ad71aed1fef8b54b79dde8c98967b7fbae699b4f3ead5673"} err="failed to get container status \"7a641592bba10da8ad71aed1fef8b54b79dde8c98967b7fbae699b4f3ead5673\": rpc error: code = NotFound desc = could not find container \"7a641592bba10da8ad71aed1fef8b54b79dde8c98967b7fbae699b4f3ead5673\": container with ID starting with 7a641592bba10da8ad71aed1fef8b54b79dde8c98967b7fbae699b4f3ead5673 not found: ID does not exist" Feb 19 16:39:33 crc kubenswrapper[4810]: I0219 16:39:33.650922 4810 scope.go:117] "RemoveContainer" containerID="d903890d8b4001a65581b9af763d7137528b1ef0a5e31781ad32e07dbd687c83" Feb 19 16:39:33 crc kubenswrapper[4810]: E0219 16:39:33.651288 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d903890d8b4001a65581b9af763d7137528b1ef0a5e31781ad32e07dbd687c83\": container with ID starting with d903890d8b4001a65581b9af763d7137528b1ef0a5e31781ad32e07dbd687c83 not found: ID does not exist" containerID="d903890d8b4001a65581b9af763d7137528b1ef0a5e31781ad32e07dbd687c83" Feb 19 16:39:33 crc kubenswrapper[4810]: I0219 16:39:33.651305 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d903890d8b4001a65581b9af763d7137528b1ef0a5e31781ad32e07dbd687c83"} err="failed to get container status \"d903890d8b4001a65581b9af763d7137528b1ef0a5e31781ad32e07dbd687c83\": rpc error: code = NotFound desc = could not find container \"d903890d8b4001a65581b9af763d7137528b1ef0a5e31781ad32e07dbd687c83\": container with ID starting with d903890d8b4001a65581b9af763d7137528b1ef0a5e31781ad32e07dbd687c83 not found: ID does not exist" Feb 19 16:39:33 crc kubenswrapper[4810]: I0219 16:39:33.651317 4810 scope.go:117] "RemoveContainer" containerID="e2710dde94bb0d603558ff62e2c9e0cf51744ccb3f0aaf41d2db02d41c8d6da0" Feb 19 16:39:33 crc kubenswrapper[4810]: E0219 16:39:33.651760 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2710dde94bb0d603558ff62e2c9e0cf51744ccb3f0aaf41d2db02d41c8d6da0\": container with ID starting with e2710dde94bb0d603558ff62e2c9e0cf51744ccb3f0aaf41d2db02d41c8d6da0 not found: ID does not exist" containerID="e2710dde94bb0d603558ff62e2c9e0cf51744ccb3f0aaf41d2db02d41c8d6da0" Feb 19 16:39:33 crc kubenswrapper[4810]: I0219 16:39:33.651780 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2710dde94bb0d603558ff62e2c9e0cf51744ccb3f0aaf41d2db02d41c8d6da0"} err="failed to get container status \"e2710dde94bb0d603558ff62e2c9e0cf51744ccb3f0aaf41d2db02d41c8d6da0\": rpc error: code = NotFound desc = could not find container \"e2710dde94bb0d603558ff62e2c9e0cf51744ccb3f0aaf41d2db02d41c8d6da0\": container with ID starting with e2710dde94bb0d603558ff62e2c9e0cf51744ccb3f0aaf41d2db02d41c8d6da0 not found: ID does not exist" Feb 19 16:39:34 crc kubenswrapper[4810]: I0219 16:39:34.577149 4810 generic.go:334] "Generic (PLEG): container finished" podID="c7403ee1-3e5c-4f08-a3ab-4eca75254405" containerID="5d843baf96eed557854d6793bfaf74426aa5aeab7dc682913f3b750090dfd60c" exitCode=0 Feb 19 16:39:34 crc kubenswrapper[4810]: I0219 16:39:34.577433 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w474k" event={"ID":"c7403ee1-3e5c-4f08-a3ab-4eca75254405","Type":"ContainerDied","Data":"5d843baf96eed557854d6793bfaf74426aa5aeab7dc682913f3b750090dfd60c"} Feb 19 16:39:35 crc kubenswrapper[4810]: I0219 16:39:35.459714 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81d6df67-4f89-42b8-9e19-2812ece77996" path="/var/lib/kubelet/pods/81d6df67-4f89-42b8-9e19-2812ece77996/volumes" Feb 19 16:39:36 crc kubenswrapper[4810]: I0219 16:39:36.605405 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w474k" event={"ID":"c7403ee1-3e5c-4f08-a3ab-4eca75254405","Type":"ContainerStarted","Data":"a968f276f3edfffb95abada6e7316b27737144bb357510bb73c8efb6a9c6d836"} Feb 19 16:39:36 crc kubenswrapper[4810]: I0219 16:39:36.643752 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-w474k" podStartSLOduration=4.141139902 podStartE2EDuration="10.643724776s" podCreationTimestamp="2026-02-19 16:39:26 +0000 UTC" firstStartedPulling="2026-02-19 16:39:28.495777708 +0000 UTC m=+5397.977807852" lastFinishedPulling="2026-02-19 16:39:34.998362562 +0000 UTC m=+5404.480392726" observedRunningTime="2026-02-19 16:39:36.634797245 +0000 UTC m=+5406.116827369" watchObservedRunningTime="2026-02-19 16:39:36.643724776 +0000 UTC m=+5406.125754930" Feb 19 16:39:36 crc kubenswrapper[4810]: I0219 16:39:36.794578 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-w474k" Feb 19 16:39:36 crc kubenswrapper[4810]: I0219 16:39:36.794620 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-w474k" Feb 19 16:39:37 crc kubenswrapper[4810]: I0219 16:39:37.864666 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-w474k" podUID="c7403ee1-3e5c-4f08-a3ab-4eca75254405" containerName="registry-server" probeResult="failure" output=< Feb 19 16:39:37 crc kubenswrapper[4810]: timeout: failed to connect service ":50051" within 1s Feb 19 16:39:37 crc kubenswrapper[4810]: > Feb 19 16:39:46 crc kubenswrapper[4810]: I0219 16:39:46.874933 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-w474k" Feb 19 16:39:46 crc kubenswrapper[4810]: I0219 16:39:46.957960 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-w474k" Feb 19 16:39:47 crc kubenswrapper[4810]: I0219 16:39:47.132996 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w474k"] Feb 19 16:39:48 crc kubenswrapper[4810]: I0219 16:39:48.754437 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-w474k" podUID="c7403ee1-3e5c-4f08-a3ab-4eca75254405" containerName="registry-server" containerID="cri-o://a968f276f3edfffb95abada6e7316b27737144bb357510bb73c8efb6a9c6d836" gracePeriod=2 Feb 19 16:39:49 crc kubenswrapper[4810]: I0219 16:39:49.327828 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w474k" Feb 19 16:39:49 crc kubenswrapper[4810]: I0219 16:39:49.445980 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7403ee1-3e5c-4f08-a3ab-4eca75254405-catalog-content\") pod \"c7403ee1-3e5c-4f08-a3ab-4eca75254405\" (UID: \"c7403ee1-3e5c-4f08-a3ab-4eca75254405\") " Feb 19 16:39:49 crc kubenswrapper[4810]: I0219 16:39:49.446303 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqsdk\" (UniqueName: \"kubernetes.io/projected/c7403ee1-3e5c-4f08-a3ab-4eca75254405-kube-api-access-bqsdk\") pod \"c7403ee1-3e5c-4f08-a3ab-4eca75254405\" (UID: \"c7403ee1-3e5c-4f08-a3ab-4eca75254405\") " Feb 19 16:39:49 crc kubenswrapper[4810]: I0219 16:39:49.447859 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7403ee1-3e5c-4f08-a3ab-4eca75254405-utilities\") pod \"c7403ee1-3e5c-4f08-a3ab-4eca75254405\" (UID: \"c7403ee1-3e5c-4f08-a3ab-4eca75254405\") " Feb 19 16:39:49 crc kubenswrapper[4810]: I0219 16:39:49.448894 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7403ee1-3e5c-4f08-a3ab-4eca75254405-utilities" (OuterVolumeSpecName: "utilities") pod "c7403ee1-3e5c-4f08-a3ab-4eca75254405" (UID: "c7403ee1-3e5c-4f08-a3ab-4eca75254405"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 16:39:49 crc kubenswrapper[4810]: I0219 16:39:49.449768 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7403ee1-3e5c-4f08-a3ab-4eca75254405-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 16:39:49 crc kubenswrapper[4810]: I0219 16:39:49.455520 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7403ee1-3e5c-4f08-a3ab-4eca75254405-kube-api-access-bqsdk" (OuterVolumeSpecName: "kube-api-access-bqsdk") pod "c7403ee1-3e5c-4f08-a3ab-4eca75254405" (UID: "c7403ee1-3e5c-4f08-a3ab-4eca75254405"). InnerVolumeSpecName "kube-api-access-bqsdk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 16:39:49 crc kubenswrapper[4810]: I0219 16:39:49.552055 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bqsdk\" (UniqueName: \"kubernetes.io/projected/c7403ee1-3e5c-4f08-a3ab-4eca75254405-kube-api-access-bqsdk\") on node \"crc\" DevicePath \"\"" Feb 19 16:39:49 crc kubenswrapper[4810]: I0219 16:39:49.584769 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7403ee1-3e5c-4f08-a3ab-4eca75254405-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c7403ee1-3e5c-4f08-a3ab-4eca75254405" (UID: "c7403ee1-3e5c-4f08-a3ab-4eca75254405"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 16:39:49 crc kubenswrapper[4810]: I0219 16:39:49.653602 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7403ee1-3e5c-4f08-a3ab-4eca75254405-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 16:39:49 crc kubenswrapper[4810]: I0219 16:39:49.766604 4810 generic.go:334] "Generic (PLEG): container finished" podID="c7403ee1-3e5c-4f08-a3ab-4eca75254405" containerID="a968f276f3edfffb95abada6e7316b27737144bb357510bb73c8efb6a9c6d836" exitCode=0 Feb 19 16:39:49 crc kubenswrapper[4810]: I0219 16:39:49.766651 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w474k" event={"ID":"c7403ee1-3e5c-4f08-a3ab-4eca75254405","Type":"ContainerDied","Data":"a968f276f3edfffb95abada6e7316b27737144bb357510bb73c8efb6a9c6d836"} Feb 19 16:39:49 crc kubenswrapper[4810]: I0219 16:39:49.766690 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w474k" event={"ID":"c7403ee1-3e5c-4f08-a3ab-4eca75254405","Type":"ContainerDied","Data":"442b7ce65bc1412aed55a2335f985bfa36080044176d83cf3f6041d9b930cf3e"} Feb 19 16:39:49 crc kubenswrapper[4810]: I0219 16:39:49.766711 4810 scope.go:117] "RemoveContainer" containerID="a968f276f3edfffb95abada6e7316b27737144bb357510bb73c8efb6a9c6d836" Feb 19 16:39:49 crc kubenswrapper[4810]: I0219 16:39:49.766711 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w474k" Feb 19 16:39:49 crc kubenswrapper[4810]: I0219 16:39:49.794203 4810 scope.go:117] "RemoveContainer" containerID="5d843baf96eed557854d6793bfaf74426aa5aeab7dc682913f3b750090dfd60c" Feb 19 16:39:49 crc kubenswrapper[4810]: I0219 16:39:49.817377 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w474k"] Feb 19 16:39:49 crc kubenswrapper[4810]: I0219 16:39:49.828359 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-w474k"] Feb 19 16:39:49 crc kubenswrapper[4810]: I0219 16:39:49.833791 4810 scope.go:117] "RemoveContainer" containerID="da5e11ed127d0033e661e20ce3da73f34f371a2138b894304f1b033ea38ce0ec" Feb 19 16:39:49 crc kubenswrapper[4810]: I0219 16:39:49.889028 4810 scope.go:117] "RemoveContainer" containerID="a968f276f3edfffb95abada6e7316b27737144bb357510bb73c8efb6a9c6d836" Feb 19 16:39:49 crc kubenswrapper[4810]: E0219 16:39:49.890646 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a968f276f3edfffb95abada6e7316b27737144bb357510bb73c8efb6a9c6d836\": container with ID starting with a968f276f3edfffb95abada6e7316b27737144bb357510bb73c8efb6a9c6d836 not found: ID does not exist" containerID="a968f276f3edfffb95abada6e7316b27737144bb357510bb73c8efb6a9c6d836" Feb 19 16:39:49 crc kubenswrapper[4810]: I0219 16:39:49.890687 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a968f276f3edfffb95abada6e7316b27737144bb357510bb73c8efb6a9c6d836"} err="failed to get container status \"a968f276f3edfffb95abada6e7316b27737144bb357510bb73c8efb6a9c6d836\": rpc error: code = NotFound desc = could not find container \"a968f276f3edfffb95abada6e7316b27737144bb357510bb73c8efb6a9c6d836\": container with ID starting with a968f276f3edfffb95abada6e7316b27737144bb357510bb73c8efb6a9c6d836 not found: ID does not exist" Feb 19 16:39:49 crc kubenswrapper[4810]: I0219 16:39:49.890715 4810 scope.go:117] "RemoveContainer" containerID="5d843baf96eed557854d6793bfaf74426aa5aeab7dc682913f3b750090dfd60c" Feb 19 16:39:49 crc kubenswrapper[4810]: E0219 16:39:49.891398 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d843baf96eed557854d6793bfaf74426aa5aeab7dc682913f3b750090dfd60c\": container with ID starting with 5d843baf96eed557854d6793bfaf74426aa5aeab7dc682913f3b750090dfd60c not found: ID does not exist" containerID="5d843baf96eed557854d6793bfaf74426aa5aeab7dc682913f3b750090dfd60c" Feb 19 16:39:49 crc kubenswrapper[4810]: I0219 16:39:49.891430 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d843baf96eed557854d6793bfaf74426aa5aeab7dc682913f3b750090dfd60c"} err="failed to get container status \"5d843baf96eed557854d6793bfaf74426aa5aeab7dc682913f3b750090dfd60c\": rpc error: code = NotFound desc = could not find container \"5d843baf96eed557854d6793bfaf74426aa5aeab7dc682913f3b750090dfd60c\": container with ID starting with 5d843baf96eed557854d6793bfaf74426aa5aeab7dc682913f3b750090dfd60c not found: ID does not exist" Feb 19 16:39:49 crc kubenswrapper[4810]: I0219 16:39:49.891449 4810 scope.go:117] "RemoveContainer" containerID="da5e11ed127d0033e661e20ce3da73f34f371a2138b894304f1b033ea38ce0ec" Feb 19 16:39:49 crc kubenswrapper[4810]: E0219 16:39:49.891910 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da5e11ed127d0033e661e20ce3da73f34f371a2138b894304f1b033ea38ce0ec\": container with ID starting with da5e11ed127d0033e661e20ce3da73f34f371a2138b894304f1b033ea38ce0ec not found: ID does not exist" containerID="da5e11ed127d0033e661e20ce3da73f34f371a2138b894304f1b033ea38ce0ec" Feb 19 16:39:49 crc kubenswrapper[4810]: I0219 16:39:49.891943 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da5e11ed127d0033e661e20ce3da73f34f371a2138b894304f1b033ea38ce0ec"} err="failed to get container status \"da5e11ed127d0033e661e20ce3da73f34f371a2138b894304f1b033ea38ce0ec\": rpc error: code = NotFound desc = could not find container \"da5e11ed127d0033e661e20ce3da73f34f371a2138b894304f1b033ea38ce0ec\": container with ID starting with da5e11ed127d0033e661e20ce3da73f34f371a2138b894304f1b033ea38ce0ec not found: ID does not exist" Feb 19 16:39:51 crc kubenswrapper[4810]: I0219 16:39:51.461236 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7403ee1-3e5c-4f08-a3ab-4eca75254405" path="/var/lib/kubelet/pods/c7403ee1-3e5c-4f08-a3ab-4eca75254405/volumes" Feb 19 16:41:02 crc kubenswrapper[4810]: I0219 16:41:02.696856 4810 generic.go:334] "Generic (PLEG): container finished" podID="a4c017a9-c049-4baa-acc0-e08a25437c90" containerID="7c1119346fc25e4a8eb25c191fe4eed4ab3589389debf84e048e3e376479d897" exitCode=0 Feb 19 16:41:02 crc kubenswrapper[4810]: I0219 16:41:02.696976 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"a4c017a9-c049-4baa-acc0-e08a25437c90","Type":"ContainerDied","Data":"7c1119346fc25e4a8eb25c191fe4eed4ab3589389debf84e048e3e376479d897"} Feb 19 16:41:04 crc kubenswrapper[4810]: I0219 16:41:04.088408 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 19 16:41:04 crc kubenswrapper[4810]: I0219 16:41:04.150911 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/a4c017a9-c049-4baa-acc0-e08a25437c90-test-operator-ephemeral-workdir\") pod \"a4c017a9-c049-4baa-acc0-e08a25437c90\" (UID: \"a4c017a9-c049-4baa-acc0-e08a25437c90\") " Feb 19 16:41:04 crc kubenswrapper[4810]: I0219 16:41:04.150998 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/a4c017a9-c049-4baa-acc0-e08a25437c90-test-operator-ephemeral-temporary\") pod \"a4c017a9-c049-4baa-acc0-e08a25437c90\" (UID: \"a4c017a9-c049-4baa-acc0-e08a25437c90\") " Feb 19 16:41:04 crc kubenswrapper[4810]: I0219 16:41:04.151029 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"a4c017a9-c049-4baa-acc0-e08a25437c90\" (UID: \"a4c017a9-c049-4baa-acc0-e08a25437c90\") " Feb 19 16:41:04 crc kubenswrapper[4810]: I0219 16:41:04.151111 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tv5x5\" (UniqueName: \"kubernetes.io/projected/a4c017a9-c049-4baa-acc0-e08a25437c90-kube-api-access-tv5x5\") pod \"a4c017a9-c049-4baa-acc0-e08a25437c90\" (UID: \"a4c017a9-c049-4baa-acc0-e08a25437c90\") " Feb 19 16:41:04 crc kubenswrapper[4810]: I0219 16:41:04.151217 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a4c017a9-c049-4baa-acc0-e08a25437c90-openstack-config\") pod \"a4c017a9-c049-4baa-acc0-e08a25437c90\" (UID: \"a4c017a9-c049-4baa-acc0-e08a25437c90\") " Feb 19 16:41:04 crc kubenswrapper[4810]: I0219 16:41:04.151355 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a4c017a9-c049-4baa-acc0-e08a25437c90-ssh-key\") pod \"a4c017a9-c049-4baa-acc0-e08a25437c90\" (UID: \"a4c017a9-c049-4baa-acc0-e08a25437c90\") " Feb 19 16:41:04 crc kubenswrapper[4810]: I0219 16:41:04.151473 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/a4c017a9-c049-4baa-acc0-e08a25437c90-ca-certs\") pod \"a4c017a9-c049-4baa-acc0-e08a25437c90\" (UID: \"a4c017a9-c049-4baa-acc0-e08a25437c90\") " Feb 19 16:41:04 crc kubenswrapper[4810]: I0219 16:41:04.151500 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a4c017a9-c049-4baa-acc0-e08a25437c90-openstack-config-secret\") pod \"a4c017a9-c049-4baa-acc0-e08a25437c90\" (UID: \"a4c017a9-c049-4baa-acc0-e08a25437c90\") " Feb 19 16:41:04 crc kubenswrapper[4810]: I0219 16:41:04.151525 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a4c017a9-c049-4baa-acc0-e08a25437c90-config-data\") pod \"a4c017a9-c049-4baa-acc0-e08a25437c90\" (UID: \"a4c017a9-c049-4baa-acc0-e08a25437c90\") " Feb 19 16:41:04 crc kubenswrapper[4810]: I0219 16:41:04.152582 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4c017a9-c049-4baa-acc0-e08a25437c90-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "a4c017a9-c049-4baa-acc0-e08a25437c90" (UID: "a4c017a9-c049-4baa-acc0-e08a25437c90"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 16:41:04 crc kubenswrapper[4810]: I0219 16:41:04.152867 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4c017a9-c049-4baa-acc0-e08a25437c90-config-data" (OuterVolumeSpecName: "config-data") pod "a4c017a9-c049-4baa-acc0-e08a25437c90" (UID: "a4c017a9-c049-4baa-acc0-e08a25437c90"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 16:41:04 crc kubenswrapper[4810]: I0219 16:41:04.157993 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4c017a9-c049-4baa-acc0-e08a25437c90-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "a4c017a9-c049-4baa-acc0-e08a25437c90" (UID: "a4c017a9-c049-4baa-acc0-e08a25437c90"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 16:41:04 crc kubenswrapper[4810]: I0219 16:41:04.165892 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4c017a9-c049-4baa-acc0-e08a25437c90-kube-api-access-tv5x5" (OuterVolumeSpecName: "kube-api-access-tv5x5") pod "a4c017a9-c049-4baa-acc0-e08a25437c90" (UID: "a4c017a9-c049-4baa-acc0-e08a25437c90"). InnerVolumeSpecName "kube-api-access-tv5x5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 16:41:04 crc kubenswrapper[4810]: I0219 16:41:04.168663 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "test-operator-logs") pod "a4c017a9-c049-4baa-acc0-e08a25437c90" (UID: "a4c017a9-c049-4baa-acc0-e08a25437c90"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 19 16:41:04 crc kubenswrapper[4810]: I0219 16:41:04.194342 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4c017a9-c049-4baa-acc0-e08a25437c90-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a4c017a9-c049-4baa-acc0-e08a25437c90" (UID: "a4c017a9-c049-4baa-acc0-e08a25437c90"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 16:41:04 crc kubenswrapper[4810]: I0219 16:41:04.206673 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4c017a9-c049-4baa-acc0-e08a25437c90-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "a4c017a9-c049-4baa-acc0-e08a25437c90" (UID: "a4c017a9-c049-4baa-acc0-e08a25437c90"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 16:41:04 crc kubenswrapper[4810]: I0219 16:41:04.220547 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4c017a9-c049-4baa-acc0-e08a25437c90-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "a4c017a9-c049-4baa-acc0-e08a25437c90" (UID: "a4c017a9-c049-4baa-acc0-e08a25437c90"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 16:41:04 crc kubenswrapper[4810]: I0219 16:41:04.243846 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4c017a9-c049-4baa-acc0-e08a25437c90-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "a4c017a9-c049-4baa-acc0-e08a25437c90" (UID: "a4c017a9-c049-4baa-acc0-e08a25437c90"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 16:41:04 crc kubenswrapper[4810]: I0219 16:41:04.253260 4810 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a4c017a9-c049-4baa-acc0-e08a25437c90-ssh-key\") on node \"crc\" DevicePath \"\"" Feb 19 16:41:04 crc kubenswrapper[4810]: I0219 16:41:04.253291 4810 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/a4c017a9-c049-4baa-acc0-e08a25437c90-ca-certs\") on node \"crc\" DevicePath \"\"" Feb 19 16:41:04 crc kubenswrapper[4810]: I0219 16:41:04.253304 4810 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a4c017a9-c049-4baa-acc0-e08a25437c90-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 19 16:41:04 crc kubenswrapper[4810]: I0219 16:41:04.253318 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a4c017a9-c049-4baa-acc0-e08a25437c90-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 16:41:04 crc kubenswrapper[4810]: I0219 16:41:04.253349 4810 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/a4c017a9-c049-4baa-acc0-e08a25437c90-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Feb 19 16:41:04 crc kubenswrapper[4810]: I0219 16:41:04.253363 4810 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/a4c017a9-c049-4baa-acc0-e08a25437c90-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Feb 19 16:41:04 crc kubenswrapper[4810]: I0219 16:41:04.253407 4810 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Feb 19 16:41:04 crc kubenswrapper[4810]: I0219 16:41:04.253422 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tv5x5\" (UniqueName: \"kubernetes.io/projected/a4c017a9-c049-4baa-acc0-e08a25437c90-kube-api-access-tv5x5\") on node \"crc\" DevicePath \"\"" Feb 19 16:41:04 crc kubenswrapper[4810]: I0219 16:41:04.253434 4810 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a4c017a9-c049-4baa-acc0-e08a25437c90-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 19 16:41:04 crc kubenswrapper[4810]: I0219 16:41:04.282789 4810 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Feb 19 16:41:04 crc kubenswrapper[4810]: I0219 16:41:04.354239 4810 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Feb 19 16:41:04 crc kubenswrapper[4810]: I0219 16:41:04.726944 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"a4c017a9-c049-4baa-acc0-e08a25437c90","Type":"ContainerDied","Data":"bc53d602c1ff8998f515238b989c9bf62ab660f688576459213d5fed2a632fb2"} Feb 19 16:41:04 crc kubenswrapper[4810]: I0219 16:41:04.727009 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc53d602c1ff8998f515238b989c9bf62ab660f688576459213d5fed2a632fb2" Feb 19 16:41:04 crc kubenswrapper[4810]: I0219 16:41:04.727011 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 19 16:41:16 crc kubenswrapper[4810]: I0219 16:41:16.695134 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 19 16:41:16 crc kubenswrapper[4810]: E0219 16:41:16.697073 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81d6df67-4f89-42b8-9e19-2812ece77996" containerName="extract-utilities" Feb 19 16:41:16 crc kubenswrapper[4810]: I0219 16:41:16.697114 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="81d6df67-4f89-42b8-9e19-2812ece77996" containerName="extract-utilities" Feb 19 16:41:16 crc kubenswrapper[4810]: E0219 16:41:16.697136 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7403ee1-3e5c-4f08-a3ab-4eca75254405" containerName="registry-server" Feb 19 16:41:16 crc kubenswrapper[4810]: I0219 16:41:16.697146 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7403ee1-3e5c-4f08-a3ab-4eca75254405" containerName="registry-server" Feb 19 16:41:16 crc kubenswrapper[4810]: E0219 16:41:16.697161 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4c017a9-c049-4baa-acc0-e08a25437c90" containerName="tempest-tests-tempest-tests-runner" Feb 19 16:41:16 crc kubenswrapper[4810]: I0219 16:41:16.697169 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4c017a9-c049-4baa-acc0-e08a25437c90" containerName="tempest-tests-tempest-tests-runner" Feb 19 16:41:16 crc kubenswrapper[4810]: E0219 16:41:16.697187 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7403ee1-3e5c-4f08-a3ab-4eca75254405" containerName="extract-utilities" Feb 19 16:41:16 crc kubenswrapper[4810]: I0219 16:41:16.697195 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7403ee1-3e5c-4f08-a3ab-4eca75254405" containerName="extract-utilities" Feb 19 16:41:16 crc kubenswrapper[4810]: E0219 16:41:16.697216 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7403ee1-3e5c-4f08-a3ab-4eca75254405" containerName="extract-content" Feb 19 16:41:16 crc kubenswrapper[4810]: I0219 16:41:16.697223 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7403ee1-3e5c-4f08-a3ab-4eca75254405" containerName="extract-content" Feb 19 16:41:16 crc kubenswrapper[4810]: E0219 16:41:16.697237 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81d6df67-4f89-42b8-9e19-2812ece77996" containerName="registry-server" Feb 19 16:41:16 crc kubenswrapper[4810]: I0219 16:41:16.697245 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="81d6df67-4f89-42b8-9e19-2812ece77996" containerName="registry-server" Feb 19 16:41:16 crc kubenswrapper[4810]: E0219 16:41:16.697281 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81d6df67-4f89-42b8-9e19-2812ece77996" containerName="extract-content" Feb 19 16:41:16 crc kubenswrapper[4810]: I0219 16:41:16.697290 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="81d6df67-4f89-42b8-9e19-2812ece77996" containerName="extract-content" Feb 19 16:41:16 crc kubenswrapper[4810]: I0219 16:41:16.697523 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7403ee1-3e5c-4f08-a3ab-4eca75254405" containerName="registry-server" Feb 19 16:41:16 crc kubenswrapper[4810]: I0219 16:41:16.697539 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4c017a9-c049-4baa-acc0-e08a25437c90" containerName="tempest-tests-tempest-tests-runner" Feb 19 16:41:16 crc kubenswrapper[4810]: I0219 16:41:16.697554 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="81d6df67-4f89-42b8-9e19-2812ece77996" containerName="registry-server" Feb 19 16:41:16 crc kubenswrapper[4810]: I0219 16:41:16.698402 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 19 16:41:16 crc kubenswrapper[4810]: I0219 16:41:16.700802 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-hvstp" Feb 19 16:41:16 crc kubenswrapper[4810]: I0219 16:41:16.725423 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 19 16:41:16 crc kubenswrapper[4810]: I0219 16:41:16.834264 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdwwl\" (UniqueName: \"kubernetes.io/projected/d3094cb4-a81a-4f4f-b1d8-040c32bd3b2c-kube-api-access-qdwwl\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"d3094cb4-a81a-4f4f-b1d8-040c32bd3b2c\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 19 16:41:16 crc kubenswrapper[4810]: I0219 16:41:16.834596 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"d3094cb4-a81a-4f4f-b1d8-040c32bd3b2c\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 19 16:41:16 crc kubenswrapper[4810]: I0219 16:41:16.937097 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdwwl\" (UniqueName: \"kubernetes.io/projected/d3094cb4-a81a-4f4f-b1d8-040c32bd3b2c-kube-api-access-qdwwl\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"d3094cb4-a81a-4f4f-b1d8-040c32bd3b2c\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 19 16:41:16 crc kubenswrapper[4810]: I0219 16:41:16.937171 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"d3094cb4-a81a-4f4f-b1d8-040c32bd3b2c\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 19 16:41:16 crc kubenswrapper[4810]: I0219 16:41:16.937801 4810 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"d3094cb4-a81a-4f4f-b1d8-040c32bd3b2c\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 19 16:41:16 crc kubenswrapper[4810]: I0219 16:41:16.961034 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdwwl\" (UniqueName: \"kubernetes.io/projected/d3094cb4-a81a-4f4f-b1d8-040c32bd3b2c-kube-api-access-qdwwl\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"d3094cb4-a81a-4f4f-b1d8-040c32bd3b2c\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 19 16:41:16 crc kubenswrapper[4810]: I0219 16:41:16.994873 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"d3094cb4-a81a-4f4f-b1d8-040c32bd3b2c\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 19 16:41:17 crc kubenswrapper[4810]: I0219 16:41:17.027183 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 19 16:41:17 crc kubenswrapper[4810]: I0219 16:41:17.351117 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 19 16:41:17 crc kubenswrapper[4810]: I0219 16:41:17.876145 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"d3094cb4-a81a-4f4f-b1d8-040c32bd3b2c","Type":"ContainerStarted","Data":"faa4349e74e3218974b49ca21921c71165d9ca53eb17561d45afb4794f92a025"} Feb 19 16:41:18 crc kubenswrapper[4810]: I0219 16:41:18.890259 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"d3094cb4-a81a-4f4f-b1d8-040c32bd3b2c","Type":"ContainerStarted","Data":"7ba0abe443e2df1851b2a666811793ff90dd394a1196d027990cb52af7854375"} Feb 19 16:41:18 crc kubenswrapper[4810]: I0219 16:41:18.911963 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=1.952307276 podStartE2EDuration="2.911943688s" podCreationTimestamp="2026-02-19 16:41:16 +0000 UTC" firstStartedPulling="2026-02-19 16:41:17.352773059 +0000 UTC m=+5506.834803223" lastFinishedPulling="2026-02-19 16:41:18.312409501 +0000 UTC m=+5507.794439635" observedRunningTime="2026-02-19 16:41:18.907387558 +0000 UTC m=+5508.389417692" watchObservedRunningTime="2026-02-19 16:41:18.911943688 +0000 UTC m=+5508.393973822" Feb 19 16:41:19 crc kubenswrapper[4810]: I0219 16:41:19.537468 4810 patch_prober.go:28] interesting pod/machine-config-daemon-t499d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 16:41:19 crc kubenswrapper[4810]: I0219 16:41:19.537531 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 16:41:43 crc kubenswrapper[4810]: I0219 16:41:43.847760 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-mt7ln/must-gather-xtkm7"] Feb 19 16:41:43 crc kubenswrapper[4810]: I0219 16:41:43.849829 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mt7ln/must-gather-xtkm7" Feb 19 16:41:43 crc kubenswrapper[4810]: I0219 16:41:43.852347 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-mt7ln"/"default-dockercfg-6x4d7" Feb 19 16:41:43 crc kubenswrapper[4810]: I0219 16:41:43.852689 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-mt7ln"/"kube-root-ca.crt" Feb 19 16:41:43 crc kubenswrapper[4810]: I0219 16:41:43.855315 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-mt7ln"/"openshift-service-ca.crt" Feb 19 16:41:43 crc kubenswrapper[4810]: I0219 16:41:43.857934 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-mt7ln/must-gather-xtkm7"] Feb 19 16:41:43 crc kubenswrapper[4810]: I0219 16:41:43.962492 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67dql\" (UniqueName: \"kubernetes.io/projected/2d8b3bd1-400c-4da2-a6d6-efa7d8bbd994-kube-api-access-67dql\") pod \"must-gather-xtkm7\" (UID: \"2d8b3bd1-400c-4da2-a6d6-efa7d8bbd994\") " pod="openshift-must-gather-mt7ln/must-gather-xtkm7" Feb 19 16:41:43 crc kubenswrapper[4810]: I0219 16:41:43.962839 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2d8b3bd1-400c-4da2-a6d6-efa7d8bbd994-must-gather-output\") pod \"must-gather-xtkm7\" (UID: \"2d8b3bd1-400c-4da2-a6d6-efa7d8bbd994\") " pod="openshift-must-gather-mt7ln/must-gather-xtkm7" Feb 19 16:41:44 crc kubenswrapper[4810]: I0219 16:41:44.065293 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67dql\" (UniqueName: \"kubernetes.io/projected/2d8b3bd1-400c-4da2-a6d6-efa7d8bbd994-kube-api-access-67dql\") pod \"must-gather-xtkm7\" (UID: \"2d8b3bd1-400c-4da2-a6d6-efa7d8bbd994\") " pod="openshift-must-gather-mt7ln/must-gather-xtkm7" Feb 19 16:41:44 crc kubenswrapper[4810]: I0219 16:41:44.065724 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2d8b3bd1-400c-4da2-a6d6-efa7d8bbd994-must-gather-output\") pod \"must-gather-xtkm7\" (UID: \"2d8b3bd1-400c-4da2-a6d6-efa7d8bbd994\") " pod="openshift-must-gather-mt7ln/must-gather-xtkm7" Feb 19 16:41:44 crc kubenswrapper[4810]: I0219 16:41:44.066094 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2d8b3bd1-400c-4da2-a6d6-efa7d8bbd994-must-gather-output\") pod \"must-gather-xtkm7\" (UID: \"2d8b3bd1-400c-4da2-a6d6-efa7d8bbd994\") " pod="openshift-must-gather-mt7ln/must-gather-xtkm7" Feb 19 16:41:44 crc kubenswrapper[4810]: I0219 16:41:44.586201 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67dql\" (UniqueName: \"kubernetes.io/projected/2d8b3bd1-400c-4da2-a6d6-efa7d8bbd994-kube-api-access-67dql\") pod \"must-gather-xtkm7\" (UID: \"2d8b3bd1-400c-4da2-a6d6-efa7d8bbd994\") " pod="openshift-must-gather-mt7ln/must-gather-xtkm7" Feb 19 16:41:44 crc kubenswrapper[4810]: I0219 16:41:44.771386 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mt7ln/must-gather-xtkm7" Feb 19 16:41:45 crc kubenswrapper[4810]: W0219 16:41:45.271560 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d8b3bd1_400c_4da2_a6d6_efa7d8bbd994.slice/crio-151c67b89f6c96a03e2c74c97afaae12904a2f90d765b36355e67b20a49b85e4 WatchSource:0}: Error finding container 151c67b89f6c96a03e2c74c97afaae12904a2f90d765b36355e67b20a49b85e4: Status 404 returned error can't find the container with id 151c67b89f6c96a03e2c74c97afaae12904a2f90d765b36355e67b20a49b85e4 Feb 19 16:41:45 crc kubenswrapper[4810]: I0219 16:41:45.277202 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-mt7ln/must-gather-xtkm7"] Feb 19 16:41:46 crc kubenswrapper[4810]: I0219 16:41:46.225951 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mt7ln/must-gather-xtkm7" event={"ID":"2d8b3bd1-400c-4da2-a6d6-efa7d8bbd994","Type":"ContainerStarted","Data":"151c67b89f6c96a03e2c74c97afaae12904a2f90d765b36355e67b20a49b85e4"} Feb 19 16:41:49 crc kubenswrapper[4810]: I0219 16:41:49.537267 4810 patch_prober.go:28] interesting pod/machine-config-daemon-t499d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 16:41:49 crc kubenswrapper[4810]: I0219 16:41:49.537724 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 16:41:52 crc kubenswrapper[4810]: I0219 16:41:52.308707 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mt7ln/must-gather-xtkm7" event={"ID":"2d8b3bd1-400c-4da2-a6d6-efa7d8bbd994","Type":"ContainerStarted","Data":"61e6a470548c19ba5e113ede1e2ed5b363e40d072f7a28ac5b76e32667d06ba8"} Feb 19 16:41:52 crc kubenswrapper[4810]: I0219 16:41:52.309591 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mt7ln/must-gather-xtkm7" event={"ID":"2d8b3bd1-400c-4da2-a6d6-efa7d8bbd994","Type":"ContainerStarted","Data":"17449341bfb55cd42e3f6347fa4879ccfdfca14899e053d20de9d0995b28213d"} Feb 19 16:41:52 crc kubenswrapper[4810]: I0219 16:41:52.350048 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-mt7ln/must-gather-xtkm7" podStartSLOduration=3.163753086 podStartE2EDuration="9.35002478s" podCreationTimestamp="2026-02-19 16:41:43 +0000 UTC" firstStartedPulling="2026-02-19 16:41:45.273974716 +0000 UTC m=+5534.756004840" lastFinishedPulling="2026-02-19 16:41:51.46024641 +0000 UTC m=+5540.942276534" observedRunningTime="2026-02-19 16:41:52.341909975 +0000 UTC m=+5541.823940109" watchObservedRunningTime="2026-02-19 16:41:52.35002478 +0000 UTC m=+5541.832054904" Feb 19 16:41:57 crc kubenswrapper[4810]: I0219 16:41:57.133712 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-mt7ln/crc-debug-5z5rw"] Feb 19 16:41:57 crc kubenswrapper[4810]: I0219 16:41:57.136440 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mt7ln/crc-debug-5z5rw" Feb 19 16:41:57 crc kubenswrapper[4810]: I0219 16:41:57.298089 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/80cd5f56-cd22-4f44-a387-03ac5132a14d-host\") pod \"crc-debug-5z5rw\" (UID: \"80cd5f56-cd22-4f44-a387-03ac5132a14d\") " pod="openshift-must-gather-mt7ln/crc-debug-5z5rw" Feb 19 16:41:57 crc kubenswrapper[4810]: I0219 16:41:57.298357 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwzqj\" (UniqueName: \"kubernetes.io/projected/80cd5f56-cd22-4f44-a387-03ac5132a14d-kube-api-access-xwzqj\") pod \"crc-debug-5z5rw\" (UID: \"80cd5f56-cd22-4f44-a387-03ac5132a14d\") " pod="openshift-must-gather-mt7ln/crc-debug-5z5rw" Feb 19 16:41:57 crc kubenswrapper[4810]: I0219 16:41:57.400840 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwzqj\" (UniqueName: \"kubernetes.io/projected/80cd5f56-cd22-4f44-a387-03ac5132a14d-kube-api-access-xwzqj\") pod \"crc-debug-5z5rw\" (UID: \"80cd5f56-cd22-4f44-a387-03ac5132a14d\") " pod="openshift-must-gather-mt7ln/crc-debug-5z5rw" Feb 19 16:41:57 crc kubenswrapper[4810]: I0219 16:41:57.401384 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/80cd5f56-cd22-4f44-a387-03ac5132a14d-host\") pod \"crc-debug-5z5rw\" (UID: \"80cd5f56-cd22-4f44-a387-03ac5132a14d\") " pod="openshift-must-gather-mt7ln/crc-debug-5z5rw" Feb 19 16:41:57 crc kubenswrapper[4810]: I0219 16:41:57.401499 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/80cd5f56-cd22-4f44-a387-03ac5132a14d-host\") pod \"crc-debug-5z5rw\" (UID: \"80cd5f56-cd22-4f44-a387-03ac5132a14d\") " pod="openshift-must-gather-mt7ln/crc-debug-5z5rw" Feb 19 16:41:57 crc kubenswrapper[4810]: I0219 16:41:57.421721 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwzqj\" (UniqueName: \"kubernetes.io/projected/80cd5f56-cd22-4f44-a387-03ac5132a14d-kube-api-access-xwzqj\") pod \"crc-debug-5z5rw\" (UID: \"80cd5f56-cd22-4f44-a387-03ac5132a14d\") " pod="openshift-must-gather-mt7ln/crc-debug-5z5rw" Feb 19 16:41:57 crc kubenswrapper[4810]: I0219 16:41:57.460531 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mt7ln/crc-debug-5z5rw" Feb 19 16:41:58 crc kubenswrapper[4810]: I0219 16:41:58.375379 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mt7ln/crc-debug-5z5rw" event={"ID":"80cd5f56-cd22-4f44-a387-03ac5132a14d","Type":"ContainerStarted","Data":"a39231f8e7be04f14e3096847bc871d04550c4e47e8f07c263cb25ad498189d6"} Feb 19 16:42:08 crc kubenswrapper[4810]: I0219 16:42:08.465664 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mt7ln/crc-debug-5z5rw" event={"ID":"80cd5f56-cd22-4f44-a387-03ac5132a14d","Type":"ContainerStarted","Data":"a07c763af809aeaa0f606507ff13030276f05d8d690e2a7ca0fe044711b225e6"} Feb 19 16:42:08 crc kubenswrapper[4810]: I0219 16:42:08.483958 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-mt7ln/crc-debug-5z5rw" podStartSLOduration=1.6297250920000002 podStartE2EDuration="11.483941087s" podCreationTimestamp="2026-02-19 16:41:57 +0000 UTC" firstStartedPulling="2026-02-19 16:41:57.513288956 +0000 UTC m=+5546.995319080" lastFinishedPulling="2026-02-19 16:42:07.367504941 +0000 UTC m=+5556.849535075" observedRunningTime="2026-02-19 16:42:08.480492424 +0000 UTC m=+5557.962522558" watchObservedRunningTime="2026-02-19 16:42:08.483941087 +0000 UTC m=+5557.965971221" Feb 19 16:42:09 crc kubenswrapper[4810]: I0219 16:42:09.900072 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9lqkv"] Feb 19 16:42:09 crc kubenswrapper[4810]: I0219 16:42:09.902460 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9lqkv" Feb 19 16:42:09 crc kubenswrapper[4810]: I0219 16:42:09.918651 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9lqkv"] Feb 19 16:42:09 crc kubenswrapper[4810]: I0219 16:42:09.998969 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db10b4b7-0ea5-46ad-ad3f-8618c5c820bb-catalog-content\") pod \"redhat-marketplace-9lqkv\" (UID: \"db10b4b7-0ea5-46ad-ad3f-8618c5c820bb\") " pod="openshift-marketplace/redhat-marketplace-9lqkv" Feb 19 16:42:09 crc kubenswrapper[4810]: I0219 16:42:09.999050 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4k24\" (UniqueName: \"kubernetes.io/projected/db10b4b7-0ea5-46ad-ad3f-8618c5c820bb-kube-api-access-n4k24\") pod \"redhat-marketplace-9lqkv\" (UID: \"db10b4b7-0ea5-46ad-ad3f-8618c5c820bb\") " pod="openshift-marketplace/redhat-marketplace-9lqkv" Feb 19 16:42:09 crc kubenswrapper[4810]: I0219 16:42:09.999120 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db10b4b7-0ea5-46ad-ad3f-8618c5c820bb-utilities\") pod \"redhat-marketplace-9lqkv\" (UID: \"db10b4b7-0ea5-46ad-ad3f-8618c5c820bb\") " pod="openshift-marketplace/redhat-marketplace-9lqkv" Feb 19 16:42:10 crc kubenswrapper[4810]: I0219 16:42:10.100999 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db10b4b7-0ea5-46ad-ad3f-8618c5c820bb-catalog-content\") pod \"redhat-marketplace-9lqkv\" (UID: \"db10b4b7-0ea5-46ad-ad3f-8618c5c820bb\") " pod="openshift-marketplace/redhat-marketplace-9lqkv" Feb 19 16:42:10 crc kubenswrapper[4810]: I0219 16:42:10.101605 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db10b4b7-0ea5-46ad-ad3f-8618c5c820bb-catalog-content\") pod \"redhat-marketplace-9lqkv\" (UID: \"db10b4b7-0ea5-46ad-ad3f-8618c5c820bb\") " pod="openshift-marketplace/redhat-marketplace-9lqkv" Feb 19 16:42:10 crc kubenswrapper[4810]: I0219 16:42:10.101814 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4k24\" (UniqueName: \"kubernetes.io/projected/db10b4b7-0ea5-46ad-ad3f-8618c5c820bb-kube-api-access-n4k24\") pod \"redhat-marketplace-9lqkv\" (UID: \"db10b4b7-0ea5-46ad-ad3f-8618c5c820bb\") " pod="openshift-marketplace/redhat-marketplace-9lqkv" Feb 19 16:42:10 crc kubenswrapper[4810]: I0219 16:42:10.102220 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db10b4b7-0ea5-46ad-ad3f-8618c5c820bb-utilities\") pod \"redhat-marketplace-9lqkv\" (UID: \"db10b4b7-0ea5-46ad-ad3f-8618c5c820bb\") " pod="openshift-marketplace/redhat-marketplace-9lqkv" Feb 19 16:42:10 crc kubenswrapper[4810]: I0219 16:42:10.102601 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db10b4b7-0ea5-46ad-ad3f-8618c5c820bb-utilities\") pod \"redhat-marketplace-9lqkv\" (UID: \"db10b4b7-0ea5-46ad-ad3f-8618c5c820bb\") " pod="openshift-marketplace/redhat-marketplace-9lqkv" Feb 19 16:42:10 crc kubenswrapper[4810]: I0219 16:42:10.123978 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4k24\" (UniqueName: \"kubernetes.io/projected/db10b4b7-0ea5-46ad-ad3f-8618c5c820bb-kube-api-access-n4k24\") pod \"redhat-marketplace-9lqkv\" (UID: \"db10b4b7-0ea5-46ad-ad3f-8618c5c820bb\") " pod="openshift-marketplace/redhat-marketplace-9lqkv" Feb 19 16:42:10 crc kubenswrapper[4810]: I0219 16:42:10.218881 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9lqkv" Feb 19 16:42:10 crc kubenswrapper[4810]: I0219 16:42:10.714745 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9lqkv"] Feb 19 16:42:11 crc kubenswrapper[4810]: I0219 16:42:11.521084 4810 generic.go:334] "Generic (PLEG): container finished" podID="db10b4b7-0ea5-46ad-ad3f-8618c5c820bb" containerID="7f5494bd61ffab00fb7a4776204e698af55418ab9763f28dfbefa06a73f560d9" exitCode=0 Feb 19 16:42:11 crc kubenswrapper[4810]: I0219 16:42:11.521338 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9lqkv" event={"ID":"db10b4b7-0ea5-46ad-ad3f-8618c5c820bb","Type":"ContainerDied","Data":"7f5494bd61ffab00fb7a4776204e698af55418ab9763f28dfbefa06a73f560d9"} Feb 19 16:42:11 crc kubenswrapper[4810]: I0219 16:42:11.521369 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9lqkv" event={"ID":"db10b4b7-0ea5-46ad-ad3f-8618c5c820bb","Type":"ContainerStarted","Data":"f3e83be118c02d536c510ef889a7296ac60084c7ad5f21a05fb5606bf16edf72"} Feb 19 16:42:14 crc kubenswrapper[4810]: I0219 16:42:14.556700 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9lqkv" event={"ID":"db10b4b7-0ea5-46ad-ad3f-8618c5c820bb","Type":"ContainerStarted","Data":"a4fcdca7c10e1120d38e6f09a02741be0d99fe0d553a926c6932f130b77917ef"} Feb 19 16:42:15 crc kubenswrapper[4810]: I0219 16:42:15.567803 4810 generic.go:334] "Generic (PLEG): container finished" podID="db10b4b7-0ea5-46ad-ad3f-8618c5c820bb" containerID="a4fcdca7c10e1120d38e6f09a02741be0d99fe0d553a926c6932f130b77917ef" exitCode=0 Feb 19 16:42:15 crc kubenswrapper[4810]: I0219 16:42:15.567864 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9lqkv" event={"ID":"db10b4b7-0ea5-46ad-ad3f-8618c5c820bb","Type":"ContainerDied","Data":"a4fcdca7c10e1120d38e6f09a02741be0d99fe0d553a926c6932f130b77917ef"} Feb 19 16:42:16 crc kubenswrapper[4810]: I0219 16:42:16.583161 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9lqkv" event={"ID":"db10b4b7-0ea5-46ad-ad3f-8618c5c820bb","Type":"ContainerStarted","Data":"9a5cc97dac182042a875d1ceb6bf4e9c3ca19f53752e02d2e0116925aa456620"} Feb 19 16:42:16 crc kubenswrapper[4810]: I0219 16:42:16.610016 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9lqkv" podStartSLOduration=4.772299823 podStartE2EDuration="7.609995897s" podCreationTimestamp="2026-02-19 16:42:09 +0000 UTC" firstStartedPulling="2026-02-19 16:42:13.152859477 +0000 UTC m=+5562.634889601" lastFinishedPulling="2026-02-19 16:42:15.990555551 +0000 UTC m=+5565.472585675" observedRunningTime="2026-02-19 16:42:16.604427983 +0000 UTC m=+5566.086458117" watchObservedRunningTime="2026-02-19 16:42:16.609995897 +0000 UTC m=+5566.092026031" Feb 19 16:42:19 crc kubenswrapper[4810]: I0219 16:42:19.537223 4810 patch_prober.go:28] interesting pod/machine-config-daemon-t499d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 16:42:19 crc kubenswrapper[4810]: I0219 16:42:19.538017 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 16:42:19 crc kubenswrapper[4810]: I0219 16:42:19.538086 4810 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t499d" Feb 19 16:42:19 crc kubenswrapper[4810]: I0219 16:42:19.539195 4810 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b8edda1f9342dacaf59cc4da7ca5a5e6fa7b1be00b10f0e3774c25fffdb24625"} pod="openshift-machine-config-operator/machine-config-daemon-t499d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 16:42:19 crc kubenswrapper[4810]: I0219 16:42:19.539283 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" containerID="cri-o://b8edda1f9342dacaf59cc4da7ca5a5e6fa7b1be00b10f0e3774c25fffdb24625" gracePeriod=600 Feb 19 16:42:20 crc kubenswrapper[4810]: I0219 16:42:20.219396 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9lqkv" Feb 19 16:42:20 crc kubenswrapper[4810]: I0219 16:42:20.219729 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9lqkv" Feb 19 16:42:20 crc kubenswrapper[4810]: I0219 16:42:20.270000 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9lqkv" Feb 19 16:42:20 crc kubenswrapper[4810]: I0219 16:42:20.621692 4810 generic.go:334] "Generic (PLEG): container finished" podID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerID="b8edda1f9342dacaf59cc4da7ca5a5e6fa7b1be00b10f0e3774c25fffdb24625" exitCode=0 Feb 19 16:42:20 crc kubenswrapper[4810]: I0219 16:42:20.621736 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" event={"ID":"dfbf74ef-1e94-4826-8583-42b2e246ccf3","Type":"ContainerDied","Data":"b8edda1f9342dacaf59cc4da7ca5a5e6fa7b1be00b10f0e3774c25fffdb24625"} Feb 19 16:42:20 crc kubenswrapper[4810]: I0219 16:42:20.622014 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" event={"ID":"dfbf74ef-1e94-4826-8583-42b2e246ccf3","Type":"ContainerStarted","Data":"63bd1770c325dfdb5ccd64220060f108b50a35fa6325f46acd3bb9d2e6c06e3e"} Feb 19 16:42:20 crc kubenswrapper[4810]: I0219 16:42:20.622055 4810 scope.go:117] "RemoveContainer" containerID="c54befe76a9fc3a2a84e7b2c94c4037774045172c204f2e9cb88741a51f735a3" Feb 19 16:42:30 crc kubenswrapper[4810]: I0219 16:42:30.277993 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9lqkv" Feb 19 16:42:30 crc kubenswrapper[4810]: I0219 16:42:30.356163 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9lqkv"] Feb 19 16:42:30 crc kubenswrapper[4810]: I0219 16:42:30.747651 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9lqkv" podUID="db10b4b7-0ea5-46ad-ad3f-8618c5c820bb" containerName="registry-server" containerID="cri-o://9a5cc97dac182042a875d1ceb6bf4e9c3ca19f53752e02d2e0116925aa456620" gracePeriod=2 Feb 19 16:42:31 crc kubenswrapper[4810]: I0219 16:42:31.313966 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9lqkv" Feb 19 16:42:31 crc kubenswrapper[4810]: I0219 16:42:31.448399 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db10b4b7-0ea5-46ad-ad3f-8618c5c820bb-catalog-content\") pod \"db10b4b7-0ea5-46ad-ad3f-8618c5c820bb\" (UID: \"db10b4b7-0ea5-46ad-ad3f-8618c5c820bb\") " Feb 19 16:42:31 crc kubenswrapper[4810]: I0219 16:42:31.448490 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db10b4b7-0ea5-46ad-ad3f-8618c5c820bb-utilities\") pod \"db10b4b7-0ea5-46ad-ad3f-8618c5c820bb\" (UID: \"db10b4b7-0ea5-46ad-ad3f-8618c5c820bb\") " Feb 19 16:42:31 crc kubenswrapper[4810]: I0219 16:42:31.448647 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4k24\" (UniqueName: \"kubernetes.io/projected/db10b4b7-0ea5-46ad-ad3f-8618c5c820bb-kube-api-access-n4k24\") pod \"db10b4b7-0ea5-46ad-ad3f-8618c5c820bb\" (UID: \"db10b4b7-0ea5-46ad-ad3f-8618c5c820bb\") " Feb 19 16:42:31 crc kubenswrapper[4810]: I0219 16:42:31.449544 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db10b4b7-0ea5-46ad-ad3f-8618c5c820bb-utilities" (OuterVolumeSpecName: "utilities") pod "db10b4b7-0ea5-46ad-ad3f-8618c5c820bb" (UID: "db10b4b7-0ea5-46ad-ad3f-8618c5c820bb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 16:42:31 crc kubenswrapper[4810]: I0219 16:42:31.462140 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db10b4b7-0ea5-46ad-ad3f-8618c5c820bb-kube-api-access-n4k24" (OuterVolumeSpecName: "kube-api-access-n4k24") pod "db10b4b7-0ea5-46ad-ad3f-8618c5c820bb" (UID: "db10b4b7-0ea5-46ad-ad3f-8618c5c820bb"). InnerVolumeSpecName "kube-api-access-n4k24". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 16:42:31 crc kubenswrapper[4810]: I0219 16:42:31.478129 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db10b4b7-0ea5-46ad-ad3f-8618c5c820bb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "db10b4b7-0ea5-46ad-ad3f-8618c5c820bb" (UID: "db10b4b7-0ea5-46ad-ad3f-8618c5c820bb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 16:42:31 crc kubenswrapper[4810]: I0219 16:42:31.551692 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db10b4b7-0ea5-46ad-ad3f-8618c5c820bb-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 16:42:31 crc kubenswrapper[4810]: I0219 16:42:31.551727 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db10b4b7-0ea5-46ad-ad3f-8618c5c820bb-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 16:42:31 crc kubenswrapper[4810]: I0219 16:42:31.551738 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n4k24\" (UniqueName: \"kubernetes.io/projected/db10b4b7-0ea5-46ad-ad3f-8618c5c820bb-kube-api-access-n4k24\") on node \"crc\" DevicePath \"\"" Feb 19 16:42:31 crc kubenswrapper[4810]: I0219 16:42:31.757666 4810 generic.go:334] "Generic (PLEG): container finished" podID="db10b4b7-0ea5-46ad-ad3f-8618c5c820bb" containerID="9a5cc97dac182042a875d1ceb6bf4e9c3ca19f53752e02d2e0116925aa456620" exitCode=0 Feb 19 16:42:31 crc kubenswrapper[4810]: I0219 16:42:31.757707 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9lqkv" event={"ID":"db10b4b7-0ea5-46ad-ad3f-8618c5c820bb","Type":"ContainerDied","Data":"9a5cc97dac182042a875d1ceb6bf4e9c3ca19f53752e02d2e0116925aa456620"} Feb 19 16:42:31 crc kubenswrapper[4810]: I0219 16:42:31.757747 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9lqkv" event={"ID":"db10b4b7-0ea5-46ad-ad3f-8618c5c820bb","Type":"ContainerDied","Data":"f3e83be118c02d536c510ef889a7296ac60084c7ad5f21a05fb5606bf16edf72"} Feb 19 16:42:31 crc kubenswrapper[4810]: I0219 16:42:31.757765 4810 scope.go:117] "RemoveContainer" containerID="9a5cc97dac182042a875d1ceb6bf4e9c3ca19f53752e02d2e0116925aa456620" Feb 19 16:42:31 crc kubenswrapper[4810]: I0219 16:42:31.757723 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9lqkv" Feb 19 16:42:31 crc kubenswrapper[4810]: I0219 16:42:31.779883 4810 scope.go:117] "RemoveContainer" containerID="a4fcdca7c10e1120d38e6f09a02741be0d99fe0d553a926c6932f130b77917ef" Feb 19 16:42:31 crc kubenswrapper[4810]: I0219 16:42:31.799084 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9lqkv"] Feb 19 16:42:31 crc kubenswrapper[4810]: I0219 16:42:31.810150 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9lqkv"] Feb 19 16:42:31 crc kubenswrapper[4810]: I0219 16:42:31.810496 4810 scope.go:117] "RemoveContainer" containerID="7f5494bd61ffab00fb7a4776204e698af55418ab9763f28dfbefa06a73f560d9" Feb 19 16:42:31 crc kubenswrapper[4810]: I0219 16:42:31.861877 4810 scope.go:117] "RemoveContainer" containerID="9a5cc97dac182042a875d1ceb6bf4e9c3ca19f53752e02d2e0116925aa456620" Feb 19 16:42:31 crc kubenswrapper[4810]: E0219 16:42:31.862702 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a5cc97dac182042a875d1ceb6bf4e9c3ca19f53752e02d2e0116925aa456620\": container with ID starting with 9a5cc97dac182042a875d1ceb6bf4e9c3ca19f53752e02d2e0116925aa456620 not found: ID does not exist" containerID="9a5cc97dac182042a875d1ceb6bf4e9c3ca19f53752e02d2e0116925aa456620" Feb 19 16:42:31 crc kubenswrapper[4810]: I0219 16:42:31.862744 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a5cc97dac182042a875d1ceb6bf4e9c3ca19f53752e02d2e0116925aa456620"} err="failed to get container status \"9a5cc97dac182042a875d1ceb6bf4e9c3ca19f53752e02d2e0116925aa456620\": rpc error: code = NotFound desc = could not find container \"9a5cc97dac182042a875d1ceb6bf4e9c3ca19f53752e02d2e0116925aa456620\": container with ID starting with 9a5cc97dac182042a875d1ceb6bf4e9c3ca19f53752e02d2e0116925aa456620 not found: ID does not exist" Feb 19 16:42:31 crc kubenswrapper[4810]: I0219 16:42:31.862771 4810 scope.go:117] "RemoveContainer" containerID="a4fcdca7c10e1120d38e6f09a02741be0d99fe0d553a926c6932f130b77917ef" Feb 19 16:42:31 crc kubenswrapper[4810]: E0219 16:42:31.863107 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4fcdca7c10e1120d38e6f09a02741be0d99fe0d553a926c6932f130b77917ef\": container with ID starting with a4fcdca7c10e1120d38e6f09a02741be0d99fe0d553a926c6932f130b77917ef not found: ID does not exist" containerID="a4fcdca7c10e1120d38e6f09a02741be0d99fe0d553a926c6932f130b77917ef" Feb 19 16:42:31 crc kubenswrapper[4810]: I0219 16:42:31.863147 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4fcdca7c10e1120d38e6f09a02741be0d99fe0d553a926c6932f130b77917ef"} err="failed to get container status \"a4fcdca7c10e1120d38e6f09a02741be0d99fe0d553a926c6932f130b77917ef\": rpc error: code = NotFound desc = could not find container \"a4fcdca7c10e1120d38e6f09a02741be0d99fe0d553a926c6932f130b77917ef\": container with ID starting with a4fcdca7c10e1120d38e6f09a02741be0d99fe0d553a926c6932f130b77917ef not found: ID does not exist" Feb 19 16:42:31 crc kubenswrapper[4810]: I0219 16:42:31.863178 4810 scope.go:117] "RemoveContainer" containerID="7f5494bd61ffab00fb7a4776204e698af55418ab9763f28dfbefa06a73f560d9" Feb 19 16:42:31 crc kubenswrapper[4810]: E0219 16:42:31.863718 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f5494bd61ffab00fb7a4776204e698af55418ab9763f28dfbefa06a73f560d9\": container with ID starting with 7f5494bd61ffab00fb7a4776204e698af55418ab9763f28dfbefa06a73f560d9 not found: ID does not exist" containerID="7f5494bd61ffab00fb7a4776204e698af55418ab9763f28dfbefa06a73f560d9" Feb 19 16:42:31 crc kubenswrapper[4810]: I0219 16:42:31.863754 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f5494bd61ffab00fb7a4776204e698af55418ab9763f28dfbefa06a73f560d9"} err="failed to get container status \"7f5494bd61ffab00fb7a4776204e698af55418ab9763f28dfbefa06a73f560d9\": rpc error: code = NotFound desc = could not find container \"7f5494bd61ffab00fb7a4776204e698af55418ab9763f28dfbefa06a73f560d9\": container with ID starting with 7f5494bd61ffab00fb7a4776204e698af55418ab9763f28dfbefa06a73f560d9 not found: ID does not exist" Feb 19 16:42:33 crc kubenswrapper[4810]: I0219 16:42:33.456485 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db10b4b7-0ea5-46ad-ad3f-8618c5c820bb" path="/var/lib/kubelet/pods/db10b4b7-0ea5-46ad-ad3f-8618c5c820bb/volumes" Feb 19 16:42:51 crc kubenswrapper[4810]: I0219 16:42:51.970600 4810 generic.go:334] "Generic (PLEG): container finished" podID="80cd5f56-cd22-4f44-a387-03ac5132a14d" containerID="a07c763af809aeaa0f606507ff13030276f05d8d690e2a7ca0fe044711b225e6" exitCode=0 Feb 19 16:42:51 crc kubenswrapper[4810]: I0219 16:42:51.970698 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mt7ln/crc-debug-5z5rw" event={"ID":"80cd5f56-cd22-4f44-a387-03ac5132a14d","Type":"ContainerDied","Data":"a07c763af809aeaa0f606507ff13030276f05d8d690e2a7ca0fe044711b225e6"} Feb 19 16:42:53 crc kubenswrapper[4810]: I0219 16:42:53.099257 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mt7ln/crc-debug-5z5rw" Feb 19 16:42:53 crc kubenswrapper[4810]: I0219 16:42:53.138187 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-mt7ln/crc-debug-5z5rw"] Feb 19 16:42:53 crc kubenswrapper[4810]: I0219 16:42:53.147191 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-mt7ln/crc-debug-5z5rw"] Feb 19 16:42:53 crc kubenswrapper[4810]: I0219 16:42:53.228048 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/80cd5f56-cd22-4f44-a387-03ac5132a14d-host\") pod \"80cd5f56-cd22-4f44-a387-03ac5132a14d\" (UID: \"80cd5f56-cd22-4f44-a387-03ac5132a14d\") " Feb 19 16:42:53 crc kubenswrapper[4810]: I0219 16:42:53.228108 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwzqj\" (UniqueName: \"kubernetes.io/projected/80cd5f56-cd22-4f44-a387-03ac5132a14d-kube-api-access-xwzqj\") pod \"80cd5f56-cd22-4f44-a387-03ac5132a14d\" (UID: \"80cd5f56-cd22-4f44-a387-03ac5132a14d\") " Feb 19 16:42:53 crc kubenswrapper[4810]: I0219 16:42:53.228213 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/80cd5f56-cd22-4f44-a387-03ac5132a14d-host" (OuterVolumeSpecName: "host") pod "80cd5f56-cd22-4f44-a387-03ac5132a14d" (UID: "80cd5f56-cd22-4f44-a387-03ac5132a14d"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 16:42:53 crc kubenswrapper[4810]: I0219 16:42:53.228946 4810 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/80cd5f56-cd22-4f44-a387-03ac5132a14d-host\") on node \"crc\" DevicePath \"\"" Feb 19 16:42:53 crc kubenswrapper[4810]: I0219 16:42:53.239527 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80cd5f56-cd22-4f44-a387-03ac5132a14d-kube-api-access-xwzqj" (OuterVolumeSpecName: "kube-api-access-xwzqj") pod "80cd5f56-cd22-4f44-a387-03ac5132a14d" (UID: "80cd5f56-cd22-4f44-a387-03ac5132a14d"). InnerVolumeSpecName "kube-api-access-xwzqj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 16:42:53 crc kubenswrapper[4810]: I0219 16:42:53.330987 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwzqj\" (UniqueName: \"kubernetes.io/projected/80cd5f56-cd22-4f44-a387-03ac5132a14d-kube-api-access-xwzqj\") on node \"crc\" DevicePath \"\"" Feb 19 16:42:53 crc kubenswrapper[4810]: I0219 16:42:53.454523 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80cd5f56-cd22-4f44-a387-03ac5132a14d" path="/var/lib/kubelet/pods/80cd5f56-cd22-4f44-a387-03ac5132a14d/volumes" Feb 19 16:42:53 crc kubenswrapper[4810]: I0219 16:42:53.992826 4810 scope.go:117] "RemoveContainer" containerID="a07c763af809aeaa0f606507ff13030276f05d8d690e2a7ca0fe044711b225e6" Feb 19 16:42:53 crc kubenswrapper[4810]: I0219 16:42:53.992879 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mt7ln/crc-debug-5z5rw" Feb 19 16:42:54 crc kubenswrapper[4810]: I0219 16:42:54.370023 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-mt7ln/crc-debug-q6p74"] Feb 19 16:42:54 crc kubenswrapper[4810]: E0219 16:42:54.370925 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db10b4b7-0ea5-46ad-ad3f-8618c5c820bb" containerName="registry-server" Feb 19 16:42:54 crc kubenswrapper[4810]: I0219 16:42:54.370948 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="db10b4b7-0ea5-46ad-ad3f-8618c5c820bb" containerName="registry-server" Feb 19 16:42:54 crc kubenswrapper[4810]: E0219 16:42:54.370980 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db10b4b7-0ea5-46ad-ad3f-8618c5c820bb" containerName="extract-utilities" Feb 19 16:42:54 crc kubenswrapper[4810]: I0219 16:42:54.370993 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="db10b4b7-0ea5-46ad-ad3f-8618c5c820bb" containerName="extract-utilities" Feb 19 16:42:54 crc kubenswrapper[4810]: E0219 16:42:54.371013 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80cd5f56-cd22-4f44-a387-03ac5132a14d" containerName="container-00" Feb 19 16:42:54 crc kubenswrapper[4810]: I0219 16:42:54.371063 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="80cd5f56-cd22-4f44-a387-03ac5132a14d" containerName="container-00" Feb 19 16:42:54 crc kubenswrapper[4810]: E0219 16:42:54.371088 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db10b4b7-0ea5-46ad-ad3f-8618c5c820bb" containerName="extract-content" Feb 19 16:42:54 crc kubenswrapper[4810]: I0219 16:42:54.371097 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="db10b4b7-0ea5-46ad-ad3f-8618c5c820bb" containerName="extract-content" Feb 19 16:42:54 crc kubenswrapper[4810]: I0219 16:42:54.371427 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="80cd5f56-cd22-4f44-a387-03ac5132a14d" containerName="container-00" Feb 19 16:42:54 crc kubenswrapper[4810]: I0219 16:42:54.371460 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="db10b4b7-0ea5-46ad-ad3f-8618c5c820bb" containerName="registry-server" Feb 19 16:42:54 crc kubenswrapper[4810]: I0219 16:42:54.372493 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mt7ln/crc-debug-q6p74" Feb 19 16:42:54 crc kubenswrapper[4810]: I0219 16:42:54.560866 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6z28r\" (UniqueName: \"kubernetes.io/projected/0f51ddeb-2d6c-440a-9629-fd9894243a23-kube-api-access-6z28r\") pod \"crc-debug-q6p74\" (UID: \"0f51ddeb-2d6c-440a-9629-fd9894243a23\") " pod="openshift-must-gather-mt7ln/crc-debug-q6p74" Feb 19 16:42:54 crc kubenswrapper[4810]: I0219 16:42:54.561632 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0f51ddeb-2d6c-440a-9629-fd9894243a23-host\") pod \"crc-debug-q6p74\" (UID: \"0f51ddeb-2d6c-440a-9629-fd9894243a23\") " pod="openshift-must-gather-mt7ln/crc-debug-q6p74" Feb 19 16:42:54 crc kubenswrapper[4810]: I0219 16:42:54.663444 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0f51ddeb-2d6c-440a-9629-fd9894243a23-host\") pod \"crc-debug-q6p74\" (UID: \"0f51ddeb-2d6c-440a-9629-fd9894243a23\") " pod="openshift-must-gather-mt7ln/crc-debug-q6p74" Feb 19 16:42:54 crc kubenswrapper[4810]: I0219 16:42:54.663659 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6z28r\" (UniqueName: \"kubernetes.io/projected/0f51ddeb-2d6c-440a-9629-fd9894243a23-kube-api-access-6z28r\") pod \"crc-debug-q6p74\" (UID: \"0f51ddeb-2d6c-440a-9629-fd9894243a23\") " pod="openshift-must-gather-mt7ln/crc-debug-q6p74" Feb 19 16:42:54 crc kubenswrapper[4810]: I0219 16:42:54.664113 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0f51ddeb-2d6c-440a-9629-fd9894243a23-host\") pod \"crc-debug-q6p74\" (UID: \"0f51ddeb-2d6c-440a-9629-fd9894243a23\") " pod="openshift-must-gather-mt7ln/crc-debug-q6p74" Feb 19 16:42:54 crc kubenswrapper[4810]: I0219 16:42:54.682856 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6z28r\" (UniqueName: \"kubernetes.io/projected/0f51ddeb-2d6c-440a-9629-fd9894243a23-kube-api-access-6z28r\") pod \"crc-debug-q6p74\" (UID: \"0f51ddeb-2d6c-440a-9629-fd9894243a23\") " pod="openshift-must-gather-mt7ln/crc-debug-q6p74" Feb 19 16:42:54 crc kubenswrapper[4810]: I0219 16:42:54.692672 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mt7ln/crc-debug-q6p74" Feb 19 16:42:55 crc kubenswrapper[4810]: I0219 16:42:55.007792 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mt7ln/crc-debug-q6p74" event={"ID":"0f51ddeb-2d6c-440a-9629-fd9894243a23","Type":"ContainerStarted","Data":"8eab70b763bfc660b31c0a60ab6d66eddd4b83e4bf6e5028c9b6b68f68641100"} Feb 19 16:42:55 crc kubenswrapper[4810]: I0219 16:42:55.007857 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mt7ln/crc-debug-q6p74" event={"ID":"0f51ddeb-2d6c-440a-9629-fd9894243a23","Type":"ContainerStarted","Data":"4496e95ad24f39e6c752124e7013fd290a20930dfd3410e4845de35db32a4353"} Feb 19 16:42:55 crc kubenswrapper[4810]: I0219 16:42:55.024767 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-mt7ln/crc-debug-q6p74" podStartSLOduration=1.024754975 podStartE2EDuration="1.024754975s" podCreationTimestamp="2026-02-19 16:42:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 16:42:55.017688105 +0000 UTC m=+5604.499718229" watchObservedRunningTime="2026-02-19 16:42:55.024754975 +0000 UTC m=+5604.506785099" Feb 19 16:42:56 crc kubenswrapper[4810]: I0219 16:42:56.017004 4810 generic.go:334] "Generic (PLEG): container finished" podID="0f51ddeb-2d6c-440a-9629-fd9894243a23" containerID="8eab70b763bfc660b31c0a60ab6d66eddd4b83e4bf6e5028c9b6b68f68641100" exitCode=0 Feb 19 16:42:56 crc kubenswrapper[4810]: I0219 16:42:56.017182 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mt7ln/crc-debug-q6p74" event={"ID":"0f51ddeb-2d6c-440a-9629-fd9894243a23","Type":"ContainerDied","Data":"8eab70b763bfc660b31c0a60ab6d66eddd4b83e4bf6e5028c9b6b68f68641100"} Feb 19 16:42:57 crc kubenswrapper[4810]: I0219 16:42:57.623281 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mt7ln/crc-debug-q6p74" Feb 19 16:42:57 crc kubenswrapper[4810]: I0219 16:42:57.657158 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-mt7ln/crc-debug-q6p74"] Feb 19 16:42:57 crc kubenswrapper[4810]: I0219 16:42:57.667850 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-mt7ln/crc-debug-q6p74"] Feb 19 16:42:57 crc kubenswrapper[4810]: I0219 16:42:57.721723 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0f51ddeb-2d6c-440a-9629-fd9894243a23-host\") pod \"0f51ddeb-2d6c-440a-9629-fd9894243a23\" (UID: \"0f51ddeb-2d6c-440a-9629-fd9894243a23\") " Feb 19 16:42:57 crc kubenswrapper[4810]: I0219 16:42:57.721861 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0f51ddeb-2d6c-440a-9629-fd9894243a23-host" (OuterVolumeSpecName: "host") pod "0f51ddeb-2d6c-440a-9629-fd9894243a23" (UID: "0f51ddeb-2d6c-440a-9629-fd9894243a23"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 16:42:57 crc kubenswrapper[4810]: I0219 16:42:57.721999 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6z28r\" (UniqueName: \"kubernetes.io/projected/0f51ddeb-2d6c-440a-9629-fd9894243a23-kube-api-access-6z28r\") pod \"0f51ddeb-2d6c-440a-9629-fd9894243a23\" (UID: \"0f51ddeb-2d6c-440a-9629-fd9894243a23\") " Feb 19 16:42:57 crc kubenswrapper[4810]: I0219 16:42:57.722403 4810 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0f51ddeb-2d6c-440a-9629-fd9894243a23-host\") on node \"crc\" DevicePath \"\"" Feb 19 16:42:57 crc kubenswrapper[4810]: I0219 16:42:57.728529 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f51ddeb-2d6c-440a-9629-fd9894243a23-kube-api-access-6z28r" (OuterVolumeSpecName: "kube-api-access-6z28r") pod "0f51ddeb-2d6c-440a-9629-fd9894243a23" (UID: "0f51ddeb-2d6c-440a-9629-fd9894243a23"). InnerVolumeSpecName "kube-api-access-6z28r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 16:42:57 crc kubenswrapper[4810]: I0219 16:42:57.823735 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6z28r\" (UniqueName: \"kubernetes.io/projected/0f51ddeb-2d6c-440a-9629-fd9894243a23-kube-api-access-6z28r\") on node \"crc\" DevicePath \"\"" Feb 19 16:42:58 crc kubenswrapper[4810]: I0219 16:42:58.034225 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4496e95ad24f39e6c752124e7013fd290a20930dfd3410e4845de35db32a4353" Feb 19 16:42:58 crc kubenswrapper[4810]: I0219 16:42:58.034277 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mt7ln/crc-debug-q6p74" Feb 19 16:42:58 crc kubenswrapper[4810]: I0219 16:42:58.827489 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-mt7ln/crc-debug-kd2tv"] Feb 19 16:42:58 crc kubenswrapper[4810]: E0219 16:42:58.828104 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f51ddeb-2d6c-440a-9629-fd9894243a23" containerName="container-00" Feb 19 16:42:58 crc kubenswrapper[4810]: I0219 16:42:58.828115 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f51ddeb-2d6c-440a-9629-fd9894243a23" containerName="container-00" Feb 19 16:42:58 crc kubenswrapper[4810]: I0219 16:42:58.828328 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f51ddeb-2d6c-440a-9629-fd9894243a23" containerName="container-00" Feb 19 16:42:58 crc kubenswrapper[4810]: I0219 16:42:58.829006 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mt7ln/crc-debug-kd2tv" Feb 19 16:42:58 crc kubenswrapper[4810]: I0219 16:42:58.945230 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89d6f\" (UniqueName: \"kubernetes.io/projected/d35bc1c3-d664-4faa-89db-dbddd7c714f3-kube-api-access-89d6f\") pod \"crc-debug-kd2tv\" (UID: \"d35bc1c3-d664-4faa-89db-dbddd7c714f3\") " pod="openshift-must-gather-mt7ln/crc-debug-kd2tv" Feb 19 16:42:58 crc kubenswrapper[4810]: I0219 16:42:58.945296 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d35bc1c3-d664-4faa-89db-dbddd7c714f3-host\") pod \"crc-debug-kd2tv\" (UID: \"d35bc1c3-d664-4faa-89db-dbddd7c714f3\") " pod="openshift-must-gather-mt7ln/crc-debug-kd2tv" Feb 19 16:42:59 crc kubenswrapper[4810]: I0219 16:42:59.047924 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89d6f\" (UniqueName: \"kubernetes.io/projected/d35bc1c3-d664-4faa-89db-dbddd7c714f3-kube-api-access-89d6f\") pod \"crc-debug-kd2tv\" (UID: \"d35bc1c3-d664-4faa-89db-dbddd7c714f3\") " pod="openshift-must-gather-mt7ln/crc-debug-kd2tv" Feb 19 16:42:59 crc kubenswrapper[4810]: I0219 16:42:59.048055 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d35bc1c3-d664-4faa-89db-dbddd7c714f3-host\") pod \"crc-debug-kd2tv\" (UID: \"d35bc1c3-d664-4faa-89db-dbddd7c714f3\") " pod="openshift-must-gather-mt7ln/crc-debug-kd2tv" Feb 19 16:42:59 crc kubenswrapper[4810]: I0219 16:42:59.048382 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d35bc1c3-d664-4faa-89db-dbddd7c714f3-host\") pod \"crc-debug-kd2tv\" (UID: \"d35bc1c3-d664-4faa-89db-dbddd7c714f3\") " pod="openshift-must-gather-mt7ln/crc-debug-kd2tv" Feb 19 16:42:59 crc kubenswrapper[4810]: I0219 16:42:59.462206 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f51ddeb-2d6c-440a-9629-fd9894243a23" path="/var/lib/kubelet/pods/0f51ddeb-2d6c-440a-9629-fd9894243a23/volumes" Feb 19 16:42:59 crc kubenswrapper[4810]: I0219 16:42:59.485492 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89d6f\" (UniqueName: \"kubernetes.io/projected/d35bc1c3-d664-4faa-89db-dbddd7c714f3-kube-api-access-89d6f\") pod \"crc-debug-kd2tv\" (UID: \"d35bc1c3-d664-4faa-89db-dbddd7c714f3\") " pod="openshift-must-gather-mt7ln/crc-debug-kd2tv" Feb 19 16:42:59 crc kubenswrapper[4810]: I0219 16:42:59.754075 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mt7ln/crc-debug-kd2tv" Feb 19 16:43:00 crc kubenswrapper[4810]: I0219 16:43:00.055227 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mt7ln/crc-debug-kd2tv" event={"ID":"d35bc1c3-d664-4faa-89db-dbddd7c714f3","Type":"ContainerStarted","Data":"50ef314a010512b59b5e136c75c52a3caf286d85d31254cd01cdef5d379a63e2"} Feb 19 16:43:01 crc kubenswrapper[4810]: I0219 16:43:01.099057 4810 generic.go:334] "Generic (PLEG): container finished" podID="d35bc1c3-d664-4faa-89db-dbddd7c714f3" containerID="8ccf7e7a2c8e880d2be04530d96a194520a2d060cbaad4b63481170b69382a6c" exitCode=0 Feb 19 16:43:01 crc kubenswrapper[4810]: I0219 16:43:01.099122 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mt7ln/crc-debug-kd2tv" event={"ID":"d35bc1c3-d664-4faa-89db-dbddd7c714f3","Type":"ContainerDied","Data":"8ccf7e7a2c8e880d2be04530d96a194520a2d060cbaad4b63481170b69382a6c"} Feb 19 16:43:01 crc kubenswrapper[4810]: I0219 16:43:01.151157 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-mt7ln/crc-debug-kd2tv"] Feb 19 16:43:01 crc kubenswrapper[4810]: I0219 16:43:01.165439 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-mt7ln/crc-debug-kd2tv"] Feb 19 16:43:02 crc kubenswrapper[4810]: I0219 16:43:02.221958 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mt7ln/crc-debug-kd2tv" Feb 19 16:43:02 crc kubenswrapper[4810]: I0219 16:43:02.321434 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-89d6f\" (UniqueName: \"kubernetes.io/projected/d35bc1c3-d664-4faa-89db-dbddd7c714f3-kube-api-access-89d6f\") pod \"d35bc1c3-d664-4faa-89db-dbddd7c714f3\" (UID: \"d35bc1c3-d664-4faa-89db-dbddd7c714f3\") " Feb 19 16:43:02 crc kubenswrapper[4810]: I0219 16:43:02.321609 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d35bc1c3-d664-4faa-89db-dbddd7c714f3-host\") pod \"d35bc1c3-d664-4faa-89db-dbddd7c714f3\" (UID: \"d35bc1c3-d664-4faa-89db-dbddd7c714f3\") " Feb 19 16:43:02 crc kubenswrapper[4810]: I0219 16:43:02.321672 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d35bc1c3-d664-4faa-89db-dbddd7c714f3-host" (OuterVolumeSpecName: "host") pod "d35bc1c3-d664-4faa-89db-dbddd7c714f3" (UID: "d35bc1c3-d664-4faa-89db-dbddd7c714f3"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 16:43:02 crc kubenswrapper[4810]: I0219 16:43:02.322190 4810 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d35bc1c3-d664-4faa-89db-dbddd7c714f3-host\") on node \"crc\" DevicePath \"\"" Feb 19 16:43:02 crc kubenswrapper[4810]: I0219 16:43:02.340003 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d35bc1c3-d664-4faa-89db-dbddd7c714f3-kube-api-access-89d6f" (OuterVolumeSpecName: "kube-api-access-89d6f") pod "d35bc1c3-d664-4faa-89db-dbddd7c714f3" (UID: "d35bc1c3-d664-4faa-89db-dbddd7c714f3"). InnerVolumeSpecName "kube-api-access-89d6f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 16:43:02 crc kubenswrapper[4810]: I0219 16:43:02.423656 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-89d6f\" (UniqueName: \"kubernetes.io/projected/d35bc1c3-d664-4faa-89db-dbddd7c714f3-kube-api-access-89d6f\") on node \"crc\" DevicePath \"\"" Feb 19 16:43:03 crc kubenswrapper[4810]: I0219 16:43:03.122565 4810 scope.go:117] "RemoveContainer" containerID="8ccf7e7a2c8e880d2be04530d96a194520a2d060cbaad4b63481170b69382a6c" Feb 19 16:43:03 crc kubenswrapper[4810]: I0219 16:43:03.122881 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mt7ln/crc-debug-kd2tv" Feb 19 16:43:03 crc kubenswrapper[4810]: I0219 16:43:03.457318 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d35bc1c3-d664-4faa-89db-dbddd7c714f3" path="/var/lib/kubelet/pods/d35bc1c3-d664-4faa-89db-dbddd7c714f3/volumes" Feb 19 16:43:44 crc kubenswrapper[4810]: I0219 16:43:44.165600 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6b886df68b-htd57_8d391303-b5ee-4f63-8035-12f123f35e65/barbican-api/0.log" Feb 19 16:43:44 crc kubenswrapper[4810]: I0219 16:43:44.348357 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6b886df68b-htd57_8d391303-b5ee-4f63-8035-12f123f35e65/barbican-api-log/0.log" Feb 19 16:43:44 crc kubenswrapper[4810]: I0219 16:43:44.406299 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-75f99f68b4-d7hj4_c008ffcd-bb96-47dd-a311-fdc58f6d8918/barbican-keystone-listener/0.log" Feb 19 16:43:44 crc kubenswrapper[4810]: I0219 16:43:44.485388 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-75f99f68b4-d7hj4_c008ffcd-bb96-47dd-a311-fdc58f6d8918/barbican-keystone-listener-log/0.log" Feb 19 16:43:44 crc kubenswrapper[4810]: I0219 16:43:44.616611 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-58f8775989-n9rgr_f277c31b-ff97-4f3b-aec3-c5cfe9293d60/barbican-worker/0.log" Feb 19 16:43:44 crc kubenswrapper[4810]: I0219 16:43:44.626209 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-58f8775989-n9rgr_f277c31b-ff97-4f3b-aec3-c5cfe9293d60/barbican-worker-log/0.log" Feb 19 16:43:44 crc kubenswrapper[4810]: I0219 16:43:44.789066 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-sdzkx_c4a9ca21-e1c7-490d-8078-14407b530301/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 16:43:44 crc kubenswrapper[4810]: I0219 16:43:44.890027 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e7d91a4d-5b61-404e-a58b-cb426722f883/ceilometer-central-agent/0.log" Feb 19 16:43:44 crc kubenswrapper[4810]: I0219 16:43:44.971969 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e7d91a4d-5b61-404e-a58b-cb426722f883/ceilometer-notification-agent/0.log" Feb 19 16:43:45 crc kubenswrapper[4810]: I0219 16:43:45.016286 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e7d91a4d-5b61-404e-a58b-cb426722f883/proxy-httpd/0.log" Feb 19 16:43:45 crc kubenswrapper[4810]: I0219 16:43:45.074909 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e7d91a4d-5b61-404e-a58b-cb426722f883/sg-core/0.log" Feb 19 16:43:45 crc kubenswrapper[4810]: I0219 16:43:45.294143 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_1723b820-73ac-49f3-8716-283bf2c05925/cinder-api-log/0.log" Feb 19 16:43:45 crc kubenswrapper[4810]: I0219 16:43:45.543932 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_f66b86b2-b164-4380-8a89-bb0cf5f833ef/probe/0.log" Feb 19 16:43:45 crc kubenswrapper[4810]: I0219 16:43:45.645580 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_f66b86b2-b164-4380-8a89-bb0cf5f833ef/cinder-backup/0.log" Feb 19 16:43:45 crc kubenswrapper[4810]: I0219 16:43:45.696243 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_1723b820-73ac-49f3-8716-283bf2c05925/cinder-api/0.log" Feb 19 16:43:45 crc kubenswrapper[4810]: I0219 16:43:45.797193 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_48d5e3e9-853c-4988-8746-a6f74e1fe209/cinder-scheduler/0.log" Feb 19 16:43:45 crc kubenswrapper[4810]: I0219 16:43:45.806439 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_48d5e3e9-853c-4988-8746-a6f74e1fe209/probe/0.log" Feb 19 16:43:46 crc kubenswrapper[4810]: I0219 16:43:46.032529 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-0_20a46eb8-508d-45be-bf13-31aed23d1582/cinder-volume/0.log" Feb 19 16:43:46 crc kubenswrapper[4810]: I0219 16:43:46.292621 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-2-0_74a12495-8d82-4296-9328-430af6d923b2/probe/0.log" Feb 19 16:43:46 crc kubenswrapper[4810]: I0219 16:43:46.297924 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-0_20a46eb8-508d-45be-bf13-31aed23d1582/probe/0.log" Feb 19 16:43:46 crc kubenswrapper[4810]: I0219 16:43:46.504491 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-2-0_74a12495-8d82-4296-9328-430af6d923b2/cinder-volume/0.log" Feb 19 16:43:46 crc kubenswrapper[4810]: I0219 16:43:46.554562 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-6q498_2cff3a3e-0543-4fec-8f5b-5421be276386/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 16:43:46 crc kubenswrapper[4810]: I0219 16:43:46.722562 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-cm25d_7e1f4472-242a-40a0-a574-9c3119fdb705/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 16:43:46 crc kubenswrapper[4810]: I0219 16:43:46.756171 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-685d6df875-6hghq_7c074feb-2f7c-4f84-9ea8-5a9062e6b10a/init/0.log" Feb 19 16:43:46 crc kubenswrapper[4810]: I0219 16:43:46.945728 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-685d6df875-6hghq_7c074feb-2f7c-4f84-9ea8-5a9062e6b10a/init/0.log" Feb 19 16:43:47 crc kubenswrapper[4810]: I0219 16:43:47.001353 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-flmfl_e6255c5c-26d4-421f-9156-1bdd2f5adcc6/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 16:43:47 crc kubenswrapper[4810]: I0219 16:43:47.139313 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-685d6df875-6hghq_7c074feb-2f7c-4f84-9ea8-5a9062e6b10a/dnsmasq-dns/0.log" Feb 19 16:43:47 crc kubenswrapper[4810]: I0219 16:43:47.211830 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_41a4af93-6f80-4097-a964-2e3f3055fd3b/glance-httpd/0.log" Feb 19 16:43:47 crc kubenswrapper[4810]: I0219 16:43:47.242838 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_41a4af93-6f80-4097-a964-2e3f3055fd3b/glance-log/0.log" Feb 19 16:43:47 crc kubenswrapper[4810]: I0219 16:43:47.416381 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_6cc215f9-4ba7-4f38-aea0-bdb68abb2ad4/glance-log/0.log" Feb 19 16:43:47 crc kubenswrapper[4810]: I0219 16:43:47.436184 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_6cc215f9-4ba7-4f38-aea0-bdb68abb2ad4/glance-httpd/0.log" Feb 19 16:43:47 crc kubenswrapper[4810]: I0219 16:43:47.757877 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5f56498b8d-9gwmf_737d6629-747f-4d16-a545-d0070c20fe5d/horizon/0.log" Feb 19 16:43:47 crc kubenswrapper[4810]: I0219 16:43:47.758140 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2_31bd8fe5-f0b6-4463-a545-bdeb0c33b182/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 16:43:47 crc kubenswrapper[4810]: I0219 16:43:47.979490 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-s8kk5_12c3c35e-69ec-4150-8f1e-7eeb6cc01d9b/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 16:43:48 crc kubenswrapper[4810]: I0219 16:43:48.189621 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29525281-26qqv_8984eff3-6c82-4e2f-8bd6-1e820a450874/keystone-cron/0.log" Feb 19 16:43:48 crc kubenswrapper[4810]: I0219 16:43:48.370293 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5f56498b8d-9gwmf_737d6629-747f-4d16-a545-d0070c20fe5d/horizon-log/0.log" Feb 19 16:43:48 crc kubenswrapper[4810]: I0219 16:43:48.461191 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_9358dbee-2e5b-432d-98e0-6945d2e0d44b/kube-state-metrics/0.log" Feb 19 16:43:48 crc kubenswrapper[4810]: I0219 16:43:48.495473 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-6cd8bf58f4-ktsjk_95165d88-ea72-4785-8c1a-eea4d54466fb/keystone-api/0.log" Feb 19 16:43:48 crc kubenswrapper[4810]: I0219 16:43:48.631150 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-mxd44_b0d687e9-21b0-4abe-b7ec-4fb050926f6c/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 16:43:49 crc kubenswrapper[4810]: I0219 16:43:49.111053 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-wpxhs_6650a3db-fdc1-4342-b8a8-cb91376e75c5/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 16:43:49 crc kubenswrapper[4810]: I0219 16:43:49.215997 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6dfcf65577-bd5w2_6528bdfd-3389-4776-826e-164fc5117682/neutron-api/0.log" Feb 19 16:43:49 crc kubenswrapper[4810]: I0219 16:43:49.281760 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6dfcf65577-bd5w2_6528bdfd-3389-4776-826e-164fc5117682/neutron-httpd/0.log" Feb 19 16:43:49 crc kubenswrapper[4810]: I0219 16:43:49.361267 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_notifications-rabbitmq-server-0_4ad875c7-9e7b-4cc8-b6f7-e6fbeb95573c/setup-container/0.log" Feb 19 16:43:49 crc kubenswrapper[4810]: I0219 16:43:49.596293 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_notifications-rabbitmq-server-0_4ad875c7-9e7b-4cc8-b6f7-e6fbeb95573c/rabbitmq/0.log" Feb 19 16:43:49 crc kubenswrapper[4810]: I0219 16:43:49.610751 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_notifications-rabbitmq-server-0_4ad875c7-9e7b-4cc8-b6f7-e6fbeb95573c/setup-container/0.log" Feb 19 16:43:50 crc kubenswrapper[4810]: I0219 16:43:50.119349 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_65e6588c-3b7f-4719-beb6-90229629820f/nova-cell0-conductor-conductor/0.log" Feb 19 16:43:50 crc kubenswrapper[4810]: I0219 16:43:50.466979 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_f93aa728-7924-4a75-ad48-cc174764cf3e/nova-cell1-conductor-conductor/0.log" Feb 19 16:43:50 crc kubenswrapper[4810]: I0219 16:43:50.723180 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_5a7915d4-6c3f-4bc7-b21d-7d51b675640f/nova-cell1-novncproxy-novncproxy/0.log" Feb 19 16:43:51 crc kubenswrapper[4810]: I0219 16:43:51.045736 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-nv8wh_cc5014f8-e5aa-47ad-8787-c187b0f7f0e1/nova-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 16:43:51 crc kubenswrapper[4810]: I0219 16:43:51.232173 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_6397af05-d030-46c2-8a0f-a90beb9b2502/nova-api-log/0.log" Feb 19 16:43:51 crc kubenswrapper[4810]: I0219 16:43:51.295832 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_f36ad344-e946-4221-892d-3ffe8fbdd59b/nova-metadata-log/0.log" Feb 19 16:43:51 crc kubenswrapper[4810]: I0219 16:43:51.559643 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_6397af05-d030-46c2-8a0f-a90beb9b2502/nova-api-api/0.log" Feb 19 16:43:52 crc kubenswrapper[4810]: I0219 16:43:52.250230 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_30d11a24-9722-4e7a-9be5-f2bd00128167/mysql-bootstrap/0.log" Feb 19 16:43:52 crc kubenswrapper[4810]: I0219 16:43:52.269364 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_d70a0a1b-ed2d-46f1-aeb9-a335de9b06d4/nova-scheduler-scheduler/0.log" Feb 19 16:43:52 crc kubenswrapper[4810]: I0219 16:43:52.461534 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_30d11a24-9722-4e7a-9be5-f2bd00128167/mysql-bootstrap/0.log" Feb 19 16:43:52 crc kubenswrapper[4810]: I0219 16:43:52.485698 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_30d11a24-9722-4e7a-9be5-f2bd00128167/galera/0.log" Feb 19 16:43:52 crc kubenswrapper[4810]: I0219 16:43:52.642418 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_c0ffb8ce-a356-4416-b96c-49db30ff1947/mysql-bootstrap/0.log" Feb 19 16:43:52 crc kubenswrapper[4810]: I0219 16:43:52.854205 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_c0ffb8ce-a356-4416-b96c-49db30ff1947/mysql-bootstrap/0.log" Feb 19 16:43:52 crc kubenswrapper[4810]: I0219 16:43:52.873287 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_c0ffb8ce-a356-4416-b96c-49db30ff1947/galera/0.log" Feb 19 16:43:53 crc kubenswrapper[4810]: I0219 16:43:53.678597 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_f36ad344-e946-4221-892d-3ffe8fbdd59b/nova-metadata-metadata/0.log" Feb 19 16:43:53 crc kubenswrapper[4810]: I0219 16:43:53.795911 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_ca8eb29b-bb26-446f-8a22-5da13ff9d5fa/openstackclient/0.log" Feb 19 16:43:53 crc kubenswrapper[4810]: I0219 16:43:53.864414 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-tbt28_c03aad2b-8ca1-4310-8c11-3287fafcd66f/openstack-network-exporter/0.log" Feb 19 16:43:54 crc kubenswrapper[4810]: I0219 16:43:54.061829 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-5t6ds_542da555-4f39-4dff-b378-5306135244db/ovsdb-server-init/0.log" Feb 19 16:43:54 crc kubenswrapper[4810]: I0219 16:43:54.197281 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-5t6ds_542da555-4f39-4dff-b378-5306135244db/ovsdb-server-init/0.log" Feb 19 16:43:54 crc kubenswrapper[4810]: I0219 16:43:54.212656 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-5t6ds_542da555-4f39-4dff-b378-5306135244db/ovsdb-server/0.log" Feb 19 16:43:54 crc kubenswrapper[4810]: I0219 16:43:54.387857 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-s5488_4a4fa57b-aa00-4866-b31e-df29f7f86480/ovn-controller/0.log" Feb 19 16:43:54 crc kubenswrapper[4810]: I0219 16:43:54.484762 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-5t6ds_542da555-4f39-4dff-b378-5306135244db/ovs-vswitchd/0.log" Feb 19 16:43:54 crc kubenswrapper[4810]: I0219 16:43:54.529591 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-5gjxx_4defb710-c07f-4e63-9baf-45f51085abdc/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 16:43:54 crc kubenswrapper[4810]: I0219 16:43:54.651116 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_22facf67-088b-410b-986a-c9e09b3d8feb/openstack-network-exporter/0.log" Feb 19 16:43:54 crc kubenswrapper[4810]: I0219 16:43:54.771536 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_22facf67-088b-410b-986a-c9e09b3d8feb/ovn-northd/0.log" Feb 19 16:43:54 crc kubenswrapper[4810]: I0219 16:43:54.876372 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_bdffb5e6-13bb-4c08-ad3c-52d8ded85431/openstack-network-exporter/0.log" Feb 19 16:43:54 crc kubenswrapper[4810]: I0219 16:43:54.915030 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_bdffb5e6-13bb-4c08-ad3c-52d8ded85431/ovsdbserver-nb/0.log" Feb 19 16:43:54 crc kubenswrapper[4810]: I0219 16:43:54.994872 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_5b985124-01b7-430c-b5ea-b9fd095e5f5e/openstack-network-exporter/0.log" Feb 19 16:43:55 crc kubenswrapper[4810]: I0219 16:43:55.087536 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_5b985124-01b7-430c-b5ea-b9fd095e5f5e/ovsdbserver-sb/0.log" Feb 19 16:43:55 crc kubenswrapper[4810]: I0219 16:43:55.367431 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7555d68ddd-xqj8c_565eac29-daec-4b40-bcb7-751696560c3a/placement-api/0.log" Feb 19 16:43:55 crc kubenswrapper[4810]: I0219 16:43:55.404938 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_bf65af35-1e80-49a0-ada2-3bd027193193/init-config-reloader/0.log" Feb 19 16:43:55 crc kubenswrapper[4810]: I0219 16:43:55.556210 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_bf65af35-1e80-49a0-ada2-3bd027193193/config-reloader/0.log" Feb 19 16:43:55 crc kubenswrapper[4810]: I0219 16:43:55.558354 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7555d68ddd-xqj8c_565eac29-daec-4b40-bcb7-751696560c3a/placement-log/0.log" Feb 19 16:43:55 crc kubenswrapper[4810]: I0219 16:43:55.609984 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_bf65af35-1e80-49a0-ada2-3bd027193193/init-config-reloader/0.log" Feb 19 16:43:55 crc kubenswrapper[4810]: I0219 16:43:55.647169 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_bf65af35-1e80-49a0-ada2-3bd027193193/prometheus/0.log" Feb 19 16:43:55 crc kubenswrapper[4810]: I0219 16:43:55.732395 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_bf65af35-1e80-49a0-ada2-3bd027193193/thanos-sidecar/0.log" Feb 19 16:43:55 crc kubenswrapper[4810]: I0219 16:43:55.854760 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_03247cdb-4055-4d47-b433-848e363768ab/setup-container/0.log" Feb 19 16:43:56 crc kubenswrapper[4810]: I0219 16:43:56.019895 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_03247cdb-4055-4d47-b433-848e363768ab/setup-container/0.log" Feb 19 16:43:56 crc kubenswrapper[4810]: I0219 16:43:56.039510 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_03247cdb-4055-4d47-b433-848e363768ab/rabbitmq/0.log" Feb 19 16:43:56 crc kubenswrapper[4810]: I0219 16:43:56.081437 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_b86448c3-669a-4132-b8ab-4db06347fa10/setup-container/0.log" Feb 19 16:43:56 crc kubenswrapper[4810]: I0219 16:43:56.258761 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_b86448c3-669a-4132-b8ab-4db06347fa10/setup-container/0.log" Feb 19 16:43:56 crc kubenswrapper[4810]: I0219 16:43:56.320720 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-ps669_69d67433-38d6-4368-a621-254a97b0c619/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 16:43:56 crc kubenswrapper[4810]: I0219 16:43:56.340722 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_b86448c3-669a-4132-b8ab-4db06347fa10/rabbitmq/0.log" Feb 19 16:43:56 crc kubenswrapper[4810]: I0219 16:43:56.505594 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-x748x_32dc9563-791b-421e-a807-41cc1e775b3a/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 16:43:56 crc kubenswrapper[4810]: I0219 16:43:56.676088 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-484qb_8c05e8c7-82f6-4ef1-a576-3c84e70dc570/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 16:43:56 crc kubenswrapper[4810]: I0219 16:43:56.707851 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-rf56v_e77512a1-b460-4008-9e59-5b38f3e9f925/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 16:43:56 crc kubenswrapper[4810]: I0219 16:43:56.875856 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-gw579_e3132ed5-687d-4cd1-a539-35c4766a27c1/ssh-known-hosts-edpm-deployment/0.log" Feb 19 16:43:57 crc kubenswrapper[4810]: I0219 16:43:57.135788 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-78bc5d479f-k79xx_9190a865-226b-487c-b0f9-2573f50f0eab/proxy-server/0.log" Feb 19 16:43:57 crc kubenswrapper[4810]: I0219 16:43:57.229552 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-78bc5d479f-k79xx_9190a865-226b-487c-b0f9-2573f50f0eab/proxy-httpd/0.log" Feb 19 16:43:57 crc kubenswrapper[4810]: I0219 16:43:57.245288 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-hrdll_6c36f3e5-f790-4eda-9486-174f8624dad1/swift-ring-rebalance/0.log" Feb 19 16:43:57 crc kubenswrapper[4810]: I0219 16:43:57.482506 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_37e2af25-5b30-4fb9-801e-f4a84d665540/account-auditor/0.log" Feb 19 16:43:57 crc kubenswrapper[4810]: I0219 16:43:57.483118 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_37e2af25-5b30-4fb9-801e-f4a84d665540/account-reaper/0.log" Feb 19 16:43:57 crc kubenswrapper[4810]: I0219 16:43:57.501962 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_37e2af25-5b30-4fb9-801e-f4a84d665540/account-replicator/0.log" Feb 19 16:43:57 crc kubenswrapper[4810]: I0219 16:43:57.657954 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_37e2af25-5b30-4fb9-801e-f4a84d665540/account-server/0.log" Feb 19 16:43:57 crc kubenswrapper[4810]: I0219 16:43:57.671673 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_37e2af25-5b30-4fb9-801e-f4a84d665540/container-server/0.log" Feb 19 16:43:57 crc kubenswrapper[4810]: I0219 16:43:57.724810 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_37e2af25-5b30-4fb9-801e-f4a84d665540/container-auditor/0.log" Feb 19 16:43:57 crc kubenswrapper[4810]: I0219 16:43:57.725938 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_37e2af25-5b30-4fb9-801e-f4a84d665540/container-replicator/0.log" Feb 19 16:43:57 crc kubenswrapper[4810]: I0219 16:43:57.839712 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_37e2af25-5b30-4fb9-801e-f4a84d665540/container-updater/0.log" Feb 19 16:43:57 crc kubenswrapper[4810]: I0219 16:43:57.890934 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_37e2af25-5b30-4fb9-801e-f4a84d665540/object-auditor/0.log" Feb 19 16:43:57 crc kubenswrapper[4810]: I0219 16:43:57.908426 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_37e2af25-5b30-4fb9-801e-f4a84d665540/object-expirer/0.log" Feb 19 16:43:58 crc kubenswrapper[4810]: I0219 16:43:58.011349 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_37e2af25-5b30-4fb9-801e-f4a84d665540/object-server/0.log" Feb 19 16:43:58 crc kubenswrapper[4810]: I0219 16:43:58.021842 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_37e2af25-5b30-4fb9-801e-f4a84d665540/object-replicator/0.log" Feb 19 16:43:58 crc kubenswrapper[4810]: I0219 16:43:58.114670 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_37e2af25-5b30-4fb9-801e-f4a84d665540/rsync/0.log" Feb 19 16:43:58 crc kubenswrapper[4810]: I0219 16:43:58.119135 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_37e2af25-5b30-4fb9-801e-f4a84d665540/object-updater/0.log" Feb 19 16:43:58 crc kubenswrapper[4810]: I0219 16:43:58.216922 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_37e2af25-5b30-4fb9-801e-f4a84d665540/swift-recon-cron/0.log" Feb 19 16:43:58 crc kubenswrapper[4810]: I0219 16:43:58.328820 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-bfhfc_f7ca8c9a-db61-400f-9319-21590462f929/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 16:43:58 crc kubenswrapper[4810]: I0219 16:43:58.429705 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_a4c017a9-c049-4baa-acc0-e08a25437c90/tempest-tests-tempest-tests-runner/0.log" Feb 19 16:43:58 crc kubenswrapper[4810]: I0219 16:43:58.490663 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_d3094cb4-a81a-4f4f-b1d8-040c32bd3b2c/test-operator-logs-container/0.log" Feb 19 16:43:58 crc kubenswrapper[4810]: I0219 16:43:58.659313 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-zbkm7_412dc62a-d25e-4820-947b-582e310ddff1/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 16:43:59 crc kubenswrapper[4810]: I0219 16:43:59.411692 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-applier-0_79f3ef20-3f3d-4fa2-8888-36d421303dfd/watcher-applier/0.log" Feb 19 16:44:00 crc kubenswrapper[4810]: I0219 16:44:00.055261 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_c11a7f60-4839-44aa-8615-98de657221f4/watcher-api-log/0.log" Feb 19 16:44:02 crc kubenswrapper[4810]: I0219 16:44:02.692664 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-decision-engine-0_fbbd48c8-49fb-4e51-9ba7-f7b37f681b3d/watcher-decision-engine/0.log" Feb 19 16:44:03 crc kubenswrapper[4810]: I0219 16:44:03.576850 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_c11a7f60-4839-44aa-8615-98de657221f4/watcher-api/0.log" Feb 19 16:44:03 crc kubenswrapper[4810]: I0219 16:44:03.668653 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_eb773d46-7b9f-4ca4-b33c-9b800b9eafd7/memcached/0.log" Feb 19 16:44:19 crc kubenswrapper[4810]: I0219 16:44:19.537229 4810 patch_prober.go:28] interesting pod/machine-config-daemon-t499d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 16:44:19 crc kubenswrapper[4810]: I0219 16:44:19.537869 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 16:44:30 crc kubenswrapper[4810]: I0219 16:44:30.315291 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ae2a058ea32e5d02c736557a2a81dfccb5f5284d9451ce18a33356457cfrzl4_124e176a-b011-4a5c-8e7c-ca027d881aea/util/0.log" Feb 19 16:44:30 crc kubenswrapper[4810]: I0219 16:44:30.500679 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ae2a058ea32e5d02c736557a2a81dfccb5f5284d9451ce18a33356457cfrzl4_124e176a-b011-4a5c-8e7c-ca027d881aea/pull/0.log" Feb 19 16:44:30 crc kubenswrapper[4810]: I0219 16:44:30.538302 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ae2a058ea32e5d02c736557a2a81dfccb5f5284d9451ce18a33356457cfrzl4_124e176a-b011-4a5c-8e7c-ca027d881aea/util/0.log" Feb 19 16:44:30 crc kubenswrapper[4810]: I0219 16:44:30.548557 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ae2a058ea32e5d02c736557a2a81dfccb5f5284d9451ce18a33356457cfrzl4_124e176a-b011-4a5c-8e7c-ca027d881aea/pull/0.log" Feb 19 16:44:30 crc kubenswrapper[4810]: I0219 16:44:30.912397 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ae2a058ea32e5d02c736557a2a81dfccb5f5284d9451ce18a33356457cfrzl4_124e176a-b011-4a5c-8e7c-ca027d881aea/util/0.log" Feb 19 16:44:30 crc kubenswrapper[4810]: I0219 16:44:30.930180 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ae2a058ea32e5d02c736557a2a81dfccb5f5284d9451ce18a33356457cfrzl4_124e176a-b011-4a5c-8e7c-ca027d881aea/extract/0.log" Feb 19 16:44:31 crc kubenswrapper[4810]: I0219 16:44:31.146216 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ae2a058ea32e5d02c736557a2a81dfccb5f5284d9451ce18a33356457cfrzl4_124e176a-b011-4a5c-8e7c-ca027d881aea/pull/0.log" Feb 19 16:44:31 crc kubenswrapper[4810]: I0219 16:44:31.494368 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6d8bf5c495-z5fb9_52bb990c-eff0-4673-be27-d55d433bef0d/manager/0.log" Feb 19 16:44:31 crc kubenswrapper[4810]: I0219 16:44:31.943074 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987464f4-qz68t_2106e7b5-bb83-464a-a43f-943f22b55078/manager/0.log" Feb 19 16:44:32 crc kubenswrapper[4810]: I0219 16:44:32.155079 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-69f49c598c-ffm66_e2942952-ce19-4053-91da-05623c954167/manager/0.log" Feb 19 16:44:32 crc kubenswrapper[4810]: I0219 16:44:32.361606 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5b9b8895d5-gnmlp_f0ab3643-d267-4902-af1f-cbcbdd7e5e41/manager/0.log" Feb 19 16:44:33 crc kubenswrapper[4810]: I0219 16:44:33.120404 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-79d975b745-2kkhl_4898d4eb-d474-44bc-9a38-e36f300d132f/manager/0.log" Feb 19 16:44:33 crc kubenswrapper[4810]: I0219 16:44:33.249257 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-554564d7fc-nnps5_602535d1-0abe-471e-8409-31319af7bd4b/manager/0.log" Feb 19 16:44:33 crc kubenswrapper[4810]: I0219 16:44:33.537035 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-b4d948c87-mkfsc_2126b31b-0444-43e4-a250-837f37d476aa/manager/0.log" Feb 19 16:44:33 crc kubenswrapper[4810]: I0219 16:44:33.772912 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-54f6768c69-vc7cw_942f40af-0498-4865-99da-bdcd068ef449/manager/0.log" Feb 19 16:44:33 crc kubenswrapper[4810]: I0219 16:44:33.789650 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-5d946d989d-jxmt5_1217b757-0f1c-4c4e-9abe-55875992915d/manager/0.log" Feb 19 16:44:34 crc kubenswrapper[4810]: I0219 16:44:34.040047 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6994f66f48-6fqmd_e4a54646-39cf-4e42-9367-487ea4f7d8a4/manager/0.log" Feb 19 16:44:34 crc kubenswrapper[4810]: I0219 16:44:34.090830 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-64ddbf8bb-jjqv2_4aeb7cba-c1db-4dd6-92f7-dae7bd2e3f65/manager/0.log" Feb 19 16:44:34 crc kubenswrapper[4810]: I0219 16:44:34.395668 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-567668f5cf-l67cq_fef4ad67-ccc8-4b32-bfb8-38dd6aa8e07e/manager/0.log" Feb 19 16:44:34 crc kubenswrapper[4810]: I0219 16:44:34.493875 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-7c6767dc9c9hlpt_c677bdd0-7248-4b02-9ab4-035c034a976a/manager/0.log" Feb 19 16:44:34 crc kubenswrapper[4810]: I0219 16:44:34.892755 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-69cffcd4f6-27gzn_e84ef702-2f13-42e9-ae2b-6f1465b67ff3/operator/0.log" Feb 19 16:44:35 crc kubenswrapper[4810]: I0219 16:44:35.601951 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-gkft8_09f49ae7-b6fb-4ca5-9238-8bcf8d15ea95/registry-server/0.log" Feb 19 16:44:35 crc kubenswrapper[4810]: I0219 16:44:35.885228 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-d44cf6b75-5xnwd_e163eac0-ea1f-4002-9469-844240d7a44c/manager/0.log" Feb 19 16:44:36 crc kubenswrapper[4810]: I0219 16:44:36.124159 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-8497b45c89-7tzvr_9f5779a5-4cda-40dc-831d-950f97eae317/manager/0.log" Feb 19 16:44:36 crc kubenswrapper[4810]: I0219 16:44:36.290686 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-k98c8_64ed590e-59b6-44c8-baee-324162d099b8/operator/0.log" Feb 19 16:44:36 crc kubenswrapper[4810]: I0219 16:44:36.477454 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-68f46476f-t44nb_aa5063d7-2358-4149-a3b9-ef2ce138faf4/manager/0.log" Feb 19 16:44:36 crc kubenswrapper[4810]: I0219 16:44:36.938579 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-7866795846-pw9kt_69b7e96d-bce6-4653-998e-3bf5d159ae5a/manager/0.log" Feb 19 16:44:36 crc kubenswrapper[4810]: I0219 16:44:36.971307 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-7f45b4ff68-px9zx_eaed166e-39b5-45ca-8a65-a22710d5fe37/manager/0.log" Feb 19 16:44:37 crc kubenswrapper[4810]: I0219 16:44:37.266188 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-69f8888797-vcbwg_3cff7e0c-86c7-4029-aefe-a7f7e8e2d76d/manager/0.log" Feb 19 16:44:37 crc kubenswrapper[4810]: I0219 16:44:37.389431 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-798847869b-dlmvg_9c5af548-c722-4e6b-9309-1420838257e0/manager/0.log" Feb 19 16:44:37 crc kubenswrapper[4810]: I0219 16:44:37.561418 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6d464797d7-lrlqc_a6f83f3c-26f4-472f-9fcd-ae8049f1819a/manager/0.log" Feb 19 16:44:42 crc kubenswrapper[4810]: I0219 16:44:42.930704 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-868647ff47-mzslt_91002269-9fe0-44d2-9dbd-9e4cf58274bf/manager/0.log" Feb 19 16:44:49 crc kubenswrapper[4810]: I0219 16:44:49.538129 4810 patch_prober.go:28] interesting pod/machine-config-daemon-t499d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 16:44:49 crc kubenswrapper[4810]: I0219 16:44:49.538868 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 16:45:00 crc kubenswrapper[4810]: I0219 16:45:00.145268 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525325-x9zxt"] Feb 19 16:45:00 crc kubenswrapper[4810]: E0219 16:45:00.146172 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d35bc1c3-d664-4faa-89db-dbddd7c714f3" containerName="container-00" Feb 19 16:45:00 crc kubenswrapper[4810]: I0219 16:45:00.146184 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="d35bc1c3-d664-4faa-89db-dbddd7c714f3" containerName="container-00" Feb 19 16:45:00 crc kubenswrapper[4810]: I0219 16:45:00.146409 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="d35bc1c3-d664-4faa-89db-dbddd7c714f3" containerName="container-00" Feb 19 16:45:00 crc kubenswrapper[4810]: I0219 16:45:00.147155 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525325-x9zxt" Feb 19 16:45:00 crc kubenswrapper[4810]: I0219 16:45:00.148927 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 19 16:45:00 crc kubenswrapper[4810]: I0219 16:45:00.149591 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 19 16:45:00 crc kubenswrapper[4810]: I0219 16:45:00.176614 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525325-x9zxt"] Feb 19 16:45:00 crc kubenswrapper[4810]: I0219 16:45:00.251865 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/642d0068-2155-46d5-85c6-d4f70d142f81-secret-volume\") pod \"collect-profiles-29525325-x9zxt\" (UID: \"642d0068-2155-46d5-85c6-d4f70d142f81\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525325-x9zxt" Feb 19 16:45:00 crc kubenswrapper[4810]: I0219 16:45:00.251955 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/642d0068-2155-46d5-85c6-d4f70d142f81-config-volume\") pod \"collect-profiles-29525325-x9zxt\" (UID: \"642d0068-2155-46d5-85c6-d4f70d142f81\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525325-x9zxt" Feb 19 16:45:00 crc kubenswrapper[4810]: I0219 16:45:00.252186 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gj6b\" (UniqueName: \"kubernetes.io/projected/642d0068-2155-46d5-85c6-d4f70d142f81-kube-api-access-2gj6b\") pod \"collect-profiles-29525325-x9zxt\" (UID: \"642d0068-2155-46d5-85c6-d4f70d142f81\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525325-x9zxt" Feb 19 16:45:00 crc kubenswrapper[4810]: I0219 16:45:00.353499 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/642d0068-2155-46d5-85c6-d4f70d142f81-config-volume\") pod \"collect-profiles-29525325-x9zxt\" (UID: \"642d0068-2155-46d5-85c6-d4f70d142f81\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525325-x9zxt" Feb 19 16:45:00 crc kubenswrapper[4810]: I0219 16:45:00.353673 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gj6b\" (UniqueName: \"kubernetes.io/projected/642d0068-2155-46d5-85c6-d4f70d142f81-kube-api-access-2gj6b\") pod \"collect-profiles-29525325-x9zxt\" (UID: \"642d0068-2155-46d5-85c6-d4f70d142f81\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525325-x9zxt" Feb 19 16:45:00 crc kubenswrapper[4810]: I0219 16:45:00.353722 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/642d0068-2155-46d5-85c6-d4f70d142f81-secret-volume\") pod \"collect-profiles-29525325-x9zxt\" (UID: \"642d0068-2155-46d5-85c6-d4f70d142f81\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525325-x9zxt" Feb 19 16:45:00 crc kubenswrapper[4810]: I0219 16:45:00.354649 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/642d0068-2155-46d5-85c6-d4f70d142f81-config-volume\") pod \"collect-profiles-29525325-x9zxt\" (UID: \"642d0068-2155-46d5-85c6-d4f70d142f81\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525325-x9zxt" Feb 19 16:45:00 crc kubenswrapper[4810]: I0219 16:45:00.372983 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/642d0068-2155-46d5-85c6-d4f70d142f81-secret-volume\") pod \"collect-profiles-29525325-x9zxt\" (UID: \"642d0068-2155-46d5-85c6-d4f70d142f81\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525325-x9zxt" Feb 19 16:45:00 crc kubenswrapper[4810]: I0219 16:45:00.374667 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gj6b\" (UniqueName: \"kubernetes.io/projected/642d0068-2155-46d5-85c6-d4f70d142f81-kube-api-access-2gj6b\") pod \"collect-profiles-29525325-x9zxt\" (UID: \"642d0068-2155-46d5-85c6-d4f70d142f81\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525325-x9zxt" Feb 19 16:45:00 crc kubenswrapper[4810]: I0219 16:45:00.464830 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525325-x9zxt" Feb 19 16:45:00 crc kubenswrapper[4810]: I0219 16:45:00.924036 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525325-x9zxt"] Feb 19 16:45:00 crc kubenswrapper[4810]: I0219 16:45:00.978529 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-895xv_64dc0d58-11d4-456b-97ab-a4d3ec28225b/control-plane-machine-set-operator/0.log" Feb 19 16:45:01 crc kubenswrapper[4810]: I0219 16:45:01.183860 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-l66cb_9a7776ca-1a56-4eca-9e44-ba1b7b15510f/kube-rbac-proxy/0.log" Feb 19 16:45:01 crc kubenswrapper[4810]: I0219 16:45:01.306435 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-l66cb_9a7776ca-1a56-4eca-9e44-ba1b7b15510f/machine-api-operator/0.log" Feb 19 16:45:01 crc kubenswrapper[4810]: I0219 16:45:01.382631 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525325-x9zxt" event={"ID":"642d0068-2155-46d5-85c6-d4f70d142f81","Type":"ContainerStarted","Data":"0bfef661b9df9370b288185a7c4e0a21adced38cc06cfc5bcc7c15675f009750"} Feb 19 16:45:01 crc kubenswrapper[4810]: I0219 16:45:01.382685 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525325-x9zxt" event={"ID":"642d0068-2155-46d5-85c6-d4f70d142f81","Type":"ContainerStarted","Data":"5d6972445d499d27164899c98a607acd2c10498b607f1c6b6cf8700155a05b83"} Feb 19 16:45:02 crc kubenswrapper[4810]: I0219 16:45:02.397446 4810 generic.go:334] "Generic (PLEG): container finished" podID="642d0068-2155-46d5-85c6-d4f70d142f81" containerID="0bfef661b9df9370b288185a7c4e0a21adced38cc06cfc5bcc7c15675f009750" exitCode=0 Feb 19 16:45:02 crc kubenswrapper[4810]: I0219 16:45:02.397499 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525325-x9zxt" event={"ID":"642d0068-2155-46d5-85c6-d4f70d142f81","Type":"ContainerDied","Data":"0bfef661b9df9370b288185a7c4e0a21adced38cc06cfc5bcc7c15675f009750"} Feb 19 16:45:02 crc kubenswrapper[4810]: I0219 16:45:02.764342 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525325-x9zxt" Feb 19 16:45:02 crc kubenswrapper[4810]: I0219 16:45:02.807433 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/642d0068-2155-46d5-85c6-d4f70d142f81-config-volume\") pod \"642d0068-2155-46d5-85c6-d4f70d142f81\" (UID: \"642d0068-2155-46d5-85c6-d4f70d142f81\") " Feb 19 16:45:02 crc kubenswrapper[4810]: I0219 16:45:02.807634 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/642d0068-2155-46d5-85c6-d4f70d142f81-secret-volume\") pod \"642d0068-2155-46d5-85c6-d4f70d142f81\" (UID: \"642d0068-2155-46d5-85c6-d4f70d142f81\") " Feb 19 16:45:02 crc kubenswrapper[4810]: I0219 16:45:02.807736 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2gj6b\" (UniqueName: \"kubernetes.io/projected/642d0068-2155-46d5-85c6-d4f70d142f81-kube-api-access-2gj6b\") pod \"642d0068-2155-46d5-85c6-d4f70d142f81\" (UID: \"642d0068-2155-46d5-85c6-d4f70d142f81\") " Feb 19 16:45:02 crc kubenswrapper[4810]: I0219 16:45:02.809776 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/642d0068-2155-46d5-85c6-d4f70d142f81-config-volume" (OuterVolumeSpecName: "config-volume") pod "642d0068-2155-46d5-85c6-d4f70d142f81" (UID: "642d0068-2155-46d5-85c6-d4f70d142f81"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 16:45:02 crc kubenswrapper[4810]: I0219 16:45:02.814774 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/642d0068-2155-46d5-85c6-d4f70d142f81-kube-api-access-2gj6b" (OuterVolumeSpecName: "kube-api-access-2gj6b") pod "642d0068-2155-46d5-85c6-d4f70d142f81" (UID: "642d0068-2155-46d5-85c6-d4f70d142f81"). InnerVolumeSpecName "kube-api-access-2gj6b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 16:45:02 crc kubenswrapper[4810]: I0219 16:45:02.815071 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/642d0068-2155-46d5-85c6-d4f70d142f81-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "642d0068-2155-46d5-85c6-d4f70d142f81" (UID: "642d0068-2155-46d5-85c6-d4f70d142f81"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 16:45:02 crc kubenswrapper[4810]: I0219 16:45:02.909738 4810 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/642d0068-2155-46d5-85c6-d4f70d142f81-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 19 16:45:02 crc kubenswrapper[4810]: I0219 16:45:02.909946 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2gj6b\" (UniqueName: \"kubernetes.io/projected/642d0068-2155-46d5-85c6-d4f70d142f81-kube-api-access-2gj6b\") on node \"crc\" DevicePath \"\"" Feb 19 16:45:02 crc kubenswrapper[4810]: I0219 16:45:02.910039 4810 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/642d0068-2155-46d5-85c6-d4f70d142f81-config-volume\") on node \"crc\" DevicePath \"\"" Feb 19 16:45:03 crc kubenswrapper[4810]: I0219 16:45:03.409270 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525325-x9zxt" event={"ID":"642d0068-2155-46d5-85c6-d4f70d142f81","Type":"ContainerDied","Data":"5d6972445d499d27164899c98a607acd2c10498b607f1c6b6cf8700155a05b83"} Feb 19 16:45:03 crc kubenswrapper[4810]: I0219 16:45:03.409348 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d6972445d499d27164899c98a607acd2c10498b607f1c6b6cf8700155a05b83" Feb 19 16:45:03 crc kubenswrapper[4810]: I0219 16:45:03.410738 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525325-x9zxt" Feb 19 16:45:03 crc kubenswrapper[4810]: I0219 16:45:03.842324 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525280-gjtpd"] Feb 19 16:45:03 crc kubenswrapper[4810]: I0219 16:45:03.855514 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525280-gjtpd"] Feb 19 16:45:05 crc kubenswrapper[4810]: I0219 16:45:05.453789 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1327b7dc-e5ad-463c-8ca9-89b735b1fec2" path="/var/lib/kubelet/pods/1327b7dc-e5ad-463c-8ca9-89b735b1fec2/volumes" Feb 19 16:45:17 crc kubenswrapper[4810]: I0219 16:45:17.208869 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-fwh4x_02206e32-6f49-407e-a02b-ce61e3daabf6/cert-manager-controller/0.log" Feb 19 16:45:17 crc kubenswrapper[4810]: I0219 16:45:17.409375 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-x4csq_54e755f0-9c2f-4d47-9979-b7b92996bab6/cert-manager-cainjector/0.log" Feb 19 16:45:17 crc kubenswrapper[4810]: I0219 16:45:17.461288 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-7lspv_1224bbe4-6d8e-410e-8990-3813efdd2003/cert-manager-webhook/0.log" Feb 19 16:45:19 crc kubenswrapper[4810]: I0219 16:45:19.537977 4810 patch_prober.go:28] interesting pod/machine-config-daemon-t499d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 16:45:19 crc kubenswrapper[4810]: I0219 16:45:19.538429 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 16:45:19 crc kubenswrapper[4810]: I0219 16:45:19.538487 4810 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t499d" Feb 19 16:45:19 crc kubenswrapper[4810]: I0219 16:45:19.539477 4810 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"63bd1770c325dfdb5ccd64220060f108b50a35fa6325f46acd3bb9d2e6c06e3e"} pod="openshift-machine-config-operator/machine-config-daemon-t499d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 16:45:19 crc kubenswrapper[4810]: I0219 16:45:19.539539 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" containerID="cri-o://63bd1770c325dfdb5ccd64220060f108b50a35fa6325f46acd3bb9d2e6c06e3e" gracePeriod=600 Feb 19 16:45:19 crc kubenswrapper[4810]: E0219 16:45:19.677192 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:45:20 crc kubenswrapper[4810]: I0219 16:45:20.594161 4810 generic.go:334] "Generic (PLEG): container finished" podID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerID="63bd1770c325dfdb5ccd64220060f108b50a35fa6325f46acd3bb9d2e6c06e3e" exitCode=0 Feb 19 16:45:20 crc kubenswrapper[4810]: I0219 16:45:20.594308 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" event={"ID":"dfbf74ef-1e94-4826-8583-42b2e246ccf3","Type":"ContainerDied","Data":"63bd1770c325dfdb5ccd64220060f108b50a35fa6325f46acd3bb9d2e6c06e3e"} Feb 19 16:45:20 crc kubenswrapper[4810]: I0219 16:45:20.594629 4810 scope.go:117] "RemoveContainer" containerID="b8edda1f9342dacaf59cc4da7ca5a5e6fa7b1be00b10f0e3774c25fffdb24625" Feb 19 16:45:20 crc kubenswrapper[4810]: I0219 16:45:20.595771 4810 scope.go:117] "RemoveContainer" containerID="63bd1770c325dfdb5ccd64220060f108b50a35fa6325f46acd3bb9d2e6c06e3e" Feb 19 16:45:20 crc kubenswrapper[4810]: E0219 16:45:20.596365 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:45:33 crc kubenswrapper[4810]: I0219 16:45:33.026120 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5c78fc5d65-kdwwx_35fc682a-0cc9-4922-a2f2-60da1ddb1eb9/nmstate-console-plugin/0.log" Feb 19 16:45:33 crc kubenswrapper[4810]: I0219 16:45:33.195262 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-bhhvv_c0eb0835-6df5-4a21-b309-f178a032d027/nmstate-handler/0.log" Feb 19 16:45:33 crc kubenswrapper[4810]: I0219 16:45:33.229831 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-4g952_ce589619-7c2f-43db-ae4f-fb43be7b07f4/kube-rbac-proxy/0.log" Feb 19 16:45:33 crc kubenswrapper[4810]: I0219 16:45:33.278707 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-4g952_ce589619-7c2f-43db-ae4f-fb43be7b07f4/nmstate-metrics/0.log" Feb 19 16:45:33 crc kubenswrapper[4810]: I0219 16:45:33.405273 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-694c9596b7-p4hwg_f8300a06-7526-4da5-89a6-7fff8ff284c9/nmstate-operator/0.log" Feb 19 16:45:33 crc kubenswrapper[4810]: I0219 16:45:33.520889 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-866bcb46dc-ckvvq_db05e782-a3d7-4cbe-be3f-f6226d894864/nmstate-webhook/0.log" Feb 19 16:45:34 crc kubenswrapper[4810]: I0219 16:45:34.441648 4810 scope.go:117] "RemoveContainer" containerID="63bd1770c325dfdb5ccd64220060f108b50a35fa6325f46acd3bb9d2e6c06e3e" Feb 19 16:45:34 crc kubenswrapper[4810]: E0219 16:45:34.442275 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:45:45 crc kubenswrapper[4810]: I0219 16:45:45.059103 4810 scope.go:117] "RemoveContainer" containerID="4e99f6d1c84e426443c4e13972b553f91aa0857b582f33dd75b9fc978d8acc56" Feb 19 16:45:49 crc kubenswrapper[4810]: I0219 16:45:49.439422 4810 scope.go:117] "RemoveContainer" containerID="63bd1770c325dfdb5ccd64220060f108b50a35fa6325f46acd3bb9d2e6c06e3e" Feb 19 16:45:49 crc kubenswrapper[4810]: E0219 16:45:49.440312 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:45:51 crc kubenswrapper[4810]: I0219 16:45:51.603096 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-6jkkz_1656f52d-7771-4bbb-9642-b296d16b791e/prometheus-operator/0.log" Feb 19 16:45:51 crc kubenswrapper[4810]: I0219 16:45:51.645007 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6d6bf9b6-qpvdt_d5debcf2-9629-4bb2-9133-f4b81748ff7d/prometheus-operator-admission-webhook/0.log" Feb 19 16:45:51 crc kubenswrapper[4810]: I0219 16:45:51.823137 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6d6bf9b6-vrfvr_408628c0-0b2c-48f9-b849-ee1b124499e1/prometheus-operator-admission-webhook/0.log" Feb 19 16:45:51 crc kubenswrapper[4810]: I0219 16:45:51.878546 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-dk9c4_8bdf030e-92d8-45dc-ab6c-a7b241444677/operator/0.log" Feb 19 16:45:52 crc kubenswrapper[4810]: I0219 16:45:52.020801 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-2fdxm_c5968625-c396-4ae0-9846-c2ceb6baf655/perses-operator/0.log" Feb 19 16:46:04 crc kubenswrapper[4810]: I0219 16:46:04.439933 4810 scope.go:117] "RemoveContainer" containerID="63bd1770c325dfdb5ccd64220060f108b50a35fa6325f46acd3bb9d2e6c06e3e" Feb 19 16:46:04 crc kubenswrapper[4810]: E0219 16:46:04.440650 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:46:09 crc kubenswrapper[4810]: I0219 16:46:09.801640 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-jngcz_781d467e-8522-43a3-a552-1ceebc40cddd/kube-rbac-proxy/0.log" Feb 19 16:46:09 crc kubenswrapper[4810]: I0219 16:46:09.890410 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-jngcz_781d467e-8522-43a3-a552-1ceebc40cddd/controller/0.log" Feb 19 16:46:10 crc kubenswrapper[4810]: I0219 16:46:10.093012 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7rbxk_66c7e596-ffa3-4687-8c80-21acecbd8075/cp-frr-files/0.log" Feb 19 16:46:10 crc kubenswrapper[4810]: I0219 16:46:10.233088 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7rbxk_66c7e596-ffa3-4687-8c80-21acecbd8075/cp-frr-files/0.log" Feb 19 16:46:10 crc kubenswrapper[4810]: I0219 16:46:10.239319 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7rbxk_66c7e596-ffa3-4687-8c80-21acecbd8075/cp-reloader/0.log" Feb 19 16:46:10 crc kubenswrapper[4810]: I0219 16:46:10.286643 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7rbxk_66c7e596-ffa3-4687-8c80-21acecbd8075/cp-metrics/0.log" Feb 19 16:46:10 crc kubenswrapper[4810]: I0219 16:46:10.294547 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7rbxk_66c7e596-ffa3-4687-8c80-21acecbd8075/cp-reloader/0.log" Feb 19 16:46:10 crc kubenswrapper[4810]: I0219 16:46:10.502676 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7rbxk_66c7e596-ffa3-4687-8c80-21acecbd8075/cp-reloader/0.log" Feb 19 16:46:10 crc kubenswrapper[4810]: I0219 16:46:10.510653 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7rbxk_66c7e596-ffa3-4687-8c80-21acecbd8075/cp-frr-files/0.log" Feb 19 16:46:10 crc kubenswrapper[4810]: I0219 16:46:10.557055 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7rbxk_66c7e596-ffa3-4687-8c80-21acecbd8075/cp-metrics/0.log" Feb 19 16:46:10 crc kubenswrapper[4810]: I0219 16:46:10.562967 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7rbxk_66c7e596-ffa3-4687-8c80-21acecbd8075/cp-metrics/0.log" Feb 19 16:46:10 crc kubenswrapper[4810]: I0219 16:46:10.716723 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7rbxk_66c7e596-ffa3-4687-8c80-21acecbd8075/cp-frr-files/0.log" Feb 19 16:46:10 crc kubenswrapper[4810]: I0219 16:46:10.766750 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7rbxk_66c7e596-ffa3-4687-8c80-21acecbd8075/cp-reloader/0.log" Feb 19 16:46:10 crc kubenswrapper[4810]: I0219 16:46:10.777231 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7rbxk_66c7e596-ffa3-4687-8c80-21acecbd8075/controller/0.log" Feb 19 16:46:10 crc kubenswrapper[4810]: I0219 16:46:10.811270 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7rbxk_66c7e596-ffa3-4687-8c80-21acecbd8075/cp-metrics/0.log" Feb 19 16:46:10 crc kubenswrapper[4810]: I0219 16:46:10.947828 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7rbxk_66c7e596-ffa3-4687-8c80-21acecbd8075/kube-rbac-proxy/0.log" Feb 19 16:46:10 crc kubenswrapper[4810]: I0219 16:46:10.950030 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7rbxk_66c7e596-ffa3-4687-8c80-21acecbd8075/frr-metrics/0.log" Feb 19 16:46:11 crc kubenswrapper[4810]: I0219 16:46:11.048007 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7rbxk_66c7e596-ffa3-4687-8c80-21acecbd8075/kube-rbac-proxy-frr/0.log" Feb 19 16:46:11 crc kubenswrapper[4810]: I0219 16:46:11.181988 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7rbxk_66c7e596-ffa3-4687-8c80-21acecbd8075/reloader/0.log" Feb 19 16:46:11 crc kubenswrapper[4810]: I0219 16:46:11.287061 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-78b44bf5bb-cwj24_1ee9f8f3-05a8-4648-b48d-4975285346d7/frr-k8s-webhook-server/0.log" Feb 19 16:46:12 crc kubenswrapper[4810]: I0219 16:46:12.103941 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-75f48c59dc-m5vm8_f26047c7-b8cc-4ce2-8a48-4b380ab225c0/manager/0.log" Feb 19 16:46:12 crc kubenswrapper[4810]: I0219 16:46:12.210423 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-595d5f7545-vfb4c_3d62866f-b047-419d-8eb0-848b0df84e63/webhook-server/0.log" Feb 19 16:46:12 crc kubenswrapper[4810]: I0219 16:46:12.530388 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-hllgd_c9d97974-67d2-42e5-89fe-b6db106a47c4/kube-rbac-proxy/0.log" Feb 19 16:46:12 crc kubenswrapper[4810]: I0219 16:46:12.533096 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7rbxk_66c7e596-ffa3-4687-8c80-21acecbd8075/frr/0.log" Feb 19 16:46:12 crc kubenswrapper[4810]: I0219 16:46:12.751554 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-hllgd_c9d97974-67d2-42e5-89fe-b6db106a47c4/speaker/0.log" Feb 19 16:46:18 crc kubenswrapper[4810]: I0219 16:46:18.439622 4810 scope.go:117] "RemoveContainer" containerID="63bd1770c325dfdb5ccd64220060f108b50a35fa6325f46acd3bb9d2e6c06e3e" Feb 19 16:46:18 crc kubenswrapper[4810]: E0219 16:46:18.440562 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:46:29 crc kubenswrapper[4810]: I0219 16:46:29.168456 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lgm4z_e54510d9-7d24-47bb-a55e-b50e7cff9fba/util/0.log" Feb 19 16:46:29 crc kubenswrapper[4810]: I0219 16:46:29.384183 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lgm4z_e54510d9-7d24-47bb-a55e-b50e7cff9fba/util/0.log" Feb 19 16:46:29 crc kubenswrapper[4810]: I0219 16:46:29.390832 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lgm4z_e54510d9-7d24-47bb-a55e-b50e7cff9fba/pull/0.log" Feb 19 16:46:29 crc kubenswrapper[4810]: I0219 16:46:29.408915 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lgm4z_e54510d9-7d24-47bb-a55e-b50e7cff9fba/pull/0.log" Feb 19 16:46:29 crc kubenswrapper[4810]: I0219 16:46:29.439127 4810 scope.go:117] "RemoveContainer" containerID="63bd1770c325dfdb5ccd64220060f108b50a35fa6325f46acd3bb9d2e6c06e3e" Feb 19 16:46:29 crc kubenswrapper[4810]: E0219 16:46:29.439389 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:46:29 crc kubenswrapper[4810]: I0219 16:46:29.585362 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lgm4z_e54510d9-7d24-47bb-a55e-b50e7cff9fba/extract/0.log" Feb 19 16:46:29 crc kubenswrapper[4810]: I0219 16:46:29.588470 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lgm4z_e54510d9-7d24-47bb-a55e-b50e7cff9fba/util/0.log" Feb 19 16:46:29 crc kubenswrapper[4810]: I0219 16:46:29.618618 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lgm4z_e54510d9-7d24-47bb-a55e-b50e7cff9fba/pull/0.log" Feb 19 16:46:29 crc kubenswrapper[4810]: I0219 16:46:29.787084 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ccnrb_a861f8a3-be34-4fc0-96cb-42502d0a3bab/util/0.log" Feb 19 16:46:29 crc kubenswrapper[4810]: I0219 16:46:29.947857 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ccnrb_a861f8a3-be34-4fc0-96cb-42502d0a3bab/pull/0.log" Feb 19 16:46:29 crc kubenswrapper[4810]: I0219 16:46:29.957679 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ccnrb_a861f8a3-be34-4fc0-96cb-42502d0a3bab/util/0.log" Feb 19 16:46:29 crc kubenswrapper[4810]: I0219 16:46:29.968725 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ccnrb_a861f8a3-be34-4fc0-96cb-42502d0a3bab/pull/0.log" Feb 19 16:46:30 crc kubenswrapper[4810]: I0219 16:46:30.111150 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ccnrb_a861f8a3-be34-4fc0-96cb-42502d0a3bab/util/0.log" Feb 19 16:46:30 crc kubenswrapper[4810]: I0219 16:46:30.121607 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ccnrb_a861f8a3-be34-4fc0-96cb-42502d0a3bab/pull/0.log" Feb 19 16:46:30 crc kubenswrapper[4810]: I0219 16:46:30.139817 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ccnrb_a861f8a3-be34-4fc0-96cb-42502d0a3bab/extract/0.log" Feb 19 16:46:30 crc kubenswrapper[4810]: I0219 16:46:30.304372 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5n9gc_6dde1ea5-68be-4851-8816-3c7302dc2579/extract-utilities/0.log" Feb 19 16:46:30 crc kubenswrapper[4810]: I0219 16:46:30.488626 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5n9gc_6dde1ea5-68be-4851-8816-3c7302dc2579/extract-utilities/0.log" Feb 19 16:46:30 crc kubenswrapper[4810]: I0219 16:46:30.488645 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5n9gc_6dde1ea5-68be-4851-8816-3c7302dc2579/extract-content/0.log" Feb 19 16:46:30 crc kubenswrapper[4810]: I0219 16:46:30.519407 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5n9gc_6dde1ea5-68be-4851-8816-3c7302dc2579/extract-content/0.log" Feb 19 16:46:30 crc kubenswrapper[4810]: I0219 16:46:30.637741 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5n9gc_6dde1ea5-68be-4851-8816-3c7302dc2579/extract-content/0.log" Feb 19 16:46:30 crc kubenswrapper[4810]: I0219 16:46:30.642699 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5n9gc_6dde1ea5-68be-4851-8816-3c7302dc2579/extract-utilities/0.log" Feb 19 16:46:30 crc kubenswrapper[4810]: I0219 16:46:30.818973 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5n9gc_6dde1ea5-68be-4851-8816-3c7302dc2579/registry-server/0.log" Feb 19 16:46:30 crc kubenswrapper[4810]: I0219 16:46:30.854425 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-78p9d_4070bcdc-bd83-4c82-920b-8cd10671c498/extract-utilities/0.log" Feb 19 16:46:30 crc kubenswrapper[4810]: I0219 16:46:30.993104 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-78p9d_4070bcdc-bd83-4c82-920b-8cd10671c498/extract-content/0.log" Feb 19 16:46:31 crc kubenswrapper[4810]: I0219 16:46:31.029534 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-78p9d_4070bcdc-bd83-4c82-920b-8cd10671c498/extract-utilities/0.log" Feb 19 16:46:31 crc kubenswrapper[4810]: I0219 16:46:31.040889 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-78p9d_4070bcdc-bd83-4c82-920b-8cd10671c498/extract-content/0.log" Feb 19 16:46:31 crc kubenswrapper[4810]: I0219 16:46:31.234770 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-78p9d_4070bcdc-bd83-4c82-920b-8cd10671c498/extract-utilities/0.log" Feb 19 16:46:31 crc kubenswrapper[4810]: I0219 16:46:31.253473 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-78p9d_4070bcdc-bd83-4c82-920b-8cd10671c498/extract-content/0.log" Feb 19 16:46:31 crc kubenswrapper[4810]: I0219 16:46:31.441776 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawk578_815191f4-9d3a-4003-a32f-de4f76c9c15f/util/0.log" Feb 19 16:46:31 crc kubenswrapper[4810]: I0219 16:46:31.663453 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawk578_815191f4-9d3a-4003-a32f-de4f76c9c15f/util/0.log" Feb 19 16:46:31 crc kubenswrapper[4810]: I0219 16:46:31.760914 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-78p9d_4070bcdc-bd83-4c82-920b-8cd10671c498/registry-server/0.log" Feb 19 16:46:31 crc kubenswrapper[4810]: I0219 16:46:31.764860 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawk578_815191f4-9d3a-4003-a32f-de4f76c9c15f/pull/0.log" Feb 19 16:46:31 crc kubenswrapper[4810]: I0219 16:46:31.765105 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawk578_815191f4-9d3a-4003-a32f-de4f76c9c15f/pull/0.log" Feb 19 16:46:31 crc kubenswrapper[4810]: I0219 16:46:31.953625 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawk578_815191f4-9d3a-4003-a32f-de4f76c9c15f/util/0.log" Feb 19 16:46:31 crc kubenswrapper[4810]: I0219 16:46:31.975966 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawk578_815191f4-9d3a-4003-a32f-de4f76c9c15f/pull/0.log" Feb 19 16:46:31 crc kubenswrapper[4810]: I0219 16:46:31.994823 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawk578_815191f4-9d3a-4003-a32f-de4f76c9c15f/extract/0.log" Feb 19 16:46:32 crc kubenswrapper[4810]: I0219 16:46:32.138863 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-sm9wk_41d27e40-a89e-4fd6-8106-824c5a257f25/marketplace-operator/0.log" Feb 19 16:46:32 crc kubenswrapper[4810]: I0219 16:46:32.214994 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tgtg8_b3c2bc60-712d-4ef6-b461-ad683f51f2e4/extract-utilities/0.log" Feb 19 16:46:32 crc kubenswrapper[4810]: I0219 16:46:32.499800 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tgtg8_b3c2bc60-712d-4ef6-b461-ad683f51f2e4/extract-content/0.log" Feb 19 16:46:32 crc kubenswrapper[4810]: I0219 16:46:32.628882 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tgtg8_b3c2bc60-712d-4ef6-b461-ad683f51f2e4/extract-utilities/0.log" Feb 19 16:46:32 crc kubenswrapper[4810]: I0219 16:46:32.719997 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tgtg8_b3c2bc60-712d-4ef6-b461-ad683f51f2e4/extract-content/0.log" Feb 19 16:46:32 crc kubenswrapper[4810]: I0219 16:46:32.869589 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tgtg8_b3c2bc60-712d-4ef6-b461-ad683f51f2e4/extract-content/0.log" Feb 19 16:46:32 crc kubenswrapper[4810]: I0219 16:46:32.870223 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tgtg8_b3c2bc60-712d-4ef6-b461-ad683f51f2e4/extract-utilities/0.log" Feb 19 16:46:33 crc kubenswrapper[4810]: I0219 16:46:33.075299 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tgtg8_b3c2bc60-712d-4ef6-b461-ad683f51f2e4/registry-server/0.log" Feb 19 16:46:33 crc kubenswrapper[4810]: I0219 16:46:33.122208 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bwbpv_05b0324a-36c1-419b-8bdd-e41ad42a6a3f/extract-utilities/0.log" Feb 19 16:46:33 crc kubenswrapper[4810]: I0219 16:46:33.314710 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bwbpv_05b0324a-36c1-419b-8bdd-e41ad42a6a3f/extract-utilities/0.log" Feb 19 16:46:33 crc kubenswrapper[4810]: I0219 16:46:33.359700 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bwbpv_05b0324a-36c1-419b-8bdd-e41ad42a6a3f/extract-content/0.log" Feb 19 16:46:33 crc kubenswrapper[4810]: I0219 16:46:33.403062 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bwbpv_05b0324a-36c1-419b-8bdd-e41ad42a6a3f/extract-content/0.log" Feb 19 16:46:33 crc kubenswrapper[4810]: I0219 16:46:33.509398 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bwbpv_05b0324a-36c1-419b-8bdd-e41ad42a6a3f/extract-utilities/0.log" Feb 19 16:46:33 crc kubenswrapper[4810]: I0219 16:46:33.530802 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bwbpv_05b0324a-36c1-419b-8bdd-e41ad42a6a3f/extract-content/0.log" Feb 19 16:46:33 crc kubenswrapper[4810]: I0219 16:46:33.840626 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bwbpv_05b0324a-36c1-419b-8bdd-e41ad42a6a3f/registry-server/0.log" Feb 19 16:46:41 crc kubenswrapper[4810]: I0219 16:46:41.454148 4810 scope.go:117] "RemoveContainer" containerID="63bd1770c325dfdb5ccd64220060f108b50a35fa6325f46acd3bb9d2e6c06e3e" Feb 19 16:46:41 crc kubenswrapper[4810]: E0219 16:46:41.457832 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:46:48 crc kubenswrapper[4810]: I0219 16:46:48.229146 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6d6bf9b6-qpvdt_d5debcf2-9629-4bb2-9133-f4b81748ff7d/prometheus-operator-admission-webhook/0.log" Feb 19 16:46:48 crc kubenswrapper[4810]: I0219 16:46:48.310057 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6d6bf9b6-vrfvr_408628c0-0b2c-48f9-b849-ee1b124499e1/prometheus-operator-admission-webhook/0.log" Feb 19 16:46:48 crc kubenswrapper[4810]: I0219 16:46:48.328237 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-6jkkz_1656f52d-7771-4bbb-9642-b296d16b791e/prometheus-operator/0.log" Feb 19 16:46:48 crc kubenswrapper[4810]: I0219 16:46:48.416589 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-dk9c4_8bdf030e-92d8-45dc-ab6c-a7b241444677/operator/0.log" Feb 19 16:46:48 crc kubenswrapper[4810]: I0219 16:46:48.501763 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-2fdxm_c5968625-c396-4ae0-9846-c2ceb6baf655/perses-operator/0.log" Feb 19 16:46:53 crc kubenswrapper[4810]: I0219 16:46:53.439581 4810 scope.go:117] "RemoveContainer" containerID="63bd1770c325dfdb5ccd64220060f108b50a35fa6325f46acd3bb9d2e6c06e3e" Feb 19 16:46:53 crc kubenswrapper[4810]: E0219 16:46:53.440452 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:47:06 crc kubenswrapper[4810]: I0219 16:47:06.439140 4810 scope.go:117] "RemoveContainer" containerID="63bd1770c325dfdb5ccd64220060f108b50a35fa6325f46acd3bb9d2e6c06e3e" Feb 19 16:47:06 crc kubenswrapper[4810]: E0219 16:47:06.439896 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:47:18 crc kubenswrapper[4810]: I0219 16:47:18.440051 4810 scope.go:117] "RemoveContainer" containerID="63bd1770c325dfdb5ccd64220060f108b50a35fa6325f46acd3bb9d2e6c06e3e" Feb 19 16:47:18 crc kubenswrapper[4810]: E0219 16:47:18.441001 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:47:32 crc kubenswrapper[4810]: I0219 16:47:32.439463 4810 scope.go:117] "RemoveContainer" containerID="63bd1770c325dfdb5ccd64220060f108b50a35fa6325f46acd3bb9d2e6c06e3e" Feb 19 16:47:32 crc kubenswrapper[4810]: E0219 16:47:32.440584 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:47:44 crc kubenswrapper[4810]: I0219 16:47:44.439591 4810 scope.go:117] "RemoveContainer" containerID="63bd1770c325dfdb5ccd64220060f108b50a35fa6325f46acd3bb9d2e6c06e3e" Feb 19 16:47:44 crc kubenswrapper[4810]: E0219 16:47:44.440638 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:47:58 crc kubenswrapper[4810]: I0219 16:47:58.439372 4810 scope.go:117] "RemoveContainer" containerID="63bd1770c325dfdb5ccd64220060f108b50a35fa6325f46acd3bb9d2e6c06e3e" Feb 19 16:47:58 crc kubenswrapper[4810]: E0219 16:47:58.440214 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:48:13 crc kubenswrapper[4810]: I0219 16:48:13.446809 4810 scope.go:117] "RemoveContainer" containerID="63bd1770c325dfdb5ccd64220060f108b50a35fa6325f46acd3bb9d2e6c06e3e" Feb 19 16:48:13 crc kubenswrapper[4810]: E0219 16:48:13.447706 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:48:26 crc kubenswrapper[4810]: I0219 16:48:26.440317 4810 scope.go:117] "RemoveContainer" containerID="63bd1770c325dfdb5ccd64220060f108b50a35fa6325f46acd3bb9d2e6c06e3e" Feb 19 16:48:26 crc kubenswrapper[4810]: E0219 16:48:26.441406 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:48:37 crc kubenswrapper[4810]: I0219 16:48:37.439509 4810 scope.go:117] "RemoveContainer" containerID="63bd1770c325dfdb5ccd64220060f108b50a35fa6325f46acd3bb9d2e6c06e3e" Feb 19 16:48:37 crc kubenswrapper[4810]: E0219 16:48:37.440748 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:48:48 crc kubenswrapper[4810]: I0219 16:48:48.440406 4810 scope.go:117] "RemoveContainer" containerID="63bd1770c325dfdb5ccd64220060f108b50a35fa6325f46acd3bb9d2e6c06e3e" Feb 19 16:48:48 crc kubenswrapper[4810]: E0219 16:48:48.442108 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:48:49 crc kubenswrapper[4810]: I0219 16:48:49.017886 4810 generic.go:334] "Generic (PLEG): container finished" podID="2d8b3bd1-400c-4da2-a6d6-efa7d8bbd994" containerID="17449341bfb55cd42e3f6347fa4879ccfdfca14899e053d20de9d0995b28213d" exitCode=0 Feb 19 16:48:49 crc kubenswrapper[4810]: I0219 16:48:49.017963 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mt7ln/must-gather-xtkm7" event={"ID":"2d8b3bd1-400c-4da2-a6d6-efa7d8bbd994","Type":"ContainerDied","Data":"17449341bfb55cd42e3f6347fa4879ccfdfca14899e053d20de9d0995b28213d"} Feb 19 16:48:49 crc kubenswrapper[4810]: I0219 16:48:49.018976 4810 scope.go:117] "RemoveContainer" containerID="17449341bfb55cd42e3f6347fa4879ccfdfca14899e053d20de9d0995b28213d" Feb 19 16:48:49 crc kubenswrapper[4810]: I0219 16:48:49.421578 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-mt7ln_must-gather-xtkm7_2d8b3bd1-400c-4da2-a6d6-efa7d8bbd994/gather/0.log" Feb 19 16:48:57 crc kubenswrapper[4810]: I0219 16:48:57.724007 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-mt7ln/must-gather-xtkm7"] Feb 19 16:48:57 crc kubenswrapper[4810]: I0219 16:48:57.724653 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-mt7ln/must-gather-xtkm7" podUID="2d8b3bd1-400c-4da2-a6d6-efa7d8bbd994" containerName="copy" containerID="cri-o://61e6a470548c19ba5e113ede1e2ed5b363e40d072f7a28ac5b76e32667d06ba8" gracePeriod=2 Feb 19 16:48:57 crc kubenswrapper[4810]: I0219 16:48:57.737082 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-mt7ln/must-gather-xtkm7"] Feb 19 16:48:58 crc kubenswrapper[4810]: I0219 16:48:58.148258 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-mt7ln_must-gather-xtkm7_2d8b3bd1-400c-4da2-a6d6-efa7d8bbd994/copy/0.log" Feb 19 16:48:58 crc kubenswrapper[4810]: I0219 16:48:58.149737 4810 generic.go:334] "Generic (PLEG): container finished" podID="2d8b3bd1-400c-4da2-a6d6-efa7d8bbd994" containerID="61e6a470548c19ba5e113ede1e2ed5b363e40d072f7a28ac5b76e32667d06ba8" exitCode=143 Feb 19 16:48:58 crc kubenswrapper[4810]: I0219 16:48:58.149796 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="151c67b89f6c96a03e2c74c97afaae12904a2f90d765b36355e67b20a49b85e4" Feb 19 16:48:58 crc kubenswrapper[4810]: I0219 16:48:58.170726 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-mt7ln_must-gather-xtkm7_2d8b3bd1-400c-4da2-a6d6-efa7d8bbd994/copy/0.log" Feb 19 16:48:58 crc kubenswrapper[4810]: I0219 16:48:58.171343 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mt7ln/must-gather-xtkm7" Feb 19 16:48:58 crc kubenswrapper[4810]: I0219 16:48:58.253538 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2d8b3bd1-400c-4da2-a6d6-efa7d8bbd994-must-gather-output\") pod \"2d8b3bd1-400c-4da2-a6d6-efa7d8bbd994\" (UID: \"2d8b3bd1-400c-4da2-a6d6-efa7d8bbd994\") " Feb 19 16:48:58 crc kubenswrapper[4810]: I0219 16:48:58.253763 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67dql\" (UniqueName: \"kubernetes.io/projected/2d8b3bd1-400c-4da2-a6d6-efa7d8bbd994-kube-api-access-67dql\") pod \"2d8b3bd1-400c-4da2-a6d6-efa7d8bbd994\" (UID: \"2d8b3bd1-400c-4da2-a6d6-efa7d8bbd994\") " Feb 19 16:48:58 crc kubenswrapper[4810]: I0219 16:48:58.260238 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d8b3bd1-400c-4da2-a6d6-efa7d8bbd994-kube-api-access-67dql" (OuterVolumeSpecName: "kube-api-access-67dql") pod "2d8b3bd1-400c-4da2-a6d6-efa7d8bbd994" (UID: "2d8b3bd1-400c-4da2-a6d6-efa7d8bbd994"). InnerVolumeSpecName "kube-api-access-67dql". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 16:48:58 crc kubenswrapper[4810]: I0219 16:48:58.357006 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67dql\" (UniqueName: \"kubernetes.io/projected/2d8b3bd1-400c-4da2-a6d6-efa7d8bbd994-kube-api-access-67dql\") on node \"crc\" DevicePath \"\"" Feb 19 16:48:58 crc kubenswrapper[4810]: I0219 16:48:58.450847 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d8b3bd1-400c-4da2-a6d6-efa7d8bbd994-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "2d8b3bd1-400c-4da2-a6d6-efa7d8bbd994" (UID: "2d8b3bd1-400c-4da2-a6d6-efa7d8bbd994"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 16:48:58 crc kubenswrapper[4810]: I0219 16:48:58.458642 4810 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2d8b3bd1-400c-4da2-a6d6-efa7d8bbd994-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 19 16:48:59 crc kubenswrapper[4810]: I0219 16:48:59.159765 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mt7ln/must-gather-xtkm7" Feb 19 16:48:59 crc kubenswrapper[4810]: I0219 16:48:59.449855 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d8b3bd1-400c-4da2-a6d6-efa7d8bbd994" path="/var/lib/kubelet/pods/2d8b3bd1-400c-4da2-a6d6-efa7d8bbd994/volumes" Feb 19 16:49:03 crc kubenswrapper[4810]: I0219 16:49:03.440021 4810 scope.go:117] "RemoveContainer" containerID="63bd1770c325dfdb5ccd64220060f108b50a35fa6325f46acd3bb9d2e6c06e3e" Feb 19 16:49:03 crc kubenswrapper[4810]: E0219 16:49:03.440775 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:49:12 crc kubenswrapper[4810]: I0219 16:49:12.089859 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wtwq9"] Feb 19 16:49:12 crc kubenswrapper[4810]: E0219 16:49:12.091146 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d8b3bd1-400c-4da2-a6d6-efa7d8bbd994" containerName="gather" Feb 19 16:49:12 crc kubenswrapper[4810]: I0219 16:49:12.091169 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d8b3bd1-400c-4da2-a6d6-efa7d8bbd994" containerName="gather" Feb 19 16:49:12 crc kubenswrapper[4810]: E0219 16:49:12.091228 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="642d0068-2155-46d5-85c6-d4f70d142f81" containerName="collect-profiles" Feb 19 16:49:12 crc kubenswrapper[4810]: I0219 16:49:12.091241 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="642d0068-2155-46d5-85c6-d4f70d142f81" containerName="collect-profiles" Feb 19 16:49:12 crc kubenswrapper[4810]: E0219 16:49:12.091272 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d8b3bd1-400c-4da2-a6d6-efa7d8bbd994" containerName="copy" Feb 19 16:49:12 crc kubenswrapper[4810]: I0219 16:49:12.091285 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d8b3bd1-400c-4da2-a6d6-efa7d8bbd994" containerName="copy" Feb 19 16:49:12 crc kubenswrapper[4810]: I0219 16:49:12.091685 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d8b3bd1-400c-4da2-a6d6-efa7d8bbd994" containerName="copy" Feb 19 16:49:12 crc kubenswrapper[4810]: I0219 16:49:12.091712 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d8b3bd1-400c-4da2-a6d6-efa7d8bbd994" containerName="gather" Feb 19 16:49:12 crc kubenswrapper[4810]: I0219 16:49:12.091742 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="642d0068-2155-46d5-85c6-d4f70d142f81" containerName="collect-profiles" Feb 19 16:49:12 crc kubenswrapper[4810]: I0219 16:49:12.094275 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wtwq9" Feb 19 16:49:12 crc kubenswrapper[4810]: I0219 16:49:12.128685 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wtwq9"] Feb 19 16:49:12 crc kubenswrapper[4810]: I0219 16:49:12.192314 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f1a10dd-dddb-4bcc-ad11-42e2ab29a0f1-catalog-content\") pod \"community-operators-wtwq9\" (UID: \"7f1a10dd-dddb-4bcc-ad11-42e2ab29a0f1\") " pod="openshift-marketplace/community-operators-wtwq9" Feb 19 16:49:12 crc kubenswrapper[4810]: I0219 16:49:12.192659 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z52wb\" (UniqueName: \"kubernetes.io/projected/7f1a10dd-dddb-4bcc-ad11-42e2ab29a0f1-kube-api-access-z52wb\") pod \"community-operators-wtwq9\" (UID: \"7f1a10dd-dddb-4bcc-ad11-42e2ab29a0f1\") " pod="openshift-marketplace/community-operators-wtwq9" Feb 19 16:49:12 crc kubenswrapper[4810]: I0219 16:49:12.192826 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f1a10dd-dddb-4bcc-ad11-42e2ab29a0f1-utilities\") pod \"community-operators-wtwq9\" (UID: \"7f1a10dd-dddb-4bcc-ad11-42e2ab29a0f1\") " pod="openshift-marketplace/community-operators-wtwq9" Feb 19 16:49:12 crc kubenswrapper[4810]: I0219 16:49:12.295076 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z52wb\" (UniqueName: \"kubernetes.io/projected/7f1a10dd-dddb-4bcc-ad11-42e2ab29a0f1-kube-api-access-z52wb\") pod \"community-operators-wtwq9\" (UID: \"7f1a10dd-dddb-4bcc-ad11-42e2ab29a0f1\") " pod="openshift-marketplace/community-operators-wtwq9" Feb 19 16:49:12 crc kubenswrapper[4810]: I0219 16:49:12.295221 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f1a10dd-dddb-4bcc-ad11-42e2ab29a0f1-utilities\") pod \"community-operators-wtwq9\" (UID: \"7f1a10dd-dddb-4bcc-ad11-42e2ab29a0f1\") " pod="openshift-marketplace/community-operators-wtwq9" Feb 19 16:49:12 crc kubenswrapper[4810]: I0219 16:49:12.295286 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f1a10dd-dddb-4bcc-ad11-42e2ab29a0f1-catalog-content\") pod \"community-operators-wtwq9\" (UID: \"7f1a10dd-dddb-4bcc-ad11-42e2ab29a0f1\") " pod="openshift-marketplace/community-operators-wtwq9" Feb 19 16:49:12 crc kubenswrapper[4810]: I0219 16:49:12.295935 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f1a10dd-dddb-4bcc-ad11-42e2ab29a0f1-catalog-content\") pod \"community-operators-wtwq9\" (UID: \"7f1a10dd-dddb-4bcc-ad11-42e2ab29a0f1\") " pod="openshift-marketplace/community-operators-wtwq9" Feb 19 16:49:12 crc kubenswrapper[4810]: I0219 16:49:12.296230 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f1a10dd-dddb-4bcc-ad11-42e2ab29a0f1-utilities\") pod \"community-operators-wtwq9\" (UID: \"7f1a10dd-dddb-4bcc-ad11-42e2ab29a0f1\") " pod="openshift-marketplace/community-operators-wtwq9" Feb 19 16:49:12 crc kubenswrapper[4810]: I0219 16:49:12.319228 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z52wb\" (UniqueName: \"kubernetes.io/projected/7f1a10dd-dddb-4bcc-ad11-42e2ab29a0f1-kube-api-access-z52wb\") pod \"community-operators-wtwq9\" (UID: \"7f1a10dd-dddb-4bcc-ad11-42e2ab29a0f1\") " pod="openshift-marketplace/community-operators-wtwq9" Feb 19 16:49:12 crc kubenswrapper[4810]: I0219 16:49:12.431676 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wtwq9" Feb 19 16:49:13 crc kubenswrapper[4810]: I0219 16:49:13.018779 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wtwq9"] Feb 19 16:49:13 crc kubenswrapper[4810]: I0219 16:49:13.332629 4810 generic.go:334] "Generic (PLEG): container finished" podID="7f1a10dd-dddb-4bcc-ad11-42e2ab29a0f1" containerID="3875e687f2e4bee41b307a420054102f3b0c17cfc118c7ff51e926bdf160d8f6" exitCode=0 Feb 19 16:49:13 crc kubenswrapper[4810]: I0219 16:49:13.332702 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wtwq9" event={"ID":"7f1a10dd-dddb-4bcc-ad11-42e2ab29a0f1","Type":"ContainerDied","Data":"3875e687f2e4bee41b307a420054102f3b0c17cfc118c7ff51e926bdf160d8f6"} Feb 19 16:49:13 crc kubenswrapper[4810]: I0219 16:49:13.332766 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wtwq9" event={"ID":"7f1a10dd-dddb-4bcc-ad11-42e2ab29a0f1","Type":"ContainerStarted","Data":"e4ff98de98e21960026f1b09a68015f3b8c43e525f6b3c17a4c5640b9e54426f"} Feb 19 16:49:13 crc kubenswrapper[4810]: I0219 16:49:13.335635 4810 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 16:49:14 crc kubenswrapper[4810]: I0219 16:49:14.439948 4810 scope.go:117] "RemoveContainer" containerID="63bd1770c325dfdb5ccd64220060f108b50a35fa6325f46acd3bb9d2e6c06e3e" Feb 19 16:49:14 crc kubenswrapper[4810]: E0219 16:49:14.440300 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:49:18 crc kubenswrapper[4810]: I0219 16:49:18.408010 4810 generic.go:334] "Generic (PLEG): container finished" podID="7f1a10dd-dddb-4bcc-ad11-42e2ab29a0f1" containerID="f7b8c338c99efffac00e60cf312b5fdc213c42aa45d098f512cfebbd165281ab" exitCode=0 Feb 19 16:49:18 crc kubenswrapper[4810]: I0219 16:49:18.408656 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wtwq9" event={"ID":"7f1a10dd-dddb-4bcc-ad11-42e2ab29a0f1","Type":"ContainerDied","Data":"f7b8c338c99efffac00e60cf312b5fdc213c42aa45d098f512cfebbd165281ab"} Feb 19 16:49:19 crc kubenswrapper[4810]: I0219 16:49:19.422185 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wtwq9" event={"ID":"7f1a10dd-dddb-4bcc-ad11-42e2ab29a0f1","Type":"ContainerStarted","Data":"56a97f096bb3adbb5160b221a59b41089911f3a603720ba682acc8f52f5df050"} Feb 19 16:49:19 crc kubenswrapper[4810]: I0219 16:49:19.454419 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wtwq9" podStartSLOduration=1.94934166 podStartE2EDuration="7.454395566s" podCreationTimestamp="2026-02-19 16:49:12 +0000 UTC" firstStartedPulling="2026-02-19 16:49:13.335147406 +0000 UTC m=+5982.817177560" lastFinishedPulling="2026-02-19 16:49:18.840201342 +0000 UTC m=+5988.322231466" observedRunningTime="2026-02-19 16:49:19.449352482 +0000 UTC m=+5988.931382646" watchObservedRunningTime="2026-02-19 16:49:19.454395566 +0000 UTC m=+5988.936425700" Feb 19 16:49:22 crc kubenswrapper[4810]: I0219 16:49:22.432370 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wtwq9" Feb 19 16:49:22 crc kubenswrapper[4810]: I0219 16:49:22.433038 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wtwq9" Feb 19 16:49:22 crc kubenswrapper[4810]: I0219 16:49:22.505689 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wtwq9" Feb 19 16:49:28 crc kubenswrapper[4810]: I0219 16:49:28.439720 4810 scope.go:117] "RemoveContainer" containerID="63bd1770c325dfdb5ccd64220060f108b50a35fa6325f46acd3bb9d2e6c06e3e" Feb 19 16:49:28 crc kubenswrapper[4810]: E0219 16:49:28.440595 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:49:32 crc kubenswrapper[4810]: I0219 16:49:32.528662 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wtwq9" Feb 19 16:49:32 crc kubenswrapper[4810]: I0219 16:49:32.628956 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wtwq9"] Feb 19 16:49:32 crc kubenswrapper[4810]: I0219 16:49:32.671656 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-78p9d"] Feb 19 16:49:32 crc kubenswrapper[4810]: I0219 16:49:32.671999 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-78p9d" podUID="4070bcdc-bd83-4c82-920b-8cd10671c498" containerName="registry-server" containerID="cri-o://2d183229f511b7d5a8cb0fb396e7514676855b732318632722a576d32921f68f" gracePeriod=2 Feb 19 16:49:33 crc kubenswrapper[4810]: I0219 16:49:33.132097 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-78p9d" Feb 19 16:49:33 crc kubenswrapper[4810]: I0219 16:49:33.243955 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4070bcdc-bd83-4c82-920b-8cd10671c498-utilities\") pod \"4070bcdc-bd83-4c82-920b-8cd10671c498\" (UID: \"4070bcdc-bd83-4c82-920b-8cd10671c498\") " Feb 19 16:49:33 crc kubenswrapper[4810]: I0219 16:49:33.244030 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88slz\" (UniqueName: \"kubernetes.io/projected/4070bcdc-bd83-4c82-920b-8cd10671c498-kube-api-access-88slz\") pod \"4070bcdc-bd83-4c82-920b-8cd10671c498\" (UID: \"4070bcdc-bd83-4c82-920b-8cd10671c498\") " Feb 19 16:49:33 crc kubenswrapper[4810]: I0219 16:49:33.244105 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4070bcdc-bd83-4c82-920b-8cd10671c498-catalog-content\") pod \"4070bcdc-bd83-4c82-920b-8cd10671c498\" (UID: \"4070bcdc-bd83-4c82-920b-8cd10671c498\") " Feb 19 16:49:33 crc kubenswrapper[4810]: I0219 16:49:33.244582 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4070bcdc-bd83-4c82-920b-8cd10671c498-utilities" (OuterVolumeSpecName: "utilities") pod "4070bcdc-bd83-4c82-920b-8cd10671c498" (UID: "4070bcdc-bd83-4c82-920b-8cd10671c498"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 16:49:33 crc kubenswrapper[4810]: I0219 16:49:33.250426 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4070bcdc-bd83-4c82-920b-8cd10671c498-kube-api-access-88slz" (OuterVolumeSpecName: "kube-api-access-88slz") pod "4070bcdc-bd83-4c82-920b-8cd10671c498" (UID: "4070bcdc-bd83-4c82-920b-8cd10671c498"). InnerVolumeSpecName "kube-api-access-88slz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 16:49:33 crc kubenswrapper[4810]: I0219 16:49:33.300538 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4070bcdc-bd83-4c82-920b-8cd10671c498-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4070bcdc-bd83-4c82-920b-8cd10671c498" (UID: "4070bcdc-bd83-4c82-920b-8cd10671c498"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 16:49:33 crc kubenswrapper[4810]: I0219 16:49:33.347184 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4070bcdc-bd83-4c82-920b-8cd10671c498-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 16:49:33 crc kubenswrapper[4810]: I0219 16:49:33.347234 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4070bcdc-bd83-4c82-920b-8cd10671c498-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 16:49:33 crc kubenswrapper[4810]: I0219 16:49:33.347249 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88slz\" (UniqueName: \"kubernetes.io/projected/4070bcdc-bd83-4c82-920b-8cd10671c498-kube-api-access-88slz\") on node \"crc\" DevicePath \"\"" Feb 19 16:49:33 crc kubenswrapper[4810]: I0219 16:49:33.602492 4810 generic.go:334] "Generic (PLEG): container finished" podID="4070bcdc-bd83-4c82-920b-8cd10671c498" containerID="2d183229f511b7d5a8cb0fb396e7514676855b732318632722a576d32921f68f" exitCode=0 Feb 19 16:49:33 crc kubenswrapper[4810]: I0219 16:49:33.602552 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-78p9d" event={"ID":"4070bcdc-bd83-4c82-920b-8cd10671c498","Type":"ContainerDied","Data":"2d183229f511b7d5a8cb0fb396e7514676855b732318632722a576d32921f68f"} Feb 19 16:49:33 crc kubenswrapper[4810]: I0219 16:49:33.602589 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-78p9d" event={"ID":"4070bcdc-bd83-4c82-920b-8cd10671c498","Type":"ContainerDied","Data":"0c95ccc80153fc2aa4312b67f7d6c9a87f9473946f133fded513db482a269be4"} Feb 19 16:49:33 crc kubenswrapper[4810]: I0219 16:49:33.602590 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-78p9d" Feb 19 16:49:33 crc kubenswrapper[4810]: I0219 16:49:33.602611 4810 scope.go:117] "RemoveContainer" containerID="2d183229f511b7d5a8cb0fb396e7514676855b732318632722a576d32921f68f" Feb 19 16:49:33 crc kubenswrapper[4810]: I0219 16:49:33.624068 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-78p9d"] Feb 19 16:49:33 crc kubenswrapper[4810]: I0219 16:49:33.626836 4810 scope.go:117] "RemoveContainer" containerID="702b9f8e53344f3c21f1cd920ff8a37a750c3a8370aa348e1d772d912bfa2ac0" Feb 19 16:49:33 crc kubenswrapper[4810]: I0219 16:49:33.632949 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-78p9d"] Feb 19 16:49:33 crc kubenswrapper[4810]: I0219 16:49:33.652336 4810 scope.go:117] "RemoveContainer" containerID="ab74f7f85dc91d920ea16df4b6ea4adb1452fa6de4e5f98251385b261184b652" Feb 19 16:49:33 crc kubenswrapper[4810]: I0219 16:49:33.719677 4810 scope.go:117] "RemoveContainer" containerID="2d183229f511b7d5a8cb0fb396e7514676855b732318632722a576d32921f68f" Feb 19 16:49:33 crc kubenswrapper[4810]: E0219 16:49:33.720164 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d183229f511b7d5a8cb0fb396e7514676855b732318632722a576d32921f68f\": container with ID starting with 2d183229f511b7d5a8cb0fb396e7514676855b732318632722a576d32921f68f not found: ID does not exist" containerID="2d183229f511b7d5a8cb0fb396e7514676855b732318632722a576d32921f68f" Feb 19 16:49:33 crc kubenswrapper[4810]: I0219 16:49:33.720206 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d183229f511b7d5a8cb0fb396e7514676855b732318632722a576d32921f68f"} err="failed to get container status \"2d183229f511b7d5a8cb0fb396e7514676855b732318632722a576d32921f68f\": rpc error: code = NotFound desc = could not find container \"2d183229f511b7d5a8cb0fb396e7514676855b732318632722a576d32921f68f\": container with ID starting with 2d183229f511b7d5a8cb0fb396e7514676855b732318632722a576d32921f68f not found: ID does not exist" Feb 19 16:49:33 crc kubenswrapper[4810]: I0219 16:49:33.720231 4810 scope.go:117] "RemoveContainer" containerID="702b9f8e53344f3c21f1cd920ff8a37a750c3a8370aa348e1d772d912bfa2ac0" Feb 19 16:49:33 crc kubenswrapper[4810]: E0219 16:49:33.720678 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"702b9f8e53344f3c21f1cd920ff8a37a750c3a8370aa348e1d772d912bfa2ac0\": container with ID starting with 702b9f8e53344f3c21f1cd920ff8a37a750c3a8370aa348e1d772d912bfa2ac0 not found: ID does not exist" containerID="702b9f8e53344f3c21f1cd920ff8a37a750c3a8370aa348e1d772d912bfa2ac0" Feb 19 16:49:33 crc kubenswrapper[4810]: I0219 16:49:33.720705 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"702b9f8e53344f3c21f1cd920ff8a37a750c3a8370aa348e1d772d912bfa2ac0"} err="failed to get container status \"702b9f8e53344f3c21f1cd920ff8a37a750c3a8370aa348e1d772d912bfa2ac0\": rpc error: code = NotFound desc = could not find container \"702b9f8e53344f3c21f1cd920ff8a37a750c3a8370aa348e1d772d912bfa2ac0\": container with ID starting with 702b9f8e53344f3c21f1cd920ff8a37a750c3a8370aa348e1d772d912bfa2ac0 not found: ID does not exist" Feb 19 16:49:33 crc kubenswrapper[4810]: I0219 16:49:33.720728 4810 scope.go:117] "RemoveContainer" containerID="ab74f7f85dc91d920ea16df4b6ea4adb1452fa6de4e5f98251385b261184b652" Feb 19 16:49:33 crc kubenswrapper[4810]: E0219 16:49:33.721028 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab74f7f85dc91d920ea16df4b6ea4adb1452fa6de4e5f98251385b261184b652\": container with ID starting with ab74f7f85dc91d920ea16df4b6ea4adb1452fa6de4e5f98251385b261184b652 not found: ID does not exist" containerID="ab74f7f85dc91d920ea16df4b6ea4adb1452fa6de4e5f98251385b261184b652" Feb 19 16:49:33 crc kubenswrapper[4810]: I0219 16:49:33.721086 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab74f7f85dc91d920ea16df4b6ea4adb1452fa6de4e5f98251385b261184b652"} err="failed to get container status \"ab74f7f85dc91d920ea16df4b6ea4adb1452fa6de4e5f98251385b261184b652\": rpc error: code = NotFound desc = could not find container \"ab74f7f85dc91d920ea16df4b6ea4adb1452fa6de4e5f98251385b261184b652\": container with ID starting with ab74f7f85dc91d920ea16df4b6ea4adb1452fa6de4e5f98251385b261184b652 not found: ID does not exist" Feb 19 16:49:35 crc kubenswrapper[4810]: I0219 16:49:35.455732 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4070bcdc-bd83-4c82-920b-8cd10671c498" path="/var/lib/kubelet/pods/4070bcdc-bd83-4c82-920b-8cd10671c498/volumes" Feb 19 16:49:42 crc kubenswrapper[4810]: I0219 16:49:42.441902 4810 scope.go:117] "RemoveContainer" containerID="63bd1770c325dfdb5ccd64220060f108b50a35fa6325f46acd3bb9d2e6c06e3e" Feb 19 16:49:42 crc kubenswrapper[4810]: E0219 16:49:42.443687 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:49:45 crc kubenswrapper[4810]: I0219 16:49:45.204919 4810 scope.go:117] "RemoveContainer" containerID="61e6a470548c19ba5e113ede1e2ed5b363e40d072f7a28ac5b76e32667d06ba8" Feb 19 16:49:45 crc kubenswrapper[4810]: I0219 16:49:45.236319 4810 scope.go:117] "RemoveContainer" containerID="17449341bfb55cd42e3f6347fa4879ccfdfca14899e053d20de9d0995b28213d" Feb 19 16:49:45 crc kubenswrapper[4810]: I0219 16:49:45.342022 4810 scope.go:117] "RemoveContainer" containerID="8eab70b763bfc660b31c0a60ab6d66eddd4b83e4bf6e5028c9b6b68f68641100" Feb 19 16:49:56 crc kubenswrapper[4810]: I0219 16:49:56.273726 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6xmxg"] Feb 19 16:49:56 crc kubenswrapper[4810]: E0219 16:49:56.275202 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4070bcdc-bd83-4c82-920b-8cd10671c498" containerName="extract-content" Feb 19 16:49:56 crc kubenswrapper[4810]: I0219 16:49:56.275224 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="4070bcdc-bd83-4c82-920b-8cd10671c498" containerName="extract-content" Feb 19 16:49:56 crc kubenswrapper[4810]: E0219 16:49:56.275272 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4070bcdc-bd83-4c82-920b-8cd10671c498" containerName="registry-server" Feb 19 16:49:56 crc kubenswrapper[4810]: I0219 16:49:56.275284 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="4070bcdc-bd83-4c82-920b-8cd10671c498" containerName="registry-server" Feb 19 16:49:56 crc kubenswrapper[4810]: E0219 16:49:56.275319 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4070bcdc-bd83-4c82-920b-8cd10671c498" containerName="extract-utilities" Feb 19 16:49:56 crc kubenswrapper[4810]: I0219 16:49:56.275354 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="4070bcdc-bd83-4c82-920b-8cd10671c498" containerName="extract-utilities" Feb 19 16:49:56 crc kubenswrapper[4810]: I0219 16:49:56.275711 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="4070bcdc-bd83-4c82-920b-8cd10671c498" containerName="registry-server" Feb 19 16:49:56 crc kubenswrapper[4810]: I0219 16:49:56.278541 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6xmxg" Feb 19 16:49:56 crc kubenswrapper[4810]: I0219 16:49:56.296536 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6xmxg"] Feb 19 16:49:56 crc kubenswrapper[4810]: I0219 16:49:56.328352 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lz44j\" (UniqueName: \"kubernetes.io/projected/e82d7377-faf1-409d-bc2e-581cbdcd1f6d-kube-api-access-lz44j\") pod \"redhat-operators-6xmxg\" (UID: \"e82d7377-faf1-409d-bc2e-581cbdcd1f6d\") " pod="openshift-marketplace/redhat-operators-6xmxg" Feb 19 16:49:56 crc kubenswrapper[4810]: I0219 16:49:56.328426 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e82d7377-faf1-409d-bc2e-581cbdcd1f6d-catalog-content\") pod \"redhat-operators-6xmxg\" (UID: \"e82d7377-faf1-409d-bc2e-581cbdcd1f6d\") " pod="openshift-marketplace/redhat-operators-6xmxg" Feb 19 16:49:56 crc kubenswrapper[4810]: I0219 16:49:56.328575 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e82d7377-faf1-409d-bc2e-581cbdcd1f6d-utilities\") pod \"redhat-operators-6xmxg\" (UID: \"e82d7377-faf1-409d-bc2e-581cbdcd1f6d\") " pod="openshift-marketplace/redhat-operators-6xmxg" Feb 19 16:49:56 crc kubenswrapper[4810]: I0219 16:49:56.431403 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e82d7377-faf1-409d-bc2e-581cbdcd1f6d-utilities\") pod \"redhat-operators-6xmxg\" (UID: \"e82d7377-faf1-409d-bc2e-581cbdcd1f6d\") " pod="openshift-marketplace/redhat-operators-6xmxg" Feb 19 16:49:56 crc kubenswrapper[4810]: I0219 16:49:56.432024 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e82d7377-faf1-409d-bc2e-581cbdcd1f6d-utilities\") pod \"redhat-operators-6xmxg\" (UID: \"e82d7377-faf1-409d-bc2e-581cbdcd1f6d\") " pod="openshift-marketplace/redhat-operators-6xmxg" Feb 19 16:49:56 crc kubenswrapper[4810]: I0219 16:49:56.432222 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lz44j\" (UniqueName: \"kubernetes.io/projected/e82d7377-faf1-409d-bc2e-581cbdcd1f6d-kube-api-access-lz44j\") pod \"redhat-operators-6xmxg\" (UID: \"e82d7377-faf1-409d-bc2e-581cbdcd1f6d\") " pod="openshift-marketplace/redhat-operators-6xmxg" Feb 19 16:49:56 crc kubenswrapper[4810]: I0219 16:49:56.432463 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e82d7377-faf1-409d-bc2e-581cbdcd1f6d-catalog-content\") pod \"redhat-operators-6xmxg\" (UID: \"e82d7377-faf1-409d-bc2e-581cbdcd1f6d\") " pod="openshift-marketplace/redhat-operators-6xmxg" Feb 19 16:49:56 crc kubenswrapper[4810]: I0219 16:49:56.432846 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e82d7377-faf1-409d-bc2e-581cbdcd1f6d-catalog-content\") pod \"redhat-operators-6xmxg\" (UID: \"e82d7377-faf1-409d-bc2e-581cbdcd1f6d\") " pod="openshift-marketplace/redhat-operators-6xmxg" Feb 19 16:49:56 crc kubenswrapper[4810]: I0219 16:49:56.463987 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lz44j\" (UniqueName: \"kubernetes.io/projected/e82d7377-faf1-409d-bc2e-581cbdcd1f6d-kube-api-access-lz44j\") pod \"redhat-operators-6xmxg\" (UID: \"e82d7377-faf1-409d-bc2e-581cbdcd1f6d\") " pod="openshift-marketplace/redhat-operators-6xmxg" Feb 19 16:49:56 crc kubenswrapper[4810]: I0219 16:49:56.618624 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6xmxg" Feb 19 16:49:57 crc kubenswrapper[4810]: I0219 16:49:57.198458 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6xmxg"] Feb 19 16:49:57 crc kubenswrapper[4810]: I0219 16:49:57.439362 4810 scope.go:117] "RemoveContainer" containerID="63bd1770c325dfdb5ccd64220060f108b50a35fa6325f46acd3bb9d2e6c06e3e" Feb 19 16:49:57 crc kubenswrapper[4810]: E0219 16:49:57.439953 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:49:57 crc kubenswrapper[4810]: I0219 16:49:57.929607 4810 generic.go:334] "Generic (PLEG): container finished" podID="e82d7377-faf1-409d-bc2e-581cbdcd1f6d" containerID="af6b1b41bd819502addcb6561dbf8e4c1088ea0e95e1b6e7bc74408aef9ce7ec" exitCode=0 Feb 19 16:49:57 crc kubenswrapper[4810]: I0219 16:49:57.929668 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6xmxg" event={"ID":"e82d7377-faf1-409d-bc2e-581cbdcd1f6d","Type":"ContainerDied","Data":"af6b1b41bd819502addcb6561dbf8e4c1088ea0e95e1b6e7bc74408aef9ce7ec"} Feb 19 16:49:57 crc kubenswrapper[4810]: I0219 16:49:57.929712 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6xmxg" event={"ID":"e82d7377-faf1-409d-bc2e-581cbdcd1f6d","Type":"ContainerStarted","Data":"e208e8dc67c8ea6085bc7423038bf7a34783c4f8f871e552d86a48225914d4f3"} Feb 19 16:49:59 crc kubenswrapper[4810]: I0219 16:49:59.958534 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6xmxg" event={"ID":"e82d7377-faf1-409d-bc2e-581cbdcd1f6d","Type":"ContainerStarted","Data":"583d135e5a359966cee557b1b2d0e1e575c864596a3a4658d92283edd29d9c07"} Feb 19 16:50:02 crc kubenswrapper[4810]: I0219 16:50:02.993896 4810 generic.go:334] "Generic (PLEG): container finished" podID="e82d7377-faf1-409d-bc2e-581cbdcd1f6d" containerID="583d135e5a359966cee557b1b2d0e1e575c864596a3a4658d92283edd29d9c07" exitCode=0 Feb 19 16:50:02 crc kubenswrapper[4810]: I0219 16:50:02.993979 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6xmxg" event={"ID":"e82d7377-faf1-409d-bc2e-581cbdcd1f6d","Type":"ContainerDied","Data":"583d135e5a359966cee557b1b2d0e1e575c864596a3a4658d92283edd29d9c07"} Feb 19 16:50:04 crc kubenswrapper[4810]: I0219 16:50:04.006441 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6xmxg" event={"ID":"e82d7377-faf1-409d-bc2e-581cbdcd1f6d","Type":"ContainerStarted","Data":"30f6fb85d62122e3a21c79cc5e4b78d9d36ebef448e6e43c507f3bfef300031e"} Feb 19 16:50:04 crc kubenswrapper[4810]: I0219 16:50:04.038042 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6xmxg" podStartSLOduration=2.27834756 podStartE2EDuration="8.038024961s" podCreationTimestamp="2026-02-19 16:49:56 +0000 UTC" firstStartedPulling="2026-02-19 16:49:57.932340337 +0000 UTC m=+6027.414370461" lastFinishedPulling="2026-02-19 16:50:03.692017708 +0000 UTC m=+6033.174047862" observedRunningTime="2026-02-19 16:50:04.032899034 +0000 UTC m=+6033.514929158" watchObservedRunningTime="2026-02-19 16:50:04.038024961 +0000 UTC m=+6033.520055085" Feb 19 16:50:06 crc kubenswrapper[4810]: I0219 16:50:06.618965 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6xmxg" Feb 19 16:50:06 crc kubenswrapper[4810]: I0219 16:50:06.619676 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6xmxg" Feb 19 16:50:07 crc kubenswrapper[4810]: I0219 16:50:07.694945 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6xmxg" podUID="e82d7377-faf1-409d-bc2e-581cbdcd1f6d" containerName="registry-server" probeResult="failure" output=< Feb 19 16:50:07 crc kubenswrapper[4810]: timeout: failed to connect service ":50051" within 1s Feb 19 16:50:07 crc kubenswrapper[4810]: > Feb 19 16:50:09 crc kubenswrapper[4810]: I0219 16:50:09.440118 4810 scope.go:117] "RemoveContainer" containerID="63bd1770c325dfdb5ccd64220060f108b50a35fa6325f46acd3bb9d2e6c06e3e" Feb 19 16:50:09 crc kubenswrapper[4810]: E0219 16:50:09.440456 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:50:16 crc kubenswrapper[4810]: I0219 16:50:16.713710 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6xmxg" Feb 19 16:50:16 crc kubenswrapper[4810]: I0219 16:50:16.801820 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6xmxg" Feb 19 16:50:16 crc kubenswrapper[4810]: I0219 16:50:16.976157 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6xmxg"] Feb 19 16:50:18 crc kubenswrapper[4810]: I0219 16:50:18.176391 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6xmxg" podUID="e82d7377-faf1-409d-bc2e-581cbdcd1f6d" containerName="registry-server" containerID="cri-o://30f6fb85d62122e3a21c79cc5e4b78d9d36ebef448e6e43c507f3bfef300031e" gracePeriod=2 Feb 19 16:50:18 crc kubenswrapper[4810]: I0219 16:50:18.703897 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6xmxg" Feb 19 16:50:18 crc kubenswrapper[4810]: I0219 16:50:18.768276 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e82d7377-faf1-409d-bc2e-581cbdcd1f6d-utilities\") pod \"e82d7377-faf1-409d-bc2e-581cbdcd1f6d\" (UID: \"e82d7377-faf1-409d-bc2e-581cbdcd1f6d\") " Feb 19 16:50:18 crc kubenswrapper[4810]: I0219 16:50:18.768920 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e82d7377-faf1-409d-bc2e-581cbdcd1f6d-catalog-content\") pod \"e82d7377-faf1-409d-bc2e-581cbdcd1f6d\" (UID: \"e82d7377-faf1-409d-bc2e-581cbdcd1f6d\") " Feb 19 16:50:18 crc kubenswrapper[4810]: I0219 16:50:18.769412 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz44j\" (UniqueName: \"kubernetes.io/projected/e82d7377-faf1-409d-bc2e-581cbdcd1f6d-kube-api-access-lz44j\") pod \"e82d7377-faf1-409d-bc2e-581cbdcd1f6d\" (UID: \"e82d7377-faf1-409d-bc2e-581cbdcd1f6d\") " Feb 19 16:50:18 crc kubenswrapper[4810]: I0219 16:50:18.769746 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e82d7377-faf1-409d-bc2e-581cbdcd1f6d-utilities" (OuterVolumeSpecName: "utilities") pod "e82d7377-faf1-409d-bc2e-581cbdcd1f6d" (UID: "e82d7377-faf1-409d-bc2e-581cbdcd1f6d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 16:50:18 crc kubenswrapper[4810]: I0219 16:50:18.771423 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e82d7377-faf1-409d-bc2e-581cbdcd1f6d-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 16:50:18 crc kubenswrapper[4810]: I0219 16:50:18.775863 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e82d7377-faf1-409d-bc2e-581cbdcd1f6d-kube-api-access-lz44j" (OuterVolumeSpecName: "kube-api-access-lz44j") pod "e82d7377-faf1-409d-bc2e-581cbdcd1f6d" (UID: "e82d7377-faf1-409d-bc2e-581cbdcd1f6d"). InnerVolumeSpecName "kube-api-access-lz44j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 16:50:18 crc kubenswrapper[4810]: I0219 16:50:18.873681 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz44j\" (UniqueName: \"kubernetes.io/projected/e82d7377-faf1-409d-bc2e-581cbdcd1f6d-kube-api-access-lz44j\") on node \"crc\" DevicePath \"\"" Feb 19 16:50:18 crc kubenswrapper[4810]: I0219 16:50:18.916655 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e82d7377-faf1-409d-bc2e-581cbdcd1f6d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e82d7377-faf1-409d-bc2e-581cbdcd1f6d" (UID: "e82d7377-faf1-409d-bc2e-581cbdcd1f6d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 16:50:18 crc kubenswrapper[4810]: I0219 16:50:18.975408 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e82d7377-faf1-409d-bc2e-581cbdcd1f6d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 16:50:19 crc kubenswrapper[4810]: I0219 16:50:19.193768 4810 generic.go:334] "Generic (PLEG): container finished" podID="e82d7377-faf1-409d-bc2e-581cbdcd1f6d" containerID="30f6fb85d62122e3a21c79cc5e4b78d9d36ebef448e6e43c507f3bfef300031e" exitCode=0 Feb 19 16:50:19 crc kubenswrapper[4810]: I0219 16:50:19.193839 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6xmxg" event={"ID":"e82d7377-faf1-409d-bc2e-581cbdcd1f6d","Type":"ContainerDied","Data":"30f6fb85d62122e3a21c79cc5e4b78d9d36ebef448e6e43c507f3bfef300031e"} Feb 19 16:50:19 crc kubenswrapper[4810]: I0219 16:50:19.193883 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6xmxg" Feb 19 16:50:19 crc kubenswrapper[4810]: I0219 16:50:19.194320 4810 scope.go:117] "RemoveContainer" containerID="30f6fb85d62122e3a21c79cc5e4b78d9d36ebef448e6e43c507f3bfef300031e" Feb 19 16:50:19 crc kubenswrapper[4810]: I0219 16:50:19.194298 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6xmxg" event={"ID":"e82d7377-faf1-409d-bc2e-581cbdcd1f6d","Type":"ContainerDied","Data":"e208e8dc67c8ea6085bc7423038bf7a34783c4f8f871e552d86a48225914d4f3"} Feb 19 16:50:19 crc kubenswrapper[4810]: I0219 16:50:19.229467 4810 scope.go:117] "RemoveContainer" containerID="583d135e5a359966cee557b1b2d0e1e575c864596a3a4658d92283edd29d9c07" Feb 19 16:50:19 crc kubenswrapper[4810]: I0219 16:50:19.271805 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6xmxg"] Feb 19 16:50:19 crc kubenswrapper[4810]: I0219 16:50:19.278522 4810 scope.go:117] "RemoveContainer" containerID="af6b1b41bd819502addcb6561dbf8e4c1088ea0e95e1b6e7bc74408aef9ce7ec" Feb 19 16:50:19 crc kubenswrapper[4810]: I0219 16:50:19.289298 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6xmxg"] Feb 19 16:50:19 crc kubenswrapper[4810]: I0219 16:50:19.337046 4810 scope.go:117] "RemoveContainer" containerID="30f6fb85d62122e3a21c79cc5e4b78d9d36ebef448e6e43c507f3bfef300031e" Feb 19 16:50:19 crc kubenswrapper[4810]: E0219 16:50:19.337720 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30f6fb85d62122e3a21c79cc5e4b78d9d36ebef448e6e43c507f3bfef300031e\": container with ID starting with 30f6fb85d62122e3a21c79cc5e4b78d9d36ebef448e6e43c507f3bfef300031e not found: ID does not exist" containerID="30f6fb85d62122e3a21c79cc5e4b78d9d36ebef448e6e43c507f3bfef300031e" Feb 19 16:50:19 crc kubenswrapper[4810]: I0219 16:50:19.337781 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30f6fb85d62122e3a21c79cc5e4b78d9d36ebef448e6e43c507f3bfef300031e"} err="failed to get container status \"30f6fb85d62122e3a21c79cc5e4b78d9d36ebef448e6e43c507f3bfef300031e\": rpc error: code = NotFound desc = could not find container \"30f6fb85d62122e3a21c79cc5e4b78d9d36ebef448e6e43c507f3bfef300031e\": container with ID starting with 30f6fb85d62122e3a21c79cc5e4b78d9d36ebef448e6e43c507f3bfef300031e not found: ID does not exist" Feb 19 16:50:19 crc kubenswrapper[4810]: I0219 16:50:19.337815 4810 scope.go:117] "RemoveContainer" containerID="583d135e5a359966cee557b1b2d0e1e575c864596a3a4658d92283edd29d9c07" Feb 19 16:50:19 crc kubenswrapper[4810]: E0219 16:50:19.339979 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"583d135e5a359966cee557b1b2d0e1e575c864596a3a4658d92283edd29d9c07\": container with ID starting with 583d135e5a359966cee557b1b2d0e1e575c864596a3a4658d92283edd29d9c07 not found: ID does not exist" containerID="583d135e5a359966cee557b1b2d0e1e575c864596a3a4658d92283edd29d9c07" Feb 19 16:50:19 crc kubenswrapper[4810]: I0219 16:50:19.340020 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"583d135e5a359966cee557b1b2d0e1e575c864596a3a4658d92283edd29d9c07"} err="failed to get container status \"583d135e5a359966cee557b1b2d0e1e575c864596a3a4658d92283edd29d9c07\": rpc error: code = NotFound desc = could not find container \"583d135e5a359966cee557b1b2d0e1e575c864596a3a4658d92283edd29d9c07\": container with ID starting with 583d135e5a359966cee557b1b2d0e1e575c864596a3a4658d92283edd29d9c07 not found: ID does not exist" Feb 19 16:50:19 crc kubenswrapper[4810]: I0219 16:50:19.340069 4810 scope.go:117] "RemoveContainer" containerID="af6b1b41bd819502addcb6561dbf8e4c1088ea0e95e1b6e7bc74408aef9ce7ec" Feb 19 16:50:19 crc kubenswrapper[4810]: E0219 16:50:19.340701 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af6b1b41bd819502addcb6561dbf8e4c1088ea0e95e1b6e7bc74408aef9ce7ec\": container with ID starting with af6b1b41bd819502addcb6561dbf8e4c1088ea0e95e1b6e7bc74408aef9ce7ec not found: ID does not exist" containerID="af6b1b41bd819502addcb6561dbf8e4c1088ea0e95e1b6e7bc74408aef9ce7ec" Feb 19 16:50:19 crc kubenswrapper[4810]: I0219 16:50:19.340785 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af6b1b41bd819502addcb6561dbf8e4c1088ea0e95e1b6e7bc74408aef9ce7ec"} err="failed to get container status \"af6b1b41bd819502addcb6561dbf8e4c1088ea0e95e1b6e7bc74408aef9ce7ec\": rpc error: code = NotFound desc = could not find container \"af6b1b41bd819502addcb6561dbf8e4c1088ea0e95e1b6e7bc74408aef9ce7ec\": container with ID starting with af6b1b41bd819502addcb6561dbf8e4c1088ea0e95e1b6e7bc74408aef9ce7ec not found: ID does not exist" Feb 19 16:50:19 crc kubenswrapper[4810]: I0219 16:50:19.457249 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e82d7377-faf1-409d-bc2e-581cbdcd1f6d" path="/var/lib/kubelet/pods/e82d7377-faf1-409d-bc2e-581cbdcd1f6d/volumes" Feb 19 16:50:20 crc kubenswrapper[4810]: I0219 16:50:20.443921 4810 scope.go:117] "RemoveContainer" containerID="63bd1770c325dfdb5ccd64220060f108b50a35fa6325f46acd3bb9d2e6c06e3e" Feb 19 16:50:21 crc kubenswrapper[4810]: I0219 16:50:21.233390 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" event={"ID":"dfbf74ef-1e94-4826-8583-42b2e246ccf3","Type":"ContainerStarted","Data":"30b913c55f9740186a8530347003d7b1c641faf95ecc5adefecfaffe54fb5ed2"} Feb 19 16:52:26 crc kubenswrapper[4810]: I0219 16:52:26.306192 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-944q7/must-gather-452lm"] Feb 19 16:52:26 crc kubenswrapper[4810]: E0219 16:52:26.307132 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e82d7377-faf1-409d-bc2e-581cbdcd1f6d" containerName="extract-content" Feb 19 16:52:26 crc kubenswrapper[4810]: I0219 16:52:26.307147 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="e82d7377-faf1-409d-bc2e-581cbdcd1f6d" containerName="extract-content" Feb 19 16:52:26 crc kubenswrapper[4810]: E0219 16:52:26.307178 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e82d7377-faf1-409d-bc2e-581cbdcd1f6d" containerName="registry-server" Feb 19 16:52:26 crc kubenswrapper[4810]: I0219 16:52:26.307189 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="e82d7377-faf1-409d-bc2e-581cbdcd1f6d" containerName="registry-server" Feb 19 16:52:26 crc kubenswrapper[4810]: E0219 16:52:26.307204 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e82d7377-faf1-409d-bc2e-581cbdcd1f6d" containerName="extract-utilities" Feb 19 16:52:26 crc kubenswrapper[4810]: I0219 16:52:26.307212 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="e82d7377-faf1-409d-bc2e-581cbdcd1f6d" containerName="extract-utilities" Feb 19 16:52:26 crc kubenswrapper[4810]: I0219 16:52:26.307466 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="e82d7377-faf1-409d-bc2e-581cbdcd1f6d" containerName="registry-server" Feb 19 16:52:26 crc kubenswrapper[4810]: I0219 16:52:26.308813 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-944q7/must-gather-452lm" Feb 19 16:52:26 crc kubenswrapper[4810]: I0219 16:52:26.329422 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-944q7"/"default-dockercfg-4gktv" Feb 19 16:52:26 crc kubenswrapper[4810]: I0219 16:52:26.329725 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-944q7"/"openshift-service-ca.crt" Feb 19 16:52:26 crc kubenswrapper[4810]: I0219 16:52:26.330088 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-944q7"/"kube-root-ca.crt" Feb 19 16:52:26 crc kubenswrapper[4810]: I0219 16:52:26.385744 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-944q7/must-gather-452lm"] Feb 19 16:52:26 crc kubenswrapper[4810]: I0219 16:52:26.410564 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6rmt\" (UniqueName: \"kubernetes.io/projected/0adbd447-568e-48c8-ab76-3d2f20e3f315-kube-api-access-f6rmt\") pod \"must-gather-452lm\" (UID: \"0adbd447-568e-48c8-ab76-3d2f20e3f315\") " pod="openshift-must-gather-944q7/must-gather-452lm" Feb 19 16:52:26 crc kubenswrapper[4810]: I0219 16:52:26.410703 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0adbd447-568e-48c8-ab76-3d2f20e3f315-must-gather-output\") pod \"must-gather-452lm\" (UID: \"0adbd447-568e-48c8-ab76-3d2f20e3f315\") " pod="openshift-must-gather-944q7/must-gather-452lm" Feb 19 16:52:26 crc kubenswrapper[4810]: I0219 16:52:26.512822 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0adbd447-568e-48c8-ab76-3d2f20e3f315-must-gather-output\") pod \"must-gather-452lm\" (UID: \"0adbd447-568e-48c8-ab76-3d2f20e3f315\") " pod="openshift-must-gather-944q7/must-gather-452lm" Feb 19 16:52:26 crc kubenswrapper[4810]: I0219 16:52:26.512966 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6rmt\" (UniqueName: \"kubernetes.io/projected/0adbd447-568e-48c8-ab76-3d2f20e3f315-kube-api-access-f6rmt\") pod \"must-gather-452lm\" (UID: \"0adbd447-568e-48c8-ab76-3d2f20e3f315\") " pod="openshift-must-gather-944q7/must-gather-452lm" Feb 19 16:52:26 crc kubenswrapper[4810]: I0219 16:52:26.513473 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0adbd447-568e-48c8-ab76-3d2f20e3f315-must-gather-output\") pod \"must-gather-452lm\" (UID: \"0adbd447-568e-48c8-ab76-3d2f20e3f315\") " pod="openshift-must-gather-944q7/must-gather-452lm" Feb 19 16:52:26 crc kubenswrapper[4810]: I0219 16:52:26.532182 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6rmt\" (UniqueName: \"kubernetes.io/projected/0adbd447-568e-48c8-ab76-3d2f20e3f315-kube-api-access-f6rmt\") pod \"must-gather-452lm\" (UID: \"0adbd447-568e-48c8-ab76-3d2f20e3f315\") " pod="openshift-must-gather-944q7/must-gather-452lm" Feb 19 16:52:26 crc kubenswrapper[4810]: I0219 16:52:26.674427 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-944q7/must-gather-452lm" Feb 19 16:52:27 crc kubenswrapper[4810]: I0219 16:52:27.191851 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-944q7/must-gather-452lm"] Feb 19 16:52:27 crc kubenswrapper[4810]: I0219 16:52:27.778126 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-944q7/must-gather-452lm" event={"ID":"0adbd447-568e-48c8-ab76-3d2f20e3f315","Type":"ContainerStarted","Data":"35e06e2cca1fb991fa940df76a9f88f0b3d758223a060db89405e9f9e28e0bdb"} Feb 19 16:52:27 crc kubenswrapper[4810]: I0219 16:52:27.778497 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-944q7/must-gather-452lm" event={"ID":"0adbd447-568e-48c8-ab76-3d2f20e3f315","Type":"ContainerStarted","Data":"07153056d19231ca3d7ab4d1dd2cd03f17f8e8578ea20e8a31486c974d0f0b3d"} Feb 19 16:52:27 crc kubenswrapper[4810]: I0219 16:52:27.778512 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-944q7/must-gather-452lm" event={"ID":"0adbd447-568e-48c8-ab76-3d2f20e3f315","Type":"ContainerStarted","Data":"cf36fcfbf554bf4a4c84825c8d6c3208469e142f948d23106d4b61d616d08799"} Feb 19 16:52:27 crc kubenswrapper[4810]: I0219 16:52:27.803256 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-944q7/must-gather-452lm" podStartSLOduration=1.80323674 podStartE2EDuration="1.80323674s" podCreationTimestamp="2026-02-19 16:52:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 16:52:27.795921209 +0000 UTC m=+6177.277951333" watchObservedRunningTime="2026-02-19 16:52:27.80323674 +0000 UTC m=+6177.285266854" Feb 19 16:52:31 crc kubenswrapper[4810]: I0219 16:52:31.227587 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-944q7/crc-debug-98k9d"] Feb 19 16:52:31 crc kubenswrapper[4810]: I0219 16:52:31.229540 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-944q7/crc-debug-98k9d" Feb 19 16:52:31 crc kubenswrapper[4810]: I0219 16:52:31.329556 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7e7afe8a-09e9-464c-b8ba-ed36963c58af-host\") pod \"crc-debug-98k9d\" (UID: \"7e7afe8a-09e9-464c-b8ba-ed36963c58af\") " pod="openshift-must-gather-944q7/crc-debug-98k9d" Feb 19 16:52:31 crc kubenswrapper[4810]: I0219 16:52:31.329646 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbj5v\" (UniqueName: \"kubernetes.io/projected/7e7afe8a-09e9-464c-b8ba-ed36963c58af-kube-api-access-fbj5v\") pod \"crc-debug-98k9d\" (UID: \"7e7afe8a-09e9-464c-b8ba-ed36963c58af\") " pod="openshift-must-gather-944q7/crc-debug-98k9d" Feb 19 16:52:31 crc kubenswrapper[4810]: I0219 16:52:31.431912 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7e7afe8a-09e9-464c-b8ba-ed36963c58af-host\") pod \"crc-debug-98k9d\" (UID: \"7e7afe8a-09e9-464c-b8ba-ed36963c58af\") " pod="openshift-must-gather-944q7/crc-debug-98k9d" Feb 19 16:52:31 crc kubenswrapper[4810]: I0219 16:52:31.432268 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbj5v\" (UniqueName: \"kubernetes.io/projected/7e7afe8a-09e9-464c-b8ba-ed36963c58af-kube-api-access-fbj5v\") pod \"crc-debug-98k9d\" (UID: \"7e7afe8a-09e9-464c-b8ba-ed36963c58af\") " pod="openshift-must-gather-944q7/crc-debug-98k9d" Feb 19 16:52:31 crc kubenswrapper[4810]: I0219 16:52:31.432103 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7e7afe8a-09e9-464c-b8ba-ed36963c58af-host\") pod \"crc-debug-98k9d\" (UID: \"7e7afe8a-09e9-464c-b8ba-ed36963c58af\") " pod="openshift-must-gather-944q7/crc-debug-98k9d" Feb 19 16:52:31 crc kubenswrapper[4810]: I0219 16:52:31.461106 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbj5v\" (UniqueName: \"kubernetes.io/projected/7e7afe8a-09e9-464c-b8ba-ed36963c58af-kube-api-access-fbj5v\") pod \"crc-debug-98k9d\" (UID: \"7e7afe8a-09e9-464c-b8ba-ed36963c58af\") " pod="openshift-must-gather-944q7/crc-debug-98k9d" Feb 19 16:52:31 crc kubenswrapper[4810]: I0219 16:52:31.559660 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-944q7/crc-debug-98k9d" Feb 19 16:52:31 crc kubenswrapper[4810]: I0219 16:52:31.828065 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-944q7/crc-debug-98k9d" event={"ID":"7e7afe8a-09e9-464c-b8ba-ed36963c58af","Type":"ContainerStarted","Data":"3fcdfa3b9cf34af832a3d571b41bf6ce1daea3be21eb9d89797e41fab2452dae"} Feb 19 16:52:32 crc kubenswrapper[4810]: I0219 16:52:32.839441 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-944q7/crc-debug-98k9d" event={"ID":"7e7afe8a-09e9-464c-b8ba-ed36963c58af","Type":"ContainerStarted","Data":"1827e095c322c4555855b6ea50a05730da42c19ada929bb9656b95a872f9917c"} Feb 19 16:52:32 crc kubenswrapper[4810]: I0219 16:52:32.853374 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-944q7/crc-debug-98k9d" podStartSLOduration=1.853352103 podStartE2EDuration="1.853352103s" podCreationTimestamp="2026-02-19 16:52:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 16:52:32.852092762 +0000 UTC m=+6182.334122876" watchObservedRunningTime="2026-02-19 16:52:32.853352103 +0000 UTC m=+6182.335382237" Feb 19 16:52:49 crc kubenswrapper[4810]: I0219 16:52:49.537293 4810 patch_prober.go:28] interesting pod/machine-config-daemon-t499d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 16:52:49 crc kubenswrapper[4810]: I0219 16:52:49.537747 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 16:53:13 crc kubenswrapper[4810]: I0219 16:53:13.248081 4810 generic.go:334] "Generic (PLEG): container finished" podID="7e7afe8a-09e9-464c-b8ba-ed36963c58af" containerID="1827e095c322c4555855b6ea50a05730da42c19ada929bb9656b95a872f9917c" exitCode=0 Feb 19 16:53:13 crc kubenswrapper[4810]: I0219 16:53:13.248168 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-944q7/crc-debug-98k9d" event={"ID":"7e7afe8a-09e9-464c-b8ba-ed36963c58af","Type":"ContainerDied","Data":"1827e095c322c4555855b6ea50a05730da42c19ada929bb9656b95a872f9917c"} Feb 19 16:53:14 crc kubenswrapper[4810]: I0219 16:53:14.377896 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-944q7/crc-debug-98k9d" Feb 19 16:53:14 crc kubenswrapper[4810]: I0219 16:53:14.414394 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-944q7/crc-debug-98k9d"] Feb 19 16:53:14 crc kubenswrapper[4810]: I0219 16:53:14.424282 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-944q7/crc-debug-98k9d"] Feb 19 16:53:14 crc kubenswrapper[4810]: I0219 16:53:14.446552 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7e7afe8a-09e9-464c-b8ba-ed36963c58af-host\") pod \"7e7afe8a-09e9-464c-b8ba-ed36963c58af\" (UID: \"7e7afe8a-09e9-464c-b8ba-ed36963c58af\") " Feb 19 16:53:14 crc kubenswrapper[4810]: I0219 16:53:14.446605 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbj5v\" (UniqueName: \"kubernetes.io/projected/7e7afe8a-09e9-464c-b8ba-ed36963c58af-kube-api-access-fbj5v\") pod \"7e7afe8a-09e9-464c-b8ba-ed36963c58af\" (UID: \"7e7afe8a-09e9-464c-b8ba-ed36963c58af\") " Feb 19 16:53:14 crc kubenswrapper[4810]: I0219 16:53:14.447981 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7e7afe8a-09e9-464c-b8ba-ed36963c58af-host" (OuterVolumeSpecName: "host") pod "7e7afe8a-09e9-464c-b8ba-ed36963c58af" (UID: "7e7afe8a-09e9-464c-b8ba-ed36963c58af"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 16:53:14 crc kubenswrapper[4810]: I0219 16:53:14.468573 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e7afe8a-09e9-464c-b8ba-ed36963c58af-kube-api-access-fbj5v" (OuterVolumeSpecName: "kube-api-access-fbj5v") pod "7e7afe8a-09e9-464c-b8ba-ed36963c58af" (UID: "7e7afe8a-09e9-464c-b8ba-ed36963c58af"). InnerVolumeSpecName "kube-api-access-fbj5v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 16:53:14 crc kubenswrapper[4810]: I0219 16:53:14.549469 4810 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7e7afe8a-09e9-464c-b8ba-ed36963c58af-host\") on node \"crc\" DevicePath \"\"" Feb 19 16:53:14 crc kubenswrapper[4810]: I0219 16:53:14.549507 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fbj5v\" (UniqueName: \"kubernetes.io/projected/7e7afe8a-09e9-464c-b8ba-ed36963c58af-kube-api-access-fbj5v\") on node \"crc\" DevicePath \"\"" Feb 19 16:53:15 crc kubenswrapper[4810]: I0219 16:53:15.270521 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3fcdfa3b9cf34af832a3d571b41bf6ce1daea3be21eb9d89797e41fab2452dae" Feb 19 16:53:15 crc kubenswrapper[4810]: I0219 16:53:15.270887 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-944q7/crc-debug-98k9d" Feb 19 16:53:15 crc kubenswrapper[4810]: I0219 16:53:15.452845 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e7afe8a-09e9-464c-b8ba-ed36963c58af" path="/var/lib/kubelet/pods/7e7afe8a-09e9-464c-b8ba-ed36963c58af/volumes" Feb 19 16:53:15 crc kubenswrapper[4810]: I0219 16:53:15.624194 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-944q7/crc-debug-ck97z"] Feb 19 16:53:15 crc kubenswrapper[4810]: E0219 16:53:15.624623 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e7afe8a-09e9-464c-b8ba-ed36963c58af" containerName="container-00" Feb 19 16:53:15 crc kubenswrapper[4810]: I0219 16:53:15.624641 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e7afe8a-09e9-464c-b8ba-ed36963c58af" containerName="container-00" Feb 19 16:53:15 crc kubenswrapper[4810]: I0219 16:53:15.624845 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e7afe8a-09e9-464c-b8ba-ed36963c58af" containerName="container-00" Feb 19 16:53:15 crc kubenswrapper[4810]: I0219 16:53:15.625573 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-944q7/crc-debug-ck97z" Feb 19 16:53:15 crc kubenswrapper[4810]: I0219 16:53:15.793076 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f0809137-426c-4833-a028-76093d3d92f5-host\") pod \"crc-debug-ck97z\" (UID: \"f0809137-426c-4833-a028-76093d3d92f5\") " pod="openshift-must-gather-944q7/crc-debug-ck97z" Feb 19 16:53:15 crc kubenswrapper[4810]: I0219 16:53:15.793640 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72khb\" (UniqueName: \"kubernetes.io/projected/f0809137-426c-4833-a028-76093d3d92f5-kube-api-access-72khb\") pod \"crc-debug-ck97z\" (UID: \"f0809137-426c-4833-a028-76093d3d92f5\") " pod="openshift-must-gather-944q7/crc-debug-ck97z" Feb 19 16:53:15 crc kubenswrapper[4810]: I0219 16:53:15.895587 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f0809137-426c-4833-a028-76093d3d92f5-host\") pod \"crc-debug-ck97z\" (UID: \"f0809137-426c-4833-a028-76093d3d92f5\") " pod="openshift-must-gather-944q7/crc-debug-ck97z" Feb 19 16:53:15 crc kubenswrapper[4810]: I0219 16:53:15.895883 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72khb\" (UniqueName: \"kubernetes.io/projected/f0809137-426c-4833-a028-76093d3d92f5-kube-api-access-72khb\") pod \"crc-debug-ck97z\" (UID: \"f0809137-426c-4833-a028-76093d3d92f5\") " pod="openshift-must-gather-944q7/crc-debug-ck97z" Feb 19 16:53:15 crc kubenswrapper[4810]: I0219 16:53:15.896303 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f0809137-426c-4833-a028-76093d3d92f5-host\") pod \"crc-debug-ck97z\" (UID: \"f0809137-426c-4833-a028-76093d3d92f5\") " pod="openshift-must-gather-944q7/crc-debug-ck97z" Feb 19 16:53:15 crc kubenswrapper[4810]: I0219 16:53:15.919930 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72khb\" (UniqueName: \"kubernetes.io/projected/f0809137-426c-4833-a028-76093d3d92f5-kube-api-access-72khb\") pod \"crc-debug-ck97z\" (UID: \"f0809137-426c-4833-a028-76093d3d92f5\") " pod="openshift-must-gather-944q7/crc-debug-ck97z" Feb 19 16:53:15 crc kubenswrapper[4810]: I0219 16:53:15.942951 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-944q7/crc-debug-ck97z" Feb 19 16:53:16 crc kubenswrapper[4810]: I0219 16:53:16.282645 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-944q7/crc-debug-ck97z" event={"ID":"f0809137-426c-4833-a028-76093d3d92f5","Type":"ContainerStarted","Data":"c35e5c9c4b5ddac624677f837531b6c053c31b4d14fb41f66bcb860eccf31d9d"} Feb 19 16:53:16 crc kubenswrapper[4810]: I0219 16:53:16.282968 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-944q7/crc-debug-ck97z" event={"ID":"f0809137-426c-4833-a028-76093d3d92f5","Type":"ContainerStarted","Data":"4956638e7759c91b229eb1bf7e713d3c122f23d048d6e53e36a4b7bdbf9b5761"} Feb 19 16:53:16 crc kubenswrapper[4810]: I0219 16:53:16.300836 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-944q7/crc-debug-ck97z" podStartSLOduration=1.300807834 podStartE2EDuration="1.300807834s" podCreationTimestamp="2026-02-19 16:53:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 16:53:16.293967185 +0000 UTC m=+6225.775997309" watchObservedRunningTime="2026-02-19 16:53:16.300807834 +0000 UTC m=+6225.782837958" Feb 19 16:53:17 crc kubenswrapper[4810]: I0219 16:53:17.299375 4810 generic.go:334] "Generic (PLEG): container finished" podID="f0809137-426c-4833-a028-76093d3d92f5" containerID="c35e5c9c4b5ddac624677f837531b6c053c31b4d14fb41f66bcb860eccf31d9d" exitCode=0 Feb 19 16:53:17 crc kubenswrapper[4810]: I0219 16:53:17.299815 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-944q7/crc-debug-ck97z" event={"ID":"f0809137-426c-4833-a028-76093d3d92f5","Type":"ContainerDied","Data":"c35e5c9c4b5ddac624677f837531b6c053c31b4d14fb41f66bcb860eccf31d9d"} Feb 19 16:53:18 crc kubenswrapper[4810]: I0219 16:53:18.419103 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-944q7/crc-debug-ck97z" Feb 19 16:53:18 crc kubenswrapper[4810]: I0219 16:53:18.452391 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-944q7/crc-debug-ck97z"] Feb 19 16:53:18 crc kubenswrapper[4810]: I0219 16:53:18.460491 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-944q7/crc-debug-ck97z"] Feb 19 16:53:18 crc kubenswrapper[4810]: I0219 16:53:18.563901 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f0809137-426c-4833-a028-76093d3d92f5-host\") pod \"f0809137-426c-4833-a028-76093d3d92f5\" (UID: \"f0809137-426c-4833-a028-76093d3d92f5\") " Feb 19 16:53:18 crc kubenswrapper[4810]: I0219 16:53:18.564156 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72khb\" (UniqueName: \"kubernetes.io/projected/f0809137-426c-4833-a028-76093d3d92f5-kube-api-access-72khb\") pod \"f0809137-426c-4833-a028-76093d3d92f5\" (UID: \"f0809137-426c-4833-a028-76093d3d92f5\") " Feb 19 16:53:18 crc kubenswrapper[4810]: I0219 16:53:18.565290 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f0809137-426c-4833-a028-76093d3d92f5-host" (OuterVolumeSpecName: "host") pod "f0809137-426c-4833-a028-76093d3d92f5" (UID: "f0809137-426c-4833-a028-76093d3d92f5"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 16:53:18 crc kubenswrapper[4810]: I0219 16:53:18.577542 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0809137-426c-4833-a028-76093d3d92f5-kube-api-access-72khb" (OuterVolumeSpecName: "kube-api-access-72khb") pod "f0809137-426c-4833-a028-76093d3d92f5" (UID: "f0809137-426c-4833-a028-76093d3d92f5"). InnerVolumeSpecName "kube-api-access-72khb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 16:53:18 crc kubenswrapper[4810]: I0219 16:53:18.666669 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-72khb\" (UniqueName: \"kubernetes.io/projected/f0809137-426c-4833-a028-76093d3d92f5-kube-api-access-72khb\") on node \"crc\" DevicePath \"\"" Feb 19 16:53:18 crc kubenswrapper[4810]: I0219 16:53:18.666705 4810 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f0809137-426c-4833-a028-76093d3d92f5-host\") on node \"crc\" DevicePath \"\"" Feb 19 16:53:19 crc kubenswrapper[4810]: I0219 16:53:19.318019 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4956638e7759c91b229eb1bf7e713d3c122f23d048d6e53e36a4b7bdbf9b5761" Feb 19 16:53:19 crc kubenswrapper[4810]: I0219 16:53:19.318069 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-944q7/crc-debug-ck97z" Feb 19 16:53:19 crc kubenswrapper[4810]: I0219 16:53:19.450501 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0809137-426c-4833-a028-76093d3d92f5" path="/var/lib/kubelet/pods/f0809137-426c-4833-a028-76093d3d92f5/volumes" Feb 19 16:53:19 crc kubenswrapper[4810]: I0219 16:53:19.618627 4810 patch_prober.go:28] interesting pod/machine-config-daemon-t499d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 16:53:19 crc kubenswrapper[4810]: I0219 16:53:19.618674 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 16:53:19 crc kubenswrapper[4810]: I0219 16:53:19.799276 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-944q7/crc-debug-qjwzn"] Feb 19 16:53:19 crc kubenswrapper[4810]: E0219 16:53:19.799667 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0809137-426c-4833-a028-76093d3d92f5" containerName="container-00" Feb 19 16:53:19 crc kubenswrapper[4810]: I0219 16:53:19.799685 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0809137-426c-4833-a028-76093d3d92f5" containerName="container-00" Feb 19 16:53:19 crc kubenswrapper[4810]: I0219 16:53:19.799929 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0809137-426c-4833-a028-76093d3d92f5" containerName="container-00" Feb 19 16:53:19 crc kubenswrapper[4810]: I0219 16:53:19.800563 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-944q7/crc-debug-qjwzn" Feb 19 16:53:19 crc kubenswrapper[4810]: I0219 16:53:19.924187 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkh4m\" (UniqueName: \"kubernetes.io/projected/3aae9737-d017-4a11-8323-cd0354ba09aa-kube-api-access-dkh4m\") pod \"crc-debug-qjwzn\" (UID: \"3aae9737-d017-4a11-8323-cd0354ba09aa\") " pod="openshift-must-gather-944q7/crc-debug-qjwzn" Feb 19 16:53:19 crc kubenswrapper[4810]: I0219 16:53:19.924262 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3aae9737-d017-4a11-8323-cd0354ba09aa-host\") pod \"crc-debug-qjwzn\" (UID: \"3aae9737-d017-4a11-8323-cd0354ba09aa\") " pod="openshift-must-gather-944q7/crc-debug-qjwzn" Feb 19 16:53:20 crc kubenswrapper[4810]: I0219 16:53:20.026387 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkh4m\" (UniqueName: \"kubernetes.io/projected/3aae9737-d017-4a11-8323-cd0354ba09aa-kube-api-access-dkh4m\") pod \"crc-debug-qjwzn\" (UID: \"3aae9737-d017-4a11-8323-cd0354ba09aa\") " pod="openshift-must-gather-944q7/crc-debug-qjwzn" Feb 19 16:53:20 crc kubenswrapper[4810]: I0219 16:53:20.026460 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3aae9737-d017-4a11-8323-cd0354ba09aa-host\") pod \"crc-debug-qjwzn\" (UID: \"3aae9737-d017-4a11-8323-cd0354ba09aa\") " pod="openshift-must-gather-944q7/crc-debug-qjwzn" Feb 19 16:53:20 crc kubenswrapper[4810]: I0219 16:53:20.026624 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3aae9737-d017-4a11-8323-cd0354ba09aa-host\") pod \"crc-debug-qjwzn\" (UID: \"3aae9737-d017-4a11-8323-cd0354ba09aa\") " pod="openshift-must-gather-944q7/crc-debug-qjwzn" Feb 19 16:53:20 crc kubenswrapper[4810]: I0219 16:53:20.070238 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkh4m\" (UniqueName: \"kubernetes.io/projected/3aae9737-d017-4a11-8323-cd0354ba09aa-kube-api-access-dkh4m\") pod \"crc-debug-qjwzn\" (UID: \"3aae9737-d017-4a11-8323-cd0354ba09aa\") " pod="openshift-must-gather-944q7/crc-debug-qjwzn" Feb 19 16:53:20 crc kubenswrapper[4810]: I0219 16:53:20.117565 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-944q7/crc-debug-qjwzn" Feb 19 16:53:20 crc kubenswrapper[4810]: W0219 16:53:20.144966 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3aae9737_d017_4a11_8323_cd0354ba09aa.slice/crio-9656164cf92bf8b01c9f1e2af11c6643c46568e292fcd76132e42a9d4a87139c WatchSource:0}: Error finding container 9656164cf92bf8b01c9f1e2af11c6643c46568e292fcd76132e42a9d4a87139c: Status 404 returned error can't find the container with id 9656164cf92bf8b01c9f1e2af11c6643c46568e292fcd76132e42a9d4a87139c Feb 19 16:53:20 crc kubenswrapper[4810]: I0219 16:53:20.335069 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-944q7/crc-debug-qjwzn" event={"ID":"3aae9737-d017-4a11-8323-cd0354ba09aa","Type":"ContainerStarted","Data":"9656164cf92bf8b01c9f1e2af11c6643c46568e292fcd76132e42a9d4a87139c"} Feb 19 16:53:21 crc kubenswrapper[4810]: I0219 16:53:21.345785 4810 generic.go:334] "Generic (PLEG): container finished" podID="3aae9737-d017-4a11-8323-cd0354ba09aa" containerID="e99e76486ae25fbcf4eaf9167a8e92681978470106cc7b5ff79341a65e62afbd" exitCode=0 Feb 19 16:53:21 crc kubenswrapper[4810]: I0219 16:53:21.345856 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-944q7/crc-debug-qjwzn" event={"ID":"3aae9737-d017-4a11-8323-cd0354ba09aa","Type":"ContainerDied","Data":"e99e76486ae25fbcf4eaf9167a8e92681978470106cc7b5ff79341a65e62afbd"} Feb 19 16:53:21 crc kubenswrapper[4810]: I0219 16:53:21.394173 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-944q7/crc-debug-qjwzn"] Feb 19 16:53:21 crc kubenswrapper[4810]: I0219 16:53:21.404286 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-944q7/crc-debug-qjwzn"] Feb 19 16:53:22 crc kubenswrapper[4810]: I0219 16:53:22.477257 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-944q7/crc-debug-qjwzn" Feb 19 16:53:22 crc kubenswrapper[4810]: I0219 16:53:22.583026 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dkh4m\" (UniqueName: \"kubernetes.io/projected/3aae9737-d017-4a11-8323-cd0354ba09aa-kube-api-access-dkh4m\") pod \"3aae9737-d017-4a11-8323-cd0354ba09aa\" (UID: \"3aae9737-d017-4a11-8323-cd0354ba09aa\") " Feb 19 16:53:22 crc kubenswrapper[4810]: I0219 16:53:22.583222 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3aae9737-d017-4a11-8323-cd0354ba09aa-host\") pod \"3aae9737-d017-4a11-8323-cd0354ba09aa\" (UID: \"3aae9737-d017-4a11-8323-cd0354ba09aa\") " Feb 19 16:53:22 crc kubenswrapper[4810]: I0219 16:53:22.583591 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3aae9737-d017-4a11-8323-cd0354ba09aa-host" (OuterVolumeSpecName: "host") pod "3aae9737-d017-4a11-8323-cd0354ba09aa" (UID: "3aae9737-d017-4a11-8323-cd0354ba09aa"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 16:53:22 crc kubenswrapper[4810]: I0219 16:53:22.589569 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3aae9737-d017-4a11-8323-cd0354ba09aa-kube-api-access-dkh4m" (OuterVolumeSpecName: "kube-api-access-dkh4m") pod "3aae9737-d017-4a11-8323-cd0354ba09aa" (UID: "3aae9737-d017-4a11-8323-cd0354ba09aa"). InnerVolumeSpecName "kube-api-access-dkh4m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 16:53:22 crc kubenswrapper[4810]: I0219 16:53:22.684746 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dkh4m\" (UniqueName: \"kubernetes.io/projected/3aae9737-d017-4a11-8323-cd0354ba09aa-kube-api-access-dkh4m\") on node \"crc\" DevicePath \"\"" Feb 19 16:53:22 crc kubenswrapper[4810]: I0219 16:53:22.684788 4810 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3aae9737-d017-4a11-8323-cd0354ba09aa-host\") on node \"crc\" DevicePath \"\"" Feb 19 16:53:23 crc kubenswrapper[4810]: I0219 16:53:23.369209 4810 scope.go:117] "RemoveContainer" containerID="e99e76486ae25fbcf4eaf9167a8e92681978470106cc7b5ff79341a65e62afbd" Feb 19 16:53:23 crc kubenswrapper[4810]: I0219 16:53:23.369260 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-944q7/crc-debug-qjwzn" Feb 19 16:53:23 crc kubenswrapper[4810]: I0219 16:53:23.455683 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3aae9737-d017-4a11-8323-cd0354ba09aa" path="/var/lib/kubelet/pods/3aae9737-d017-4a11-8323-cd0354ba09aa/volumes" Feb 19 16:53:49 crc kubenswrapper[4810]: I0219 16:53:49.537839 4810 patch_prober.go:28] interesting pod/machine-config-daemon-t499d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 16:53:49 crc kubenswrapper[4810]: I0219 16:53:49.538391 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 16:53:49 crc kubenswrapper[4810]: I0219 16:53:49.538436 4810 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t499d" Feb 19 16:53:49 crc kubenswrapper[4810]: I0219 16:53:49.539216 4810 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"30b913c55f9740186a8530347003d7b1c641faf95ecc5adefecfaffe54fb5ed2"} pod="openshift-machine-config-operator/machine-config-daemon-t499d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 16:53:49 crc kubenswrapper[4810]: I0219 16:53:49.539275 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" containerID="cri-o://30b913c55f9740186a8530347003d7b1c641faf95ecc5adefecfaffe54fb5ed2" gracePeriod=600 Feb 19 16:53:49 crc kubenswrapper[4810]: I0219 16:53:49.715226 4810 generic.go:334] "Generic (PLEG): container finished" podID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerID="30b913c55f9740186a8530347003d7b1c641faf95ecc5adefecfaffe54fb5ed2" exitCode=0 Feb 19 16:53:49 crc kubenswrapper[4810]: I0219 16:53:49.715502 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" event={"ID":"dfbf74ef-1e94-4826-8583-42b2e246ccf3","Type":"ContainerDied","Data":"30b913c55f9740186a8530347003d7b1c641faf95ecc5adefecfaffe54fb5ed2"} Feb 19 16:53:49 crc kubenswrapper[4810]: I0219 16:53:49.715535 4810 scope.go:117] "RemoveContainer" containerID="63bd1770c325dfdb5ccd64220060f108b50a35fa6325f46acd3bb9d2e6c06e3e" Feb 19 16:53:50 crc kubenswrapper[4810]: I0219 16:53:50.725681 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" event={"ID":"dfbf74ef-1e94-4826-8583-42b2e246ccf3","Type":"ContainerStarted","Data":"a0ba7d3250d70141c7fee7e449509a680dcfa78b2c2d7c2d1cd7f9464f6ba11b"} Feb 19 16:54:09 crc kubenswrapper[4810]: I0219 16:54:09.270289 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6b886df68b-htd57_8d391303-b5ee-4f63-8035-12f123f35e65/barbican-api/0.log" Feb 19 16:54:09 crc kubenswrapper[4810]: I0219 16:54:09.366996 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6b886df68b-htd57_8d391303-b5ee-4f63-8035-12f123f35e65/barbican-api-log/0.log" Feb 19 16:54:09 crc kubenswrapper[4810]: I0219 16:54:09.490570 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-75f99f68b4-d7hj4_c008ffcd-bb96-47dd-a311-fdc58f6d8918/barbican-keystone-listener/0.log" Feb 19 16:54:09 crc kubenswrapper[4810]: I0219 16:54:09.569666 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-75f99f68b4-d7hj4_c008ffcd-bb96-47dd-a311-fdc58f6d8918/barbican-keystone-listener-log/0.log" Feb 19 16:54:09 crc kubenswrapper[4810]: I0219 16:54:09.737409 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-58f8775989-n9rgr_f277c31b-ff97-4f3b-aec3-c5cfe9293d60/barbican-worker/0.log" Feb 19 16:54:09 crc kubenswrapper[4810]: I0219 16:54:09.753131 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-58f8775989-n9rgr_f277c31b-ff97-4f3b-aec3-c5cfe9293d60/barbican-worker-log/0.log" Feb 19 16:54:09 crc kubenswrapper[4810]: I0219 16:54:09.985158 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-sdzkx_c4a9ca21-e1c7-490d-8078-14407b530301/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 16:54:10 crc kubenswrapper[4810]: I0219 16:54:10.045724 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e7d91a4d-5b61-404e-a58b-cb426722f883/ceilometer-notification-agent/0.log" Feb 19 16:54:10 crc kubenswrapper[4810]: I0219 16:54:10.131860 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e7d91a4d-5b61-404e-a58b-cb426722f883/ceilometer-central-agent/0.log" Feb 19 16:54:10 crc kubenswrapper[4810]: I0219 16:54:10.254240 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e7d91a4d-5b61-404e-a58b-cb426722f883/sg-core/0.log" Feb 19 16:54:10 crc kubenswrapper[4810]: I0219 16:54:10.256626 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e7d91a4d-5b61-404e-a58b-cb426722f883/proxy-httpd/0.log" Feb 19 16:54:10 crc kubenswrapper[4810]: I0219 16:54:10.504127 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_1723b820-73ac-49f3-8716-283bf2c05925/cinder-api-log/0.log" Feb 19 16:54:10 crc kubenswrapper[4810]: I0219 16:54:10.776766 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_f66b86b2-b164-4380-8a89-bb0cf5f833ef/probe/0.log" Feb 19 16:54:10 crc kubenswrapper[4810]: I0219 16:54:10.874628 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_f66b86b2-b164-4380-8a89-bb0cf5f833ef/cinder-backup/0.log" Feb 19 16:54:10 crc kubenswrapper[4810]: I0219 16:54:10.939724 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_1723b820-73ac-49f3-8716-283bf2c05925/cinder-api/0.log" Feb 19 16:54:11 crc kubenswrapper[4810]: I0219 16:54:11.061532 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_48d5e3e9-853c-4988-8746-a6f74e1fe209/cinder-scheduler/0.log" Feb 19 16:54:11 crc kubenswrapper[4810]: I0219 16:54:11.135939 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_48d5e3e9-853c-4988-8746-a6f74e1fe209/probe/0.log" Feb 19 16:54:11 crc kubenswrapper[4810]: I0219 16:54:11.266886 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-0_20a46eb8-508d-45be-bf13-31aed23d1582/cinder-volume/0.log" Feb 19 16:54:11 crc kubenswrapper[4810]: I0219 16:54:11.533701 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-0_20a46eb8-508d-45be-bf13-31aed23d1582/probe/0.log" Feb 19 16:54:11 crc kubenswrapper[4810]: I0219 16:54:11.721916 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-2-0_74a12495-8d82-4296-9328-430af6d923b2/probe/0.log" Feb 19 16:54:11 crc kubenswrapper[4810]: I0219 16:54:11.767628 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-2-0_74a12495-8d82-4296-9328-430af6d923b2/cinder-volume/0.log" Feb 19 16:54:11 crc kubenswrapper[4810]: I0219 16:54:11.834286 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-6q498_2cff3a3e-0543-4fec-8f5b-5421be276386/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 16:54:11 crc kubenswrapper[4810]: I0219 16:54:11.987187 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-cm25d_7e1f4472-242a-40a0-a574-9c3119fdb705/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 16:54:12 crc kubenswrapper[4810]: I0219 16:54:12.150137 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-685d6df875-6hghq_7c074feb-2f7c-4f84-9ea8-5a9062e6b10a/init/0.log" Feb 19 16:54:12 crc kubenswrapper[4810]: I0219 16:54:12.279786 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-685d6df875-6hghq_7c074feb-2f7c-4f84-9ea8-5a9062e6b10a/init/0.log" Feb 19 16:54:12 crc kubenswrapper[4810]: I0219 16:54:12.365502 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-flmfl_e6255c5c-26d4-421f-9156-1bdd2f5adcc6/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 16:54:12 crc kubenswrapper[4810]: I0219 16:54:12.490968 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-685d6df875-6hghq_7c074feb-2f7c-4f84-9ea8-5a9062e6b10a/dnsmasq-dns/0.log" Feb 19 16:54:12 crc kubenswrapper[4810]: I0219 16:54:12.652372 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_41a4af93-6f80-4097-a964-2e3f3055fd3b/glance-httpd/0.log" Feb 19 16:54:12 crc kubenswrapper[4810]: I0219 16:54:12.676303 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_41a4af93-6f80-4097-a964-2e3f3055fd3b/glance-log/0.log" Feb 19 16:54:12 crc kubenswrapper[4810]: I0219 16:54:12.826920 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_6cc215f9-4ba7-4f38-aea0-bdb68abb2ad4/glance-httpd/0.log" Feb 19 16:54:12 crc kubenswrapper[4810]: I0219 16:54:12.843835 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_6cc215f9-4ba7-4f38-aea0-bdb68abb2ad4/glance-log/0.log" Feb 19 16:54:13 crc kubenswrapper[4810]: I0219 16:54:13.027348 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5f56498b8d-9gwmf_737d6629-747f-4d16-a545-d0070c20fe5d/horizon/0.log" Feb 19 16:54:13 crc kubenswrapper[4810]: I0219 16:54:13.293774 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-n7jh2_31bd8fe5-f0b6-4463-a545-bdeb0c33b182/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 16:54:13 crc kubenswrapper[4810]: I0219 16:54:13.380519 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-s8kk5_12c3c35e-69ec-4150-8f1e-7eeb6cc01d9b/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 16:54:13 crc kubenswrapper[4810]: I0219 16:54:13.648694 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29525281-26qqv_8984eff3-6c82-4e2f-8bd6-1e820a450874/keystone-cron/0.log" Feb 19 16:54:13 crc kubenswrapper[4810]: I0219 16:54:13.661580 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5f56498b8d-9gwmf_737d6629-747f-4d16-a545-d0070c20fe5d/horizon-log/0.log" Feb 19 16:54:13 crc kubenswrapper[4810]: I0219 16:54:13.873752 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_9358dbee-2e5b-432d-98e0-6945d2e0d44b/kube-state-metrics/0.log" Feb 19 16:54:13 crc kubenswrapper[4810]: I0219 16:54:13.999719 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-mxd44_b0d687e9-21b0-4abe-b7ec-4fb050926f6c/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 16:54:14 crc kubenswrapper[4810]: I0219 16:54:14.135255 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-6cd8bf58f4-ktsjk_95165d88-ea72-4785-8c1a-eea4d54466fb/keystone-api/0.log" Feb 19 16:54:14 crc kubenswrapper[4810]: I0219 16:54:14.496040 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6dfcf65577-bd5w2_6528bdfd-3389-4776-826e-164fc5117682/neutron-httpd/0.log" Feb 19 16:54:14 crc kubenswrapper[4810]: I0219 16:54:14.504787 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-wpxhs_6650a3db-fdc1-4342-b8a8-cb91376e75c5/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 16:54:14 crc kubenswrapper[4810]: I0219 16:54:14.595150 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6dfcf65577-bd5w2_6528bdfd-3389-4776-826e-164fc5117682/neutron-api/0.log" Feb 19 16:54:14 crc kubenswrapper[4810]: I0219 16:54:14.676167 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_notifications-rabbitmq-server-0_4ad875c7-9e7b-4cc8-b6f7-e6fbeb95573c/setup-container/0.log" Feb 19 16:54:14 crc kubenswrapper[4810]: I0219 16:54:14.948683 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_notifications-rabbitmq-server-0_4ad875c7-9e7b-4cc8-b6f7-e6fbeb95573c/setup-container/0.log" Feb 19 16:54:15 crc kubenswrapper[4810]: I0219 16:54:15.189362 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_notifications-rabbitmq-server-0_4ad875c7-9e7b-4cc8-b6f7-e6fbeb95573c/rabbitmq/0.log" Feb 19 16:54:15 crc kubenswrapper[4810]: I0219 16:54:15.840578 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_65e6588c-3b7f-4719-beb6-90229629820f/nova-cell0-conductor-conductor/0.log" Feb 19 16:54:16 crc kubenswrapper[4810]: I0219 16:54:16.066621 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_f93aa728-7924-4a75-ad48-cc174764cf3e/nova-cell1-conductor-conductor/0.log" Feb 19 16:54:16 crc kubenswrapper[4810]: I0219 16:54:16.597681 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_5a7915d4-6c3f-4bc7-b21d-7d51b675640f/nova-cell1-novncproxy-novncproxy/0.log" Feb 19 16:54:16 crc kubenswrapper[4810]: I0219 16:54:16.705533 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-nv8wh_cc5014f8-e5aa-47ad-8787-c187b0f7f0e1/nova-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 16:54:16 crc kubenswrapper[4810]: I0219 16:54:16.799630 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_6397af05-d030-46c2-8a0f-a90beb9b2502/nova-api-log/0.log" Feb 19 16:54:16 crc kubenswrapper[4810]: I0219 16:54:16.932868 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_6397af05-d030-46c2-8a0f-a90beb9b2502/nova-api-api/0.log" Feb 19 16:54:17 crc kubenswrapper[4810]: I0219 16:54:17.052518 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_f36ad344-e946-4221-892d-3ffe8fbdd59b/nova-metadata-log/0.log" Feb 19 16:54:17 crc kubenswrapper[4810]: I0219 16:54:17.306783 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_30d11a24-9722-4e7a-9be5-f2bd00128167/mysql-bootstrap/0.log" Feb 19 16:54:17 crc kubenswrapper[4810]: I0219 16:54:17.482615 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_d70a0a1b-ed2d-46f1-aeb9-a335de9b06d4/nova-scheduler-scheduler/0.log" Feb 19 16:54:17 crc kubenswrapper[4810]: I0219 16:54:17.517065 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_30d11a24-9722-4e7a-9be5-f2bd00128167/mysql-bootstrap/0.log" Feb 19 16:54:17 crc kubenswrapper[4810]: I0219 16:54:17.549359 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_30d11a24-9722-4e7a-9be5-f2bd00128167/galera/0.log" Feb 19 16:54:17 crc kubenswrapper[4810]: I0219 16:54:17.734733 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_c0ffb8ce-a356-4416-b96c-49db30ff1947/mysql-bootstrap/0.log" Feb 19 16:54:17 crc kubenswrapper[4810]: I0219 16:54:17.948451 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_c0ffb8ce-a356-4416-b96c-49db30ff1947/mysql-bootstrap/0.log" Feb 19 16:54:17 crc kubenswrapper[4810]: I0219 16:54:17.963905 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_c0ffb8ce-a356-4416-b96c-49db30ff1947/galera/0.log" Feb 19 16:54:18 crc kubenswrapper[4810]: I0219 16:54:18.204119 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_ca8eb29b-bb26-446f-8a22-5da13ff9d5fa/openstackclient/0.log" Feb 19 16:54:18 crc kubenswrapper[4810]: I0219 16:54:18.287503 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-tbt28_c03aad2b-8ca1-4310-8c11-3287fafcd66f/openstack-network-exporter/0.log" Feb 19 16:54:18 crc kubenswrapper[4810]: I0219 16:54:18.510825 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-5t6ds_542da555-4f39-4dff-b378-5306135244db/ovsdb-server-init/0.log" Feb 19 16:54:18 crc kubenswrapper[4810]: I0219 16:54:18.696668 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-5t6ds_542da555-4f39-4dff-b378-5306135244db/ovsdb-server-init/0.log" Feb 19 16:54:18 crc kubenswrapper[4810]: I0219 16:54:18.722719 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-5t6ds_542da555-4f39-4dff-b378-5306135244db/ovsdb-server/0.log" Feb 19 16:54:18 crc kubenswrapper[4810]: I0219 16:54:18.917122 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-s5488_4a4fa57b-aa00-4866-b31e-df29f7f86480/ovn-controller/0.log" Feb 19 16:54:19 crc kubenswrapper[4810]: I0219 16:54:19.163728 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-5gjxx_4defb710-c07f-4e63-9baf-45f51085abdc/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 16:54:19 crc kubenswrapper[4810]: I0219 16:54:19.176946 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-5t6ds_542da555-4f39-4dff-b378-5306135244db/ovs-vswitchd/0.log" Feb 19 16:54:19 crc kubenswrapper[4810]: I0219 16:54:19.402378 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_22facf67-088b-410b-986a-c9e09b3d8feb/openstack-network-exporter/0.log" Feb 19 16:54:19 crc kubenswrapper[4810]: I0219 16:54:19.402885 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_22facf67-088b-410b-986a-c9e09b3d8feb/ovn-northd/0.log" Feb 19 16:54:19 crc kubenswrapper[4810]: I0219 16:54:19.577986 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_f36ad344-e946-4221-892d-3ffe8fbdd59b/nova-metadata-metadata/0.log" Feb 19 16:54:19 crc kubenswrapper[4810]: I0219 16:54:19.590095 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_bdffb5e6-13bb-4c08-ad3c-52d8ded85431/openstack-network-exporter/0.log" Feb 19 16:54:19 crc kubenswrapper[4810]: I0219 16:54:19.608074 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_bdffb5e6-13bb-4c08-ad3c-52d8ded85431/ovsdbserver-nb/0.log" Feb 19 16:54:19 crc kubenswrapper[4810]: I0219 16:54:19.881842 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_5b985124-01b7-430c-b5ea-b9fd095e5f5e/openstack-network-exporter/0.log" Feb 19 16:54:19 crc kubenswrapper[4810]: I0219 16:54:19.889553 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_5b985124-01b7-430c-b5ea-b9fd095e5f5e/ovsdbserver-sb/0.log" Feb 19 16:54:20 crc kubenswrapper[4810]: I0219 16:54:20.153582 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7555d68ddd-xqj8c_565eac29-daec-4b40-bcb7-751696560c3a/placement-api/0.log" Feb 19 16:54:20 crc kubenswrapper[4810]: I0219 16:54:20.168427 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_bf65af35-1e80-49a0-ada2-3bd027193193/init-config-reloader/0.log" Feb 19 16:54:20 crc kubenswrapper[4810]: I0219 16:54:20.340999 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_bf65af35-1e80-49a0-ada2-3bd027193193/init-config-reloader/0.log" Feb 19 16:54:20 crc kubenswrapper[4810]: I0219 16:54:20.356167 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7555d68ddd-xqj8c_565eac29-daec-4b40-bcb7-751696560c3a/placement-log/0.log" Feb 19 16:54:20 crc kubenswrapper[4810]: I0219 16:54:20.364936 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_bf65af35-1e80-49a0-ada2-3bd027193193/config-reloader/0.log" Feb 19 16:54:20 crc kubenswrapper[4810]: I0219 16:54:20.389131 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_bf65af35-1e80-49a0-ada2-3bd027193193/prometheus/0.log" Feb 19 16:54:20 crc kubenswrapper[4810]: I0219 16:54:20.588515 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_03247cdb-4055-4d47-b433-848e363768ab/setup-container/0.log" Feb 19 16:54:20 crc kubenswrapper[4810]: I0219 16:54:20.616128 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_bf65af35-1e80-49a0-ada2-3bd027193193/thanos-sidecar/0.log" Feb 19 16:54:20 crc kubenswrapper[4810]: I0219 16:54:20.772704 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_03247cdb-4055-4d47-b433-848e363768ab/setup-container/0.log" Feb 19 16:54:20 crc kubenswrapper[4810]: I0219 16:54:20.891242 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_b86448c3-669a-4132-b8ab-4db06347fa10/setup-container/0.log" Feb 19 16:54:20 crc kubenswrapper[4810]: I0219 16:54:20.936683 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_03247cdb-4055-4d47-b433-848e363768ab/rabbitmq/0.log" Feb 19 16:54:21 crc kubenswrapper[4810]: I0219 16:54:21.052207 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_b86448c3-669a-4132-b8ab-4db06347fa10/setup-container/0.log" Feb 19 16:54:21 crc kubenswrapper[4810]: I0219 16:54:21.061196 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_b86448c3-669a-4132-b8ab-4db06347fa10/rabbitmq/0.log" Feb 19 16:54:21 crc kubenswrapper[4810]: I0219 16:54:21.176156 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-ps669_69d67433-38d6-4368-a621-254a97b0c619/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 16:54:21 crc kubenswrapper[4810]: I0219 16:54:21.382758 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-x748x_32dc9563-791b-421e-a807-41cc1e775b3a/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 16:54:21 crc kubenswrapper[4810]: I0219 16:54:21.475672 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-484qb_8c05e8c7-82f6-4ef1-a576-3c84e70dc570/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 16:54:21 crc kubenswrapper[4810]: I0219 16:54:21.618220 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-rf56v_e77512a1-b460-4008-9e59-5b38f3e9f925/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 16:54:21 crc kubenswrapper[4810]: I0219 16:54:21.663762 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-gw579_e3132ed5-687d-4cd1-a539-35c4766a27c1/ssh-known-hosts-edpm-deployment/0.log" Feb 19 16:54:21 crc kubenswrapper[4810]: I0219 16:54:21.906954 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-78bc5d479f-k79xx_9190a865-226b-487c-b0f9-2573f50f0eab/proxy-server/0.log" Feb 19 16:54:22 crc kubenswrapper[4810]: I0219 16:54:22.055525 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-78bc5d479f-k79xx_9190a865-226b-487c-b0f9-2573f50f0eab/proxy-httpd/0.log" Feb 19 16:54:22 crc kubenswrapper[4810]: I0219 16:54:22.105285 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-hrdll_6c36f3e5-f790-4eda-9486-174f8624dad1/swift-ring-rebalance/0.log" Feb 19 16:54:22 crc kubenswrapper[4810]: I0219 16:54:22.328467 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_37e2af25-5b30-4fb9-801e-f4a84d665540/account-auditor/0.log" Feb 19 16:54:22 crc kubenswrapper[4810]: I0219 16:54:22.468592 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_37e2af25-5b30-4fb9-801e-f4a84d665540/account-reaper/0.log" Feb 19 16:54:22 crc kubenswrapper[4810]: I0219 16:54:22.486954 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_37e2af25-5b30-4fb9-801e-f4a84d665540/account-replicator/0.log" Feb 19 16:54:22 crc kubenswrapper[4810]: I0219 16:54:22.497389 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_37e2af25-5b30-4fb9-801e-f4a84d665540/container-auditor/0.log" Feb 19 16:54:22 crc kubenswrapper[4810]: I0219 16:54:22.588314 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_37e2af25-5b30-4fb9-801e-f4a84d665540/account-server/0.log" Feb 19 16:54:22 crc kubenswrapper[4810]: I0219 16:54:22.683552 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_37e2af25-5b30-4fb9-801e-f4a84d665540/container-server/0.log" Feb 19 16:54:22 crc kubenswrapper[4810]: I0219 16:54:22.687084 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_37e2af25-5b30-4fb9-801e-f4a84d665540/container-replicator/0.log" Feb 19 16:54:22 crc kubenswrapper[4810]: I0219 16:54:22.690993 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_37e2af25-5b30-4fb9-801e-f4a84d665540/container-updater/0.log" Feb 19 16:54:22 crc kubenswrapper[4810]: I0219 16:54:22.834105 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_37e2af25-5b30-4fb9-801e-f4a84d665540/object-auditor/0.log" Feb 19 16:54:22 crc kubenswrapper[4810]: I0219 16:54:22.909092 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_37e2af25-5b30-4fb9-801e-f4a84d665540/object-expirer/0.log" Feb 19 16:54:22 crc kubenswrapper[4810]: I0219 16:54:22.933700 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_37e2af25-5b30-4fb9-801e-f4a84d665540/object-replicator/0.log" Feb 19 16:54:22 crc kubenswrapper[4810]: I0219 16:54:22.964055 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_37e2af25-5b30-4fb9-801e-f4a84d665540/object-server/0.log" Feb 19 16:54:23 crc kubenswrapper[4810]: I0219 16:54:23.049364 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_37e2af25-5b30-4fb9-801e-f4a84d665540/object-updater/0.log" Feb 19 16:54:23 crc kubenswrapper[4810]: I0219 16:54:23.123031 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_37e2af25-5b30-4fb9-801e-f4a84d665540/swift-recon-cron/0.log" Feb 19 16:54:23 crc kubenswrapper[4810]: I0219 16:54:23.143584 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_37e2af25-5b30-4fb9-801e-f4a84d665540/rsync/0.log" Feb 19 16:54:23 crc kubenswrapper[4810]: I0219 16:54:23.299393 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-bfhfc_f7ca8c9a-db61-400f-9319-21590462f929/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 16:54:23 crc kubenswrapper[4810]: I0219 16:54:23.351268 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_a4c017a9-c049-4baa-acc0-e08a25437c90/tempest-tests-tempest-tests-runner/0.log" Feb 19 16:54:23 crc kubenswrapper[4810]: I0219 16:54:23.487160 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_d3094cb4-a81a-4f4f-b1d8-040c32bd3b2c/test-operator-logs-container/0.log" Feb 19 16:54:23 crc kubenswrapper[4810]: I0219 16:54:23.697726 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-zbkm7_412dc62a-d25e-4820-947b-582e310ddff1/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 16:54:24 crc kubenswrapper[4810]: I0219 16:54:24.422635 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-applier-0_79f3ef20-3f3d-4fa2-8888-36d421303dfd/watcher-applier/0.log" Feb 19 16:54:24 crc kubenswrapper[4810]: I0219 16:54:24.971815 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_c11a7f60-4839-44aa-8615-98de657221f4/watcher-api-log/0.log" Feb 19 16:54:27 crc kubenswrapper[4810]: I0219 16:54:27.644402 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-decision-engine-0_fbbd48c8-49fb-4e51-9ba7-f7b37f681b3d/watcher-decision-engine/0.log" Feb 19 16:54:28 crc kubenswrapper[4810]: I0219 16:54:28.965597 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_c11a7f60-4839-44aa-8615-98de657221f4/watcher-api/0.log" Feb 19 16:54:31 crc kubenswrapper[4810]: I0219 16:54:31.080660 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_eb773d46-7b9f-4ca4-b33c-9b800b9eafd7/memcached/0.log" Feb 19 16:54:51 crc kubenswrapper[4810]: I0219 16:54:51.638692 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ae2a058ea32e5d02c736557a2a81dfccb5f5284d9451ce18a33356457cfrzl4_124e176a-b011-4a5c-8e7c-ca027d881aea/util/0.log" Feb 19 16:54:53 crc kubenswrapper[4810]: I0219 16:54:53.137401 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ae2a058ea32e5d02c736557a2a81dfccb5f5284d9451ce18a33356457cfrzl4_124e176a-b011-4a5c-8e7c-ca027d881aea/util/0.log" Feb 19 16:54:53 crc kubenswrapper[4810]: I0219 16:54:53.140255 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ae2a058ea32e5d02c736557a2a81dfccb5f5284d9451ce18a33356457cfrzl4_124e176a-b011-4a5c-8e7c-ca027d881aea/pull/0.log" Feb 19 16:54:53 crc kubenswrapper[4810]: I0219 16:54:53.143103 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ae2a058ea32e5d02c736557a2a81dfccb5f5284d9451ce18a33356457cfrzl4_124e176a-b011-4a5c-8e7c-ca027d881aea/pull/0.log" Feb 19 16:54:53 crc kubenswrapper[4810]: I0219 16:54:53.286207 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ae2a058ea32e5d02c736557a2a81dfccb5f5284d9451ce18a33356457cfrzl4_124e176a-b011-4a5c-8e7c-ca027d881aea/util/0.log" Feb 19 16:54:53 crc kubenswrapper[4810]: I0219 16:54:53.325039 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ae2a058ea32e5d02c736557a2a81dfccb5f5284d9451ce18a33356457cfrzl4_124e176a-b011-4a5c-8e7c-ca027d881aea/pull/0.log" Feb 19 16:54:53 crc kubenswrapper[4810]: I0219 16:54:53.361978 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ae2a058ea32e5d02c736557a2a81dfccb5f5284d9451ce18a33356457cfrzl4_124e176a-b011-4a5c-8e7c-ca027d881aea/extract/0.log" Feb 19 16:54:53 crc kubenswrapper[4810]: I0219 16:54:53.769121 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6d8bf5c495-z5fb9_52bb990c-eff0-4673-be27-d55d433bef0d/manager/0.log" Feb 19 16:54:54 crc kubenswrapper[4810]: I0219 16:54:54.119713 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987464f4-qz68t_2106e7b5-bb83-464a-a43f-943f22b55078/manager/0.log" Feb 19 16:54:54 crc kubenswrapper[4810]: I0219 16:54:54.312183 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-69f49c598c-ffm66_e2942952-ce19-4053-91da-05623c954167/manager/0.log" Feb 19 16:54:54 crc kubenswrapper[4810]: I0219 16:54:54.520031 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5b9b8895d5-gnmlp_f0ab3643-d267-4902-af1f-cbcbdd7e5e41/manager/0.log" Feb 19 16:54:54 crc kubenswrapper[4810]: I0219 16:54:54.956868 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-554564d7fc-nnps5_602535d1-0abe-471e-8409-31319af7bd4b/manager/0.log" Feb 19 16:54:55 crc kubenswrapper[4810]: I0219 16:54:55.381038 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-79d975b745-2kkhl_4898d4eb-d474-44bc-9a38-e36f300d132f/manager/0.log" Feb 19 16:54:55 crc kubenswrapper[4810]: I0219 16:54:55.527604 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-b4d948c87-mkfsc_2126b31b-0444-43e4-a250-837f37d476aa/manager/0.log" Feb 19 16:54:55 crc kubenswrapper[4810]: I0219 16:54:55.687969 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-54f6768c69-vc7cw_942f40af-0498-4865-99da-bdcd068ef449/manager/0.log" Feb 19 16:54:55 crc kubenswrapper[4810]: I0219 16:54:55.983993 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6994f66f48-6fqmd_e4a54646-39cf-4e42-9367-487ea4f7d8a4/manager/0.log" Feb 19 16:54:56 crc kubenswrapper[4810]: I0219 16:54:56.135946 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-5d946d989d-jxmt5_1217b757-0f1c-4c4e-9abe-55875992915d/manager/0.log" Feb 19 16:54:56 crc kubenswrapper[4810]: I0219 16:54:56.222378 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-64ddbf8bb-jjqv2_4aeb7cba-c1db-4dd6-92f7-dae7bd2e3f65/manager/0.log" Feb 19 16:54:56 crc kubenswrapper[4810]: I0219 16:54:56.472496 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-567668f5cf-l67cq_fef4ad67-ccc8-4b32-bfb8-38dd6aa8e07e/manager/0.log" Feb 19 16:54:56 crc kubenswrapper[4810]: I0219 16:54:56.690081 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-7c6767dc9c9hlpt_c677bdd0-7248-4b02-9ab4-035c034a976a/manager/0.log" Feb 19 16:54:57 crc kubenswrapper[4810]: I0219 16:54:57.073660 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-69cffcd4f6-27gzn_e84ef702-2f13-42e9-ae2b-6f1465b67ff3/operator/0.log" Feb 19 16:54:57 crc kubenswrapper[4810]: I0219 16:54:57.218675 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-gkft8_09f49ae7-b6fb-4ca5-9238-8bcf8d15ea95/registry-server/0.log" Feb 19 16:54:57 crc kubenswrapper[4810]: I0219 16:54:57.501874 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-d44cf6b75-5xnwd_e163eac0-ea1f-4002-9469-844240d7a44c/manager/0.log" Feb 19 16:54:57 crc kubenswrapper[4810]: I0219 16:54:57.717845 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-8497b45c89-7tzvr_9f5779a5-4cda-40dc-831d-950f97eae317/manager/0.log" Feb 19 16:54:57 crc kubenswrapper[4810]: I0219 16:54:57.909613 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-k98c8_64ed590e-59b6-44c8-baee-324162d099b8/operator/0.log" Feb 19 16:54:58 crc kubenswrapper[4810]: I0219 16:54:58.126896 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-68f46476f-t44nb_aa5063d7-2358-4149-a3b9-ef2ce138faf4/manager/0.log" Feb 19 16:54:58 crc kubenswrapper[4810]: I0219 16:54:58.500825 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-7866795846-pw9kt_69b7e96d-bce6-4653-998e-3bf5d159ae5a/manager/0.log" Feb 19 16:54:58 crc kubenswrapper[4810]: I0219 16:54:58.631608 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-7f45b4ff68-px9zx_eaed166e-39b5-45ca-8a65-a22710d5fe37/manager/0.log" Feb 19 16:54:58 crc kubenswrapper[4810]: I0219 16:54:58.979478 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-798847869b-dlmvg_9c5af548-c722-4e6b-9309-1420838257e0/manager/0.log" Feb 19 16:54:59 crc kubenswrapper[4810]: I0219 16:54:59.061971 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6d464797d7-lrlqc_a6f83f3c-26f4-472f-9fcd-ae8049f1819a/manager/0.log" Feb 19 16:54:59 crc kubenswrapper[4810]: I0219 16:54:59.348447 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-69f8888797-vcbwg_3cff7e0c-86c7-4029-aefe-a7f7e8e2d76d/manager/0.log" Feb 19 16:55:05 crc kubenswrapper[4810]: I0219 16:55:05.088557 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-868647ff47-mzslt_91002269-9fe0-44d2-9dbd-9e4cf58274bf/manager/0.log" Feb 19 16:55:19 crc kubenswrapper[4810]: I0219 16:55:19.765369 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-895xv_64dc0d58-11d4-456b-97ab-a4d3ec28225b/control-plane-machine-set-operator/0.log" Feb 19 16:55:19 crc kubenswrapper[4810]: I0219 16:55:19.982743 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-l66cb_9a7776ca-1a56-4eca-9e44-ba1b7b15510f/kube-rbac-proxy/0.log" Feb 19 16:55:20 crc kubenswrapper[4810]: I0219 16:55:20.006147 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-l66cb_9a7776ca-1a56-4eca-9e44-ba1b7b15510f/machine-api-operator/0.log" Feb 19 16:55:35 crc kubenswrapper[4810]: I0219 16:55:35.625421 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-fwh4x_02206e32-6f49-407e-a02b-ce61e3daabf6/cert-manager-controller/0.log" Feb 19 16:55:35 crc kubenswrapper[4810]: I0219 16:55:35.796394 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-x4csq_54e755f0-9c2f-4d47-9979-b7b92996bab6/cert-manager-cainjector/0.log" Feb 19 16:55:35 crc kubenswrapper[4810]: I0219 16:55:35.841322 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-7lspv_1224bbe4-6d8e-410e-8990-3813efdd2003/cert-manager-webhook/0.log" Feb 19 16:55:49 crc kubenswrapper[4810]: I0219 16:55:49.537449 4810 patch_prober.go:28] interesting pod/machine-config-daemon-t499d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 16:55:49 crc kubenswrapper[4810]: I0219 16:55:49.537939 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 16:55:50 crc kubenswrapper[4810]: I0219 16:55:50.988229 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5c78fc5d65-kdwwx_35fc682a-0cc9-4922-a2f2-60da1ddb1eb9/nmstate-console-plugin/0.log" Feb 19 16:55:51 crc kubenswrapper[4810]: I0219 16:55:51.150483 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-bhhvv_c0eb0835-6df5-4a21-b309-f178a032d027/nmstate-handler/0.log" Feb 19 16:55:51 crc kubenswrapper[4810]: I0219 16:55:51.209205 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-4g952_ce589619-7c2f-43db-ae4f-fb43be7b07f4/kube-rbac-proxy/0.log" Feb 19 16:55:51 crc kubenswrapper[4810]: I0219 16:55:51.290585 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-4g952_ce589619-7c2f-43db-ae4f-fb43be7b07f4/nmstate-metrics/0.log" Feb 19 16:55:51 crc kubenswrapper[4810]: I0219 16:55:51.451123 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-694c9596b7-p4hwg_f8300a06-7526-4da5-89a6-7fff8ff284c9/nmstate-operator/0.log" Feb 19 16:55:51 crc kubenswrapper[4810]: I0219 16:55:51.539259 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-866bcb46dc-ckvvq_db05e782-a3d7-4cbe-be3f-f6226d894864/nmstate-webhook/0.log" Feb 19 16:56:07 crc kubenswrapper[4810]: I0219 16:56:07.847551 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-6jkkz_1656f52d-7771-4bbb-9642-b296d16b791e/prometheus-operator/0.log" Feb 19 16:56:08 crc kubenswrapper[4810]: I0219 16:56:08.032814 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6d6bf9b6-qpvdt_d5debcf2-9629-4bb2-9133-f4b81748ff7d/prometheus-operator-admission-webhook/0.log" Feb 19 16:56:08 crc kubenswrapper[4810]: I0219 16:56:08.203343 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6d6bf9b6-vrfvr_408628c0-0b2c-48f9-b849-ee1b124499e1/prometheus-operator-admission-webhook/0.log" Feb 19 16:56:08 crc kubenswrapper[4810]: I0219 16:56:08.233348 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-dk9c4_8bdf030e-92d8-45dc-ab6c-a7b241444677/operator/0.log" Feb 19 16:56:08 crc kubenswrapper[4810]: I0219 16:56:08.408635 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-2fdxm_c5968625-c396-4ae0-9846-c2ceb6baf655/perses-operator/0.log" Feb 19 16:56:19 crc kubenswrapper[4810]: I0219 16:56:19.538038 4810 patch_prober.go:28] interesting pod/machine-config-daemon-t499d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 16:56:19 crc kubenswrapper[4810]: I0219 16:56:19.538534 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 16:56:24 crc kubenswrapper[4810]: I0219 16:56:24.193068 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-jngcz_781d467e-8522-43a3-a552-1ceebc40cddd/kube-rbac-proxy/0.log" Feb 19 16:56:24 crc kubenswrapper[4810]: I0219 16:56:24.308253 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-jngcz_781d467e-8522-43a3-a552-1ceebc40cddd/controller/0.log" Feb 19 16:56:24 crc kubenswrapper[4810]: I0219 16:56:24.413474 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7rbxk_66c7e596-ffa3-4687-8c80-21acecbd8075/cp-frr-files/0.log" Feb 19 16:56:24 crc kubenswrapper[4810]: I0219 16:56:24.641730 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7rbxk_66c7e596-ffa3-4687-8c80-21acecbd8075/cp-frr-files/0.log" Feb 19 16:56:24 crc kubenswrapper[4810]: I0219 16:56:24.642671 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7rbxk_66c7e596-ffa3-4687-8c80-21acecbd8075/cp-reloader/0.log" Feb 19 16:56:24 crc kubenswrapper[4810]: I0219 16:56:24.659993 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7rbxk_66c7e596-ffa3-4687-8c80-21acecbd8075/cp-metrics/0.log" Feb 19 16:56:24 crc kubenswrapper[4810]: I0219 16:56:24.684912 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7rbxk_66c7e596-ffa3-4687-8c80-21acecbd8075/cp-reloader/0.log" Feb 19 16:56:24 crc kubenswrapper[4810]: I0219 16:56:24.796179 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7rbxk_66c7e596-ffa3-4687-8c80-21acecbd8075/cp-reloader/0.log" Feb 19 16:56:24 crc kubenswrapper[4810]: I0219 16:56:24.833432 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7rbxk_66c7e596-ffa3-4687-8c80-21acecbd8075/cp-frr-files/0.log" Feb 19 16:56:24 crc kubenswrapper[4810]: I0219 16:56:24.859965 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7rbxk_66c7e596-ffa3-4687-8c80-21acecbd8075/cp-metrics/0.log" Feb 19 16:56:24 crc kubenswrapper[4810]: I0219 16:56:24.861733 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7rbxk_66c7e596-ffa3-4687-8c80-21acecbd8075/cp-metrics/0.log" Feb 19 16:56:25 crc kubenswrapper[4810]: I0219 16:56:25.155088 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7rbxk_66c7e596-ffa3-4687-8c80-21acecbd8075/cp-reloader/0.log" Feb 19 16:56:25 crc kubenswrapper[4810]: I0219 16:56:25.155198 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7rbxk_66c7e596-ffa3-4687-8c80-21acecbd8075/cp-frr-files/0.log" Feb 19 16:56:25 crc kubenswrapper[4810]: I0219 16:56:25.160639 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7rbxk_66c7e596-ffa3-4687-8c80-21acecbd8075/cp-metrics/0.log" Feb 19 16:56:25 crc kubenswrapper[4810]: I0219 16:56:25.203038 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7rbxk_66c7e596-ffa3-4687-8c80-21acecbd8075/controller/0.log" Feb 19 16:56:25 crc kubenswrapper[4810]: I0219 16:56:25.352908 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7rbxk_66c7e596-ffa3-4687-8c80-21acecbd8075/kube-rbac-proxy/0.log" Feb 19 16:56:25 crc kubenswrapper[4810]: I0219 16:56:25.362684 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7rbxk_66c7e596-ffa3-4687-8c80-21acecbd8075/frr-metrics/0.log" Feb 19 16:56:25 crc kubenswrapper[4810]: I0219 16:56:25.458481 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7rbxk_66c7e596-ffa3-4687-8c80-21acecbd8075/kube-rbac-proxy-frr/0.log" Feb 19 16:56:25 crc kubenswrapper[4810]: I0219 16:56:25.860114 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7rbxk_66c7e596-ffa3-4687-8c80-21acecbd8075/reloader/0.log" Feb 19 16:56:25 crc kubenswrapper[4810]: I0219 16:56:25.915274 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-78b44bf5bb-cwj24_1ee9f8f3-05a8-4648-b48d-4975285346d7/frr-k8s-webhook-server/0.log" Feb 19 16:56:26 crc kubenswrapper[4810]: I0219 16:56:26.118608 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-75f48c59dc-m5vm8_f26047c7-b8cc-4ce2-8a48-4b380ab225c0/manager/0.log" Feb 19 16:56:26 crc kubenswrapper[4810]: I0219 16:56:26.166037 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-595d5f7545-vfb4c_3d62866f-b047-419d-8eb0-848b0df84e63/webhook-server/0.log" Feb 19 16:56:26 crc kubenswrapper[4810]: I0219 16:56:26.317942 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-hllgd_c9d97974-67d2-42e5-89fe-b6db106a47c4/kube-rbac-proxy/0.log" Feb 19 16:56:26 crc kubenswrapper[4810]: I0219 16:56:26.920589 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-hllgd_c9d97974-67d2-42e5-89fe-b6db106a47c4/speaker/0.log" Feb 19 16:56:27 crc kubenswrapper[4810]: I0219 16:56:27.127203 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7rbxk_66c7e596-ffa3-4687-8c80-21acecbd8075/frr/0.log" Feb 19 16:56:41 crc kubenswrapper[4810]: I0219 16:56:41.523524 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lgm4z_e54510d9-7d24-47bb-a55e-b50e7cff9fba/util/0.log" Feb 19 16:56:41 crc kubenswrapper[4810]: I0219 16:56:41.658926 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lgm4z_e54510d9-7d24-47bb-a55e-b50e7cff9fba/util/0.log" Feb 19 16:56:41 crc kubenswrapper[4810]: I0219 16:56:41.693460 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lgm4z_e54510d9-7d24-47bb-a55e-b50e7cff9fba/pull/0.log" Feb 19 16:56:41 crc kubenswrapper[4810]: I0219 16:56:41.783222 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lgm4z_e54510d9-7d24-47bb-a55e-b50e7cff9fba/pull/0.log" Feb 19 16:56:41 crc kubenswrapper[4810]: I0219 16:56:41.945239 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lgm4z_e54510d9-7d24-47bb-a55e-b50e7cff9fba/pull/0.log" Feb 19 16:56:41 crc kubenswrapper[4810]: I0219 16:56:41.949069 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lgm4z_e54510d9-7d24-47bb-a55e-b50e7cff9fba/extract/0.log" Feb 19 16:56:41 crc kubenswrapper[4810]: I0219 16:56:41.966142 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lgm4z_e54510d9-7d24-47bb-a55e-b50e7cff9fba/util/0.log" Feb 19 16:56:42 crc kubenswrapper[4810]: I0219 16:56:42.124179 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ccnrb_a861f8a3-be34-4fc0-96cb-42502d0a3bab/util/0.log" Feb 19 16:56:42 crc kubenswrapper[4810]: I0219 16:56:42.315747 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ccnrb_a861f8a3-be34-4fc0-96cb-42502d0a3bab/util/0.log" Feb 19 16:56:42 crc kubenswrapper[4810]: I0219 16:56:42.360333 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ccnrb_a861f8a3-be34-4fc0-96cb-42502d0a3bab/pull/0.log" Feb 19 16:56:42 crc kubenswrapper[4810]: I0219 16:56:42.375409 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ccnrb_a861f8a3-be34-4fc0-96cb-42502d0a3bab/pull/0.log" Feb 19 16:56:42 crc kubenswrapper[4810]: I0219 16:56:42.542080 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ccnrb_a861f8a3-be34-4fc0-96cb-42502d0a3bab/util/0.log" Feb 19 16:56:42 crc kubenswrapper[4810]: I0219 16:56:42.546852 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ccnrb_a861f8a3-be34-4fc0-96cb-42502d0a3bab/extract/0.log" Feb 19 16:56:42 crc kubenswrapper[4810]: I0219 16:56:42.572346 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ccnrb_a861f8a3-be34-4fc0-96cb-42502d0a3bab/pull/0.log" Feb 19 16:56:42 crc kubenswrapper[4810]: I0219 16:56:42.695037 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5n9gc_6dde1ea5-68be-4851-8816-3c7302dc2579/extract-utilities/0.log" Feb 19 16:56:42 crc kubenswrapper[4810]: I0219 16:56:42.878220 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5n9gc_6dde1ea5-68be-4851-8816-3c7302dc2579/extract-utilities/0.log" Feb 19 16:56:42 crc kubenswrapper[4810]: I0219 16:56:42.890688 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5n9gc_6dde1ea5-68be-4851-8816-3c7302dc2579/extract-content/0.log" Feb 19 16:56:42 crc kubenswrapper[4810]: I0219 16:56:42.942278 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5n9gc_6dde1ea5-68be-4851-8816-3c7302dc2579/extract-content/0.log" Feb 19 16:56:43 crc kubenswrapper[4810]: I0219 16:56:43.048206 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5n9gc_6dde1ea5-68be-4851-8816-3c7302dc2579/extract-utilities/0.log" Feb 19 16:56:43 crc kubenswrapper[4810]: I0219 16:56:43.091408 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5n9gc_6dde1ea5-68be-4851-8816-3c7302dc2579/extract-content/0.log" Feb 19 16:56:43 crc kubenswrapper[4810]: I0219 16:56:43.288097 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5n9gc_6dde1ea5-68be-4851-8816-3c7302dc2579/registry-server/0.log" Feb 19 16:56:43 crc kubenswrapper[4810]: I0219 16:56:43.317444 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wtwq9_7f1a10dd-dddb-4bcc-ad11-42e2ab29a0f1/extract-utilities/0.log" Feb 19 16:56:43 crc kubenswrapper[4810]: I0219 16:56:43.460077 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wtwq9_7f1a10dd-dddb-4bcc-ad11-42e2ab29a0f1/extract-content/0.log" Feb 19 16:56:43 crc kubenswrapper[4810]: I0219 16:56:43.481429 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wtwq9_7f1a10dd-dddb-4bcc-ad11-42e2ab29a0f1/extract-utilities/0.log" Feb 19 16:56:43 crc kubenswrapper[4810]: I0219 16:56:43.492873 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wtwq9_7f1a10dd-dddb-4bcc-ad11-42e2ab29a0f1/extract-content/0.log" Feb 19 16:56:43 crc kubenswrapper[4810]: I0219 16:56:43.707706 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wtwq9_7f1a10dd-dddb-4bcc-ad11-42e2ab29a0f1/extract-utilities/0.log" Feb 19 16:56:43 crc kubenswrapper[4810]: I0219 16:56:43.771018 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wtwq9_7f1a10dd-dddb-4bcc-ad11-42e2ab29a0f1/extract-content/0.log" Feb 19 16:56:43 crc kubenswrapper[4810]: I0219 16:56:43.889689 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wtwq9_7f1a10dd-dddb-4bcc-ad11-42e2ab29a0f1/registry-server/0.log" Feb 19 16:56:43 crc kubenswrapper[4810]: I0219 16:56:43.923348 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawk578_815191f4-9d3a-4003-a32f-de4f76c9c15f/util/0.log" Feb 19 16:56:44 crc kubenswrapper[4810]: I0219 16:56:44.079471 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawk578_815191f4-9d3a-4003-a32f-de4f76c9c15f/util/0.log" Feb 19 16:56:44 crc kubenswrapper[4810]: I0219 16:56:44.111123 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawk578_815191f4-9d3a-4003-a32f-de4f76c9c15f/pull/0.log" Feb 19 16:56:44 crc kubenswrapper[4810]: I0219 16:56:44.111144 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawk578_815191f4-9d3a-4003-a32f-de4f76c9c15f/pull/0.log" Feb 19 16:56:44 crc kubenswrapper[4810]: I0219 16:56:44.266180 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawk578_815191f4-9d3a-4003-a32f-de4f76c9c15f/pull/0.log" Feb 19 16:56:44 crc kubenswrapper[4810]: I0219 16:56:44.291507 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawk578_815191f4-9d3a-4003-a32f-de4f76c9c15f/util/0.log" Feb 19 16:56:44 crc kubenswrapper[4810]: I0219 16:56:44.314692 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawk578_815191f4-9d3a-4003-a32f-de4f76c9c15f/extract/0.log" Feb 19 16:56:44 crc kubenswrapper[4810]: I0219 16:56:44.446423 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-sm9wk_41d27e40-a89e-4fd6-8106-824c5a257f25/marketplace-operator/0.log" Feb 19 16:56:44 crc kubenswrapper[4810]: I0219 16:56:44.476216 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tgtg8_b3c2bc60-712d-4ef6-b461-ad683f51f2e4/extract-utilities/0.log" Feb 19 16:56:44 crc kubenswrapper[4810]: I0219 16:56:44.700705 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tgtg8_b3c2bc60-712d-4ef6-b461-ad683f51f2e4/extract-utilities/0.log" Feb 19 16:56:44 crc kubenswrapper[4810]: I0219 16:56:44.704789 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tgtg8_b3c2bc60-712d-4ef6-b461-ad683f51f2e4/extract-content/0.log" Feb 19 16:56:44 crc kubenswrapper[4810]: I0219 16:56:44.705161 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tgtg8_b3c2bc60-712d-4ef6-b461-ad683f51f2e4/extract-content/0.log" Feb 19 16:56:44 crc kubenswrapper[4810]: I0219 16:56:44.859149 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tgtg8_b3c2bc60-712d-4ef6-b461-ad683f51f2e4/extract-content/0.log" Feb 19 16:56:44 crc kubenswrapper[4810]: I0219 16:56:44.861486 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tgtg8_b3c2bc60-712d-4ef6-b461-ad683f51f2e4/extract-utilities/0.log" Feb 19 16:56:45 crc kubenswrapper[4810]: I0219 16:56:45.071242 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bwbpv_05b0324a-36c1-419b-8bdd-e41ad42a6a3f/extract-utilities/0.log" Feb 19 16:56:45 crc kubenswrapper[4810]: I0219 16:56:45.071923 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tgtg8_b3c2bc60-712d-4ef6-b461-ad683f51f2e4/registry-server/0.log" Feb 19 16:56:45 crc kubenswrapper[4810]: I0219 16:56:45.237005 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bwbpv_05b0324a-36c1-419b-8bdd-e41ad42a6a3f/extract-content/0.log" Feb 19 16:56:45 crc kubenswrapper[4810]: I0219 16:56:45.265993 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bwbpv_05b0324a-36c1-419b-8bdd-e41ad42a6a3f/extract-content/0.log" Feb 19 16:56:45 crc kubenswrapper[4810]: I0219 16:56:45.289678 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bwbpv_05b0324a-36c1-419b-8bdd-e41ad42a6a3f/extract-utilities/0.log" Feb 19 16:56:45 crc kubenswrapper[4810]: I0219 16:56:45.533350 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bwbpv_05b0324a-36c1-419b-8bdd-e41ad42a6a3f/extract-utilities/0.log" Feb 19 16:56:45 crc kubenswrapper[4810]: I0219 16:56:45.534451 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bwbpv_05b0324a-36c1-419b-8bdd-e41ad42a6a3f/extract-content/0.log" Feb 19 16:56:46 crc kubenswrapper[4810]: I0219 16:56:46.048723 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bwbpv_05b0324a-36c1-419b-8bdd-e41ad42a6a3f/registry-server/0.log" Feb 19 16:56:49 crc kubenswrapper[4810]: I0219 16:56:49.537319 4810 patch_prober.go:28] interesting pod/machine-config-daemon-t499d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 16:56:49 crc kubenswrapper[4810]: I0219 16:56:49.537809 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 16:56:49 crc kubenswrapper[4810]: I0219 16:56:49.537881 4810 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t499d" Feb 19 16:56:49 crc kubenswrapper[4810]: I0219 16:56:49.539022 4810 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a0ba7d3250d70141c7fee7e449509a680dcfa78b2c2d7c2d1cd7f9464f6ba11b"} pod="openshift-machine-config-operator/machine-config-daemon-t499d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 16:56:49 crc kubenswrapper[4810]: I0219 16:56:49.539126 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" containerID="cri-o://a0ba7d3250d70141c7fee7e449509a680dcfa78b2c2d7c2d1cd7f9464f6ba11b" gracePeriod=600 Feb 19 16:56:49 crc kubenswrapper[4810]: E0219 16:56:49.687986 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:56:50 crc kubenswrapper[4810]: I0219 16:56:50.171259 4810 generic.go:334] "Generic (PLEG): container finished" podID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerID="a0ba7d3250d70141c7fee7e449509a680dcfa78b2c2d7c2d1cd7f9464f6ba11b" exitCode=0 Feb 19 16:56:50 crc kubenswrapper[4810]: I0219 16:56:50.171314 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" event={"ID":"dfbf74ef-1e94-4826-8583-42b2e246ccf3","Type":"ContainerDied","Data":"a0ba7d3250d70141c7fee7e449509a680dcfa78b2c2d7c2d1cd7f9464f6ba11b"} Feb 19 16:56:50 crc kubenswrapper[4810]: I0219 16:56:50.171458 4810 scope.go:117] "RemoveContainer" containerID="30b913c55f9740186a8530347003d7b1c641faf95ecc5adefecfaffe54fb5ed2" Feb 19 16:56:50 crc kubenswrapper[4810]: I0219 16:56:50.172963 4810 scope.go:117] "RemoveContainer" containerID="a0ba7d3250d70141c7fee7e449509a680dcfa78b2c2d7c2d1cd7f9464f6ba11b" Feb 19 16:56:50 crc kubenswrapper[4810]: E0219 16:56:50.173723 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:56:59 crc kubenswrapper[4810]: I0219 16:56:59.375938 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6d6bf9b6-qpvdt_d5debcf2-9629-4bb2-9133-f4b81748ff7d/prometheus-operator-admission-webhook/0.log" Feb 19 16:56:59 crc kubenswrapper[4810]: I0219 16:56:59.449095 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-6jkkz_1656f52d-7771-4bbb-9642-b296d16b791e/prometheus-operator/0.log" Feb 19 16:56:59 crc kubenswrapper[4810]: I0219 16:56:59.506421 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6d6bf9b6-vrfvr_408628c0-0b2c-48f9-b849-ee1b124499e1/prometheus-operator-admission-webhook/0.log" Feb 19 16:56:59 crc kubenswrapper[4810]: I0219 16:56:59.624556 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-dk9c4_8bdf030e-92d8-45dc-ab6c-a7b241444677/operator/0.log" Feb 19 16:56:59 crc kubenswrapper[4810]: I0219 16:56:59.657446 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-2fdxm_c5968625-c396-4ae0-9846-c2ceb6baf655/perses-operator/0.log" Feb 19 16:57:03 crc kubenswrapper[4810]: I0219 16:57:03.440571 4810 scope.go:117] "RemoveContainer" containerID="a0ba7d3250d70141c7fee7e449509a680dcfa78b2c2d7c2d1cd7f9464f6ba11b" Feb 19 16:57:03 crc kubenswrapper[4810]: E0219 16:57:03.441127 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:57:17 crc kubenswrapper[4810]: I0219 16:57:17.439866 4810 scope.go:117] "RemoveContainer" containerID="a0ba7d3250d70141c7fee7e449509a680dcfa78b2c2d7c2d1cd7f9464f6ba11b" Feb 19 16:57:17 crc kubenswrapper[4810]: E0219 16:57:17.440639 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:57:32 crc kubenswrapper[4810]: I0219 16:57:32.440118 4810 scope.go:117] "RemoveContainer" containerID="a0ba7d3250d70141c7fee7e449509a680dcfa78b2c2d7c2d1cd7f9464f6ba11b" Feb 19 16:57:32 crc kubenswrapper[4810]: E0219 16:57:32.441219 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:57:45 crc kubenswrapper[4810]: I0219 16:57:45.441609 4810 scope.go:117] "RemoveContainer" containerID="a0ba7d3250d70141c7fee7e449509a680dcfa78b2c2d7c2d1cd7f9464f6ba11b" Feb 19 16:57:45 crc kubenswrapper[4810]: E0219 16:57:45.442708 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:57:59 crc kubenswrapper[4810]: I0219 16:57:59.439847 4810 scope.go:117] "RemoveContainer" containerID="a0ba7d3250d70141c7fee7e449509a680dcfa78b2c2d7c2d1cd7f9464f6ba11b" Feb 19 16:57:59 crc kubenswrapper[4810]: E0219 16:57:59.440734 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:58:13 crc kubenswrapper[4810]: I0219 16:58:13.440241 4810 scope.go:117] "RemoveContainer" containerID="a0ba7d3250d70141c7fee7e449509a680dcfa78b2c2d7c2d1cd7f9464f6ba11b" Feb 19 16:58:13 crc kubenswrapper[4810]: E0219 16:58:13.441553 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:58:26 crc kubenswrapper[4810]: I0219 16:58:26.439652 4810 scope.go:117] "RemoveContainer" containerID="a0ba7d3250d70141c7fee7e449509a680dcfa78b2c2d7c2d1cd7f9464f6ba11b" Feb 19 16:58:26 crc kubenswrapper[4810]: E0219 16:58:26.440952 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:58:38 crc kubenswrapper[4810]: I0219 16:58:38.440566 4810 scope.go:117] "RemoveContainer" containerID="a0ba7d3250d70141c7fee7e449509a680dcfa78b2c2d7c2d1cd7f9464f6ba11b" Feb 19 16:58:38 crc kubenswrapper[4810]: E0219 16:58:38.441402 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:58:45 crc kubenswrapper[4810]: I0219 16:58:45.645575 4810 scope.go:117] "RemoveContainer" containerID="1827e095c322c4555855b6ea50a05730da42c19ada929bb9656b95a872f9917c" Feb 19 16:58:48 crc kubenswrapper[4810]: I0219 16:58:48.602075 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dbdz2"] Feb 19 16:58:48 crc kubenswrapper[4810]: E0219 16:58:48.603013 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3aae9737-d017-4a11-8323-cd0354ba09aa" containerName="container-00" Feb 19 16:58:48 crc kubenswrapper[4810]: I0219 16:58:48.603034 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="3aae9737-d017-4a11-8323-cd0354ba09aa" containerName="container-00" Feb 19 16:58:48 crc kubenswrapper[4810]: I0219 16:58:48.603284 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="3aae9737-d017-4a11-8323-cd0354ba09aa" containerName="container-00" Feb 19 16:58:48 crc kubenswrapper[4810]: I0219 16:58:48.605271 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dbdz2" Feb 19 16:58:48 crc kubenswrapper[4810]: I0219 16:58:48.638006 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dbdz2"] Feb 19 16:58:48 crc kubenswrapper[4810]: I0219 16:58:48.785005 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4333000-6b13-4953-b6d8-94b72a034fa3-catalog-content\") pod \"certified-operators-dbdz2\" (UID: \"b4333000-6b13-4953-b6d8-94b72a034fa3\") " pod="openshift-marketplace/certified-operators-dbdz2" Feb 19 16:58:48 crc kubenswrapper[4810]: I0219 16:58:48.785287 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4333000-6b13-4953-b6d8-94b72a034fa3-utilities\") pod \"certified-operators-dbdz2\" (UID: \"b4333000-6b13-4953-b6d8-94b72a034fa3\") " pod="openshift-marketplace/certified-operators-dbdz2" Feb 19 16:58:48 crc kubenswrapper[4810]: I0219 16:58:48.785400 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jsfx\" (UniqueName: \"kubernetes.io/projected/b4333000-6b13-4953-b6d8-94b72a034fa3-kube-api-access-9jsfx\") pod \"certified-operators-dbdz2\" (UID: \"b4333000-6b13-4953-b6d8-94b72a034fa3\") " pod="openshift-marketplace/certified-operators-dbdz2" Feb 19 16:58:48 crc kubenswrapper[4810]: I0219 16:58:48.887063 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jsfx\" (UniqueName: \"kubernetes.io/projected/b4333000-6b13-4953-b6d8-94b72a034fa3-kube-api-access-9jsfx\") pod \"certified-operators-dbdz2\" (UID: \"b4333000-6b13-4953-b6d8-94b72a034fa3\") " pod="openshift-marketplace/certified-operators-dbdz2" Feb 19 16:58:48 crc kubenswrapper[4810]: I0219 16:58:48.887247 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4333000-6b13-4953-b6d8-94b72a034fa3-catalog-content\") pod \"certified-operators-dbdz2\" (UID: \"b4333000-6b13-4953-b6d8-94b72a034fa3\") " pod="openshift-marketplace/certified-operators-dbdz2" Feb 19 16:58:48 crc kubenswrapper[4810]: I0219 16:58:48.887299 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4333000-6b13-4953-b6d8-94b72a034fa3-utilities\") pod \"certified-operators-dbdz2\" (UID: \"b4333000-6b13-4953-b6d8-94b72a034fa3\") " pod="openshift-marketplace/certified-operators-dbdz2" Feb 19 16:58:48 crc kubenswrapper[4810]: I0219 16:58:48.888021 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4333000-6b13-4953-b6d8-94b72a034fa3-utilities\") pod \"certified-operators-dbdz2\" (UID: \"b4333000-6b13-4953-b6d8-94b72a034fa3\") " pod="openshift-marketplace/certified-operators-dbdz2" Feb 19 16:58:48 crc kubenswrapper[4810]: I0219 16:58:48.888789 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4333000-6b13-4953-b6d8-94b72a034fa3-catalog-content\") pod \"certified-operators-dbdz2\" (UID: \"b4333000-6b13-4953-b6d8-94b72a034fa3\") " pod="openshift-marketplace/certified-operators-dbdz2" Feb 19 16:58:48 crc kubenswrapper[4810]: I0219 16:58:48.925280 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jsfx\" (UniqueName: \"kubernetes.io/projected/b4333000-6b13-4953-b6d8-94b72a034fa3-kube-api-access-9jsfx\") pod \"certified-operators-dbdz2\" (UID: \"b4333000-6b13-4953-b6d8-94b72a034fa3\") " pod="openshift-marketplace/certified-operators-dbdz2" Feb 19 16:58:48 crc kubenswrapper[4810]: I0219 16:58:48.947121 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dbdz2" Feb 19 16:58:49 crc kubenswrapper[4810]: I0219 16:58:49.691081 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dbdz2"] Feb 19 16:58:50 crc kubenswrapper[4810]: I0219 16:58:50.390376 4810 generic.go:334] "Generic (PLEG): container finished" podID="b4333000-6b13-4953-b6d8-94b72a034fa3" containerID="d15cf9afbb1bcf445a79cbff5a4a42a15c43d7239e3785870c3d7d15fdb487c7" exitCode=0 Feb 19 16:58:50 crc kubenswrapper[4810]: I0219 16:58:50.390948 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dbdz2" event={"ID":"b4333000-6b13-4953-b6d8-94b72a034fa3","Type":"ContainerDied","Data":"d15cf9afbb1bcf445a79cbff5a4a42a15c43d7239e3785870c3d7d15fdb487c7"} Feb 19 16:58:50 crc kubenswrapper[4810]: I0219 16:58:50.390981 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dbdz2" event={"ID":"b4333000-6b13-4953-b6d8-94b72a034fa3","Type":"ContainerStarted","Data":"26f06d09242947bbcb3eda6d9dd4c0cb9850bbf8a6d4e737adaca0b70117f512"} Feb 19 16:58:50 crc kubenswrapper[4810]: I0219 16:58:50.396238 4810 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 16:58:51 crc kubenswrapper[4810]: I0219 16:58:51.448965 4810 scope.go:117] "RemoveContainer" containerID="a0ba7d3250d70141c7fee7e449509a680dcfa78b2c2d7c2d1cd7f9464f6ba11b" Feb 19 16:58:51 crc kubenswrapper[4810]: E0219 16:58:51.449244 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:58:55 crc kubenswrapper[4810]: I0219 16:58:55.450882 4810 generic.go:334] "Generic (PLEG): container finished" podID="b4333000-6b13-4953-b6d8-94b72a034fa3" containerID="e0323915d761d2f378bea3e13516dd792761c267e30344e8801a5e56c377084f" exitCode=0 Feb 19 16:58:55 crc kubenswrapper[4810]: I0219 16:58:55.479693 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dbdz2" event={"ID":"b4333000-6b13-4953-b6d8-94b72a034fa3","Type":"ContainerDied","Data":"e0323915d761d2f378bea3e13516dd792761c267e30344e8801a5e56c377084f"} Feb 19 16:58:57 crc kubenswrapper[4810]: I0219 16:58:57.498756 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dbdz2" event={"ID":"b4333000-6b13-4953-b6d8-94b72a034fa3","Type":"ContainerStarted","Data":"43ae22d790c0f3771637fac0124f2bb32d5470c25cb7395e96bb10929efd5241"} Feb 19 16:58:57 crc kubenswrapper[4810]: I0219 16:58:57.532428 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dbdz2" podStartSLOduration=3.950747642 podStartE2EDuration="9.532408155s" podCreationTimestamp="2026-02-19 16:58:48 +0000 UTC" firstStartedPulling="2026-02-19 16:58:50.395871408 +0000 UTC m=+6559.877901542" lastFinishedPulling="2026-02-19 16:58:55.977531941 +0000 UTC m=+6565.459562055" observedRunningTime="2026-02-19 16:58:57.524711636 +0000 UTC m=+6567.006741770" watchObservedRunningTime="2026-02-19 16:58:57.532408155 +0000 UTC m=+6567.014438299" Feb 19 16:58:58 crc kubenswrapper[4810]: I0219 16:58:58.947627 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dbdz2" Feb 19 16:58:58 crc kubenswrapper[4810]: I0219 16:58:58.947956 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dbdz2" Feb 19 16:58:59 crc kubenswrapper[4810]: I0219 16:58:59.023239 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dbdz2" Feb 19 16:59:05 crc kubenswrapper[4810]: I0219 16:59:05.602370 4810 generic.go:334] "Generic (PLEG): container finished" podID="0adbd447-568e-48c8-ab76-3d2f20e3f315" containerID="07153056d19231ca3d7ab4d1dd2cd03f17f8e8578ea20e8a31486c974d0f0b3d" exitCode=0 Feb 19 16:59:05 crc kubenswrapper[4810]: I0219 16:59:05.602406 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-944q7/must-gather-452lm" event={"ID":"0adbd447-568e-48c8-ab76-3d2f20e3f315","Type":"ContainerDied","Data":"07153056d19231ca3d7ab4d1dd2cd03f17f8e8578ea20e8a31486c974d0f0b3d"} Feb 19 16:59:05 crc kubenswrapper[4810]: I0219 16:59:05.603953 4810 scope.go:117] "RemoveContainer" containerID="07153056d19231ca3d7ab4d1dd2cd03f17f8e8578ea20e8a31486c974d0f0b3d" Feb 19 16:59:06 crc kubenswrapper[4810]: I0219 16:59:06.439646 4810 scope.go:117] "RemoveContainer" containerID="a0ba7d3250d70141c7fee7e449509a680dcfa78b2c2d7c2d1cd7f9464f6ba11b" Feb 19 16:59:06 crc kubenswrapper[4810]: E0219 16:59:06.440095 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:59:06 crc kubenswrapper[4810]: I0219 16:59:06.646851 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-944q7_must-gather-452lm_0adbd447-568e-48c8-ab76-3d2f20e3f315/gather/0.log" Feb 19 16:59:09 crc kubenswrapper[4810]: I0219 16:59:09.015044 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dbdz2" Feb 19 16:59:09 crc kubenswrapper[4810]: I0219 16:59:09.093446 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dbdz2"] Feb 19 16:59:09 crc kubenswrapper[4810]: I0219 16:59:09.149232 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5n9gc"] Feb 19 16:59:09 crc kubenswrapper[4810]: I0219 16:59:09.149474 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5n9gc" podUID="6dde1ea5-68be-4851-8816-3c7302dc2579" containerName="registry-server" containerID="cri-o://c46982b47a689551c84c8a94559da6c4aba508584e390cc906dbadf87e5b2444" gracePeriod=2 Feb 19 16:59:09 crc kubenswrapper[4810]: I0219 16:59:09.609911 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5n9gc" Feb 19 16:59:09 crc kubenswrapper[4810]: I0219 16:59:09.651844 4810 generic.go:334] "Generic (PLEG): container finished" podID="6dde1ea5-68be-4851-8816-3c7302dc2579" containerID="c46982b47a689551c84c8a94559da6c4aba508584e390cc906dbadf87e5b2444" exitCode=0 Feb 19 16:59:09 crc kubenswrapper[4810]: I0219 16:59:09.653083 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5n9gc" Feb 19 16:59:09 crc kubenswrapper[4810]: I0219 16:59:09.653857 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5n9gc" event={"ID":"6dde1ea5-68be-4851-8816-3c7302dc2579","Type":"ContainerDied","Data":"c46982b47a689551c84c8a94559da6c4aba508584e390cc906dbadf87e5b2444"} Feb 19 16:59:09 crc kubenswrapper[4810]: I0219 16:59:09.653997 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5n9gc" event={"ID":"6dde1ea5-68be-4851-8816-3c7302dc2579","Type":"ContainerDied","Data":"280973c57a3f46eeb314d42cd9ad11d9e2b63939b685ed7e15c043b29db4262c"} Feb 19 16:59:09 crc kubenswrapper[4810]: I0219 16:59:09.654104 4810 scope.go:117] "RemoveContainer" containerID="c46982b47a689551c84c8a94559da6c4aba508584e390cc906dbadf87e5b2444" Feb 19 16:59:09 crc kubenswrapper[4810]: I0219 16:59:09.687099 4810 scope.go:117] "RemoveContainer" containerID="8046aa230ca2b4fe6d7939a4777943e6f17a788dfab6e0b652084234b84eaf5e" Feb 19 16:59:09 crc kubenswrapper[4810]: I0219 16:59:09.757560 4810 scope.go:117] "RemoveContainer" containerID="c60a4f5da78e4d16d81eacb1c4cfea0a5df0810a9ff302334c88f13711140033" Feb 19 16:59:09 crc kubenswrapper[4810]: I0219 16:59:09.790933 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dde1ea5-68be-4851-8816-3c7302dc2579-catalog-content\") pod \"6dde1ea5-68be-4851-8816-3c7302dc2579\" (UID: \"6dde1ea5-68be-4851-8816-3c7302dc2579\") " Feb 19 16:59:09 crc kubenswrapper[4810]: I0219 16:59:09.791121 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dde1ea5-68be-4851-8816-3c7302dc2579-utilities\") pod \"6dde1ea5-68be-4851-8816-3c7302dc2579\" (UID: \"6dde1ea5-68be-4851-8816-3c7302dc2579\") " Feb 19 16:59:09 crc kubenswrapper[4810]: I0219 16:59:09.791185 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5j8ln\" (UniqueName: \"kubernetes.io/projected/6dde1ea5-68be-4851-8816-3c7302dc2579-kube-api-access-5j8ln\") pod \"6dde1ea5-68be-4851-8816-3c7302dc2579\" (UID: \"6dde1ea5-68be-4851-8816-3c7302dc2579\") " Feb 19 16:59:09 crc kubenswrapper[4810]: I0219 16:59:09.800503 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6dde1ea5-68be-4851-8816-3c7302dc2579-utilities" (OuterVolumeSpecName: "utilities") pod "6dde1ea5-68be-4851-8816-3c7302dc2579" (UID: "6dde1ea5-68be-4851-8816-3c7302dc2579"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 16:59:09 crc kubenswrapper[4810]: I0219 16:59:09.801606 4810 scope.go:117] "RemoveContainer" containerID="c46982b47a689551c84c8a94559da6c4aba508584e390cc906dbadf87e5b2444" Feb 19 16:59:09 crc kubenswrapper[4810]: E0219 16:59:09.805560 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c46982b47a689551c84c8a94559da6c4aba508584e390cc906dbadf87e5b2444\": container with ID starting with c46982b47a689551c84c8a94559da6c4aba508584e390cc906dbadf87e5b2444 not found: ID does not exist" containerID="c46982b47a689551c84c8a94559da6c4aba508584e390cc906dbadf87e5b2444" Feb 19 16:59:09 crc kubenswrapper[4810]: I0219 16:59:09.805603 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c46982b47a689551c84c8a94559da6c4aba508584e390cc906dbadf87e5b2444"} err="failed to get container status \"c46982b47a689551c84c8a94559da6c4aba508584e390cc906dbadf87e5b2444\": rpc error: code = NotFound desc = could not find container \"c46982b47a689551c84c8a94559da6c4aba508584e390cc906dbadf87e5b2444\": container with ID starting with c46982b47a689551c84c8a94559da6c4aba508584e390cc906dbadf87e5b2444 not found: ID does not exist" Feb 19 16:59:09 crc kubenswrapper[4810]: I0219 16:59:09.805628 4810 scope.go:117] "RemoveContainer" containerID="8046aa230ca2b4fe6d7939a4777943e6f17a788dfab6e0b652084234b84eaf5e" Feb 19 16:59:09 crc kubenswrapper[4810]: E0219 16:59:09.806814 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8046aa230ca2b4fe6d7939a4777943e6f17a788dfab6e0b652084234b84eaf5e\": container with ID starting with 8046aa230ca2b4fe6d7939a4777943e6f17a788dfab6e0b652084234b84eaf5e not found: ID does not exist" containerID="8046aa230ca2b4fe6d7939a4777943e6f17a788dfab6e0b652084234b84eaf5e" Feb 19 16:59:09 crc kubenswrapper[4810]: I0219 16:59:09.806955 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8046aa230ca2b4fe6d7939a4777943e6f17a788dfab6e0b652084234b84eaf5e"} err="failed to get container status \"8046aa230ca2b4fe6d7939a4777943e6f17a788dfab6e0b652084234b84eaf5e\": rpc error: code = NotFound desc = could not find container \"8046aa230ca2b4fe6d7939a4777943e6f17a788dfab6e0b652084234b84eaf5e\": container with ID starting with 8046aa230ca2b4fe6d7939a4777943e6f17a788dfab6e0b652084234b84eaf5e not found: ID does not exist" Feb 19 16:59:09 crc kubenswrapper[4810]: I0219 16:59:09.807076 4810 scope.go:117] "RemoveContainer" containerID="c60a4f5da78e4d16d81eacb1c4cfea0a5df0810a9ff302334c88f13711140033" Feb 19 16:59:09 crc kubenswrapper[4810]: I0219 16:59:09.810571 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6dde1ea5-68be-4851-8816-3c7302dc2579-kube-api-access-5j8ln" (OuterVolumeSpecName: "kube-api-access-5j8ln") pod "6dde1ea5-68be-4851-8816-3c7302dc2579" (UID: "6dde1ea5-68be-4851-8816-3c7302dc2579"). InnerVolumeSpecName "kube-api-access-5j8ln". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 16:59:09 crc kubenswrapper[4810]: E0219 16:59:09.811506 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c60a4f5da78e4d16d81eacb1c4cfea0a5df0810a9ff302334c88f13711140033\": container with ID starting with c60a4f5da78e4d16d81eacb1c4cfea0a5df0810a9ff302334c88f13711140033 not found: ID does not exist" containerID="c60a4f5da78e4d16d81eacb1c4cfea0a5df0810a9ff302334c88f13711140033" Feb 19 16:59:09 crc kubenswrapper[4810]: I0219 16:59:09.811549 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c60a4f5da78e4d16d81eacb1c4cfea0a5df0810a9ff302334c88f13711140033"} err="failed to get container status \"c60a4f5da78e4d16d81eacb1c4cfea0a5df0810a9ff302334c88f13711140033\": rpc error: code = NotFound desc = could not find container \"c60a4f5da78e4d16d81eacb1c4cfea0a5df0810a9ff302334c88f13711140033\": container with ID starting with c60a4f5da78e4d16d81eacb1c4cfea0a5df0810a9ff302334c88f13711140033 not found: ID does not exist" Feb 19 16:59:09 crc kubenswrapper[4810]: I0219 16:59:09.842441 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6dde1ea5-68be-4851-8816-3c7302dc2579-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6dde1ea5-68be-4851-8816-3c7302dc2579" (UID: "6dde1ea5-68be-4851-8816-3c7302dc2579"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 16:59:09 crc kubenswrapper[4810]: I0219 16:59:09.894384 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dde1ea5-68be-4851-8816-3c7302dc2579-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 16:59:09 crc kubenswrapper[4810]: I0219 16:59:09.894414 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dde1ea5-68be-4851-8816-3c7302dc2579-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 16:59:09 crc kubenswrapper[4810]: I0219 16:59:09.894428 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5j8ln\" (UniqueName: \"kubernetes.io/projected/6dde1ea5-68be-4851-8816-3c7302dc2579-kube-api-access-5j8ln\") on node \"crc\" DevicePath \"\"" Feb 19 16:59:09 crc kubenswrapper[4810]: I0219 16:59:09.997680 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5n9gc"] Feb 19 16:59:10 crc kubenswrapper[4810]: I0219 16:59:10.007478 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5n9gc"] Feb 19 16:59:11 crc kubenswrapper[4810]: I0219 16:59:11.451089 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6dde1ea5-68be-4851-8816-3c7302dc2579" path="/var/lib/kubelet/pods/6dde1ea5-68be-4851-8816-3c7302dc2579/volumes" Feb 19 16:59:17 crc kubenswrapper[4810]: I0219 16:59:17.440468 4810 scope.go:117] "RemoveContainer" containerID="a0ba7d3250d70141c7fee7e449509a680dcfa78b2c2d7c2d1cd7f9464f6ba11b" Feb 19 16:59:17 crc kubenswrapper[4810]: E0219 16:59:17.441963 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:59:20 crc kubenswrapper[4810]: I0219 16:59:20.511612 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-944q7/must-gather-452lm"] Feb 19 16:59:20 crc kubenswrapper[4810]: I0219 16:59:20.511968 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-944q7/must-gather-452lm" podUID="0adbd447-568e-48c8-ab76-3d2f20e3f315" containerName="copy" containerID="cri-o://35e06e2cca1fb991fa940df76a9f88f0b3d758223a060db89405e9f9e28e0bdb" gracePeriod=2 Feb 19 16:59:20 crc kubenswrapper[4810]: I0219 16:59:20.521497 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-944q7/must-gather-452lm"] Feb 19 16:59:20 crc kubenswrapper[4810]: I0219 16:59:20.992831 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-944q7_must-gather-452lm_0adbd447-568e-48c8-ab76-3d2f20e3f315/copy/0.log" Feb 19 16:59:20 crc kubenswrapper[4810]: I0219 16:59:20.993542 4810 generic.go:334] "Generic (PLEG): container finished" podID="0adbd447-568e-48c8-ab76-3d2f20e3f315" containerID="35e06e2cca1fb991fa940df76a9f88f0b3d758223a060db89405e9f9e28e0bdb" exitCode=143 Feb 19 16:59:20 crc kubenswrapper[4810]: I0219 16:59:20.993602 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf36fcfbf554bf4a4c84825c8d6c3208469e142f948d23106d4b61d616d08799" Feb 19 16:59:21 crc kubenswrapper[4810]: I0219 16:59:21.046312 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-944q7_must-gather-452lm_0adbd447-568e-48c8-ab76-3d2f20e3f315/copy/0.log" Feb 19 16:59:21 crc kubenswrapper[4810]: I0219 16:59:21.046930 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-944q7/must-gather-452lm" Feb 19 16:59:21 crc kubenswrapper[4810]: I0219 16:59:21.162678 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0adbd447-568e-48c8-ab76-3d2f20e3f315-must-gather-output\") pod \"0adbd447-568e-48c8-ab76-3d2f20e3f315\" (UID: \"0adbd447-568e-48c8-ab76-3d2f20e3f315\") " Feb 19 16:59:21 crc kubenswrapper[4810]: I0219 16:59:21.162753 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6rmt\" (UniqueName: \"kubernetes.io/projected/0adbd447-568e-48c8-ab76-3d2f20e3f315-kube-api-access-f6rmt\") pod \"0adbd447-568e-48c8-ab76-3d2f20e3f315\" (UID: \"0adbd447-568e-48c8-ab76-3d2f20e3f315\") " Feb 19 16:59:21 crc kubenswrapper[4810]: I0219 16:59:21.170498 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0adbd447-568e-48c8-ab76-3d2f20e3f315-kube-api-access-f6rmt" (OuterVolumeSpecName: "kube-api-access-f6rmt") pod "0adbd447-568e-48c8-ab76-3d2f20e3f315" (UID: "0adbd447-568e-48c8-ab76-3d2f20e3f315"). InnerVolumeSpecName "kube-api-access-f6rmt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 16:59:21 crc kubenswrapper[4810]: I0219 16:59:21.265953 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f6rmt\" (UniqueName: \"kubernetes.io/projected/0adbd447-568e-48c8-ab76-3d2f20e3f315-kube-api-access-f6rmt\") on node \"crc\" DevicePath \"\"" Feb 19 16:59:21 crc kubenswrapper[4810]: I0219 16:59:21.370409 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0adbd447-568e-48c8-ab76-3d2f20e3f315-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "0adbd447-568e-48c8-ab76-3d2f20e3f315" (UID: "0adbd447-568e-48c8-ab76-3d2f20e3f315"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 16:59:21 crc kubenswrapper[4810]: I0219 16:59:21.452978 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0adbd447-568e-48c8-ab76-3d2f20e3f315" path="/var/lib/kubelet/pods/0adbd447-568e-48c8-ab76-3d2f20e3f315/volumes" Feb 19 16:59:21 crc kubenswrapper[4810]: I0219 16:59:21.469882 4810 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0adbd447-568e-48c8-ab76-3d2f20e3f315-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 19 16:59:22 crc kubenswrapper[4810]: I0219 16:59:22.002879 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-944q7/must-gather-452lm" Feb 19 16:59:28 crc kubenswrapper[4810]: I0219 16:59:28.841207 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-l88mg"] Feb 19 16:59:28 crc kubenswrapper[4810]: E0219 16:59:28.842052 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dde1ea5-68be-4851-8816-3c7302dc2579" containerName="registry-server" Feb 19 16:59:28 crc kubenswrapper[4810]: I0219 16:59:28.842069 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dde1ea5-68be-4851-8816-3c7302dc2579" containerName="registry-server" Feb 19 16:59:28 crc kubenswrapper[4810]: E0219 16:59:28.842088 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dde1ea5-68be-4851-8816-3c7302dc2579" containerName="extract-content" Feb 19 16:59:28 crc kubenswrapper[4810]: I0219 16:59:28.842096 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dde1ea5-68be-4851-8816-3c7302dc2579" containerName="extract-content" Feb 19 16:59:28 crc kubenswrapper[4810]: E0219 16:59:28.842122 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dde1ea5-68be-4851-8816-3c7302dc2579" containerName="extract-utilities" Feb 19 16:59:28 crc kubenswrapper[4810]: I0219 16:59:28.842132 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dde1ea5-68be-4851-8816-3c7302dc2579" containerName="extract-utilities" Feb 19 16:59:28 crc kubenswrapper[4810]: E0219 16:59:28.842155 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0adbd447-568e-48c8-ab76-3d2f20e3f315" containerName="copy" Feb 19 16:59:28 crc kubenswrapper[4810]: I0219 16:59:28.842163 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="0adbd447-568e-48c8-ab76-3d2f20e3f315" containerName="copy" Feb 19 16:59:28 crc kubenswrapper[4810]: E0219 16:59:28.842181 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0adbd447-568e-48c8-ab76-3d2f20e3f315" containerName="gather" Feb 19 16:59:28 crc kubenswrapper[4810]: I0219 16:59:28.842188 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="0adbd447-568e-48c8-ab76-3d2f20e3f315" containerName="gather" Feb 19 16:59:28 crc kubenswrapper[4810]: I0219 16:59:28.843713 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="6dde1ea5-68be-4851-8816-3c7302dc2579" containerName="registry-server" Feb 19 16:59:28 crc kubenswrapper[4810]: I0219 16:59:28.843746 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="0adbd447-568e-48c8-ab76-3d2f20e3f315" containerName="copy" Feb 19 16:59:28 crc kubenswrapper[4810]: I0219 16:59:28.843794 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="0adbd447-568e-48c8-ab76-3d2f20e3f315" containerName="gather" Feb 19 16:59:28 crc kubenswrapper[4810]: I0219 16:59:28.846543 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l88mg" Feb 19 16:59:28 crc kubenswrapper[4810]: I0219 16:59:28.878173 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l88mg"] Feb 19 16:59:28 crc kubenswrapper[4810]: I0219 16:59:28.936569 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb1500ae-62df-4106-92a9-292e4a530f9b-utilities\") pod \"redhat-marketplace-l88mg\" (UID: \"fb1500ae-62df-4106-92a9-292e4a530f9b\") " pod="openshift-marketplace/redhat-marketplace-l88mg" Feb 19 16:59:28 crc kubenswrapper[4810]: I0219 16:59:28.936663 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb1500ae-62df-4106-92a9-292e4a530f9b-catalog-content\") pod \"redhat-marketplace-l88mg\" (UID: \"fb1500ae-62df-4106-92a9-292e4a530f9b\") " pod="openshift-marketplace/redhat-marketplace-l88mg" Feb 19 16:59:28 crc kubenswrapper[4810]: I0219 16:59:28.936803 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4449\" (UniqueName: \"kubernetes.io/projected/fb1500ae-62df-4106-92a9-292e4a530f9b-kube-api-access-d4449\") pod \"redhat-marketplace-l88mg\" (UID: \"fb1500ae-62df-4106-92a9-292e4a530f9b\") " pod="openshift-marketplace/redhat-marketplace-l88mg" Feb 19 16:59:29 crc kubenswrapper[4810]: I0219 16:59:29.040181 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb1500ae-62df-4106-92a9-292e4a530f9b-utilities\") pod \"redhat-marketplace-l88mg\" (UID: \"fb1500ae-62df-4106-92a9-292e4a530f9b\") " pod="openshift-marketplace/redhat-marketplace-l88mg" Feb 19 16:59:29 crc kubenswrapper[4810]: I0219 16:59:29.040388 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb1500ae-62df-4106-92a9-292e4a530f9b-catalog-content\") pod \"redhat-marketplace-l88mg\" (UID: \"fb1500ae-62df-4106-92a9-292e4a530f9b\") " pod="openshift-marketplace/redhat-marketplace-l88mg" Feb 19 16:59:29 crc kubenswrapper[4810]: I0219 16:59:29.040510 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4449\" (UniqueName: \"kubernetes.io/projected/fb1500ae-62df-4106-92a9-292e4a530f9b-kube-api-access-d4449\") pod \"redhat-marketplace-l88mg\" (UID: \"fb1500ae-62df-4106-92a9-292e4a530f9b\") " pod="openshift-marketplace/redhat-marketplace-l88mg" Feb 19 16:59:29 crc kubenswrapper[4810]: I0219 16:59:29.041038 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb1500ae-62df-4106-92a9-292e4a530f9b-catalog-content\") pod \"redhat-marketplace-l88mg\" (UID: \"fb1500ae-62df-4106-92a9-292e4a530f9b\") " pod="openshift-marketplace/redhat-marketplace-l88mg" Feb 19 16:59:29 crc kubenswrapper[4810]: I0219 16:59:29.041077 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb1500ae-62df-4106-92a9-292e4a530f9b-utilities\") pod \"redhat-marketplace-l88mg\" (UID: \"fb1500ae-62df-4106-92a9-292e4a530f9b\") " pod="openshift-marketplace/redhat-marketplace-l88mg" Feb 19 16:59:29 crc kubenswrapper[4810]: I0219 16:59:29.050911 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5mt74"] Feb 19 16:59:29 crc kubenswrapper[4810]: I0219 16:59:29.074110 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5mt74" Feb 19 16:59:29 crc kubenswrapper[4810]: I0219 16:59:29.119254 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4449\" (UniqueName: \"kubernetes.io/projected/fb1500ae-62df-4106-92a9-292e4a530f9b-kube-api-access-d4449\") pod \"redhat-marketplace-l88mg\" (UID: \"fb1500ae-62df-4106-92a9-292e4a530f9b\") " pod="openshift-marketplace/redhat-marketplace-l88mg" Feb 19 16:59:29 crc kubenswrapper[4810]: I0219 16:59:29.177687 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5mt74"] Feb 19 16:59:29 crc kubenswrapper[4810]: I0219 16:59:29.184848 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l88mg" Feb 19 16:59:29 crc kubenswrapper[4810]: I0219 16:59:29.250571 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/808052e2-d521-47dd-afa0-09caec857462-catalog-content\") pod \"community-operators-5mt74\" (UID: \"808052e2-d521-47dd-afa0-09caec857462\") " pod="openshift-marketplace/community-operators-5mt74" Feb 19 16:59:29 crc kubenswrapper[4810]: I0219 16:59:29.250671 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gqvs\" (UniqueName: \"kubernetes.io/projected/808052e2-d521-47dd-afa0-09caec857462-kube-api-access-5gqvs\") pod \"community-operators-5mt74\" (UID: \"808052e2-d521-47dd-afa0-09caec857462\") " pod="openshift-marketplace/community-operators-5mt74" Feb 19 16:59:29 crc kubenswrapper[4810]: I0219 16:59:29.250793 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/808052e2-d521-47dd-afa0-09caec857462-utilities\") pod \"community-operators-5mt74\" (UID: \"808052e2-d521-47dd-afa0-09caec857462\") " pod="openshift-marketplace/community-operators-5mt74" Feb 19 16:59:29 crc kubenswrapper[4810]: I0219 16:59:29.352447 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gqvs\" (UniqueName: \"kubernetes.io/projected/808052e2-d521-47dd-afa0-09caec857462-kube-api-access-5gqvs\") pod \"community-operators-5mt74\" (UID: \"808052e2-d521-47dd-afa0-09caec857462\") " pod="openshift-marketplace/community-operators-5mt74" Feb 19 16:59:29 crc kubenswrapper[4810]: I0219 16:59:29.352847 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/808052e2-d521-47dd-afa0-09caec857462-utilities\") pod \"community-operators-5mt74\" (UID: \"808052e2-d521-47dd-afa0-09caec857462\") " pod="openshift-marketplace/community-operators-5mt74" Feb 19 16:59:29 crc kubenswrapper[4810]: I0219 16:59:29.352902 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/808052e2-d521-47dd-afa0-09caec857462-catalog-content\") pod \"community-operators-5mt74\" (UID: \"808052e2-d521-47dd-afa0-09caec857462\") " pod="openshift-marketplace/community-operators-5mt74" Feb 19 16:59:29 crc kubenswrapper[4810]: I0219 16:59:29.353540 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/808052e2-d521-47dd-afa0-09caec857462-catalog-content\") pod \"community-operators-5mt74\" (UID: \"808052e2-d521-47dd-afa0-09caec857462\") " pod="openshift-marketplace/community-operators-5mt74" Feb 19 16:59:29 crc kubenswrapper[4810]: I0219 16:59:29.353742 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/808052e2-d521-47dd-afa0-09caec857462-utilities\") pod \"community-operators-5mt74\" (UID: \"808052e2-d521-47dd-afa0-09caec857462\") " pod="openshift-marketplace/community-operators-5mt74" Feb 19 16:59:29 crc kubenswrapper[4810]: I0219 16:59:29.375079 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gqvs\" (UniqueName: \"kubernetes.io/projected/808052e2-d521-47dd-afa0-09caec857462-kube-api-access-5gqvs\") pod \"community-operators-5mt74\" (UID: \"808052e2-d521-47dd-afa0-09caec857462\") " pod="openshift-marketplace/community-operators-5mt74" Feb 19 16:59:29 crc kubenswrapper[4810]: I0219 16:59:29.441554 4810 scope.go:117] "RemoveContainer" containerID="a0ba7d3250d70141c7fee7e449509a680dcfa78b2c2d7c2d1cd7f9464f6ba11b" Feb 19 16:59:29 crc kubenswrapper[4810]: E0219 16:59:29.441923 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:59:29 crc kubenswrapper[4810]: I0219 16:59:29.497795 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5mt74" Feb 19 16:59:29 crc kubenswrapper[4810]: I0219 16:59:29.746407 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l88mg"] Feb 19 16:59:29 crc kubenswrapper[4810]: W0219 16:59:29.773472 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb1500ae_62df_4106_92a9_292e4a530f9b.slice/crio-102f4f274c72147c1a6a8c6c1f5977a776f1840f276ae532338949bb1735f657 WatchSource:0}: Error finding container 102f4f274c72147c1a6a8c6c1f5977a776f1840f276ae532338949bb1735f657: Status 404 returned error can't find the container with id 102f4f274c72147c1a6a8c6c1f5977a776f1840f276ae532338949bb1735f657 Feb 19 16:59:30 crc kubenswrapper[4810]: I0219 16:59:30.030449 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5mt74"] Feb 19 16:59:30 crc kubenswrapper[4810]: I0219 16:59:30.160484 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5mt74" event={"ID":"808052e2-d521-47dd-afa0-09caec857462","Type":"ContainerStarted","Data":"6e87026d9bcbe6b26d8857e5d141664c1b740983c8945dad943f6e6a6bd89c0e"} Feb 19 16:59:30 crc kubenswrapper[4810]: I0219 16:59:30.163039 4810 generic.go:334] "Generic (PLEG): container finished" podID="fb1500ae-62df-4106-92a9-292e4a530f9b" containerID="184b8410364efc8dd2bbe2c66a5da731716998db41b931676b3f37f11a279a52" exitCode=0 Feb 19 16:59:30 crc kubenswrapper[4810]: I0219 16:59:30.163072 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l88mg" event={"ID":"fb1500ae-62df-4106-92a9-292e4a530f9b","Type":"ContainerDied","Data":"184b8410364efc8dd2bbe2c66a5da731716998db41b931676b3f37f11a279a52"} Feb 19 16:59:30 crc kubenswrapper[4810]: I0219 16:59:30.163091 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l88mg" event={"ID":"fb1500ae-62df-4106-92a9-292e4a530f9b","Type":"ContainerStarted","Data":"102f4f274c72147c1a6a8c6c1f5977a776f1840f276ae532338949bb1735f657"} Feb 19 16:59:31 crc kubenswrapper[4810]: I0219 16:59:31.188304 4810 generic.go:334] "Generic (PLEG): container finished" podID="808052e2-d521-47dd-afa0-09caec857462" containerID="5cfcb4b39748777ff912a8c1578b8da3c2c348b0f537f5a9d085ed4073373a1c" exitCode=0 Feb 19 16:59:31 crc kubenswrapper[4810]: I0219 16:59:31.188469 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5mt74" event={"ID":"808052e2-d521-47dd-afa0-09caec857462","Type":"ContainerDied","Data":"5cfcb4b39748777ff912a8c1578b8da3c2c348b0f537f5a9d085ed4073373a1c"} Feb 19 16:59:32 crc kubenswrapper[4810]: I0219 16:59:32.207779 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l88mg" event={"ID":"fb1500ae-62df-4106-92a9-292e4a530f9b","Type":"ContainerStarted","Data":"087c65650316a2f00c7c8a503624ebcae641fced016361f533643bbd4682f836"} Feb 19 16:59:33 crc kubenswrapper[4810]: I0219 16:59:33.222837 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5mt74" event={"ID":"808052e2-d521-47dd-afa0-09caec857462","Type":"ContainerStarted","Data":"00b0bfe8666df0ae7322bfa1858d66088b60c8375202d05eb06bee981a6fa745"} Feb 19 16:59:34 crc kubenswrapper[4810]: I0219 16:59:34.233156 4810 generic.go:334] "Generic (PLEG): container finished" podID="fb1500ae-62df-4106-92a9-292e4a530f9b" containerID="087c65650316a2f00c7c8a503624ebcae641fced016361f533643bbd4682f836" exitCode=0 Feb 19 16:59:34 crc kubenswrapper[4810]: I0219 16:59:34.233222 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l88mg" event={"ID":"fb1500ae-62df-4106-92a9-292e4a530f9b","Type":"ContainerDied","Data":"087c65650316a2f00c7c8a503624ebcae641fced016361f533643bbd4682f836"} Feb 19 16:59:35 crc kubenswrapper[4810]: I0219 16:59:35.246929 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l88mg" event={"ID":"fb1500ae-62df-4106-92a9-292e4a530f9b","Type":"ContainerStarted","Data":"3e1cb4584544667e6e0ebcdf16d948da751d43efcbc033f5ccb4b9f4f67cb6a2"} Feb 19 16:59:35 crc kubenswrapper[4810]: I0219 16:59:35.276571 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-l88mg" podStartSLOduration=2.6360307499999998 podStartE2EDuration="7.276553779s" podCreationTimestamp="2026-02-19 16:59:28 +0000 UTC" firstStartedPulling="2026-02-19 16:59:30.165141546 +0000 UTC m=+6599.647171670" lastFinishedPulling="2026-02-19 16:59:34.805664545 +0000 UTC m=+6604.287694699" observedRunningTime="2026-02-19 16:59:35.270171962 +0000 UTC m=+6604.752202106" watchObservedRunningTime="2026-02-19 16:59:35.276553779 +0000 UTC m=+6604.758583903" Feb 19 16:59:36 crc kubenswrapper[4810]: I0219 16:59:36.258990 4810 generic.go:334] "Generic (PLEG): container finished" podID="808052e2-d521-47dd-afa0-09caec857462" containerID="00b0bfe8666df0ae7322bfa1858d66088b60c8375202d05eb06bee981a6fa745" exitCode=0 Feb 19 16:59:36 crc kubenswrapper[4810]: I0219 16:59:36.259066 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5mt74" event={"ID":"808052e2-d521-47dd-afa0-09caec857462","Type":"ContainerDied","Data":"00b0bfe8666df0ae7322bfa1858d66088b60c8375202d05eb06bee981a6fa745"} Feb 19 16:59:38 crc kubenswrapper[4810]: I0219 16:59:38.297546 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5mt74" event={"ID":"808052e2-d521-47dd-afa0-09caec857462","Type":"ContainerStarted","Data":"af9a979525ef507298249d3ca6e031c44578c0f6537e1c2586a5d6e69cd1a9cd"} Feb 19 16:59:38 crc kubenswrapper[4810]: I0219 16:59:38.336960 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5mt74" podStartSLOduration=3.550946484 podStartE2EDuration="9.336930263s" podCreationTimestamp="2026-02-19 16:59:29 +0000 UTC" firstStartedPulling="2026-02-19 16:59:31.190590875 +0000 UTC m=+6600.672620999" lastFinishedPulling="2026-02-19 16:59:36.976574614 +0000 UTC m=+6606.458604778" observedRunningTime="2026-02-19 16:59:38.320647172 +0000 UTC m=+6607.802677316" watchObservedRunningTime="2026-02-19 16:59:38.336930263 +0000 UTC m=+6607.818960397" Feb 19 16:59:39 crc kubenswrapper[4810]: I0219 16:59:39.185608 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-l88mg" Feb 19 16:59:39 crc kubenswrapper[4810]: I0219 16:59:39.187292 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-l88mg" Feb 19 16:59:39 crc kubenswrapper[4810]: I0219 16:59:39.255772 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-l88mg" Feb 19 16:59:39 crc kubenswrapper[4810]: I0219 16:59:39.499537 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5mt74" Feb 19 16:59:39 crc kubenswrapper[4810]: I0219 16:59:39.499588 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5mt74" Feb 19 16:59:40 crc kubenswrapper[4810]: I0219 16:59:40.549167 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-5mt74" podUID="808052e2-d521-47dd-afa0-09caec857462" containerName="registry-server" probeResult="failure" output=< Feb 19 16:59:40 crc kubenswrapper[4810]: timeout: failed to connect service ":50051" within 1s Feb 19 16:59:40 crc kubenswrapper[4810]: > Feb 19 16:59:44 crc kubenswrapper[4810]: I0219 16:59:44.440415 4810 scope.go:117] "RemoveContainer" containerID="a0ba7d3250d70141c7fee7e449509a680dcfa78b2c2d7c2d1cd7f9464f6ba11b" Feb 19 16:59:44 crc kubenswrapper[4810]: E0219 16:59:44.441272 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 16:59:45 crc kubenswrapper[4810]: I0219 16:59:45.755299 4810 scope.go:117] "RemoveContainer" containerID="35e06e2cca1fb991fa940df76a9f88f0b3d758223a060db89405e9f9e28e0bdb" Feb 19 16:59:45 crc kubenswrapper[4810]: I0219 16:59:45.787518 4810 scope.go:117] "RemoveContainer" containerID="07153056d19231ca3d7ab4d1dd2cd03f17f8e8578ea20e8a31486c974d0f0b3d" Feb 19 16:59:46 crc kubenswrapper[4810]: I0219 16:59:46.177583 4810 scope.go:117] "RemoveContainer" containerID="c35e5c9c4b5ddac624677f837531b6c053c31b4d14fb41f66bcb860eccf31d9d" Feb 19 16:59:49 crc kubenswrapper[4810]: I0219 16:59:49.257586 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-l88mg" Feb 19 16:59:49 crc kubenswrapper[4810]: I0219 16:59:49.327102 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-l88mg"] Feb 19 16:59:49 crc kubenswrapper[4810]: I0219 16:59:49.433189 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-l88mg" podUID="fb1500ae-62df-4106-92a9-292e4a530f9b" containerName="registry-server" containerID="cri-o://3e1cb4584544667e6e0ebcdf16d948da751d43efcbc033f5ccb4b9f4f67cb6a2" gracePeriod=2 Feb 19 16:59:49 crc kubenswrapper[4810]: I0219 16:59:49.555126 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5mt74" Feb 19 16:59:49 crc kubenswrapper[4810]: I0219 16:59:49.623386 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5mt74" Feb 19 16:59:49 crc kubenswrapper[4810]: I0219 16:59:49.954942 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l88mg" Feb 19 16:59:50 crc kubenswrapper[4810]: I0219 16:59:50.052422 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4449\" (UniqueName: \"kubernetes.io/projected/fb1500ae-62df-4106-92a9-292e4a530f9b-kube-api-access-d4449\") pod \"fb1500ae-62df-4106-92a9-292e4a530f9b\" (UID: \"fb1500ae-62df-4106-92a9-292e4a530f9b\") " Feb 19 16:59:50 crc kubenswrapper[4810]: I0219 16:59:50.052682 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb1500ae-62df-4106-92a9-292e4a530f9b-utilities\") pod \"fb1500ae-62df-4106-92a9-292e4a530f9b\" (UID: \"fb1500ae-62df-4106-92a9-292e4a530f9b\") " Feb 19 16:59:50 crc kubenswrapper[4810]: I0219 16:59:50.052965 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb1500ae-62df-4106-92a9-292e4a530f9b-catalog-content\") pod \"fb1500ae-62df-4106-92a9-292e4a530f9b\" (UID: \"fb1500ae-62df-4106-92a9-292e4a530f9b\") " Feb 19 16:59:50 crc kubenswrapper[4810]: I0219 16:59:50.057173 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb1500ae-62df-4106-92a9-292e4a530f9b-utilities" (OuterVolumeSpecName: "utilities") pod "fb1500ae-62df-4106-92a9-292e4a530f9b" (UID: "fb1500ae-62df-4106-92a9-292e4a530f9b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 16:59:50 crc kubenswrapper[4810]: I0219 16:59:50.063868 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb1500ae-62df-4106-92a9-292e4a530f9b-kube-api-access-d4449" (OuterVolumeSpecName: "kube-api-access-d4449") pod "fb1500ae-62df-4106-92a9-292e4a530f9b" (UID: "fb1500ae-62df-4106-92a9-292e4a530f9b"). InnerVolumeSpecName "kube-api-access-d4449". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 16:59:50 crc kubenswrapper[4810]: I0219 16:59:50.078277 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb1500ae-62df-4106-92a9-292e4a530f9b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fb1500ae-62df-4106-92a9-292e4a530f9b" (UID: "fb1500ae-62df-4106-92a9-292e4a530f9b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 16:59:50 crc kubenswrapper[4810]: I0219 16:59:50.155737 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4449\" (UniqueName: \"kubernetes.io/projected/fb1500ae-62df-4106-92a9-292e4a530f9b-kube-api-access-d4449\") on node \"crc\" DevicePath \"\"" Feb 19 16:59:50 crc kubenswrapper[4810]: I0219 16:59:50.155985 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb1500ae-62df-4106-92a9-292e4a530f9b-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 16:59:50 crc kubenswrapper[4810]: I0219 16:59:50.156050 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb1500ae-62df-4106-92a9-292e4a530f9b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 16:59:50 crc kubenswrapper[4810]: I0219 16:59:50.452928 4810 generic.go:334] "Generic (PLEG): container finished" podID="fb1500ae-62df-4106-92a9-292e4a530f9b" containerID="3e1cb4584544667e6e0ebcdf16d948da751d43efcbc033f5ccb4b9f4f67cb6a2" exitCode=0 Feb 19 16:59:50 crc kubenswrapper[4810]: I0219 16:59:50.454557 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l88mg" event={"ID":"fb1500ae-62df-4106-92a9-292e4a530f9b","Type":"ContainerDied","Data":"3e1cb4584544667e6e0ebcdf16d948da751d43efcbc033f5ccb4b9f4f67cb6a2"} Feb 19 16:59:50 crc kubenswrapper[4810]: I0219 16:59:50.454618 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l88mg" Feb 19 16:59:50 crc kubenswrapper[4810]: I0219 16:59:50.454667 4810 scope.go:117] "RemoveContainer" containerID="3e1cb4584544667e6e0ebcdf16d948da751d43efcbc033f5ccb4b9f4f67cb6a2" Feb 19 16:59:50 crc kubenswrapper[4810]: I0219 16:59:50.454648 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l88mg" event={"ID":"fb1500ae-62df-4106-92a9-292e4a530f9b","Type":"ContainerDied","Data":"102f4f274c72147c1a6a8c6c1f5977a776f1840f276ae532338949bb1735f657"} Feb 19 16:59:50 crc kubenswrapper[4810]: I0219 16:59:50.500199 4810 scope.go:117] "RemoveContainer" containerID="087c65650316a2f00c7c8a503624ebcae641fced016361f533643bbd4682f836" Feb 19 16:59:50 crc kubenswrapper[4810]: I0219 16:59:50.511814 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-l88mg"] Feb 19 16:59:50 crc kubenswrapper[4810]: I0219 16:59:50.530858 4810 scope.go:117] "RemoveContainer" containerID="184b8410364efc8dd2bbe2c66a5da731716998db41b931676b3f37f11a279a52" Feb 19 16:59:50 crc kubenswrapper[4810]: I0219 16:59:50.534687 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-l88mg"] Feb 19 16:59:50 crc kubenswrapper[4810]: I0219 16:59:50.578209 4810 scope.go:117] "RemoveContainer" containerID="3e1cb4584544667e6e0ebcdf16d948da751d43efcbc033f5ccb4b9f4f67cb6a2" Feb 19 16:59:50 crc kubenswrapper[4810]: E0219 16:59:50.579068 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e1cb4584544667e6e0ebcdf16d948da751d43efcbc033f5ccb4b9f4f67cb6a2\": container with ID starting with 3e1cb4584544667e6e0ebcdf16d948da751d43efcbc033f5ccb4b9f4f67cb6a2 not found: ID does not exist" containerID="3e1cb4584544667e6e0ebcdf16d948da751d43efcbc033f5ccb4b9f4f67cb6a2" Feb 19 16:59:50 crc kubenswrapper[4810]: I0219 16:59:50.579107 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e1cb4584544667e6e0ebcdf16d948da751d43efcbc033f5ccb4b9f4f67cb6a2"} err="failed to get container status \"3e1cb4584544667e6e0ebcdf16d948da751d43efcbc033f5ccb4b9f4f67cb6a2\": rpc error: code = NotFound desc = could not find container \"3e1cb4584544667e6e0ebcdf16d948da751d43efcbc033f5ccb4b9f4f67cb6a2\": container with ID starting with 3e1cb4584544667e6e0ebcdf16d948da751d43efcbc033f5ccb4b9f4f67cb6a2 not found: ID does not exist" Feb 19 16:59:50 crc kubenswrapper[4810]: I0219 16:59:50.579138 4810 scope.go:117] "RemoveContainer" containerID="087c65650316a2f00c7c8a503624ebcae641fced016361f533643bbd4682f836" Feb 19 16:59:50 crc kubenswrapper[4810]: E0219 16:59:50.580078 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"087c65650316a2f00c7c8a503624ebcae641fced016361f533643bbd4682f836\": container with ID starting with 087c65650316a2f00c7c8a503624ebcae641fced016361f533643bbd4682f836 not found: ID does not exist" containerID="087c65650316a2f00c7c8a503624ebcae641fced016361f533643bbd4682f836" Feb 19 16:59:50 crc kubenswrapper[4810]: I0219 16:59:50.580113 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"087c65650316a2f00c7c8a503624ebcae641fced016361f533643bbd4682f836"} err="failed to get container status \"087c65650316a2f00c7c8a503624ebcae641fced016361f533643bbd4682f836\": rpc error: code = NotFound desc = could not find container \"087c65650316a2f00c7c8a503624ebcae641fced016361f533643bbd4682f836\": container with ID starting with 087c65650316a2f00c7c8a503624ebcae641fced016361f533643bbd4682f836 not found: ID does not exist" Feb 19 16:59:50 crc kubenswrapper[4810]: I0219 16:59:50.580312 4810 scope.go:117] "RemoveContainer" containerID="184b8410364efc8dd2bbe2c66a5da731716998db41b931676b3f37f11a279a52" Feb 19 16:59:50 crc kubenswrapper[4810]: E0219 16:59:50.581236 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"184b8410364efc8dd2bbe2c66a5da731716998db41b931676b3f37f11a279a52\": container with ID starting with 184b8410364efc8dd2bbe2c66a5da731716998db41b931676b3f37f11a279a52 not found: ID does not exist" containerID="184b8410364efc8dd2bbe2c66a5da731716998db41b931676b3f37f11a279a52" Feb 19 16:59:50 crc kubenswrapper[4810]: I0219 16:59:50.581296 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"184b8410364efc8dd2bbe2c66a5da731716998db41b931676b3f37f11a279a52"} err="failed to get container status \"184b8410364efc8dd2bbe2c66a5da731716998db41b931676b3f37f11a279a52\": rpc error: code = NotFound desc = could not find container \"184b8410364efc8dd2bbe2c66a5da731716998db41b931676b3f37f11a279a52\": container with ID starting with 184b8410364efc8dd2bbe2c66a5da731716998db41b931676b3f37f11a279a52 not found: ID does not exist" Feb 19 16:59:51 crc kubenswrapper[4810]: I0219 16:59:51.459690 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb1500ae-62df-4106-92a9-292e4a530f9b" path="/var/lib/kubelet/pods/fb1500ae-62df-4106-92a9-292e4a530f9b/volumes" Feb 19 16:59:51 crc kubenswrapper[4810]: I0219 16:59:51.715454 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5mt74"] Feb 19 16:59:51 crc kubenswrapper[4810]: I0219 16:59:51.715966 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5mt74" podUID="808052e2-d521-47dd-afa0-09caec857462" containerName="registry-server" containerID="cri-o://af9a979525ef507298249d3ca6e031c44578c0f6537e1c2586a5d6e69cd1a9cd" gracePeriod=2 Feb 19 16:59:52 crc kubenswrapper[4810]: I0219 16:59:52.288296 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5mt74" Feb 19 16:59:52 crc kubenswrapper[4810]: I0219 16:59:52.309354 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/808052e2-d521-47dd-afa0-09caec857462-utilities\") pod \"808052e2-d521-47dd-afa0-09caec857462\" (UID: \"808052e2-d521-47dd-afa0-09caec857462\") " Feb 19 16:59:52 crc kubenswrapper[4810]: I0219 16:59:52.309492 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/808052e2-d521-47dd-afa0-09caec857462-catalog-content\") pod \"808052e2-d521-47dd-afa0-09caec857462\" (UID: \"808052e2-d521-47dd-afa0-09caec857462\") " Feb 19 16:59:52 crc kubenswrapper[4810]: I0219 16:59:52.309632 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5gqvs\" (UniqueName: \"kubernetes.io/projected/808052e2-d521-47dd-afa0-09caec857462-kube-api-access-5gqvs\") pod \"808052e2-d521-47dd-afa0-09caec857462\" (UID: \"808052e2-d521-47dd-afa0-09caec857462\") " Feb 19 16:59:52 crc kubenswrapper[4810]: I0219 16:59:52.315346 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/808052e2-d521-47dd-afa0-09caec857462-kube-api-access-5gqvs" (OuterVolumeSpecName: "kube-api-access-5gqvs") pod "808052e2-d521-47dd-afa0-09caec857462" (UID: "808052e2-d521-47dd-afa0-09caec857462"). InnerVolumeSpecName "kube-api-access-5gqvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 16:59:52 crc kubenswrapper[4810]: I0219 16:59:52.322401 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/808052e2-d521-47dd-afa0-09caec857462-utilities" (OuterVolumeSpecName: "utilities") pod "808052e2-d521-47dd-afa0-09caec857462" (UID: "808052e2-d521-47dd-afa0-09caec857462"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 16:59:52 crc kubenswrapper[4810]: I0219 16:59:52.374255 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/808052e2-d521-47dd-afa0-09caec857462-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "808052e2-d521-47dd-afa0-09caec857462" (UID: "808052e2-d521-47dd-afa0-09caec857462"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 16:59:52 crc kubenswrapper[4810]: I0219 16:59:52.412200 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/808052e2-d521-47dd-afa0-09caec857462-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 16:59:52 crc kubenswrapper[4810]: I0219 16:59:52.412242 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/808052e2-d521-47dd-afa0-09caec857462-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 16:59:52 crc kubenswrapper[4810]: I0219 16:59:52.412259 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5gqvs\" (UniqueName: \"kubernetes.io/projected/808052e2-d521-47dd-afa0-09caec857462-kube-api-access-5gqvs\") on node \"crc\" DevicePath \"\"" Feb 19 16:59:52 crc kubenswrapper[4810]: I0219 16:59:52.488023 4810 generic.go:334] "Generic (PLEG): container finished" podID="808052e2-d521-47dd-afa0-09caec857462" containerID="af9a979525ef507298249d3ca6e031c44578c0f6537e1c2586a5d6e69cd1a9cd" exitCode=0 Feb 19 16:59:52 crc kubenswrapper[4810]: I0219 16:59:52.488087 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5mt74" event={"ID":"808052e2-d521-47dd-afa0-09caec857462","Type":"ContainerDied","Data":"af9a979525ef507298249d3ca6e031c44578c0f6537e1c2586a5d6e69cd1a9cd"} Feb 19 16:59:52 crc kubenswrapper[4810]: I0219 16:59:52.488101 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5mt74" Feb 19 16:59:52 crc kubenswrapper[4810]: I0219 16:59:52.488128 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5mt74" event={"ID":"808052e2-d521-47dd-afa0-09caec857462","Type":"ContainerDied","Data":"6e87026d9bcbe6b26d8857e5d141664c1b740983c8945dad943f6e6a6bd89c0e"} Feb 19 16:59:52 crc kubenswrapper[4810]: I0219 16:59:52.488157 4810 scope.go:117] "RemoveContainer" containerID="af9a979525ef507298249d3ca6e031c44578c0f6537e1c2586a5d6e69cd1a9cd" Feb 19 16:59:52 crc kubenswrapper[4810]: I0219 16:59:52.526726 4810 scope.go:117] "RemoveContainer" containerID="00b0bfe8666df0ae7322bfa1858d66088b60c8375202d05eb06bee981a6fa745" Feb 19 16:59:52 crc kubenswrapper[4810]: I0219 16:59:52.539700 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5mt74"] Feb 19 16:59:52 crc kubenswrapper[4810]: I0219 16:59:52.550033 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5mt74"] Feb 19 16:59:52 crc kubenswrapper[4810]: I0219 16:59:52.568167 4810 scope.go:117] "RemoveContainer" containerID="5cfcb4b39748777ff912a8c1578b8da3c2c348b0f537f5a9d085ed4073373a1c" Feb 19 16:59:52 crc kubenswrapper[4810]: I0219 16:59:52.637876 4810 scope.go:117] "RemoveContainer" containerID="af9a979525ef507298249d3ca6e031c44578c0f6537e1c2586a5d6e69cd1a9cd" Feb 19 16:59:52 crc kubenswrapper[4810]: E0219 16:59:52.638520 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af9a979525ef507298249d3ca6e031c44578c0f6537e1c2586a5d6e69cd1a9cd\": container with ID starting with af9a979525ef507298249d3ca6e031c44578c0f6537e1c2586a5d6e69cd1a9cd not found: ID does not exist" containerID="af9a979525ef507298249d3ca6e031c44578c0f6537e1c2586a5d6e69cd1a9cd" Feb 19 16:59:52 crc kubenswrapper[4810]: I0219 16:59:52.638561 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af9a979525ef507298249d3ca6e031c44578c0f6537e1c2586a5d6e69cd1a9cd"} err="failed to get container status \"af9a979525ef507298249d3ca6e031c44578c0f6537e1c2586a5d6e69cd1a9cd\": rpc error: code = NotFound desc = could not find container \"af9a979525ef507298249d3ca6e031c44578c0f6537e1c2586a5d6e69cd1a9cd\": container with ID starting with af9a979525ef507298249d3ca6e031c44578c0f6537e1c2586a5d6e69cd1a9cd not found: ID does not exist" Feb 19 16:59:52 crc kubenswrapper[4810]: I0219 16:59:52.638604 4810 scope.go:117] "RemoveContainer" containerID="00b0bfe8666df0ae7322bfa1858d66088b60c8375202d05eb06bee981a6fa745" Feb 19 16:59:52 crc kubenswrapper[4810]: E0219 16:59:52.639089 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00b0bfe8666df0ae7322bfa1858d66088b60c8375202d05eb06bee981a6fa745\": container with ID starting with 00b0bfe8666df0ae7322bfa1858d66088b60c8375202d05eb06bee981a6fa745 not found: ID does not exist" containerID="00b0bfe8666df0ae7322bfa1858d66088b60c8375202d05eb06bee981a6fa745" Feb 19 16:59:52 crc kubenswrapper[4810]: I0219 16:59:52.639123 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00b0bfe8666df0ae7322bfa1858d66088b60c8375202d05eb06bee981a6fa745"} err="failed to get container status \"00b0bfe8666df0ae7322bfa1858d66088b60c8375202d05eb06bee981a6fa745\": rpc error: code = NotFound desc = could not find container \"00b0bfe8666df0ae7322bfa1858d66088b60c8375202d05eb06bee981a6fa745\": container with ID starting with 00b0bfe8666df0ae7322bfa1858d66088b60c8375202d05eb06bee981a6fa745 not found: ID does not exist" Feb 19 16:59:52 crc kubenswrapper[4810]: I0219 16:59:52.639144 4810 scope.go:117] "RemoveContainer" containerID="5cfcb4b39748777ff912a8c1578b8da3c2c348b0f537f5a9d085ed4073373a1c" Feb 19 16:59:52 crc kubenswrapper[4810]: E0219 16:59:52.639648 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cfcb4b39748777ff912a8c1578b8da3c2c348b0f537f5a9d085ed4073373a1c\": container with ID starting with 5cfcb4b39748777ff912a8c1578b8da3c2c348b0f537f5a9d085ed4073373a1c not found: ID does not exist" containerID="5cfcb4b39748777ff912a8c1578b8da3c2c348b0f537f5a9d085ed4073373a1c" Feb 19 16:59:52 crc kubenswrapper[4810]: I0219 16:59:52.639682 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cfcb4b39748777ff912a8c1578b8da3c2c348b0f537f5a9d085ed4073373a1c"} err="failed to get container status \"5cfcb4b39748777ff912a8c1578b8da3c2c348b0f537f5a9d085ed4073373a1c\": rpc error: code = NotFound desc = could not find container \"5cfcb4b39748777ff912a8c1578b8da3c2c348b0f537f5a9d085ed4073373a1c\": container with ID starting with 5cfcb4b39748777ff912a8c1578b8da3c2c348b0f537f5a9d085ed4073373a1c not found: ID does not exist" Feb 19 16:59:53 crc kubenswrapper[4810]: I0219 16:59:53.496200 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="808052e2-d521-47dd-afa0-09caec857462" path="/var/lib/kubelet/pods/808052e2-d521-47dd-afa0-09caec857462/volumes" Feb 19 16:59:58 crc kubenswrapper[4810]: I0219 16:59:58.439285 4810 scope.go:117] "RemoveContainer" containerID="a0ba7d3250d70141c7fee7e449509a680dcfa78b2c2d7c2d1cd7f9464f6ba11b" Feb 19 16:59:58 crc kubenswrapper[4810]: E0219 16:59:58.440292 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 17:00:00 crc kubenswrapper[4810]: I0219 17:00:00.160692 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525340-89f56"] Feb 19 17:00:00 crc kubenswrapper[4810]: E0219 17:00:00.161404 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb1500ae-62df-4106-92a9-292e4a530f9b" containerName="extract-content" Feb 19 17:00:00 crc kubenswrapper[4810]: I0219 17:00:00.161420 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb1500ae-62df-4106-92a9-292e4a530f9b" containerName="extract-content" Feb 19 17:00:00 crc kubenswrapper[4810]: E0219 17:00:00.161463 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb1500ae-62df-4106-92a9-292e4a530f9b" containerName="extract-utilities" Feb 19 17:00:00 crc kubenswrapper[4810]: I0219 17:00:00.161472 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb1500ae-62df-4106-92a9-292e4a530f9b" containerName="extract-utilities" Feb 19 17:00:00 crc kubenswrapper[4810]: E0219 17:00:00.161495 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="808052e2-d521-47dd-afa0-09caec857462" containerName="extract-content" Feb 19 17:00:00 crc kubenswrapper[4810]: I0219 17:00:00.161503 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="808052e2-d521-47dd-afa0-09caec857462" containerName="extract-content" Feb 19 17:00:00 crc kubenswrapper[4810]: E0219 17:00:00.161528 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="808052e2-d521-47dd-afa0-09caec857462" containerName="registry-server" Feb 19 17:00:00 crc kubenswrapper[4810]: I0219 17:00:00.161538 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="808052e2-d521-47dd-afa0-09caec857462" containerName="registry-server" Feb 19 17:00:00 crc kubenswrapper[4810]: E0219 17:00:00.161553 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="808052e2-d521-47dd-afa0-09caec857462" containerName="extract-utilities" Feb 19 17:00:00 crc kubenswrapper[4810]: I0219 17:00:00.161560 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="808052e2-d521-47dd-afa0-09caec857462" containerName="extract-utilities" Feb 19 17:00:00 crc kubenswrapper[4810]: E0219 17:00:00.161574 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb1500ae-62df-4106-92a9-292e4a530f9b" containerName="registry-server" Feb 19 17:00:00 crc kubenswrapper[4810]: I0219 17:00:00.161583 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb1500ae-62df-4106-92a9-292e4a530f9b" containerName="registry-server" Feb 19 17:00:00 crc kubenswrapper[4810]: I0219 17:00:00.161810 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb1500ae-62df-4106-92a9-292e4a530f9b" containerName="registry-server" Feb 19 17:00:00 crc kubenswrapper[4810]: I0219 17:00:00.161836 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="808052e2-d521-47dd-afa0-09caec857462" containerName="registry-server" Feb 19 17:00:00 crc kubenswrapper[4810]: I0219 17:00:00.162614 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525340-89f56" Feb 19 17:00:00 crc kubenswrapper[4810]: I0219 17:00:00.164614 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 19 17:00:00 crc kubenswrapper[4810]: I0219 17:00:00.165676 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 19 17:00:00 crc kubenswrapper[4810]: I0219 17:00:00.173026 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525340-89f56"] Feb 19 17:00:00 crc kubenswrapper[4810]: I0219 17:00:00.291547 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bcf416e2-d4f9-491d-945e-4c1ae3bf3249-config-volume\") pod \"collect-profiles-29525340-89f56\" (UID: \"bcf416e2-d4f9-491d-945e-4c1ae3bf3249\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525340-89f56" Feb 19 17:00:00 crc kubenswrapper[4810]: I0219 17:00:00.291915 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bcf416e2-d4f9-491d-945e-4c1ae3bf3249-secret-volume\") pod \"collect-profiles-29525340-89f56\" (UID: \"bcf416e2-d4f9-491d-945e-4c1ae3bf3249\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525340-89f56" Feb 19 17:00:00 crc kubenswrapper[4810]: I0219 17:00:00.291980 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ksrp\" (UniqueName: \"kubernetes.io/projected/bcf416e2-d4f9-491d-945e-4c1ae3bf3249-kube-api-access-8ksrp\") pod \"collect-profiles-29525340-89f56\" (UID: \"bcf416e2-d4f9-491d-945e-4c1ae3bf3249\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525340-89f56" Feb 19 17:00:00 crc kubenswrapper[4810]: I0219 17:00:00.394386 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bcf416e2-d4f9-491d-945e-4c1ae3bf3249-secret-volume\") pod \"collect-profiles-29525340-89f56\" (UID: \"bcf416e2-d4f9-491d-945e-4c1ae3bf3249\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525340-89f56" Feb 19 17:00:00 crc kubenswrapper[4810]: I0219 17:00:00.394449 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ksrp\" (UniqueName: \"kubernetes.io/projected/bcf416e2-d4f9-491d-945e-4c1ae3bf3249-kube-api-access-8ksrp\") pod \"collect-profiles-29525340-89f56\" (UID: \"bcf416e2-d4f9-491d-945e-4c1ae3bf3249\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525340-89f56" Feb 19 17:00:00 crc kubenswrapper[4810]: I0219 17:00:00.394498 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bcf416e2-d4f9-491d-945e-4c1ae3bf3249-config-volume\") pod \"collect-profiles-29525340-89f56\" (UID: \"bcf416e2-d4f9-491d-945e-4c1ae3bf3249\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525340-89f56" Feb 19 17:00:00 crc kubenswrapper[4810]: I0219 17:00:00.396193 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bcf416e2-d4f9-491d-945e-4c1ae3bf3249-config-volume\") pod \"collect-profiles-29525340-89f56\" (UID: \"bcf416e2-d4f9-491d-945e-4c1ae3bf3249\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525340-89f56" Feb 19 17:00:00 crc kubenswrapper[4810]: I0219 17:00:00.406781 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bcf416e2-d4f9-491d-945e-4c1ae3bf3249-secret-volume\") pod \"collect-profiles-29525340-89f56\" (UID: \"bcf416e2-d4f9-491d-945e-4c1ae3bf3249\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525340-89f56" Feb 19 17:00:00 crc kubenswrapper[4810]: I0219 17:00:00.410008 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ksrp\" (UniqueName: \"kubernetes.io/projected/bcf416e2-d4f9-491d-945e-4c1ae3bf3249-kube-api-access-8ksrp\") pod \"collect-profiles-29525340-89f56\" (UID: \"bcf416e2-d4f9-491d-945e-4c1ae3bf3249\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525340-89f56" Feb 19 17:00:00 crc kubenswrapper[4810]: I0219 17:00:00.500965 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525340-89f56" Feb 19 17:00:01 crc kubenswrapper[4810]: I0219 17:00:01.048744 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525340-89f56"] Feb 19 17:00:01 crc kubenswrapper[4810]: I0219 17:00:01.589030 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525340-89f56" event={"ID":"bcf416e2-d4f9-491d-945e-4c1ae3bf3249","Type":"ContainerStarted","Data":"c6bd58f7b811b8f09702843fce0debaf048c5b58f77fccad890e4b669750caa7"} Feb 19 17:00:01 crc kubenswrapper[4810]: I0219 17:00:01.589404 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525340-89f56" event={"ID":"bcf416e2-d4f9-491d-945e-4c1ae3bf3249","Type":"ContainerStarted","Data":"71a910cf5f3b1bb8ed4465276f60e954d96bfc4e8d308aef1368dfbe0291668a"} Feb 19 17:00:01 crc kubenswrapper[4810]: I0219 17:00:01.628926 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29525340-89f56" podStartSLOduration=1.6289033590000002 podStartE2EDuration="1.628903359s" podCreationTimestamp="2026-02-19 17:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 17:00:01.606873987 +0000 UTC m=+6631.088904111" watchObservedRunningTime="2026-02-19 17:00:01.628903359 +0000 UTC m=+6631.110933483" Feb 19 17:00:02 crc kubenswrapper[4810]: I0219 17:00:02.603961 4810 generic.go:334] "Generic (PLEG): container finished" podID="bcf416e2-d4f9-491d-945e-4c1ae3bf3249" containerID="c6bd58f7b811b8f09702843fce0debaf048c5b58f77fccad890e4b669750caa7" exitCode=0 Feb 19 17:00:02 crc kubenswrapper[4810]: I0219 17:00:02.604039 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525340-89f56" event={"ID":"bcf416e2-d4f9-491d-945e-4c1ae3bf3249","Type":"ContainerDied","Data":"c6bd58f7b811b8f09702843fce0debaf048c5b58f77fccad890e4b669750caa7"} Feb 19 17:00:04 crc kubenswrapper[4810]: I0219 17:00:04.117224 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525340-89f56" Feb 19 17:00:04 crc kubenswrapper[4810]: I0219 17:00:04.182082 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bcf416e2-d4f9-491d-945e-4c1ae3bf3249-secret-volume\") pod \"bcf416e2-d4f9-491d-945e-4c1ae3bf3249\" (UID: \"bcf416e2-d4f9-491d-945e-4c1ae3bf3249\") " Feb 19 17:00:04 crc kubenswrapper[4810]: I0219 17:00:04.182136 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bcf416e2-d4f9-491d-945e-4c1ae3bf3249-config-volume\") pod \"bcf416e2-d4f9-491d-945e-4c1ae3bf3249\" (UID: \"bcf416e2-d4f9-491d-945e-4c1ae3bf3249\") " Feb 19 17:00:04 crc kubenswrapper[4810]: I0219 17:00:04.182259 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ksrp\" (UniqueName: \"kubernetes.io/projected/bcf416e2-d4f9-491d-945e-4c1ae3bf3249-kube-api-access-8ksrp\") pod \"bcf416e2-d4f9-491d-945e-4c1ae3bf3249\" (UID: \"bcf416e2-d4f9-491d-945e-4c1ae3bf3249\") " Feb 19 17:00:04 crc kubenswrapper[4810]: I0219 17:00:04.183267 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bcf416e2-d4f9-491d-945e-4c1ae3bf3249-config-volume" (OuterVolumeSpecName: "config-volume") pod "bcf416e2-d4f9-491d-945e-4c1ae3bf3249" (UID: "bcf416e2-d4f9-491d-945e-4c1ae3bf3249"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 17:00:04 crc kubenswrapper[4810]: I0219 17:00:04.188869 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcf416e2-d4f9-491d-945e-4c1ae3bf3249-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "bcf416e2-d4f9-491d-945e-4c1ae3bf3249" (UID: "bcf416e2-d4f9-491d-945e-4c1ae3bf3249"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 17:00:04 crc kubenswrapper[4810]: I0219 17:00:04.196590 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcf416e2-d4f9-491d-945e-4c1ae3bf3249-kube-api-access-8ksrp" (OuterVolumeSpecName: "kube-api-access-8ksrp") pod "bcf416e2-d4f9-491d-945e-4c1ae3bf3249" (UID: "bcf416e2-d4f9-491d-945e-4c1ae3bf3249"). InnerVolumeSpecName "kube-api-access-8ksrp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 17:00:04 crc kubenswrapper[4810]: I0219 17:00:04.284274 4810 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bcf416e2-d4f9-491d-945e-4c1ae3bf3249-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 19 17:00:04 crc kubenswrapper[4810]: I0219 17:00:04.284309 4810 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bcf416e2-d4f9-491d-945e-4c1ae3bf3249-config-volume\") on node \"crc\" DevicePath \"\"" Feb 19 17:00:04 crc kubenswrapper[4810]: I0219 17:00:04.284319 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8ksrp\" (UniqueName: \"kubernetes.io/projected/bcf416e2-d4f9-491d-945e-4c1ae3bf3249-kube-api-access-8ksrp\") on node \"crc\" DevicePath \"\"" Feb 19 17:00:04 crc kubenswrapper[4810]: I0219 17:00:04.583188 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525295-b9xtq"] Feb 19 17:00:04 crc kubenswrapper[4810]: I0219 17:00:04.596124 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525295-b9xtq"] Feb 19 17:00:04 crc kubenswrapper[4810]: I0219 17:00:04.629985 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525340-89f56" event={"ID":"bcf416e2-d4f9-491d-945e-4c1ae3bf3249","Type":"ContainerDied","Data":"71a910cf5f3b1bb8ed4465276f60e954d96bfc4e8d308aef1368dfbe0291668a"} Feb 19 17:00:04 crc kubenswrapper[4810]: I0219 17:00:04.630048 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="71a910cf5f3b1bb8ed4465276f60e954d96bfc4e8d308aef1368dfbe0291668a" Feb 19 17:00:04 crc kubenswrapper[4810]: I0219 17:00:04.630080 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525340-89f56" Feb 19 17:00:05 crc kubenswrapper[4810]: I0219 17:00:05.452493 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99fb536c-bd62-47d3-87d6-9f56d3e51f72" path="/var/lib/kubelet/pods/99fb536c-bd62-47d3-87d6-9f56d3e51f72/volumes" Feb 19 17:00:10 crc kubenswrapper[4810]: I0219 17:00:10.439545 4810 scope.go:117] "RemoveContainer" containerID="a0ba7d3250d70141c7fee7e449509a680dcfa78b2c2d7c2d1cd7f9464f6ba11b" Feb 19 17:00:10 crc kubenswrapper[4810]: E0219 17:00:10.440264 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 17:00:22 crc kubenswrapper[4810]: I0219 17:00:22.442808 4810 scope.go:117] "RemoveContainer" containerID="a0ba7d3250d70141c7fee7e449509a680dcfa78b2c2d7c2d1cd7f9464f6ba11b" Feb 19 17:00:22 crc kubenswrapper[4810]: E0219 17:00:22.443953 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 17:00:33 crc kubenswrapper[4810]: I0219 17:00:33.439171 4810 scope.go:117] "RemoveContainer" containerID="a0ba7d3250d70141c7fee7e449509a680dcfa78b2c2d7c2d1cd7f9464f6ba11b" Feb 19 17:00:33 crc kubenswrapper[4810]: E0219 17:00:33.439863 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 17:00:46 crc kubenswrapper[4810]: I0219 17:00:46.380246 4810 scope.go:117] "RemoveContainer" containerID="932479730c0a6c125151920519504a29cae36b5a433b12d52524833608c85c05" Feb 19 17:00:48 crc kubenswrapper[4810]: I0219 17:00:48.440001 4810 scope.go:117] "RemoveContainer" containerID="a0ba7d3250d70141c7fee7e449509a680dcfa78b2c2d7c2d1cd7f9464f6ba11b" Feb 19 17:00:48 crc kubenswrapper[4810]: E0219 17:00:48.441444 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 17:00:53 crc kubenswrapper[4810]: I0219 17:00:53.891794 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lswlw"] Feb 19 17:00:53 crc kubenswrapper[4810]: E0219 17:00:53.893272 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcf416e2-d4f9-491d-945e-4c1ae3bf3249" containerName="collect-profiles" Feb 19 17:00:53 crc kubenswrapper[4810]: I0219 17:00:53.893294 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcf416e2-d4f9-491d-945e-4c1ae3bf3249" containerName="collect-profiles" Feb 19 17:00:53 crc kubenswrapper[4810]: I0219 17:00:53.893738 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcf416e2-d4f9-491d-945e-4c1ae3bf3249" containerName="collect-profiles" Feb 19 17:00:53 crc kubenswrapper[4810]: I0219 17:00:53.896472 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lswlw" Feb 19 17:00:53 crc kubenswrapper[4810]: I0219 17:00:53.930450 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lswlw"] Feb 19 17:00:54 crc kubenswrapper[4810]: I0219 17:00:54.027050 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8knm\" (UniqueName: \"kubernetes.io/projected/a8a71d42-1d52-4e61-a884-bb624373e783-kube-api-access-b8knm\") pod \"redhat-operators-lswlw\" (UID: \"a8a71d42-1d52-4e61-a884-bb624373e783\") " pod="openshift-marketplace/redhat-operators-lswlw" Feb 19 17:00:54 crc kubenswrapper[4810]: I0219 17:00:54.027285 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8a71d42-1d52-4e61-a884-bb624373e783-utilities\") pod \"redhat-operators-lswlw\" (UID: \"a8a71d42-1d52-4e61-a884-bb624373e783\") " pod="openshift-marketplace/redhat-operators-lswlw" Feb 19 17:00:54 crc kubenswrapper[4810]: I0219 17:00:54.027543 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8a71d42-1d52-4e61-a884-bb624373e783-catalog-content\") pod \"redhat-operators-lswlw\" (UID: \"a8a71d42-1d52-4e61-a884-bb624373e783\") " pod="openshift-marketplace/redhat-operators-lswlw" Feb 19 17:00:54 crc kubenswrapper[4810]: I0219 17:00:54.129859 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8knm\" (UniqueName: \"kubernetes.io/projected/a8a71d42-1d52-4e61-a884-bb624373e783-kube-api-access-b8knm\") pod \"redhat-operators-lswlw\" (UID: \"a8a71d42-1d52-4e61-a884-bb624373e783\") " pod="openshift-marketplace/redhat-operators-lswlw" Feb 19 17:00:54 crc kubenswrapper[4810]: I0219 17:00:54.129946 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8a71d42-1d52-4e61-a884-bb624373e783-utilities\") pod \"redhat-operators-lswlw\" (UID: \"a8a71d42-1d52-4e61-a884-bb624373e783\") " pod="openshift-marketplace/redhat-operators-lswlw" Feb 19 17:00:54 crc kubenswrapper[4810]: I0219 17:00:54.129985 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8a71d42-1d52-4e61-a884-bb624373e783-catalog-content\") pod \"redhat-operators-lswlw\" (UID: \"a8a71d42-1d52-4e61-a884-bb624373e783\") " pod="openshift-marketplace/redhat-operators-lswlw" Feb 19 17:00:54 crc kubenswrapper[4810]: I0219 17:00:54.130453 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8a71d42-1d52-4e61-a884-bb624373e783-catalog-content\") pod \"redhat-operators-lswlw\" (UID: \"a8a71d42-1d52-4e61-a884-bb624373e783\") " pod="openshift-marketplace/redhat-operators-lswlw" Feb 19 17:00:54 crc kubenswrapper[4810]: I0219 17:00:54.130554 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8a71d42-1d52-4e61-a884-bb624373e783-utilities\") pod \"redhat-operators-lswlw\" (UID: \"a8a71d42-1d52-4e61-a884-bb624373e783\") " pod="openshift-marketplace/redhat-operators-lswlw" Feb 19 17:00:54 crc kubenswrapper[4810]: I0219 17:00:54.148678 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8knm\" (UniqueName: \"kubernetes.io/projected/a8a71d42-1d52-4e61-a884-bb624373e783-kube-api-access-b8knm\") pod \"redhat-operators-lswlw\" (UID: \"a8a71d42-1d52-4e61-a884-bb624373e783\") " pod="openshift-marketplace/redhat-operators-lswlw" Feb 19 17:00:54 crc kubenswrapper[4810]: I0219 17:00:54.240279 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lswlw" Feb 19 17:00:54 crc kubenswrapper[4810]: I0219 17:00:54.752921 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lswlw"] Feb 19 17:00:55 crc kubenswrapper[4810]: I0219 17:00:55.206700 4810 generic.go:334] "Generic (PLEG): container finished" podID="a8a71d42-1d52-4e61-a884-bb624373e783" containerID="d4e42978c3a6a470b88c9495d08ef7718003b3f791602697fe71e2d3574a2e01" exitCode=0 Feb 19 17:00:55 crc kubenswrapper[4810]: I0219 17:00:55.206770 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lswlw" event={"ID":"a8a71d42-1d52-4e61-a884-bb624373e783","Type":"ContainerDied","Data":"d4e42978c3a6a470b88c9495d08ef7718003b3f791602697fe71e2d3574a2e01"} Feb 19 17:00:55 crc kubenswrapper[4810]: I0219 17:00:55.207005 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lswlw" event={"ID":"a8a71d42-1d52-4e61-a884-bb624373e783","Type":"ContainerStarted","Data":"855fe77aae20a3a916acc865c1e7cb25af7b033cecc87ffd8153ba29c46257f8"} Feb 19 17:00:56 crc kubenswrapper[4810]: I0219 17:00:56.218004 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lswlw" event={"ID":"a8a71d42-1d52-4e61-a884-bb624373e783","Type":"ContainerStarted","Data":"f8b61cb82e6bbf75f041346f19b5205afe4541708e98ac877a4ce752aa9482c7"} Feb 19 17:01:00 crc kubenswrapper[4810]: I0219 17:01:00.163176 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29525341-lgcg6"] Feb 19 17:01:00 crc kubenswrapper[4810]: I0219 17:01:00.166426 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29525341-lgcg6" Feb 19 17:01:00 crc kubenswrapper[4810]: I0219 17:01:00.177606 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29525341-lgcg6"] Feb 19 17:01:00 crc kubenswrapper[4810]: I0219 17:01:00.270847 4810 generic.go:334] "Generic (PLEG): container finished" podID="a8a71d42-1d52-4e61-a884-bb624373e783" containerID="f8b61cb82e6bbf75f041346f19b5205afe4541708e98ac877a4ce752aa9482c7" exitCode=0 Feb 19 17:01:00 crc kubenswrapper[4810]: I0219 17:01:00.270899 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lswlw" event={"ID":"a8a71d42-1d52-4e61-a884-bb624373e783","Type":"ContainerDied","Data":"f8b61cb82e6bbf75f041346f19b5205afe4541708e98ac877a4ce752aa9482c7"} Feb 19 17:01:00 crc kubenswrapper[4810]: I0219 17:01:00.271033 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55b3945c-6038-4494-bb56-0ec80ec66510-config-data\") pod \"keystone-cron-29525341-lgcg6\" (UID: \"55b3945c-6038-4494-bb56-0ec80ec66510\") " pod="openstack/keystone-cron-29525341-lgcg6" Feb 19 17:01:00 crc kubenswrapper[4810]: I0219 17:01:00.271276 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/55b3945c-6038-4494-bb56-0ec80ec66510-fernet-keys\") pod \"keystone-cron-29525341-lgcg6\" (UID: \"55b3945c-6038-4494-bb56-0ec80ec66510\") " pod="openstack/keystone-cron-29525341-lgcg6" Feb 19 17:01:00 crc kubenswrapper[4810]: I0219 17:01:00.271355 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55b3945c-6038-4494-bb56-0ec80ec66510-combined-ca-bundle\") pod \"keystone-cron-29525341-lgcg6\" (UID: \"55b3945c-6038-4494-bb56-0ec80ec66510\") " pod="openstack/keystone-cron-29525341-lgcg6" Feb 19 17:01:00 crc kubenswrapper[4810]: I0219 17:01:00.271436 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gm4tv\" (UniqueName: \"kubernetes.io/projected/55b3945c-6038-4494-bb56-0ec80ec66510-kube-api-access-gm4tv\") pod \"keystone-cron-29525341-lgcg6\" (UID: \"55b3945c-6038-4494-bb56-0ec80ec66510\") " pod="openstack/keystone-cron-29525341-lgcg6" Feb 19 17:01:00 crc kubenswrapper[4810]: I0219 17:01:00.373100 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55b3945c-6038-4494-bb56-0ec80ec66510-config-data\") pod \"keystone-cron-29525341-lgcg6\" (UID: \"55b3945c-6038-4494-bb56-0ec80ec66510\") " pod="openstack/keystone-cron-29525341-lgcg6" Feb 19 17:01:00 crc kubenswrapper[4810]: I0219 17:01:00.373565 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/55b3945c-6038-4494-bb56-0ec80ec66510-fernet-keys\") pod \"keystone-cron-29525341-lgcg6\" (UID: \"55b3945c-6038-4494-bb56-0ec80ec66510\") " pod="openstack/keystone-cron-29525341-lgcg6" Feb 19 17:01:00 crc kubenswrapper[4810]: I0219 17:01:00.373604 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55b3945c-6038-4494-bb56-0ec80ec66510-combined-ca-bundle\") pod \"keystone-cron-29525341-lgcg6\" (UID: \"55b3945c-6038-4494-bb56-0ec80ec66510\") " pod="openstack/keystone-cron-29525341-lgcg6" Feb 19 17:01:00 crc kubenswrapper[4810]: I0219 17:01:00.373662 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gm4tv\" (UniqueName: \"kubernetes.io/projected/55b3945c-6038-4494-bb56-0ec80ec66510-kube-api-access-gm4tv\") pod \"keystone-cron-29525341-lgcg6\" (UID: \"55b3945c-6038-4494-bb56-0ec80ec66510\") " pod="openstack/keystone-cron-29525341-lgcg6" Feb 19 17:01:00 crc kubenswrapper[4810]: I0219 17:01:00.380725 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55b3945c-6038-4494-bb56-0ec80ec66510-combined-ca-bundle\") pod \"keystone-cron-29525341-lgcg6\" (UID: \"55b3945c-6038-4494-bb56-0ec80ec66510\") " pod="openstack/keystone-cron-29525341-lgcg6" Feb 19 17:01:00 crc kubenswrapper[4810]: I0219 17:01:00.380988 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55b3945c-6038-4494-bb56-0ec80ec66510-config-data\") pod \"keystone-cron-29525341-lgcg6\" (UID: \"55b3945c-6038-4494-bb56-0ec80ec66510\") " pod="openstack/keystone-cron-29525341-lgcg6" Feb 19 17:01:00 crc kubenswrapper[4810]: I0219 17:01:00.396573 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/55b3945c-6038-4494-bb56-0ec80ec66510-fernet-keys\") pod \"keystone-cron-29525341-lgcg6\" (UID: \"55b3945c-6038-4494-bb56-0ec80ec66510\") " pod="openstack/keystone-cron-29525341-lgcg6" Feb 19 17:01:00 crc kubenswrapper[4810]: I0219 17:01:00.398096 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gm4tv\" (UniqueName: \"kubernetes.io/projected/55b3945c-6038-4494-bb56-0ec80ec66510-kube-api-access-gm4tv\") pod \"keystone-cron-29525341-lgcg6\" (UID: \"55b3945c-6038-4494-bb56-0ec80ec66510\") " pod="openstack/keystone-cron-29525341-lgcg6" Feb 19 17:01:00 crc kubenswrapper[4810]: I0219 17:01:00.500092 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29525341-lgcg6" Feb 19 17:01:01 crc kubenswrapper[4810]: I0219 17:01:01.112359 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29525341-lgcg6"] Feb 19 17:01:01 crc kubenswrapper[4810]: W0219 17:01:01.113615 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod55b3945c_6038_4494_bb56_0ec80ec66510.slice/crio-6c211bd33da59f17b3fd9922d7f33b134093e0916a593721b360bdff400d312e WatchSource:0}: Error finding container 6c211bd33da59f17b3fd9922d7f33b134093e0916a593721b360bdff400d312e: Status 404 returned error can't find the container with id 6c211bd33da59f17b3fd9922d7f33b134093e0916a593721b360bdff400d312e Feb 19 17:01:01 crc kubenswrapper[4810]: I0219 17:01:01.285294 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29525341-lgcg6" event={"ID":"55b3945c-6038-4494-bb56-0ec80ec66510","Type":"ContainerStarted","Data":"6c211bd33da59f17b3fd9922d7f33b134093e0916a593721b360bdff400d312e"} Feb 19 17:01:02 crc kubenswrapper[4810]: I0219 17:01:02.297773 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lswlw" event={"ID":"a8a71d42-1d52-4e61-a884-bb624373e783","Type":"ContainerStarted","Data":"13477726b2a82632cdf0e65bb570525ed9b08d31d20c93710785a2d9d52254a9"} Feb 19 17:01:02 crc kubenswrapper[4810]: I0219 17:01:02.299465 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29525341-lgcg6" event={"ID":"55b3945c-6038-4494-bb56-0ec80ec66510","Type":"ContainerStarted","Data":"84a9eb8c272dc4fabb345cbeb55b5a7de0453811a5eb16c5fbd0f1464471b84a"} Feb 19 17:01:02 crc kubenswrapper[4810]: I0219 17:01:02.323533 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lswlw" podStartSLOduration=3.763602895 podStartE2EDuration="9.323510454s" podCreationTimestamp="2026-02-19 17:00:53 +0000 UTC" firstStartedPulling="2026-02-19 17:00:55.208315192 +0000 UTC m=+6684.690345316" lastFinishedPulling="2026-02-19 17:01:00.768222751 +0000 UTC m=+6690.250252875" observedRunningTime="2026-02-19 17:01:02.316523382 +0000 UTC m=+6691.798553526" watchObservedRunningTime="2026-02-19 17:01:02.323510454 +0000 UTC m=+6691.805540578" Feb 19 17:01:02 crc kubenswrapper[4810]: I0219 17:01:02.349856 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29525341-lgcg6" podStartSLOduration=2.34975703 podStartE2EDuration="2.34975703s" podCreationTimestamp="2026-02-19 17:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 17:01:02.341355083 +0000 UTC m=+6691.823385207" watchObservedRunningTime="2026-02-19 17:01:02.34975703 +0000 UTC m=+6691.831787164" Feb 19 17:01:03 crc kubenswrapper[4810]: I0219 17:01:03.440388 4810 scope.go:117] "RemoveContainer" containerID="a0ba7d3250d70141c7fee7e449509a680dcfa78b2c2d7c2d1cd7f9464f6ba11b" Feb 19 17:01:03 crc kubenswrapper[4810]: E0219 17:01:03.441144 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 17:01:04 crc kubenswrapper[4810]: I0219 17:01:04.241370 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lswlw" Feb 19 17:01:04 crc kubenswrapper[4810]: I0219 17:01:04.241805 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lswlw" Feb 19 17:01:05 crc kubenswrapper[4810]: I0219 17:01:05.297030 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lswlw" podUID="a8a71d42-1d52-4e61-a884-bb624373e783" containerName="registry-server" probeResult="failure" output=< Feb 19 17:01:05 crc kubenswrapper[4810]: timeout: failed to connect service ":50051" within 1s Feb 19 17:01:05 crc kubenswrapper[4810]: > Feb 19 17:01:07 crc kubenswrapper[4810]: I0219 17:01:07.354917 4810 generic.go:334] "Generic (PLEG): container finished" podID="55b3945c-6038-4494-bb56-0ec80ec66510" containerID="84a9eb8c272dc4fabb345cbeb55b5a7de0453811a5eb16c5fbd0f1464471b84a" exitCode=0 Feb 19 17:01:07 crc kubenswrapper[4810]: I0219 17:01:07.355362 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29525341-lgcg6" event={"ID":"55b3945c-6038-4494-bb56-0ec80ec66510","Type":"ContainerDied","Data":"84a9eb8c272dc4fabb345cbeb55b5a7de0453811a5eb16c5fbd0f1464471b84a"} Feb 19 17:01:08 crc kubenswrapper[4810]: I0219 17:01:08.829622 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29525341-lgcg6" Feb 19 17:01:08 crc kubenswrapper[4810]: I0219 17:01:08.993566 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55b3945c-6038-4494-bb56-0ec80ec66510-config-data\") pod \"55b3945c-6038-4494-bb56-0ec80ec66510\" (UID: \"55b3945c-6038-4494-bb56-0ec80ec66510\") " Feb 19 17:01:08 crc kubenswrapper[4810]: I0219 17:01:08.994668 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55b3945c-6038-4494-bb56-0ec80ec66510-combined-ca-bundle\") pod \"55b3945c-6038-4494-bb56-0ec80ec66510\" (UID: \"55b3945c-6038-4494-bb56-0ec80ec66510\") " Feb 19 17:01:08 crc kubenswrapper[4810]: I0219 17:01:08.994755 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gm4tv\" (UniqueName: \"kubernetes.io/projected/55b3945c-6038-4494-bb56-0ec80ec66510-kube-api-access-gm4tv\") pod \"55b3945c-6038-4494-bb56-0ec80ec66510\" (UID: \"55b3945c-6038-4494-bb56-0ec80ec66510\") " Feb 19 17:01:08 crc kubenswrapper[4810]: I0219 17:01:08.994842 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/55b3945c-6038-4494-bb56-0ec80ec66510-fernet-keys\") pod \"55b3945c-6038-4494-bb56-0ec80ec66510\" (UID: \"55b3945c-6038-4494-bb56-0ec80ec66510\") " Feb 19 17:01:08 crc kubenswrapper[4810]: I0219 17:01:08.999529 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55b3945c-6038-4494-bb56-0ec80ec66510-kube-api-access-gm4tv" (OuterVolumeSpecName: "kube-api-access-gm4tv") pod "55b3945c-6038-4494-bb56-0ec80ec66510" (UID: "55b3945c-6038-4494-bb56-0ec80ec66510"). InnerVolumeSpecName "kube-api-access-gm4tv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 17:01:09 crc kubenswrapper[4810]: I0219 17:01:09.003886 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55b3945c-6038-4494-bb56-0ec80ec66510-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "55b3945c-6038-4494-bb56-0ec80ec66510" (UID: "55b3945c-6038-4494-bb56-0ec80ec66510"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 17:01:09 crc kubenswrapper[4810]: I0219 17:01:09.032990 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55b3945c-6038-4494-bb56-0ec80ec66510-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "55b3945c-6038-4494-bb56-0ec80ec66510" (UID: "55b3945c-6038-4494-bb56-0ec80ec66510"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 17:01:09 crc kubenswrapper[4810]: I0219 17:01:09.098960 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55b3945c-6038-4494-bb56-0ec80ec66510-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 17:01:09 crc kubenswrapper[4810]: I0219 17:01:09.099022 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gm4tv\" (UniqueName: \"kubernetes.io/projected/55b3945c-6038-4494-bb56-0ec80ec66510-kube-api-access-gm4tv\") on node \"crc\" DevicePath \"\"" Feb 19 17:01:09 crc kubenswrapper[4810]: I0219 17:01:09.099046 4810 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/55b3945c-6038-4494-bb56-0ec80ec66510-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 19 17:01:09 crc kubenswrapper[4810]: I0219 17:01:09.100709 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55b3945c-6038-4494-bb56-0ec80ec66510-config-data" (OuterVolumeSpecName: "config-data") pod "55b3945c-6038-4494-bb56-0ec80ec66510" (UID: "55b3945c-6038-4494-bb56-0ec80ec66510"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 17:01:09 crc kubenswrapper[4810]: I0219 17:01:09.201683 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55b3945c-6038-4494-bb56-0ec80ec66510-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 17:01:09 crc kubenswrapper[4810]: I0219 17:01:09.379904 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29525341-lgcg6" event={"ID":"55b3945c-6038-4494-bb56-0ec80ec66510","Type":"ContainerDied","Data":"6c211bd33da59f17b3fd9922d7f33b134093e0916a593721b360bdff400d312e"} Feb 19 17:01:09 crc kubenswrapper[4810]: I0219 17:01:09.379954 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c211bd33da59f17b3fd9922d7f33b134093e0916a593721b360bdff400d312e" Feb 19 17:01:09 crc kubenswrapper[4810]: I0219 17:01:09.380015 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29525341-lgcg6" Feb 19 17:01:14 crc kubenswrapper[4810]: I0219 17:01:14.325782 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lswlw" Feb 19 17:01:14 crc kubenswrapper[4810]: I0219 17:01:14.410626 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lswlw" Feb 19 17:01:14 crc kubenswrapper[4810]: I0219 17:01:14.445481 4810 scope.go:117] "RemoveContainer" containerID="a0ba7d3250d70141c7fee7e449509a680dcfa78b2c2d7c2d1cd7f9464f6ba11b" Feb 19 17:01:14 crc kubenswrapper[4810]: E0219 17:01:14.448687 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 17:01:19 crc kubenswrapper[4810]: I0219 17:01:19.143867 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lswlw"] Feb 19 17:01:19 crc kubenswrapper[4810]: I0219 17:01:19.144975 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-lswlw" podUID="a8a71d42-1d52-4e61-a884-bb624373e783" containerName="registry-server" containerID="cri-o://13477726b2a82632cdf0e65bb570525ed9b08d31d20c93710785a2d9d52254a9" gracePeriod=2 Feb 19 17:01:19 crc kubenswrapper[4810]: I0219 17:01:19.533051 4810 generic.go:334] "Generic (PLEG): container finished" podID="a8a71d42-1d52-4e61-a884-bb624373e783" containerID="13477726b2a82632cdf0e65bb570525ed9b08d31d20c93710785a2d9d52254a9" exitCode=0 Feb 19 17:01:19 crc kubenswrapper[4810]: I0219 17:01:19.533317 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lswlw" event={"ID":"a8a71d42-1d52-4e61-a884-bb624373e783","Type":"ContainerDied","Data":"13477726b2a82632cdf0e65bb570525ed9b08d31d20c93710785a2d9d52254a9"} Feb 19 17:01:19 crc kubenswrapper[4810]: I0219 17:01:19.533380 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lswlw" event={"ID":"a8a71d42-1d52-4e61-a884-bb624373e783","Type":"ContainerDied","Data":"855fe77aae20a3a916acc865c1e7cb25af7b033cecc87ffd8153ba29c46257f8"} Feb 19 17:01:19 crc kubenswrapper[4810]: I0219 17:01:19.533394 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="855fe77aae20a3a916acc865c1e7cb25af7b033cecc87ffd8153ba29c46257f8" Feb 19 17:01:19 crc kubenswrapper[4810]: I0219 17:01:19.604967 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lswlw" Feb 19 17:01:19 crc kubenswrapper[4810]: I0219 17:01:19.678912 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8a71d42-1d52-4e61-a884-bb624373e783-catalog-content\") pod \"a8a71d42-1d52-4e61-a884-bb624373e783\" (UID: \"a8a71d42-1d52-4e61-a884-bb624373e783\") " Feb 19 17:01:19 crc kubenswrapper[4810]: I0219 17:01:19.679280 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8a71d42-1d52-4e61-a884-bb624373e783-utilities\") pod \"a8a71d42-1d52-4e61-a884-bb624373e783\" (UID: \"a8a71d42-1d52-4e61-a884-bb624373e783\") " Feb 19 17:01:19 crc kubenswrapper[4810]: I0219 17:01:19.679831 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8a71d42-1d52-4e61-a884-bb624373e783-utilities" (OuterVolumeSpecName: "utilities") pod "a8a71d42-1d52-4e61-a884-bb624373e783" (UID: "a8a71d42-1d52-4e61-a884-bb624373e783"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 17:01:19 crc kubenswrapper[4810]: I0219 17:01:19.679814 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8knm\" (UniqueName: \"kubernetes.io/projected/a8a71d42-1d52-4e61-a884-bb624373e783-kube-api-access-b8knm\") pod \"a8a71d42-1d52-4e61-a884-bb624373e783\" (UID: \"a8a71d42-1d52-4e61-a884-bb624373e783\") " Feb 19 17:01:19 crc kubenswrapper[4810]: I0219 17:01:19.680846 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8a71d42-1d52-4e61-a884-bb624373e783-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 17:01:19 crc kubenswrapper[4810]: I0219 17:01:19.696615 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8a71d42-1d52-4e61-a884-bb624373e783-kube-api-access-b8knm" (OuterVolumeSpecName: "kube-api-access-b8knm") pod "a8a71d42-1d52-4e61-a884-bb624373e783" (UID: "a8a71d42-1d52-4e61-a884-bb624373e783"). InnerVolumeSpecName "kube-api-access-b8knm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 17:01:19 crc kubenswrapper[4810]: I0219 17:01:19.782394 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8knm\" (UniqueName: \"kubernetes.io/projected/a8a71d42-1d52-4e61-a884-bb624373e783-kube-api-access-b8knm\") on node \"crc\" DevicePath \"\"" Feb 19 17:01:19 crc kubenswrapper[4810]: I0219 17:01:19.809675 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8a71d42-1d52-4e61-a884-bb624373e783-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a8a71d42-1d52-4e61-a884-bb624373e783" (UID: "a8a71d42-1d52-4e61-a884-bb624373e783"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 17:01:19 crc kubenswrapper[4810]: I0219 17:01:19.884581 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8a71d42-1d52-4e61-a884-bb624373e783-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 17:01:20 crc kubenswrapper[4810]: I0219 17:01:20.541680 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lswlw" Feb 19 17:01:20 crc kubenswrapper[4810]: I0219 17:01:20.583913 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lswlw"] Feb 19 17:01:20 crc kubenswrapper[4810]: I0219 17:01:20.593065 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-lswlw"] Feb 19 17:01:21 crc kubenswrapper[4810]: I0219 17:01:21.464942 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8a71d42-1d52-4e61-a884-bb624373e783" path="/var/lib/kubelet/pods/a8a71d42-1d52-4e61-a884-bb624373e783/volumes" Feb 19 17:01:28 crc kubenswrapper[4810]: I0219 17:01:28.439183 4810 scope.go:117] "RemoveContainer" containerID="a0ba7d3250d70141c7fee7e449509a680dcfa78b2c2d7c2d1cd7f9464f6ba11b" Feb 19 17:01:28 crc kubenswrapper[4810]: E0219 17:01:28.439940 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 17:01:41 crc kubenswrapper[4810]: I0219 17:01:41.453125 4810 scope.go:117] "RemoveContainer" containerID="a0ba7d3250d70141c7fee7e449509a680dcfa78b2c2d7c2d1cd7f9464f6ba11b" Feb 19 17:01:41 crc kubenswrapper[4810]: E0219 17:01:41.453863 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t499d_openshift-machine-config-operator(dfbf74ef-1e94-4826-8583-42b2e246ccf3)\"" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" Feb 19 17:01:55 crc kubenswrapper[4810]: I0219 17:01:55.439666 4810 scope.go:117] "RemoveContainer" containerID="a0ba7d3250d70141c7fee7e449509a680dcfa78b2c2d7c2d1cd7f9464f6ba11b" Feb 19 17:01:56 crc kubenswrapper[4810]: I0219 17:01:56.016848 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t499d" event={"ID":"dfbf74ef-1e94-4826-8583-42b2e246ccf3","Type":"ContainerStarted","Data":"cb9191dca31827157c78f39124e1d9c08f5e7d9be848f27d1b0a60fa20d7dfe4"} Feb 19 17:04:19 crc kubenswrapper[4810]: I0219 17:04:19.537531 4810 patch_prober.go:28] interesting pod/machine-config-daemon-t499d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 17:04:19 crc kubenswrapper[4810]: I0219 17:04:19.539377 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 17:04:49 crc kubenswrapper[4810]: I0219 17:04:49.537718 4810 patch_prober.go:28] interesting pod/machine-config-daemon-t499d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 17:04:49 crc kubenswrapper[4810]: I0219 17:04:49.538535 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t499d" podUID="dfbf74ef-1e94-4826-8583-42b2e246ccf3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515145641300024444 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015145641301017362 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015145623302016506 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015145623303015457 5ustar corecore